- This topic has 7 replies, 2 voices, and was last updated 11 years, 7 months ago by Halle Winkler.
-
AuthorPosts
-
September 3, 2012 at 6:03 pm #10920hohlParticipant
I am successfully integrated OpenEars framework an hour ago, but know I wanted to test the voice recognition and it doesn’t work. I set up a very minimalistic view controller to test it in my app, but it always crashes when some voice is detected.
It crashes in ps_start_utt in 0x80df2. I have logging enabled and receive following output: https://www.sourcedrop.net/WPg4e3069b5b0
The source of the view controller is just some copy&paste from the sample app. https://www.sourcedrop.net/bsB4e270527f7
Does it matter that my application is a music player and changes the AVAudioSession on launch?
- (BOOL)application:(UIApplication *)application didFinishLaunchingWithOptions:(NSDictionary *)launchOptions
{
NSError *audioSessionError = nil;
[[AVAudioSession sharedInstance] setCategory:AVAudioSessionCategoryPlayback error:&audioSessionError];
[[AVAudioSession sharedInstance] setActive:YES error:&audioSessionError];
if (audioSessionError != nil) {
NSLog(@"Something went wrong with initialising the audio session!");
}AudioSessionSetActive(true);
AudioSessionAddPropertyListener(kAudioSessionProperty_AudioRouteChange, ARAudioSessionPropertyListener, nil);self.window = [[UIWindow alloc] initWithFrame:[[UIScreen mainScreen] bounds]];
// Override point for customization after application launch.
if ([[UIDevice currentDevice] userInterfaceIdiom] == UIUserInterfaceIdiomPhone) {
self.viewController = [[ARViewController alloc] initWithNibName:@"ARViewController_iPhone" bundle:nil];
} else {
self.viewController = [[ARViewController alloc] initWithNibName:@"ARViewController_iPad" bundle:nil];
}
self.window.rootViewController = self.viewController;
[self.window makeKeyAndVisible];[OpenEarsLogging startOpenEarsLogging];
return YES;
}September 3, 2012 at 6:10 pm #10922Halle WinklerPolitepixTo find out why PocketsphinxController is crashing, set verbosePocketSphinx to true. It probably can’t find all or part of the acoustic model or the language model in your new app.
September 3, 2012 at 6:16 pm #10923hohlParticipantI already had [OpenEarsLogging startOpenEarsLogging];
And setting verbosePocketSphinx doesn’t change anything. Just Listening and then crashes.
Acoustic model and language model is generated dynamically, so this shouldn’t be missing.
September 3, 2012 at 6:29 pm #10925Halle WinklerPolitepixOpenEarsLogging and verbosePocketSphinx aren’t related. OpenEarsLogging logs the basic functionality of the audio driver etc, and verbosePocketSphinx logs what is going on under the surface for pocketsphinx, which is where your issue is. I don’t think it’s possible that you won’t get any new logging output when you turn on verbosePocketSphinx since the crash is occurring after pocketsphinx is starting. Please double-check that it is turned on so you can show your logs.
Acoustic model and language model is generated dynamically, so this shouldn’t be missing.
The language model can be generated dynamically, but the acoustic model is part of the “framework” folder that has to be dragged into an app and cannot be dynamically generated. My guess is that the acoustic model isn’t in your new app.
September 3, 2012 at 6:32 pm #10927hohlParticipantOh, I mixed it up with the grammar model. All I have in my resources is: http://cl.ly/image/0b3m3C2L0q37 (which is just the whole framework folder)
September 3, 2012 at 6:33 pm #10928Halle WinklerPolitepixYou just need to get the verbosePocketSphinx logging turned on and it will tell you what is going wrong.
September 3, 2012 at 6:36 pm #10929hohlParticipantDoesn’t matter anymore. After your last comment, I found out, that the framework folder must be included in flat form (using groups instead of folder references). Not it works.
Thanks for the support.
September 3, 2012 at 6:38 pm #10930Halle WinklerPolitepixAh, gotcha. OK, glad it’s working for you!
-
AuthorPosts
- You must be logged in to reply to this topic.