MissKitty

Forum Replies Created

Viewing 5 posts - 1 through 5 (of 5 total)

  • Author
    Posts
  • in reply to: lm/dic files #1024150
    MissKitty
    Participant

    Problems solved – app running on mac time to move to iphone – Thanks for your help, you can close this out

    in reply to: lm/dic files #1024143
    MissKitty
    Participant

    I have speech recognition now working with large lm. My app is designed to start speech recognition when user presses button and stop when hypothesis is retuned. In code below I am doing resumeRecognition and suspendRecognition. I then do multiple speaks via Flite.
    1. From the log below it appears that every time Flite has finished speaking – Pocketsphinx has resumed recognition. Is there a way to prevent this?
    2. From the log below I use Flite to say “error netfor: hand”. Then “Love”. And finally “Thirty”. I only hear “error netfor: hand”. Is there away to wait after each phase to make sure speech is complete as there is using Google TTS?
    3. I am using speech punctuation “:” is this any problem with Flite?

    **************************** xcode log *********************************
    2015-01-08 05:54:25.777 MTC[1257:91066] MTC button pressed.
    2015-01-08 05:54:26.026 MTC[1257:91066] MTC button released.
    2015-01-08 05:54:26.026 MTC[1257:91066] Local callback: Pocketsphinx has resumed recognition.
    2015-01-08 05:54:26.890 MTC[1257:91066] Local callback: Pocketsphinx has detected speech.
    2015-01-08 05:54:28.307 MTC[1257:91066] Local callback: Pocketsphinx has detected a second of silence, concluding an utterance.
    2015-01-08 05:54:28.313 MTC[1257:91066] MTC to gotWords
    2015-01-08 05:54:28.314 MTC[1257:91066] error netfor: hand
    2015-01-08 05:54:28.354 MTC[1257:91066] Love
    2015-01-08 05:54:28.377 MTC[1257:91066] Thirty
    2015-01-08 05:54:28.400 MTC[1257:91066] Local callback: Pocketsphinx has suspended recognition.
    2015-01-08 05:54:28.401 MTC[1257:91066] Local callback: Flite has started speaking
    2015-01-08 05:54:28.401 MTC[1257:91066] Local callback: Flite has started speaking
    2015-01-08 05:54:28.401 MTC[1257:91066] Local callback: Flite has started speaking
    2015-01-08 05:54:28.401 MTC[1257:91066] Local callback: Pocketsphinx has suspended recognition.
    2015-01-08 05:54:28.401 MTC[1257:91066] Local callback: Pocketsphinx has suspended recognition.
    2015-01-08 05:54:28.401 MTC[1257:91066] Local callback: Pocketsphinx has suspended recognition.
    2015-01-08 05:54:29.946 MTC[1257:91066] Local callback: Flite has finished speaking
    2015-01-08 05:54:29.946 MTC[1257:91066] Local callback: Flite has finished speaking
    2015-01-08 05:54:29.946 MTC[1257:91066] Local callback: Pocketsphinx has resumed recognition.
    2015-01-08 05:54:29.946 MTC[1257:91066] Local callback: Pocketsphinx has resumed recognition.
    2015-01-08 05:54:30.100 MTC[1257:91066] Local callback: Pocketsphinx has detected speech.
    2015-01-08 05:54:30.758 MTC[1257:91066] Local callback: Flite has finished speaking
    2015-01-08 05:54:30.759 MTC[1257:91066] Local callback: Pocketsphinx has resumed recognition.

    ****************** in didViewLoad ***************************

    // This is how to start the continuous listening loop of an available instance of OEPocketsphinxController.

    [OEPocketsphinxController sharedInstance].returnNullHypotheses = TRUE;

    [[OEPocketsphinxController sharedInstance] setActive:TRUE error:nil];

    if(![OEPocketsphinxController sharedInstance].isListening) {
    [[OEPocketsphinxController sharedInstance] startListeningWithLanguageModelAtPath:self.pathToFirstDynamicallyGeneratedLanguageModel dictionaryAtPath:self.pathToFirstDynamicallyGeneratedDictionary acousticModelAtPath:[OEAcousticModel pathToModel:@”AcousticModelEnglish”] languageModelIsJSGF:FALSE];

    // Start speech recognition if we aren’t already listening.
    }

    [self startDisplayingLevels];

    // This suspends listening without ending the recognition loop

    [[OEPocketsphinxController sharedInstance] suspendRecognition];

    ************************* end didViewLoad

    – (IBAction)buttonDown:(id)sender {

    NSLog(@” MTC button pressed.”);

    }

    – (IBAction)buttonUp:(id)sender {

    NSLog(@” MTC button released.”);

    AudioServicesPlaySystemSound(1005);

    [OEPocketsphinxController sharedInstance].returnNullHypotheses = TRUE;

    [[OEPocketsphinxController sharedInstance] resumeRecognition];

    }

    -(void) handleLongPress : (id)sender
    {
    //Long Press done by the user

    NSLog(@” MTC long press”);

    }

    – (void)speakWithNSString:(NSString *)text {

    self.fliteController = [[OEFliteController alloc] init];
    self.slt = [[Slt alloc] init];

    ];

    NSLog(@”%@”,
    text);

    [self.fliteController say:[NSString stringWithFormat:@” %@”,text] withVoice:self.slt];
    }

    – (void)myLogWithNSString:(NSString *)text {

    NSLog(@”%@”,
    text);
    }
    //

    – (void) pocketsphinxDidReceiveHypothesis:(NSString *)hypothesis recognitionScore:(NSString *)recognitionScore utteranceID:(NSString *)utteranceID {

    // This suspends listening without ending the recognition loop

    [[OEPocketsphinxController sharedInstance] suspendRecognition];

    NSLog(@” MTC to gotWords”);

    MTCccActivity *theInstance = [MTCccActivity getInstance];
    [theInstance gotWordsWithNSString:hypothesis];}

    @end

    in reply to: lm/dic files #1024114
    MissKitty
    Participant

    I just bought a new mac. Transferred source and re- downloaded OpenEars. Xcode 6.1.1 both cases. Was working on old mac. I now have a new problem, can not find
    OpenEars/OEEventsObserver.h

    The Framework Search Path is the same as the old mac except for the user name change. Any suggestions? I will not have access to old mac after today.

    in reply to: lm/dic files #1024057
    MissKitty
    Participant

    The question I am asking is will Openears have any problem with an LM of my size. Android has nothing to do with it except that it works on Android.

    in reply to: lm/dic files #1024054
    MissKitty
    Participant

    My LM consists of 329 sentences, 1289 words. This runs fine using Android – Pocketsphinx. I get excellent recognition. I can write some code to convert this file to your format so that I will only have to maintain one LM file. Do you see any problem with this approach?

Viewing 5 posts - 1 through 5 (of 5 total)