Home › Forums › OpenEars plugins › Error when integrating the NeatSpeech demo
- This topic has 12 replies, 2 voices, and was last updated 11 years, 4 months ago by Halle Winkler.
-
AuthorPosts
-
December 13, 2012 at 2:53 pm #14891appcontrolParticipant
Hi there,
I have a projet with OpenEars (1.2.4) integrated in it, for speech synthesis only. When I try to replace the SLT voice with one of the demo voices, the app crashes at the first synthesis, with the following message in the debug console: “Error: flite_hts_engine: specify models(trees) for each parameter.”
Did I miss something in the installation document?
Thanks,
Vincent.December 13, 2012 at 3:05 pm #14892Halle WinklerPolitepixHi,
Can you show the code you used? It just sounds a bit like the Emma voice (or whichever voice) is not instantiated at the time you are calling it.
December 13, 2012 at 3:06 pm #14893Halle WinklerPolitepixYou can also contact me through the contact form and I’ll give you an address to email your code or project to if you want to use your free support email.
December 13, 2012 at 3:14 pm #14894appcontrolParticipantHi,
thanks for the swift reaction, this is really good.
Here’s a code extract, the crash error occurs on the line where the stEmma (static variable) is initialized.
Thanks,
Vincent==========
+(void) sayText:(NSString*)text{
if(stFliteController == nil){
stFliteController = [[FliteController alloc]init];
}if(stEmma == nil){
stEmma = [[Emma alloc]initWithPitch:0.0f speed:0.0f transform:0.0f];
}if(theQueue == nil){
theQueue = [[NSMutableArray alloc]initWithCapacity:10];
}if([stFliteController speechInProgress]){
// NSLog(@”AudioSingleton sayText:%@ –> Will queue it”, text);
[theQueue addObject:text];
}
else{
// NSLog(@”AudioSingleton sayText:%@ –> will say it”, text);
[stFliteController sayWithNeatSpeech:text withVoice:stEmma];
}
}December 13, 2012 at 3:39 pm #14895Halle WinklerPolitepixOK, I see a few issues. The first is that the tutorial gives an example of how to do the memory management for both FliteController and FliteController+Neatspeech voices, and it’s a good idea to use it since it avoids issues related to memory management. This looks like the initialization occurs inside of an instance method of a shared object, which seems like there are a few ways it could be going wrong. There’s no need to put NeatSpeech inside a singleton or do something with queueing since NeatSpeech manages its own queue internally and it is multithreaded and expects to be instantiated in one view controller, not in a singleton whose thread we don’t know.
I would just set it up like the tutorial example:
December 13, 2012 at 4:25 pm #14896Halle WinklerPolitepixJust to explain a bit more about the internal queueing, you can send text to sayWithNeatSpeech: whenever you want, and if speech is currently in progress the new text will be queued behind the scenes and spoken when previous queued speech is done. Or you can send a single very large piece of text and NeatSpeech will break it down and queue it up on its own. You can also dump the queue. It’s built on the assumption that you will need to queue and manages its whole process of putting synthesis on a secondary thread and keeping the results that are delivered by OpenEarsEventsObserver on mainThread.
December 13, 2012 at 5:50 pm #14897appcontrolParticipantThanks for your answers.
The singleton is there because of the way the app is using sound, not to encapsulate NeatSpeech. And this works with the Slt voice, so I hoped it would be easy to switch to NeatSpeech.
Regarding the queuing, it’s in place because Slt and the previous sound mechanism I used could’t handle it, and the text come in chunks because of the nature of the app.
I did not expect such a difference between OpenEars and NeatSpeech, but I’ll test further, and let you know if I need more support.
Vincent.
December 14, 2012 at 3:24 pm #14901appcontrolParticipantSo, at this point I have something equivalent to the tutorial working, and by the way, the NeatSpeech voices are really good compared to the free ones…
But I’m still struggling to integrate it in my application, even in one of the many view controllers I have. Does NeatSpeech assumes, that the objects will be properties of the root view controller, or can it be any VC ?
Thanks,
VincentDecember 14, 2012 at 3:30 pm #14902Halle WinklerPolitepixAny VC. They can also be instantiated in a model that is controlled in a VC without any multithreading; the only reason I say to put them in a VC is that they should be on mainThread and not in a singleton but instead something which has a particular location in the view hierarchy and is normally memory managed.
How are you struggling? Are you instantiating the voice and the fliteController in the emma and fliteController lazy instantiation method that is shown in the tutorial and then referencing them with self. as in:
[self.fliteController sayWithNeatSpeech:@”I have always wished for my computer to be as easy to use as my telephone; my wish has come true because I can no longer figure out how to use my telephone.” withVoice:self.emma];
?
I’m here to help, just let me know what the hangup is and I’m sure we can figure it out.
December 14, 2012 at 3:35 pm #14903Halle WinklerPolitepixThis would be the lazy instantiation approach:
1. Make sure you’ve imported FliteController+NeatSpeech.h in the VC header after the import of FliteController.h,
2. Create an ivar and property of the voice and of the FliteController in the VC header, synthesize both in the VC implementation, and for each, override their accessor method with the following lazy accessors:- (Emma *)emma { if (emma == nil) { emma = [[Emma alloc]initWithPitch:0.0 speed:0.0 transform:0.0]; } return emma; } - (FliteController *)fliteController { if (fliteController == nil) { fliteController = [[FliteController alloc] init]; } return fliteController; }
Then, you don’t initialize either ever, or do any checking of whether they are instantiated, and you don’t have to queue, you just reference them like so:
[self.fliteController sayWithNeatSpeech:@”I have always wished for my computer to be as easy to use as my telephone; my wish has come true because I can no longer figure out how to use my telephone.” withVoice:self.emma];
Also, just for sanity, double-check that you’ve added the -ObjC other linker flag to the target.
December 14, 2012 at 3:42 pm #14904Halle WinklerPolitepixby the way, the NeatSpeech voices are really good compared to the free ones…
And thanks for this! Very nice to hear.
December 14, 2012 at 4:33 pm #14905appcontrolParticipantO-keeee, I got it working now :-)
The problem was that I have one single XCode project with several targets, and when I did copy the Voices folder to my project, it got added to only one target, so the voice files were missing in the bundle resources (Build phases -> Copy bundle resources).
So the problem had nothing to do with architecture, singleton or VCs.
Thanks for your good support, and expect an order for the full version next week, but for now, time for some more test, and… party.
With kind regards,
VincentDecember 14, 2012 at 4:39 pm #14906Halle WinklerPolitepixFantastic! Glad it’s working for you and enjoy the party.
-
AuthorPosts
- You must be logged in to reply to this topic.