xcode - Xamarin OpenEars Native Binding Not working on Device but works on Simulator -


i have been working on using openears v2.03 ios framework project in xamarin ios binding project. let me explain did far,.i'm new xcode, xamarin , binding things. gonna big questions, hold breath…

1) build openears framework project in xcode simulator. copied “openears” file framework/openears.framework/versions/current/ , renamed “libopenears-i386.a

likewise build same library iphone 4s device connecting device mac , chosen target iphone. copied generated openears , renamed “libopenears-armv7.a

2) using lipo command bundled 2 file (libopenears-i386.a, libopenears-armv7.a) single file “libopenears.a” using below command.

lipo -create -output libopenears.a libopenears-i386.a libopenears-armv7.a  

3) created binding project in xamarin studio , added libopenears.a, generates libopenears.linkwith.cs automatically. below following code,

using system; using objcruntime;  [assembly: linkwith ("libopenears.a", linktarget.armv7 | linktarget.simulator, smartlink = true, forceload = true, frameworks="audiotoolbox avfoundation", iscxx=true, linkerflags = "-lstdc++")] 

i tried changing liker flags linkerflags = "-lstdc++ -lc++ -objc” , smartlink=false.

4) apidefinition file contain interface openears, added 1 interface here.

[basetype(typeof(nsobject))] [protocol] interface oeeventsobserver {     [wrap ("weakdelegate")]     oeeventsobserverdelegate delegate { get; set; }      [export ("delegate", argumentsemantic.assign), nullallowed]     nsobject weakdelegate { get; set; } } 

5) referenced openears.dll ios sample project.

6) add language model , acoustic model in binding library itself. (even though not needed dynamic language model generation , used old openears sample project openears xamarin git, dind’t used new dynamiclanguagemodel generator modified example latest changes).

view controller:

public partial class openearsnewapiviewcontroller : uiviewcontroller {     oeeventsobserver observer;     oeflitecontroller flitecontroller;     oepocketsphinxcontroller pocketsphinxcontroller;       string pathtolanguagemodel;     string pathtodictionary;     string pathtoacousticmodel;      string firstvoicetouse;     string secondvoicetouse;      static bool userinterfaceidiomisphone {         { return uidevice.currentdevice.userinterfaceidiom == uiuserinterfaceidiom.phone; }     }      public void init()     {         try         {             observer = new oeeventsobserver();             observer.delegate = new openearseventsobserverdelegate (this);             pocketsphinxcontroller = new oepocketsphinxcontroller ();              flitecontroller = new oeflitecontroller();              firstvoicetouse = "cmu_us_slt";             secondvoicetouse = "cmu_us_rms";              pathtolanguagemodel = nsbundle.mainbundle.resourcepath + system.io.path.directoryseparatorchar + "openears1.languagemodel";             pathtodictionary = nsbundle.mainbundle.resourcepath + system.io.path.directoryseparatorchar + "openears1.dic";             pathtoacousticmodel = nsbundle.mainbundle.resourcepath;         }         catch(exception e) {             console.writeline ("exception message :"+e.message);             console.writeline ("inner exception mesage :"+e.innerexception.message);         }      }      public openearsnewapiviewcontroller (intptr handle) : base (handle)     {         init ();     }      #region update      public void updatestatus (string text)     {         txtstatus.text = text;     }      public void updatetext (string text)     {         txtoutput.text = text;     }      public void updatebuttonstates (bool hidden1, bool hidden2, bool hidden3, bool hidden4)     {         btnstartlistening.hidden = hidden1;         btnstoplistening.hidden = hidden2;         btnsuspend.hidden = hidden3;         btnresume.hidden = hidden4;     }      public void (string text)     {         //flitecontroller.saywithvoice (text, secondvoicetouse);     }      public void startlistening ()     {         //pocketsphinxcontroller.requestmicpermission ();         if (!pocketsphinxcontroller.islistening) {              //nsstring *correctpathtomylanguagemodelfile = [nsstring stringwithformat:@"%@/thenameichoseformylanguagemodelanddictionaryfile.%@",[nssearchpathfordirectoriesindomains(nscachesdirectory, nsuserdomainmask, yes) objectatindex:0],@"dmp"];               pocketsphinxcontroller.startlisteningwithlanguagemodelatpath (                 pathtolanguagemodel,                 pathtodictionary,                 pathtoacousticmodel,                 false             );         } else {             new uialertview ("notify !!","already listening",null,"ok","stop").show();          }      }      public void stoplistening ()     {         //pocketsphinxcontroller.stoplistening ();     }      public void suspendrecognition ()     {         pocketsphinxcontroller.suspendrecognition ();     }      public void resumerecognition ()     {         pocketsphinxcontroller.resumerecognition ();     }      #endregion      #region event handlers      partial void btnstartlistening_touchupinside (uibutton sender)     {         try         {             startlistening();             //flitecontroller.init();             //console.writeline("speech in progress :"+flitecontroller.speechinprogress);             //flitecontroller.say("hai", new oeflitevoice());              updatebuttonstates (true, false, false, true);             console.writeline("speech in progress :"+flitecontroller.speechinprogress);         }         catch(exception e)         {             console.writeline(e.message);         }     }      partial void btnstoplistening_touchupinside (uibutton sender)     {         stoplistening ();         updatebuttonstates (false, true, true, true);     }      partial void btnsuspend_touchupinside (uibutton sender)     {         suspendrecognition ();         updatebuttonstates (true, false, true, false);     }      partial void btnresume_touchupinside (uibutton sender)     {         resumerecognition ();         updatebuttonstates (true, false, false, true);     } } 

openearseventsobserverdelegate:

// nothing here check status , debugging   public class openearseventsobserverdelegate:oeeventsobserverdelegate {     openearsnewapiviewcontroller _controller;      public openearsnewapiviewcontroller controller {         {             return _controller;         }         set {             _controller = value;         }     }      public openearseventsobserverdelegate (openearsnewapiviewcontroller ctrl)     {         controller = ctrl;     }      public override void pocketsphinxrecognitionloopdidstart()     {         //base.pocketsphinxrecognitionloopdidstart();          console.writeline ("pocketsphinx starting up");         controller.updatestatus ("pocketsphinx starting up");     }      public override void pocketsphinxdidreceivehypothesis (foundation.nsstring hypothesis, foundation.nsstring recognitionscore, foundation.nsstring utteranceid)     {         controller.updatetext ("heard: " + hypothesis);         controller.say ("you said: " + hypothesis);     }      public override void pocketsphinxcontinuoussetupdidfail ()     {      }      public override void pocketsphinxdidcompletecalibration ()     {         console.writeline ("pocket calibration complete");         controller.updatestatus ("pocket calibratio complete");     }      public override void pocketsphinxdiddetectspeech ()     {      }      public override void pocketsphinxdidstartlistening ()     {         console.writeline ("pocketsphinx listening");         controller.updatestatus ("pocketphinx listening");         controller.updatebuttonstates (true, false, false, true);     }      public override void pocketsphinxdidstoplistening ()     {      }      public override void pocketsphinxdidstartcalibration ()     {         console.writeline ("pocketsphinx calibration has started.");         controller.updatestatus ("pocketsphinx calibration has started");     }      public override void pocketsphinxdidresumerecognition ()     {      }      public override void pocketsphinxdidsuspendrecognition ()     {      }      public override void pocketsphinxdiddetectfinishedspeech ()     {      }      public override void flitedidstartspeaking ()     {      }      public override void flitedidfinishspeaking ()     {      } } 

this works on ios simulator not running on real device.

simulator screen shot.

i got error message while running on device.i'm getting same message interfaces.

exception message :wrapper type 'openears.oeeventsobserver' missing native objectivec class 'oeeventsobserver'.  2015-05-15 12:55:26.996 openearsnewapi[1359:231264] unhandled managed  exception: exception has been thrown target of invocation.  (system.reflection.targetinvocationexception) @ system.reflection.monocmethod.internalinvoke (system.object obj,   system.object[] parameters) [0x00016] in   /developer/monotouch/source/mono/mcs/class/corlib/system.reflection/monomethod.cs:543  

am missing related binding devices?

i tried building same .dll using make files also, got same error message.

for building openears framework:

xcodebuild -project openears.xcodeproj -target openears -sdk iphonesimulator8.2 -arch i386 -configuration release clean build  xcodebuild -project openears.xcodeproj -target openears -sdk iphoneos -arch armv7 -configuration release clean build 

make file genrating openears.dll

btouch=/developer/monotouch/usr/bin/btouch-native  all: openears.dll   openears.dll: assemblyinfo.cs openears.cs libopenears.a $(btouch) -unsafe --new-style -out:$@ openears.cs -x=assemblyinfo.cs --link-with=libopenears.a,libopenears.a  clean:    -rm -f *.dll 

check complete mtouch error log here

$lipo -info libopenears.a  architectures in fat file: libopenears.a are: i386 armv7  

check $nm -arch armv7 libopenears.a

nm command output here

checked oeevent exist in simulator (i386)

$ nm -arch i386 libopenears.a | grep oeevent 

output

u _objc_class_$_oeeventsobserver 00006aa0 s l_objc_label_protocol_$_oeeventsobserverdelegate 000076f0 s l_objc_protocol_$_oeeventsobserverdelegate warning: /applications/xcode.app/contents/developer/toolchains/xcodedefault.xctoolchain/usr/bin/nm: no name list libopenears.a(oeeventsobserver.o): 00002174 s _objc_class_$_oeeventsobserver 00002170 s _objc_ivar_$_oeeventsobserver._delegate 00002188 s _objc_metaclass_$_oeeventsobserver      u _objc_class_$_oeeventsobserver 00002d90 s l_objc_label_protocol_$_oeeventsobserverdelegate 000035a0 s l_objc_protocol_$_oeeventsobserverdelegate 

checked oeevent exist in armv7

$nm -arch armv7 libopenears.a | grep oeevent 

output

 u _objc_class_$_oeeventsobserver 00005680 s l_objc_label_protocol_$_oeeventsobserverdelegate 000062d8 s l_objc_protocol_$_oeeventsobserverdelegate warning:    /applications/xcode.app/contents/developer/toolchains/xcodedefault.xctoolchain/usr/bin/nm: no name list libopenears.a(oeeventsobserver.o): 00001cb4 s _objc_class_$_oeeventsobserver 00001cb0 s _objc_ivar_$_oeeventsobserver._delegate 00001cc8 s _objc_metaclass_$_oeeventsobserver      u _objc_class_$_oeeventsobserver 00002638 s l_objc_label_protocol_$_oeeventsobserverdelegate 00002e50 s l_objc_protocol_$_oeeventsobserverdelegate 

i'm not sure missing. yup there lot of grammar mistakes , thank time spend on reading this.

thanks @poupou , @halle valuable comments. build fat binary using architectures including arm64 , x86_64 (must). used lipo build in 1 package.now works charm !... set project properties-> advanced-> supportedarchi. -> armv7 running in device ipad 2 , iphone 4. still need test in iphone 6 , 6+, hope might support since arm64 family. i'm not sure how works on armv7s like(iphone 5, iphone 5c, ipad 4). i'm not seeing armv7s support in openears v2.03.


Comments

Popular posts from this blog

c++ - Difference between pre and post decrement in recursive function argument -

php - Nothing but 'run(); ' when browsing to my local project, how do I fix this? -

php - How can I echo out this array? -