Search Unity

  1. Unity 2018.3 is now released.
    Dismiss Notice
  2. The Unity Pro & Visual Studio Professional Bundle gives you the tools you need to develop faster & collaborate more efficiently. Learn more.
    Dismiss Notice
  3. Want more efficiency in your development work? Sign up to receive weekly tech and creative know-how from Unity experts.
    Dismiss Notice
  4. Build games and experiences that can load instantly and without install. Explore the Project Tiny Preview today!
    Dismiss Notice
  5. Want to provide direct feedback to the Unity team? Join the Unity Advisory Panel.
    Dismiss Notice
  6. Improve your Unity skills with a certified instructor in a private, interactive classroom. Watch the overview now.
    Dismiss Notice

[RELEASED] LipSync Pro and Eye Controller - Lipsyncing and Facial Animation Tools

Discussion in 'Assets and Asset Store' started by Rtyper, Mar 11, 2015.

  1. Spaceleigh

    Spaceleigh

    Joined:
    Jun 24, 2017
    Posts:
    4
    Hi. Love the asset. Quick question - how can you change the character mesh for the blend system at runtime? I've got several meshes on the same character and would like the lipsync to work if I switch to a different mesh. Basically, I'm making outfits for a character, and I've got 3 different masks I need to switch to depending upon the amount of coverage I need. i tried looking for something simple like lipsync.blendSystem.characterMesh, which is where I expected it to be.
     
  2. Spaceleigh

    Spaceleigh

    Joined:
    Jun 24, 2017
    Posts:
    4
    I figured out a way to do this by adding LipSync to each SkinnedMeshRenderer, and then switching the reference to the active LipSync when needed instead of the meshrenderer.
     
  3. JasinAsap

    JasinAsap

    Joined:
    Jun 14, 2018
    Posts:
    1
    Hi, I'm trying to set up a Project with the LipSync Asset. Using the Lite Version right now to see how things work. I want to purchase the Pro Version, but I got a problem right from the start.

    It wont let me select any Blend System, even though there are Blend Shapes for facial expressions on the Character (it was generated with Autodesk Character Generator).
    There is also an Error put out by the console I don't understand.
    Thankful for any help!


    Screenshot
    ScreenshotProblem.png
     
  4. NawarRajab

    NawarRajab

    Joined:
    Aug 23, 2017
    Posts:
    11
    Hi, I'm trying to generate LipSyncData by code. I've manager to do that but when I want to play it, I get the following:
    [LipSync - Character] Loading data from an old format LipSyncData file. For better performance, open this clip in the Clip Editor and re-save to update.


    and followed by a lot of errors such as :


    Assertion failed: Assertion failed on expression: 'curveT >= GetRange().first && curveT <= GetRange().second'
    UnityEngine.AnimationCurve:Evaluate(Single)
    RogoDigital.Lipsync.LipSync:LateUpdate() (at Assets/Rogo Digital/LipSync Pro/Components/LipSync.cs:340)


    I'm not using LipSyncClipSetup. I'm creating my LipSyncData as follows:
    Code (CSharp):
    1.        
    2. lipSyncData.name = finishedClip.name;
    3. lipSyncData.clip = finishedClip;
    4. lipSyncData.length = finishedClip.length;
    5. lipSyncData.phonemeData = markers.ToArray();
    6. lipSyncData.emotionData = new EmotionMarker[0];
    7. lipSyncData.gestureData = new GestureMarker[0];
    is there anything I'm missing?

    Can you point me in a direction please?
     
  5. Prefab

    Prefab

    Joined:
    May 14, 2013
    Posts:
    42
    Hi @Rtyper , just wondering if there is an estimated eta for full Timeline integration, I believe it's coming out with V2?
     
  6. dgoyette

    dgoyette

    Joined:
    Jul 1, 2016
    Posts:
    856
    I'm using LipSync Pro v1.42, and I'm running into an issue that only occurs in a released build, never when run under the unity editor. The general issue is that I'm calling Play on a LipSyncData object, but it's not playing, but in some other cases it will play. I'll describe the two cases I've seen, with the hope that maybe someone has run into this before, or could offer some suggestions for how to debug this effectively (since it only occurs in a release build.)

    The first case happens very regularly. I play a LipSync Pro clip shortly (about three seconds) after the scene starts. If I create a build of my game, and run it, that clip will not play. A few seconds later I play a second clip, which always plays just fine. At this point if I quit the game and launch it again, it will always play the first clip just fine. It's only the initial attempt to play that clip after doing a build where the clip will not play. Restarting the level doesn't help. The clip simply will never play until I've tried to play it once, quit the game, and launched it again.

    The second case occurs sporadically. I have some Coroutines set up to play a helpful suggestion every N seconds if the player hasn't done what they're supposed to do yet. So every N seconds I Play() a short LipSyncData object. Most of the time this works fine. However, every once in a while the clip doesn't play. And if it doesn't play the first time in the loop, it won't play any subsequent times. If I quit the game and relaunch it, the clip plays fine.

    Again, I'm hoping someone might offer some advice on how to diagnose this. Thanks.
     
  7. Casanuda

    Casanuda

    Joined:
    Dec 29, 2014
    Posts:
    38
    Hi,

    I see it was already mentioned, but I can confirm that on Mac with the latest version of lipsync I am getting stuck on "recognising phonemes".

    Thanks
    Cas
     
  8. aroontan_orca

    aroontan_orca

    Joined:
    Mar 29, 2018
    Posts:
    1
    Hi Casanuda, we are also experiencing the same problem. The issue seems to be with a 3rd party library - pocketsphinx.dylib for OSX (pocketsphinx.dll works fine on Windows). The auto-sync fails to load the library in Unity Editor OSX environment -- see error below. Hoping to hear back from Rogo Digital regarding this. Thanks, Aroon

    ------
    Unhandled Exception: System.DllNotFoundException: Assets/Plugins/Editor/Rogo Digital/LipSync/AutoSync 64bit/libpocketsphinx.3.dylib
    at (wrapper managed-to-native) RogoDigital.Lipsync.SphinxWrapper:psRun (RogoDigital.Lipsync.SphinxWrapper/MessageCallback,RogoDigital.Lipsync.SphinxWrapper/ResultCallback,int,string[])
    at RogoDigital.Lipsync.SphinxWrapper+<RecognizeProcess>c__AnonStorey1.<>m__1 () [0x0001e] in /Users/jeremyroh/Workspace/Collab/test-animation/Assets/Rogo Digital/LipSync Lite/AutoSync/Editor/SphinxWrapper.cs:117
     
    Last edited: Jul 9, 2018
  9. spakment

    spakment

    Joined:
    Dec 19, 2017
    Posts:
    16
    @Rtyper For fixing the autosync issue with Sierra / High Sierra, have you considered using the pocketsphinx distributed by homebrew ("brew install cmu-pocketsphinx")? then you might be able to have a wrapper using something like
    var info:FileInfo = new FileInfo("pocketsphinx");
    System.Diagnostics.Process.Start(info.FullName);

    Not sure if thats any help or not...(you've proabbly already tried it!)
     
  10. Hellwaiker

    Hellwaiker

    Joined:
    Jan 8, 2016
    Posts:
    45
    Hi, with this system do I need to create blendshapes in external application, or can it generate/retarget some basic blendshapes ?
     
  11. Jakub_Machowski

    Jakub_Machowski

    Joined:
    Mar 19, 2013
    Posts:
    269
    Hello

    We tried to manipulate emotions on bone blend system using Slate cinematic plugin but we have problem. This is what info we get from Slate developer.


    "Unfortunately (and after getting in contact with LipSync dev as well) there is no valid solution available yet for dynamically displaying bone-based emotions along with blend in/out possibility like it's done for blend-based ones. The developer kindly provided a solution and a method for "DisplayEmotion", but that provided solution does not work in the same way as currently the blend-shape implementation does. For example, it does not support gradual blending in and out of the emotion, but rather only an instant change to that emotion"

    and my question is, is there chance for update from your site to make him possible to implement that feature to slate?
    It is really important for us to work, blendshapes ar to memory heavy and more professional is to use bones.

    I hope you could implement that impossibility to use bones blend emotions is quite big lack :)
     
    Last edited: Aug 1, 2018
  12. JamesGartland

    JamesGartland

    Joined:
    Jun 1, 2017
    Posts:
    7
    Is there any way to make Bone Transforms also affect the scale of the bone? I'm trying to use bone transforms with a Sprite Blend System to add squash and stretch to my sprites for more fluid transitions between phonemes.
     
  13. pilamin

    pilamin

    Joined:
    Apr 9, 2015
    Posts:
    7
    Does anybody know if the developer is fine? His last post was over three-and-a-half months ago, and he stated he was soon going to post details on the v2.0 update.

    Seems a bit ominous. Hope nothing happened to him.
     
  14. TonyLi

    TonyLi

    Joined:
    Apr 10, 2012
    Posts:
    8,620
    @Rtyper was last on the forum two days ago. Maybe there's just no update yet. Sometimes the forum stops sending notifications, so maybe he just didn't receive notice of the last few posts here.
     
  15. Jakub_Machowski

    Jakub_Machowski

    Joined:
    Mar 19, 2013
    Posts:
    269
    We contacted dev by e-mail he replied in few days ;) So asset is still under development. no worries.
     
    Last edited: Aug 2, 2018
  16. mkgame

    mkgame

    Joined:
    Feb 24, 2014
    Posts:
    553
    Hi, under Unity 5.6.6f2 the AudioUtility collides with with Unity's one. Cannot use it because of the errors. At least two files are involved, please write the correct namespace before the AudioUtility. Nice work by the way, lot of features has been made!

    "Assets/Rogo Digital/LipSync Pro/Editor/LipSyncClipSetup.cs(504,5): error CS0104: `AudioUtility' is an ambiguous reference between `RogoDigital.AudioUtility' and `UnityEditor.AudioUtility'"

    Edit: in a clean Project, this error is not available.
     
    Last edited: Aug 8, 2018
  17. mkgame

    mkgame

    Joined:
    Feb 24, 2014
    Posts:
    553
    MCS 1.6.4 seems to be not supported. How to set up automatically the morphs for MORPH 3D? I installed the extension for MORPHS 3D 1.6.3+, then I selected the Morph3d Blend System and attached the MCS character. If I click on the preset button, then I can just select Maximo setting. If I do that, then I get wrong morphs. Are there no presets for Morph 3d?

    Edit: I found it, install the preset package, in the installed Morph 3d extension.
     
    Last edited: Aug 8, 2018
  18. namdo

    namdo

    Joined:
    Feb 23, 2015
    Posts:
    123
    Sent you an email but got no response. Im trying to log into the download page on your site and having no luck. It's not recognizing my invoice number. I bought the asset along with some others.
     
  19. ceebeee

    ceebeee

    Joined:
    Mar 7, 2017
    Posts:
    231
    Yeah I just tried it to see if it would work for me, but it doesn't either. Honestly despite what some say, I think they have stopped supporting this asset. I've not seen them reply to anyone here in a long time. Just because they 'logged in' to the forums means nothing. And saying 'maybe they didn't notice anyone had questions here' is ludicrous. that's literally their one job, to check the thread of their own product.
     
  20. namdo

    namdo

    Joined:
    Feb 23, 2015
    Posts:
    123

    I wanted the playmaker add one but I can't get it because I cant log into the download page. Might as well be looking into another asset
     
  21. namdo

    namdo

    Joined:
    Feb 23, 2015
    Posts:
    123
    So I was able to login by manually typing in my invoice number. I got the playmaker addon from the extensions so 'im good for now
     
  22. ceebeee

    ceebeee

    Joined:
    Mar 7, 2017
    Posts:
    231
    I always hand-typed it in, and tried every possible number of digits the form allows, and it just doesn't work.
     
  23. mkgame

    mkgame

    Joined:
    Feb 24, 2014
    Posts:
    553
    This asset works fine. Is this asset better suited for AAA games, than SALSA? In SALSA I had to make expressions and animations somehow externally and here a timeline editor is supported. I just setup an MCS charater with both systems, I'am asking because I have no experience with these tools. The developer cannot live with doing just this asset, and I guess he also afraid to write here a comment with sorry, sorry, sorry ;)
     
  24. eggtart

    eggtart

    Joined:
    Feb 4, 2013
    Posts:
    19
    What language models are available right now? Thanks!
     
  25. evfasya

    evfasya

    Joined:
    Feb 7, 2018
    Posts:
    6
    Hi everyone, I was wondering if there's a method to get the current blendshape/emotion value from the emotion that is created through the LipSyncPro tool (from the LipSyncData that I have created)? Please help :confused: and thanks in advance :D
     
  26. Appminis

    Appminis

    Joined:
    Apr 12, 2014
    Posts:
    119
    Lip Sync says there is native support for iClone characters - can you please provide information on how to use an iClone character? It seems like perhaps it needs its own Preset? The only preset available is Mixamo Fuse and if you use that, the iClone character doesn't really work well.
     
  27. evfasya

    evfasya

    Joined:
    Feb 7, 2018
    Posts:
    6
    Okay now I know how to do this :) (my email also was replied by the RogoDigital support yesterday)

    I wanted to retrieve the blendshape values in runtime, it can be done by using this:
    float RogoDigital.Lipsync.BlendSystem.GetBlendableValue ( int  blendable )


    First, get the LipSync component which has the blendshapes. I did it by using assigning the component in the inspector, and the variable intiation is something like this:
    public LipSync lipsyncComponent;


    Then on the
    void Update()
    I can show the values during the runtime in the console like this:
    Debug.Log("lipsyncComponent.blendSystem.GetBlendableValue(35)");


    35 is the index/number of the component's blendshape. In order to know the index/number of the blendshape, try this: https://stackoverflow.com/questions/22694672/get-blendshapes-by-name-rather-than-by-index

    That's all!
     
  28. matteatsmochi

    matteatsmochi

    Joined:
    Aug 2, 2018
    Posts:
    5
    hi friends,
    quick question about phoneme sliders. I have 2 different sets of phonemes (image below is temp) and I'd like to control when they change using a c# manager. anyone know how to do this with code? thanks for the help.



    EDIT: I found
    blendSystem.SetBlendableValue();
    but I'm not quite sure how to set which blendshape to effect.
     
    Last edited: Aug 27, 2018
  29. evfasya

    evfasya

    Joined:
    Feb 7, 2018
    Posts:
    6
    You can use this function to print out all the active blendshapes of your character: https://stackoverflow.com/questions/22694672/get-blendshapes-by-name-rather-than-by-index

    Then you can put the blendshape index number into the SetBlendableValue method as
    SetBlendableValue(int blendShapeIndexNumber, float value)
     
    matteatsmochi likes this.
  30. mgmegag

    mgmegag

    Joined:
    Aug 7, 2018
    Posts:
    3
    Hi, can we change audio clip at run time in lip sync pro?
     
  31. hungrybelome

    hungrybelome

    Joined:
    Dec 31, 2014
    Posts:
    195
    @Rtyper Hi, I'm trying out the Lite version and the Clip Editor preview works great with my character, but when I try it at runtime, no blend shapes are changed. The provided Gettysburg sample file works great at runtime though. Also, lite seems to save files as LipSyncDataBase, instead of just LipSyncData like the provided Gettysburg files. I'm interested in buying the Pro version, but I need to somehow get the saved lite files to play correctly at runtime like in the Clip Editor preview.
     
  32. nathanjams

    nathanjams

    Joined:
    Jul 27, 2016
    Posts:
    164
  33. hungrybelome

    hungrybelome

    Joined:
    Dec 31, 2014
    Posts:
    195
    I bought Pro and I think the Autosync hang up issue on OSX is due to a pathing error (probably due to spaces?). If you move the .dylibs to a path without spaces (e.g. Assets/Plugins/Editor/libpocketsphinx.3.dylib) and then edit the path within the script, you'll get an instant crash and a more concerning error than the DllNotFoundException:

    Code (CSharp):
    1. dyld: lazy symbol binding failed: Symbol not found: _err_set_debug_level
    2.   Referenced from: Assets/Plugins/Editor/libpocketsphinx.3.dylib
    3.   Expected in: /usr/local/lib/libsphinxbase.3.dylib
    4.  
    5. dyld: Symbol not found: _err_set_debug_level
    6.   Referenced from: Assets/Plugins/Editor/libpocketsphinx.3.dylib
    7.   Expected in: /usr/local/lib/libsphinxbase.3.dylib
    8.  
    9. Stacktrace:
    10.  
    11.   at (wrapper managed-to-native) RogoDigital.Lipsync.SphinxWrapper.psRun (RogoDigital.Lipsync.SphinxWrapper/MessageCallback,RogoDigital.Lipsync.SphinxWrapper/ResultCallback,int,string[]) <IL 0x00075, 0x003f3>
    12.   at (wrapper managed-to-native) RogoDigital.Lipsync.SphinxWrapper.psRun (RogoDigital.Lipsync.SphinxWrapper/MessageCallback,RogoDigital.Lipsync.SphinxWrapper/ResultCallback,int,string[]) <IL 0x00075, 0x003f3>
    13.   at RogoDigital.Lipsync.SphinxWrapper/<RecognizeProcess>c__AnonStorey1.<>m__1 () [0x00028] in /Volumes/OSX/UnityProjects/Fuse/Assets/Rogo Digital/LipSync Pro/AutoSync/Editor/SphinxWrapper.cs:144
    14.   at (wrapper runtime-invoke) object.runtime_invoke_void__this__ (object,intptr,intptr,intptr) <IL 0x0001c, 0x000f9>
    15.  
    16. Native stacktrace:
    17.  
    18.     0   ???                                 0x0000000113ec224a 0x0 + 4629209674
    19.     1   ???                                 0x0000000113ec1be9 0x0 + 4629208041
    20.     2   ???                                 0x0000000113e95482 0x0 + 4629025922
    21.     3   ???                                 0x0000000113e955a9 0x0 + 4629026217
    22.     4   libdyld.dylib                       0x00007fff5691b292 dyld_stub_binder + 282
    23.     5   libpocketsphinx.3.dylib             0x0000000185e64008 format_desc + 38488
    24.     6   libpocketsphinx.3.dylib             0x0000000185e583bf ps_init + 63
    25.     7   libpocketsphinx.3.dylib             0x0000000185e55528 ps_run + 232
    26.     8   ???                                 0x0000000174419c63 0x0 + 6245424227
    27.     9   ???                                 0x000000017440c8b8 0x0 + 6245370040
    28.     10  ???                                 0x000000012ebb023a 0x0 + 5078975034
    29.     11  libmono.0.dylib                     0x000000012e00a1ca mono_get_runtime_build_info + 3654
    30.     12  libmono.0.dylib                     0x000000012e135d02 mono_runtime_invoke + 117
    31.     13  libmono.0.dylib                     0x000000012e13aa7e mono_runtime_delegate_invoke + 105
    32.     14  libmono.0.dylib                     0x000000012e1617b8 mono_thread_create_internal + 1480
    33.     15  libmono.0.dylib                     0x000000012e191b76 CreateThread + 1348
    34.     16  libmono.0.dylib                     0x000000012e1b7ed2 GC_start_routine + 96
    35.     17  libsystem_pthread.dylib             0x00007fff56c33661 _pthread_body + 340
    36.     18  libsystem_pthread.dylib             0x00007fff56c3350d _pthread_body + 0
    37.     19  libsystem_pthread.dylib             0x00007fff56c32bf9 thread_start + 13
    38.  
    39. Debug info from gdb:
    40.  
    41.  
    42. =================================================================
    43. Got a SIGABRT while executing native code. This usually indicates
    44. a fatal error in the mono runtime or one of the native libraries
    45. used by your application.
    46. =================================================================
    47.  
    48. [1005/153427:FATAL:platform_thread_posix.cc(235)] Check failed: 0 == pthread_join(thread_handle.handle_, __null) (0 vs. 3)
     
  34. tgekerrion

    tgekerrion

    Joined:
    Sep 11, 2018
    Posts:
    3
    @Rtyper Hey! Been looking into various lipsync plugins for a couple of days now and yours is the most advanced that I came across. I'm planning to use it for a school project me and my team are working on that will include an AI with real-time conversations using a neural network for the AI response. Love the freedom you provide in regard to the blend shapes and Phonemes! But there is only 1 downside to it at the moment. You state that "Please Note: LipSync Pro does not support automatic lipsyncing at runtime. Clips must be created in the editor beforehand." I've looked into possibly combining it with Salsa since they have real-time voice input but naturally since you are competitors I couldn't find a way to do that. "Note: I'm also a 3d artist so coding is far from my area of expertise"
    I've been wondering if you have been working on a real-time implementation or if u could point me to a workaround this issue?
    Thanks in advance!
    Kind regards,
    Kerrion!
     
  35. mgmegag

    mgmegag

    Joined:
    Aug 7, 2018
    Posts:
    3
    Hi, what all languages are supported by lip sync pro?
    Also, comparing lip sync pro and salsa with random eyes, which one is better?
     
  36. Jakub_Machowski

    Jakub_Machowski

    Joined:
    Mar 19, 2013
    Posts:
    269
  37. rshnbs

    rshnbs

    Joined:
    Jul 21, 2018
    Posts:
    2
    Hello, I've created a new race in UMA by following steps of Secret Anorak's tutorials. First I created character on Adobe Fuse and imported it to Mixamo and then to Blender. I converted to UMA and now I'm trying to use LipSync Lite on my new avatar but it doesn't work. So what am I missing and what should I do to fix this?

     
  38. coverpage

    coverpage

    Joined:
    Mar 3, 2016
    Posts:
    384
    I believe only English is supported for LipSync and salsa probably any language.

    I have both Salsa and LipSync and find myself using more of LipSync nowadays. Salsa doesn't require processing of the audio to classify the different phonemes (like 'a', 'o', 'c' etc) as needed by LipSync . Salsa does it realtime but does not have all the phonemes, it mostly does open and close kind of motion. So while it is fast in this way, you have less control thatn LipSync as LipSync is preprocessed and you can edit exactly how you want the mouth motion to be. This control and the richness of the full range of phonemes is the reason why I find myself using LipSync more these days.
     
  39. Roy927

    Roy927

    Joined:
    Sep 22, 2018
    Posts:
    7
    Can you please send me the fix too? I have the same problem as rsouza on page 12.
    Using 1.43 and Unity 2018.2.8f
     
    Last edited: Nov 12, 2018
  40. Jakub_Machowski

    Jakub_Machowski

    Joined:
    Mar 19, 2013
    Posts:
    269
    Hello I'm using your system :) it is great soon we will share more cutscenes made with that . here is our webpage http://www.endofsun.com

    But I'm writing to give some feedback about 2 improivements that would be great! :)

    1. It would be great if bone based emotions would blend better when character talks, For now it works more like emotion of mouth is only beetween phenomes and it looks kind odd cause there are very micro short smiles sometimes beetween talk,It would be better to just make some average position, cause people also can smile and talk in the same time :)

    2. It would be great if you could add some automatic eyelid movment when eyes are rotating :) It will be much more natural :) For example if you look extreemely down eyelid is closing a little autoamtic way :)


    just feedback it would be great if you could implement it in next Updates:)
     
  41. magique

    magique

    Joined:
    May 2, 2014
    Posts:
    3,203
    Does this work with iClone characters?
     
  42. Olander

    Olander

    Joined:
    Sep 13, 2014
    Posts:
    391
    Since nobody is going to answer this...I will. =)

    Indeed. It is really the only one that works with iClone Character Creator 2 & 3

    VirtualRealities.school has a nice video on the process with LipSync tools for Unity (Salsa and LipSync Pro)
    www.youtube.com/watch?v=-kvRoc9uPN8
    See 32:30 for the section on iClone CC3's jaw issue

    LipSync Pro is really good and includes a great eye controller. Most definitely worth the purchase.

    I did my own system for some time with C# and LeanTween and use Ease In and Ease Out for each blendshape and jaw transform to give great fake muscle acceleration. The end effect is really solid. LipSync Pro continues to work and makes configuring different personalities easier....and it keeps working through Unity versions so not having source has not bit the framework yet.
    *(I just updated to v1.43 and only had to add => RogoDigital. to AudioUtility in a couple scripts Ctrl+H to change all)
     
  43. pegassy

    pegassy

    Joined:
    Sep 28, 2017
    Posts:
    24
    I am using Morph3D and the character's eyes are popping up when I am using the eye controller. There were some posts about the reason being the dummy eye sockets, but there was not a clear solution. Can you help me?