Search Unity

SALSA Lipsync Suite - lip-sync, emote, head, eye, and eyelid control system.

Discussion in 'Assets and Asset Store' started by Crazy-Minnow-Studio, Apr 23, 2014.

  1. Vector_6

    Vector_6

    Joined:
    Mar 9, 2017
    Posts:
    3

    Ahh I couldn't find any reference to the updated class. I'll see if I can implement! Thanks!!!
     
  2. Crazy-Minnow-Studio

    Crazy-Minnow-Studio

    Joined:
    Mar 22, 2014
    Posts:
    1,399
  3. saylas_unity

    saylas_unity

    Joined:
    Apr 25, 2019
    Posts:
    3
    Hello! I've setup lipsync, eyes, head, eyelids and emotions using bones on my generic rig.
    The rig is pretty non-standard but everything seems working good, except eye and head targeting. When choosing the target the character chooses wrong direction at which it looks (for example when looking at camera that is in front of the character, it looks to the left down). Do you have any ides what can help?
     
    Last edited: Aug 15, 2019
  4. Crazy-Minnow-Studio

    Crazy-Minnow-Studio

    Joined:
    Mar 22, 2014
    Posts:
    1,399
    Hello saylas_unity,

    It sounds like your bones are probably not aligned with Unity's left handed coordinate system. You can either fix your bones to align them with Unity's coordinate system or use our Fix Axis feature on your head and eye bones to automatically add corrective hierarchy. See the "Fix Axis / Restore Axis" sections of our documentation for head and eyes for more details.

    https://crazyminnowstudio.com/docs/salsa-lip-sync/modules/eyes/using/#head-configuration
    https://crazyminnowstudio.com/docs/salsa-lip-sync/modules/eyes/using/#eye-configuration
     
  5. saylas_unity

    saylas_unity

    Joined:
    Apr 25, 2019
    Posts:
    3
    When I tried Fix Axis - it destroys how the eyes look on the Character.

    Which bone should I align to fix that issue? Are there any details in any manuals on that?
     
  6. wetcircuit

    wetcircuit

    Joined:
    Jul 17, 2012
    Posts:
    1,409
    You may have got the eyes turned 180° around…
    for example: the eyes are pointing +Z when they should be pointing -Z
     
  7. Crazy-Minnow-Studio

    Crazy-Minnow-Studio

    Joined:
    Mar 22, 2014
    Posts:
    1,399
    To answer your question, any bones that you link to the SALSA suite should be left hand coordinate system aligned since that is what Unity uses.

    EDIT: To clarify a bit, SALSA and EmoteR don't care about your bone orientation since they capture a start and end point and use that to interpolate between the two points similar to how blendshapes work. Eyes requires correct alignment because it can't rely on a single start and end point and needs to incorporate world space to calculate tracking vectors.

    However, before you proceed we'd like to investigate your result further. Please send us an email to assetsupport@crazyminnow.com, reference this forum thread so we know who the email is from, and include your SALSA v2 invoice number. Thanks!
     
    Last edited: Aug 15, 2019
  8. TJD269

    TJD269

    Joined:
    Oct 17, 2017
    Posts:
    19
    Hi there, slightly confused as I'm trying to update Salsa with random eyes and it's telling me the package is deprecated and I no longer can update to 2.0... half the reason I bought this asset was for the update. Is there something I'm missing? I love this asset, was looking forward to the individual eye control.

    Thanks!
     
  9. Crazy-Minnow-Studio

    Crazy-Minnow-Studio

    Joined:
    Mar 22, 2014
    Posts:
    1,399
    Hi TJD269,

    SALSA v2 is a complete rewrite, is not backwards compatible, and is a paid upgrade. SALSA v1 is deprecated, but you can continue to use it or you can upgrade.
     
  10. Firlefanz73

    Firlefanz73

    Joined:
    Apr 2, 2015
    Posts:
    1,316
    Hello,

    it has taken some while but now I found the time to update Salsa and have a look at my UMA.
    In the first Scene the UMA are really in the Scene (not dynamically generated).

    With Gameobject menu, one click, UMA DCS components have been added. I also added an audiosource.
    From my script I removed all CM_... Events.

    As I understood I only have to set the Audio Clip and Play on the audiosource? That would be much easier than before :)

    upload_2019-8-23_21-47-49.png

    It Looks okay so far, but when the Audio Clip with the Audiosource is played, I can hear it but lips do not move...

    What am I missing, can you give me a hint?
     
  11. Deckard_89

    Deckard_89

    Joined:
    Feb 4, 2016
    Posts:
    316
    Hi,
    How can I completely disable SALSA head movement on my character? I set Head Templates to None, turned off "Head" in the Queue Processor but the character's head is still being influenced during gameplay - I only want to use blend shapes and eye bones on this character (using Fuse OneClick in Unity 2018.3.14).
     
  12. Crazy-Minnow-Studio

    Crazy-Minnow-Studio

    Joined:
    Mar 22, 2014
    Posts:
    1,399
    Hello!
    Since you have a completely different configuration from the UMA Getting Started option, you might require some additional setup. The OneClick requires the UMA ExpressionPlayer to function. Are you spawning that at runtime? Or do you have one on your character? If you don't have one at all, that would be at least one reason there is no lip syncing.
     
  13. Crazy-Minnow-Studio

    Crazy-Minnow-Studio

    Joined:
    Mar 22, 2014
    Posts:
    1,399
    Hello @Deckard_89 ,
    Setting the template to "none" should do the trick, as long as you agree to the popup dialog suggesting the configuration will be removed. I just tested this on the DAZ character I'm working with on some video tutorials and it worked as intended. Try disabling (un-checking) the "Enable Processing" option on the Head Configuration section. That should completely disable the head animations.

    Also, disabling the "head" option under the QueueProcessor only prevents the processor inspector from displaying the "head" animations currently in the queue. It is a filter setting and nothing more.

    Hope that helps!
    Darrin
     
    Deckard_89 likes this.
  14. Firlefanz73

    Firlefanz73

    Joined:
    Apr 2, 2015
    Posts:
    1,316
    Sometimes when UMA is not Talking, and when he's Talking, I can see values in the Queue Processor like These:

    upload_2019-8-23_21-51-36.png

    Unity is 2019.2.0.
    I have multiple Scenes, three with UMA already in Scene, one with dynamic spawning. I am currently working on those with UMA already in Scene. I added the expressionplayer but see no Change yet...

    In both cases I have a script which changes some DNA and Cloth in runtime when they are initialized.

    Thanks a lot for Guiding me ;-)
     
  15. skinwalker

    skinwalker

    Joined:
    Apr 10, 2015
    Posts:
    509
    Hello, now we are going to make blendshapes for talking and buy Salsa, but Im wondering what's the best configuration of blendshapes to make realistic talking? I found out this:

    https://facefx.github.io/documentation/doc/default-character-setup

    What do you think, does it work with Salsa if we only make the Mouth Targets (no tongue or head rotations)? Or maybe I should use this setup? https://crazyminnowstudio.com/docs/...mg/OneClickVisemes_reallusionCC3-newFront.png

    Which one could give best results?
     
    Last edited: Aug 24, 2019
  16. wetcircuit

    wetcircuit

    Joined:
    Jul 17, 2012
    Posts:
    1,409
    I'd say it depends on your artstyle, and the fidelity of your models…. I'd definitely want more than just the visemes to animate a face (eyebrows and eyelids perform most emotions).

    If I were starting from scratch I'd probably aim for FACS blendshapes, like in iPhoneX's ARKit: https://developer.apple.com/documentation/arkit/arfaceanchor/blendshapelocation
    (visual examples for each blendshape are excellent)

    FACS is a system based on the actual muscles in a human face, and originally developed to work with people who cannot understand the emotions behind facial expressions. (That original purpose turned out to be not-so-precise – people make expressions with their faces that don't always match a specific emotion, and other people have strong emotions without moving their faces.)
     
  17. skinwalker

    skinwalker

    Joined:
    Apr 10, 2015
    Posts:
    509
    Thanks for the suggestion, I see a lot of blendshapes and have no idea which one to choose, obviously I can't implement all of them. The character we are making now is a human female, but the same setup will be used on male later.
     
  18. Crazy-Minnow-Studio

    Crazy-Minnow-Studio

    Joined:
    Mar 22, 2014
    Posts:
    1,399
    Hello,
    @wetcircuit offers excellent advice as she always does! We also have our requirements and recommendations (based on what we do for our OneClick implementations) documented here:
    https://crazyminnowstudio.com/docs/salsa-lip-sync/modules/overview/#requirements

    You are, of course, free to choose your own path and style. We also have a few videos released on version 2, with more to come soon!

    Hope that helps!
    Darrin
     
    skinwalker likes this.
  19. wetcircuit

    wetcircuit

    Joined:
    Jul 17, 2012
    Posts:
    1,409
    So targetting SALSA's strengths:
    go with the 7 or 8 visemes (Darrin beat me to reply, haha)

    Eyebrows "in" towards the center
    Eyebrows up
    Eyebrows down

    top eyelid and bottom eyelid separately
    wide eyed

    nose and cheek scrunch

    then the mouth corners – FACS has so many of these and I don't think they're all that different, so the idea is to be able to move each corner of the mouth:
    "up" like a smile
    "out" like the mouth is stretched
    "down" like a frown
    "in" like the mouth is compressed

    If you do half the face you should be able to mirror them to the other side. You'll want left and right blendshapes for each.

    All these base blendshapes can be combined to make "emotes" for SALSA (basically: blendshape combos as expressions), which SALSA will dial in randomly as the character speaks.

    There is a book called STOP STARING, very helpful.
    https://www.amazon.com/Stop-Staring-Facial-Modeling-Animation/dp/0470609907
     
    Last edited: Aug 24, 2019
  20. skinwalker

    skinwalker

    Joined:
    Apr 10, 2015
    Posts:
    509
    thanks for the help, now I want to ask since I exported my character from character creator 3 and did some heavy modification then lost all of the morphs, should I merge the teeth and tongue to the main character or Salsa can use multiple skinned meshes to make those face expressions ?
     
  21. Crazy-Minnow-Studio

    Crazy-Minnow-Studio

    Joined:
    Mar 22, 2014
    Posts:
    1,399
    Also, if you are modeling your mesh in something like Blender and you use the mirror modifier on the mesh, always apply the modifier prior to creating your blendshapes. If you create blendshapes first (on the mirror) you will not be able to apply the mirror later and when you export, you will lose your blendshapes. I'm not sure how this works in Maya, etc. but I would assume it is similar.

    Here's a playlist for 3 videos we did a while back -- ShapeKeys (blendshapes) in Blender
     
  22. Crazy-Minnow-Studio

    Crazy-Minnow-Studio

    Joined:
    Mar 22, 2014
    Posts:
    1,399
    The short answer is 'yes'.

    I would recommend taking a peak at the documentation we have. A lot of your questions should be answered there.
    https://crazyminnowstudio.com/unity-3d/lip-sync-salsa/features/
    https://crazyminnowstudio.com/unity-3d/lip-sync-salsa/faq/
    https://crazyminnowstudio.com/docs/salsa-lip-sync/
     
  23. Deckard_89

    Deckard_89

    Joined:
    Feb 4, 2016
    Posts:
    316
    Ok, it's a weird one then. Even if Head processing is not enabled, something still seems to interfere with the head. The thing is, I have Final IK's "Look At" IK on my character, and before I added SALSA it worked fine, but now the character's head seems to try and twist on an angle, like it's always trying to look behind him. It might be because of the new objects SALSA added to the bone structure:
    "Head_OffsetAxis" and ""Head_FixedAxis"
    Do you think these may be somehow confusing Final IK into thinking they are the transform of the head bone maybe?

    The only way I can fix it is to put the "Clamp Head Weight" (Final IK setting) all the way to 1, but then I am losing the nice bit of subtle head movement that I had before. It's not a game-breaker of course, but it was a good thing I had going.
     
  24. Crazy-Minnow-Studio

    Crazy-Minnow-Studio

    Joined:
    Mar 22, 2014
    Posts:
    1,399
    It sounds like you're using one of our one-click setups. These apply a coordinate fix to the bone automatically. If you don't want SALSA to control the head, relink the head bone and click the "Restore Axis" button to remove the corrective hierarchy. then set the head configuration to none. Likely the orientation fix is still in place.

    See the eyes manual on Fix Axis / Restore Axis
    https://crazyminnowstudio.com/docs/salsa-lip-sync/modules/eyes/using/
     
  25. Deckard_89

    Deckard_89

    Joined:
    Feb 4, 2016
    Posts:
    316
    Yeah,that seems to have fixed it. Thank you!
     
    Crazy-Minnow-Studio likes this.
  26. Crazy-Minnow-Studio

    Crazy-Minnow-Studio

    Joined:
    Mar 22, 2014
    Posts:
    1,399
    Check out our new SALSA LipSync Suite (v2) tutorial series. Subscribe to our YouTube channel for updates as we release additional tutorials.

     
  27. Firlefanz73

    Firlefanz73

    Joined:
    Apr 2, 2015
    Posts:
    1,316
    Can anybody give me a clue why my UMA are not moving lips?

    The OneClick was done successfully on my Uma existing in Scene already, the folder uma /examples / Expression examples exists (like I saw in the small UMA docs it should be existing), I added an Audio source (which is playing speech file) and I added an Expression Player...no lips movement still.

    Thanks :)
     
  28. namdo

    namdo

    Joined:
    Feb 23, 2015
    Posts:
    200
    I was able to set it up with saysmall, saymedium and saylarge. When I add more visemes, the mouth doesnt move at all. How do I use blendshapes like a,o,mbp,L,fv.

    the example in the intro video on the asset store page has multiple mouth movements and I thought that example would be available but only the box head is available.

    Are there any advanced examples or something?
     
  29. Crazy-Minnow-Studio

    Crazy-Minnow-Studio

    Joined:
    Mar 22, 2014
    Posts:
    1,399
    Sometimes all we need is to sleep on it -- I think I know what the problem is. Since your characters are already in-scene, likely they're not firing the character created event our UepDriver component is waiting for. I'll see about an option to bypass this requirement.

    Darrin
     
  30. Crazy-Minnow-Studio

    Crazy-Minnow-Studio

    Joined:
    Mar 22, 2014
    Posts:
    1,399
    The early tech-demo video you are referring to uses an MCS model. Morph3D no longer supports MCS, so we also pulled support for the model system. You are free to use MCS if you like, but you would have to rely on community support for it.

    As for your question(s) -- boxHead is currently the only 3D model we supply. If you would like to try a different model with more visemes applied, you can try one of the supported model systems and use a OneClick on it. We obviously cannot provide these models ourselves (since they are not ours), but any of them would serve as a demo and several are available for free, such as the Emotiguy model from DAZ -- featured in our new videos. Speaking of videos, if you haven't already watched them, we recently published several tutorial videos (with more on the way) that will jump-start you on your how-do-I question.

    Hope that helps,
    Darrin
     
  31. namdo

    namdo

    Joined:
    Feb 23, 2015
    Posts:
    200
    I have my own model with multiple blendshapes like a,o,mbp,l,fv. I didnt mca or mcs or whatever. I just wanted to use my own blendshapes to work like it did in that video.
     
  32. Crazy-Minnow-Studio

    Crazy-Minnow-Studio

    Joined:
    Mar 22, 2014
    Posts:
    1,399
    Did you reset the trigger values using the Curve or Linear option after adding more shapes? See part 2 of the tutorial series on our YourTube channel. Also listed a few posts above.
     
  33. Crazy-Minnow-Studio

    Crazy-Minnow-Studio

    Joined:
    Mar 22, 2014
    Posts:
    1,399
    @Firlefanz73, I've posted UMA OneClick 2.1.0 -- I am flying blind here because I don't have your configuration. On Start, if the UmaUepDriver doesn't see a DCA, it will simply look for an expression player. It assumes you have already properly configured your expression player since there is no data to receive from the DCA character created event, so make sure you have your expression player configured. It should just work now. Previews will still be created on the substitute model -- that is just the way it is setup to work. Again, flying blind here, so you will have to test this and see if it works for your dynamic and non-dynamic characters.

    Thanks,
    Darrin
     
  34. namdo

    namdo

    Joined:
    Feb 23, 2015
    Posts:
    200
    I got it working. IT was adventure creators shapeable component that was freezing the blendshapes.
     
  35. Firlefanz73

    Firlefanz73

    Joined:
    Apr 2, 2015
    Posts:
    1,316
    Thanks a lot! I downloaded and imported the latest UMA One click.

    Now when starting the Scene with UMA already included, I get this error message:

    NullReferenceException: Object reference not set to an instance of an object
    CrazyMinnow.SALSA.BoneController..ctor (UnityEngine.Transform bone, CrazyMinnow.SALSA.TformBase min, CrazyMinnow.SALSA.TformBase max, System.Boolean applyPos, System.Boolean applyRot, System.Boolean applyScale, System.Single frac) (at <a566912c183740c7bc79407306aa1ba8>:0)
    CrazyMinnow.SALSA.SalsaUtil.RuntimeUpdateExpressionController (CrazyMinnow.SALSA.Expression expData) (at <a566912c183740c7bc79407306aa1ba8>:0)
    CrazyMinnow.SALSA.Eyes.UpdateRuntimeExpressionControllers (System.Collections.Generic.List`1[CrazyMinnow.SALSA.EyesExpression]& expressions) (at <a566912c183740c7bc79407306aa1ba8>:0)
    CrazyMinnow.SALSA.OneClicks.OneClickUmaDcsEyes.ConfigureHead (UnityEngine.GameObject umaGO) (at Assets/Crazy Minnow Studio/Addons/OneClickUMA/OneClickUmaDcsEyes.cs:55)
    CrazyMinnow.SALSA.OneClicks.UmaUepDriver.InitVars () (at Assets/Crazy Minnow Studio/Addons/OneClickUMA/UmaUepDriver.cs:127)
    CrazyMinnow.SALSA.OneClicks.UmaUepDriver.Start () (at Assets/Crazy Minnow Studio/Addons/OneClickUMA/UmaUepDriver.cs:37)

    My UMA has an audioscource, an Expression Player and Looks okay so far I can see...

    upload_2019-8-25_18-8-11.png

    Thanks!
     
  36. Firlefanz73

    Firlefanz73

    Joined:
    Apr 2, 2015
    Posts:
    1,316
    Trying to debug:

    upload_2019-8-25_18-12-11.png
     
  37. Crazy-Minnow-Studio

    Crazy-Minnow-Studio

    Joined:
    Mar 22, 2014
    Posts:
    1,399
    Question, are you using standard UMA for this non-dynamic character? Or is it a product of UMA Power Tools or some other 3rd party system? We were unaware UMA could produce non-runtime characters. We are not familiar with Power Tools (or other 3rd party systems) and likely we can't support them. It appears that Power Tools probably creates new skeletal structure, but again, we are completely unfamiliar with it.
     
  38. Firlefanz73

    Firlefanz73

    Joined:
    Apr 2, 2015
    Posts:
    1,316
    I am not using 3rd Party System. And I am using UMA DCS.
     
  39. Crazy-Minnow-Studio

    Crazy-Minnow-Studio

    Joined:
    Mar 22, 2014
    Posts:
    1,399
    Okay, please create a clean and minimal Unity test project, not your entire project, that clearly demonstrates the issue you're having and we'll be happy to take a look for any SALSA specific issues. There are too many variables with UMA characters for us to try to diagnose the issue from a few screen shots.

    Thanks!
     
    Firlefanz73 likes this.
  40. Firlefanz73

    Firlefanz73

    Joined:
    Apr 2, 2015
    Posts:
    1,316
    I will send you a PM now, thanks for helping!
     
  41. Crazy-Minnow-Studio

    Crazy-Minnow-Studio

    Joined:
    Mar 22, 2014
    Posts:
    1,399
    UMA OneClick v2.1.3 released.
    Updated to support manual setups in addition to the dynamic UMA GettingStarted method.

    Steps to implement:
    1. Same as previous, apply the OneClick to your avatar root.
    2. On the UmaUepDriver, disable (uncheck) the "UMA Character is Dynamic" option.
    3. Configure an UMAExpressionPlayer on your avatar and you will need to configure it with these parameters:
      • UMAExpressionPlayer.expressionSet = yourExpressionSet;
      • UMAExpressionPlayer.umaData = yourUmaData;
      • UMAExpressionPlayer.Initialize();
    4. Then, call our UEP driver manual start function, passing a link to the UMAExpressionPlayer you just configured:
      • UmaUepDriver.ManualStart(yourUMAExpressionPlayer);

    Enjoy!
     
    wetcircuit and Firlefanz73 like this.
  42. Firlefanz73

    Firlefanz73

    Joined:
    Apr 2, 2015
    Posts:
    1,316
    Thanks a lot for supporting and Guiding me, I got the new Salsa Lipysync now working for my game.
    And another one will follow later, I want to have talking UMA there too :)
     
  43. Crazy-Minnow-Studio

    Crazy-Minnow-Studio

    Joined:
    Mar 22, 2014
    Posts:
    1,399
    Hello! We have released another video in our tutorial series, discussing AudioSource and external analysis input into SALSA. Enjoy!
     
    wetcircuit likes this.
  44. Crazy-Minnow-Studio

    Crazy-Minnow-Studio

    Joined:
    Mar 22, 2014
    Posts:
    1,399
    SALSA LipSync Suite v2.1.1 is now available. This is a small release with a couple of bug fixes and some usability enhancements. We also laid the framework for some CC3 OneClick changes. Check out the Release Notes for more info.
     
  45. Crazy-Minnow-Studio

    Crazy-Minnow-Studio

    Joined:
    Mar 22, 2014
    Posts:
    1,399
    Hello everyone! We have released another tutorial video for SALSA LipSync Suite v2 covering how SALSA and EmoteR work together to create emphasis emotes. Enjoy!

     
    wetcircuit likes this.
  46. wetcircuit

    wetcircuit

    Joined:
    Jul 17, 2012
    Posts:
    1,409
    Great videos! I've been trying the Text to Speech script on webGL. I missed lots of settings.
     
  47. wetcircuit

    wetcircuit

    Joined:
    Jul 17, 2012
    Posts:
    1,409
    Hey Crazy Minnows!

    I'm trying to work out a procedural hand-gesture thing, something similar to what's shown in this video:


    I think I can do this with a set of short mecanim animations, but I need a way to trigger it. I've tried a few things, like trying to get which emote is running (by name), buuuuut that doesn't seem to be panning out. I've done a few isTalking functions, but triggering from Emoter might be better for several reasons….

    How about an Emote that sends a Unity event? (or better for me, a Playmaker event?)

    Screen Shot 2019-09-17 at 11.15.41 AM.png

    Or maybe you have another suggestion?
     
    Ony likes this.
  48. Crazy-Minnow-Studio

    Crazy-Minnow-Studio

    Joined:
    Mar 22, 2014
    Posts:
    1,399
    Ha! You saw that video too, huh? Mike just shared that with me a couple of days ago -- pretty cool stuff. I have been considering some method to trigger animations (so many ways to do it). My current thought process would be to link up an ExpressionController to an Animator and provide controls for setting an animator parameter bool true/false (on/off) depending on whether the controller has reached max/min. Using something like this should make it inherently work within Playmaker as well.

    We've got a few new features for the upcoming SALSA Suite v2 release 2.2.0, one being an option to exclude from hold-time-variations you suggested a while back. I'll play with this a little and see if I can squeeze it in.

    Thanks!
    Darrin
     
    Ony and wetcircuit like this.
  49. wetcircuit

    wetcircuit

    Joined:
    Jul 17, 2012
    Posts:
    1,409
    Mecanim trigger would probably be more precise...

    ...but Event trigger could be routed through some other logic first.

    I would control a bank of randomized animations, and adjust with an IK-hand bobble that moves around so the gesture wouldn't move exactly the same every time it fires….

    Also, I figure I'd want frequent facial emotes but infrequent gestures… so 1 emote for all hand gestures might work for me (as opposed to 20 mecanims each as an Emote….)

    idk really… just thinking out loud. lol
     
    Crazy-Minnow-Studio and Ony like this.
  50. wetcircuit

    wetcircuit

    Joined:
    Jul 17, 2012
    Posts:
    1,409
    Also a Playmaker listener for isTalking would be wonderful.... all the playmaker actions Set parameters in SALSA, there's no actions to listen for when SALSA is SALSAing.

    Currently there's an issue in ios where Playmaker loses it's Get/Set gameobjects, so that's a minor issue doing generic API grabs off components….
     
    Crazy-Minnow-Studio likes this.