Search Unity

SALSA Lipsync Suite - lip-sync, emote, head, eye, and eyelid control system.

Discussion in 'Assets and Asset Store' started by Crazy-Minnow-Studio, Apr 23, 2014.

  1. xibanya

    xibanya

    Joined:
    Nov 26, 2016
    Posts:
    10
    Thanks! Does writing to the "average" property trigger an event action delegate, or is it that the changes from writing to that are picked up on an update loop from the Salsa3d object?
     
  2. Crazy-Minnow-Studio

    Crazy-Minnow-Studio

    Joined:
    Mar 22, 2014
    Posts:
    921
    It's not an event, the average property is monitored and acted upon by SALSA during LateUpdate, when the associated AudioSource is playing.
     
  3. wetcircuit

    wetcircuit

    Joined:
    Jul 17, 2012
    Posts:
    1,098
    WAAAAAH! It sounds AMAZING! I want it meow…! :cool:
     
  4. xibanya

    xibanya

    Joined:
    Nov 26, 2016
    Posts:
    10
    An update from my end, I got much greater accuracy when I didn't grab the master ChannelGroup but rather the EventInstance's ChannelGroup. The problem is, FMOD's runtimemanager runs async with Unity unless you set it otherwise, so my solution may be inelegant, but it works. Rather than adding the DSP to the master ChannelGroup as soon as the EventInstance is created, I put this at the top of the update loop.

    After that, it goes on to grab the pointers from the low level FMOD API as it did in the last code snippet I posted.

    Code (CSharp):
    1. if (!gotIt && ev != null)
    2.         {
    3.             FMOD.RESULT result = ev.getChannelGroup(out channelGroup);
    4.             if (result == FMOD.RESULT.OK)
    5.             {
    6.                 channelGroup.addDSP(FMOD.CHANNELCONTROL_DSP_INDEX.HEAD, fft);
    7.                 gotIt = true;
    8.             }
    9.         }
    one minor change, since I was grabbing ONLY the channelGroup I actually wanted, I was able to simplify my write to salsa.average to this:

    Code (CSharp):
    1. salsa.average = spectrum[0].Average() * mod;
     
    Crazy-Minnow-Studio likes this.
  5. Crazy-Minnow-Studio

    Crazy-Minnow-Studio

    Joined:
    Mar 22, 2014
    Posts:
    921
    Thanks a lot for posting this, it's a great help for anyone else using FMOD.

    Michael
     
  6. Klarax

    Klarax

    Joined:
    Dec 2, 2013
    Posts:
    17
    Can someone tell me how I can set the eye target at runtime c# please thanx.
     
  7. Crazy-Minnow-Studio

    Crazy-Minnow-Studio

    Joined:
    Mar 22, 2014
    Posts:
    921
    You can use the following method:

    RandomEyes3D.SetLookTarget(GameObject target)
    Set the look target for eye tracking.

    Code (CSharp):
    1. using UnityEngine;
    2. using CrazyMinnow.SALSA;
    3.  
    4. public class SetLookTargetClean : MonoBehaviour
    5. {
    6.     public RandomEyes3D randomEyes;
    7.     public GameObject lookTarget;
    8.     public bool set;
    9.     public bool unset;
    10.  
    11.     void Update ()
    12.     {
    13.         if (set)
    14.         {
    15.             set = false;
    16.             randomEyes.SetLookTarget(lookTarget);
    17.         }
    18.  
    19.         if (unset)
    20.         {
    21.             unset = false;
    22.             randomEyes.SetLookTarget(null);
    23.         }
    24.     }
    25. }
    For more information about the API, see the Classes, Methods, and Properties section of the manual.
    https://crazyminnowstudio.com/unity...randomeyes-manual/classes-methods-properties/

    Code example are provided in the Code Examples section of the manual.
    https://crazyminnowstudio.com/unity-3d/lip-sync-salsa/manuals/randomeyes-manual/code-examples/
     
  8. Crazy-Minnow-Studio

    Crazy-Minnow-Studio

    Joined:
    Mar 22, 2014
    Posts:
    921
  9. wetcircuit

    wetcircuit

    Joined:
    Jul 17, 2012
    Posts:
    1,098
    The Mic update is A LOT faster!
    Thanks! :cool:
     
    Crazy-Minnow-Studio likes this.
  10. Klarax

    Klarax

    Joined:
    Dec 2, 2013
    Posts:
    17



    this did not work. eyes continue to look whereever

    in addition, the setaudioclip code does nothing to mouth either. the sound plays, but not movement.

    cant get it to this package to work what so ever outside a test scene.
     
  11. Crazy-Minnow-Studio

    Crazy-Minnow-Studio

    Joined:
    Mar 22, 2014
    Posts:
    921
    Hello Klarax,

    Are you using SALSA with a custom character or one of the character systems that we support such as Fuse, Daz, or MCS?

    If you're using a 1-click set up with one of the character systems that we support, many of these setups use two instances of randomized, one instance for facial expression, and the other instance for eye control. In this case, you will need to ensure that you are referencing the instance that has the option [Use custom shapes only] disabled. This is the instance that controls eye movement.

    For SALSA, if you are hearing the audio play but the mouth is not moving, then it may be that your audio source is not linked to SALSA in the audio source property.

    Michael
     
  12. Crazy-Minnow-Studio

    Crazy-Minnow-Studio

    Joined:
    Mar 22, 2014
    Posts:
    921
    We've posted another SALSA 2.0 technology preview video. This time we used our new combined 2D/3D workflow, our unlimited sprite/texture/material/trigger capability (sprites for this test), and multi-image sprite sequences for each of the eight visemes used to achieve smoother 2D traditional animation. This workflow could also be used to add full lip-sync and emotion expression to low poly 3D characters using texture or material swapping. We think the results look great. Let us know what you think.

     
    Last edited: Jun 4, 2018
  13. FiveFingerStudios

    FiveFingerStudios

    Joined:
    Apr 22, 2016
    Posts:
    370
    I just downloaded the 1 click for iClone.

    I’m noticing that random eyes is making only the left eye blink, how can I fix this?

     
  14. Crazy-Minnow-Studio

    Crazy-Minnow-Studio

    Joined:
    Mar 22, 2014
    Posts:
    921
    Hello sbmhome,

    Thanks for the video, we are looking into this now and will get back to you soon.

    Michael
     
  15. Crazy-Minnow-Studio

    Crazy-Minnow-Studio

    Joined:
    Mar 22, 2014
    Posts:
    921
    I suspect iClone may have renamed the BlendShape. Please expand your characters hierarchy (in Unity's hierarchy), select the body object, expand the BlendShapes list on the SkinnedMeshRenderer component, and let us know what the 3 blink BlendShapes are named.

    On RL_G6 characters they are as follows:
    • Eyes_Blink$$RL_48$$
    • Eyes_Blink_L$$RL_49$$
    • Eyes_Blink_R$$RL_50$$
    Thanks,
    Michael
     
    Last edited: Jun 5, 2018
  16. FiveFingerStudios

    FiveFingerStudios

    Joined:
    Apr 22, 2016
    Posts:
    370
    Thanks for the quick reply.

    Ok, I looked at it. It looks like an issue on my side. I tested a non-altered version of the iClone character its ok.

    I imported the IClone character into Blender to reduce the polygons and it changed the names of the blendshapes to the below. (it removed the $$ and everything in between).
    Eyes_Blink
    Eyes_Blink_L
    Eyes_Blink_R


    I've looked at the blendShapes and they work.....they are even mapped in the RandomEye component (when I click on "Override" within Random Eyes it works).

    Its just not getting activated when the character blinks.

    This puts in a bit of a pickle for this character as I don't want to redo everything in Blender. Is there a way to make sure the mapping is correct?
     
    Last edited: Jun 5, 2018
    Crazy-Minnow-Studio likes this.
  17. Crazy-Minnow-Studio

    Crazy-Minnow-Studio

    Joined:
    Mar 22, 2014
    Posts:
    921
    We can provide an easy fix for this. Our Daz 1-click setup provides a bit more flexibility to deal with version discrepancies from one Genesis model to the next by exposing a comma separated string of shape search names. We'll add a similar feature to the iClone 1-click setup. We'll search for the default names, and you can add any custom naming to the inspector.

    Michael
     
  18. FiveFingerStudios

    FiveFingerStudios

    Joined:
    Apr 22, 2016
    Posts:
    370
    Awesome....looking forward to this. Thanks.
     
  19. Crazy-Minnow-Studio

    Crazy-Minnow-Studio

    Joined:
    Mar 22, 2014
    Posts:
    921
  20. FiveFingerStudios

    FiveFingerStudios

    Joined:
    Apr 22, 2016
    Posts:
    370
    Thanks....this helped and I can now see both eyes blink!

    One note: I had to take out the directive "using Slate" as my code wouldn't compile with it. I don't use Cinematic Sequencer SLATE so hopefully this won't be an issue.
     
  21. Crazy-Minnow-Studio

    Crazy-Minnow-Studio

    Joined:
    Mar 22, 2014
    Posts:
    921
    Thanks for letting me know, the SLATE namespace isn't supposed to be there, it must have been a test that slip by. We'll update the change.
     
  22. Firlefanz73

    Firlefanz73

    Joined:
    Apr 2, 2015
    Posts:
    1,104
    upload_2018-6-6_19-32-43.png

    Hello,

    I just upgraded my Project to Unity 2018.1.3f1.

    UMA itself is working so far, but my "Talking Heads" look strange when they include Salsa Integration.
    I changed Nothing there, before with Unity 2017 it worked perfect.

    I upgraded my UMA to the latest develop build but the problem remains.

    I see an error message:

    Exception: Bone not found.
    UMA.UMASkeleton.GetRotation (Int32 nameHash) (at Assets/UMA/Core/StandardAssets/UMA/Scripts/UMASkeleton.cs:615)
    UMA.PoseTools.UMAExpressionPlayer.Update () (at Assets/UMA/Core/StandardAssets/UMA/Scripts/UMAExpressionPlayer.cs:88)

    Could you have a look at that please? I am using your UMA Integration. Or should I post this in the UMA Forum?

    Thanks a lot!
    I hope to get it working soon, I still love Salsa for my "Talking heads" :)
     
  23. Crazy-Minnow-Studio

    Crazy-Minnow-Studio

    Joined:
    Mar 22, 2014
    Posts:
    921
    Hi Firlefanz73,

    We're sorry you're experiencing this issue, but thanks for letting us know. We'll test our UMA integration with 2018.1.3f1 and the latest UMA DCS release to see if we can reproduce, and correct any issues that may be related to our workflow. Please give us time to investigate and get back to you.

    Thanks,
    Michael
     
    Firlefanz73 likes this.
  24. Firlefanz73

    Firlefanz73

    Joined:
    Apr 2, 2015
    Posts:
    1,104
    Thanks a lot!
     
  25. Crazy-Minnow-Studio

    Crazy-Minnow-Studio

    Joined:
    Mar 22, 2014
    Posts:
    921
    Hi,

    We have tested our UMA workflow with the following software versions, but we were not able to recreate the issue you presented.

    Tested using:
    • Unity 2018.1.3f1
    • SALSA 1.5.5
    • UMA DCS add-on for SALSA 1.6.0
    It's possible that the issue occurs because of the separated head mesh, but that's just a guess. It's where I would begin troubleshooting, by first testing with a normal full body mesh to see if the problem persists. Any additional details you can provide that might help us recreate the issue would also be helpful.

    Michael
     
  26. Firlefanz73

    Firlefanz73

    Joined:
    Apr 2, 2015
    Posts:
    1,104
    Okay, Thanks a lot for testing!

    I will try to find out myself this Weekend.
     
  27. The-Creature-Queen

    The-Creature-Queen

    Joined:
    Jan 19, 2017
    Posts:
    24
    Sorry if it's been asked, thread is huge and search just returns everything.

    I use fuse characters. Mixamo no longer auto rigs the face. Will this work with the current state of auto rigging? (Everything but the face.) Otherwise, what would you recommend to rig the face with to work with your product? I can rig well in Maya at least, but want the most painless "I don't have to keyframe animate this" integration with Unity.
     
  28. Crazy-Minnow-Studio

    Crazy-Minnow-Studio

    Joined:
    Mar 22, 2014
    Posts:
    921
    Hello Wolfride,

    SALSA is a blendshape-based system. It is designed to be flexible enough to work with many character designs and character generations system, but the characters must have sufficient facial blendshapes to work (UMA is the only bone-based system that is an exception to this rule). If you are creating your own blendshapes, you can use our manual as a guide to create blendshapes for the best SALSA results.
    https://crazyminnowstudio.com/unity-3d/lip-sync-salsa/manuals/salsa-manual/

    We also have a 3-part series on creating blendshapes in Blender.


    Here you can find links to all of the assets and character systems that SALSA has been tested with.
    https://crazyminnowstudio.com/unity-3d/lip-sync-salsa/features/

    There are a number of different ways to setup SALSA depending on your character. Basic setup using just the inspectors, 1-click setups for the most popular character generation systems, and our SalsaSync add-on for complex custom character setups.

    Michael
     
  29. Crazy-Minnow-Studio

    Crazy-Minnow-Studio

    Joined:
    Mar 22, 2014
    Posts:
    921
  30. gregb

    gregb

    Joined:
    Feb 6, 2013
    Posts:
    2
    Hello,

    I have an issue with Randoom Eyes (2D).
    I am using Unity 2017.3.1f1 and Vuforia (augmented reality sdk).
    I wish my character look the camera (ar camera where I attached an empty GO). Everything works good in Scene view, eyes follow target, but in Game view the eyes don't follow the target.
    Some clues or idea?

    Edit : same issue with Randoom Eye 3D... It's working in scene view but don't in Game view. Frustrating....

    Edit 2 : same issue when I test RandomEyes2D_QuickStart scene. If I move camera or target (empty GO) child of camera nothing happens in Game View. But if I move target (empty GO) when it is no children, it's working in Game view. Weird...

    Thanks in advance.
    RandoomEyesIssue.png
     
    Last edited: Jun 21, 2018
  31. Crazy-Minnow-Studio

    Crazy-Minnow-Studio

    Joined:
    Mar 22, 2014
    Posts:
    921
    Hi gregb,

    Your second edit gave us a clue of what might be happening, we'll look into this and get back to you. Please send your SALSA invoice number to assetsupport@crazyminnow.com.

    Thanks,
    Michael
     
  32. Crazy-Minnow-Studio

    Crazy-Minnow-Studio

    Joined:
    Mar 22, 2014
    Posts:
    921
    Hi gregb,

    We tested using RandomEyes2D, on a 2D character, in a 3D scene, and tracked a target in 3D space, and then a child target in 3D space moving the child only and then moving the parent to drag the child along, and in each test the characters eyes tracked the appropriate direction in 2D space (i.e. up, down, left, right).

    Are you able to send us a test scene that can reproduced the symptom, along with your SALSA invoice number?

    assetsupport@crazyminnow.com

    Thanks!
     
  33. gregb

    gregb

    Joined:
    Feb 6, 2013
    Posts:
    2
    Of course I will send you it in few minutes.
     
  34. IsntCo

    IsntCo

    Joined:
    Apr 12, 2014
    Posts:
    141
    I work on games with low poly 3d and flat textures on faces. It looks like v2.0 will provide the ability to lip sync the face of a 3d object with texture swaps? When will this be available to use?
     
  35. Crazy-Minnow-Studio

    Crazy-Minnow-Studio

    Joined:
    Mar 22, 2014
    Posts:
    921
    Hello Appminis,

    Thanks for your interest in SALSA 2.0. We don't have a release date yet, but we are making great progress. We post out latest dev updates here.

    http://www.crazyminnowstudio.com/posts/salsa-v2-devblog/

    Thanks,
    Michael
     
  36. BoxxyBrown

    BoxxyBrown

    Joined:
    Jul 2, 2014
    Posts:
    4
    Hey guys. First, great asset! Super excited for 2.0.

    I had some trouble getting the lip sync to work at all for some MCS characters of mine. Come to find out, it was because I have the audio set to 5.1 surround for my project. If I set the audio source to 2D, lip sync works great; if back to 3D, then nothing will happen (presumably because the audio source, if directly in front of my player, is playing through the center channel only). Any workarounds for this?

    Also on the subject of MCS, I've had an issue where the linked blendshapes seem to de-link at some point. I had this problem a while back and I'm just getting back into this project that's using SALSA, so I'll report back if I can reproduce the problem. I also seem to recall that randomeyes would work normally after a couple playtests, and at some point the characters' cheeks would "blink." haha.
     
  37. Crazy-Minnow-Studio

    Crazy-Minnow-Studio

    Joined:
    Mar 22, 2014
    Posts:
    921
    Hi BoxxyBrown,

    We're glad you like SALSA. We're really excited for 2.0 as well, and getting closer every day.

    SALSA 1.x can only process audio heard by the AudioListener, so the further an Audiosource is from the AudioListener, the weaker the signal for processing becomes. This is resolved in 2.0 by using an all new algorithm to process audio data. If you have situations in 1.x where you need the speaker to be far away from the AudioListener, but you still want SALSA to process lipsync, the current recommended workaround is to use two AudioSources. The first AudioSource has nothing to do with SALSA and uses spacial audio however you need to use it, the second AudioSource is set to 2D, is linked to SALSA for processing, and has it's output routed through a muted mixer group (attenuation turned all the way down). Turning the attenuation down on a mixer group mutes the audible output, but allow the audio data to continue to flow. In this scenario, you would then set the audio clip for each AudioSource, and call AudioSource.Play() on each at the same time. We know this is a bit clunky, which is why we fixed it in 2.0, but it's our recommended approach for 1.x.

    For MCS characters, it's important to understand how they implemented the ability to activate/deactivate BlendShapes, how the indexes are assigned, and how and when those indexes can change. They are using Unity API to add BlendShapes to a FBX Mesh when you activate a BlendShape in the inspector. Upon activation, the BlendShape is added to the end of the SkinnedMeshRenderer BlendShape array and assigned the incremental index. However, since they are writing these changes to a FBX file and not a Unity Asset file, the BlendShapes are automatically wiped out when you close and reopen your project and Unity automatically detects changes in the mesh and re-imports it. After you re-open your project, MCS re-activates the previously activated BlendShapes, but in a predetermined order, not the order that you activated them. This can result in the BlendShape indexes changing from the order you selected, to the order they've assigned. This is what happens when you activate/deactivate in the editor, not at run-time, then close/open your project. If you are activating at run-time, this does not apply since there isn't an opportunity for a character re-import and Blendshape index reshuffling. This is the reason we added the option to our CM_MCSsync inspector to remap indexes.

    I hope that helps to clarify, and makes implementation easier to deal with.

    Michael
     
  38. Prefab

    Prefab

    Joined:
    May 14, 2013
    Posts:
    42
    Hi @Crazy-Minnow-Studio , in V2 will it be possible to add expression blendshapes to parts of lip synced dialogue? For example allowing eyebrows to be raised for surprised questions, frowns added for angry statements etc. I am also assuming that Timeline support will remain as per V1?
     
  39. Crazy-Minnow-Studio

    Crazy-Minnow-Studio

    Joined:
    Mar 22, 2014
    Posts:
    921
    Hello! v2 has a new emphasis system that allows for automated responsive emotes. Unfortunately, there really is no way for the system to automagically know what an emotive response should be. However, just as with v1, emotes can be triggered manually. We are considering the ability to respond to Watson Tone Analysis, which may produce this sort of response -- but this is not set in stone.

    Most of the existing add-ons for SALSA will work out-of-the-box for SALSA v2, or will require very few modifications. Timeline included. v2 will likely have increased Timeline support, possibly including Emoter functionality.

    Hope that helps!
    --Darrin
     
  40. Prefab

    Prefab

    Joined:
    May 14, 2013
    Posts:
    42
    Sorry yes I did mean manually triggering what emote will go where during the dialogue. I hadn't realised V1 can already do this, is there a tutorial showing how this works? Also is this something that can be set inside Timeline using the editor, or is it only triggered via script?
     
  41. Crazy-Minnow-Studio

    Crazy-Minnow-Studio

    Joined:
    Mar 22, 2014
    Posts:
    921
    Currently only callable in script. There is demo syntax code in the online manual. https://crazyminnowstudio.com/unity-3d/lip-sync-salsa/manuals/randomeyes-manual/code-examples/

    And we do have a video demonstrating it all:

     
  42. jeffsarge

    jeffsarge

    Joined:
    Jan 3, 2014
    Posts:
    24
    I was wondering if the Salsa microphone live input was able to record audio to a file.
    Thanks,
    Jeff
     
  43. Crazy-Minnow-Studio

    Crazy-Minnow-Studio

    Joined:
    Mar 22, 2014
    Posts:
    921
    It does not, live use only.
     
  44. Prefab

    Prefab

    Joined:
    May 14, 2013
    Posts:
    42
    Great thank you for that. I have also watched this tutorial (
    ) and in the first part with the conversing boxheads, to add emotes to this timeline would I simply create an animation track and add animation events as required? Would these events then simply add the emotes on top of the lip syncing or overwrite it? I am just trying to work out the best way of being able to sequence this in a visual way if that makes sense.
     
  45. Crazy-Minnow-Studio

    Crazy-Minnow-Studio

    Joined:
    Mar 22, 2014
    Posts:
    921
    I haven't tested using timeline in that manner, it may be possible. It seems like I remember reading something where animation events did not fire properly in Timeline, but that may be something else entirely (or, if it was, it may be fixed by now).

    If the blendshapes are different from the shapes used in lipsync, they should simply blend together. However, if your "emotes" use or include shapes that are used in lip-sync, there will be contention and the shapes will fight each other. SALSA v2.0 will gracefully allow conflicting shapes in an override mode where lipsync will win if shapes conflict. Another way to get around this is with something like our MorphMixer asset that allows you to create new blendshapes within the Unity editor by copying or mixing multiple shapes together to create a new shape. As long as the shapes are different, they should blend smoothly together.
     
  46. Prefab

    Prefab

    Joined:
    May 14, 2013
    Posts:
    42
    Ah you're completely right about Timeline and animation events, I just looked into this and even up until recently there isn't official support yet. What a shame, I would've thought that using timeline would make a sequence such as this much easier but it is actually more painful. Looks like the only current solution is to use empty gameobjects with scripts that get enabled at certain points in the timeline and their code executed on enable. Either that or using one of the third party assets trying to solve this issue. Thank you for letting me know
     
    Crazy-Minnow-Studio likes this.
  47. Crazy-Minnow-Studio

    Crazy-Minnow-Studio

    Joined:
    Mar 22, 2014
    Posts:
    921
    Yes, it seems like something that is sorely missing in the Timeline framework. I took a look at this last night and I believe we have a solution that seems pretty nice and will convert very easily to SALSA 2.0 when it is released. It needs some cleanup and a little more testing, but we should have it published later this evening or tomorrow at latest. The existing AudioClip Timeline event will be rolled together with this custom shape event into a SALSA core set of Timeline types.

    We will post here when it is available.
     
    Prefab likes this.
  48. Crazy-Minnow-Studio

    Crazy-Minnow-Studio

    Joined:
    Mar 22, 2014
    Posts:
    921
    The updated Timeline package has been uploaded and the blog post updated to reflect the changes included. The new package is now called "Unity Timeline SALSA Core" to reflect the nature of the package containing functionality core to the SALSA product. It includes the AudioClip and Emote track/clip types for Timeline.

    The TextSync Timeline package remains separate to avoid name conflicts when the TextSync package is not being used.

    Blog post: https://crazyminnowstudio.com/posts/unity-timeline-for-salsa-lip-sync/

    Enjoy!
    --Darrin
     
    Prefab likes this.
  49. Prefab

    Prefab

    Joined:
    May 14, 2013
    Posts:
    42
    This sounds pretty awesome! Would it be possible to see a short video of exactly how this is put into practice?
     
  50. Crazy-Minnow-Studio

    Crazy-Minnow-Studio

    Joined:
    Mar 22, 2014
    Posts:
    921
     
    Prefab likes this.