Search Unity

[RELEASED] LipSync Pro and Eye Controller - Lipsyncing and Facial Animation Tools

Discussion in 'Assets and Asset Store' started by Rtyper, Mar 11, 2015.

  1. Hamesh81

    Hamesh81

    Joined:
    Mar 9, 2012
    Posts:
    405
    Ok, I have installed the latest Audacity (2.1.1), recorded some audio, and then exported it using the same settings as per the screenshot above. I am still getting the same results unfortunately.

    Could you possibly document the procedure that you use, or advise what else I should check/compare?
     
  2. Rtyper

    Rtyper

    Joined:
    Aug 7, 2010
    Posts:
    452
    There is the Quickstart Guide PDF, but you're right, it should be more obvious. I'll put together more complete documentation for the next major update.

    I'm sorry you're still having problems! Sure, I'll record a video of the process.

    Looking at that screenshot you posted, it looks like the audio file isn't being passed to sapi_lipsync.exe correctly, which is odd... Can you go into AutoSync Windows/Editor/AutoSync.cs and add this:

    Code (CSharp):
    1. UnityEngine.Debug.Log(audioPath);
    at line 39, then tell me what comes up in the console when you press the AutoSync button again?
     
  3. Hamesh81

    Hamesh81

    Joined:
    Mar 9, 2012
    Posts:
    405
    Ok I tried that, please see below:
    upload_2015-11-9_20-58-39.png upload_2015-11-9_20-58-39.png
    But unfortunately nothing appeared in the console. I tried it with multiple audio files and even the gettysburg one ,which is actually working ok, but nothing printed to the console. Are you sure that that's the correct line to put it on?
     
  4. Rtyper

    Rtyper

    Joined:
    Aug 7, 2010
    Posts:
    452
    Ah, no sorry! I was looking at the wrong version of the file, put it on 41, just before Process process = new Process();
     
  5. Hamesh81

    Hamesh81

    Joined:
    Mar 9, 2012
    Posts:
    405
    Ok I've worked it out. I need to place my audio files inside the Examples/Audio folder (where the sample audio is), then the autosync works perfectly. I didn't know that this was the case. Is there any possibility that this location can be set manually?
     
  6. Rtyper

    Rtyper

    Joined:
    Aug 7, 2010
    Posts:
    452
    There shouldn't be anything special about that folder, unless it's the issue with spaces in directory names (which is a problem with SAPI, and doesn't seem to always crop up.) Could you try making sure the names of the folders one or two levels above the audio file don't have spaces in the names? That's the only thing I can think of that could set the example audio folder apart from your own.
     
    Hamesh81 likes this.
  7. AndreInfante

    AndreInfante

    Joined:
    Oct 26, 2013
    Posts:
    5
    Hey Rtyper! Having a bit of an issue with this. I have a blend-shape controller that handles facial expressions, and it fights with lip-sync in a really ugly way. Is there a way to temporarily stop the plugin from messing with the blend shapes of the face while the facial expression is turned on? I tried to use your system for playing back emotions, but the minimum duration is much too long (I want to do a quick smile, then a quick frown).

    Grateful for any help,
    Andre
     
  8. Hamesh81

    Hamesh81

    Joined:
    Mar 9, 2012
    Posts:
    405
    You're right the original folder had a space in it, I thought that only the audio files had to not have spaces. Thanks for that. ;)

    Also in relation to AndreInfante's comment above, is there a way to make the emotions very short for example for blink blend shapes? Since at the moment the minimum is still too long for realistic blinks. It could also be useful to be able to have a separate layer on top of phonemes and emotions for facial expressions such as blinks, tongue pokes etc.
     
  9. nahie

    nahie

    Joined:
    Aug 3, 2013
    Posts:
    4
    Thanks for making this great product! I've had the same issues everyone else had with the autosync but I think I got it to mostly work with the fixes suggested in the forum - no spaces in file name or folder path to the audio file. At first I thought it still wasn't working but it seems that Unity isn't showing the markers on the timeline when you use autosync. The markers are actually there - you can see this my moving the mouse over where they would be and the tooltip appears, but still no marker. This is the error I'm getting in the console:

    null texture passed to GUI.DrawTexture
    UnityEngine.GUI:DrawTexture(Rect, Texture)
    LipSyncClipSetup:OnModalGUI() (at Assets/Rogo Digital/LipSync/Editor/LipSyncClipSetup.cs:406)
    RogoDigital.ModalParent:OnGUI() (at Assets/Rogo Digital/Shared/Editor/ModalParent.cs:24)
    UnityEditor.DockArea:OnGUI()

    That pops up 1000's of times (hits the unity limit of 999+) when loading lipsync. I'm using the latest version of Unity 5.2.2 because I have some other assets that seem to require 5.2.2 so I'm thinking maybe that is the issue?
     
  10. Rtyper

    Rtyper

    Joined:
    Aug 7, 2010
    Posts:
    452
    LipSync won't be doing anything at all unless an animation is playing. If you want it to continue playing, but not actually affecting the blend shapes then your best bet is to add a public boolean variable to LipSync.cs that you change from your blend shape controller, then wrap the code in LipSync that controls blend shapes (blocks at around lines 370 for phonemes and around 530 for emotions, have a look at the comments to see what does what) in an if statement to stop it doing anything when your boolean is set.

    You're both right, that minimum duration is way too long, it was set like that to stay readable/keep the handles usable, but especially now with the zoomable timeline, that's not really relevant. I'll reduce that minimum duration massively. All the same, I wouldn't recommend using emotions for blinks - you usually don't want blinking to be tied to specific points in audio, and it could be a hassle to add all the markers. LipSync includes another component called Eye Controller, which handles randomised blinking, and controls eye look targets (either random or looking at a transform). I'd advise using this, as it works with most setups LipSync works with and can add life to characters while saving you a lot of time!

    Do the markers appear if you save the LipSyncData file then reopen it? Also, are you on the latest version of LipSync (0.4)? The line the error's coming from is to do with drawing the timeline along the top of the waveform in 0.4, so I'm not sure why it would stop the markers from appearing...


    In general news, I uploaded 0.401 (meant to be the first of many smaller patches) to the asset store about a week ago, and it's still not been approved. The store doesn't give publishers any ability to get small fixes out quickly, which is unfortunate. At this rate, the next update will be submitted before the previous one gets published!
     
    Hamesh81 likes this.
  11. Hamesh81

    Hamesh81

    Joined:
    Mar 9, 2012
    Posts:
    405
    I have had a look at the Eye Controller component and I cannot see how you would use it for blink blend shapes? It seems to be setup for only bone based blinking.
     
    Last edited: Nov 11, 2015
  12. Rtyper

    Rtyper

    Joined:
    Aug 7, 2010
    Posts:
    452
    No, in fact it only supports blendshape blinking! Have a look at the Lincoln example, it uses Eye Controller for that. The Left and Right Eye Transform fields are for look targets. The Left and Right Blink Blendshape dropdowns appear once you add a main body mesh.
     
  13. nahie

    nahie

    Joined:
    Aug 3, 2013
    Posts:
    4
    I'm using 0.4. I downloaded a fresh copy from the asset store to make sure, and started a new Unity project and imported it to check. It seems to work fine at first but if you save and close the project, then re-open it, it has the problem. It also has 22 "red" error messages at the top of the console when re-opening the project:

    Recursive Serialization is not supported. You can't dereference a PPtr while loading. (Constructors of C# classes may not load objects either eg. EditorGUIUtility.TextContent should be moved to OnEnable. See stacktrace.)
    UnityEngine.Resources:Load(String)
    LipSyncClipSetup:.ctor()
    UnityEditorInternal.InternalEditorUtility:LoadSerializedFileAndForget(String)
    UnityEditor.WindowLayout:LoadWindowLayout(String, Boolean)

    It does seem related to the markers because when I mouse over them, more of the error messages appear.
     
  14. Hamesh81

    Hamesh81

    Joined:
    Mar 9, 2012
    Posts:
    405
    Ok I managed to use the Eye Controller script but it keeps triggering errors if i don't have eye transforms set. I have added a boolean which executes the eye transform parts of the code only when true, which allows the script to work without needing to assign eye transforms. However, I cannot see my public boolean variable in the inspector. Is there something else I need to change to make it show up in the inspector?
     
  15. Rtyper

    Rtyper

    Joined:
    Aug 7, 2010
    Posts:
    452
    Ah, yes - it uses a custom inspector, sorry about that - I'll make the eye transforms optional in the next update. In the meantime, it'd probably be easier instead of using a boolean in your if statement, check if lefteye and righteye == null instead, saves you a bit of work.
     
  16. Rtyper

    Rtyper

    Joined:
    Aug 7, 2010
    Posts:
    452
    Sorry, I missed your post! I've been trying but I haven't been able to recreate this yet - I think I know what could be causing it though, can you give me your email address so I can send a replacement file to you to test? (or if you don't want to post it publicly, you can send it to contact@rogodigital.com)
     
  17. Hamesh81

    Hamesh81

    Joined:
    Mar 9, 2012
    Posts:
    405
    The == null suggestion is a good idea, I will do that. But, is there any way that the Eye Controller script can be "reset" to use the default Unity inspector? There are other variables which I would like to make public so that it is easier to adjust the blink rate, blink speed etc from the inspector. Is this possible?
     
  18. olix4242

    olix4242

    Joined:
    Jul 21, 2013
    Posts:
    1,962
    I have a serious problem: I'm using blendshapes for customising my avatar, but when using LipSync, all my blendshape values resets to 0 and all customisations are gone. Is it possible not to influence/reset blendshapes that aren't in a phonemes or emotions list? I think this is a feature that is absolutely essential.
     
    Last edited: Nov 16, 2015
  19. Rtyper

    Rtyper

    Joined:
    Aug 7, 2010
    Posts:
    452
    Sorry, I was away over the weekend, only just getting around to replying to posts/emails etc!

    Beta 0.401 was finally approved to the Asset Store today, so you can download the update now. It's quite a minor one, just fixes a few issues related to importing the package into some projects. I was planning on having it out as a small update almost 2 weeks ago, but unfortunately it took much longer to get approved than I expected!

    Because of this, and a number of other issues people have reported, I'm changing plans slightly - I'm putting updates on hold temporarily, and working on a new version of the runtime component. The current version of the component was created for alpha 0.2, and has had extra bits bolted on since, and as such is nowhere near as stable as it should be. A lot of the bugs that have turned up are incredibly difficult to track down because of how complicated it's got.

    As I'm working on this part-time, more or less single-handedly, this will probably delay new features and such by a couple of months, but I feel it's important for LipSync to be really robust and reliable before the full 1.0 release.

    Yes, if you need to you can either delete, or just comment out the contents of LipSync/Editor/EyeControllerEditor.cs.

    I know, that's an issue with the current version. I'm working on a fix for it (and a few other similar issues that require some blendshapes that ARE in the lists to be left alone when they're not needed) at the moment. It's turned out to be more difficult than I first thought to get good looking results from it.
     
  20. Hamesh81

    Hamesh81

    Joined:
    Mar 9, 2012
    Posts:
    405
    Great that worked, thank you. One more question please. When using the "Audio + Text" option for auto syncing, is there anyway in particular that I need to input the text into the textbox? This may seem a silly question, but when I pasted my paragraphs into the textbox and hit process, the blendshape markers were only generated for the first half of the audio. When I use the audio only method I have no issues, so I'm not sure what happened there. Could you explain the workflow for this please.
     
  21. olix4242

    olix4242

    Joined:
    Jul 21, 2013
    Posts:
    1,962
    Wouldn't it be possible to add a list with blendshapes to exclude?
     
  22. Rtyper

    Rtyper

    Joined:
    Aug 7, 2010
    Posts:
    452
    That's strange... though I have to admit, I'm not 100% certain what format the text is expected in. This may just be one of the problems with SAPI. I'm sorry, I know it seems like just avoiding the issue, but SAPI_lipsync.exe is basically a black box, and I don't really have the ability to fix problems caused by it. You could try using a text file instead of the textbox and see if that makes any difference, but really at the moment the only real solution is wait until the new version of AutoSync is ready!

    Yes, this would work as a fix for your issue, but really in the long-term the whole system needs to just not affect any blendshapes it doesn't currently need. That's what the new version that I'm working on now will do.

    If you can, it shouldn't be too difficult to add such a list yourself (you would add an array of ints in LipSync.cs, and check against that list in the loop where the blendshapes are set - if it's there then just ignore that shape. You'd also need to expose the array in the custom inspector, LipSyncEditor.cs).If not I can put a patch together for you, but it'll take a few days as I'm fairly busy with work at the moment!
     
    olix4242 likes this.
  23. olix4242

    olix4242

    Joined:
    Jul 21, 2013
    Posts:
    1,962
    Few days is ok for me ;)
     
  24. Rtyper

    Rtyper

    Joined:
    Aug 7, 2010
    Posts:
    452
  25. Eitrius

    Eitrius

    Joined:
    Dec 30, 2013
    Posts:
    19
    I cannot get autosync to work anymore. Last week it would get about a third of the way through the progress bar then either skip the rest and not do anything, or it would completely freeze Unity. I downloaded the update for Lipsync today and now it skips through it and won't place any markers.
     
  26. movra

    movra

    Joined:
    Feb 16, 2013
    Posts:
    566
    What's your source file? AutoSync wouldn't work with a Ogg Vorbis audio clip, but it did after I externally converted the clip to a WAV file.
     
  27. Eitrius

    Eitrius

    Joined:
    Dec 30, 2013
    Posts:
    19
    I'm using mp3's, and I tried a WAV file. Neither one will work. Anyone have any ideas?
     
  28. Rtyper

    Rtyper

    Joined:
    Aug 7, 2010
    Posts:
    452
    How are you exporting the audio file? It needs to be uncompressed PCM audio in a .wav file to work, and (usually) can't have spaces in the file name or path. (Limitations of the phoneme detection software, sorry, they will be fixed when the new version of AutoSync is done.)
     
  29. Eitrius

    Eitrius

    Joined:
    Dec 30, 2013
    Posts:
    19
    Tried PCM .wav in unity and that still isn't working. I'm not the one exporting the audio, so I will check on that tomorrow and see if it's on that end.
     
  30. BortStudios

    BortStudios

    Joined:
    May 1, 2012
    Posts:
    48
    Will this asset work with 2D models? Specifically, I actually want to have a sprite on a 3D model for the eyes / mouth etc. Can I set it to do a sprite swap for each phoneme? Otherwise, I assume there is some way to animate a mouth sprite into different positions and put it into the program. But I would like to do sprite swapping.
     
  31. Rtyper

    Rtyper

    Joined:
    Aug 7, 2010
    Posts:
    452
    Are you getting any errors or warnings in the console when you start AutoSync?

    Sorry, but LipSync only works with 3D models at the moment. It uses blend shapes (or bones, with more limited support) for animation, so sprites aren't supported at all. 2D support is planned for the final release, but I can't say for sure when it'll be out just yet.
     
  32. BortStudios

    BortStudios

    Joined:
    May 1, 2012
    Posts:
    48
    Oh man, that's disappointing. Excited for the final release, though
     
  33. Ecnelis

    Ecnelis

    Joined:
    Jun 19, 2013
    Posts:
    2
    Hey, any idea when you plan on moving over to CMUSphinx? I've been messing with phonemes for this very purpose using their libraries, and then stumbled across this project. If it's not too far off, I'll likely abandon my adaptation as this already has some nice features included. :)
     
  34. Rtyper

    Rtyper

    Joined:
    Aug 7, 2010
    Posts:
    452
    It might be a couple of months away, it's turned out to be a little harder than I first thought, and with Christmas etc approaching I won't have too much time to work on LipSync this month.

    Work is progressing quite well on the rest of the update though - the new blend systems have been implemented, and I'll be writing a couple to come with LipSync as standard (one for blend shapes, one for UMA characters). For the time being, bone animation will probably stay built-in to LipSync instead of having its own blend system, because I know mixing blend shapes and bones can be useful. This may change later though.
     
  35. Ecnelis

    Ecnelis

    Joined:
    Jun 19, 2013
    Posts:
    2
    Naturally. Thank you for the response.
     
  36. jndaghir

    jndaghir

    Joined:
    Dec 19, 2015
    Posts:
    2
    I love Lipsync, thanks for making such a great product! Has anyone had issues loading XML files? I wanted to use the Autosync feature, so I ran a Windows partition on my Mac. I exported the phoneme-sync file as an XML file to try to open it on my Mac version of Unity. The XML file will open in the Autosync editor, but when it says to find an audio file, all .wav files are grayed out/unable to be selected. Has anyone made an Autosync file on a Windows PC and then tried to open it on a Mac?
     
  37. Rtyper

    Rtyper

    Joined:
    Aug 7, 2010
    Posts:
    452
    I haven't, but I will try! That sounds strange, when you say they're greyed out, do you mean in the standard file picker dialog that comes up after clicking File > Import XML in the LipSync Clip Editor?

    EDIT: Yes, Just tried it on my Macbook and seen that... I'll have a look into it!
     
    Last edited: Dec 19, 2015
  38. jndaghir

    jndaghir

    Joined:
    Dec 19, 2015
    Posts:
    2
    Wow, thanks for your quick reply! I tried using both the XML file and the regular data file, neither of them would work if made on a PC and then opened up on a Mac. Let me know if you figure it out; I'll keep looking at it too!
     
  39. Rtyper

    Rtyper

    Joined:
    Aug 7, 2010
    Posts:
    452
    I think I've worked it out - in the OnXMLImport() method in LipSyncClipSetup.cs, there's this line:
    Code (CSharp):
    1. string audioPath = EditorUtility.OpenFilePanel("Load AudioClip" , "Assets"+lastLoad , "wav;*.mp3;*.ogg");
    If the last part of that (with the extensions) is changed to only contain a single extension, then it works fine (for that extension). It looks like Windows interprets that correctly as multiple extensions, but OS X interprets it as a single extension (that obviously doesn't exist!) Unity doesn't have any reliable way to allow multiple extensions like this across platforms though. For now, if you edit that string to only contain the single extension (without the . or *) that your audio file is in, it should work.
     
  40. OlliIllustrator

    OlliIllustrator

    Joined:
    Nov 1, 2013
    Posts:
    71
    Hi,

    Thank you for your Product.
    My preffered way of preparing facial animation jaw bone for mouth opening and controlling the rest of the shapes with blendshapes.
    I tested the new version of Lipsync with one of my charcters and had problems because the jaw bone (that was animated in the animation clip) was not overwritten by Lipsync animation (only the shape animation came through.
    I did write that at the end of your survey I did a couple of minutes ago--
    just wanted to let you know that the problem is gone and that lipsyc does it's job beautifully now and overruns the jaw in the animation clip---
    I am not really sure what the cause was, I recreated the Avatar mask I used for the animations (in exactly the same way it was set up)and since then everything runs as desired.
    So ignore my comment in the survey and receive a big THANK YOU for this fantastic asset.
    It is a joy to work with and really adds wonderful possibilities to my work.

    Merry Christmas:)

    Oliver
     
  41. Rtyper

    Rtyper

    Joined:
    Aug 7, 2010
    Posts:
    452
    Hi, @OlliIllustrator - thanks for the feedback! I'm not sure why it wouldn't have been overriding the animation correctly before (possibly you didn't have the animation/animator component linked with LipSync underneath the Use Bones checkbox?) but if it works now that's great!

    I'm taking a break from developing this (mostly) from now until the new year. I'll still be providing support through email as normal (except the 24th, 25th and 26th) if anyone has any issues.

    Have a Merry Christmas, everyone!
     
  42. giorgos_gs

    giorgos_gs

    Joined:
    Apr 23, 2014
    Posts:
    623
    Hi to all! I am thinking of buying this asset to use it in my 2D Adventure game. I know it doesnt support 2d but I think I can tweak it via Bone-based animation and use it to animate mouth, FFD etc. Can this be done? Also can you add 'scale' to the transform and also add an option to enable/disable a gameobject so that this asset can support 2D as well? It would be great since it seems the best lip sync asset! thanks...
     
  43. Rtyper

    Rtyper

    Joined:
    Aug 7, 2010
    Posts:
    452
    Thanks!

    Probably, but as you pointed out, the bone system doesn't support scale at the moment, so you might not get the best results out of it. The next update has a pretty big change to how bones/blend shapes etc work, and it will come with 2D support then. I'm aiming to release it at the end of January.
     
  44. giorgos_gs

    giorgos_gs

    Joined:
    Apr 23, 2014
    Posts:
    623
    This is great news, I hope you also add enable and disable gameobject into it. Also another good feature for 2d is to add a way to input custom scripts in it, so that we can do everything else!
     
    Last edited: Dec 30, 2015
  45. Rtyper

    Rtyper

    Joined:
    Aug 7, 2010
    Posts:
    452
    Yeah, this is pretty much the core change - all the code for affecting the mesh/sprite/whatever is moved to its own component called a Blend System, which is integrated with LipSync. I'll be creating some more in-depth documentation for it before it's released, but in theory you should be able to write a blend system that will let LipSync work with almost any kind of 2D or 3D character.
     
    giorgos_gs likes this.
  46. giorgos_gs

    giorgos_gs

    Joined:
    Apr 23, 2014
    Posts:
    623
    Will the new features have the ability to also work with Mechanim or only custom animation?
     
  47. Rtyper

    Rtyper

    Joined:
    Aug 7, 2010
    Posts:
    452
    They'll work similarly to the current system, where LipSync's animation is in addition to Unity's, so you can have your standard body animations in mecanim/legacy animation and LipSync running on top of that for facial animation.
     
  48. giorgos_gs

    giorgos_gs

    Joined:
    Apr 23, 2014
    Posts:
    623
    I ask because in order to do a nice lipsync animations you need the mouth, the emotions and the movement of hands or neck wich give emfasis to the emotions. That is why maybe the new system could interact with mechanim? I cannot imagine how... How can I have my standard animation and run emotion animations (on hands, neck) on top of my current one... this is a headache..
     
    Last edited: Jan 4, 2016
  49. Rtyper

    Rtyper

    Joined:
    Aug 7, 2010
    Posts:
    452
    Yeah, previously LipSync could only do mouth and face animation, but 0.5 also has the gestures track, where you can add triggers for animations to LipSync files alongside phonemes and emotions.

    In the current version, you could create a mecanim animation controller with a trigger that you set at the same time as starting the LipSync animation, but timing it right would be pretty difficult.
     
  50. giorgos_gs

    giorgos_gs

    Joined:
    Apr 23, 2014
    Posts:
    623
    So in the new version how does these gestures tracks work? On top of mechanim or in mechanim?