Search Unity

[RELEASED] LipSync Pro and Eye Controller - Lipsyncing and Facial Animation Tools

Discussion in 'Assets and Asset Store' started by Rtyper, Mar 11, 2015.

  1. Rtyper

    Rtyper

    Joined:
    Aug 7, 2010
    Posts:
    452
    In Mechanim. In the editor, it adds a new state machine to your animator controller with the gestures in, so they can be triggered by LipSync at the right points in the lipsync animation.
     
  2. gameorchard

    gameorchard

    Joined:
    Mar 19, 2015
    Posts:
    8
    First off, great asset. It works great with our game and I am very happy I purchased it.

    Quick question, as of now is there any way of starting a LipSync function on a object at a certain time in the LipSync and end it at a certain time during the LipSync. E.G Start LipSync 2 seconds in to the LipSync and end it 20 second into it?

    Thank you for your time
     
  3. binarie

    binarie

    Joined:
    Dec 18, 2012
    Posts:
    8
    Hi, I'm getting the error below and AutoSync isn't working

    'SAPI error: Phoneme label not found in phoneme mapper'

    Is this a Windows 10 issue with sapi.exe? I tried setting it to run in Admin mode with Windows XP/Vista/7/8 compatibility but none of them made any difference.
     
  4. Rtyper

    Rtyper

    Joined:
    Aug 7, 2010
    Posts:
    452
    No, there isn't - bit of an oversight on my part! I hadn't thought about it, but I can see how useful that could be. It shouldn't be too difficult to implement, so I'll put it in to 0.5 (out in 2 or 3 weeks).

    That's a known issue, It's nothing to do with Windows 10 as far as I can tell, but the speech API seems to just fail to work on a small percentage of computers (including my main one!). When this happens you get the error you mention and I haven't managed to find any solution for it yet.

    I'm replacing the AutoSync system completely for LipSync 1.0, which I'm aiming to release in March, or April at the latest, and this will solve these issues and bring it to Mac OSX too. In the meantime, you could try AutoSync on another PC if you have access to one, or add the phonemes manually in the editor (which often isn't anywhere near as time consuming as it sounds at first!).
     
  5. gameorchard

    gameorchard

    Joined:
    Mar 19, 2015
    Posts:
    8

    Awesome, looking forward to it!
     
  6. gameorchard

    gameorchard

    Joined:
    Mar 19, 2015
    Posts:
    8
    Also, at the moment whenever I add a clip that I haven't turned into a LipSync .asset earlier doesn't play audio within LipSync.

    I have tried deleting Assets\Rogo Digital and re-downloading but nothing worked. I got the this glitch after exporting and importing a .xml file from another .asset LipSync.

    Would it be possible for you to send me an earlier version of LipSync or are you aware of any solution?

    Best Regards.
     
  7. Rtyper

    Rtyper

    Joined:
    Aug 7, 2010
    Posts:
    452
    This is really strange - I've tried recreating it, but I wasn't able to. Could you PM or email me a project that recreates the problem? What Unity version are you using?
    I don't think an alternate version of LipSync would really help, are you getting any errors or warnings in the console?

    I'll keep looking at it, but right now I'm not sure what could be causing it!
     
  8. gameorchard

    gameorchard

    Joined:
    Mar 19, 2015
    Posts:
    8
    I'm using 4.6.9. I'll try and PM, the project is large enough so I'll copy/paste the project and eliminate.
     
  9. Rtyper

    Rtyper

    Joined:
    Aug 7, 2010
    Posts:
    452
    If sending it's not an option, how to recreate the error, step-by-step would be just as good! I've tried what I think you're saying you did, but in 4.6 this time (I had been using 5.3), and I still couldn't find the problem.
     
  10. gameorchard

    gameorchard

    Joined:
    Mar 19, 2015
    Posts:
    8
    I've found a workaround, I just open up Unity and Audacity side by side and use Audacity to navigate the audioclip.
    Hopefully a later update may fix the glitch!

    Thanks.
     
  11. gameorchard

    gameorchard

    Joined:
    Mar 19, 2015
    Posts:
    8
    I started up Unity today and it simply worked, it was probably just a issue with Unity.
     
  12. Rtyper

    Rtyper

    Joined:
    Aug 7, 2010
    Posts:
    452
    How weird! Oh well, if it happens again let me know, as there is one change to the code (specific to Unity 4.6) that could potentially fix it.
     
  13. MisfitMafia

    MisfitMafia

    Joined:
    Mar 14, 2013
    Posts:
    9
    Hey there,

    We're using LipSync, and it works pretty well. Is it intended to only work with WAV data? We imported MP3s and it simply didn't work, but I noticed that a stand-alone SAPI program also didn't work either.

    Also, it doesn't seem to play back audio sources that are marked as streamed. The plugin exceptions in LipSync.LoadData, at line 1155, when it tries to get the samples out of the audio source. However, the plugin doesn't use that data anywhere currently, and simply commenting that line allows it to work. Is this just an oversight? Or are you planning on using that data during playback?
     
  14. Rtyper

    Rtyper

    Joined:
    Aug 7, 2010
    Posts:
    452
    Hi MisfitMafia,
    If you're referring to AutoSync, then yes. The SAPI application that does the phoneme detection work only works with wavs. This will be changing in the near future though! Version 1.0 will be released in a couple of months' time, and will feature a new version of AutoSync that works with .mp3s (and will be Mac compatible too).

    Thanks for pointing that out! Yes, that code being there is an oversight. It's part of a feature that will alter the animation based on the volume of the audio (to differentiate between whispering and shouting), but wasn't finished in time for the last update. I obviously missed that when stripping the code for it out!

    When that feature is released, there will be an option to turn it off to allow streamed audio to work.
     
  15. Rtyper

    Rtyper

    Joined:
    Aug 7, 2010
    Posts:
    452
    Update time!

    I've decided to split this version into two, so that I can get some of the features out sooner. The new 0.5 is finished, and I'm submitting it to the store tomorrow. Here's the changelog for this version:

    Features
    - New BlendSystem base class: facilitates adding compatibility with other assets/workflows instead of blendshapes.
    - Default BlendShapeBlendSystem class. (for using blend shapes - identical to LipSync pre-0.5)
    - Added [BlendSystemButton] attribute for marking buttons in the LipSync editor.
    - Added Export option in the Clip Editor. (Exports a self-contained .unitypackage for transfering between Unity versions)
    - Added PlayFromTime method.
    - Extensions Window for downloading 3rd party asset integrations. Find in the Window/Rogo Digital menu.
    - Improved presets system: Presets can now go in any folder named "Presets". The Presets dropdown also now takes subfolders into account.

    Fixes
    - Clip Editor now behaves correctly when dragging markers while zoomed in.
    - Emotion marker handles now re-appear when two emotions are seperated by resizing.

    Changes
    - Removed outdated ExampleGUI.cs and XMLExampleGUI.cs.
    - Reduced the minimum emotion duration in the Clip Editor.
    - Added icons for presets and delete buttons in the LipSync editor.
    - Cleaned up emotion marker graphics to scale better.
    - Made left and right eye transforms optional in Eye Controller.

    If you've been following this thread, you might notice a couple of absences from that list - most notably the new runtime and the gestures system. These have both been moved back to 0.6, which I've pencilled in for release on the 5th of February. That will be the final beta version before 1.0 (In March), which will feature AutoSync 2.0 - with OS X compatibility!
     
  16. MisfitMafia

    MisfitMafia

    Joined:
    Mar 14, 2013
    Posts:
    9
    Thanks for your quick reply, that cleared up some stuff.

    We've discovered a couple of other issues that we've kind of hacked a fix for, but it might be something you want to consider. Right now, the plugin assumes all responsibility for starting and stopping audio sources. This doesn't necessarily play well with third-party sound libraries, though. We have a specialized sound manager that is responsible for starting the sounds, and want the lip syncing to just match those sounds.

    Like I said, we kind of hacked a fix for this (made a new play function that takes an AudioSource, and removed some dependencies on the audioclip from LateUpdate), but it might be something you wish to consider.

    One other issue we ran into is, the lip sync plugin is doing its own timing for lip syncing, instead of using the current play position of the audio source, to determine the current position. Is there a reason you did it this way, instead of using the current position in the sound (audioSource.time) to compute your normalized sound position?
     
  17. MisfitMafia

    MisfitMafia

    Joined:
    Mar 14, 2013
    Posts:
    9
    One final thing, is there a plan or would it be difficult to handle looping audio?
     
  18. Rtyper

    Rtyper

    Joined:
    Aug 7, 2010
    Posts:
    452
    No problem :)
    Good point. I originally made it to act as a sort of replacement for the AudioSource, in that you would do all the interaction with the dialogue audio through LipSync (which is why the LipSyncData asset has the AudioClip reference built-in). As with most things though, this doesn't work with everyone's workflow!

    Both of these things (and how the plugin works re: AudioSource playback) are part of the reason I'm completely rewriting the runtime again for 0.6. LipSync has evolved quite a lot since it was first launched last March, and the way I decided to do things then hasn't always turned out to be the best way of doing them.
    The new runtime system works completely differently - it's tied to audio playback instead of governing it, and allows for seeking, rewinding etc like a standard animation would.

    Looping audio would be possible with the current version, but it would be difficult. You'd need to reset the timer, the current phoneme/emotion variables and a few other things when the loop happens. If it's not an super important feature, you might be better off waiting for 0.6 to be release (mid-March at the latest).
     
  19. MisfitMafia

    MisfitMafia

    Joined:
    Mar 14, 2013
    Posts:
    9
    Thanks again for the quick reply.

    It was fairly easy to modify to use existing AudioSources, at least for our purposes. And for looping, just Stop()ing and restarting the lip syncing seemed to be sufficient; again, for our purposes.

    If you could work in a way to use existing audio sources in future versions, though, that would make at least our lives a little easier in the future ;) I'm sure with the popularity of third-party audio managers out there, it might be a useful feature.
     
  20. Rolsct

    Rolsct

    Joined:
    Sep 8, 2014
    Posts:
    1
    Hi !
    I bought Lipsync a little time ago, and I finally start playing with it !
    It's really a good tool, even in its beta state !
    However, I've got one question, quite crucial for me : I'm developing a tool where all the texts will be in french.
    Have you already planned adapting the list of phonemes and the autosync tool to the french language ?

    Thx !
     
  21. Rtyper

    Rtyper

    Joined:
    Aug 7, 2010
    Posts:
    452
    Good, good!
    Yeah, this is already in place for the next version!


    Thanks!
    I'll be honest, I hadn't. Forgive me if I'm being ignorant, but am I right in thinking that French uses mostly similar sounds to English? It should be possible to get a decent approximation of lipsyncing for French with LipSync at the moment. AutoSync may not work well though.

    Foreign language support is an interesting topic though. The upcoming AutoSync 2 uses a library called CMU Sphinx, which has support for many other languages through language models, so this is something I could integrate easily enough. Languages with many very different sounds to English might be more difficult though, as they may need a different set of phoneme poses. It's something I'd like to add, but it's definitely a feature for after version 1.0!
     
  22. Rtyper

    Rtyper

    Joined:
    Aug 7, 2010
    Posts:
    452
    In other news, the new runtime for beta 0.6 has been coming along much more smoothly than I expected! It's already looking much better (in my opinion).

    Check it out for yourself here: Webplayer Comparison.

    It's a completely different approach than the old system, using animation curves that are precomputed when the file is loaded. This means that (on average) it runs faster, allocates less memory, doesn't trigger the garbage collector once it's playing and can be sampled at any time - allowing for a much more accurate PlayFromTime method and finally making a real-time preview in the Clip Editor possible! I'll post a video at some point this week showing how that works.
     
    KuzKuz, gurayg and wetcircuit like this.
  23. Danirey

    Danirey

    Joined:
    Apr 3, 2013
    Posts:
    548
    Hi,

    I'm new with facial blendshapes. Sorry if this is a bit obvious, but i don't know how this work. The question is: I'm using mixamo fuse to create my characters. To use blendshapes, do i need to create them in a 3d editor, or will give your asset some premade blandshapes to use out of the box? Again sorry if the question is too basic.

    Cheers!
     
  24. Rtyper

    Rtyper

    Joined:
    Aug 7, 2010
    Posts:
    452
    Hi Danirey,

    No problem! It depends on how you get your model out of Fuse. The easiest way is to use the Animate button (might be labelled differently in the new Adobe version) that uploads the model to Mixamo's auto-rigger. On the auto-rigger page there is a checkbox you can tick that will enable facial blendshapes on the model.

    LipSync has presets built-in for these Mixamo Fuse models, so if you do it this way, you won't even have to set up the poses manually in the LipSync editor!
     
    Danirey likes this.
  25. Danirey

    Danirey

    Joined:
    Apr 3, 2013
    Posts:
    548
    Hey, thanks!

    I'm using fuse and animating with mixamo for quite a time, but never need facial animations. Thanks a lot for the info.


    Cheers
     
  26. Rtyper

    Rtyper

    Joined:
    Aug 7, 2010
    Posts:
    452
    Beta 0.5 is now available from the Asset Store!

    As always, if you own it you can download for free from the downloads page.
    Please read the update guide before updating!!
    This update brings the minimum Unity version up to 5.0. This is due to the extra time involved in getting each release ready for 4.6, and the rapidly declining use of that version making it less and less worthwhile. If you really need the update for 4.6, let me know either here or by email.

    Here's the change list again for this version:

    Features
    - New BlendSystem base class: facilitates adding compatibility with other assets/workflows instead of blendshapes.
    - BlendshapeBlendSystem class. (for using blend shapes - identical to LipSync pre-0.5)
    - 2D support with SpriteSwapBlendSystem class.
    - Added [BlendSystemButton] attribute for marking buttons in the LipSync editor.
    - Added Export option in the Clip Editor. (Exports a self-contained .unitypackage for transfering between Unity versions)
    - Added PlayFromTime method.
    - Extensions Window for downloading 3rd party asset integrations. Find in the Window/Rogo Digital menu.
    - Improved presets system: Presets can now go in any folder named "Presets". The Presets dropdown also now takes subfolders into account.

    Fixes
    - Clip Editor now behaves correctly when dragging markers while zoomed in.
    - Emotion marker handles now re-appear when two emotions are seperated by resizing.

    Changes
    - Removed outdated ExampleGUI.cs and XMLExampleGUI.cs.
    - Reduced the minimum emotion duration in the Clip Editor.
    - Added icons for presets and delete buttons in the LipSync editor.
    - Cleaned up emotion marker graphics to scale better.
    - Made left and right eye transforms optional in Eye Controller.
    - Dropped support for Unity 4.6 and lower.
     
  27. Mac77

    Mac77

    Joined:
    Jan 30, 2016
    Posts:
    4
    Hi,

    first of all thanks for this great addon. Keep up the good work.

    I try to use the eye controller for my character, but unfortunately it seems like the axes of my eye bones are somehow not compatible with the axes altered by eye controller. The horizontal look has to be -90 to make my character look straight and the vertical look rotates the pupils instead of moving them up and down. I think it is because I modeled the character in Blender and that always gives problems with axes (in Blender Z axis is the up-axis). I try to alter the eye bones in Blender for two hours now to fit the eye controller, but it's not working. It's quite heavy to alter the rig on a fully animated character.

    So I'd like to make two feature requests:
    1. Would it be possible to implement an offset value for each horizontal and vertical look?
    2. Could you implement choosable bone axes for horizontal and vertical look?

    Cheers
    Marcus
     
  28. Rtyper

    Rtyper

    Joined:
    Aug 7, 2010
    Posts:
    452
    Hi Marcus, thanks!

    Yes, I've actually just finished a big update for Eye Controller that's going to be available separately soon, but I'll still be including it with LipSync. There is an offset variable included in the system at the moment, but I can see how it wouldn't really help with the X rotation problem. The 0.6 update is maybe a week away, so I'll include the ability to change the axis in with that.
     
  29. Rtyper

    Rtyper

    Joined:
    Aug 7, 2010
    Posts:
    452
    Hey everyone, I've got a couple of announcements!

    First, LipSync 0.6 is very nearly finished, and will be submitted to the Asset Store tomorrow. This update includes the all-new runtime (Demo here), real-time animation preview in the clip editor, and the new gestures system for triggering mecanim animations at certain points during dialogue.

    I've put together a short video showing the editor preview in action:



    Second, Eye Controller 1.1 is being released on the Asset Store as a separate product. This probably isn't of too much interest to people in this thread, as a lot of you will already own LipSync, and the full version of Eye Controller will continue to be included with it. This update features a lot of improvements, including the ability to enable/disable certain features, compatibility with LipSync blend systems and more in-depth customisation of blinking and looking speed and duration. Here's the new editor:

    Eye Controller.PNG

    As I said, this product will be kept on par with the version included with LipSync (this update will be in with 0.6) - this is only to allow people to use it even if they don't need LipSync.

    I'll also be creating a whole new video tutorial series soon, as the existing one is really out of date now! Keep an eye on this thread for more info on that.
     
    KuzKuz and gurayg like this.
  30. Purpleshine84

    Purpleshine84

    Joined:
    Apr 8, 2013
    Posts:
    194
    Hi, I get an error with the new Update:
    Assets/Rogo Digital/Shared/Editor/ContinuationManager.cs(35,47): error CS1955: The member `RogoDigital.ContinuationManager.Job.ContinueWith' cannot be used as method or delegate

    This is in an almost empty project. Could you help?@Rtyper. I also sended a personal message per e mail.

    Regards,
    Maurice

    EDIT: Issue solved now with help of Rhys from Rogo. There appeared to be a class in the root called Action that Rogo used aswell. Uncommenting all out solved the problem.
     
    Last edited: Feb 3, 2016
  31. Mr-Oliv

    Mr-Oliv

    Joined:
    Sep 14, 2012
    Posts:
    33
    Hi! Any chance you'll be adding batch processing of sound files?
     
  32. Rtyper

    Rtyper

    Joined:
    Aug 7, 2010
    Posts:
    452
    Hi! Yes, that's been suggested a couple of times. That'll be coming in version 1.0, when AutoSync 2 is finished. I'm expecting that to be early March at the moment. :)
     
  33. Rtyper

    Rtyper

    Joined:
    Aug 7, 2010
    Posts:
    452
    It's a couple of days late, but 0.6 has been submitted to the Asset Store for review. I ran into a few problems getting bone animation to work correctly with the new system, and because of that I've ended up having to delay the Gesture system again. It's close to completion, but I didn't want to push this version back any further.

    On the plus side, emotion markers with bones are implemented now, and Gestures will be coming in 0.61 in a week or so's time. We're pretty rapidly approaching 1.0!
     
  34. florianbepunkt

    florianbepunkt

    Joined:
    Nov 28, 2015
    Posts:
    45
    Hey @Rtyper,

    I just purchased your plugin and marked this thread to follow up with development. Two questions:

    - Is there some sort of integration with the Cinema Director asset or did someone already tried to use both assets together? I think they will make a good couple.

    - As for the eye controller. I really like the random eye movement. But I already use FinalIK to look at specific objects called by triggers or via code. So is there a way to have the random eye movement going on and probably the auto target as well but when Final IK kicks in to temporarily disable this? Thing would be that it should blend between Eye Controller and Final IK in a seamless way if possible.

    Thanks,
    Florian
     
  35. Rtyper

    Rtyper

    Joined:
    Aug 7, 2010
    Posts:
    452
    Hi Florian,

    Yes, but it's not ready right now. It's going to be available to download from the extensions window in a few days.

    You could disable random blinking from a script at the same time you enable the IK, which would probably be the simplest solution. If you wanted to use the auto target functionality as well though, you'd probably be better off leaving look at target and auto target on all the time, and only using FinalIK on the head and neck etc, (I think FinalIK has some similar targeting functionality?) so that the eyes aren't affected.
    There's an update coming for Eye Controller soon that will transition back to random looking when using auto target and there's no target in range, so that should help too!
     
  36. Stormy102

    Stormy102

    Joined:
    Jan 17, 2014
    Posts:
    495
    Hm, this looks pretty cool. Keeping an eye on this. Wondering if it would be possible to add runtime audio processing e.g. for VoIP?
     
  37. Rtyper

    Rtyper

    Joined:
    Aug 7, 2010
    Posts:
    452
    Thanks!

    No, that wouldn't really be feasible for the way this works, but we have a separate product on the way (After LipSync 1.0 is released) called LipSync Live that will do just that! It'll support a lot of the same features as LipSync, but be completely real-time.
     
    Stormy102 likes this.
  38. Stormy102

    Stormy102

    Joined:
    Jan 17, 2014
    Posts:
    495
    Ok sounds cool. I'll definitely keep an eye out for it. Thanks Rtyper
     
  39. Rtyper

    Rtyper

    Joined:
    Aug 7, 2010
    Posts:
    452
    Cinema Director and LipSync integration is now available!

    A package containing Cinema Director Events for all the basic functions in LipSync is now available to download from the Extensions Window.

    The next extension will be a Blend System for UMA 2, which will be ready before the end of February.
     
  40. pikifou

    pikifou

    Joined:
    Jan 6, 2015
    Posts:
    3
    I've used the tool and I really enjoy it. Congrats for the job because it works like a charm.
    Just a quick question. I'm making an adventure game without voice recording. There's too many text so I won't record audio. I just want to have people's mouth moving. How can I use lipsync without audio? I think it's precomputed so it should be trickable (in the autosync maybe, there could be an option like "create from text only"). How would you do it?
     
  41. Rtyper

    Rtyper

    Joined:
    Aug 7, 2010
    Posts:
    452
    Hi Pikifou,
    AutoSync is based on using audio, and the text is a secondary part of it - you can use it without text, but it's not possible without audio. I'll look into introducing some kind of audioless mode in the future, but at the moment a lot of LipSync relies on having an AudioClip present (for timing etc).
     
  42. Rtyper

    Rtyper

    Joined:
    Aug 7, 2010
    Posts:
    452
    Good news, after a small issue regarding documentation, Eye Controller is now available from the Asset Store!

    As I've stated before, owners of LipSync will get these improvements, and all future versions of EyeController included with LipSync still. There's no need to buy both, this just gives people who don't need LipSync access to Eye Controller.

    For everyone else, LipSync 0.6 is still pending review, and 0.61 (including Gestures) is very nearly finished, so you may actually end up just getting 0.61 instead... We'll see how that goes!
     
  43. Rtyper

    Rtyper

    Joined:
    Aug 7, 2010
    Posts:
    452
    Phew, feels like a lot's happening at once here!

    LipSync 0.6 is now live - you can get the update now if you like, or wait about a week until 0.61 gets approved. Either way, after that we're on the home straight for version 1.0 in early March.

    A quick warning: If you're relying on the SpriteSwapBlendSystem at the moment, don't update to 0.6, as it will no longer work! Sprite Swapping will be coming back in 1.0 (or possibly before then via the extensions system), but 0.6 doesn't contain it, and trying to use the existing blend system will cause compilation errors. Sorry if this is an issue for you, but don't worry - there's not too long to wait until the new version is ready!

    As always, remember to uninstall the previous version of LipSync before downloading the current one!
     
  44. Casanuda

    Casanuda

    Joined:
    Dec 29, 2014
    Posts:
    53
    Hi. Great asset!

    Any way to extract bone position/rotation from animation file rather than manually having to insert the values - for bone lip Sync?

    Tnx
     
  45. Rtyper

    Rtyper

    Joined:
    Aug 7, 2010
    Posts:
    452
    Hi Casanuda,
    Not directly, I'm afraid. You could play your animation and copy the values from the transform at the right frame.
    What is it you need to be able to do that for? There might be some other way around it!
     
  46. Casanuda

    Casanuda

    Joined:
    Dec 29, 2014
    Posts:
    53
    No biggie...the facial animations are all set up in a 3rd party program and can be exported as animations...would just speed up the work flow back into Unity...

    Looking forward to auto sync for Mac!
     
  47. Mac77

    Mac77

    Joined:
    Jan 30, 2016
    Posts:
    4
    Hi,
    what about constraints to the auto targeting? If the character stands in the opposite direction of the target that is inside the target range, the eyes roll to the back because the still aim the target, which looks really creepy. A maximum horizontal and vertical eye angle could help with this.

    And another thing: How about a switch to turn on random looking while targeting an aim using auto target. At the moment auto-targeting disables the random eye movement, which is basically good, but an additional randomization of the eye movement would give it a more natural look, I think.

    EDIT: And one more: The line in the auto-target part
    float targetDistance = Mathf.Infinity;
    should be
    float targetDistance = autoTargetDistance;
    or else setting the target distance makes no sense.
     
    Last edited: Feb 14, 2016
  48. Rtyper

    Rtyper

    Joined:
    Aug 7, 2010
    Posts:
    452
    Ah, I see. I'll put that on the list for a future update, it does sound useful. I'm already planning on improving the process of setting bone positions for version 1.0, because it is a bit too clunky at the moment!

    Yep, I'm putting out an update to Eye Controller soon that will add this - it'll use the same min/max rotations as random looking.
    I thought I'd already fixed the auto-target bug before I published it! Are you using the version of Eye Controller that comes with LipSync or the standalone version? Either way, I'll fix this in the next update too.

    You should still be able to change targetWeight when you're using auto target to get a blend between them like you want.
     
  49. Casanuda

    Casanuda

    Joined:
    Dec 29, 2014
    Posts:
    53
    That would be great! Thanks.
     
  50. AiryKai

    AiryKai

    Joined:
    Apr 16, 2014
    Posts:
    52
    Hi guys.

    AutoSync does not work for my audio files. Can't find the cause. There may be some requirements for files?

    Also, Emotions don't work. I configured them in the Clip Editor and set emotions blendshapes. But they don't move.

    Any ideas?