Search Unity

[RELEASED] LipSync Pro and Eye Controller - Lipsyncing and Facial Animation Tools

Discussion in 'Assets and Asset Store' started by Rtyper, Mar 11, 2015.

  1. Rtyper

    Rtyper

    Joined:
    Aug 7, 2010
    Posts:
    452
    You only need to call one of these - DisplayEmotionPose is mainly for integration with other assets, and doesn't animate over time, so if you want to blend into one just call
    this.GetComponent<RogoDigital.Lipsync.LipSync>().SetEmotion("Smile", 4.0f);
    . No need for Time.deltaTime, as the duration value should be the time in seconds it will take to blend in.

    That's the plan - I'll be making the Google Cloud feature available through the Extension Window after release, but the phoneme alignment part should work right from the start if you can provide a transcript yourself. I'm not sure if it will support Chinese characters or if it will need to be in Pinyin, but I'll make sure it's all in the documentation by the time it's out.
     
  2. Alvarezmd90

    Alvarezmd90

    Joined:
    Jul 21, 2016
    Posts:
    151
    I know I only have to use one of them. But the set emotion one doesn't work. Display emotion does work correctly. But set emotion causes the whole face to collapse. I have no clue why.
     
  3. CWatsonT2

    CWatsonT2

    Joined:
    Jan 9, 2019
    Posts:
    114
    Is there a tutorial for setting up the eye controller? I'm having some issues trying to get the eyes to track a target with an UMA character. What is strange is I can get it to start to work if I disable the expression player script that is created at run time. I create the bone structure and then put the eyeglobes as the left and right look at tranform bones. I get errors if I don't have anything in there.Then when I disable expression player they start to work. What am I missing? The eyecontroller creates the expression player so it must work with that.
     
  4. Rtyper

    Rtyper

    Joined:
    Aug 7, 2010
    Posts:
    452
    I'm guessing you're using bone transforms? I can't say I've come across this problem before, but I might have an idea about what's causing it. Is there any way you could send me a project folder (whether it's the original or a cut-down version) to reproduce the problem? You can use the email contact@rogodigital.com. I'll have a look over that method myself in the meantime and see if there's anything obviously wrong, but it might be a few more days as I'm focusing on getting the new 1.5 update out in the next week.

    No specific tutorial at the moment, but that sounds like the Expression Player is overriding the rotations of the bones itself. I think they may have changed it since I last updated the UMA integration. I'll look into fixing this myself, but before that happens, you might be able to get it working by using Unity's Script Execution Order. If you make sure EyeController.cs runs after the Default time (unless you see UMA Expression Player in that list, in which case it should run after that) then Eye Controller should take priority over UMA in setting the eye rotation.
     
  5. Rtyper

    Rtyper

    Joined:
    Aug 7, 2010
    Posts:
    452
    LipSync 1.5 has now been submitted to the Asset Store! It's out of my hands now, but hopefully it should be available in the next few days. Ordinarily, I'd make the new update available from the downloads section of the website, but I'm in the middle of updating how this system works, so I'm afraid you'll have to wait until it's available from the store.

    I'll be making a post very soon with the timescale for new modules, updates to existing ones, and additional language support, which will all be downloadable from the extension window to avoid requiring a whole new LipSync version.

    As always, here's the changelist for 1.5:
    Features
    • AutoSync 3
      • Modular system, allowing different tasks (phoneme detection, cleanup, etc) to be queued and executed in sequence
      • AutoSync Presets - contains a set of AS Modules + settings for each for easy access
      • Phoneme Mapping is now seperate from Language Models, and the same phoneme maps can be shared between different modules
        • Includes new automatic mode, for finding the best phoneme map based on the input module and current PhonemeSet
      • New AutoSync Setup Wizard designed to simplify the setup process of modules that require 3rd party applications or paths to be set.
      • PocketSphinx Module - Phoneme detection module that recreates the behaviour of AutoSync2 for legacy purposes (Windows-only).
      • Phoneme Cleanup Module - Performs very simple marker thinning out process as used by the old High Quality preset in AutoSync2.
      • Marker Intensity from Volume Module - uses AudioClip volume to set phoneme marker intensities according to a curve. (Previously available from the Clip Editor, but now usable in batch mode.)
    • Montreal Forced Aligner Module - Brand-new phoneme detection module. Much more accurate than PocketSphinx, works on macOS + Windows. [Download from the Extension Window]
    • The Clip Editor can now look for a .txt file with the same name when an AudioClip is loaded and use this as the transcript.
    • API for opening the Extension Window and immediately downloading/installing a specified extension package.

    Fixes
    • Improved how the system handles the Project Settings file being deleted. A new file will be created that is identical to the default included one.
    • Removed Project Settings menu item in the Edit menu to avoid the warning in Unity 2018.3. The file can still be found in the same location as before.

    Changes
    • Clip Editor now stores its data as a temporary LipSyncData file for passing into AutoSync. Original API is still preserved as Properties, though with capital first letters for convention (e.g. .length is now .Length).
    • Removed LipSyncData constructors that were unusable as a ScriptableObject.
    • Removed legacy code for Unity 5.4 and lower.
     
    ftejada and haleler51 like this.
  6. ceebeee

    ceebeee

    Joined:
    Mar 7, 2017
    Posts:
    395
    Hi, I wanted to get back to you on this as I've been doing my own investigations. After a *lot* of experimentation with my own models and yours, I found I was never able to create the problem with anything but the lipsync package. But I couldn't see any good reason why. However Today I finally stumbled on some interesting findings.

    It seems there's something about the Lincoln model's blendshapes that Unity doesn't like, but not just any of them, there's specific ones that cause the error, namely the following Indexes:

    4 BrowsIn_Left
    5 BrowsIn_Right
    6 BrowsOuter_LowerLeft
    7 BrowsOouter_LowerRight
    37 MouthWhistle_NarrowAdjust_Left
    38 MouthWhistle_NarrowAdjust_Right

    Now I don't know why these specific Blendshapes are troublesome to Unity, but you can simply remove them from the "Emotions" settings, and then everything runs without error.

    Hopefully this info can help to find a solution.
     
  7. Rtyper

    Rtyper

    Joined:
    Aug 7, 2010
    Posts:
    452
    Unity seems to have a got a lot faster at approving Asset Store updates recently!
    LipSync Pro 1.5 is now available from the Asset Store.

    Please pay attention to the update instructions text file on this one, as there may be some extra steps! Usually I'd post the change list again, but it seems a little pointless - it's two posts up ^ that way. ;)

    Huh, that's very interesting. I must have misread your first post, I didn't realise it was only happening on the Lincoln model. That's good in a way, at least this hopefully won't be a problem for lots of users! I'll try re-exporting the model and see if it's still happening then. Thanks for your help!
     
  8. f1chris

    f1chris

    Joined:
    Sep 21, 2013
    Posts:
    335
    I did a clean install in a new project and I'm getting these 2 errors in 2019.1.0f2

    thx for you help !!

    Assets/Rogo Digital/LipSync Pro/AutoSync/Editor/Modules/PocketSphinx/SphinxWrapper.cs(142,21): error CS0103: The name `PSRun' does not exist in the current context


    Assets/Rogo Digital/LipSync Pro/AutoSync/Editor/Modules/PocketSphinx/SphinxWrapper.cs(149,20): error CS0103: The name `PSRun' does not exist in the current context
     
  9. Prephonat

    Prephonat

    Joined:
    Mar 13, 2014
    Posts:
    4
    Hello, I installed the new Lipsync Pro 1.5 but I'm having a problem when trying to autosync using the new autosync module. I ran the autosync setup and installed the Montreal Forced Aligner extension. When I try to autosync using the new Default preset it gives me an error "Montreal Forced Aligner application path is not verified".
    I ran into this problem in a new Unity 2018.3.14f1 project.
    I hope you can help me with this problem, thanks!
     

    Attached Files:

  10. Rtyper

    Rtyper

    Joined:
    Aug 7, 2010
    Posts:
    452
    Damn, that's my fault. I made a fairly last-minute change and forgot to re-test on macOS after. I've attached a replacement version of SphinxWrapper.cs that fixes the problem to this post. Just overwrite the existing one and it should work! I'll put out a patch to the Asset Store ASAP.

    Hi, did you complete the Setup Wizard after the module had downloaded? It's intended that you keep it open while the module installs and return to it after, but if you completed it beforehand you should be able to just run it again and have it detect the location of the application it needs. Let me know if you're still having trouble after that!
     

    Attached Files:

  11. Prephonat

    Prephonat

    Joined:
    Mar 13, 2014
    Posts:
    4
    I reinstalled the Lipsync and tried to install the module through the setup window but the problem still persists. In the autosync setup window everything looks correct.
     

    Attached Files:

  12. Rtyper

    Rtyper

    Joined:
    Aug 7, 2010
    Posts:
    452
    It's strange that it seems to be finding the application path just fine... After clicking "Continue" on that last screen, can you open the Clip Editor, go to the settings screen, then click the "AutoSync" tab and tell me if the path appears under the "Montreal Forced Aligner Phonemes Module" section of that page?

    If it does, try just clicking the verify button next to it and see if the little red circle turns green. The setup wizard is supposed to do this automatically, but maybe that's failing somehow?
     
  13. Prephonat

    Prephonat

    Joined:
    Mar 13, 2014
    Posts:
    4
    Thanks, selecting path manually solved the issue. Before that the path was relative to the project location and it wasn't working.
     
  14. Rtyper

    Rtyper

    Joined:
    Aug 7, 2010
    Posts:
    452
    Glad it's working now, at least, though I'd have thought the relative path would work. The path it auto-detects is always relative, but it runs just fine on my machine. Slightly worrying me if that can vary from PC to PC!
     
  15. f1chris

    f1chris

    Joined:
    Sep 21, 2013
    Posts:
    335
    LOL.... no problem...thanks for your super fast fix !!!!
     
  16. ElevenGame

    ElevenGame

    Joined:
    Jun 13, 2016
    Posts:
    146
    Hello @Rtyper, I've tried the new Version 1.5 and sadly I have to say that my issue persists: Autosync does not create any Phoneme Markers for me, not even the new MFA. I even tried reinstalling vc++ runtimes and was using an empty 2019.1.0f2 project, used the included sox files instead of my local ones, but nothing new: the clip editor says "Autosync completed sucessfully", there's no debug log, but also still no phoneme markers.. Is there any way to debug what's happening here?
     
  17. Rtyper

    Rtyper

    Joined:
    Aug 7, 2010
    Posts:
    452
    I just don't understand this. This may not help much, but could you take a screenshot of the clip editor immediately after running AutoSync (just the Default MFA preset is best) and post that here? I don't know if there'll be anything visible but I really can't think of much else to try looking at if you're not getting any messages in the console...

    I'd usually ask if you could send me a project folder to reproduce the problem, but honestly if this is happening in a completely clean project with LipSync Pro installed straight from the Asset Store, there shouldn't logically be any difference between that and me just creating that project myself!
     
  18. ElevenGame

    ElevenGame

    Joined:
    Jun 13, 2016
    Posts:
    146
    Here is a screenshot. I've realized that LipSync Pro keeps it's settings (such as SoX path) between different Unity projects. Can you tell me where those informations are stored? Maybe that's where the issue lies..
     

    Attached Files:

  19. Rtyper

    Rtyper

    Joined:
    Aug 7, 2010
    Posts:
    452
    The data uses Unity's EditorPrefs class, which I believe stores the values in the registry. I suppose it's possible that it could be a different issue with the MFA module than with PocketSphinx. Could you check the clip editor settings screen under the AutoSync tab and see what values are stored for both the sphinx path and the MFA path? If the MFA one is relative, could you try clicking browse and finding the absolute path to the file?

    It should work with relative paths, but as Prephonat found above, it seems like sometimes absolute paths are required? If this is the case for you too, I'll need to make an update to detect absolute paths instead and change how these values are stored, as they'll need to be per-project.
     
  20. ElevenGame

    ElevenGame

    Joined:
    Jun 13, 2016
    Posts:
    146
    @Rtyper when doing the setup, I've had to do the same thing Prephonat described. I am having absolute paths in there now, and had them there for all my tests.

    BTW. I've added some debug logs to the code and found out, that phonemes are actually returned from AutoSync. They just never showed up because their time range was 0 to 1000 instead of zero to one, also sustain seems to be varying between true and false from one phoneme to the next. I can "fix" the results of AutoSync with something like the following function:

    public void FixLipSyncData(LipSyncData lsData)
    {
    for(int i = 0; i < lsData.phonemeData.Length; i++)
    {
    lsData.phonemeData.time /= 1000.0f;
    lsData.phonemeData.sustain = false;
    }
    }
     
  21. Rtyper

    Rtyper

    Joined:
    Aug 7, 2010
    Posts:
    452
    I think this solves it!

    Sustain varying from one to the next is normal, that's based on the length of the time interval returned by MFA and the "minLengthForSustain" option. The default presets stops it creating sustains at all ordinarily (as I found the results looked a bit better without).
    With the 0-1000 range thing though, I actually had a message from another user who saw this thread and suggested it might be to do with the OS region setting, and stupidly I didn't think it could apply in this situation. Am I right in thinking your system is set up to use a , character to separate decimals? I don't know if this could have been the problem with PocketSphinx, but if this is right then I think MFA is giving results with . characters as decimal points and then LipSync is using the system culture to interpret them as meaning thousands, making the range wrong.

    Could you try finding this path immediately after running the MFA module:
    "<path-to-your-Users-folder>\<your-name>\Local\Temp\<company-name>\<product-name>\Gettysburg_MFA_Output"
    Where <company-name> and <product-name> are the company and product names from the top of PlayerSettings in your Unity project. There should be a .TextGrid file in there that you can open with notepad or some other text editor. If you could upload that file here, I can find out whether it's LipSync interpreting wrong, or MFA producing the wrong results and hopefully fix this problem properly!
     
  22. ElevenGame

    ElevenGame

    Joined:
    Jun 13, 2016
    Posts:
    146
    @Rtyper Yes, you are right, my pc is running with german language settings and is using "," for decimal separation. I'm pretty sure that this is the root of the issue. The .TextGrid file has dot separated numbers.. I appended it in a zip, because of otherwise forbidden file extension.
     

    Attached Files:

    Rtyper likes this.
  23. Rtyper

    Rtyper

    Joined:
    Aug 7, 2010
    Posts:
    452
    Great, I think we've got this solved then. I've attached a replacement file. Can you try overwriting the current version of this in the AutoSync/Editor/Modules/Montreal Forced Aligner folder, removing your fix from before and see if everything works correctly now?
     

    Attached Files:

  24. ElevenGame

    ElevenGame

    Joined:
    Jun 13, 2016
    Posts:
    146
    Yes, it works fine now with the replaced TextGridUtility! :) PocketSphinx Phonemes still has the same issue I think, but i don't care anymore, because th MFA creates great results as far as I can tell. Thanks for your help! I also wrote you a review at the asset store and am happy to finally use this tool some more..
     
  25. Rtyper

    Rtyper

    Joined:
    Aug 7, 2010
    Posts:
    452
    For anybody who's been thinking of picking up LipSync Pro for a while, now's your chance - it's currently 50% off in the May Madness sale! Pick it up from the Asset Store for only $17.50 until May the 15th.

    Great, thanks for the review!

    For anyone else encountering the same problem, I've just pushed an update to the Montreal Forced Aligner Module to the extensions window. Changes are as follows:

    V1.0.1
    -------
    Fixed bug that prevented the module from working on systems that use the "," character as a decimal seperator.
    Fixed bug where only an absolute path would be accepted for the MFA application on some systems.

    I'll be updating the PocketSphinx module soon with the same "," character fix.
     
  26. ConfusedCactus

    ConfusedCactus

    Joined:
    Jul 3, 2018
    Posts:
    19
    Hi There,

    I just started to use Lipsync Pro, so here is some newbie questions:
    1) Is there be a way to automatically create blendshapes? Currently I am making my own with Blender.
    2) I know you can choose "exaggerated" and "standard" for Fuse characters to generate those blendshapes. I couldn't find any tutorials on using Adobe Fuse with Lipsync Pro. Therefore can anyone show me how to do so?

    Many thanks for the response!
     
  27. philipp-naegelsbach

    philipp-naegelsbach

    Joined:
    Mar 2, 2016
    Posts:
    2
    Hi,
    I started using Lipsync Pro yesterday, but when I try "Autosync->Run Default" on my dialog files I often get the "MFA output TextGrid file does not exist." error. The legacy presets (and the example audio files) work though. What am I doing wrong? Is there some specific setup I have to do on my wav files import settings?
    I also tried the updated TextGridUtility.cs...
     
    Last edited: May 3, 2019
  28. jpingen

    jpingen

    Joined:
    Jan 28, 2019
    Posts:
    22
  29. Nemquae

    Nemquae

    Joined:
    Nov 24, 2013
    Posts:
    2
    Hi Rhys,

    New to LipSync Pro, but so far it looks great. Excellent work. The only feature we need now is real-time phoneme detection. We'd like to use it on a live product with streaming audio input. Is this feature supported? If not, do you have any plans or suggestions to support this feature? We're looking at some competing frameworks, namely SALSA and OVRLipSync, but we'd like to use LipSyncPro if at all possible.

    -Lake
     
  30. Rtyper

    Rtyper

    Joined:
    Aug 7, 2010
    Posts:
    452
    1) No - this is part of the art pipeline really, so it's beyond the scope of LipSync Pro. For your own models, you'll always want to do this anyway so you have full control over the look of the animation.

    2) It's fairly straightforward if you're using their autorigger. You just export your character from Fuse via the auto rigger (making sure to enable the facial blend shapes option) as usual and download it once it's complete. In Unity, put a LipSync component on the root of the character, pick the "Blend Shape Blend System", add the "body" mesh to the "Character Mesh" field, and any other animated meshes (usually eyelashes, sometimes beard and/or mask as well) to the "Optional Other Meshes" field and click continue. You can then apply either of those presets you saw to fill in all the standard phoneme + emotion poses on the character.

    Hmm, that implies that something might be going wrong internally in the MFA_align application. Can you check that both the paths under the "AutoSync" tab of the Clip Editor's settings screen have little green circles next to them? It sounds like possibly your audio isn't being converted to the correct format, which is handled by an application called SoX. If that path isn't correct then it won't be able to convert it.

    Send me an email (contact@rogodigital.com) if it's not that. I may be able to send you a debug version of the module that can make the problem clearer.

    I'm afraid not - I haven't had any luck integrating any other PocketSphinx language models in the past. The Mandarin one was fairly recently updated to add the required phoneme dictionary component (that only the English one had contained before) and even that just caused a crash that I wasn't able to debug.

    There is some potential for Dutch support for the new Montreal Forced Aligner module, though it isn't one of their currently available models. I'll definitely be keeping an eye out for new models, and I have an update to this module planned for the future that will make supporting certain languages even simpler.

    Thanks! Unfortunately, this isn't something we're planning. A long time ago I had plans to make a "LipSync Live" asset, that would be for real-time lipsync, but I was never able to come up with anything that I felt solved a problem not already solved by assets like SALSA. I do intend to enable limited runtime lipsync in LipSync Pro in the future, but this will be via the AutoSync system, so it'll support processing audio that's been downloaded or added post-build, but not true realtime lipsync on mic input or streaming audio. Sorry!
     
  31. Nemquae

    Nemquae

    Joined:
    Nov 24, 2013
    Posts:
    2
    Thanks for the quick reply. That's a shame, but we'll move ahead with Salsa for now.
     
  32. ina

    ina

    Joined:
    Nov 15, 2010
    Posts:
    1,085
    Does this work when the audio is streamed in from a url?
     
  33. Rtyper

    Rtyper

    Joined:
    Aug 7, 2010
    Posts:
    452
    Not at runtime, no. Your audio needs to exist in the editor so a LipSyncData asset can be created for it (either manually or using AutoSync). If it fits in with your pipeline though, you can export LipSyncData to an .xml file, which could also be downloaded from a url along with the audio, and then the two can be played back. This workflow wouldn't support actual streaming though (where playback starts before the download is complete).
     
  34. Rtyper

    Rtyper

    Joined:
    Aug 7, 2010
    Posts:
    452
    A couple of bits of news:

    1) I've just submitted a patch (1.501) to the Asset Store [EDIT: This is now live]. This patch is fairly important and fixes a few issues users have been reporting with the new AutoSync Setup Wizard and with certain system language/culture settings.
    I'll also be uploading a couple of updates for modules: Montreal Forced Aligner v1.0.3, and PocketSphinx v1.0.1 - These fix a few more issues regarding paths + settings, and the MFA 1.0.3 update should fix any issues with the macOS version. Once these are live, just download from the Extension Window.

    2) I'm putting together a roadmap for future updates + plans. There are a lot of things still with varying levels of progress - improved documentation, website upgrade, new AutoSync modules etc, and unless you trawl through all the posts in this thread it's difficult to keep track of them! For that reason, I'll be putting up a page on the website with a list of upcoming features and a (potentially rough) estimate of how far along they are/when you can expect them. Edit: This can be found here.

    Cheers!
     
    Last edited: May 19, 2019
  35. Rtyper

    Rtyper

    Joined:
    Aug 7, 2010
    Posts:
    452
    The 1.501 patch is now available from the Asset Store. Here's the changelist:

    Fixes
    - Fixed bug that prevented SoX from working with the auto-detected path from the Setup Wizard
    - Fixed bug that caused a "Failed to Connect" message in the Extension Window when certain system languages are in use.
    - Fixed incorrect version number being added on new LipSync components.

    - [MFA Module] Fixed bug that prevented the Montreal Forced Alignment module from working with the auto-detected path from the Setup Wizard.
    - [MFA Module] Fixed bug that made the Montreal Forced Alignment module fail when certain system languages are in use.
    - [PocketSphinx Module] Fixed bug that prevented the PocketSphinx module from working when certain system languages are in use.

    Changes
    - Clicking the "Browse" button next to a path on a settings screen now opens the file browser in the current path's location.
    - Improved the AutoSync Setup Wizard to lock until the download is complete when installing a module automatically.

    To anyone using the PocketSphinx module: There's an update now available from the Extensions Window. It's highly advised you download this update, as this fixes an issue with audio conversion that can stop the module from working completely in some cases.
     
  36. ggendron

    ggendron

    Joined:
    Mar 19, 2019
    Posts:
    4
    Hi!

    I've bought your plugin and it seems to work really well when using the example clip files.
    But I can't manage to generate my own clips because of several errors when using AutoSync.

    "Module failed compatibility check." is the last one I got.
    The pathes for Sox and MFA are good (green icon) and I can't figure out what I'm missing here. Any hint?

    Screenshot_6.png Screenshot_7.png
     
  37. Rtyper

    Rtyper

    Joined:
    Aug 7, 2010
    Posts:
    452
    Hi there,
    "Module failed compatibility check" means an AutoSync module requires some clip feature that wasn't present. You can get more information about what needs to be done by opening the AutoSync window. From the AutoSync menu, click on "Open AutoSync Window" and then select the preset you were running from the list at the top. The module that was failing the compatibility check will have an error message next to it in the lower part of the window telling you what's missing.

    If I had to, I'd guess you're probably trying to run the MFA module (e.g. the "Default" preset) without having a transcript for your clip? If so, you can either manually add one from the Clip Settings window (under the Edit menu in the clip editor), put one in a text file next to your AudioClip like the example audio files do, or, if transcribing the audio isn't feasible, you can use one of the two legacy presets instead which don't require a transcript.

    For that first error ("Audio Conversion Failed"), am I right in thinking that came from the Legacy preset? If so, can you try going to the "Window > Rogo Digital > Get Extensions" menu and downloading the "AutoSync PocketSphinx Module" - it's an update to the version that's included with the asset at present, and should fix this problem.
     
  38. ggendron

    ggendron

    Joined:
    Mar 19, 2019
    Posts:
    4
    Thanks a lot for your quick answer!
    It's all working well now, you guessed right on both errors :)

    Can't wait to work with your tool!
     
    Rtyper likes this.
  39. Rtyper

    Rtyper

    Joined:
    Aug 7, 2010
    Posts:
    452
    The roadmap is now available on our documentation page here.

    Slightly ahead of schedule, I've also just submitted the 1.501 update for LipSync Lite to the Asset Store. Again, I can't guarantee how fast the review will happen, but you can expect LipSync Lite to be available again sometime after tomorrow. For the first time (and after problems in the previous version) this new update will include full source code. Compiling it took a while, complicated the design and really didn't have much benefit, so you're now free to modify it if needs be.
    Note that it still doesn't contain Pro features: Emotion markers, Gesture markers, AutoSync or the Blendshape Baking tool, but any files created in LipSync Lite are fully compatible with LipSync Pro if you'd like to upgrade at a later date.
     
  40. ConfusedCactus

    ConfusedCactus

    Joined:
    Jul 3, 2018
    Posts:
    19
    Hi there,
    I really enjoyed the plugin so far. However, I found this error constantly popping up. It does not affect the gameplay though. Do you know how to fix it? Thanks a lot!

    NullReferenceException: Object reference not set to an instance of an object
    RogoDigital.EyeController.LateUpdate () (at Assets/Rogo Digital/LipSync Pro/Components/EyeController.cs:509)
     
  41. evfasya

    evfasya

    Joined:
    Feb 7, 2018
    Posts:
    8
    Hi!

    I have a question, is it possible that I change a character's emotions (face blendshapes) with a script during the run time while playing a lipsyncdata (a speech with only mouth movements) that has been prepared beforehand?
     
    ElevenGame likes this.
  42. Alvarezmd90

    Alvarezmd90

    Joined:
    Jul 21, 2016
    Posts:
    151
    I have emailed you, Rtyper.
     
  43. j-bomb

    j-bomb

    Joined:
    Jan 14, 2018
    Posts:
    16
    Hey! I'm having the same "MFA output TextGrid file does not exist" issue this other user is having, and have both the green circles in the clip editor. I've sent you an e-mail.
     
  44. Majkel92

    Majkel92

    Joined:
    Apr 19, 2019
    Posts:
    2
    Hi,
    I unwrapped lipsync 1.501 but I do not have a plugins folder. Is it normal? In the information for installation, I have to remove the file from there.
     
  45. haleler51

    haleler51

    Joined:
    Apr 9, 2015
    Posts:
    30
    I had this exact problem a week ago. It turns out my audio files were not an ideal format for LipSync Pro. What fixed it for me was manually converting the audio clips to the specified format (16000Hz Mono, 16-bit .wav) and importing those into Unity for use with Autosync.
     
  46. j-bomb

    j-bomb

    Joined:
    Jan 14, 2018
    Posts:
    16
    Ah! See I tried that, I saw that on the MFA wiki. I've converted my audio files to 16kHz, 16-bit mono wav and it doesn't work. I've also tried a variety of different wav formats, including 12kHz 8bit mono. They all generate the same problem. I've tried deleting the MFA folder. Not sure what to do. It's weird, since the demo wav files will run Autosync, but my own audio files will not.

    UPDATE: So it doesn't work. However, it turns out it does work with 16 bit, 16kHz, mono wav files, if I first 'Run Default' the Gettysburg example audio clip to success, then replace the audio clip and transcript and 'Run Default' on the new audio clip and save it as it's own clip but only on the Gettysburg.asset since it comes with a TextGrid. However, if I "Run Default" on the new file, it returns the same "TextGrid does not exist" error. This leads me to believe that the TextGrid is not generating.
     
    Last edited: May 22, 2019
  47. Rtyper

    Rtyper

    Joined:
    Aug 7, 2010
    Posts:
    452
    Do you have both left and right eye transforms assigned? If you do, it might be some problem with the re-parenting it does to allow changing the forward direction. Let me know if it's not just a missing reference and I'll dig a bit deeper.

    You can call DisplayEmotionPose(int emotion, float intensity) to instantly show an emotion on a character, but it's not possible to call SetEmotion and blend in/out smoothly, because it uses the same system to do this as playing back a clip and the two would interfere.

    If you don't need to crossfade between emotions though, you could call DisplayEmotionPose repeatedly and lerp the intensity value to fade-in or -out, it just won't support going from one emotion to another.

    And I have emailed you :) I agree about being able to set an intensity value for SetEmotion - I'll look into adding that.

    Regarding your edit, are you saying it actually works in that scenario? It sounds like all it would be doing is loading in the existing TextGrid with phonemes from the Gettysburg audio rather than phonemes from your own audio?

    Either way, I am trying to come up with a slightly more useful idea on this, but at the moment my best theory is just that this happens when MFA can't understand the audio itself.
    In the one example I've had sent to me (and if you wouldn't mind sending me an example of a clip that doesn't work for you, it could help me figure out what links them all) it seems like the speaker's accent was simply too strong/different from the data the language model was trained on to be used. It's quite a difficult problem, and MFA doesn't output a whole lot of information I can use to narrow things down, but right now that's the closest to an answer I can get.

    There shouldn't be any need to manually convert audio - SoX should be handling that automatically.

    That's fine. The instructions deal with updating from an existing version of LipSync Pro. If you're just installing it for the first time you can simply import the "LipSync Pro 1.501" unitypackage. There's a Getting Started guide as part of the documentation.
     
    evfasya and Alvarezmd90 like this.
  48. Majkel92

    Majkel92

    Joined:
    Apr 19, 2019
    Posts:
    2
    Hi, Rtyper. I have another problem. When adding a soundtrack longer than 1.30 minutes, the application crashes. The second problem is that you do not always generate a speech movement for the whole path, there are breaks of 10-30 sec. How can I fix it ?
     
  49. Alvarezmd90

    Alvarezmd90

    Joined:
    Jul 21, 2016
    Posts:
    151
    Thanks for helping me out. And I'm looking forward to that addition.
    For anyone who has the same issue with 'set_emotion': Here's the file that Rtyper shared with me:
    https://drive.google.com/open?id=1qxpfsRtlIMIAdnFz1UBtiWZbrzxgSPix
     
  50. dgoyette

    dgoyette

    Joined:
    Jul 1, 2016
    Posts:
    4,196
    I've come here after updating to 1.5, the latest on the Asset Store. Here's some feedback, and a question on whether I can use the new version of LipSync Pro at all.
    • After installing the new asset, I moved the top-level "Rogo Digital" folder into a subdirectory, "3rd Party", where I keep all my asset store assets. Later, when installing the Montreal module, it ended up installing into the top-level "Rogo Digital" folder (recreating it again), rather than the existing Modules folder that contained the default modules. This caused me some initial confusion until I realized that I had modules spread out across two different directories. Maybe the module importer can be made smarter to notice that the Modules folder has been moved?
    • Given how a Transcript is required for the new Default parsing, I'd request that this be more clear and easy to access. Can that Transcript text box be shown in the main window? It seems clumsy to have to drill down into a menu item dialog for something that is required. Not to mention the vague error if you don't include a transcript doesn't guide the user in that direction.
    • The biggest issue is that I'm also getting the TextGrid error that others are complaining about. In this case, I'm using it on some text that does indeed have an accent, and it's not clear to me whether that's the cause of the issue. As an example, I've gone to this website to generate some speech: https://ttsmp3.com/ I chose "Indian English / Raveena", and used the following sample text: "This is a test file, confirming whether the speech can be parsed." Download as MP3, bring it into Unity, and try to parse it in Lip Sync Pro, but it gives me the TextGrid error. Is the new parser just not capable of parsing text with an accent like this? I've also uploaded the MP3 file here if you'd like to try parsing it yourself: https://www.dropbox.com/s/vl93g97k5v1hfj0/LipSyncProTestFile.mp3?dl=0
    • Assuming I can't use 1.5, is there something I can edit in the older code to stop getting the following warning? "There are menu items registered under Edit/Project Settings: LipSync Consider using [SettingsProvider] attribute to register in the Unified Settings Window."
     
    Last edited: May 28, 2019