Search Unity

  1. Click here to see what's on sale for the "Best of Super Sale" on the Asset Store
    Dismiss Notice
  2. Good news ✨ We have more Unite Now videos available for you to watch on-demand! Come check them out and ask our experts any questions!
    Dismiss Notice

[RELEASED] LipSync Pro and Eye Controller - Lipsyncing and Facial Animation Tools

Discussion in 'Assets and Asset Store' started by Rtyper, Mar 11, 2015.

  1. donkey0t

    donkey0t

    Joined:
    Oct 23, 2016
    Posts:
    45
    I don't get them now, but originally I noticed something about being unable to access/download some icon files, I'm guessing that might be the mouth pics?
     
  2. Michael_Berna

    Michael_Berna

    Joined:
    Jul 5, 2019
    Posts:
    10
    @Rtyper

    I have a fresh install of LipSync Pro 1.521 that I just purchased and installed. I got it up and running, but I am getting the following error in my unity debug console. Any idea what could cause this or how to fix it properly?

    Assets\Rogo Digital\LipSync Pro\Editor\LipSyncDataPreprocessor.cs(106,16): error CS1061: 'LipSync' does not contain a definition for 'GetCurveDataOut' and no accessible extension method 'GetCurveDataOut' accepting a first argument of type 'LipSync' could be found (are you missing a using directive or an assembly reference?)
     
    faizan143143 likes this.
  3. pigglet

    pigglet

    Joined:
    Aug 13, 2014
    Posts:
    113
    Hey! Glad you're back. It could sound inappropriate but I believe someone should address this issue, so it could be me. I bought the LSP back in 2018 and since then I met a lot of developers in discord and forums that had a common point "Well, LSP is good but seems like developer disappear again, so maybe asset is abandoned. We can't rely on it in production." Don't get me wrong, LSP is really awesome solution and I wish I could use it in my game, but I'm not sure... can I rely on it? Did you return for a long and serious about support?
     
  4. Rtyper

    Rtyper

    Joined:
    Aug 7, 2010
    Posts:
    445
    No, those are harmless warnings from the extensions window if there's some network problem that prevents the icons from loading. I'd never seen this happen before, though I have another customer who's reported the same issue by email recently. I'll need to look into it further but I have no ideas for the moment...

    That method definitely exists in the latest version. Have you installed any extensions from the extensions window? It may be there's something outdated there I need to fix.

    Not an inappropriate question at all! If i'm being candid, I had got burned out on working on LipSync and I let support slide as a result. Being able to redesign everything from the ground-up for the new Cinetools versions is helping a lot in getting over how much of a nightmare it was to try and fix problems in the current version, and by formalizing support with the ticketing system etc. I'm hoping to make support much more manageable as well. I absolutely intend to stick around!
     
    dmenefee and pigglet like this.
  5. pigglet

    pigglet

    Joined:
    Aug 13, 2014
    Posts:
    113
    Yes, I got the same error after installing Timeline support
     
  6. pigglet

    pigglet

    Joined:
    Aug 13, 2014
    Posts:
    113
    I figured it out. One of the add-ons (TimeLine addon in my case, but might be some other on your side) rewrite the LypSync.cs with the old version. Just restore the original LypSync.cs from the latest asset install and it will be fine.
     
    Rtyper likes this.
  7. Rtyper

    Rtyper

    Joined:
    Aug 7, 2010
    Posts:
    445
    Good to know, sorry for that! I'll get the Timeline extension updated to remove the extra file today.
     
  8. Dirrogate

    Dirrogate

    Joined:
    Feb 18, 2014
    Posts:
    122
    @Rtyper I'm revisiting Lipsync pro after a few years, and loving how it's evolved.
    Would there be an iclone Character Creator 3 preset? I can't find any after a long search.
    Since the blend shapes exported by CC3 to unity does not contain phoneme blend shapes, it's a task to create from scratch.

    Kind Regards
     
    f1chris likes this.
  9. f1chris

    f1chris

    Joined:
    Sep 21, 2013
    Posts:
    335
    +1

    Was about to ask as well since I just bought the whole iClone/CC3/Live Face/3dexchange suite !!
     
    Dirrogate likes this.
  10. Rtyper

    Rtyper

    Joined:
    Aug 7, 2010
    Posts:
    445
    @f1chris
    I don't have Character Creator 3 to create one with unfortunately, but if you set a character up and make a preset for it, I could make it available for others on the extensions server. Alternatively if someone with access to CC3 could send me a character to use, I'd be happy to set up the preset myself.
     
    Dirrogate likes this.
  11. Dirrogate

    Dirrogate

    Joined:
    Feb 18, 2014
    Posts:
    122
    Thanks @Rtyper , @f1chris . I'm sending a Wetransfer link to Ryper in priv message, to download a CC3 character.
    Many thanks.
     
    f1chris and Rtyper like this.
  12. f1chris

    f1chris

    Joined:
    Sep 21, 2013
    Posts:
    335
    Good idea, there’s Trial version. I can also export you a simple character created by default and so you can play with it as well. Let me know @Rtyper the best way to send It.

    EDIT: i didn’t play with your asset for a little while and just saw you’re supporting iClone natively. That should work then since iClone and CC3 are sharing the same rigging.
     
    Last edited: Jun 30, 2020
  13. Dirrogate

    Dirrogate

    Joined:
    Feb 18, 2014
    Posts:
    122
    Thank you @Rtyper for sending the test CC3 package. I'm just on a deadline delivery the next few days, but will try it out after that.
    Meanwhile, directly from Iclone, which has an amazing text to speech mapper for its characters, this was then rendered in Unity. Example output below. Audio was synced later in a video editing software.
     
    Rtyper likes this.
  14. mrwhip

    mrwhip

    Joined:
    Jan 30, 2020
    Posts:
    6
    Hi Hopefully @Rtyper or anyone can help - I'm using the latest LipSync Pro in Unity 2019.4.0 LTS in a URP project and everything seems set up and sort of works but playback is broken. I mean my model is set to Blendshapes and they are assigned in the LipSync Pro component - and moving the slider morphs the face in the Scene - so I know my model is correct.

    The LipSync clip i made used autosync and generated the phoneme keyframes but preview doesn't work - nor does Play on Awake in the scene. The only thing weird is that LipSync component reports a warning in the inspector pane:

    But the mesh works because the sliders in the LipSync component work changing mouth shapes in my scene. However playback when running or by preview in the Lipsync clip window doesn't move or animate anything. Only the sliders work. I'm not using gestures - just lipsync and emotions.

    At first I thought it was this mesh warning - but i don't think so now - I went to the Lincoln example in the samples and everything's pink (I assume shaders are not URP friendly) - so I put it on wireframe just to see Lincoln and he only animated his idle - His mouth doesnt move either. So is there some issue with this latest version of LipSync Pro and URP??

    Any help is appreciated - This seems pretty cool but I can't figure out what I'm missing. If the sliders work and the blendshapes are working in the sliders - I would think it should work...?

    Please help! thanks!
     
  15. mrwhip

    mrwhip

    Joined:
    Jan 30, 2020
    Posts:
    6

    Ahhh Nevermind - Even though I watched the videos - I clearly missed something - my problem in case others come here and have the same issue is that in the animation clips, you need to move all the phonemes to 100% with the sliders - for some reason I thought that the defaults were 100 intensity but it defaults to zero. Moving all my sliders to 100 before playback made it work. So my problem is solved

    Great asset - thanks!
     
    Rtyper likes this.
  16. Rtyper

    Rtyper

    Joined:
    Aug 7, 2010
    Posts:
    445
    Hey everyone, here's some info on the next update! After talking with a few customers I've identified a few key areas of AutoSync & the batch processor that were lacking, and as I plan for AutoSync to make the jump over to LipSync 2 almost unchanged (at least as far as the public API goes), it made sense to address this now.

    The main thing is that the batch processor isn't very flexible - it takes AudioClips and turns them into LipSyncData or XML text files with the same names and that's it. But what if you have existing LipSyncData files that you want run through AutoSync again? What if you want the resulting files to end up in a different folder? There are a huge number of tasks, from localisation to an actor or temp dialogue being replaced, that could be simplified by batching. To that end, the new version will support both LipSyncData and AudioClips as inputs, and multiple output options per-clip. Here's a mockup of the new UI:

    AutoSync3 Refresh.png
    The other new features here include support for sending one item of any batch to the clip editor instead of a file, using the existing clip editor contents as an item, running events on pre- and post-processing, and appending clips together. You might also have noticed there's now no distinction between running AutoSync in single mode & batch mode - it's now simply a batch with one item in.

    I'm also close to releasing several new AutoSync modules & updates (finally - I know this has been promised since day one!):
    • Azure Speech Recognition Module
      Uses Microsoft Azure to generate a transcript from audio. It supports dozens of languages and variants, though will require setting up account credentials to use - their free tier lets you transcribe up to 5 hours of audio per month, so for the usual length of dialogue in games shouldn't be too prohibitive.
    • AudioClip Replacement Module
      A utility module (see the next paragraph for an example) for replacing the AudioClip of an existing LipSyncData file with another one. Supports several methods of matching new audio clips based on name, directory etc.
    • Update: Marker Cleanup Module
      A very unfinished version of this accidentally made it into version 1.521, so you may have already seen this (though the new modes aren't functional in that version) - this update adds new ways of determining which markers are unnecessary, and allows biasing towards less important phonemes (e.g. leaving Fs and Ps, which are much more important than more subtle vowel sounds).
    • Update: Intensity From Volume Module
      This will provide a better default volume curve, which can now be applied relative to the loudest and quietest parts of your clip, along with the same phoneme biasing as the Marker Cleanup Module to make it much more useful without spending ages fine-tuning the curve by trial and error.
    • French Language Model for MFA
      Exactly what it sounds like! Adds support for French language audio to the Montreal Forced Aligner module. This is the first of two language models that are currently possible, the other being German - others can be done in future but will require some additions to the module itself.
    As an example, say you're localising your game into French. You already have LipSyncData files for all your dialogue in English, and you've been provided with the equivalent audio clips in French. In the new AutoSync Window, you could add all your existing LipSyncData clips as a batch, choose to output the new data as LipSyncData files in a new folder, and add the AudioClip Replacement Module to find the new audio based on folder structure and swap them in, the Azure Speech Recognition Module to generate the new transcripts, then the MFA module using the new French language model to create the phonemes. All the existing Emotions and Gestures will be carried over into the new clips, and all of that can be saved as an AutoSync Preset to run on any additional clips as needed.
     
    CoyoteFringe, pigglet and MoMonay like this.
  17. mrwhip

    mrwhip

    Joined:
    Jan 30, 2020
    Posts:
    6
    Hi @Rtyper
    I'm not sure but I think I've found a bug. I'm Unity LTS 2019.4.0 and latest LipSync Pro.
    I copied and pasted the components from one character to another and then changed the Character Mesh property - but it didn't switch over the BlendShapes like I thought it would - that's ok I can reset them but there's a bug where the arrow expand sections for Phonemes and Emotions garbles up the inspector UI and I can't trash one Blendshape and add a new Blendable. I have to deselect the Object in the hierarchy and reselect it to get it to load the inspector properly again, but then when I click the phoneme again it garbles the inspector so I can't change it. Here's what it looks like:



    Also can you please tell me what this next error means and how to fix it or if I can ignore it?

    Thanks!
     
  18. pigglet

    pigglet

    Joined:
    Aug 13, 2014
    Posts:
    113
    Hey, @Rtyper
    Any chance you managed the warning\error with timeline & no sound clip?
    Just to remind you, here is the warning:
    Code (CSharp):
    1. PlayOneShot was called with a null AudioClip.
    2. UnityEngine.AudioSource:PlayOneShot(AudioClip)
    3. RogoDigital.Lipsync.LipSync:PreviewAudioAtTime(Single, Single) (at Assets/Rogo Digital/LipSync Pro/Components/LipSync.cs:1091)
    4. RogoDigital.Lipsync.Extensions.Timeline.LipSyncMixerBehaviour:ProcessFrame(Playable, FrameData, Object) (at Assets/Rogo Digital/LipSync Pro Timeline/LipSyncMixerBehaviour.cs:40)
    and here is the error:
    Code (CSharp):
    1. UnassignedReferenceException: The variable clip of LipSyncData has not been assigned.
    2. You probably need to assign the clip variable of the LipSyncData script in the inspector.
    3. RogoDigital.Lipsync.LipSync.PreviewAudioAtTime (System.Single time, System.Single length) (at Assets/Rogo Digital/LipSync Pro/Components/LipSync.cs:1093)
    4. RogoDigital.Lipsync.Extensions.Timeline.LipSyncMixerBehaviour.ProcessFrame (UnityEngine.Playables.Playable playable, UnityEngine.Playables.FrameData info, System.Object playerData) (at Assets/Rogo Digital/LipSync Pro Timeline/LipSyncMixerBehaviour.cs:40)
     
  19. Rtyper

    Rtyper

    Joined:
    Aug 7, 2010
    Posts:
    445
    Are you getting any errors in the console when the inspector is messed up like that? I imagine it's a null reference or something similar in the GUI code, but it'll be very difficult to track down without the line number.

    As for the warning in that screenshot - it's just pointing out that the mesh in the "Character Mesh" field above it is in a prefab, not an instance in the scene. If you're editing a prefab from the project panel then it's fine, you can ignore it. If that's showing on your prefab instance in the scene then you probably have the wrong version of the mesh renderer in that field.

    Sorry, I haven't yet! I'll take another look today, but if I remember rightly I wasn't able to reproduce those errors when you first sent them over. I'll let you know if I find anything different though!
    Edit: That said, I think the fix is fairly simple: open up LipSync.cs from the Components folder, and replace line 1086 (which should be
    if (IsPlaying || !audioSource)
    ) with this:
    if (IsPlaying || !audioSource || !audioClip)
     
    Last edited: Jul 20, 2020
  20. pigglet

    pigglet

    Joined:
    Aug 13, 2014
    Posts:
    113
    Oh, yes! It works, thank you so much! <3
     
    Rtyper likes this.
  21. Rtyper

    Rtyper

    Joined:
    Aug 7, 2010
    Posts:
    445
    In working on the new AutoSync features/modules I've run into a limitation with how the Phoneme Set system currently works. I'm trying to judge how much use this feature sees, so if any LipSync Pro users who have time could complete this (5 second, one question!) poll, that would be hugely helpful! https://strawpoll.com/x8s7v26a5
    Thanks!
     
  22. mrwhip

    mrwhip

    Joined:
    Jan 30, 2020
    Posts:
    6
    Hi No errors at all in the console when the inspector is messed up. Any ideas what the issue is? I'm also getting an "Invalid AABB a" error but I think that's due to a few points in my bone layer that I need to edit out as they might be 1pt polys used to generate my skeleton. Could that be causing the inspector to mess up?

    Also, another few questions:
    1. Is there any way to "Stack Emotions"? Like if I have an animation and I want to do a "Surprise Brows Up" inside a "Happy" emotion - your interface says I can't do that - only when they straddle each other at a border. Their movements aren't contraindicating each other so it should work if one emotion was nested in the other?? I'm finding it hard to do complex emotions the way my character's blendshapes are set up then.

    2. I can't get the Eye Controller to work at all. I am using Blendshapes for emotions and phonemes and a bone skeleton for character head neck & body movement. I use Blendshapes for the blinks and winks, but have eye bones for rotating the eyes. I can manage inserting blink morphs into the Emotion layer of the lip sync if I have to (except for that no nesting issue above), but the main thing is that I want to Look at a Target empty game object for eye tracking and me animate that null. Whenever I use the eyecontroller's Look At feature, the eyes roll back into his head and he looks out HellRaiser and no settings in your component or placement of the null can make the pupils show up properly aligned. The bones do work for the eyes in the Scene when manually adjusted. What's the trick with the Eyecontroller? I'm using Blendshapes for the LipSync Pro component and Bones Only for the Eyecontroller (leaving my blinks in the LipSync Emotion tab). What do you suggest for this set up?

    3. Is there any way to tweak the lip sync or emotion curves? Or are those hidden from us? I've seen Loose and Tight - but no way to view the curves. I'm finding it hard to know how to tweak the Rest time you expose and I might need to vary throughout the talking.

    thanks again for your help!!
     
  23. Rtyper

    Rtyper

    Joined:
    Aug 7, 2010
    Posts:
    445
    I really wouldn't have thought so - it would have to be something inside the GUI code, and as it only happens when you open one of the phoneme poses, it's probably something in the code that draws those. I'm surprised nothing's throwing an error though. :confused: Would you be able to send me a project folder or .unitypackage or similar that reproduces the issue?

    Not directly, but you can use the Emotion Mixer to achieve something similar: when you add a new emotion marker, choose the "Add New Mixer" option. You'll then get a black emotion marker on the timeline, right click this and choose "Edit Mixer", you can then add multiple emotions and balance their effect against each other (in your case you'll probably want to change the "Mixing Mode" dropdown to "Additive", though if the emotions you choose share any blend shapes these might get pushed past 100% as a result).
    You can then use this like a normal emotion, so blending from one emotion into the mixer, then back into the same emotion (with a different marker) should give you the effect you want.

    It sounds like you're using the right approach - Eye Controller often has issues with look at when bones have different forward axes than expected, and the option to change it doesn't always do the trick. If possible, I'd suggest adding a dummy GameObject as the parent to each of your eye bones, and using that in Eye Controller, so that it can be pointed at the target with Z-forward, and you're free to change the local rotation of the child eye bones to correct it visually.

    All this stuff is fixed in the new Eye Controller 2, it's just not quite ready to release yet!

    It's not possible to tweak the curves unfortunately - they're usually generated at runtime anyway as they are per-character and per-clip. Rest time can be a little hard to dial in - you may prefer to just avoid using it altogether by setting the value to something high (1 second or more), then add Rest phonemes into your clips where you want them.
     
    BriBill and mrwhip like this.
  24. seamon67

    seamon67

    Joined:
    Jan 30, 2016
    Posts:
    6
    Is there a way to add the Lipsync Pro component to a Fuse Character at runtime? There's a specific reason I cannot add the Lipsync Pro to the Prefab.
     
  25. seamon67

    seamon67

    Joined:
    Jan 30, 2016
    Posts:
    6
    Alright I managed to add a Lipsync component at runtime. A little jank but it works!
     
  26. mrwhip

    mrwhip

    Joined:
    Jan 30, 2020
    Posts:
    6
    @Rtyper Why would preview in the Lip sync editor not work? I select the character and nothing in the clip window changes the character in the scene.

    You really need an FAQ for simple or common setup errors. Please make one. :) I've scoured your docs and watched all the videos - but you gloss over basic setup - perhaps I'm missing a fundamental with the Avatar or Animator controller? (I've watched tutorials on this too). It entirely a custom character (actually just shoulders up character) no Fuse or external animations. All animations done to the FBX bones in Unity clips.

    This is a new scene and this is the first time I've tried to use Lip Sync pro without the auto generate - because this clip has no talking and I just want to use your Emotions and Gesture hookup in it. I've placed a "Rest" phoneme in there anyway and that doesn't do anything. I know the component is working because I can select Phonemes or Emotions in the inspector and the character changes in the scene correctly. I'm trying to time emotions to a bit of music clip not a spoken word clip in the clip editor and nothing is working in the clip editor preview - even though the character name is selected in preview there.

    I have an Animator Controller and the Lip Sync Pro component on the prefab in the scene and the Skinned Mesh Renderer is selected. I do get that warning icon about a prefab - but other than clicking it and selecting Scene and the Head mesh I don't know how to tell it to select the head mesh from the in scene prefab. Is this the issue?

    Basically I have a "Dancing" animation clip in Unity which I've set up to be a gesture and I've run your gesture setup which creates the Lip Sync layer and I want to put emotions and facial expressions during the dance gesture.

    What am I doing wrong? I can't seem to figure out what is happening with your component in this new scene since I've tried it without using auto generate. The Lip Sync Clip is set to the default clip and Play on Awake is on. thanks for your help!
     
  27. mrwhip

    mrwhip

    Joined:
    Jan 30, 2020
    Posts:
    6
    @Rtyper Sorry for so many questions... Is there also a limit on the number of emotion blendshapes? I've added about 20 and they look good when you select them one at a time in the inspector, but if you open up a lip sync clip editor - it seems like the last four or five in there don't work. Like I have an emotion called Smile and MouthOpenWide and neither of them mix with other emotions or work on their own? Any ideas why this might be happening?
    Thanks again so much for your help!
     
  28. meteorvirtual

    meteorvirtual

    Joined:
    Aug 21, 2020
    Posts:
    1
    Hello. I am running into an issue with the asset. The lipsyncing looks good for the parts that contain audio, but between those segments the face does this bizarre thing. It seems to escalate, too, where the first time the issue appears it just looks like he's sticking his lip out really far, but by the 2nd/3rd time it looks like this (attached). Is there anything I can do fix this?

    Thanks, much appreciated.
     

    Attached Files:

  29. Jakub_Machowski

    Jakub_Machowski

    Joined:
    Mar 19, 2013
    Posts:
    418
    I wrote on e-mail but without reply ;) We are using Your system to our game The End of The Sun. I have a questiuon is there chance to improve bone based system? esspecially when emotion and speech used in the same time, for example smiling + talk simply dont work good, It dont make that smiled person is talking is making that neutral person is talking, and in gaps when no talk, it put smile short "glitches" on face :) that dont look good. Maybe is there better way to make some average, when emotion and phenomene use the same bone in the same time? Here is our trailer if You would like to see:
     
  30. seamon67

    seamon67

    Joined:
    Jan 30, 2016
    Posts:
    6
    This happens when in Unity Blendshapes are unbounded. If you don't need Unbounded Blendshapes then go to Project Settings -> Player -> Tick Mark on Clamp Blendshapes.

    If you need unbounded blendshapes then you need to check if the blendshapes defined in the speech profile(Mixamo profile in this) are going below 0 and if they are then you have set each of them to 0 manually every frame.
     
    Last edited: Aug 31, 2020
  31. CosmicBoy

    CosmicBoy

    Joined:
    Apr 16, 2014
    Posts:
    33
    Hey there,

    Great asset so far! I'm having a problem figuring something out. I'm using sprite animations, I can manually create phoneme lip sync data by putting in markers, however I have to double up every marker and sustain inbetween them to get the character to display them correctly in the editor and during playback.

    When I use the autosync with dialog, the markers it creates are accurate and great, however when I play the animation because there are no sustains you can't see any of the clips playing because they come off and on so fast, I have to manually double up the generated markers and sustain them in order to display them properly.

    What am I doing wrong here?
     
  32. ksam2

    ksam2

    Joined:
    Apr 28, 2012
    Posts:
    1,045
    Does it work in the runtime? I mead can I add random audio? Or setup audio files on runtime and then save somehow?
     
  33. ElevenGame

    ElevenGame

    Joined:
    Jun 13, 2016
    Posts:
    37
    @ksam2 i don't think runtime creation is possible, yet. I remember it being mentioned as a planned feature some time ago, but haven't read anything about that since.
     
  34. Alvare32

    Alvare32

    Joined:
    May 26, 2020
    Posts:
    3
    I would really like a feature to be able to call functions from markers. Is that a possible addition?

    Like a Unity Event with an argument or pre-configured value.
     
    Last edited: Oct 16, 2020
  35. hvillacruz

    hvillacruz

    Joined:
    Aug 15, 2018
    Posts:
    8
    Im using mac and this is the error i got after installing the wizard

    Assets/Rogo Digital/LipSync Pro/AutoSync/Editor/Modules/Montreal Forced Aligner/ASMontrealPhonemeDetectionModule.cs(44,24): error CS0115: 'ASMontrealPhonemeDetectionModule.ProcessWithTemplates(LipSyncData, AutoSync.ASProcessDelegate, PhonemeMarker, EmotionMarker)': no suitable method found to override
     
  36. Rtyper

    Rtyper

    Joined:
    Aug 7, 2010
    Posts:
    445
    Yes, @ElevenGame is right, LipSync files have to be created in the Editor then played back at runtime.

    I'll have a look and see if there's a simple way to do this, but I have a feeling it would require a lot of reorganising code (duplicating a lot of stuff in the editor + runtime etc.) and I'm not planning on adding new features to LipSync 1.x at this point. It's definitely something that should be possible using the new 2.0 version.

    Hi, looks like you need to update to the newest version of LipSync Pro (1.53) from the Asset Store - that error is the new version of the MFA module failing to compile because the LipSync Pro version is too old.
     
  37. faizan143143

    faizan143143

    Joined:
    Oct 24, 2017
    Posts:
    4
    i have also this issue
     
  38. faizan143143

    faizan143143

    Joined:
    Oct 24, 2017
    Posts:
    4
    I have a fresh install of LipSync Pro 1.521 that I just purchased and installed. I got it up and running, but I am getting the following error in my unity debug console. Any idea what could cause this or how to fix it properly?

    Assets\Rogo Digital\LipSync Pro\Editor\LipSyncDataPreprocessor.cs(106,16): error CS1061: 'LipSync' does not contain a definition for 'GetCurveDataOut' and no accessible extension method 'GetCurveDataOut' accepting a first argument of type 'LipSync' could be found (are you missing a using directive or an assembly reference?)
     
  39. Rtyper

    Rtyper

    Joined:
    Aug 7, 2010
    Posts:
    445
    Can you tell me exactly what you did to get this error? A fresh install from the Asset Store would get you LipSync Pro 1.53, and won't have any compiler errors like that - have you downloaded any extensions etc.?
     
  40. r3ndesigner

    r3ndesigner

    Joined:
    Mar 21, 2013
    Posts:
    140
    upload_2020-10-27_11-33-38.png
    i am getting this errors with the new update 1.53.0
     
  41. r3ndesigner

    r3ndesigner

    Joined:
    Mar 21, 2013
    Posts:
    140
    any tip? i am little stucked now, i cant play my project and cant download the 1.52 version xD
     
  42. Rtyper

    Rtyper

    Joined:
    Aug 7, 2010
    Posts:
    445
    I don't know what to tell you, those classes definitely exist in the current version. Have you tried removing LipSync from your project and redownloading from the Asset Store? (Don't forget to import the .unitypackage after downloading)
     
  43. r3ndesigner

    r3ndesigner

    Joined:
    Mar 21, 2013
    Posts:
    140
    ll try one more time, but i think i already did it xD, anyway i ll let you know if anything good happens ^^ thanks
     
  44. anomas

    anomas

    Joined:
    Jul 3, 2016
    Posts:
    44
    Delete all folders and files that from lipsync and reinstall everything. It will work afterwards.
     
    Rtyper likes this.
  45. r3ndesigner

    r3ndesigner

    Joined:
    Mar 21, 2013
    Posts:
    140
    fixed, the error is in my script just noticed now xD thanks and sorry to lost your time.
     
    Rtyper likes this.
  46. colpolstudios

    colpolstudios

    Joined:
    Nov 2, 2011
    Posts:
    129
    Hi, I am using UMA with lipsync pro, but want to be able to add a gesture.

    Uma generates the animation controller at runtime, I cannot pick this within the gestures animator slot.

    Is there a way to do this?
     
  47. Rtyper

    Rtyper

    Joined:
    Aug 7, 2010
    Posts:
    445
    Let me look into this - it may be fairly trivial to do via script if you only want to trigger animations that are already on UMA's Animator Controller. If it genuinely generates the entire controller at runtime (not just adding the Animator component) and you want to add custom animations to it like LipSync's Gesture setup wizard does, this might not be possible.
     
  48. colpolstudios

    colpolstudios

    Joined:
    Nov 2, 2011
    Posts:
    129
    I May have worked it out.
    if you use Bone builder you get a animator :)
    im working on it now.

    Ok, so a day later, I found the issue.

    When you save your edited audio and gesture clip, be sure to save "SAVE AS" change the name to something totally unique. So you can find it easily.

    I am using a playmaker action when I hit space the newly saved clip is the one to pick.

    All working with no errors; oh I'm going to have so much fun with this :)
     
    Last edited: Oct 30, 2020
  49. colpolstudios

    colpolstudios

    Joined:
    Nov 2, 2011
    Posts:
    129
    Hi, I am curious about the feature "create mesh with blend shapes".

    The adobe fuse character genterator no longer supports transfer and auto rigging to Mixamo.

    This means if you create a cool character you now have to use the place markers system, but you lose the blend shapes.

    Will your feature overcome this pitfall?

    I just tried, but I am a new user it does seem possible by choosing the body mesh.

    I have the option to what looks like the ability to create new blend shapes.

    So I named it and saved it. But how do we use it? How many can we create?
     
    Last edited: Oct 29, 2020
  50. Hazneliel

    Hazneliel

    Joined:
    Nov 14, 2013
    Posts:
    226
    Hello, thanks for any help. Im using Lipsync Pro with the timeline integration, I am able to trigger lipsync from the timeline but when I pause or move the time of the timeline the Audio keeps playing and doesnt reflect the actual position of the timeline. Is it as if the audio is not being evaluated by the graph.

    Is this a bug in Lipsync? Or Im not doing something right?

    Any help is very appreciated.
    Thank you.
     
unityunity