Search Unity

  1. Looking for a job or to hire someone for a project? Check out the re-opened job forums.
    Dismiss Notice
  2. Unity 2020 LTS & Unity 2021.1 have been released.
    Dismiss Notice
  3. Good news ✨ We have more Unite Now videos available for you to watch on-demand! Come check them out and ask our experts any questions!
    Dismiss Notice

[RELEASED] LipSync Pro and Eye Controller - Lipsyncing and Facial Animation Tools

Discussion in 'Assets and Asset Store' started by Rtyper, Mar 11, 2015.

  1. Aleksandr409

    Aleksandr409

    Joined:
    Jan 4, 2019
    Posts:
    6
    I'll send you a private message.
     
  2. Alvarezmd90

    Alvarezmd90

    Joined:
    Jul 21, 2016
    Posts:
    144
    Alright. I have a very annoying issue right now. it might have something to do with the model's face bone scale. But that shouldn't have to matter in my opinion.. EDIT - I tried resetting the bones scale in 3ds max and wasted one and a half hour for it to fail working again.
    Could you please upload the older version 1.5.1?

    EDIT 2 - Lmao, found the older version in my other project and it works flawless there. 1.5.2 has problems.
     
    Last edited: Feb 7, 2020
  3. Alvarezmd90

    Alvarezmd90

    Joined:
    Jul 21, 2016
    Posts:
    144
    Now that set emotion is working again I get this error on resetting after emotion:

    Code (CSharp):
    1. KeyNotFoundException: The given key was not present in the dictionary.
    2. System.Collections.Generic.Dictionary`2[TKey,TValue].get_Item (TKey key) (at <1f0c1ef1ad524c38bbc5536809c46b48>:0)
    3. RogoDigital.Lipsync.LipSync.SetEmotion (System.String emotion, System.Single blendTime) (at Assets/Rogo Digital/LipSync Pro/Components/LipSync.cs:491)
    4. RogoDigital.Lipsync.LipSync.ResetEmotion (System.Single blendTime) (at Assets/Rogo Digital/LipSync Pro/Components/LipSync.cs:505)
    5. ChangeTheEmotions.LateUpdate () (at Assets/Scripts/ChangeTheEmotions.cs:18)
    6.  
    Ah so borrowing that 'Set emotion' code from 1.5.2 works :D
     
    Aleksandr409 likes this.
  4. Aleksandr409

    Aleksandr409

    Joined:
    Jan 4, 2019
    Posts:
    6
    Hello again, I went through all the points of my character "blend shapes face" is missing, but there are bones they behave very strange when moving nothing to change. I attached a video of what I was saying. I hope I have any options to fix this. Thanks.
     
  5. dgoyette

    dgoyette

    Joined:
    Jul 1, 2016
    Posts:
    3,679
    Edit: This was super simple. I'll leave the post here in case it helps anyone else. Anyway, the issue was that it seems I accidentally included "Blink_Left" in the "Rest" phoneme for Lip Sync. It seems the Rest phoneme takes precedence, and you probably shouldn't ever put your Blink blend shapes there.


    Original question:
    Has anyone experienced Eye Control only causing a single eye to blink? I've got blend shapes for both eyes, and if I adjust those blend shapes with the sliders, they each work fine. However, Eye Control is only adjusting the Blink_Right blend shape at runtime. You can see on the second screenshot a capture during runtime. Blink_right quickly goes from 0 to 1, and back to 0, But Blink_Left doesn't change numerically or in the model at this time. Anyone know what might be causing that?


    upload_2020-2-12_16-48-50.png


    upload_2020-2-12_16-48-17.png
     
    Last edited: Feb 13, 2020
  6. BriBill

    BriBill

    Joined:
    Oct 8, 2019
    Posts:
    7
    I'm using Lipsync Pro in my educational VR project and so far the results have been spectacular! In my project I have a coach that is providing dialogue, emotions, gestures, and eye movements which breath a lot of life into the character. The only piece I'm missing is for the head to also move and track the learner's movements. Is there any easy way to have the coach's head move and track the learner's movements along with they eyes with Lipsync Pro and Eye Controller or should I handle that separately? If so, can you recommend a solution? Thank you!
     
  7. dgoyette

    dgoyette

    Joined:
    Jul 1, 2016
    Posts:
    3,679
    I'm not sure whether there's anything baked into LipSyncPro/EyeController for moving the head. However, it's pretty simple to do this using built-in Unity Animator methods.

    Take a look at Animator.SetLookAtPosition, and Animator.SetLookAtWeight. You basically tell the animator where to look, and how strongly it should look there, and the head (or body too if you want) will look at that position, overriding any other animations the model might be performing. You'll need to make sure your Animator layer has IK Pass ticked, and that you call the methods during the OnAnimatorIK.

    This thread shows a simple example of someone doing that: https://forum.unity.com/threads/mecanim-animation-setlookatposition.267200/
     
  8. DeadSeraph

    DeadSeraph

    Joined:
    Sep 8, 2017
    Posts:
    97
    Hi guys. Couple of questions:

    -Is there a way to set up multiple lip sync clips to play on command? Currently I have 15 or so lip syncs done but I'd like to be able to call them via code to play at this or that time. Is this possible?

    -We would like to be able to set a delay in the time when a clip plays. I've messed around with the settings a ton but the best I can tell, the delay time in the settings only affects when the animation starts, not when the animation / audio play in tandem. Would it be possible to simply add a delay value for the audio so that one delays the other? This seems like a pretty basic function?

    -Is the source code available anywhere for people who have purchased the pro version?

    Thanks for your time.
     
  9. Jakub_Machowski

    Jakub_Machowski

    Joined:
    Mar 19, 2013
    Posts:
    473
    Hello We use lipsynch and emotions with bones blend, The problem is that when Using bones blending system overall blend of emotion and look wear What I mean for example if lips corners are set is happy emotions more upper position and they are blended with phenomenes that also use that bones, IT dont take average position, but No add emotion on that bones at all, in reuslt emotions on mouth are visible only when character dont talk but even then for example smile has streght "picks" where it shows for a moment dissapear again shows. Is there chance to do it better to emotions get average position, if phenomene and emotion are using the same bone?
     
  10. mateuszjaworski

    mateuszjaworski

    Joined:
    Aug 8, 2019
    Posts:
    9
    Hey guys,

    do weights for phonemes work for you? I am trying to change them and even if I set weigth to 10% they are still played on 100%. For our timelines I am using Slate Sequencer and extension to make LipsyncPro compatible with it. Is there something I have to do to make it work properly? At this moment we don't have blendshapes only bone transforms.
     
  11. domdev

    domdev

    Joined:
    Feb 2, 2015
    Posts:
    358
    Hi this asset supported webGL?
     
  12. domdev

    domdev

    Joined:
    Feb 2, 2015
    Posts:
    358
    Hi were are planning to buy this asset, I was just wondering if I can load an audio clip from runtime? cause were using google cloud kit, from text to speech then we want to load that audio from this asset? is that possible?
     
  13. shivamnyr

    shivamnyr

    Joined:
    Jul 4, 2017
    Posts:
    1
    Is the limited run-time lipsync available now, where I can just push an audio and the transcript and have it processed at run-time?
     
    Benutzer1 likes this.
  14. minad2017

    minad2017

    Joined:
    Dec 1, 2016
    Posts:
    37
    An error has occurred in the [Slate Cinematic Sequencer] plug-in.

    Assets\ParadoxNotion\SLATE Resources\Extensions\LipSync\LipSyncSpeech.cs(14,57): error CS0535: 'LipSyncSpeech' does not implement interface member 'ISubClipContainable.subClipSpeed'

    Assets\ParadoxNotion\SLATE Resources\Extensions\LipSync\LipSyncSpeech.cs(14,57): error CS0535: 'LipSyncSpeech' does not implement interface member 'ISubClipContainable.subClipSpeed'

    Can you fix it?
    unity ver 2019.3.2f1
    lipsyncpro ver 1.52
    slate ver 1.9.8
     
  15. imaginationrabbit

    imaginationrabbit

    Joined:
    Sep 23, 2013
    Posts:
    283
    I also just tried to use LipSyncPro with Slate and got the same error-

    Open the script "LipSyncSpeech" and paste this on line 84 or anywhere in the script and everything will work after that.

    Code (CSharp):
    1.  public float subClipSpeed
    2.         {
    3.             get
    4.             {
    5.                 return 0;
    6.             }
    7.         }
    8.  
    9.         public float subClipLength
    10.         {
    11.             get
    12.             {
    13.                 return 0;
    14.             }
    15.         }
     
  16. minad2017

    minad2017

    Joined:
    Dec 1, 2016
    Posts:
    37
    Thank you very much.
    It works now.
     
  17. minad2017

    minad2017

    Joined:
    Dec 1, 2016
    Posts:
    37
    I am developing with Playmaker.
    I am having trouble calling SetEmotion from Playmaker.
    With "Slate plugin", it was possible to operate Emotion.
    As a test, I tried running a simple C # script, but Emotion still does not work.
    Code (CSharp):
    1. using System.Collections;
    2. using System.Collections.Generic;
    3. using UnityEngine;
    4. using RogoDigital.Lipsync;
    5.  
    6. public class SetEmotionTest : MonoBehaviour
    7. {
    8.     public LipSync lipSyncTarget;
    9.     public string emotionname;
    10.     public float blendtime;
    11.  
    12.     void Start()
    13.     {
    14.         lipSyncTarget.SetEmotion(emotionname, blendtime);
    15.     }
    16. }
    17.  
    Below is the error message.

    NullReferenceException: Object reference not set to an instance of an object
    RogoDigital.Lipsync.LipSync.SetEmotion (System.String emotion, System.Single blendTime) (at Assets/Rogo Digital/LipSync Pro/Components/LipSync.cs:611)
    SetEmotionTest.Start () (at Assets/---Script/-Sound/SetEmotionTest.cs:14)

    unity ver 2019.3.2f1
    lipsyncpro ver 1.52
     
  18. FirstTimeCreator

    FirstTimeCreator

    Joined:
    Sep 28, 2016
    Posts:
    745
    Hello,

    I have sent a separate support request in via email but figured I would ask here as well about this issue.

    1.When I setup the facial poses using the Bones system and try to preview a audio clip, all facial poses get reset to their default positions and all facial pose data appears to be lost.
    2. When I save a preset, nearly all scale, transform and rotation data is set to 0 when the preset is loaded.

    There are no errors in the console.

    The system actually works and moves the mouth/face joints until i click off of the preview window and everything is reset to the default position.

    This literally wasted an entire day of work for me because I finally got done setting up a complex face rig that is made using joints and not blendshapes went to test it, it was working great, clicked on hiarachy to go save the preset and literally ALL facial poses had been set to to the default rig position.

    I just about popped a blood vessel.

    This should not be happening.
     
  19. DeadSeraph

    DeadSeraph

    Joined:
    Sep 8, 2017
    Posts:
    97
    I wouldn't expect much support from the dev at this point. The asset doesn't seem to be supported at all anymore. I'll be seeking a refund and moving to Salsa, unfortunately.
     
  20. vorokoi

    vorokoi

    Joined:
    Oct 18, 2019
    Posts:
    39
    @DeadSeraph The questions you are asking are all answered in the documentation and in this forum thread. Yes you can call clips via code, trigger them via events, set a delay, anything via script. There is a completely documented API reference. If it was not possible, why would there would be an API? You are asking for the source code when it comes with the source code? What do you think all the scripts it comes with are for?

    Salsa is good for what it does, but doesn't offer the same flexibility and precision LipSync Pro does. It really isn't even close.

    Man, I would hate to be an asset publisher and have to deal with this kind of behavior from lazy/entitled users.
     
  21. msminotaur

    msminotaur

    Joined:
    Mar 27, 2014
    Posts:
    1
    Hi - I just started using Lip Sync Pro recently, and while I've been able to run autosync on some of my audio files, I've run into an issue where certain audio files fail with an error telling me to check our encoding or enable conversion. The audio files that DO work, are no different than the audio files that DON'T work (other than length and actual content....but they're all encoded the same), and we have conversion enabled. I've tried to look through all the posts on this thread, but I haven't seen much about autosync and this issue. Has anyone else encountered this inconsistency? Is there a solution I'm completely missing?

    Screen Shot 2020-03-05 at 3.31.11 PM.png
     
  22. DeadSeraph

    DeadSeraph

    Joined:
    Sep 8, 2017
    Posts:
    97
    1) If any of the things I have asked about are listed in the documentation, they certainly aren't clear or apparent, as I've been through the documentation multiple times, and I don't see a section that answers the questions I have asked. Maybe you could point me to the relevant sections?

    2) I'm not a programmer, I'm just doing some research for my team. We've purchased the pro version and it would be nice to not have to buy a different asset and relearn that one.

    3) If you purchase a product on the asset store, it is assumed that it is still being supported by the developers unless it has been deprecated. I'm not the only person who has had issues getting questions answered by the developer for this plugin.

    4) It's not a matter of being lazy. Support your asset or deprecate it. It's that simple. The developer has not been active in this thread for months and does not answer emails sent to their website for the product either. No need to be rude.
     
    AGregori likes this.
  23. Gubendran_Vins

    Gubendran_Vins

    Joined:
    Mar 20, 2018
    Posts:
    10
    Hi,
    I am working with Lip Sync Pro plugin for Oculus GO VR application. Everything is working fine, But It reducing my FPS and hang up the application for a moment when my humanoid model start to speak.

    Is there any solution for this?

    Thanks.
     
  24. FirstTimeCreator

    FirstTimeCreator

    Joined:
    Sep 28, 2016
    Posts:
    745


    Fri, Mar 6, 2:43 PM (4 days ago)

    Support Request was sent 4 days ago. If I don't hear anything by weeks end, I guess we will have no choice but to change to Salsa because this is holding up development.

    This is a good product, If I can get some support it would be worth using. My last ditch effort to try to fix the problems with it will be to try to upgrade to the latest unity version. Surely others are experience these issues where the Data is not saving to the preset properly for Generic Rigs as well as the facial poses being basically "erased" when you test a sample clip in preview... again using Generic Rig AND Bones system (NOT blendshapes).
     
  25. vorokoi

    vorokoi

    Joined:
    Oct 18, 2019
    Posts:
    39
    @Gregorik I don't think you know what crusade means. I answered someone's questions, which I guess you could have done too, but you chose not to? If you aren't going to address a topic, don't tell someone who is not to.

    I am using LipSync Pro 1.5.2 on 2019.3.4f1 with absolutely zero issues.
     
  26. RobertEspinoza

    RobertEspinoza

    Joined:
    Aug 18, 2014
    Posts:
    10
    Hello everyone. We are having an issue with Lipsync Pro editor GUI not displaying properly. Specifically LipSync Data GUI. We are using Unity 2018.4.16f1 and have updated Lipsync pro to use the latest version 1.52 using the update guidelines. We were previously on an older version of Lipsync Pro and we where having component GUI issues where the menus were appearing stacked, but that now is fixed. The issue we have now is related to Lipsync data windows not displaying the GUI properly. Here is a screenshot of what I am talking about. If anyone has any idea how to resolve this, we would greatly appreciate it.

    upload_2020-3-16_14-25-46.png
     
  27. RobertEspinoza

    RobertEspinoza

    Joined:
    Aug 18, 2014
    Posts:
    10
    OK just wanted to answer my own question just n case anyone else runs into this issue. Our problem was that we moved the Rogodigital Editor Default Resources to a subdirectory in the project and did not notice. Once we moved all those resources to the proper Editor Default Resources directory on the root of assets, all the UI is working properly.
     
    CoyoteFringe and AGregori like this.
  28. NawarRajab

    NawarRajab

    Joined:
    Aug 23, 2017
    Posts:
    20
    Will this ever be solved? We have this bug for more than 6 months now

    Code (CSharp):
    1. ArgumentOutOfRangeException: Index was out of range. Must be non-negative and less than the size of the collection.
    2. Parameter name: index
    3. System.ThrowHelper.ThrowArgumentOutOfRangeException (System.ExceptionArgument argument, System.ExceptionResource resource) (at <437ba245d8404784b9fbab9b439ac908>:0)
    4. System.ThrowHelper.ThrowArgumentOutOfRangeException () (at <437ba245d8404784b9fbab9b439ac908>:0)
    5. System.Collections.Generic.List`1[T].get_Item (System.Int32 index) (at <437ba245d8404784b9fbab9b439ac908>:0)
    6. RogoDigital.Lipsync.BlendSystem.SetInternalValue (System.Int32 blendable, System.Single value) (at Assets/DMAI/3rd Party/Rogo Digital/LipSync Pro/Classes/BlendSystem.cs:230)
    7. RogoDigital.Lipsync.BlendshapeBlendSystem.SetBlendableValue (System.Int32 blendable, System.Single value) (at Assets/DMAI/3rd Party/Rogo Digital/LipSync Pro/BlendSystems/BlendshapeBlendSystem.cs:67)
    8. LipSyncEditor.OnInspectorGUI () (at Assets/DMAI/3rd Party/Rogo Digital/LipSync Pro/Editor/LipSyncEditor.cs:840)
    9. UnityEditor.UIElements.InspectorElement+<>c__DisplayClass55_0.<CreateIMGUIInspectorFromEditor>b__0 () (at <6f28216fea9f453abf2e05b770ed3ee4>:0)
    10. UnityEditor.PopupCallbackInfo:SetEnumValueDelegate(Object, String[], Int32)
     
  29. Joshua_Bogart

    Joshua_Bogart

    Joined:
    Aug 5, 2015
    Posts:
    4
    Is it safe to say this asset isn't a viable option for WebGL builds? More specifically, any details on WebGL 1.0 vs. WebGL 2.0, asm vs. web assembly, etc? 2018.4.20f1 is the current target, but "AudioPlayback timing mode is incompatible with WebGL target." is all I see when adjusting Timing Modes. There doesn't seem to be much documentation for this and forums seem thin with details.
     
    Last edited: Apr 9, 2020
  30. AGregori

    AGregori

    Joined:
    Dec 11, 2014
    Posts:
    437
    Heh, Lipsync Pro is part of the spring sale with 50% off. Since the asset is now clearly abandoned for 6+ months, new buyers should probably beware.
     
    Alvarezmd90 likes this.
  31. mic474

    mic474

    Joined:
    May 7, 2017
    Posts:
    3
    Hi, I absolutely love the asset but I'm new to coding. Could anyone help with me an example on how to call the clips from code. I have this from my first coding lessons:

    public class startSpeech : MonoBehaviour
    {
    public LipSync lipSyncCharacter;
    public LipSyncData clip1;

    void Start()
    {
    LipSync.Play(LipSyncData clip1, 1);
    }

    There seems to be something wrong with the way I try to handle the type 'LipSyncData, which is not valid in the given context'.

    Many thanks in advance!
     
  32. Jakub_Machowski

    Jakub_Machowski

    Joined:
    Mar 19, 2013
    Posts:
    473
    SammmZ likes this.
  33. ben_unity723

    ben_unity723

    Joined:
    Sep 27, 2018
    Posts:
    3
    Rtyper, did this ever get integrated into the latest product? Considering buying this for my timeline mini animation video I'm building in Unity, or was looking to see if iClone's animations were a better option for timeline ease.
     
  34. Caruos

    Caruos

    Joined:
    Dec 30, 2016
    Posts:
    42
    Hello,

    I'm trying to implement LipSyncPro into my game, but so far I've encountered several major glitches in bone transform poses. Here are two than can easily be reproduced from the example scenes, could you please help me find the cause :

    Bug 1 : preset resetting scale

    Here's the testing protocol :

    1. Import LipSyncPro 1.52 in Unity 2019.3

    2. Open the "Example_02_Bone_Poses" scene

    3. In the LipSync component on "GRFLHead_Liza", change the Z rotation on phonem A from 260,2 to 265

    4. Save preset with "Relative bone transform"

    5. Apply preset on "GRFLHead_Liza"

      1. => the scales on the bones has been set to 0, not just in the Lipsync component but in the scene as well ! The model is now completely unusable
        upload_2020-5-25_13-46-31.png

    Bug 2 : Data corruption when selecting another LipSync object

    Here's the testing protocol :
    1. Import LipSyncPro 1.52 in Unity 2019.3

    2. Open the "Example_02_Bone_Poses" scene

    3. Male a copy "GRFLHead_Liza2" of "GRFLHead_Liza". Both have a Z rotation on phonem A of 260,2

    4. In the LipSync component on "GRFLHead_Liza", change the Z rotation on phonem A from 260,2 to 265

    5. Click on "GRFLHead_Liza2", then "GRFLHead_Liza". GRFLHead_Liza now has a Z rotation on phonem A of 270

    6. Click on "GRFLHead_Liza2". GRFLHead_Liza2 now has a Z rotation on phonem A of 270
    Thanks by advance for your help.

    Regards.
     
  35. BenWoodford

    BenWoodford

    Joined:
    Sep 29, 2013
    Posts:
    108
    This is kind of a weird question, but if I have audio playing on one device on a LAN (let’s say speakers broadcasting to a room) and a character on individual devices that need to lip sync up to that audio being broadcast to the room, is that possible at all, or does this rely on a local audio source being present?

    Assume the clients are networked, and all clients are mute - only the server is playing the voice clip.
     
  36. donkey0t

    donkey0t

    Joined:
    Oct 23, 2016
    Posts:
    55
    Hi, I'm just starting out with LipSync Pro and I'm trying to setup up a Character Creator 3 sync. When I start to customise the phonemes I expect to see, given video's I've seen online, a little gizmo showing me an example of the mouth shape, but I'm not seeing that. What do I need to do to enable that mouth shape guide?
     
  37. donkey0t

    donkey0t

    Joined:
    Oct 23, 2016
    Posts:
    55
    re: Eye Controller. By default my models eyes are facing into the skull when I use the Eye Controller. I've set the offset rotation on Y to be 180 which now means I can at see their eyes. However, the LookAt doesn't seem to work. The character is just permanently looking off to their right. Any ideas?
     
  38. donkey0t

    donkey0t

    Joined:
    Oct 23, 2016
    Posts:
    55
    Argh, now when I apply an Animation controller the eye's roll back into the head again. <frustrated>
     
  39. jeromeWork

    jeromeWork

    Joined:
    Sep 1, 2015
    Posts:
    372
    Does anyone have a phoneme preset they could share for Reallusion's Character Creator 3?

    Thanks
     
  40. Rtyper

    Rtyper

    Joined:
    Aug 7, 2010
    Posts:
    450
    Hi everyone, I've finally got some updates on what's happening with LipSync and Eye Controller - I'm sorry that it took so long!

    First up, there's a new LipSync Pro update that's just gone live today on the Asset Store! LipSync Pro 1.521 contains several bug fixes relating to bone transforms, specifically when used in presets, and an improvement to the inspector UI, which was broken by the new editor theme in Unity 2019.3. You can find the full changelist on the Releases tab in the store: u3d.as/cag.

    Support
    I realise support has been lacking for a while, I've been replying to emails and picking up Facebook messages etc but there's always been a lack of organisation when it comes to keeping track of everything. To try and fix that I'm doing two things:
    1. Moving to a ticket-based support system.
      From next week on, if you send us a message on Facebook or email contact@rogodigital.com (or start a ticket manually HERE), it will create a ticket that I can use to keep track of support requests better. I'll aim to respond within 3 days, and it should help stop losing emails that have been replied to but still need further action.
    2. Starting a Discord server
      I'm almost always on Discord anyway as we use it to communicate within Rogo Digital, and so I'll get notifications. It also handles checking of invoices numbers etc to keep things clean and organised, and allows community discussion as well. You can join it here: https://discord.gg/V475qqh.
    The Discord doesn't (currently) automatically create tickets, but where someone sends a specific support request in the right channel and it's not user error or something easily solved, I'll create a ticket manually.

    With this in mind, while I will try to check-in on these forums more often, I can't guarantee responses here. The best places for getting support in future will be:


    Future Updates

    I've been talking about various future updates for a while, specifically 1.6 and 2.0. The plan was for 1.6 to be a compatibility update, only supporting 2019.2 and up, moving the editor UI over to the new UI Toolkit system in Unity, replacing our custom-made keyboard shortcuts and editor preferences screens with the built-in Unity versions that have been made available since ours were built. 2.0 would then be an entirely new ground-up rebuild with new features, aimed at solving many of the problems LipSync has accumulated over 5 years of having new features and systems added to an old codebase.

    While working on 1.6, I gradually realised that the amount of extra support code and new work required to migrate the UI cleanly was far too much for something that would be more-or-less thrown away anyway. I've also started and restarted work on 2.0 a few times, trying to find the best way to create a cleaner, more easily fixable, update-able and extensible plugin.

    tl;dr: Working on LipSync Pro in its current state is a pain, and several less drastic ideas for version 2 just wouldn't have solved enough problems.

    The upshot of all this is that I'm now working on new versions of LipSync and Eye Controller, and a new product called Performer that I'll share more details on another time. All these tools share a core codebase, handling a lot of runtime and editor functions, including a new version of BlendSystems, that makes compatibility between them and sharing bugfixes across them super simple. I'm really happy with the direction of this so far, which is why I feel finally ready to publicly announce it all.

    Although the asset development & support is just me, as a wider team we're currently working on our first commercial game, so to avoid confusion between the two sides to Rogo Digital, I've decided to create a new umbrella brand that our assets will fall under, reflecting the focus on cinematics & character animation: Cinetools

    Cinetools-forum.png

    I'll be putting together a post specifically on the new features in these tools soon, but in the meantime here's a quick preview of the redesigned editors (to fit in with the 2019.3 editor redesign) and the improved Blend Systems:
    upload_2020-6-5_14-24-36.png

    The plan is to release
    Eye Controller 2 first, in late July, followed by LipSync Pro/Lite 2 later in the year. These will be paid upgrades (exact prices TBD, though it won't be huge) and we'll be setting a one-year grace period, meaning anyone who has bought LipSync Pro within a year of version 2 being released will get the upgrade for free. I believe this is a good balance between making sure those of you who bought it recently aren't denied value for money, and allowing me to continue working on these products into the future.

    Stay safe!
     
    Last edited: Jun 6, 2020
    MoMonay, CoyoteFringe and AGregori like this.
  41. Rtyper

    Rtyper

    Joined:
    Aug 7, 2010
    Posts:
    450
    I know it's probably too late for many of the posts here, but I'll provide answers where I can for future reference!

    There's a processing step that the LipSync character does when it starts playing which will be causing the hang. You can use the "Data Preprocessor" in the Window > Rogo Digital > LipSync Pro menu to do the processing step in advance, though it will mean the LipSyncData clip will be tied to a single character. You can also try the Anim Clip Export if you want to avoid the runtime cost of LipSync entirely, you can download it from the Extensions Window (though audio playback won't be handled automatically).

    It is compatible with WebGL, you'll just need to chose a timing mode other than AudioPlayback (Custom Timer is recommended, and it will fall-back to this automatically) - it's because audio playback in WebGL doesn't report its time correctly for LipSync to use. You may also want to look at pre-processing or anim clip export (as said above) for speed.

    That looks fine to me, you probably just need the namespace - all the runtime parts of LipSync Pro are in the "RogoDigital.Lipsync" namespace, so you'll need to add "using RogoDigital.Lipsync" to the top of your script.

    Congrats on reaching your target! I hope the game does well!

    The timeline extension is now available for download, it doesn't support previewing in the editor but it does work in play mode.

    I've replied to your email, but the fixes for these issues are included in today's update.

    This is interesting, do you just mean you need LipSync to play on all the clients at once, or are you trying to bypass LipSync's audio playback and specifically only play audio on the server?

    If it's the former, I'd simply call LipSync.Play inside an RPC to call it on the clients separately. If it's the latter then I don't really have any simple advice - could you give any more info about what you're trying to do?

    Can you modify your hierarchy? The best way to fix issues like this is by adding a dummy parent object to your eye bones (or eye meshes depending on how your character is set up), modifying their transforms so that the dummies face Z-forward, but the real bones/meshes as children are facing the right way. This is all stuff I'm fixing in version 2, but that's the best approach for now.

    It should be visible by default (in the scene view, not the game view). I can't think of any reason why it wouldn't be. Are there any errors/warnings in the console that seem related?
     
    CoyoteFringe likes this.
  42. AGregori

    AGregori

    Joined:
    Dec 11, 2014
    Posts:
    437
    Excited to see that Rogo is back with an update!
    You do have something of a customer service disaster on your hands currently, including the newer reviews, but nothing that further bugfixes can't cure. ;)
     
    jeromeWork likes this.
  43. jeromeWork

    jeromeWork

    Joined:
    Sep 1, 2015
    Posts:
    372
    I've kept using and recommending the asset simply because it's the best at what it does, even with all its flaws.
     
    MoMonay, Rtyper and vorokoi like this.
  44. BenWoodford

    BenWoodford

    Joined:
    Sep 29, 2013
    Posts:
    108
    Nailed it, I need the server to play the audio and the clients to sync up. If it's possible to just bake the lip-sync in the editor and have it play without audio (or with audio muted perhaps...?) on the client that'd work. It's for a public-ish installation, so the server plays the audio and the clients do some stuff on their iPads in AR - that AR creature is the thing that needs the lip sync.
     
  45. Rtyper

    Rtyper

    Joined:
    Aug 7, 2010
    Posts:
    450
    That's the plan :)

    Ahh, ok I see. In that case, I'd say the easiest option is exporting to an animation clip. You can download the AnimClip Export package from the extensions window (Window > Rogo Digital > Get Extensions), and after compiling you'll be able to find a new "Export as AnimClip" option in the clip editor. It's marked as experimental at the moment because it doesn't support every possible Blend System, but if you're using blendshapes or bone transforms it should work perfectly.

    You will then have to handle playing the audio manually, but that shouldn't be too much of an issue.
     
    BenWoodford likes this.
  46. MoMonay

    MoMonay

    Joined:
    May 9, 2018
    Posts:
    18
    Hi this asset is probably my favorite of the 100s of assset I own!

    Sometimes I reopen my unity project and the preview in the clip editor doesn't work anymore for lipsync files that I have already created, as in; everything works fine but the skinned mesh blendshape values aren't changing during the lipsync playback. Any ideas what might be causing this?
     
  47. raghuakella

    raghuakella

    Joined:
    Jun 11, 2020
    Posts:
    1
    Hi all,
    I'm new to Animation and only found about LipSync Pro lately. Can this tool be used for videos recording from phone/camera. My idea is to record a toy and add lip sync to make it look like talking toy. Any thoughts if this can be achieved from this Lipsync pro tool.
     
  48. Rtyper

    Rtyper

    Joined:
    Aug 7, 2010
    Posts:
    450
    Thanks! Is this problem happening just in the clip editor preview, or at runtime as well? Could you also check to see if there are any errors or warnings appearing in the console when it happens? There shouldn't be any way for it go from working to not working without anything changing.

    Unfortunately not by default, sorry! It may be possible with a still image and some pre-edited frames (of different mouth shapes) made in Photoshop or something similar, but it would require some image editing and likely programming to create a custom blend system for LipSync. Video would be much much harder, and if you wanted it to be live (e.g. an AR app or similar) would be next to impossible.
     
  49. DarkTree

    DarkTree

    Joined:
    Mar 23, 2013
    Posts:
    239
    Hello everyone, going to buy LipSyncPro and wondering is it possible to change morpheme set for other languages and detect other morphemes than english?
     
  50. Rtyper

    Rtyper

    Joined:
    Aug 7, 2010
    Posts:
    450
    The morpheme set is completely customisable, that's no problem. AutoSync (for detecting phonemes automatically) is currently only available in English, though there are other language models that can be adapted to work with AutoSync - though it will take some work in creating the phoneme map to convert from the model's output to whatever set you create for use in LipSync Pro.
     
unityunity