Search Unity

Question How face Capture with other animations playing simultaneously during runtime?

Discussion in 'Virtual Production' started by diegooriani_unity, Nov 13, 2021.

  1. diegooriani_unity

    diegooriani_unity

    Joined:
    Jul 23, 2021
    Posts:
    10
    I am trying to simultaneously play an idle animation together with a real-time face capture using Unity live-capture. I am facing the server connection between the devices stops when I run the game. Is it possible to have both animations running simultaneously? If, so can anyone point me in the right direction.

    I really appreciate any help you can provide.
     
  2. akent99

    akent99

    Joined:
    Jan 14, 2018
    Posts:
    588
    Have you tried creating a timeline, dropping in the idle animation (looped), then use the timeline preview mode to play the animation?

    This is just a suggestion to try - I don’t use face capture from unity - I use a different one and I cannot work out how to only capture face - it is capturing face and body movements at the same time. But I added an activation track in a timeline to activate/deactivate the hair (so it appears/disappears) and played that in "preview mode" (the play button in the timeline) and that did activate/deactivate the hair at the same as capturing live capture (from my other tool), so there it may be possible if you can tell the Unity tools to only capture face and nothing else.

    I was wondering about recording and playing animation clips at the same time - would they conflict (fight over who controls the animator component etc). If they do, some hacks to try. Playback and record on different instances of the character standing next to each other - you can at least see the movements of the character to time your performance, even if you cannot see the facial expressions on the same character. Or export a video of idle looped and play that in other window… Clearly recording face over the top of other animation tracks can make timing of performances easier.
     
  3. diegooriani_unity

    diegooriani_unity

    Joined:
    Jul 23, 2021
    Posts:
    10
    Thank you for the reply. I manage to figure it out. However, I did not track the body. Therefore there was no overlap of animations. The issue is that Unity ceases the capture as you play the game. The solution is to start the capture once the game is running.

    On another note, I am interested to know more about your setup. Would you mind sharing more info about your setup (software and setup) as I will be exploring body tracking at a later date?
     
  4. akent99

    akent99

    Joined:
    Jan 14, 2018
    Posts:
    588
    Glad you got it working! I would love to hear how it all goes for you.

    I am using some open source software called VSeeFace. It is more for VTubers (Virtual YouTubers, upper half of body). It uses a protocol called VMC which a few tools around use as well. It works with VRM files, designed for 3D avatars - again used by VTubers, but also VR Chat and other such apps. (VRM files are basically avatars using the Humanoid model, but with extra things defined for controlling the face etc.) VRoid Studio is quite popular for creating VTuber characters, but there are also Blender plugins. There are also iPhone apps for example you can feed into VSeeFace, then VSeeFace sends it on to Unity which can receive it via "EVMC4U" (more open source). There are a few apps like Waidyo, iFacialMocap, TDPT (Three D Pose Tracker), 3d.kalidoface.com, and more. I then use EasyMotionRecorder (another open source project) to capture movements and create animation clips from them.

    Personally I use VSeeFace with a Leap Motion camera for capturing hands, fed into EVMC4U and EasyMotionRecorder. I don't use facial expression capture - instead I have been building up a set of animation clips for facial expressions, then I layer walk animation clips, recordings from VSeeFace (upper half of body), facial clips, etc using Timeline, Sequences, etc. Oh, I use VRoid Studio to create all the characters because its quicker. Example result: https://extra-ordinary-web-stories.web.app/ep1/ (work in progress).
     
  5. diegooriani_unity

    diegooriani_unity

    Joined:
    Jul 23, 2021
    Posts:
    10
    Very nice. Are you using Mocap to animate the characters and animations to handle the face? Is that correct? What is EVMC4U? I am new to this whole world.
     
  6. akent99

    akent99

    Joined:
    Jan 14, 2018
    Posts:
    588
    EVMC4U (Easy Virtual Motion Capture for Unity, https://booth.pm/ja/items/1801535) is open source software (on GitHub). It reads VMC protocol for Unity, which a number of VTuber folks use. I was using a number of free packages from the VTuber community (free suits my budget!!! This is just a hobby for me).

    I do whatever a scene needs. My mocap cannot do walking (not well anyway), so I tend to use walk animations for the legs (e.g. from unity asset store), then I use "Avatar Masks" in Unity in a Timeline to override the top half of the body. But I did not find facial mocap good enough for my liking (maybe my face is too boring, so it did not pick it up well), so I create facial expressions by hand as animation clips. I also found the finger tracking great at times, but not good enough to make a reliable fist. So I hand created a fist animation clip. Then I splice and override etc in Timeline animation tracks between all the different things to get as close as I can given the quality of the tools I am using.

    Proper gloves and body mocap suit would probably make it all so much easier, but buying one of those suits is beyond my near $0 budget!!

    I wrote some blog posts at https://extra-ordinary.tv/behind-the-scenes/ if any help - random stuff, so you need to sift through to find Unity posts.
     
  7. diegooriani_unity

    diegooriani_unity

    Joined:
    Jul 23, 2021
    Posts:
    10
    Thank you for the info. I will definitely have a look at your blog. Meanwhile, this might be interesting for you —
     
    akent99 likes this.