Search Unity

  1. Megacity Metro Demo now available. Download now.
    Dismiss Notice
  2. Unity support for visionOS is now available. Learn more in our blog post.
    Dismiss Notice

[FaceCapture] Feedback

Discussion in 'Virtual Production' started by GeniusKoala, May 18, 2021.

  1. GeniusKoala

    GeniusKoala

    Joined:
    Oct 13, 2017
    Posts:
    97
    Hey (me again),

    I tried the face capture. Actually it reminds me the facial-ar-remote project that I currently work on at work for a project (we can discuss about it if you are interessed in motion capture for cartoon animation). It's really much better since it works out of the box and in Editor mode too (like the Virtual Camera so cool!). I'm really impressed by how it runs smoothly in the Editor.

    I already have a feature that would be great to have. Having the possibility to record at the same time the facial and the sound from a microphone would be awesome. It's a critical feature for our needs at work for example. Using the microphone for a live session is not easy in Unity since you need to set a duration before starting to play the sound. Plus it would be awesome to follow the same workflow that you create for the facial animation. Being able to play the sound for the device to the computer and vice-versa.

    Thanks for reading! I'm looking to use it and tweak it!
     
  2. benmurray2

    benmurray2

    Joined:
    May 16, 2019
    Posts:
    3
    Looking pretty easy to set up - The ARKit app seems to crash after about 30 seconds. I've tried on iPhone 11 pro, and iPad pro so far.
     
  3. bradweiers

    bradweiers

    Joined:
    Nov 3, 2015
    Posts:
    59
    @GeniusKoala Thanks for the request. We have a plan to add on-device audio recording.

    @benmurray2 We're looking into this and will try and have a fix as soon as possible. Is there any more information you can provide about your setup? Your iOS version in particular. Also, does it say "Development Build" in the bottom corner of the app?

    It's worth noting that we have seen crashes on versions that aren't the latest iOS version (14.5). Please update to the latest iOS version for best results.
     
    Last edited: May 18, 2021
  4. pauldrummond

    pauldrummond

    Joined:
    Oct 30, 2014
    Posts:
    145
    Very useful. I'll definitely be using this for character work. Recording audio will be a great addition to the app.
     
    GeniusKoala likes this.
  5. Ruchir

    Ruchir

    Joined:
    May 26, 2015
    Posts:
    934
    Why isn't this section accessible through the "Betas & Experimental Features" even though it's shown as being under that section?

    I have to constantly search for it on the email I got, or through my history o_O
     
  6. HIBIKI_entertainment

    HIBIKI_entertainment

    Joined:
    Dec 4, 2018
    Posts:
    594
    This is a great stepping stone for raw data capturing.
    I hope to see some more smoothing options too as well as things like phoneme support with the audio mentioned above, but I can tell you would have base support for this regardless, it's just a google option for live capture cleaning.

    I can see taking this roundtrip into different applications would be great too.

    a small issue I seem to have is the tongue is always at the tip of the top teeth. I know the ARKit tongue is a bit gimmicky but I couldn't find how to rest the tongue more naturally
     
    akent99, GeniusKoala and Ruchir like this.
  7. Ruchir

    Ruchir

    Joined:
    May 26, 2015
    Posts:
    934
    The video seems to be unavailable though
     
    HIBIKI_entertainment likes this.
  8. HIBIKI_entertainment

    HIBIKI_entertainment

    Joined:
    Dec 4, 2018
    Posts:
    594
    my apologies I got ahead of myself typing before the video uploaded
     
  9. beatdesign

    beatdesign

    Joined:
    Apr 3, 2015
    Posts:
    137
    I agree. Recording audio is the next must-have.

    I'd suggest also to provide a more photorealistic face in the example scene. I know that here the important thing is the richness of the blendshapes and the sample scene already does a great work about this, but having a more realistic face would encourage people to share the registration on social media more, letting as many people know that finally unity is making progress in virtual production.

    Another thing: if I connect the client to the server while in edit mode, once I hit Play the connection is lost, and I have to reconnect. Not a bug per-se, but it would be nice if the connection stay active!
     
    Last edited: May 20, 2021
    GeniusKoala and Ruchir like this.
  10. marc_tanenbaum

    marc_tanenbaum

    Unity Technologies

    Joined:
    Oct 22, 2014
    Posts:
    637
    Sorry about that. Our thinking is that we want people to explicitly join the Open Beta, which gives us a clear indication of interest and intent. Put another way, we need to know that people want these tools so we can justify the cost of development.

    I'll ask the team if we should revisit this decision.
     
    Ruchir and HIBIKI_entertainment like this.
  11. HIBIKI_entertainment

    HIBIKI_entertainment

    Joined:
    Dec 4, 2018
    Posts:
    594
    @Ruchir We just watched the forum section so we can go back that way.
     
  12. bradweiers

    bradweiers

    Joined:
    Nov 3, 2015
    Posts:
    59
    Good call. For the beta the sample face is really kind of "programmer art" made by a developer to help people to see how things work without having to build a face, and I'm sure we'll have something higher quality for release.
     
  13. GeniusKoala

    GeniusKoala

    Joined:
    Oct 13, 2017
    Posts:
    97
    I agree. It's just the first iteration of a beta so no wonder for this kind of sample and it's really good enough for testing.

    I was wondering actually what's Unity strategy about face capture? Do you want to reach people for cartoon or also realistic human beings? Both need different ways since animation for cartoon is totally different. Many cartoon characters faces don't look like the real human face and need appropriate adjustements for data computing. Plus animation for cartoon is more dynamic and need in my opinion more work than realistic human beings (I may be wrong since I'm just beginning of this topic). Regarding the Meta Humans of Epic, my guess was Unity would rather target a cartoon audience first. If I'm wrong and that you don't really focus on one audience, in the future of this face capture feature it would be awesome to have a scene sample for different type of characters. Also if you see software like Reallusion iClone, they use the iPhone blenshapes that they retarget on their own blendshape system. Indeed iPhone blenshapes are really limited itself but it's a good start for studio that have resources to develope around it. That's actually what we are currently doing to reach high face capture quality for cartoon.
     
  14. pauldrummond

    pauldrummond

    Joined:
    Oct 30, 2014
    Posts:
    145
    I think using ARKit for facial capture is a good idea as I can use existing hardware (an iPhone X) and it's compatible with other capture systems such as Moves By Maxon for Cinema 4D. I'm working on a project that needs facial capture and planned to use the Cinema 4D solution, so have been creating ARKit compatible blend shapes. Now I can switch to facial capture inside Unity, with no changes to my models. This is a big improvement to my workflow.
     
    marc_tanenbaum and GeniusKoala like this.
  15. pauldrummond

    pauldrummond

    Joined:
    Oct 30, 2014
    Posts:
    145
    For me these tools are very important. In order of importance I'd put facial capture first, then virtual camera.
     
  16. fengkan

    fengkan

    Joined:
    Jul 10, 2018
    Posts:
    82
    Finally, wish this become an official release soon.
     
    marc_tanenbaum likes this.
  17. HIBIKI_entertainment

    HIBIKI_entertainment

    Joined:
    Dec 4, 2018
    Posts:
    594
    I'm really excited about all of these new project ideas (virtual camera/face cap). it'd be really nice to have facial capture raw in unity too as an earlier production point as well as at later stages too.
    It really shows potential new paths to explore and create.
    and has opened a lot of closed doors that I and my team were previous exploring *fingers crossed for continued development here. We would happily help fund this project.

    We use ARkit in general in our pipelines so it's really good to see this adopted here in a similar fashion.

    One thing I have recently found is that I can't seem to find the new Sequencer package again for timeline.
    is it only available through the Cinematic Studio asset?
     
  18. markvi

    markvi

    Joined:
    Oct 31, 2016
    Posts:
    118
    It's still in beta, so it's not yet visible in most versions of Unity. Here's how to add it to any project:
    • In Unity 2021.1 and above, click the [+] button at the top left of the Package Manager, select Add package by name... and enter com.unity.sequences in the Name field and 1.0.0-pre.5 in the Version field.

      OR
    • In Unity 2020.3 and above, click the [+] button at the top left of the Package Manager, select Add package by git URL... and enter com.unity.sequences@1.0.0-pre.5.

      OR


    • In Unity 2019.4 and above, edit Packages/manifest.json and add the following line to the top of the list of dependencies:

      "com.unity.sequences": "1.0.0-pre.5",
     
    akent99 and HIBIKI_entertainment like this.
  19. HIBIKI_entertainment

    HIBIKI_entertainment

    Joined:
    Dec 4, 2018
    Posts:
    594
    Thanks @markvi works perfectly
     
    markvi likes this.
  20. GerryM

    GerryM

    Joined:
    May 1, 2012
    Posts:
    20
    Greetings,

    thanks for letting us try this, however, we can't connect the FaceCapture App to the server. The app is running on a current iPhone Pro 12, starting up fine and even finding the server and port, yet, upon pressing connect it takes a second or so then displays "Can not connect to server" at the bottom.
    We tried with several recent Unity version on a MacBook Pro and with Windows. Under Windows a new project shows the "configure firewall" upon the first start, doesn't show this on the Mac, though. Also, did check with several ports.

    Any ideas? Thanks!
     
  21. bradweiers

    bradweiers

    Joined:
    Nov 3, 2015
    Posts:
    59
    @GerryM which version of the app and package are you using?
     
  22. bradweiers

    bradweiers

    Joined:
    Nov 3, 2015
    Posts:
    59
    This problem is always caused by not having the latest version of both the app and the package.
     
  23. GerryM

    GerryM

    Joined:
    May 1, 2012
    Posts:
    20
    The app is "Version 0.1 (400)", the package is "1.0.0-pre.400", tried the HDRP Face Example...
    Is there a newer version? Both seem to be from May 14th.

    The error message in the app is "Can not connect to server" :-(
     
    Last edited: May 25, 2021
  24. bradweiers

    bradweiers

    Joined:
    Nov 3, 2015
    Posts:
    59
    And Unity version is 2020.3+?
     
  25. GerryM

    GerryM

    Joined:
    May 1, 2012
    Posts:
    20
    Unity Version 2021.1.7f1 (Windows)

    Funny, got the app to connect to the Mac server by physically connecting the iPhone to the Mac.

    However, while the tracking looks fine in the app, it doesn't show in Unity, sigh...
    Unity Version 2020.3.9f1 (Mac), does show the phone under connected clients, the example head does not show any movement, though.
     
    Last edited: May 25, 2021
  26. bradweiers

    bradweiers

    Joined:
    Nov 3, 2015
    Posts:
    59
    Please click on the TakeRecorder and make sure that you have clicked the "Live" button to be in Live mode.
     
  27. GerryM

    GerryM

    Joined:
    May 1, 2012
    Posts:
    20
    Silly me, indeed, working now, thanks. Only on the Mac, though.

    One more feedback, the app pushes the brightness on the phone to the max upon start, kind of annoying. Not sure if this is necessary.

    Thanks again!
     
  28. bradweiers

    bradweiers

    Joined:
    Nov 3, 2015
    Posts:
    59
    You mean it doesn't work on Windows?

    We'll fix this very soon.
     
  29. benmurray2

    benmurray2

    Joined:
    May 16, 2019
    Posts:
    3

    Hey there, I've updated the iOS to 14.6 on iPhone 11 pro - I still get the crashes, it doesn't seem to matter if I'm connected to Unity, or not. Where can I update the iOS app now? I tried the original link, and I get a message saying "This beta isn't accepting any new testers right now"

    Thanks,

    Ben
     
  30. markvi

    markvi

    Joined:
    Oct 31, 2016
    Posts:
    118
  31. BlakeSchreurs

    BlakeSchreurs

    Joined:
    Aug 10, 2016
    Posts:
    51
    I get an ArgumentNullException when connecting. Did I miss a step somewhere?

    2020.2.0
     
  32. bradweiers

    bradweiers

    Joined:
    Nov 3, 2015
    Posts:
    59
    The minimum supported version is 2020.3.

    Can you copy and paste the error from the console?

    Are you following the getting started guide posted in the original forum post? Also curious if you're using the face sample?
     
    Last edited: May 29, 2021
  33. benmurray2

    benmurray2

    Joined:
    May 16, 2019
    Posts:
    3
  34. cr4y

    cr4y

    Joined:
    Jul 12, 2012
    Posts:
    44
    On iPad + Windows PC I see "Trying to connect" and nothing happens. On PC no error los, empty connected clients list.
    https://i.imgur.com/DbIT0OL.png
    (using Unity 2020.3.5, and current version of FaceCapureSample)
     
  35. bradweiers

    bradweiers

    Joined:
    Nov 3, 2015
    Posts:
    59
    @cr4y thanks for reporting. Can I get a few more details?
    • macOS or Windows?
    • What GPU?
    • What Unity version?
    • Which iPhone model?
    • Which iPhone iOS version?
    • Which app version? (you can find this in TestFlight)
    • Which Live Capture package version? (you can find this in the Package Manager)
     
  36. pbritton

    pbritton

    Joined:
    Nov 14, 2016
    Posts:
    159
    That is somewhat understandable but it may not be the best way to evaluate interest. I already struggle to find anything about this potential feature online, so if the final decision is just based on current interest, then the results may not be favorable. This is something that I would like to implement in our animation curriculum and there is an opportunity for Unity to connect with universities and colleges teaching animation to potentially use this as part of their animation pipeline.
     
  37. pbritton

    pbritton

    Joined:
    Nov 14, 2016
    Posts:
    159
    I am also running into the same issue as cr4y.
    Windows 10 Build 19043
    RTX 2070 Super
    Unity 2021.1.12f1
    IPad Pro (m1)
    IPadOS 14.6
    Version 0.1
    Live Capture 1.0.1

    Update: I was able to resolve the initial connectivity issue by deleting the rule for Live Capture in the firewall settings. However, it connects for 7 seconds then disconnects.
     
    Last edited: Jun 21, 2021
  38. cr4y

    cr4y

    Joined:
    Jul 12, 2012
    Posts:
    44
    @bradweiers
    OS: Windows 10 PC + iPad Pro (a2229)
    GPU: Seriously, what is the difference xD. GTX GeForce 1060 6GB
    Unity: 2021.1.12, it fails on other versions too
    IPadOS 14.4
    Version 0.1
    Live Capture 1.0.1.-pre.465

    And yes, I've disabled the firewall (first using the "configure firewall" button in the connections window, then in Windows settings).

    Fun idea: some developers used to add debug logs when something happens (i.e. connection fails). Also, reconnect button or any indicator that the ios app tries to reconnect would be way better than "Trying to connect" button lasting forever.
     
  39. akent99

    akent99

    Joined:
    Jan 14, 2018
    Posts:
    588
    I don't have 2021.2 installed yet, so have not had a chance to try the virtual capture. I was curious however if it uses the VMC protocol (as used by apps like Waidayo, https://apps.apple.com/us/app/waidayo/id1513166077). VSeeFace (https://www.vseeface.icu/) + EVMC4U + EasyMotionRecorder etc work pretty well for me. Just wondering if trying to be compatible with them or whether this tool had some benefit over the others. (E.g. VSeeFace can take VMC streams from other apps for more accurate facial capture and combine them with body movement, then send the combined stream through to Unity.) Just curious if this tool had some benefit (other than becoming a built in Unity standard, which has value in its own right.)
     
  40. ScottSewellUnity

    ScottSewellUnity

    Unity Technologies

    Joined:
    Jan 29, 2020
    Posts:
    20
    @cr4y
    Thanks for the information. If the connection issue occurs at the Unity application layer, generally there is a useful log message. This leads me to suspect the client side or network topology to be the problem. Does manually entering the IP make any difference? Sometimes the client doesn't pick the right interface on the server to connect to when using the automatic server discovery, so that can help, as well as disabling unused or extra interfaces (VPNs, etc.).

    @akent99
    Both the virtual camera and face camera apps use a custom protocol for communication with the editor (other than the video streaming, which is RTSP). We considered various standards, but in practice there wasn't a standard that has all the features we need for our apps.

    The primary advantage of our apps is that they are integrated with the Live Capture API being developed, which allows for recording from multiple devices (vcam, face capture, third party, etc.) simultaneously. In the future, using this API will permit synchronizing different devices, which is critical for live scenarios.
     
    akent99 likes this.
  41. gjh_unity

    gjh_unity

    Joined:
    Oct 17, 2018
    Posts:
    3
    @bradweiers Hello, I'm having the same issue as GerryM. I see the server from the app on my phone but am unable to connect. I'm testing the ARKit Face Sample scene.

    Here is my info:

    OS: Windows
    Graphics card: GeForce GTX 1080 Ti
    Unity: 2021.1.16f1
    iPhone model: 12 Pro
    iOS version: 14.7.1
    Face capture app version: Not sure. Downloaded it yesterday from the app store.
    Live capture package version: 1.0.1-pre.585

    Any suggestions for me to try to get it running? Thanks!
     
  42. bradweiers

    bradweiers

    Joined:
    Nov 3, 2015
    Posts:
    59
    Can you try the Unity Virtual Camera app and see if that connects for you?
     
  43. gjh_unity

    gjh_unity

    Joined:
    Oct 17, 2018
    Posts:
    3
    Sure, I tried out the Virtual Camera app and it has the same issue as the Face Capture app. I see my PC when starting the server in Unity but get a "Could not connect to server" message in the app.
     
  44. mufoumu

    mufoumu

    Joined:
    Jan 27, 2020
    Posts:
    2
    I closed windows Defender Firewall and it finally connected
     
  45. GeniusKoala

    GeniusKoala

    Joined:
    Oct 13, 2017
    Posts:
    97
    You should always keep your Firewall enabled!

    Instead add rules to let reliable programs to execute their behaviours.

    Open your firewall rules. Enter "firewall" in the Windows 10 search field and open it. you should see a window like that (sorry it is in French for me) :

    upload_2021-9-13_14-30-4.png

    Now open the first line on the left "Règles de trafic entrant" must be something like "rules for incoming traffic" and you should see a window like this :

    upload_2021-9-13_14-31-40.png

    Double click on the version of Unity you use and you will see this window :

    upload_2021-9-13_14-32-11.png

    Now just select "authorize the connection" (the first one) and do "Apply" and "OK". To be sure, authorize Unity Hub also.
     
    Last edited: Sep 13, 2021
    SimonRess and wetcircuit like this.
  46. stefanozan

    stefanozan

    Joined:
    Mar 27, 2019
    Posts:
    2
    Right now I'm able to connect and run face tracking in the editor but I see the connection code is setup as editor side code only. Is there any support or option to run Face Tracking in a standalone PC build?

    I was going to implement a runtime version that uses CompanionAppServer but I'm seeing it's created as Internal. I mean, I can set the package as local and start hacking way but I'm a confused on what is the design around runtime standalone connection support? Why is CompanionAppServer internal? is there a different API class we are suppose to use?
     
    Last edited: Oct 8, 2021
  47. bradweiers

    bradweiers

    Joined:
    Nov 3, 2015
    Posts:
    59
    This first version is focused on being an editor utility but it's on our roadmap to add runtime support out-of-the-box. There might be some missing functionality that is related to the Editor, such as the Timeline integration, etc.

    For the first iterations of the package we wanted to have a small surface area, public API-wise, to be able to be as flexible as possible under the hood as possible and make changes to implementations without breaking users' projects. Once we get more clarity on exactly what we need to surface we will slowly open the API and add runtime support for what's applicable. For instance, this request for CompanionAppServer to be public is a useful.
     
  48. stefanozan

    stefanozan

    Joined:
    Mar 27, 2019
    Posts:
    2
    Thanks for the quick reply. In this case I'll hack away on the LiveCapture code make CompanionAppServer public and build my own runtime on top of it. Just wanted to make sure it was the right way to go. Hopefully there aren't too many lower level dependencies to editor code. Once you release a proper runtime version I'll move to that one.

    Timeline recording integration seems very much an editor only feature. I can see the playback being useful at runtime tho. I'm not using timeline at the moment I just need to drive the faces real-time.
     
  49. akent99

    akent99

    Joined:
    Jan 14, 2018
    Posts:
    588
    More of a "feeling" than deep insight, but the Editor/Play separation makes sense for game development, but less so for movie making. I want to record special effects or actions controlled via scripts, or capture live data. Some tools want to be in Play mode to work, others in Editor mode. The separation causes friction. E.g. using EVMC4U for motion capture needs to be in play mode, but rendering a timeline with sequences you also kick off play mode. So I have to turn on/off 'activate on entering play mode' on the top sequence. But I have to use Editor mode for scene composition... The nicest apps seem to be a movie making app where you just stay in Play mode all the time, but then it uses its own infrastructure (not scenes) for recording changes.
     
    wetcircuit likes this.
  50. GimmyDev

    GimmyDev

    Joined:
    Oct 9, 2021
    Posts:
    160
    I wonder if that will work with android phone?