Search Unity

  1. Good news ✨ We have more Unite Now videos available for you to watch on-demand! Come check them out and ask our experts any questions!
    Dismiss Notice

[RELEASED] Polyphemus - facial capture system

Discussion in 'Assets and Asset Store' started by z4g0, Dec 13, 2019.

  1. z4g0

    z4g0

    Joined:
    Sep 6, 2013
    Posts:
    35
    POLYPHEMUS
    - facial capture system - markerless - single webcam -



    [ Asset Store ][ Documentation ]


    REQUIREMENTS
    • Unity 2018.3+ WINDOWS only
    • A computer* with a connected webcam or a laptop with integrated webcam.
      *For a good capture quality is recommended a mid-range to high-end computer ( i.e. GeForce GTX 1060 - 7th Generation Intel® Core ) and an average quality -external or integrated- webcam ( it doesn't need to have HD/4K resolutions; a 30+ framerate, instead, is basically a mandatory requirement for a smooth capture.
    • Good light conditions: A natural light condition is recommended, even though it could still capture in low and high light conditions. Avoid frontal and/or back intense lights.
    • 3D model implementing morph targets via blendshapes ( or “Shape Keys” ).

    NB: This face capture system is based on facial landmark detection: Even if it can recognize a wide range of faces, it could have difficulties to detect some features in some kind of face and appearance.


    • Does it support skeleton-based expressions?
      Polyphemus capture system is based on blendshapes. It still possible, anyway, write a FaceCapureAddon that rotate specific bones from points and metrics data.
    • Does it need an external application?
      No, it works inside Unity.
    • Does it track also eyes movements?
      Eye tracking is not officially supported yet: There's an experimental addons included in the example.
    • Does it works like ARKit Remote face capture?
      Polyphemus use a single standard webcam. This limitation cannot guarantee the same result you'll get with a '3D' camera you have, for example, on an iPhone X. While it can capture several expressions, there's some, like a puffy cheeks expression, that it's impossible to get with this system.
    • Does my 3D model need to follow a naming convention for the blendshape names?
      No, it's up to you. Also there's no guidelines about how many and which blendshape you have to implement. For complex models, you can follow this specs.
    • Is it possible to edit captured clips?
      Yes. Polyphemus outputs standard AnimationClips. So you can edit frames and curves from the standard Unity's Animation Editor.
    • It seems like it can't track properly parts of my face.
      Please check light conditions requirements. Also try to stay at 50cm / 1mt at 50cm from the webcam. You're face must be inside a rectangle big at least 80x80, so if you set a low webcam resolution, you'll have to stay closer to the webcam. Even if it can recognize a wide range of faces, it could have difficulties to detect some features in some kind of face and appearance.​

    upload_2019-12-14_0-22-19.png
     
    Gekigengar likes this.
  2. InfiniteDesign8

    InfiniteDesign8

    Joined:
    Aug 22, 2017
    Posts:
    29
  3. Gekigengar

    Gekigengar

    Joined:
    Jan 20, 2013
    Posts:
    460
    Wow, I'd love to try it out!

    What kind of blend shape setup does it need? Any guidelines to follow when creating the blend shapes?
     
  4. z4g0

    z4g0

    Joined:
    Sep 6, 2013
    Posts:
    35
    It's possible implementing a FaceTrackerAddon: In a FaceTrackerAddon ( it's a simple MonoBehaviour ) you can read blendhshapes values and face metrics, and do things ( that Polyphemus doesn't cover by default ) consequently.
    Just did a test, writing a simple FaceTrackerAddon that swap 2D sprites on a skinned quad ( a skinned mesh still mandatory ) according to captured face expressions.

    In next update I'll add it as example in bundle :)

    Thanks!
    The number and the naming convention of blendshapes is up to you, but I suggest to follow the faceshift or apple arkit guidelines ( https://developer.apple.com/documentation/arkit/arfaceanchor/blendshapelocation ) as it seems to work well on overlapping areas. The "oldman" model example in bundle with the package follow, more or less, this list.
     
  5. vresu

    vresu

    Joined:
    Sep 26, 2011
    Posts:
    2
    is this work while we play the scene?
     
  6. z4g0

    z4g0

    Joined:
    Sep 6, 2013
    Posts:
    35
    Hi Vresu, the editor tool can still work ( capturing and updating blendshapes ) and record clips while Unity is in play mode, but it not recommended because all the binds and changes you'll do in blendshape editor will be lost once you stop the game.
     
  7. CanAydin

    CanAydin

    Joined:
    Mar 18, 2013
    Posts:
    10
    Hi there, I wrote an email but Im going to post it here as well maybe ill get a faster answer from someone else. when I try to open your example scene, my unity crashes. I am using Unity 2018.4.9f1 and my specs are Windows 10, with AMD Ryzen 7 2700X Eight Core Processor 3700 Mhz, 8 core, with 16GB Ram and a Nvidia GeForce RTX 2060
     
  8. z4g0

    z4g0

    Joined:
    Sep 6, 2013
    Posts:
    35
    Hi Can, just replied to your email about the webcam related issues. About this crash, please can you post here the editor log?
    Thanks
     
  9. KRGraphics

    KRGraphics

    Joined:
    Jan 5, 2010
    Posts:
    3,999
    This looks interesting... I have actually designed my own HMD for facial performance capture using two cameras and will be using a mocap suit with this. And from my understanding, this doesn't support joint based facial rigs (with corrective shapes) so I will be keeping an eye on this. It feels like this was inspired by Dynamixyz Performer which is EXTREMELY powerful and learns from your face.

    I am using marker based capture (for stuff like puffed cheeks) since I will be able to carefully retarget my performance. Other than that, this is AMAZING work.
     
  10. z4g0

    z4g0

    Joined:
    Sep 6, 2013
    Posts:
    35
    Hi KRGraphics,
    Thanks! yes it doesn't support bone based facial rigs natively ( it's based on blendshapes implementing actual expressions and/or facial features rather than corrective shapes ). Extending it should be possible to drive bones according to points position or metrics values, but at the moment it's something not bindable directly by the editor and must be 'hardcoded' directly in the addon.
     
  11. FURKANKAHVECI

    FURKANKAHVECI

    Joined:
    May 12, 2013
    Posts:
    19
    I would like to see a unity web demo or video testing Polyphemus on mixamo fuse character blendshapes.
    +and a bit lower price
     
  12. z4g0

    z4g0

    Joined:
    Sep 6, 2013
    Posts:
    35
    Hi FURKANKAHVEC,
    I've done a test with "Andromeda" character from Mixamo characters library ( https://www.mixamo.com/#/?page=1&query=Andromeda&type=Character ) since it have facial blendshapes, unlike many other models in this library.
    Here you can find a quick video:

    I haven't linked all blendshapes, since some jaw and mouth related blendshapes was similar and somehow overlapped, and expressions like CheekPuff_Left and CheekPuff_Right are not supported by the tool.
     
    FURKANKAHVECI likes this.
  13. FURKANKAHVECI

    FURKANKAHVECI

    Joined:
    May 12, 2013
    Posts:
    19
    Yeah, mixamo not supported some expressions.
    Still very very good capturing. Thank you for your time and reply.
     
    Thanathos likes this.
  14. Thanathos

    Thanathos

    Joined:
    Jan 17, 2014
    Posts:
    8
    Hope you see this one, now it's on sale! I already got mine and will start to play with it :D
     
    FURKANKAHVECI likes this.
  15. Nhanpk

    Nhanpk

    Joined:
    Nov 8, 2017
    Posts:
    1
    Currently I see it only runs on unity editor, I wonder if it can be built into a window app, with face capture function
     
    KRGraphics likes this.
  16. local306

    local306

    Joined:
    Feb 28, 2016
    Posts:
    147
    @z4g0 came across your asset during the sale this week. Gotta say it looks very promising!

    Are there any plans for down the road to automate importing and configuring common models like iClone, DAZ, etc.?

    Also, can these animations work additively to currently applied blendshapes? Like if my character is smiling, could this be used on too to make it look like a happy conversation?
     
    Last edited: Apr 15, 2020
  17. DeepShader

    DeepShader

    Joined:
    May 29, 2009
    Posts:
    667
    Hi, what is the technical reason this isn't working on an iMac?
     
  18. startupstor

    startupstor

    Joined:
    Aug 3, 2019
    Posts:
    7
    Just bought this plugin, seems to be able to track face in good lighting only which is fine. Initial problem is that even in the demo provided it says it needs help configuring blendshapes and manually wants me to link dozens of face parts, very disappointed in the demo provided and that the real time face tracking doesn't work for the model in the demo provided.
     
  19. amarillosebas

    amarillosebas

    Joined:
    Feb 28, 2013
    Posts:
    37
    Dude, what. I was about to buy this. The dev also isn't active here. I'm not feeling very confident.
     
  20. Zante

    Zante

    Joined:
    Mar 29, 2008
    Posts:
    405
    This is fantastic. If you can show this working seamlessly with UMA and its bone-driven facial animations, I'll buy it immediately and happily write something up. You're competing with very expensive solutions that just don't work as advertised yet here you are with something that works, in-situ(!)

    I do hope you go the extra mile for supporting UMA. :]

    Edit [8th Sept 2020]: I've created a blendshape emulator for UMA. Long story short, it creates the blendshape entries and ensures they behave in ranges you'd expect by driving a bone-based system which remains out of site. The downside is that this only works during runtime. If Polyphemus can produce blendshape data during runtime, I can then map it to my blendshapes - or I can reconfigure them to work with it. Can you let me know how and if Polyphemus is able to produce readable blendshape values in runtime (as your videos seem to show it only working in edit mode)?


     
    Last edited: Sep 8, 2020
  21. colpolstudios

    colpolstudios

    Joined:
    Nov 2, 2011
    Posts:
    107
    Hi, I purchased this asset, but unsure

    How to integrate audio to be played to match the animation?

    Has anyone been able to do this? If yes please tell me how to do it :)
     
unityunity