Search Unity

  1. Welcome to the Unity Forums! Please take the time to read our Code of Conduct to familiarize yourself with the forum rules and how to post constructively.

Official Introducing Enemies: The latest evolution in high-fidelity digital humans from Unity

Discussion in 'Announcements' started by LeonhardP, Mar 18, 2022.

  1. altepTest

    altepTest

    Joined:
    Jul 5, 2012
    Posts:
    866
    ok, i was curios because seems farfetch that they will support an old version like 2021
     
  2. pauldrummond

    pauldrummond

    Joined:
    Oct 30, 2014
    Posts:
    143
    From the Github repository:

    -Minimum Requirements - Unity 2020.3 + - HDRP 10.9.0 +
    • Requirements for Skin Deformation and Skin Attachment GPU Path
      • Unity 2021.2 +

    • Requirements for new Eye and Skin Shaders
      • Unity 2022.2.0a16 +
      • HDRP 14.0.3 +
     
  3. altepTest

    altepTest

    Joined:
    Jul 5, 2012
    Posts:
    866
    so there is this

    https://github.com/Unity-Technologies/com.unity.demoteam.digital-human.sample

    which appears to be the old demo (the bald guy) it say it works with 2019 so most likely this skin deformation is probably not included in this demo (you could try anyway see if it works).

    I think that

    "Requirements for Skin Deformation and Skin Attachment GPU Path Unity 2021.2 +"

    can be interpreted as indication of this "skin deformation" tech being available way back when they built the old demo but they never included this tech in the official unity releases. They have access to that tech and it works on those unity versions, but if they will ever be released for everyone is not 100% clear. They may just keep them in the drawer. Maybe they don't have the license to release the tech from the movie o_O

    Now, I think people should try to understand that there is a difference between the engine being capable of doing something and the work required to create that "something"

    If you are familiar with the shader graph. You can create wonderful stuff with it but you need to create it. All these amazing shaders are not included by default. If you create a new shader graph is just that, a simple shader, a starting point, is not even the default pbr material with all the options from the standard pbr material. (btw you can't recreate the standard unity materials in shader graph at 100%, they have some stuff under the hood you can't access, I've tried)

    anyyyway

    so maybe Unity 2021.2 has the capability to deform skin :cool: but only if you know how to implement it. They, the team creating the movies, definitely know how to do it.
     
  4. Qleenie

    Qleenie

    Joined:
    Jan 27, 2019
    Posts:
    551
    The skin deformation is not really deformation. It is just a compute shader which is measuring the distance between vertices based on their rest length (the length in the original mesh). The information is being stored inside the mesh color channel and then being used by shader graph to display a wrinkle map or whatever you like to display if there is any sort of tension to the skin.
    Nothing really magic, just an elaborated shader trick. And this should work since 2021.2+, as it only needs the "new" Mesh API to access the mesh from the compute shaders.
     
  5. altepTest

    altepTest

    Joined:
    Jul 5, 2012
    Posts:
    866
    ah, ok, I remember, accessing the deformed mesh at runtime inside the GPU, that is the holy grail man. I really want to see that at some time in the future. maybe new GPUs will make this possible.

    well, you can't do that, you need to get the mesh out of the gpu then push it back.

    I've made this both for unity and unreal, actually but is pointless, accessing the mesh drops the frame rate, it tanks it hard. both on unreal 5 and unity. was definitely easy on unity as C++ is ridiculous. After I've figured how to do it and I was happy as a young boy then it hit me, is not real time :eek:

    it works like this, you deform the skin using blendshapes/morphs but once you do this you don't know the realtime position of the deformed vertex.

    so you need to get the deformed mesh back from the GPU, do whatever you need with them, then put the mesh back on the GPU. you can do this on unity/unreal but is expensive. On unity I was 3D painting on the real time deformed mesh, on unreal I wanted to know where is a vertex to precisely place other objects there (while the character was animating)

    It worked, but can't be used on real time.

    probably why you will never see a real time game demo for these unity "demo" movies, they can't run at normal frame rates. It was probably rendered frame by frame. its a movie.

    you can't have both realtime and deformation, my knowledge of GPU is fuzzy but it appears to be a technical limitations on how and why GPU and CPUs are different.

    but maybe they will manage to do this at some point.
     
  6. Qleenie

    Qleenie

    Joined:
    Jan 27, 2019
    Posts:
    551
    The trick is to do everything on GPU. Once data is on GPU, you can do whatever you want very fast. You should never pull it back to CPU. That being said, GPU programming is a nightmare. But worth it in the end. So yes, real time deformation is definitely possible, ZIVA is doing it for example, even with modest GPU. Biggest issue is then that this does not play together easily with physics, if physics is calculated on CPU only. You could somehow pass in some collider data to GPU, but this is limited (you better only use primitives, and limited number), and you don't get feedback back into the CPU world.
     
  7. altepTest

    altepTest

    Joined:
    Jul 5, 2012
    Posts:
    866
    that ziva appears to use AI to create/bake morphs ahead of time that will then be used to animate the character realtime. from what I'm seeing it doesn't mess with the GPU while is running the app realtime.

    ok so let me explain, GPUs are more than capable to deform and animate a base mesh using blendshapes/morphs. But you need to have those blendshapes made before you use them realtime.

    Unreal has this workflow where depending how a bone is animated it will apply certain morphs with a certain value at runtime. You need the blendshapes/morphs to exist and you need to set them up with the correct values to react to the bone animation. OK

    this ziva tech appears to do this, depending of the face animation, they will use AI cloud tech to create the morphs. Then you get these morphs and you can use them for realtime animation. This AI calculation (machine learning or whatver is called) takes quite some time.

    There is an open source app for unreal that lets you do this on your computer right now but is slow like hell, it takes days. Needs quite the power. ziva probably have a network of good computers to cut on the time.

    but it is not realtime create directly on your gpu, is just a more advanced version of a tech that already exist.

    I may be off here, but I think you are mistaken this tech for something else, you can't put your hands inside the GPU while is doing its stuff, I mean you somehow do it but you lose the speed. The GPU is fast because you feed it data and does something really fast. if you want certain data back from GPU it will give it do you BUT depending of how big and complex the data it will get really slow.
     
  8. Qleenie

    Qleenie

    Joined:
    Jan 27, 2019
    Posts:
    551
    ah, we are speaking about two different things;) yes, blendshapes or Ziva data needs to be pre-calculated, Ziva does not seem to use blendshapes, but something more efficient, which as you said needs to be pre-calculated (by them). They e.g. don't use precalculated normals and tangents, but rather calculate them in an own compute shader, which alon reduces data need comapred to blendshapes to 1/3rd.
    Both data needs to be send to GPU (blendshape and/or Ziva), and be applied there. Even SMR is doing it like this, else it would be slow. You just tell the compute shader, which shape to apply with which strength.
    You can absolutely also sync blendshapes to bones in Unity, I am using it to apply certain muscle flex blendshapes. In unity, you need to script it yourself, but this is pretty easy.
    I was speaking about real mesh deformation, like skin / vertex deformation based on physics impact. This has nothing to do with morphs, it's softbody simulation, which is a topic on it's own.
    The tech in digital human is neither of these, as I explained, it's "just" shader stuff. All of these have one thing in common: You should never leave the GPU / pull data to CPU if you want it to be performant on like 100K+ vertices in realtime.
     
  9. altepTest

    altepTest

    Joined:
    Jul 5, 2012
    Posts:
    866
    it makes sense that they have some kind of proprietary tech similar with the morphs, both to streamline and make stuff efficient and to lock you into their system.

    I'm not familiar with this soft skin you are talking about, maybe is something similar with the unreal niagara workflow
     
  10. Qleenie

    Qleenie

    Joined:
    Jan 27, 2019
    Posts:
    551
    Niagara is particle simulation, not quite the same, but could be usable for this. Only "official" lib that integrates full softbody physics I have seen so far is Physx5:


    But this will probably be only available in Omniverse, as Unity tries to build their own physics stuff (Unreal too, but there it seems already production ready).
     
  11. pauldrummond

    pauldrummond

    Joined:
    Oct 30, 2014
    Posts:
    143
    That's how I understand it. What I need is some documentation and a demo scene so I can understand how to use it.
     
  12. Kory-therapy

    Kory-therapy

    Joined:
    Jun 5, 2013
    Posts:
    47
    I seen this uploaded download the Executable but it runs good on my laptop
     
    ROBYER1 likes this.
  13. mush555

    mush555

    Joined:
    Feb 4, 2020
    Posts:
    23
    wow good news. thanks.

    //edit
    It worked on my 1050gtx.
    but fps was probably 3~5.
    Hair is very beautiful. Antialiasing is working fine.
     
    Last edited: Nov 1, 2022
  14. Andy-Touch

    Andy-Touch

    A Moon Shaped Bool Unity Legend

    Joined:
    May 5, 2014
    Posts:
    1,325
    Thanks for the headsup!

    Runs at about 20fps on my gaming machine. I like the quality settings window. :)
     
    ROBYER1 likes this.
  15. pauldrummond

    pauldrummond

    Joined:
    Oct 30, 2014
    Posts:
    143
    Will the digital human package on GitHub be updated? The Unite 2022 keynote said the new version was available but it still appears to be the old one from May 2020.
     
  16. altepTest

    altepTest

    Joined:
    Jul 5, 2012
    Posts:
    866
    well they have surprised me. at least it runs and you can test it. good thing
     
  17. Kory-therapy

    Kory-therapy

    Joined:
    Jun 5, 2013
    Posts:
    47
    I was wondering the same thing will the new digital human be heretic or the new female?
    Screenshot_20221102-095232_YouTube.jpg
    Screenshot_20221102-095255_YouTube.jpg
    In unite 2022 Keynote around 58 mins she is talking about new digital human on gethub it didn't work for. UNITE2022

    Here gethub link if you get it to work please share. Gethub

    Thanks
     
  18. Kory-therapy

    Kory-therapy

    Joined:
    Jun 5, 2013
    Posts:
    47
    Good tutorial on Hair Tool

    Video
     
    mush555 and DragonCoder like this.
  19. LeonhardP

    LeonhardP

    Unity Technologies

    Joined:
    Jul 4, 2016
    Posts:
    3,045
  20. Kory-therapy

    Kory-therapy

    Joined:
    Jun 5, 2013
    Posts:
    47
    Thanks I will test tonight when I get off work is it heretic male or female Enemies model?

    Thanks again
     
  21. Qleenie

    Qleenie

    Joined:
    Jan 27, 2019
    Posts:
    551
    This is the core library, which was being used by Heretic and Enemies demo. AFAIK no character models inside, only the shader graphs, shader and corresponding scripts.
    Hopefully the Enemies demo will be released soon, too, so we can see how to properly use this lib, and also how to use the hair lib correctly.
     
    Last edited: Nov 8, 2022
  22. Kory-therapy

    Kory-therapy

    Joined:
    Jun 5, 2013
    Posts:
    47
    Thanks for feedback ❤
     
  23. Threeyes

    Threeyes

    Joined:
    Jun 19, 2014
    Posts:
    71
    Does anyone know how to change URP hair's color? I had try changing the material with no luck... Thanks! upload_2022-11-11_22-45-27.png
     
  24. Qleenie

    Qleenie

    Joined:
    Jan 27, 2019
    Posts:
    551
    What kind of shader do you use right now? As it seems, you can only use shader graph right now as shader, as you need to feed in vertex position, normals and tangents. in this case, it is trivial to also set the color.
     
    Threeyes likes this.
  25. Kory-therapy

    Kory-therapy

    Joined:
    Jun 5, 2013
    Posts:
    47
    Here a Mod for Enemies demo free fly works great Video
    20221203_155308.jpg 20221203_155019.jpg
    Still excited to test Enemies in game editor hopefully before GDC 2023 but I been adding heretic skin shaders on my model waiting on Enemies eyes for the kostick effect
     
  26. pauldrummond

    pauldrummond

    Joined:
    Oct 30, 2014
    Posts:
    143
    Can we please have a guide on how to use the GPU skin deformation and attachment, the skin tension and wrinkle maps? Not the full Enemies scene, just a simple example so we can start using this in our own projects.
    Thanks.
     
    Kory-therapy and DragonCoder like this.
  27. Kory-therapy

    Kory-therapy

    Joined:
    Jun 5, 2013
    Posts:
    47
    Someone tested Enemies demo on Steam Deck video next I will try to get someone to do Heretic. But we didn't get the Enemies digital human 2.0 package in the asset store 2022 but maybe GDC in late March I heard they are working on a video capture with the phone to add lip sync but my solution to that is iclone it its iPhone facial tracking