A Unity ID allows you to buy and/or subscribe to Unity products and services, shop in the Asset Store and participate
in the Unity community.
Separate names with a comma.
Discussion in 'Announcements' started by LeonhardP, Mar 18, 2022.
ok, i was curios because seems farfetch that they will support an old version like 2021
From the Github repository:
-Minimum Requirements - Unity 2020.3 + - HDRP 10.9.0 +
Requirements for Skin Deformation and Skin Attachment GPU Path
Unity 2021.2 +
Requirements for new Eye and Skin Shaders
Unity 2022.2.0a16 +
HDRP 14.0.3 +
so there is this
which appears to be the old demo (the bald guy) it say it works with 2019 so most likely this skin deformation is probably not included in this demo (you could try anyway see if it works).
I think that
"Requirements for Skin Deformation and Skin Attachment GPU Path Unity 2021.2 +"
can be interpreted as indication of this "skin deformation" tech being available way back when they built the old demo but they never included this tech in the official unity releases. They have access to that tech and it works on those unity versions, but if they will ever be released for everyone is not 100% clear. They may just keep them in the drawer. Maybe they don't have the license to release the tech from the movie
Now, I think people should try to understand that there is a difference between the engine being capable of doing something and the work required to create that "something"
If you are familiar with the shader graph. You can create wonderful stuff with it but you need to create it. All these amazing shaders are not included by default. If you create a new shader graph is just that, a simple shader, a starting point, is not even the default pbr material with all the options from the standard pbr material. (btw you can't recreate the standard unity materials in shader graph at 100%, they have some stuff under the hood you can't access, I've tried)
so maybe Unity 2021.2 has the capability to deform skin but only if you know how to implement it. They, the team creating the movies, definitely know how to do it.
The skin deformation is not really deformation. It is just a compute shader which is measuring the distance between vertices based on their rest length (the length in the original mesh). The information is being stored inside the mesh color channel and then being used by shader graph to display a wrinkle map or whatever you like to display if there is any sort of tension to the skin.
Nothing really magic, just an elaborated shader trick. And this should work since 2021.2+, as it only needs the "new" Mesh API to access the mesh from the compute shaders.
ah, ok, I remember, accessing the deformed mesh at runtime inside the GPU, that is the holy grail man. I really want to see that at some time in the future. maybe new GPUs will make this possible.
well, you can't do that, you need to get the mesh out of the gpu then push it back.
I've made this both for unity and unreal, actually but is pointless, accessing the mesh drops the frame rate, it tanks it hard. both on unreal 5 and unity. was definitely easy on unity as C++ is ridiculous. After I've figured how to do it and I was happy as a young boy then it hit me, is not real time
it works like this, you deform the skin using blendshapes/morphs but once you do this you don't know the realtime position of the deformed vertex.
so you need to get the deformed mesh back from the GPU, do whatever you need with them, then put the mesh back on the GPU. you can do this on unity/unreal but is expensive. On unity I was 3D painting on the real time deformed mesh, on unreal I wanted to know where is a vertex to precisely place other objects there (while the character was animating)
It worked, but can't be used on real time.
probably why you will never see a real time game demo for these unity "demo" movies, they can't run at normal frame rates. It was probably rendered frame by frame. its a movie.
you can't have both realtime and deformation, my knowledge of GPU is fuzzy but it appears to be a technical limitations on how and why GPU and CPUs are different.
but maybe they will manage to do this at some point.
The trick is to do everything on GPU. Once data is on GPU, you can do whatever you want very fast. You should never pull it back to CPU. That being said, GPU programming is a nightmare. But worth it in the end. So yes, real time deformation is definitely possible, ZIVA is doing it for example, even with modest GPU. Biggest issue is then that this does not play together easily with physics, if physics is calculated on CPU only. You could somehow pass in some collider data to GPU, but this is limited (you better only use primitives, and limited number), and you don't get feedback back into the CPU world.
that ziva appears to use AI to create/bake morphs ahead of time that will then be used to animate the character realtime. from what I'm seeing it doesn't mess with the GPU while is running the app realtime.
ok so let me explain, GPUs are more than capable to deform and animate a base mesh using blendshapes/morphs. But you need to have those blendshapes made before you use them realtime.
Unreal has this workflow where depending how a bone is animated it will apply certain morphs with a certain value at runtime. You need the blendshapes/morphs to exist and you need to set them up with the correct values to react to the bone animation. OK
this ziva tech appears to do this, depending of the face animation, they will use AI cloud tech to create the morphs. Then you get these morphs and you can use them for realtime animation. This AI calculation (machine learning or whatver is called) takes quite some time.
There is an open source app for unreal that lets you do this on your computer right now but is slow like hell, it takes days. Needs quite the power. ziva probably have a network of good computers to cut on the time.
but it is not realtime create directly on your gpu, is just a more advanced version of a tech that already exist.
I may be off here, but I think you are mistaken this tech for something else, you can't put your hands inside the GPU while is doing its stuff, I mean you somehow do it but you lose the speed. The GPU is fast because you feed it data and does something really fast. if you want certain data back from GPU it will give it do you BUT depending of how big and complex the data it will get really slow.
ah, we are speaking about two different things yes, blendshapes or Ziva data needs to be pre-calculated, Ziva does not seem to use blendshapes, but something more efficient, which as you said needs to be pre-calculated (by them). They e.g. don't use precalculated normals and tangents, but rather calculate them in an own compute shader, which alon reduces data need comapred to blendshapes to 1/3rd.
Both data needs to be send to GPU (blendshape and/or Ziva), and be applied there. Even SMR is doing it like this, else it would be slow. You just tell the compute shader, which shape to apply with which strength.
You can absolutely also sync blendshapes to bones in Unity, I am using it to apply certain muscle flex blendshapes. In unity, you need to script it yourself, but this is pretty easy.
I was speaking about real mesh deformation, like skin / vertex deformation based on physics impact. This has nothing to do with morphs, it's softbody simulation, which is a topic on it's own.
The tech in digital human is neither of these, as I explained, it's "just" shader stuff. All of these have one thing in common: You should never leave the GPU / pull data to CPU if you want it to be performant on like 100K+ vertices in realtime.
it makes sense that they have some kind of proprietary tech similar with the morphs, both to streamline and make stuff efficient and to lock you into their system.
I'm not familiar with this soft skin you are talking about, maybe is something similar with the unreal niagara workflow
Niagara is particle simulation, not quite the same, but could be usable for this. Only "official" lib that integrates full softbody physics I have seen so far is Physx5:
But this will probably be only available in Omniverse, as Unity tries to build their own physics stuff (Unreal too, but there it seems already production ready).
That's how I understand it. What I need is some documentation and a demo scene so I can understand how to use it.
I seen this uploaded download the Executable but it runs good on my laptop
wow good news. thanks.
It worked on my 1050gtx.
but fps was probably 3~5.
Hair is very beautiful. Antialiasing is working fine.
Thanks for the headsup!
Runs at about 20fps on my gaming machine. I like the quality settings window.
Will the digital human package on GitHub be updated? The Unite 2022 keynote said the new version was available but it still appears to be the old one from May 2020.
well they have surprised me. at least it runs and you can test it. good thing
I was wondering the same thing will the new digital human be heretic or the new female?
In unite 2022 Keynote around 58 mins she is talking about new digital human on gethub it didn't work for. UNITE2022
Here gethub link if you get it to work please share. Gethub
Good tutorial on Hair Tool
The Digital Human package was updated a couple of weeks ago but the team hadn't added a new version tag yet. This has now been amended.
Thanks I will test tonight when I get off work is it heretic male or female Enemies model?
This is the core library, which was being used by Heretic and Enemies demo. AFAIK no character models inside, only the shader graphs, shader and corresponding scripts.
Hopefully the Enemies demo will be released soon, too, so we can see how to properly use this lib, and also how to use the hair lib correctly.
Thanks for feedback ❤
Does anyone know how to change URP hair's color? I had try changing the material with no luck... Thanks!
What kind of shader do you use right now? As it seems, you can only use shader graph right now as shader, as you need to feed in vertex position, normals and tangents. in this case, it is trivial to also set the color.
Here a Mod for Enemies demo free fly works great Video
Still excited to test Enemies in game editor hopefully before GDC 2023 but I been adding heretic skin shaders on my model waiting on Enemies eyes for the kostick effect
Can we please have a guide on how to use the GPU skin deformation and attachment, the skin tension and wrinkle maps? Not the full Enemies scene, just a simple example so we can start using this in our own projects.
Someone tested Enemies demo on Steam Deck video next I will try to get someone to do Heretic. But we didn't get the Enemies digital human 2.0 package in the asset store 2022 but maybe GDC in late March I heard they are working on a video capture with the phone to add lip sync but my solution to that is iclone it its iPhone facial tracking
Can anyone give a tip on how the Mapped Direction/Parameters Textures work and what tools can be used to create them?
Has anyone been able to use this in their own project? I simply can't get anything to work. The SkinTensionRenderer script throws exception errors, and in the absence of any documentation I don't know how to proceed.
It's here download now
With GDC 2023 tech demo where can we watch? I just watch Unreal 5.2 was ok excited to watch Unity 2023 please post link in forum I don't see anything on youtube.
Nice, but I am getting errors from the main character prefab (missing references inside), and also some other errors. Did not find any install instructions, so not sure if something special needs to be done to make it work.
I think we have to wait for 2023 idk yet I'm still at work when I get off I will test it.
It states 2023.1.0b8. which is available already and which I tried. Either there are some unknown prerequisites, or something in the package is corrupt.
Update: A few restarts of Editor later it looks much better Seemed to be able to pick up the references now, still a few errors left, but scene looks complete now.
Some preliminary thoughts about the project:
Got it fully working. With DX12 it runs okay, looks good, using the provided HDRP DX11 asset produces some very nasty artifacts, very buggy.
Lighting seems to have changed a bit from GDC 2022 version, SSGI does not seemed to be used anymore, Raytracing seems to be off for most things (only reflections as far as I can see) Should have read the readme. You can switch on/off Raytracing effects from the included quality menu, which needs to be loaded in parallel. APV do a good job, but you can see the usual artifacts if you look for it.
The Hair profits dramatically from the new High Quality Line Renderer introduced in 2023. Switching it off is like day and night difference. Hope they'll fix it that we can also use it in XR. The hair suffers from more artifacts than were seen in the demo of 2022: It has some jittery behavior, and also some checkerboard like artifacts from the internal clustering (checkerboard seems to be mostly resolved with 2023.1.0b9). It also seems to use an outdated version of the hair package which gets installed in the Asset folder, but I have not tried to upgrade it yet. Otherwise the hair looks pretty good, but very different in Editor / Scene mode than in Play mode.
Skin looks good, although it does not yet use new 2023 features like thickness and advanced SSS profile features. A beefed up shader from the Heretic demo, biggest new feature is the wrinkle map stuff.
As in all character based demos, most fidelity comes from a complex light setup, which is not feasible for real time interactive games due to a) positioning (lights are always set in specific position related to the face), and costs (area lights with soft shadows are pretty expensive).
When I get off tonight I will test it and try to work with it all weekend
I was wondering do they have the different hair styles from tech demo last year ?
Only one hairstyle as it seems, also did not see the source file (alembic). And the EULA is too long and too complicated for me to understand whether this might be used in our own projects, maybe someone with a law degree can try to figure out
I tried with my new pc...it runs at 30 - 50 with ultra but post processing and hair quality at medium i also turned off dlss because it looks really really bad, i have to say it, this demo is better than anything i've seen from unreal, well, matrix is pretty close
I finally got Enemies demo to work first I made new build of unity 2023 beta then loaded Enemies before I push start I had to hide the hair.
Or the beta would crash then I saved without hair then I load the one with hair on and it works fine but next my goal is to add the hair on one of my 3d models so far I save model as prefab next do the hair
But hopefully someone make more hair styles and release in asset store but I will keep posting my updates in the forum.
Here is video to roadmap 2023
Unity2023 I'm excited
How did you got that link? That video is unlisted
Hey, i had the same problem (plus other bugs and glitches) but i restarted the editor a couple of times and i worked perfectly
I got it running after a few glitches -- it's really amazing seeing this running in the editor
I got the video in email.
I was wondering has anyone subscribed to sakura rabbitter he make vfx shader environments for Unity3d HDRP
Amazing artist he uploads 2 assets a month not to expensive youtube link I wish I would have known about him earlier I found out about him around Dec 2022 if anyone else is subscribed to him let me know ❤️
haha yeah, he (or she) was very happy because the CEO of unity name him (or her xD) on an interview, very creative guy (or girl)
I was a subscriber for a month. Not really much stuff you can use. Either there is no license available (projects just meant for research), or it’s stuff you can find elsewhere easily (like a slightly modified version of the eye shader demo, which is available as sample inside Unity). Also project access is limited. Projects are deleted from the fanbox after a few month.
Off topic, but adding to this; Rabbit also often take others work (sometimes paid) and just re sells it, which is shifty and not very legal...
Woha, do you have any further info on this? I have never seen anyone complaining about this and I am a subscriber of hers, but I do not want to use anything legally questionable. So I would much appreciate any links to threads or complaints or even concrete examples where I can check on this. Thanks!
Steering the thread back on course...
Im really happy that the Demo Team has released this project. Will act as a great place to study their workflows and see how they can apply into our own! Great work!