Search Unity

Enemies: New tech demo showcasing Unity's growing humanoid toolsets

Discussion in 'General Discussion' started by IllTemperedTunas, Mar 22, 2022.

  1. MadeFromPolygons

    MadeFromPolygons

    Joined:
    Oct 5, 2013
    Posts:
    3,977
    They showed this running in realtime on an xbox on stage, with wind blowing the characters hair around + allowing them to do live edits in the scene.

    So surprised how bad the performance is in the released demo in comparison to that
     
  2. Antypodish

    Antypodish

    Joined:
    Apr 29, 2014
    Posts:
    10,769
    Adam demo was focusing on showcasing timline capabilities, if you have watched tech talks.

    Enemies shows high fiedility in rendering, human facials and hairs.

    You don't need to all of these today. Neither for games. You need to think in wider scope, rater just narrow picture.
    People can use such techs equally to make styled cartoon animations and games, as high quality ones.

    And don't seems their audience target is solely game devs at all.
     
  3. AcidArrow

    AcidArrow

    Joined:
    May 20, 2010
    Posts:
    11,728
    And what I'm asking is that they communicate that more clearly, since the demos are not intended for games, and yet their communication vaguely includes or hints towards games as well. This has been going on since the Blacksmith.

    Because if it was crystal clear that this is intended to showcase Unity's capabilities to produce cartoons and animations and pre-rendered stuff, no one would be complaining about the performance, because even 0.1fps would have been amazing for that use case.

    And yet, there are people complaining, and that is because of Unity's messaging.

    But I'm really not sure what your point is, if there even is one. I feel like you just wanted to disagree with me (and obnoxiously tell me what I need and don't need), at least that would explain why you are talking past my points.
     
    Last edited: Nov 4, 2022
  4. Antypodish

    Antypodish

    Joined:
    Apr 29, 2014
    Posts:
    10,769
    The point is, people think tha presented tech must be used in such and only such way, as it was shown.

    But tha is not the case. Weather it is timline, shaders, hairs or DOTS, it can be used in both large, small high end and low end production. Like with DOTS general consensus from u experienced people, is tha it only is suitable for moving many millions thing on he screen. While it can be done, it can be used very well for hand full stuff on the screen as well. Same applies to any other tech tha is presented.

    Instead what we have, that people try to play high end project on mid tear hardware, and seeing low fps, telling us that tech is unusable.

    But techs can be used for other type of projects too. If starting considering that, now you will realise, tech is in reach of the hand.
     
  5. chingwa

    chingwa

    Joined:
    Dec 4, 2009
    Posts:
    3,790
    It's been a near decade of Unity's marketing being nothing but over-hyped BS. The marketing team has destroyed Unity's reputation because of lies.
     
    Murgilod, pm007, Jingle-Fett and 2 others like this.
  6. KRGraphics

    KRGraphics

    Joined:
    Jan 5, 2010
    Posts:
    4,467
    It's sad. And we still don't have a proper GI solution yet and the tech shown in Enemies should be explained that it took a very large team with high end tools to pull this off.

    And I kinda feel Unity is getting into feature creep, where they have so many great ideas but it feels unfocused and haphazard. And also stuff in the current version of the engine feels unfinished. I was very disappointed when I couldn't get a clean bake of an overcast sky
     
    Deleted User and Lymdun like this.
  7. neoshaman

    neoshaman

    Joined:
    Feb 11, 2011
    Posts:
    6,493
    For once I'll tell you something good and positive :p

    The hair tech github demo run on my GeForce gt 705 at playable framerate, that gpu is weaker than a wii u which is weaker than a switch. So maybe not billion of hair particles and probably nothing running more, like gameplay, than the demo, but given the very low bar it represents, it mean it's probably suitable for people to use, if they are smart about where and when.

    Now the thing is, it didn't worked when I installed the urp demo with urp pipeline, and same with hdrp... until I crossed and used urp demo in hdrp pipeline and ditto for hdrp demo, which mean I had pink material, but they use some kind of debug line allowing me to look at the simulation... :rolleyes: now the roots of the hair jittered like crazy, but looks like a visual bug because it barely effected the simulation, as in other nodes collided and fell with gravity correctly, given than hair is simulated fron root to tip, it shows that's not propagating :). It works, but it could probably should had have a bit of basic polish prior to releasing it publicly, I wasn't the only one who initially couldn't run it.

    Also the new lightprobe solution looks good enough and that was part of the enemies demo too. It doesn't seem complicated, just hierarchical lightprobe grid...

    Anything about the character though seems overkill without actual tool planned. They don't do anything that isn't known too differently. Main problem is that they haven't democratized the actual hard part, capturing hi fidelity character and performance... you can't use most of the tech because you need the data quality for it to make sense especially the eye shader, which is overkill. Again meta human isn't as impressive visually, but bring you 90% there for free and move the quality floor for everyone by as much. And people found way to overreach the last 10% from there, without the costly investment. Literally a game changer.

    Without a proper solution for facial and performance acquisition, it's as impressive these old crappy fmv shooter forgotten by time. :oops: funnily enough, currently everyone and their mother are building easy real time facial and performance capture, meta human, again, have a basic way to do, it, meta has one for in headset communications, etc...

    I tried and failed to stay positive, sorry for the constructive feedback.
     
    PanthenEye, Ryiah, AcidArrow and 2 others like this.
  8. KRGraphics

    KRGraphics

    Joined:
    Jan 5, 2010
    Posts:
    4,467
    I agree with all of this, especially the hardware to capture face and body performance for your characters. Eye shader being overkill? With all of those insane options, I agree. You DON'T need a control to rotate the eyeball! Do that during the texturing phase.

    Most indies are using the iPhone to do facial performance capture and that isn't enough, especially since with an HMC, you can capture hundreds of expressions. But 4D cameras aren't wireless enough where you can just put in an SD card and save it locally, and they are expensive.

    And also, with enough skill in ZBrush, you could sculpt all of your characters expressions and map that to your character rig. For capturing the actual model, if you have a good camera, and programs like Agisoft Metashape or Reality Capture, you can get scans of your actors. And with Wrap3, you can get a very nice topology to get clean scans. It's a lot of work, but you'll learn so much.

    I'm actually impressed with the Adaptive Probe Volume since it's easy to set up and get nice lighting in your scenes. It just needs to be feature complete
     
  9. KRGraphics

    KRGraphics

    Joined:
    Jan 5, 2010
    Posts:
    4,467
    I'm bumping this topic. I re-read the blog about creating the character model for Enemies but I'm curious about how the wrinkle maps were authored