Search Unity

Unity, Epic/Unreal Tech Demo Thoughts and Comparisons to Practical Workflows

Discussion in 'General Discussion' started by LooperVFX, Sep 12, 2020.

  1. LooperVFX

    LooperVFX

    Joined:
    Dec 3, 2018
    Posts:
    180
    This thread is for the discussion of flagship tech demos from Unity (and Epic, Crytek, etc) on how they are made, marketed, the expectations they set, how it may misrepresent practical workflows.

    I will be replying here to the first half of this post by @neoshaman (also quoted below,) branching off from an official Unity thread about Scriptable Render Pipelines futures / roadmap:
    https://forum.unity.com/threads/wha...e-render-pipelines.924218/page-5#post-6282083

    Original post(s) quoted for context:
     
  2. LooperVFX

    LooperVFX

    Joined:
    Dec 3, 2018
    Posts:
    180
    I do appreciate that the UE5 demo walks through the features and isn't purely "cinematic" only and wish Unity took an approach more like this for their "flagship" demos. I am hopeful and will be interested to see how this holds up and compares to Unity's offerings by the time UE5 is actually available.

    To be fair here, Quixel megascans are a huge payload of highly detailed static "baked" assets that are arguably impractical to use in a full game right now due to storage limitations alone. Nothing wrong with this for a demo but worth mentioning this is the sort of thing I'm speaking to. On the other hand, maybe this will spark some major advancements in affordable storage technologies?
    https://wccftech.com/game-file-sizes-may-skyrocket-with-unreal-engine-5s-nanite-says-developer/

    You are right, as this was definitely the case with the "Adam" demos which were ridiculously custom, but it's not a fair assessment of say "The Heretic," the latest demo which was largely built with Unity's own tools they've been developing into new versions of Unity and new packages. HDRP Volumetric lighting built in, Subsurface scattering material, Shader Graph for custom materials, VFX Graph for real-time particle simulations which are used heavily in the Heretic, new Cinemachine etc. Not devoid of one-offs, but drastically fewer one-offs and baked anims / sims versus "Adam" which in comparison was really a testament to how much Unity needed to focus on these improvements to start to make such workflows practical. It's still a work in progress, definitely. But it's come a long way.
    All that being said, it seems that Adam as well as to a lesser degree, the Heretic are really focused on the use case of Cinematics in games and using Unity for Film/TV, and should be presented / marketed as such.. not as a example of what you should expect to do in Unity during interactive gameplay for the Game Dev industry at large.I think Unity could improve the clarity on this marketing / messaging.

    I believe Epic generally did a better job on the UE5 demo of mixing a cinematic and playable interactive demo, with voiceover narration that makes it very clear which parts were interactive gameplay moments, and which were the cinematic parts, and often even what is real-time and what is an asset. This sets clearer expectations and I hope Unity will take note and do the same. Book of the Dead has a mix of interactive and cinematic but in the cinematic teaser it's not clear what is what. And while Unity has interactive demos, they are often separate, harder to find, rarely get the same amount of press coverage and not as many people will see them, (Book of the Dead Interactive not the Teaser, Massive Battle, MegaCity, Spaceship VFX. These demos certainly deserve criticism in their own right as well as to how practical they are / versus highly custom components Unity has scraped together and not finished the APIs and/or Editor tools / workflows are. When appropriate, demos should be more clearly marked as "glimpses into the future of Unity" not implying that it is using features available in Unity today. You could also say some people will always misconstrue what expectations these demos set, but I believe there is a significant amount of ambiguity here that is created solely to "fake it before they make it" from a marketing point of view. So there's certainly a gap with Unity right now in that, and though I like the format and narration of Epic's UE5 demo more on this, we'll still have to wait until sometime in 2021 to know exactly how practical the advertised workflows are. Hopefully we get binaries and project files of both The Heretic and the Lumen in the Land of Nanite demos. Here's hoping, as regardless of what engine you use, Epic and Unity competing to improve features and workflows is a benefit to everyone.
     
    Last edited: Sep 12, 2020
    NotaNaN likes this.
  3. neoshaman

    neoshaman

    Joined:
    Feb 11, 2011
    Posts:
    6,493
    The quixel megascan zbrush size has been adressed as the whole point of nanite, that is the workflow streaming and size issue, that's what they say when they tells you it's virtualizing mesh and it handles lod automatically. Also i had on depth discussion about the techniques in the unreal5 thread. Ie it's probably unifying a sdf method with lighting which allows to virtualize mesh and use the same structure for lighting. A kind of preview of the tech can be seen with the most advence creations in dreams on ps4.


    Glimpses into the future don't work when the future don't adress any real problem clearly. People have pointing to the problem by saying unity does not make games, therefore don't understand what to solve.
     
  4. chingwa

    chingwa

    Joined:
    Dec 4, 2009
    Posts:
    3,790
    I'm finding it hard to disagree with @neoshaman as most of the Unity "demos" in the past 4 years have been completely useless to 99% of people using the engine.
     
    Kennth and Martin_H like this.
  5. Neto_Kokku

    Neto_Kokku

    Joined:
    Feb 15, 2018
    Posts:
    1,751
    Me too. Modern UE4 games visually match and surpass past demos, and pretty much everything those demos did are stock engine features nowadays. Meanwhile, I haven't seen a Unity game that gets anywhere close to matching Adam and Heretic (please post examples if you have).

    There's an argument that UE4 being used by AAA productions while Unity isn't skews the results, but there are many UE4 games by smaller and indie devs that have visuals you rarely see in Unity games, like Lost Soul Aside that was being made by one guy and several games in the PS5 reveal.
     
    Deleted User and OCASM like this.
  6. Deleted User

    Deleted User

    Guest

    Not exactly true...
    - The linked article is pure speculation... I'd like to see what exactly contributed to the size of games like Call of Duty. I've heard that in one of these game they shipped with uncompressed wav files that weight a lot of gigabytes ;)
    - Nanite renderer is actually designed to handle huge polycounts. Also imported raw high-poly assets might have much smaller polycount and size in the cooked game, like with traditional workflows. Not using normal maps and LODs should help to regain a lot of disk space. Games installs might get bigger... But still applicable to use megascans in engine designed to use megascans in games ;)
    - Fortnite is smartly using some megascans, but updated to fit the stylized game. Smaller and bigger games are using photoscans for years.

    Half of this game was made by using large-scale scans in 2014 on UE3: rocks, buildings, bridge and even the sand roads. Totally practical to use in a full game ;)

     
  7. ippdev

    ippdev

    Joined:
    Feb 7, 2010
    Posts:
    3,853
    The graphics in that video do NOT look real. It looks like what a city slicker thinks the woods look like.
     
  8. neoshaman

    neoshaman

    Joined:
    Feb 11, 2011
    Posts:
    6,493
    They do not look real, they look highly details in a way that's uncanny for an in game editor that isn't a professional tools. The only way it's possible to get that level of details on a ps4 without compression, baking and without importing model, is a new rendering paradigm. Also this is made on a stock ps4.
     
    Deleted User likes this.
  9. zombiegorilla

    zombiegorilla

    Moderator

    Joined:
    May 8, 2012
    Posts:
    9,052
    They look like the constructed forests and such at Disneyland.
     
  10. neginfinity

    neginfinity

    Joined:
    Jan 27, 2013
    Posts:
    13,572
    I agree with @ippdev, and personally I think that this forest looks horrible.

    The most jarring things is that grass and vegetation are frozen in place, and some of the vegetation has acid green color which doesn't happen in reality. This created immediate impression that somebody made tree decorations out of something like iron, and then painted them to look like plants. The unhealthy shininess on wet leaves only adds to the impression of artificial "painted metal forest", along with lack of smaller grass blades on the ground.

    It looks tolerable when viewed in distance, but when a tree is a far away, it might as well be a billboard.
     
    Martin_H likes this.
  11. neoshaman

    neoshaman

    Joined:
    Feb 11, 2011
    Posts:
    6,493
    That's not teh point i'm trying to make anyway, so feel free.

    I was trying to leverage an understanding about the auto density adjustment that remove the need for LOD (ie cut workflow) and free export of high density mesh with reasonable memory cost (ie the power of SDF).

    The limitation in the video is amateur using ps4's DREAMS which is the scale you would have in a AAA production, you can't import mesh and have to do everything from scratch, within the editor limits. That's what makes it impressive you can that level of detail that easily. Detail density is key, you don't have to make your billboard, ever, that's time save.

    Basically with polygon you must have a constant eye on the count, that's why we do stuff like LOD, billboard, imposteur, etc ... that cost time, increase complexity, and jeopardize details. The ps5 is 10x more powerful than a ps4, abstracting that workflow, make the memory happy and keep the details level is important.

    SO the rendering aspects isn't a really about the material, it's about the resolution of details an how it is handled for free.

    Also I picked it from a tweet of seb aaltonen
     
    Kennth, NotaNaN, Deleted User and 4 others like this.
  12. angrypenguin

    angrypenguin

    Joined:
    Dec 29, 2011
    Posts:
    15,620
    Yeah, we can pick issues with what the forest looks like, but it's still quite nice from a tech perspective and was put together by someone with a 7 year old game console.

    I think the catch there is that you get to that level of quality and ease-of-use by having good systems in place, and those systems will be making tradeoffs somewhere. I haven't tried out Dreams so I've no idea what or where theirs are.

    Regarding billboards in particular, I'm surprised that real-time impostor systems aren't more popular than they seem to be. Same deal with automated mesh LOD systems in general. I know that for the latter an automated result won't be as good as a hand-made one, but you can pick up the differences in QA where they matter and probably cut out well over 80% of the work.
     
    Kennth, AcidArrow, NotaNaN and 3 others like this.
  13. neginfinity

    neginfinity

    Joined:
    Jan 27, 2013
    Posts:
    13,572
    The most important thing about this demo is that the author discovered Uncanny Valley for Plants. I never thought it was a thing.

    Regarding quixel and high quality assets in general, (in my opinion) one issue with pre-made high quality assets is that you may end up with a high definition version of RPG maker effect. And that would, in my opinion, reduce perceived value of final work. Another issue is that when you would need something that isn't available as a premade asset, the quality bar may end up being set too high.
     
    JoNax97, Ryiah and angrypenguin like this.
  14. neginfinity

    neginfinity

    Joined:
    Jan 27, 2013
    Posts:
    13,572
    I believe that impostors and billboards are heavily used, however when they're used well, you don't notice them. For example...

    This is from "Rise of the Tomb Raider"
    upload_2020-9-14_12-51-53.png
    upload_2020-9-14_12-51-23.png
    Flat planes in distance. Completely identical ones, too. However, as there is pretty much only one spot where the glitch is visible and you're highly likely to run by without looking in that direction or zooming in.
     
    Ryiah, Deleted User and Martin_H like this.
  15. MDADigital

    MDADigital

    Joined:
    Apr 18, 2020
    Posts:
    2,198
    I dnt agree on that one. You can kitbash insane looking scenes with Quixel

    Example





     
    Deleted User likes this.
  16. angrypenguin

    angrypenguin

    Joined:
    Dec 29, 2011
    Posts:
    15,620
    Sure... And the your character art, props, animation, effects, etc. all need to match up in style and quality.
     
    Ryiah, Martin_H and MadeFromPolygons like this.
  17. angrypenguin

    angrypenguin

    Joined:
    Dec 29, 2011
    Posts:
    15,620
    For sure. What I mean is that I'm surprised they're not used even more. There's a UE example, I think, where a whole distant part of the map is rendered to a texture and thus reduced to a single quad. It doesn't need to be updated again until something, such as perspective, changes noticeably.

    Turning trees into quads is neat, but it can go much further in certain circumstances. Whole distant areas can potentially be turned into "matte paintings". Even if you have to update the impostor every few frames, you won't have to update them all at once, drastically reducing per-frame draw calls.

    I assume the catch is things like shadow casting.
     
    NotaNaN likes this.
  18. neginfinity

    neginfinity

    Joined:
    Jan 27, 2013
    Posts:
    13,572
    If you need to make a HR Giger style alien world, a futuristic city, a spaceship, scifi weaponry, or something trivial like a runic fantasy crossbow or a set of demonic armor....

    If so far you were quixel kitbashing, you're SCREWED.

    Because to match the quality you will now need hollywood budget. Like, you know how in Lord of the Rings they actually were making actual chainmail? You're going to have to do that.

    Then there's RPG maker effect.
    ----------
    Honestly, it reminds me of a common problem I encounter in manga/manhua. See, they need to quickly produce colored pages with detailed backgrounds, and they use 3d and kitbash. The result?

    Every fantasy city is the same and has THIS castle.
    upload_2020-9-14_13-52-28.png
    That's what you're going to get with kitbashing.

    Then there's Wilhelm scream.

    At the very least, if you kitbash, make your own kit.
     
    Kennth, JoNax97, Ryiah and 4 others like this.
  19. neoshaman

    neoshaman

    Joined:
    Feb 11, 2011
    Posts:
    6,493
    Nobody noticed the wilheim scream until cinema nerd pointed it and it became a meme. Nobody around me ever know about it because I'm the sole nerd.

    It's like when i share pseudo AAA works where i see all the flaws that makes it not AAA, like lack of local specular occlusion and linear animation blending, but no one notice.

    And even block buster movie share props, plot point, music composition and literal shoot, a lot of AAA games source their texture in the same place.

    Only specialist notice, other will not until pointed out to them in an exhaustive way. Happen all the times. Not a big deal. Everything is a remix.
     
  20. Deleted User

    Deleted User

    Guest

    That's exactly the same issue as with building a game when using outsource asset companies. Or using marketplace assets. Using megascans doesn't introduce any new production issues, really.

    Guys, it's like discussing "nah, using 3D meshes is pointless, artists need to create art matching up in style and quality for everything in-game". Yes, from the first day of introducing a 3D graphics ;)

    And while it's awesome to have free megascans in Quixel, scanning your own assets is a small team that is totally doable.
    We successfully mixed photoscans with hand-made content in 2014, in the indie team. Just 3 environment artists when hardware and software for photogrammetry were heavily experimental. Just equipped with a few DLSRs...
    Only the quality of characters was drastically lower, but that because that artist who didn't even rig characters properly... ;)

    Also not entire content needs to be unique. Creating new chairs in every studio separately has exactly zero sense, it's like movie studios would design new chairs for every movie. Dozens of Hollywood movies reuse building interiors and nobody care.

    The cost skyrockets if someone would want to create a lot of top-notch content, the entire open world of photoscans. Or Last of Us with insane detail, every single square cm scripted and polished. It's not like we need Hollywood budgets to avoid "RPG maker effect" in smaller games;)
     
    Kennth and NotaNaN like this.
  21. neoshaman

    neoshaman

    Joined:
    Feb 11, 2011
    Posts:
    6,493
    Hollywood is already the rpgmaker of big budget. They mostly look the same segregated by genre only.
     
  22. Deleted User

    Deleted User

    Guest

    Not entirely sure which is part American movie industry is "Hollywood", but that industry alone produces like 1000 movies a year. There are a ton of originally looking movies if one does ignore typical popcorn blockbusters :)
     
  23. neginfinity

    neginfinity

    Joined:
    Jan 27, 2013
    Posts:
    13,572
    It is really not a good analogy, as hollywood has hollywood budget, and has entire world available as an asset library.
    If something is not available, they can build it for real, by throwing money at the problem. Quixel library, no matter how huge, won't match this.

    A lone indie cannot match this either. Photogrammetry with limited equipment can be a pain in the butt, still requires equipment, and will be largely restricted to things in your vicinity, and made in modern era.

    Of course, if you actually has a movie-level budget, then all those problems won't apply to you. But. This experience won't be useful to majority of people making or trying to make their games...
     
    Kennth likes this.
  24. angrypenguin

    angrypenguin

    Joined:
    Dec 29, 2011
    Posts:
    15,620
    Except that the style is "realistic" and the detail level is "very high", as is the point of megascans.

    You're right, it's fine if everything you need is something you can scan. But if it's not then the detail bar to make new stuff that matches is potentially higher than if you'd taken other avenues.

    One of my early thoughts with any project is "how will we make or get all the content?" If scanned stuff can do all of it, great. If it can't then I'd be super cautious about where I do use it (and I do) to make sure it doesn't scope creep our art requirements elsewhere.

    Hence my concern at the idea of kitbashing high detail stuff because it seems easy. Maybe it'll be fine. Also, maybe you'll find all your other stuff suddenly has super high requirements which aren't easy to match. To ship a game you need to finish all of it.
     
    Kennth, Ryiah and neginfinity like this.
  25. neoshaman

    neoshaman

    Joined:
    Feb 11, 2011
    Posts:
    6,493
    It's also still possible to do good old model with clay, balsa or literal trash and then scan that like harold halibut
     
    Deleted User likes this.
  26. Neto_Kokku

    Neto_Kokku

    Joined:
    Feb 15, 2018
    Posts:
    1,751
    Make physical kitbashed assets then scan them so you can kitbash in 3D.

    Next step: 3D print kitbashed assets so you can physically kitbash.

    Seriously, I'm not even sure where this discussion is going on, what the hell are you people even arguing against?
     
  27. neoshaman

    neoshaman

    Joined:
    Feb 11, 2011
    Posts:
    6,493
    Is it always about arguing against?

    Btw harold halibut is a real unity game made with photogrammetry with real objects imported into games through photogrammettry.

    So like the title of the the thread say, it's a practical workflow.



    @KokkuHub see you were onto something!

    Btw modern stop motion make heavy use of 3d printing too to convert cg model into reality.
     
  28. MDADigital

    MDADigital

    Joined:
    Apr 18, 2020
    Posts:
    2,198
    Last edited: Sep 14, 2020
    Kennth and neoshaman like this.
  29. Neto_Kokku

    Neto_Kokku

    Joined:
    Feb 15, 2018
    Posts:
    1,751
    Ranting against scanned assets by citing bad examples feels like the whole ranting against 2D artists using photo references. Just because it can be misused doesn't mean it cannot be used well. But if you want to spend time modelling rocks, chairs, trash bins, and hand painting concrete textures, go for it.
     
    Deleted User likes this.
  30. ippdev

    ippdev

    Joined:
    Feb 7, 2010
    Posts:
    3,853
    It seems like a gimmick. This could have been done strictly in 3D apps in much less time. With a team that size this would have taken six months to a year tops.
     
    Kennth and neginfinity like this.
  31. Lars-Steenhoff

    Lars-Steenhoff

    Joined:
    Aug 7, 2007
    Posts:
    3,527
    I appreciate the practical sets, even when it could have been done in 3d from scratch
     
    Kennth, Ryiah and angrypenguin like this.
  32. angrypenguin

    angrypenguin

    Joined:
    Dec 29, 2011
    Posts:
    15,620
    For the sake of clarity, I'm not against them. I use them. All I'm saying is to consider the full project context in your decisions.
     
    Ryiah likes this.
  33. ippdev

    ippdev

    Joined:
    Feb 7, 2010
    Posts:
    3,853
    Seems the drop in framerate and graphics card mb handling limits would prohibit the size of levels using scans. And for what purpose exactly? Like the Silent Hill trashed men's room shown in the video thumbnail above. Looks dirty and effed up for sure. Is someone gonna navigate into that room and spend more than a few seconds checking it out. And to pre-empt those who say gameplay may occur in there.. Are they gonna be checking the urinal stains and broken tiles or reacting to the game design mechanic? The player spent how much time downloading that toilet? It used up how much bandwidth rendering it and what was the fps cost? In a cost analysis is it worth it?
     
    Kennth likes this.
  34. ippdev

    ippdev

    Joined:
    Feb 7, 2010
    Posts:
    3,853
    I am a practical guy [double entendre]. I have executed miniatures for stages, architecture and complex signage. I appreciate the art and craft. I am pretty sure that those handling the grant monies thought it was a gas. Curious how much they raised on Kickstarter but couldn't find the figures. I saw they wanted 192k USD.
     
  35. Ryiah

    Ryiah

    Joined:
    Oct 11, 2012
    Posts:
    21,203
    ippdev likes this.
  36. neoshaman

    neoshaman

    Joined:
    Feb 11, 2011
    Posts:
    6,493
    Well industry report show that photogrammetry is actually faster, EA, and the guy who that zombie game bought by xbox, made a GDC talk about it. Also there is a definite artistic touch that's not the same, it's basically motion capture for mesh, the raw data is useless, but still add a flair you can't have otherwise. Facial motion capture and scanning is the purest form of it.
     
  37. neoshaman

    neoshaman

    Joined:
    Feb 11, 2011
    Posts:
    6,493
    Raw data isn't used, it is decimated, also unreal 5 will provide surely a new implementation of mesh through their virtualized geometry (suspected to be a sort of sdf based method) were data get crunch automatically to reasonable size and details ratio.
     
    Kennth and Deleted User like this.
  38. MDADigital

    MDADigital

    Joined:
    Apr 18, 2020
    Posts:
    2,198
    No one is saying raw data is used? Nanite uses maps to virtualize the geometry, cool stuff!

    I Dont care super much about that part though, I just hope we get more performance by default in the future.
     
  39. neoshaman

    neoshaman

    Joined:
    Feb 11, 2011
    Posts:
    6,493
    Not sure yet, he said that was the starting idea, he hasn't disclosed the final implementation and if they kept GIM, I have bet on SDF but then how to implicitly derive UV (as in filtered pixel data, not exact UV) from it, unless it's a pixel soup in a dictionary access through sdf query?