Search Unity

UE5

Discussion in 'General Discussion' started by scottymclue, May 26, 2021.

Thread Status:
Not open for further replies.
  1. Billy4184

    Billy4184

    Joined:
    Jul 7, 2014
    Posts:
    6,012
    That's interesting, since one of the issues with Unity seems to be blurry/muddy textures that requires something like Beautify to fix.

    That said, there are games made with UE which don't seem to have very heavy AA.

    I agree that UE5 's new features are probably not quite ready yet for the typical game.
     
  2. Deleted User

    Deleted User

    Guest

    Quite opposite.
    Only the Lumen is a high-end feature designed for current-gen consoles (it's not "next-gen", game starting development in 2022 will commonly ignore older consoles) and high-end PC. Won't run on current GPUs integrated into CPUs or mobile.

    All other features can be used for your typical game.
    Nanite optimizes disk size and RAM usage even for low-poly static meshes. (support for skeletal meshes is coming).
    That is coupled with a GPU-driven renderer and their own software rasterizer.

    You can forget about counting draw calls- AFAIK there's only a single draw call per, doesn't matter how many instances. No more static or dynamic batching of standard static meshes would be required, I guess. It solves decades-old workflow issues. Finally, we could go wild with a modular-level design. Easier to create a bigger or more detailed world, even in a small team!

    World Partition is an amazing upgrade for world-building in any team. It's a must-have when working in a huge open world, but all of us can utilize it. Death to sublevels, finally :)

    There are also great upgrades coming to gameplay framework, audio, physics. A lot of that already present in 4.26/4.27 in the beta/experimental stage.
    https://docs.unrealengine.com/5.0/en-US/
     
  3. hippocoder

    hippocoder

    Digital Ape

    Joined:
    Apr 11, 2010
    Posts:
    29,723
    Nah, Epic clearly stated this is nowhere near any good for PS4 let alone anything else. I am referring only to lumen in this case. It's very, very heavy. I've been evaluating it for a long time now. Their live streamed videos and docs clearly state the performance issues.

    For example for lumen to function indoors, the best performance they can offer is on a 2080+ card, 30fps. That's the limit, it's just too costly. That won't even run on PS4 btw, it is not supported yet.

    Nanite is but currently there's no alternative except screen space GI solutions for lighting. Try to put it in perspective, it is epic saying this, not me. They are very clear about what the targets are for this new tech.

    It's not switch.
     
    NotaNaN likes this.
  4. Billy4184

    Billy4184

    Joined:
    Jul 7, 2014
    Posts:
    6,012
    That's why I said 'probably'! I haven't tested it myself, so I can't say for sure, but I was thinking of Lumen which is what I would find most useful.
     
  5. Neto_Kokku

    Neto_Kokku

    Joined:
    Feb 15, 2018
    Posts:
    1,751
    Nanite so far improves performance for any meshes which can use it, since it fixes bottlenecks of both forward (overshading) and deferred (g-buffer bandwidth) rendering and batches per material, even if you have different meshes.

    If you use a single material with a texture atlas or texture array you can get as far as drawing an entire scene worth of opaque static meshes in a single draw call.

    On Switch this could allow us mere mortals to approach the scene complexity of Doom Eternal, which is unrivaled on Switch. I often stopped to look around at how wild the devs went with polygonal detail there. Modelled brick walls, actual geometry for rivets, bezels and other small details, hard surface porn everywhere, all while pushing less total triangles on screen than Doom 2016 due to their new geometry culling pipeline.
     
    Deleted User likes this.
  6. hippocoder

    hippocoder

    Digital Ape

    Joined:
    Apr 11, 2010
    Posts:
    29,723
    Why do you keep talking about what-ifs and "could allow" when Epic is contradicting you? watch their live streams for nanite last week and lumen this week. You will see they are clearly communicating that it is for next-gen runtimes. Anything else is not guaranteed and is just guesswork - which is bad for business.

    If you want doom eternal, use a highly modified and optimised UE4 or the up-coming forward+ support in URP.

    It is guesswork which ruins developer dreams. Everyone is happier if they start and finish a project with known and clear boundaries. I don't mean to sound harsh but it is that kind of dreaming which can cause people to put Unity or even Unreal on a pedestal then knock them down later when it doesn't live up to guesswork.

    This is the state of Unity in 2021 and it is proven:



    Pretty good presentation in my view.
     
    NotaNaN and Antypodish like this.
  7. neginfinity

    neginfinity

    Joined:
    Jan 27, 2013
    Posts:
    13,566
    I agree with hippo that there are too many "ifs" here.

    In my opinion the best idea is to assume that a technology that is not finished now will never be completed, because by doing that you'll avoid the danger of betting on things beyond your control.
     
  8. PutridEx

    PutridEx

    Joined:
    Feb 3, 2021
    Posts:
    1,136
    Watched both streams, and it's true, they state lumen is intended for next-gen consoles, and high-end PCs.
    The latter is somewhat useless for production since, unlike consoles, people will have all types of specs. Most of them not that high-end, currently at least. Although it's still useful for things that aren't real-time games on PC. And maybe a setting? You can tell lumen was more targeted at next-gen, it fits perfectly, especially when you consider the work/optimizations they'll do that's specifically for consoles since the hardware is uniformal.

    I think nanite isn't as demanding as lumen.

    I believe when they stated that it targeted next-gen, they were talking about Lumen. I don't remember hearing that about nanite on the stream, correct me if i'm wrong.
     
  9. hippocoder

    hippocoder

    Digital Ape

    Joined:
    Apr 11, 2010
    Posts:
    29,723
    The online docs for UE5 will give you further information including timings on specific GPUs such as lumen's heavy performance indoors and so on.

    People reading things like nanite being faster than a traditional pipeline should do well to remember Epic's comment that it doesn't include how hard it taxes other parts of the GPU. For example bandwidth is a huge issue, not just draw call merging or micro poly optimisations. When they say it's faster than a classic pipeline they mean on a 2080, it's faster. It's faster on the same high-end hardware.

    Put it on a switch and the compute shaders they use will never ever run, let alone anything else, and if they did get that working the rest would be taken out to a field and shot by bandwidth requirements. And that's just nanite.

    That said, I can't imagine anything better right now for next gen consoles except maybe Ratchet and Clank which I have a huge engine love affair with.
     
  10. Ryiah

    Ryiah

    Joined:
    Oct 11, 2012
    Posts:
    21,141
    I checked it out in the past and my conclusion was the strengths (primarily the double precision world) that would make me want to evaluate it were locked behind the far more expensive tiers leaving it completely out of my reach.
     
  11. hippocoder

    hippocoder

    Digital Ape

    Joined:
    Apr 11, 2010
    Posts:
    29,723
    Yeah- on the same machine supporting it. So on a switch this means it does not run, at all. If it did you would be looking at (for sake of argument) something going 4fps instead of 3fps.

    It's simply the wrong technology for a lot of devices, regardless, and yes we have tested it on a 980 (it sucks) and a 2070S (it is 30fps full screen and up to 60) using the valley demo. It's incredibly slow for 980, certainly not worth the customer dissatisfaction for those framerates on 970 (or 980 in our case), in a real nanite scene.

    So it is faster on a 980 running at 10-20fps at 720p as opposed to drawing 10,000 draw calls, which would be even less than 10 or 20fps. It's still terrible.

    You surely understand it's relative to the same scene contents being faster, and without any measurement of bandwidth right? in short you would have to be abusing a classic pipeline really badly for nanite to be faster, and nanite still isn't fast enough to be useful on lower end hardware. I base this off actual testing.

    I'm quite sure it matters a lot to consider this is in a film and AEC context too when they talk about it being faster.
     
    Zarconis, NotaNaN and FernandoMK like this.
  12. Ryiah

    Ryiah

    Joined:
    Oct 11, 2012
    Posts:
    21,141
    I disagree. Spoiler below has a video of the UE5 demo running at 1080p 50% resolution scaling on an i7-4790K with 16GB RAM and a GTX 1060. He easily achieves 30 FPS.


    If we check the Steam hardware survey just under 46% of users have at least a GTX 1060.

    https://store.steampowered.com/hwsurvey/directx/
     
    MadeFromPolygons likes this.
  13. PutridEx

    PutridEx

    Joined:
    Feb 3, 2021
    Posts:
    1,136
    Their new temporal super resolution makes it so low screen percentages can look good, and give better performance.
     
  14. hippocoder

    hippocoder

    Digital Ape

    Joined:
    Apr 11, 2010
    Posts:
    29,723
    Well I'm testing it right now and running at 720p ish with upscale is pretty darn ugly, to the point I'd wonder what you'd be achieving. I'm doing all my tests myself and it's not like that. I call fibbies and tweaksies, or rather - it's still not a switch.

    Let's say it does work on a switch eventually (technically it could with a lot of optimisations) - you would be struggling to find any spare processing left for anything else.

    I am testing on 2070S and 980, and both have a basic 6 core ryzen CPU so that may well be a factor why it is slower.
     
  15. Ryiah

    Ryiah

    Joined:
    Oct 11, 2012
    Posts:
    21,141
    Oh I thought you were referring to or at least including non-enthusiast PCs. :p
     
    Last edited: Jun 12, 2021
  16. snacktime

    snacktime

    Joined:
    Apr 15, 2013
    Posts:
    3,356
    Ya if you want to find areas where Unity lags Unreal the next gen stuff seems like a poor place to start.

    More basic stuff like a lot of their higher level features in HDRP that are poor implementations of stuff Unreal does better. Shading and the rendering pipelines being designed in a much more black box way then Unreal. Like you need source but the lighting models in Unreal were actually designed to be extended.

    Or features like Terrain.

    A studio looks at the basic/core stuff first. If Unity isn't competitive there the rest is almost a mute point.

    And on PC games Unity is not there yet. Some games you might win enough in other areas to offset the cost.. But Unity will absolutely cost you more time/money on the rendering side if you are making a PC game in HDRP.
     
  17. hippocoder

    hippocoder

    Digital Ape

    Joined:
    Apr 11, 2010
    Posts:
    29,723
    I think Unity is really good when it's just Physx, C#, Fmod, URP etc... that slimmed down Unity is strong, reliable, performant in my experience.

    It just falls to pieces whenever Unity adds stuff, since said stuff just isn't finished or is forever almost good enough. I am hoping Unity addresses that based on their video above.
     
    Zarconis likes this.
  18. AcidArrow

    AcidArrow

    Joined:
    May 20, 2010
    Posts:
    11,723
    Unity is really good when it's mostly about stuff they didn't really make? :p

    Wait... Does Unity need to outsource all their features?
     
  19. Ryiah

    Ryiah

    Joined:
    Oct 11, 2012
    Posts:
    21,141
    Considering we're back with Enlighten again... :p
     
  20. hippocoder

    hippocoder

    Digital Ape

    Joined:
    Apr 11, 2010
    Posts:
    29,723
    You had to go there :D
     
  21. AcidArrow

    AcidArrow

    Joined:
    May 20, 2010
    Posts:
    11,723
    The things I really liked in Unity 4.x and earlier was Beast, snappy editor / fast iteration and writing shaders.

    Beast is gone.

    Editor is slow.

    Writing shaders seems to be on the way out (and Shader Graph is weirdly limiting).

    And I can't think of a single feature that was introduced after that era that I genuinely like to use. (smaller stuff, sure, but a completely new feature... maybe the memory profiler? It's experimental for 6 years now... They're not going to finish it, are they?)
     
  22. hippocoder

    hippocoder

    Digital Ape

    Joined:
    Apr 11, 2010
    Posts:
    29,723
    I agree it's gotten hard as hell to write shaders. We have to duplicate this huge monolithic thing for a lit URP shader, and it's hard to change. Jason Booth wrote Better shaders to address that problem, but I have a problem using his product: I don't want to keep relying on 3rd parties, and that's just because I like to be in control.

    So each version of Unity just seems like I have to do more work haha.
     
  23. Neto_Kokku

    Neto_Kokku

    Joined:
    Feb 15, 2018
    Posts:
    1,751
    Hippo, the Switch GPU has the same architecture as the GTX 9XX series, just clocked lower.

    You are still focusing solely on the micropolygon part of nanite. Yes, putting a million TRI model there won't do. But the way it renders "normal" triangles is a net positive bandwidth-wise everywhere.

    Also, on the Switch the CPU is often the biggest bottleneck and if you can shave off hundreds of draw calls offloading stuff to the GPU you should. I'm talking from experience: in a Switch port of a Unity project I worked on we re-created the Unity terrain and foliage system entirely on the GPU using compute shaders and indirect drawing, which was significantly faster.
     
    Deleted User likes this.
  24. Ryiah

    Ryiah

    Joined:
    Oct 11, 2012
    Posts:
    21,141
    No. It's not just clocked lower. It has far fewer units too. The Switch has 256 shader processors, 16 texture mapping units, and 16 render output units. A GTX 960 has 1,024 shader processors, 64 texture mapping units, and 32 render output units.

    Furthermore it has far lower memory bandwidth. The Switch uses LPDDR4 @ 3200MT which has a maximum bandwidth of 25.6GB/sec. A GTX 960 uses GDDR5 @ 7000MT which has a maximum bandwidth of 112GB/sec.

    On paper that's only a 4.375x difference but you need to keep in mind that the CPU and the GPU have to share while the GTX 960 has that memory entirely to itself. Memory bandwidth is a very important factor in a GPU and having both slow and shared memory is crippling to performance.

    If you want an example of how much memory bandwidth can affect a graphics card go look at benchmarks for the recently released RTX 3070 Ti. It has 4% more cores and 5% higher clocks than the base model but it's on average 12% faster. Why? Because it has around 35% more memory bandwidth.

    https://en.wikipedia.org/wiki/Nintendo_Switch#Technical_specifications
    https://en.wikipedia.org/wiki/List_of_Nvidia_graphics_processing_units#GeForce_900_series
    https://www.tomshardware.com/news/nvidia-geforce-rtx-3070-ti-review
     
    Last edited: Jun 13, 2021
  25. hippocoder

    hippocoder

    Digital Ape

    Joined:
    Apr 11, 2010
    Posts:
    29,723
    When all is said and done, I really hope switch is a target. It's technically possible but still there's the SSD storage requirement as well. A regular hard drive is going to have problems feeding nanite. This is all before post effects of course, and the entire game on top, enemies, laser-spewing whales and the likes with procedural audio.#

    Honestly, Unity's pretty well suited to the switch, I think I'd pick Unity for it.
     
  26. neoshaman

    neoshaman

    Joined:
    Feb 11, 2011
    Posts:
    6,493
    they debunk that in their yt channel video presentation of nanite
     
    Deleted User likes this.
  27. hippocoder

    hippocoder

    Digital Ape

    Joined:
    Apr 11, 2010
    Posts:
    29,723
    Sorry I should've been more clear. The context is a regular hard drive on a mobile device like a switch. This device doesn't have the 32gb minimum system ram to buffer a slow hard drive like the yt channel has, so it's not in the slightest bit debunked.

    On our 16 gb systems we can see in our test that the data is buffered into ram very quickly, and not really a bottleneck. We tried Valley and that was fine too. But when we added several different objects, the pressure increased quite a bit.

    So in the theoretical scenario of switch (which I've been talking about in nearly all my posts here as an example of scaling) I think it will matter quite a lot.
     
  28. Ryiah

    Ryiah

    Joined:
    Oct 11, 2012
    Posts:
    21,141
    The Switch doesn't have a hard drive but I do agree the internal storage and the memory card slot are very slow compared to the solid state drives in the PS5 and Xbox S/X. That said according to the tech overview Nanite has superior compression so it should still work for some use cases.

    Just in case someone missed them here are the current presentations.


     
  29. Neto_Kokku

    Neto_Kokku

    Joined:
    Feb 15, 2018
    Posts:
    1,751
    Again, IO speed would only be such a problem if you're aiming for massing poly counts far beyond what the top Switch games currently do. Again, Doom Eternal is constantly streaming in meshes and textures and does fine for it's level of fidelity. I consider that the upper bound of what's possible on the platform.
     
    Deleted User likes this.
  30. neginfinity

    neginfinity

    Joined:
    Jan 27, 2013
    Posts:
    13,566
    However, Doom Eternal does not use Nanite. Which means that while it is possible to maintain good performance on the platform, there's no guarantee that this applies to Nanite.
     
  31. Ryiah

    Ryiah

    Joined:
    Oct 11, 2012
    Posts:
    21,141
    Which means Doom Eternal is having to stream in larger mesh files than it otherwise would have if it used Nanite.
     
  32. neginfinity

    neginfinity

    Joined:
    Jan 27, 2013
    Posts:
    13,566
    That is not necessarily true.

    It means that the application operates using a different approach compared to Nanite, and whether this can be used to estimate performance of Nanite is unknown.

    I'd recommend to wait for an actual live test on Swtich.
     
  33. Ryiah

    Ryiah

    Joined:
    Oct 11, 2012
    Posts:
    21,141
    Nanite has a lossy compression technqiue. It only stores most of the data and out of that data it's only streaming in the info that it needs to render. Doom Eternal's tech review left me with the impression that it's largely the same techniques we've used for a while now since.

    https://www.eurogamer.net/articles/digitalfoundry-2020-doom-eternal-tech-review
     
    Last edited: Jun 13, 2021
    Deleted User likes this.
  34. Neto_Kokku

    Neto_Kokku

    Joined:
    Feb 15, 2018
    Posts:
    1,751
    Doom Eternal doesn't use visibility buffer for rendering, but it does use a custom meshlet-like solution with for culling and streaming for its static level geometry. They did have to change some of their techniques for the Switch, because Maxwell doesn't do async compute (only came to NVidia GPUs with Pascal), the lower bandwidth didn't play too well with them using bindless textures everywhere and their compute based hidden triangle removal (which they replaced with NVidia's exclusive "fast GS", which is a limited form of geometry shaders which run much faster than traditional GS for things like culling and multi-view drawing).
     
    neoshaman and Deleted User like this.
  35. neoshaman

    neoshaman

    Joined:
    Feb 11, 2011
    Posts:
    6,493
    If nanite run on switch, fortnite will let us see when they upgrade in the future. :D:p
     
  36. Neto_Kokku

    Neto_Kokku

    Joined:
    Feb 15, 2018
    Posts:
    1,751
    Anyone has a laptop with a GTX 920/930/940m lying around? You know, for science.
     
  37. Lord_Eniac

    Lord_Eniac

    Joined:
    Jan 28, 2020
    Posts:
    50
    Agreed.
     
  38. Zephus

    Zephus

    Joined:
    May 25, 2015
    Posts:
    356
    Can anyone tell me why it takes about 10 seconds to create a new Unreal Project with an example Scene in it while I have to wait about 2 minutes for an empty Unity project to load? What happened to Unity being the lighter engine?

    Also - why can I press the play button and the game just instantly starts? Everything is ten times more responsive while having a lot more going on and much better graphics. How?
     
  39. PutridEx

    PutridEx

    Joined:
    Feb 3, 2021
    Posts:
    1,136
    Unity c# / ease of programming doesn't come without cost :)
    compiling shaders In unreal take a few years relative to unity.

    In general, unreal has better runtime performance, and viewport editing performance.

    Although at the start of this thread, and a long time ago, I used unreal for a while and crashes, for me personally, occurred way more compared to unity.

    I've also submitted a few bugs for UE4, and one for UE5, including crashes that occur every time when doing something in the editor (features that straight up crash the engine), most recent a few weeks ago, and never heard back to this day. Which kind of sucks.
     
    Last edited: Jun 21, 2021
  40. DuvE

    DuvE

    Joined:
    May 22, 2016
    Posts:
    168
    UE4 shaders take forever to recompile, this is true. Also, in general working in UE4 is a slower process because of Blue Prints. You also need to memorize some weird-named nodes.
     
  41. Reanimate_L

    Reanimate_L

    Joined:
    Oct 10, 2009
    Posts:
    2,788
    idk about this part, i think it's since 2020 release the editor become like this.

    you need to enable the new Enter Play Mode, idk why they didn't make this active by default
     
  42. PutridEx

    PutridEx

    Joined:
    Feb 3, 2021
    Posts:
    1,136
    I tried it out a while ago, it's pretty fast.

    From what I read, you'll need to make some changes/additions to your code if you want to use it all the time. Since what the option basically does is disable domain reloading, and scene reloading.

    If you run the following, then exit play mode, then press play again, counter won't be reset to 0.
    It'll be what it was last time.



    https://blog.unity.com/technology/enter-play-mode-faster-in-unity-2019-3
    And if you have any assets, they'll need to do the same to work correctly.
     
  43. hippocoder

    hippocoder

    Digital Ape

    Joined:
    Apr 11, 2010
    Posts:
    29,723
    Really? screenshots of code on a programming forum? :) In any case - yep need to take care of resetting scene state and variables but it is a lot faster...
     
    FernandoMK likes this.
  44. ExtraCat

    ExtraCat

    Joined:
    Aug 30, 2019
    Posts:
    52
  45. Zarconis

    Zarconis

    Joined:
    Jun 5, 2018
    Posts:
    234
    I've tried it too on a GTX 1080TI and Vega 64, I agree with Hippo the performance sucks. UE has always had an issue with scalability, anything lower than medium settings and it looks several times worse than Unity out the box w/ nothing on. Upscaling really doesn't work either..

    8ms overhead for Lumen per their docs is useless for a game, 2-3 ms like SVOGI would be ideal. One has to remember that outside of a tech demo in game we've usually got multiple particles on the go, several characters, UI overhead, background event processing, a bit of foliage, water shaders etc. etc. I can tank a 1080 without trying all that hard and that doesn't include the use of Lumen. Also 30FPS on next gen consoles is pretty poor..

    It needs a lot of work, I just hope this time around they don't rip it out of engine and wait until UE6 to give it another go.
     
  46. PutridEx

    PutridEx

    Joined:
    Feb 3, 2021
    Posts:
    1,136
    I mean, it's from the blog. It's faster to screenshot it which takes 3-5s then write it. :D
     
    hippocoder likes this.
  47. MadeFromPolygons

    MadeFromPolygons

    Joined:
    Oct 5, 2013
    Posts:
    3,978
    And its just as quick (and better to read) to copy and paste it, using code tags ;) No need to write anything :)
     
    Antypodish likes this.
  48. PutridEx

    PutridEx

    Joined:
    Feb 3, 2021
    Posts:
    1,136
    Oh, I thought the one in the blog was a picture. Just realized I can copy it. :eek:
     
  49. To be fair, it's an early access. Which means experimental. They have ~a year to improve on performance and iron out the wrinkles. Who knows, maybe they'll pull this off.
     
  50. Zarconis

    Zarconis

    Joined:
    Jun 5, 2018
    Posts:
    234
    Yes, obviously.. I wouldn't put much hope in it being ready in a year though. Not that it's an issue for me personally I don't develop based on things to come.
     
Thread Status:
Not open for further replies.