Search Unity

Unreal Engine 5 = Game Changer

Discussion in 'General Discussion' started by DigitalAdam, May 13, 2020.

Thread Status:
Not open for further replies.
  1. N1warhead

    N1warhead

    Joined:
    Mar 12, 2014
    Posts:
    3,884
    arkano22 and Ryiah like this.
  2. neginfinity

    neginfinity

    Joined:
    Jan 27, 2013
    Posts:
    13,572
    N1warhead likes this.
  3. AlanMattano

    AlanMattano

    Joined:
    Aug 22, 2013
    Posts:
    1,501
    And baking in Unreal? I remember going to sleep. I feel like Unreal is more for a 20Yr developer.

    I migrate from UDK3 to Unity3/4 because I was incapable of scripting and editor limitations.
    For making a game like 500x500 km PC later VR
    But now after 10yr looks like I never will finish my game. Unity teach me how to make a game!

    Question: Actually it is better to start with the URP or the integrated that is mature but will be deprecated?
    I ask Unity for dividing renders. I'm HD target PC and VR but now afraid of staring with HDRP...
     
    Last edited: May 16, 2020
  4. Ryiah

    Ryiah

    Joined:
    Oct 11, 2012
    Posts:
    21,203
    UE5's realtime global illumination is bake free. :)
     
    Ony, SunnySunshine, MrPaparoz and 2 others like this.
  5. jjejj87

    jjejj87

    Joined:
    Feb 2, 2013
    Posts:
    1,117
    It is funny how our passion and history with Unity is putting me in the state of denial...

    I am hoping that in a few months things will be different

    like Unity announcing game development,
    finishing preview packages,
    HDRP terrain vegetation is complete,
    realtime GI released and
    nanite alternative available...

    and I will be chanting victory...

    but that is just my wishful thinking...

    when I sit down to plan my next project...I have to become a heartless businessman,
    and make the right choices...
    and these choices are too easy and clear cut...
    sigh....
     
    Ramobo, Tanner555, MrPaparoz and 2 others like this.
  6. neginfinity

    neginfinity

    Joined:
    Jan 27, 2013
    Posts:
    13,572
    In openworld you can get away with dynamic lights without baked lightmaps.
    In my experience, lightmass is faster than enlighten when it comes to baking, because enlighten is CPU only and require a day per bake depending on what you do, and also can hoard insane amounts of memory - like 24 gigabyte allocations on 16 gigabyte system. However, lightmass is not GI, and for example emissive materials do not produce light bounces. On other hand lightmass baking can be networked through unreal swarm which is not a thing with enlighten.

    My experience with progressive lightmapper in unity has been fairly negative, despite it looking cool in preview. I did grab bakery asset to investigate possibility of lightmap transfer from it though.

    Also... wasn't there a ton of complaints on unity side when Enlighten was introduced in the first place? In the end after all that time enlighten is being phased out.
    https://forum.unity.com/threads/uni...ghting-a-big-step-backward-from-beast.343025/
    ^^^ Enlighten vs Beast.
     
  7. Ryiah

    Ryiah

    Joined:
    Oct 11, 2012
    Posts:
    21,203
    It's definitely wishful thinking. Nanite is very clearly a complex beast based on everything I've read including statements by the core developer that it took him more than a decade of research and more than three years of full time development.

    https://twitter.com/BrianKaris/status/1260590413003362305
     
    AlanMattano and hippocoder like this.
  8. AcidArrow

    AcidArrow

    Joined:
    May 20, 2010
    Posts:
    11,794
    3 years! That’s almost enough time to write an FBX exporter!
     
  9. AlanMattano

    AlanMattano

    Joined:
    Aug 22, 2013
    Posts:
    1,501
    I'm not so sure of that. Some other 3D software has also struggled for decades with FBX exporter. I'm still making bug reports.

    I'm 100GB ram. But also just using one directional light and not baking, Nanite is attractive.

    UE5-Unity.jpg
    I found this on twitter.

    My wife is looking at the calendar when I will release and now my baby is walking over the desktop.
     
    Havok_ZA and hippocoder like this.
  10. EternalAmbiguity

    EternalAmbiguity

    Joined:
    Dec 27, 2014
    Posts:
    3,144
    Something I've "always" wondered, do people really just sit in front of their computers and wait while lighting is baking, doing nothing whatsoever? It's SSDs all over again.
     
  11. AlanMattano

    AlanMattano

    Joined:
    Aug 22, 2013
    Posts:
    1,501
    I mean baking compiling code, building the project. When the CPU and GPU hit the top there is not much left to do.

     
  12. neoshaman

    neoshaman

    Joined:
    Feb 11, 2011
    Posts:
    6,493
    The only sensible conclusion of this thread is that @hippocoder was right all along, GI has been solved for video games ...
     
  13. jjejj87

    jjejj87

    Joined:
    Feb 2, 2013
    Posts:
    1,117
    I usually bake lighting on my sub machine with NVME drives. It allows me to keep working while it is baking.

    I've had multiple occasions where I started to bake -> leave or go get lunch -> find out something went wrong.

    Now, I just bake whenever I need to and keep working.

    I am thinking of extending the sculpting and substance painter machine as it eats GPU and RAM like crazy.
     
    EternalAmbiguity and pcg like this.
  14. tatoforever

    tatoforever

    Joined:
    Apr 16, 2009
    Posts:
    4,369
    So you really think this whole demo is what? 500TB on the PS5 SSD? Think about it twice, PS5 won't ship with more than 2TB so if what you say is correct the opening of that demo would take more than 2TB.
    What I think this new tech is encoding hi-poly mesh data into 8K textures (which can hold tremendous amount of data and Epic folks said something about textures holding some geometry data in their gamesfoundry interview video). Then streamed to the GPU and the mesh is being reconstructed on the GPU. The resulting model is a hi poly mesh that do not require normals maps, tessellation, height, or whatever we used before to fake dense models. Also is easier to match the 1pixel1triangle statement they said.
    So what you need for such games built with UE5 is not large storage space, but fast storage drives.
    I can even speculate that those games will take roughly the same space as today's game or maybe less.
     
    TalkieTalkie, Vagabond_ and Billy4184 like this.
  15. unit_dev123

    unit_dev123

    Joined:
    Feb 10, 2020
    Posts:
    989
    Correct, my friend said it is using a hybrid of something called 'Euclideon tech' which was conceived many years ago.
    The genius bit comes in its encoding and reconstruction on the gpu, when u drag high tessalted mesh into unreal it reconstructs data on the fly and encodes it using special proprietary format. Ultimately will result in only slightly bigger download size than existing games like 'red dead redemption' etc, well that is until stadia fully takes off.

    Friend also said, unlimited lod will not impact creation of movable riggable characters because workflow is not suited for that, although not sure what she means.
     
    Last edited: May 16, 2020
    tatoforever likes this.
  16. MDADigital

    MDADigital

    Joined:
    Apr 18, 2020
    Posts:
    2,198
    About lumen, we dont know the performance yet. If we can get it to run desktop VR I'm sold :p
    So freaking tired of lightmaps
     
  17. unit_dev123

    unit_dev123

    Joined:
    Feb 10, 2020
    Posts:
    989
    is a great time to grab a coffee though :)
     
  18. MDADigital

    MDADigital

    Joined:
    Apr 18, 2020
    Posts:
    2,198
    The time is not the worst thing with it though. But I would be a bit happier if GPU PLM worked with scenes bigger than my first apartment. That would give some faster turnaround times.

    But, man, all the problems.
     
  19. unit_dev123

    unit_dev123

    Joined:
    Feb 10, 2020
    Posts:
    989
    Haha, I'm not sure what you mean as i don't handle graphics. But my friend said unity has good solution with something called light probe capture in demo video, if they can move off the rtx cards and make it scalable.

    But remember with vr it has to use both eyes so perf is more critical, and you have to use special render mode, is it called 'deferred' mode? or something?
     
  20. MDADigital

    MDADigital

    Joined:
    Apr 18, 2020
    Posts:
    2,198
    Deferred is not good for VR since it does not have a proper anti aliasing mode
     
    AlanMattano likes this.
  21. unit_dev123

    unit_dev123

    Joined:
    Feb 10, 2020
    Posts:
    989
    ok must mean other mode then.
     
  22. protopop

    protopop

    Joined:
    May 19, 2009
    Posts:
    1,561
    I hate to admit there's some truth to this

    The only real issue I have with unity is I am afraid to update to new versions of the engine and this is keeping me from updating games I need to update because especially on mobile Apple and google keep adding new requirements. But almost every update brings regressions and or worse performance and breaking changes. If unity can fix this I am happy with the rest of the unity experience, because aside from
    This I love unity for what it is not what it isn't.
     
    valarnur and LIVENDA_LABS like this.
  23. MDADigital

    MDADigital

    Joined:
    Apr 18, 2020
    Posts:
    2,198
    I can recommend getting a 16 core CPU I have the AMD 3950x and even though PLM is running all 32 logical CPUs at 100 the computer is still responsive in other programs.

    Edit: PLM plus Project Acoustics baking though, then the computer starts to struggle I guess since project acoustics uses docker and the container takes 16 of my 32 gigs.
     
    Last edited: May 16, 2020
  24. unit_dev123

    unit_dev123

    Joined:
    Feb 10, 2020
    Posts:
    989
    Apple and android are super critical when comes to stable updates. Also friend says xcode versions get more bloated with every version. Best to do updates cautiously sir.
     
    protopop likes this.
  25. unit_dev123

    unit_dev123

    Joined:
    Feb 10, 2020
    Posts:
    989
    i am using windows hp laptop with 4gb of ram to make mechanics and use 'trello' for comms, works well :)
     
  26. MDADigital

    MDADigital

    Joined:
    Apr 18, 2020
    Posts:
    2,198
    It all boils down to how complex the game is, our scenes are quite big for example this one

    152251a-1.png

    A 512x512 terrain, some backdrop mountains and a interior complex at about 50x300 meters. Wouldn't want to navigate that scene on a 4 gig laptop more so before lightmaps are baked
     
  27. arkano22

    arkano22

    Joined:
    Sep 20, 2012
    Posts:
    1,929
    If visual scripting is part of the plan to make it easier to use, even in simple cases, it only makes sense.

    See, DOTS in its current state is just a backend of sorts. It does not have any of the nice wrappers/scaffolding/visual tools on top, it is just a skeleton. Being the most efficient skeleton possible, the obvious thing to do is to flesh it out with the existing editor workflow (gameobjects/components). That’s what they’re doing with the gameobject/entity conversion stuff and I think it is brilliant, have two different representations of the same thing: one that’s comfortable for humans to work with, and another that’s comfortable for the computer to run. This way you get both an approachable editor and games that run lightning fast.

    The issue is they just let everyone use this raw skeleton as soon as they had it. Told everyone it is the best way to write efficient code, and assumed most people would make sense of it, appreciate its potential and benefit in some way. However, the vast majority of their userbase was just confused: “what? am I supposed to use this thing to write all my games from now on? It’s much harder to use! It runs slower than my old code! Why cant they just make gameobjects faster? etc”. Not blaming users, as this is perfectly understandable: Unity forgot who its audience was, expecting inexperienced developers to understand and use something they possibly can’t. If they could, they would probably be using Unreal, or writing their own toy engine just for the sake of it.

    As a result, the entire engine is now in this weird limbo where things are too complex and cumbersome to be used in an “indie” spirit, but too brittle and underdeveloped to be used by professional devs. Good luck getting out of the “twilight zone”!
     
    Last edited: May 16, 2020
    DavidJares, jiraphatK, Ramobo and 5 others like this.
  28. Arowx

    Arowx

    Joined:
    Nov 12, 2009
    Posts:
    8,194
    Is it the PS5's whopping memory bandwidth what makes this demo possible?

    https://www.psu.com/news/ps5-ram-amount-speed-officially-confirmed-by-sony/

    In comparison your PC probably only has a tenth of the memory bandwidth of the PS5...



    Will we need a revolution in PC hardware to compete with the next generation of gaming consoles?
     
    Last edited: May 16, 2020
    Lex4art likes this.
  29. LIVENDA_LABS

    LIVENDA_LABS

    Joined:
    Sep 23, 2013
    Posts:
    377
    Sorry for the shameless plug, but we had to mention this, after analyzing the Epic footage extensively, the Vimeo uncompressed version and screenshots, I am sorry to say, their use of TAA is... Bad. Still extremely blurry even at 4K and certainly does not do justice to their new tech. Our latest CTAA Cinematic Temporal Anti-Aliasing solution for Unity is infinitely better in every respect period (HDRP version coming soon) , CTAA introduces almost almost imperceptible amount of blur and retains sharpness much better then any other solution during motion.

    From Visual perspective this has a tremendous impact on the final fidelity. UE TAA gets even worse during motion. It has taken us literally 6 years to come to this level with CTAA and we like to invite anyone interested to at least check it out. CTAA also works with VR with both multipass and singlepass modes as well.
     
    Shizola likes this.
  30. neginfinity

    neginfinity

    Joined:
    Jan 27, 2013
    Posts:
    13,572
    I would say Global Illumination has higher importance than antialiasing technique. Especially when coupled with a new way to handle high resolution assets.

    Additionally, Epic Games could adapt your solution to their engine if it catches their interest.
     
    LIVENDA_LABS likes this.
  31. MDADigital

    MDADigital

    Joined:
    Apr 18, 2020
    Posts:
    2,198
    You have a eveluation version? Many indies probably will not shell out over 100 EUR before the know it works well for their use case
     
  32. Murgilod

    Murgilod

    Joined:
    Nov 12, 2013
    Posts:
    10,161
    Wow, you sure took that post and really ran with it in the most ridiculous ways. That demo? Yeah, it COULD be 2TB because it's literally a tech demo. Did I say it would be that big? No. You said I said it'd be that big though.

    The simple fact is that they said absolutely nothing about compression tech and that even Epic themselves said that the statue in question was twenty-four 8k textures. I then made a simple extrapolation based on the fact that an uncompressed 8k texture comes in at 85mb, which means that statue comes in at 2gb uncompressed. Then I made a generous assessment that even if they got that down to 50% compression, that's still coming in at a full gigabyte.

    This is something we have to consider because we are already dealing with some games coming in at 100gb filesizes on current hardware. Epic has done literally nothing to assuage worries about increasing game file sizes in the era of primarily digital distribution.
     
    Martin_H, SunnySunshine and tmcdonald like this.
  33. tmcdonald

    tmcdonald

    Joined:
    Mar 20, 2016
    Posts:
    160
    I agree. I might even go so far as to say they've done the opposite with this technology. I'm half-expecting games to be in the 200-300 GB range not long after the next gen consoles come out.
     
  34. Ryiah

    Ryiah

    Joined:
    Oct 11, 2012
    Posts:
    21,203
    Yes, but current generation memory is due to be supplanted very soon with a new generation. A dual channel kit of DDR4-3200 runs at 51.7GB/sec while a dual channel kit of DDR5-8400 runs at 134.4GB/sec. Widespread adoption of DDR5 is a few years out but the same can be said for UE5. By the time we need that memory performance we'll have it.

    https://www.extremetech.com/computing/308848-sk-hynix-plans-for-blazing-fast-ddr5-8400-pc-memory
     
    Last edited: May 16, 2020
  35. neginfinity

    neginfinity

    Joined:
    Jan 27, 2013
    Posts:
    13,572
    ...
    I'm slowly getting sick of hardware upgrades.
     
    Martin_H, valarnur and AcidArrow like this.
  36. Ryiah

    Ryiah

    Joined:
    Oct 11, 2012
    Posts:
    21,203
    Same, but in my case it's less the hassle with the hardware since I enjoy building new machines, and more the hassle with the operating system. Windows simply cannot handle a motherboard upgrade without performance penalties, and you have to upgrade the motherboard nearly every time you want a new CPU.

    I'm very close to just purchasing new machines from iBUYPOWER since the hardware you can select from is standard off the shelf components (ignoring the custom parts they create that you can select from) and the normal cost is only a little higher than that of an actual PC while the sales can sometimes push the machines lower than if you did it yourself.

    https://www.ibuypower.com/
     
  37. neoshaman

    neoshaman

    Joined:
    Feb 11, 2011
    Posts:
    6,493
    They said it's wavelet compression, expect jpeg ratio of compression, also probably why they do software rasterization.
     
    tatoforever likes this.
  38. perholmes

    perholmes

    Joined:
    Dec 29, 2017
    Posts:
    296
    Honestly, we're a small shop two months into actual work on a Unity project, and we're very likely going to bail for UE. There are some things in UE that either are or will be so much better that it's worth it starting over:

    * Render pipeline. We're deeply depressed that there's no way to do URP on mobile and HDRP on desktop. You have to do URP all the way or give up on mobile. So we can never shine on desktop. We are a cinematic-centric app, and it's a big loss for us. We need the way you can scale a single rendering engine up and down in UE. We'll never need a scriptable pipeline, we just need a single one that scales.

    * Dynamic LOD/Baked GI stuff. This is where a *lot* of our labor would have to go, and it disappears in UE in the future. With Unity, we'd have to fake dynamic lighting by using many bake sets.

    * Quixel.

    * The limited marketplace. Unreal has many things built-in that are marketplace modules in Unity. I wouldn't have cared, except that it's really hard to get these purchased modules to play together.

    * Licensing floor moved to $1M doesn't feel as rough anymore, and we can bring people in and out at will without thinking in "seats".

    I had sworn I wouldn't make a major project in C++ again, but I'm going through some UE C++ tutorials and installing the engine again. I sense an 80% probability that we'll be canceling the Unity subscription very soon and rebuilding our object model in UE. It's possible that a two month loss is still ultimately a time-saver.
     
    Ony, tyrot and AnvilNight like this.
  39. neginfinity

    neginfinity

    Joined:
    Jan 27, 2013
    Posts:
    13,572
    That shop seems to be US only, also.... building computers lost appeal to me many years ago - it is just going through the motions now. Another annoying problem is that computers eventually reproduce using division during update.

    Speaking of updates, I'm usually using AMD CPUs, and those do not require motherboard updates that often. Also, Windows could handle last CPU switches just fine, although I definitely recall that either in windows xp or windows 7 times you had to do some tweaks before motherboard swap, otherwise you'd get BSOD or something similar.
     
  40. N1warhead

    N1warhead

    Joined:
    Mar 12, 2014
    Posts:
    3,884
    @perholmes : I don't believe you can cancel it. At least the time I had pro during Unity 4x cycle you couldn't. You essentially signed up to an agreement to pay for that entire cycle. So I wouldn't waste your money, so keep using it if you can and see where Unity goes from here, they might surprise us (unlikely, but who knows, but perhaps). It's one thing to have their community complain about these things, it's entirely different when the competition makes a striking blow, if the wallet doesn't hurt them bad enough then nothing will change the way Unity is going and yeah we all need to jump ship at that point, even though the lack of listening and understanding the community should be enough at the end, but if money doesn't drive change, nothing will.

    EDIT: I'm not a lawyer, so this isn't legal advice, just my experience with Pro during 4x cycle on canceling.
     
  41. neginfinity

    neginfinity

    Joined:
    Jan 27, 2013
    Posts:
    13,572
    Either way, having done motherboard + CPU + RAM upgrade recently, at the moment I can only feel irritation upon hearing that there's a new type of RAM on the horizon again.
     
  42. N1warhead

    N1warhead

    Joined:
    Mar 12, 2014
    Posts:
    3,884
    There is? Wow. I remember just a year or two ago I spent like nearly 1K on DDR4 memory (granted that's with a couple shipments of overnight mail), so I'd say around like 750 for the ram and the rest on overnight shipping cause I'm impatient lol.. But I think it was more around 850-900 range after the overnight shipments.
     
  43. MDADigital

    MDADigital

    Joined:
    Apr 18, 2020
    Posts:
    2,198
    DDR4 will be used on zen 3 so atleast one more gen until depricated.

    3080 TI will have so much DDR6 Vram that you never need to load anything off the normal ram :p
     
  44. arkano22

    arkano22

    Joined:
    Sep 20, 2012
    Posts:
    1,929
    Brian Karis hints at virtual geometry images and SVOs:
    https://twitter.com/BrianKaris/status/1260590413003362305

    SVOs (sparse voxel octrees) are well known, but the vanilla version is pretty memory-hungry. Geometry images are somewhat obscure though.

    The idea behind geometry images basically revolves around cutting open a mesh along specially selected seams, and reparameterizing it to a flat square space. Similar to seam-based uv unwrap, except that you encode the mesh itself (not uv coords) as quantized x,y,z positions in the r,g,b channels of the texture, using a single "island" so no texture space is wasted.

    After this, you have your mesh expressed as an image so you can apply virtualization, and image compression algorithms to it. In theory this should compress quite large meshes efficiently, using wavelet-based compression like jpeg you're cutting corners in frequency domain.

    Additionally with this scheme you have no need for uv coords (since parametrization is implicit) and no triangle index buffer, so even more savings. Using traditional mipmapping on the geometry image could yield automatic mesh LODs.

    Really hope they clarify if they have solved the size problem. I don't really care how right now, any sufficiently advanced technology is indistinguishable from magic anyway :D
     
    Last edited: May 16, 2020
  45. shredingskin

    shredingskin

    Joined:
    Nov 7, 2012
    Posts:
    242
    To me it was pretty clear that Sony struck a deal with Epic to sweet-talk the ps5.
     
  46. Vagabond_

    Vagabond_

    Joined:
    Aug 26, 2014
    Posts:
    1,148
    I don't think this does mean a lot considering their UE4 tech demo also ran on play station.
    What i really think is that this is "Epic Game" and they don't play cheap

    EDIT : i would bet that they have what to show to the public...

     
    Last edited: May 16, 2020
    ArmKe likes this.
  47. Ryiah

    Ryiah

    Joined:
    Oct 11, 2012
    Posts:
    21,203
    It wouldn't surprise me if Epic Games reached out to both companies and went with the highest bidder.
     
  48. neoshaman

    neoshaman

    Joined:
    Feb 11, 2011
    Posts:
    6,493
    Apparently the svo is only for large structure, they use distance field for medium scale (so a 3d texture). By cutting the depth they probably save memory.
     
    pcg likes this.
  49. arkano22

    arkano22

    Joined:
    Sep 20, 2012
    Posts:
    1,929
    I think you're referring to Lumen here? Also read somewhere about the SSGI / SDF / Voxels combo used for high / mid / low frequency GI. They'd trace a voxel-based scene representation for low-frequency GI, similar to how VXGI works. Just speculation here though, not sure.

    Tbh if anyone from Unreal is reading this thread they must be laughing their asses off at how everyone is trying to guess how the heck they've done it. It reminds me of the speculation around Crytek's realtime GI back in the day (light volumes).
     
    Last edited: May 16, 2020
    SunnySunshine likes this.
  50. MDADigital

    MDADigital

    Joined:
    Apr 18, 2020
    Posts:
    2,198
    And it will use Vram which will be super fast
     
Thread Status:
Not open for further replies.