Search Unity

  1. Megacity Metro Demo now available. Download now.
    Dismiss Notice
  2. Unity support for visionOS is now available. Learn more in our blog post.
    Dismiss Notice

So, Nanite

Discussion in 'General Discussion' started by Win3xploder, May 13, 2020.

Thread Status:
Not open for further replies.
  1. neginfinity

    neginfinity

    Joined:
    Jan 27, 2013
    Posts:
    13,554
    The point of GI tech is that it should ALWAYS look fine, though. I also have a lot of doubts about "realtime".

    So how it looks is its selling point, and the important one.

    Also the question is - must GI be realtime? If the world is static or changes infrequently.

    If an existing tech can produce comparable results to the new tech, then the new tech is kind of pointless, unless it has some amazing advantage somewhere.

    The problem with GI in general is that in many cases results are too subtle. For example, RTX GI attempts were displayed in videos before, and frankly they looked dated and unimpressive. The newest "Portal RTX" also looked like crap. Where's the wow effect?

    The point of this sort of tech should have no edge cases to look out for.
    Otherwise it is just swapping one trouble for another.
     
  2. Neto_Kokku

    Neto_Kokku

    Joined:
    Feb 15, 2018
    Posts:
    1,751
    This has to be trolling. Did you even watch the videos to say it looks subtle? Saying there's no value to realtime unless it looks better than offline baked lightmaps, regardless of everything else that makes them different, is like arguing that real-time 3D is worthless when pre-rendered backgrounds can look much better. Or that cascaded shadowmaps are worthless when we can just place pre-made shadow textures under trees.

    And running at 60fps on current gen consoles is not real-time enough for you?

    This is the one game where using lightmaps or probes is simply impossible: dynamic time-of-day on a large world where every building can be destroyed.

    Of course you can get vastly better performance and quality if your game is made of tiny rooms with loading screens between them and you can UV unwrap every single object to give them a good lightmap resolution without blowing your memory budget or waiting weeks for lightmaps to bake.
     
    Last edited: Dec 12, 2022
  3. Deleted User

    Deleted User

    Guest

    Of course, it doesn't need to be dynamic :)

    We got light-baking for many years in Unreal. Plus late UE4 introduces GPU baking, which is so much faster (on DXR-support cards) and correct. CPU Lightmass is a workaround, it's a separate application that translates the Unreal light model, but it's not the same thing. Still, old baking is "good enough" as many projects still use it.

    Lumen is one yet option, i.e. if what a given project need is a dynamic, big world.

    Or simply the team doesn't want to bake anymore. Lightmaps may be a workflow-killer, a lot to upload/download to the repository every day. You have to wait for another bake after changing anything in the scene. Rapid game patching might be impossible, which is slows down testing.
    Often game/level designers hate light baking as this limits what the game is done in gameplay. Plus you got much more data to stream, making streaming a much bigger hassle. And if the game is big (even a static environment), the final install size would grow quickly because of light maps.
    Making much more detailed assets (which is possible thanks to Nanite, VSM, TSR, and other techniques) also makes baking more resource-consuming. Real-time GI is one element of more complex puzzle.

    Realtime GI scales well, which cannot be said about any baking. That's why is so needed, even if sometimes we lose something minor aesthetically. Some of my artist friends often miss "a soft look" of CPU Lightmass when switching to dynamic lighting, but the entire team loves the much faster workflow in general ;)

    So it doesn't need to be dynamic, but I hope to never again work on game with baked lightning ;)
     
    Last edited by a moderator: Dec 12, 2022
    stonstad and Gooren like this.
  4. Neto_Kokku

    Neto_Kokku

    Joined:
    Feb 15, 2018
    Posts:
    1,751
    This. The issue with lighmaps is that they don't scale up. Lightmaps use resources based on total surface area, so the larger and more complex your environments are the harder it becomes to use them. Most AAA games have simply abandoned lightmaps last generation and moved into baked light probes (which have their own unique issues) due to the sheer size of their worlds.

    And if your game has a even slightly dynamic world (players can build/change/destroy stuff or lighting changes drastically) you have to forgo GI entirely if all you have are baked solutions, which is the case with Fortnite: without Lumen all the game can do is rely on plain SH lighting (which looks the same everywhere) with SSAO+SDFAO to ground objects together.

    Lightmaps aren't perfect either. Leaks are a thing in lightmaps too since texels rarely line up perfectly with the surfaces they are applied to. Shadows will only look as good as the lightmap resolution allows and if your scene is too large or your memory budget too low they will become blurry and inaccurate.

    But just like any real-time techniques (like shadows, reflections, and even direct lighting), real-time GI has a cost and won't scale down. But this doesn't make it worthless, just as none of the others are worthless.
     
  5. Ng0ns

    Ng0ns

    Joined:
    Jun 21, 2016
    Posts:
    197
    Not to derail, but how's the updates to pso compilation in 5.1? I know they've been working on a better solution as to reduce the stutters.
     
  6. Deleted User

    Deleted User

    Guest

    It's already derailed to discuss Lumen, so... ;)

    Async shader compilation has been added, it makes a big difference.
    The DF analysis I linked previously ends with a performance analysis. There are still some shader compilation stutters. It's still being worked on, Alex repeats what he learned directly from Epic. Best to check the video from this link with the timestamp :)

     
    Ng0ns likes this.
  7. Rastapastor

    Rastapastor

    Joined:
    Jan 12, 2013
    Posts:
    589
    I will take any small not noticeable light leak in fast gameplay game that u probably wont notice if u dont look for it, over hours of baking :)
     
    stonstad, pm007 and Deleted User like this.
  8. impheris

    impheris

    Joined:
    Dec 30, 2009
    Posts:
    1,623
    well in the video are easily noticeable. But many devs this days would prefer some glitches and bugs over hours of work to released something good, then, you have battlefield 2042, cyberpunk 2077 etc...
     
  9. UhOhItsMoving

    UhOhItsMoving

    Joined:
    May 25, 2022
    Posts:
    104
    Any mention Epic makes about Lumen heavily emphasizes it's real-time capabilities and what that allows for. This isn't to say that "how it looks" isn't important, just that the "real-time" part is the very reason for Lumen, which is what they said when they first introduced it:
    I don't know how you could have any doubts about that, lol.
    In addition to the reasons others have said, physically (not realistically, but physically), everything in the world, including characters, affects lighting, which is made up of both direct and indirect light. Just like a character can affect direct light, a character can also affect indirect light, as they are both equally light: "there is no distinction to be made between illumination emitted from a light source and illumination reflected from a surface."

    A simple example of this is a character blocking a doorway to a room. The more the character (which can be anything, by the way, not just a humanoid) blocks the doorway, the less light that enters in and bounces around the room, and thus, the darker the room gets. Even though the world itself isn't actually changing, the lighting still is. From a gameplay perspective, if a room suddenly darkens, it's obvious that someone or something has blocked or entered the doorway.

    One could obviously fake that (for example, simply putting a trigger at the doorway to darken the room), but that would still be, in effect, simulating real-time GI in a static scene, as that is, by definition, what a simulation is: "the imitation of the operation of a real-world process or system over time." So, by faking it, you would be manually simulating an effect of real-time GI that would otherwise happen naturally (and would also be explicitly expressing an interest in simulating that effect ;)).
     
    OBiwer likes this.
  10. Max-om

    Max-om

    Joined:
    Aug 9, 2017
    Posts:
    499
    Faking indirect lighting is almost impossible. Also PLM indirect doesnt work well in those scenarios. You need to add area lights to windows and doorways for rooms to light up correctly. I have seen that lumen does this much nicer out of the box.

    When using doors it looked a bit strange because of the delay in indirect lighting, maybe this have been fixed?
     
  11. OBiwer

    OBiwer

    Joined:
    Aug 2, 2022
    Posts:
    61
    I'd gladly take a (nearly-)realtime-GI variant that looks like 80% of the Baked high quality result at least in the editor to have a faster workflow. Being able to iterate faster is a huge cost saver.
     
  12. Max-om

    Max-om

    Joined:
    Aug 9, 2017
    Posts:
    499
    Unitys problem isnt that its baked per say, its that the workflow is a pain. Its not made for actual games, it is made for prototypes.
     
  13. neginfinity

    neginfinity

    Joined:
    Jan 27, 2013
    Posts:
    13,554
    Why are you bringing up the rendering equation here and not official Lumen docs?

    Rendering equation is describing ideal case. Ideal case might not exist in reality, as I don't recall EPIC demonstrating characters affecting GI environment.

    Trying to implement this sort of thing smells like feature creep.

    It is possible to aim lower, with a tech that allows quick rebake of the environment in non-realtim time, but would allow sufficiently quick update of world's illumination.

    And that would be what OBiwer mentioned.
     
  14. Murgilod

    Murgilod

    Joined:
    Nov 12, 2013
    Posts:
    10,083
    It really isn't. We've been doing it for ages, both in realtime and offline production.
     
  15. Max-om

    Max-om

    Joined:
    Aug 9, 2017
    Posts:
    499
    It never looks close to physical correct
     
  16. impheris

    impheris

    Joined:
    Dec 30, 2009
    Posts:
    1,623
    Are we talking about light bounces?
     
  17. Max-om

    Max-om

    Joined:
    Aug 9, 2017
    Posts:
    499
    Indirect lighting yes
     
  18. impheris

    impheris

    Joined:
    Dec 30, 2009
    Posts:
    1,623
    what do you mean with "faking"?
    Light cookies or something like that?
     
  19. Max-om

    Max-om

    Joined:
    Aug 9, 2017
    Posts:
    499
    Using area lights etc.
     
  20. impheris

    impheris

    Joined:
    Dec 30, 2009
    Posts:
    1,623
    Well unity has Area lights, those are baked, so, are you talking about lightmaps?
    I've seen very good lightmaps created with unity, i don't understand your point sorry, but i'm very curious
     
  21. Max-om

    Max-om

    Joined:
    Aug 9, 2017
    Posts:
    499
    PLM isn't very physical correct. For example, in real life if just a small part of your darkening curtain lets in the smallest stream of light in the room indirect light will lightup the entire room. Not so much in PLM. You need to use tonemapping to get it more physical correct. But that breaks if you have both outdoor and indoor parts in your scene, then you need triggers and change your LUT when moving from indoor to outdoor etc. It gets messy fast
     
    Deleted User likes this.
  22. impheris

    impheris

    Joined:
    Dec 30, 2009
    Posts:
    1,623
    sorry for this, but can you show me a real example for this, please? i'm looking for real photos on google and the photos I find do not show a scenario like the one you describe.
    Also, for my own experience i know that the exposure values can be complicated sometimes, but if you use real camera values you can get great results (at least for games)
     
  23. impheris

    impheris

    Joined:
    Dec 30, 2009
    Posts:
    1,623
    ...anyway i looked for images similar to what you described and i did not find any image with a entire room lighted.
    I tried what you said, i found 2 references on google and i made a simple room with a window an a simple "curtain", also i'm using default values for the PLM (which is very bad quality xD but it works for my test)
    8 min bounces - 16 max bounces + fixed exposure values.
    It took me like 5 - 7 minutes (creating the objects, opening unity, importing the objects, creating the materials, i also pet my cat and creating the lightmaps)
    For a 5 minutes test i can say it is pretty close at least to those references, did i misunderstood you?

    References:
    ij.jpg
    dark-bedroom.jpg

    My test:
    aaaab.jpg
    saasaasa.jpg
     
  24. UhOhItsMoving

    UhOhItsMoving

    Joined:
    May 25, 2022
    Posts:
    104
    Because Lumen uses the rendering equation (that's what global illumination is, lol). By the way, that answer wasn't about Lumen specifically, just real-time GI in general since that's what your question was about.
    Software Ray Tracing: "Only Static Meshes, Instanced Static Meshes, Hierarchical Instanced Static Meshes, and Landscape terrain are represented in the Lumen Scene." However, "screen traces enable skinned meshes to receive and contribute to indirect lighting," but obviously, "are limited by what's visible on screen."

    Hardware Ray Tracing: "Hardware Ray Tracing supports a larger range of geometry types than Software Ray Tracing, in particular it supports tracing against skinned meshes."

    So, with both methods, skinned meshes can contribute to GI, just one is more limited than the other. The "Lumen in the Land of Nanite" and "Valley of the Ancient" demos used software ray tracing, and the "Matrix Awakens" demo used hardware ray tracing, so you can watch (or try) either of those to see their effects.

    Also understand that "character" does not inherently mean "skinned mesh." For example, "The Ancient" from the "Valley of the Ancient" demo wasn't a skinned mesh, but rather, a bunch of Nanite meshes attached to a skeleton. Since Nanite meshes can contribute to the Lumen Scene, a character that uses them can, as well (here's a mech somebody made using Nanite meshes).
    Here's a better example using a tiny hole (pinhole) in a dark room. This effect is called camera obscura.
    As you said, it's the exposure. For example, this is the first image with the exposure adjusted:
    upload_2022-12-17_0-33-18.png
     
    Deleted User likes this.
  25. AcidArrow

    AcidArrow

    Joined:
    May 20, 2010
    Posts:
    11,631
    I find it weird how there is a focus for this for Unreal when Unity has the same problem and for some platforms there is almost nothing you can do.

    https://docs.unity3d.com/ScriptReference/Shader.WarmupAllShaders.html doesn't work on DX12, Metal and Vulkan

    and

    https://docs.unity3d.com/ScriptReference/Experimental.Rendering.ShaderWarmup.html is experimental (and has been for a few years already, so I don't see it losing its experimental status any time soon) and also doesn't really work
     
    Deleted User likes this.
  26. Max-om

    Max-om

    Joined:
    Aug 9, 2017
    Posts:
    499
    Most cameras does not have enough f-stops to reproduce what your eyes can pickup.

    From what I have seen Lumen replicate this much better out of the box than PLM. Edit: with added benefit that it is real time and dynamic objects like doors can effect indirect lighting
     
  27. impheris

    impheris

    Joined:
    Dec 30, 2009
    Posts:
    1,623
  28. angrypenguin

    angrypenguin

    Joined:
    Dec 29, 2011
    Posts:
    15,615
    Nothing, as long as what happens to the data is understood. Truncated data can not be regained.

    Because of that, converting to higher precision data types for our own calculations will have no impact on jitter or precision once passed back to Unity, where they will be truncated to 32 bit.

    If we're doing a lot of calculations then keeping our workings at higher resolution may be worthwhile in some use cases, but the result used by Unity will not use the additional resolution.

    :p

    It's just as likely that there's a bottleneck elsewhere that's bigger than the performance cost of those assets, and completely hides their own performance impact.
     
    Deleted User likes this.
  29. UhOhItsMoving

    UhOhItsMoving

    Joined:
    May 25, 2022
    Posts:
    104
    Just found a talk by Brian Karis (guy who made Nanite) that's pretty interesting & useful. It's not about Nanite itself, but rather, "the problem and the process of invention of what ultimately became Nanite." It starts at 4:00.


    Some notable quotes from the start that explain the problem Nanite is trying to solve (taken from the slides):
    A question Hamming became famous for asking was “What are the most important problems in your field?” Followed by “Why aren't you working on them?” I’m not sure how impactful and disruptive Nanite will ultimately be. Time will tell. What I am confident of though is the importance of the problem it is trying to solve. So tonight I thought it would be worthwhile to tell the story of my journey in trying to solve it and along the way tell you what I learned about both the problem and the process of invention of what ultimately became Nanite.

    Let’s start at the source of it all.

    Before creating a tool you must understand the people that will use it. Most of us don’t think of ourselves as tools programmers but that’s ultimately what the majority of computer graphics is about. Creating tools for artists to make art. So we need to understand artists.

    If your organization includes artists, talk to them. A lot. Become friends with artists. Try to understand their process, what they spend their time doing, what was different between what they intended and what was actually created. If you don’t have artists in your organization, watch videos of artists working. Watch instructional videos of how art software works. Look at breakdowns for how they made what they did.

    Many of you haven’t worked in game production so it is worth explaining some of the problems artists deal with. You might be surprised at how much of their time is sucked up in technical tasks that aren’t artistic.

    Laying out UVs, generating LODs, collision geometry, multiple versions of the same asset for different use cases, and then profiling and optimizing all of it to fit in budget. Optimization is familiar to everyone but in games they rule with an iron fist. I’m not sure how I can relay how dominant of a thought budgets are to artists any better than to tell you one of the most popular forums for game artists is called polycount.com. They named their website after the budget.

    And that isn’t the only budget they need to be concerned about. There are countless ones that dominate their every action. Polycount, draw calls, texture memory, mesh memory, light count, shadow casting light count, shader instruction count. These are just things that engineers can quantify and try and turn into a budget. There are a 1000 more things an artist could do to make things run slower and they need to know those too.

    [...]

    In a production environment, money and time are just as much of a limiting factor on quality than rendering the pixels with the latest greatest rendering technique. Anything that makes art more efficient, allows artistic vision to more directly be expressed, or enables more of the art team to contribute more widely because the process is less arcanely technical, will reap massive dividends.

    I argue there is no problem more important to work on in graphics right now than how to make high fidelity content cheaper to create.
     
  30. Deleted User

    Deleted User

    Guest

  31. martinvarga334

    martinvarga334

    Joined:
    Mar 31, 2021
    Posts:
    2
    Any news about Nanite in Unity? Nano tech?
     
  32. Why do you post the same in multiple threads? Here is your answer.
     
  33. neginfinity

    neginfinity

    Joined:
    Jan 27, 2013
    Posts:
    13,554
    I've finally got a bit of time to play with Lumen in Unreal 5.

    Results:
    Character is lit by emissive materials.
    upload_2022-12-29_14-37-9.png

    This is good.

    Character itself does not participate in global illumination. Which is not so good.

    If you enable debug view, it is not even part of the lumen scene. It does not exist.
    upload_2022-12-29_14-38-1.png

    In practice it means that glowing logo on the chest does not contribute to the scene.
    upload_2022-12-29_14-38-39.png

    But if the glowing part is VISIBLE then you get some effect via apparently screen space reflection
    upload_2022-12-29_14-39-40.png

    Character appears in the scene when HARDWARE raytracing is enabled, however:
    upload_2022-12-29_15-6-29.png
    But surfaces on the character that are not visible in the scene do not contribute to lighting.

    I don't know maybe I've missed a checkbox for the characters somewhere.

    So, uh, people doing archviz gonna love it, but it is not yet that "one true way to light up the scene" at the moment.

    Additionally from the docs:
    That means you won't be able to make thunderstorm with it and expect GI to properly reflect lighting flashes.[/spoiler]
     
  34. angrypenguin

    angrypenguin

    Joined:
    Dec 29, 2011
    Posts:
    15,615
    If it's truly important for glowy doodads on your character (and other moving objects) to contribute then what's stopping you from whacking a little light on them?

    For lightning in particular - large, fast, short lived - my question would be whether it can be ignored by the GI system? Nobody's going to see that the GI in a crevice during a lightning flash (which is faster than our eyes can adjust and should often be brighter than our screens go) isn't accurate, so I don't want to waste cycles on it.
     
    pcg and Deleted User like this.
  35. Max-om

    Max-om

    Joined:
    Aug 9, 2017
    Posts:
    499
    I wonder how lumen handles florescent light flicker. It never looked good with precomputed real time GI in enlighten.
     
  36. neginfinity

    neginfinity

    Joined:
    Jan 27, 2013
    Posts:
    13,554
    It will produce a different effect and the process of babysitting "little lights" is going to be time consuming.

    The point of GI system that support characters is that you have unrestricted freedom of doing anything you want. For example, you can have a "magma golem" type of character
    Which is a rock with magma/fire within cracks. You can animate those, control their brightness and color and IF the GI system supports those this will be reflected in environment.

    There's also matter of stuff like having glowing part of character:

    Animating those with "little lights" is going to be a pain, and it would be better if GI could handle it instead.

    Then we have cyberpunk designs, where everything can be covered with completely irregular light strips.

    The interesting thing is that at some point there was a youtube video of Unreal's "Elemental" demo convert ed to use one of the *GI technologies (SVOGI, I think). It could do exactly what I'm describing here. Apparently this specific video has been taken down since, for some bizarre reason.

    "Several seconds" is a LOT of time, and it is not a matter of crevices. In a situation where you have dark room with an open window during lighting strike, the room (in reality) is gonna be lit up with indirect bounces. And speaking of crevices, if you look at the debug view, Lumen lighting is very fuzzy and operates on fairly large "splotches" of light. So there won't be any details in crevices in the first place.

    The interesting thing is that common sense says that this sort of situation should work with bruteforce RTX GI lighting. However this is not the case as far as I can tell.
     
  37. angrypenguin

    angrypenguin

    Joined:
    Dec 29, 2011
    Posts:
    15,615
    Yeah, I get that it'll be "wrong", I just don't care any more than my players will.

    When I play games with non-developers they don't notice most of what I do. Sometimes they still don't notice when I point stuff out and explain it.

    So even though it'll be "wrong", for a lightning strike I'd just put a big, GI-ignored light in my scene for half a second, and then get on with my life, because I don't think my players are going to know, notice or care that the flash isn't bounced properly.

    Even in feature films lighting isn't correct, often deliberately so on characters, so I wouldn't get hung up on arbitrary strip emission in their clothes influencing nearby surfaces accurately, either.

    Those things would be nice. But they're not necessary.
     
    Deleted User likes this.
  38. astracat111

    astracat111

    Joined:
    Sep 21, 2016
    Posts:
    725
    What about Nanite, Unreal is 30GB. ¯\_(ツ)_/¯
     
  39. neginfinity

    neginfinity

    Joined:
    Jan 27, 2013
    Posts:
    13,554
    It kinda not what's it about.

    Some of t he earlier posters sort of positioned the system as the ultimate silver bullet that is gonna solve EVERYTHING.

    To which I responded that there are gonna be edge cases - another set of edge cases, to be specific.
    So here are the edge cases.

    I do hope that you aren't trying to "defend the honor of Lumen" here, as in those sort of discussions people get upset and take it personally and then it becomes one huge waste of time.

    By the way in the test scenario on the pictures, it is possible to set the logo brightness to a factor of one million. In this case seeing it will literally blind you as it propagates through builtin bloom, like glowing particles. However when you aren't seeing it, and stuff one million bright logo into the wall (looking at character's back), it produces zero effect. Which looks incredibly odd in practice. I also believe t hat Nvidia VXGI can handle that scenario.
    What happened with all those SVOGI/VXGI/whatver GI discussions, by theway? Used to be a hot topic a couple of years ago, now everybody forgot about them.

    So, it doesn't matter what "but my players won't notice". This is a flaw. A flaw for which you'd need to develop workaround. And rather than dismissing it as "unimportant", it would be best to keep in mind and see if there's a way to fix it.

    Now, in this particular scenario, it is possible that emission actually works, but I did not configure the material correctly. See, this material uses some nonsensical multi-layer system and the logo is slapped on top of it procedurally. So if the logo does not provide "correct" emission, but some other nonsense like particle glow, that could lead to incorrect impression. In which case you're free to grab the 3rd person template and investigate it yourself.

    Overall, the impression is that, like someone else said earlier, Unreal sort of expects you to have a mostly static world. Which is probably another thing that needs to be fixed in the future.

    Lumen itself falls into the same category. It is not necessary. Using game engines is also not necessary. And so on.
     
  40. neginfinity

    neginfinity

    Joined:
    Jan 27, 2013
    Posts:
    13,554
    It is not 30 GB, it is about 95 GB. On my system at least. The reason for that is that at some point the editor is going to crash, and you'd need to know at least WHERE it crashed. That requires C++ debugging symbols and they're somewhere between 30 and 50 gigabytes alone. Without them you won't get engine stacktraces. Now the good thing is that symbols compress fairly well using NTFS compression, but the engine is a behemoth plus it really wants to be on a ssd. On top of that there's a matter of shader compilation, for example, on a cold start the test 3rd person template project took something like 10 or 15 minutes to open, because it was compiling seven thousand shader variants. And t his stuff is dumping gigabytes of junk somewhere on your SSD system drive. (I forgot where that damn shader cache folder is). Oh, and subsystems, like linux support/etc also couple of gigabytes each.

    So, uh, yeah. In case of unity you can easily keep 20 different versions installed on the same system and switch between them at will, as they're all relatively tiny and lightweight. Unreal is a space devouring behemoth in comparison.

    //opinion.
    Now the positive thing about Unreal is that they announced VS Code support (somewhere), and as much as I dislike that it is written in another interpreted language, at the moment VS Code is strongly preferable to Visual Studio, as Visual Studio clearly lost its way and turned into another big blob of bloat.
     
  41. angrypenguin

    angrypenguin

    Joined:
    Dec 29, 2011
    Posts:
    15,615
    Sigh. No. Software doesn't have honor, and Epic doesn't need my help.

    Best for what?

    If you're specifically developing tech, groovy. If you're trying to ship a game which your audience will like, then I choose to consider their perspective.
     
    OBiwer likes this.
  42. neginfinity

    neginfinity

    Joined:
    Jan 27, 2013
    Posts:
    13,554
    For technology advancement. Because tech that does not have this flaw is better than tech that does have this flaw, and it is desirable to move towards the "perfect GI solution to rule them all" which we currently do not have.

    Regarding Lumen, you can download Unreal 5, play with it and see if I screwed up with settings and it is actually possible to make the glowing character affect the environment without plastering light sources onto it and I screwed up with settings somewhere.
    Because at the time I honestly wasn't in the mood for trying to untangle the shader concoction they came up with for this specific character and just wanted a quick test.

    NVidia also offers RTXGI plugin which could be tested.
    https://developer.nvidia.com/rtx/ray-tracing/rtxgi
     
  43. Rastapastor

    Rastapastor

    Joined:
    Jan 12, 2013
    Posts:
    589
    Well, UE has Lumen, flawled but still working for the most parts. Lumen is one of selling points of UE5 so we can expect a lot of work put into it throught the UE5 lifecycle so. Stuff can be fixed and improved.

    Unity has nothing, because I dont consider old version of Enlighten where u still have to bake stuff that takes ages and RTX is RTX :).

    End of argument.
     
    Deleted User likes this.
  44. BIGTIMEMASTER

    BIGTIMEMASTER

    Joined:
    Jun 1, 2017
    Posts:
    5,181
    @neginfinity
    if people acted upon the mentality you are describing here, would you expect that a single video game would have ever been finished?

    nobody said lumen is perfect. they are indicating that its the best thing right now.

    you are the one creating outrageous counter arguments in your own mind. Why?
     
    GimmyDev, ontrigger, blueivy and 6 others like this.
  45. Deleted User

    Deleted User

    Guest

    Yeah, but downloading debugging symbols and additional platform support is optional. If a given engine version isn't much used, it is actually 30GB for UE 5.0.

    Still, a lot of gigabytes could be saved if we'd have the option to opt-out of the Enterprise plugins, Virtual Production plugins, ML libraries... etc ;)

    upload_2022-12-30_17-3-2.png


    Fortunately, UE 5.1 finally brings On Demand Shader Compilation, so only basic shaders are compiled on the first engine run. And later only what is supposed to be rendered. It's like jumping to light speed after so many years of compiling all loaded shaders upfront.
    It was also truly annoying if someone needed to load a big map just to check a bug in a specific place. This frustration is now gone, even huge maps load quickly :)

    Plus virtual assets (an optional mechanism that moves bulk texture and audio data out of the repository) save at least dozens of gigabytes on large projects. Texture and audio sources easily 40-90% of the content size.

    That probably brings joy to many more developers than new rendering toys ;)

    https://portal.productboard.com/epi...admap/c/847-odsc-on-demand-shader-compilation
    https://portal.productboard.com/epicgames/1-unreal-engine-public-roadmap/c/860-virtual-assets-beta
     
    Rastapastor likes this.
  46. angrypenguin

    angrypenguin

    Joined:
    Dec 29, 2011
    Posts:
    15,615
    If everyone acted only on that, then no. In reality we want a mix of people who create new tech to push current limitations out further, and people who make and finish stuff with what's available.

    There's no way I'd let those "flaws" slow me down in getting a job done. But also I'm glad that in 5 years they probably won't be there any more.
     
  47. BIGTIMEMASTER

    BIGTIMEMASTER

    Joined:
    Jun 1, 2017
    Posts:
    5,181
    Well I think it takes all sorts of different personalities to make a good team, but there is a line between "if its not perfect I won't use it" and "how do we make the tools we have right now better" where better = solving the current problems teams making games are facing. Not problems like, "how can we simulate a billion universes, and each atom, and colors that humans cannot comprehend.
     
    angrypenguin and Ryiah like this.
  48. BIGTIMEMASTER

    BIGTIMEMASTER

    Joined:
    Jun 1, 2017
    Posts:
    5,181
    so, here is a practical example of how these tools are actually helpful, even for dinky little solo game devs like me.

    my current project, which began in unity but moved to unreal is close to finished. It has been developed in ue4 but I tested migrating it to ue5.1. I enabled lumen and nanite and did a test build. Zero issues. Performance is improved and it looks a heck of a lot nicer.

    I literally had to do almost no work - epic's graphic engineers just made some magic buttons that make my project look better and perform better.

    How could anybody suggest this isn't a win-win for any developer?

    Now of course there is plenty of game projects that don't belong in unreal, or plenty of teams that don't. That's fine - I mean duh. Different tools for different purposes. I care about unity and epic as much as they care about me - zilch.

    So I don't have any issue with unity and I don't have any special love for unreal - my only beef is with people who continually spread the sort of misinformation that gets beginners into problems of wasting time and money. I would have been finished with this project half a year ago and saved a good chunk of money if there wasn't people wrongly suggesting that they are equivalent tools or anything possible in one is possible in antoher, or unreal "only looks better because it has post processing enabled", etc etc.

    In other words, people just making stuff up that they never actually tested.
     
  49. impheris

    impheris

    Joined:
    Dec 30, 2009
    Posts:
    1,623
    That is a good point and in fact, must of the games you play are smoke and mirrors, you do not have to create any tool or whatever, just creative solutions to achieve the same goal.
     
    Deleted User and angrypenguin like this.
  50. angrypenguin

    angrypenguin

    Joined:
    Dec 29, 2011
    Posts:
    15,615
    Oh, for sure. I actually meant that I appreciate that people outside of my team, on tech dev teams, are attacking this kind of stuff and sharing the solutions.

    In a tech dev team solving new things it's a competitive advantage. In a team focused on shipping a game it can easily become a distraction.
     
    BIGTIMEMASTER likes this.
Thread Status:
Not open for further replies.