Search Unity

  1. Improved Prefab workflow (includes Nested Prefabs!), 2D isometric Tilemap and more! Get the 2018.3 Beta now.
    Dismiss Notice
  2. The Unity Pro & Visual Studio Professional Bundle gives you the tools you need to develop faster & collaborate more efficiently. Learn more.
    Dismiss Notice
  3. Improve your Unity skills with a certified instructor in a private, interactive classroom. Watch the overview now.
    Dismiss Notice
  4. Want to see the most recent patch releases? Take a peek at the patch release page.
    Dismiss Notice

Unity and Nvidia RTX Raytracing and AI on GPU?

Discussion in 'General Discussion' started by Arowx, Aug 20, 2018.

  1. konsic

    konsic

    Joined:
    Oct 19, 2015
    Posts:
    348

    What do you mean by gameplay died long time ago ?


    It could if realtime GI and light probes get heavy improvement.
     
  2. Murgilod

    Murgilod

    Joined:
    Nov 12, 2013
    Posts:
    3,569
    So basically it can't then. Until the future. Which can be anywhere from a femtosecond to an infinity away.
     
  3. konsic

    konsic

    Joined:
    Oct 19, 2015
    Posts:
    348
    It's just a week away -> Unity 2018.3, a holy grail of engines.
     
  4. hippocoder

    hippocoder

    Digital Ape Moderator

    Joined:
    Apr 11, 2010
    Posts:
    22,957
    What are you expecting from 2018.3 ? sounds like Unity staff should be nervous.
     
  5. konsic

    konsic

    Joined:
    Oct 19, 2015
    Posts:
    348
    A miracle which will solve all my problems and feed my desires to make a game.

    Also, I want from 2018.3 to be like unreleased Unreal engine 5.
     
    Last edited: Aug 31, 2018
  6. atomicjoe

    atomicjoe

    Joined:
    Apr 10, 2013
    Posts:
    126
    Exactly.
    Don't be so pessimistic! There are great modern games out there!
    But it's sad that great gameplay comes more from 2D games lately.
    I think the problem is development of AAA titles is too expensive to experiment. They need to make a profit to stay alive so they just replicate what they know will make good sales.
     
  7. atomicjoe

    atomicjoe

    Joined:
    Apr 10, 2013
    Posts:
    126
    Seriously, if you can't make a good looking destructive scene with realtime GI right now, you're doing it wrong...
     
  8. Murgilod

    Murgilod

    Joined:
    Nov 12, 2013
    Posts:
    3,569
    If you can't tell the difference between how that will appear and literal raytracing then you're blind.
     
  9. olix4242

    olix4242

    Joined:
    Jul 21, 2013
    Posts:
    1,023
    Thanks!
    But actually, there were no baking involved in my example scene. There is just some realtime GI provided by using SEGI, and reflections provided by Post Processing SSRR and Realtime reflection probes.

    I wouldn't say that RTX won't help in a future - but I think that we simply aren't still there in terms of performance. One thing is to put a single character in center of a screen in a static closed space (that can be done well with a simple static reflection probe), another to do full raytracing in open world mixed with closed spaces where you have to shoot rays 10 000 units around a world (and hold/update whole world in graphics memory).
     
    konsic likes this.
  10. konsic

    konsic

    Joined:
    Oct 19, 2015
    Posts:
    348
    I think we just might be. There is round up of new Intel CPU's next year. With new RTX GPU's in combo a lot can be achieved.
     
  11. Ryiah

    Ryiah

    Joined:
    Oct 11, 2012
    Posts:
    12,142
    Last I heard Intel's next series is yet another refresh using a slightly more refined version of the process they've been using for a few generations now (marketing calls it 14nm with a string of pluses added onto the end). On the other hand AMD is preparing for a die shrink (marketing calls it 7nm) and there is a good chance we'll see yet more cores and higher Hz.

    AMD's response to NVIDIA's raytracing initiative will be interesting too because they're the ones designing the hardware behind the consoles with the heaviest graphics. Microsoft and Sony can of course order specialized hardware but they might choose not to do so and that might adversely affect NVIDIA.
     
    Last edited: Aug 31, 2018
  12. neoshaman

    neoshaman

    Joined:
    Feb 11, 2011
    Posts:
    3,367
    I think there is a lot of misunderstanding. It's basically hardware tree traversal (bhv which they also call scene representation) that typical gpu are bad at, with fixed function tri intersetion, gpu used to be bad at it due to their architecture (random access). It's still another triangle buffer.

    cons:
    1. it won't help with volumetric, it's a tri intersection system
    2. still has plenty noise, especially for bounces, denoiser shader will be the new anti aliasing, which mean issue with reflection on transparency. Though simple transparency accumulation (no reflection) will be better, so cubemap will help.

    neutral:
    1. it won't replace cubemap, ssr, etc ... it will supplement their weakness, you will use tham to add details, not generate all details.
    2. you will simply accumulate particle render to simulate volumetric AFTER a first trace on opaque.
    3. I expect real time baking (aka render to a cache) as common optimization to handle scene changes.
    4. Most path tracing will be screen space to have nice shadow cut out,

    Pro:
    1. you will have a lod version of the scene in the "raytracing buffer", and use that to get out of view shadow/reflection



    Basically all the GPU based GI implementation struggle with scene representation and traversal. It solve JUST that point. I expect hybrid solution to emerge, with a scene representation to accelerate Lightfield query like in SEGI, instead of a full scene represenation inside a tri tree.
     
    LennartJohansen and konsic like this.
  13. OCASM

    OCASM

    Joined:
    Jan 12, 2011
    Posts:
    196
    DXR supports custom intersection shaders.
     
    neoshaman likes this.
  14. Arowx

    Arowx

    Joined:
    Nov 12, 2009
    Posts:
    6,449
    What if ray tracing is not really for today's display hardware but a first 'hardware' step towards real-time light field displays?




    Light fields are a bit like a hologram as they have variable depth focal points and can be viewed from multiple angles.

    And Nvidia have been working on them for some time their potential in AR and VR would be amazing and solve a lot of the problems associated with depth of field and focus. However to generate a light field requires ray-tracing calculations...
     
  15. neoshaman

    neoshaman

    Joined:
    Feb 11, 2011
    Posts:
    3,367
    there is nothing raytrace do differently than rasterization for lightfield.

    Lightfield is just havinhg more directional data per pixel, which allow you to compute directional effects.

    LPPV (aka a 3d texture box of spherical harmonics) is a lighfield, so is a cubemap array.
     
  16. unity_zOvEB4IgVxRirA

    unity_zOvEB4IgVxRirA

    Joined:
    Sep 20, 2017
    Posts:
    112
    i hope to see HDRP RTX version :]
     
    konsic likes this.
  17. TimoHellmund

    TimoHellmund

    Joined:
    Mar 6, 2011
    Posts:
    60
    I just find it sad to see some negative, pessimistic people around here. Unity does not support it and might be late to the party (again) but that does not mean it is not viable. Perhaps not today, not tomorrow, but if you soon join the OMFG RTX RAYTRACING hype, being one of the first Indies to do so, you could use that marketing effect.
    One does not need to build their game entirely around RTX / raytracing, nobody says that, but adding the possibility as a toggle or an updated version for high-end PC gamers is not impossible as we could see in the gamescom presentations. It will be interesting to see whether UE4 (end of the year acc. to Epic) or Unity adds RTX / GI raytracing support sooner so that we devs can use it. Vulkan seems to receive it very soon. :)
     
    Last edited: Sep 2, 2018
    SteveEsco likes this.
  18. ShilohGames

    ShilohGames

    Joined:
    Mar 24, 2014
    Posts:
    2,030
    Well, to be completely honest, I cannot image Unity implementing RTX support before Epic. Unity is usually not the first to implement new graphics features, especially when those features are largely specific to high end PCs only.

    For example, Unity was slow to implement GPU Instancing support. And the automatic GPU Instancing support in Unity 5.4 was not useful. The GPU Instancing support in Unity 5.5 was awesome thanks to the inclusion of the DrawMeshInstanced API method. By that point, I think every other major engine had useful GPU Instancing support before Unity.

    My guess for RTX support will be Epic before the end of 2018, and then Unity within about 5 years. I do not work for either company, so I could easily be wrong on both of my guesses.
     
  19. konsic

    konsic

    Joined:
    Oct 19, 2015
    Posts:
    348
    Well progressive lightmapper behaves like raytracer (it's based on radeon rays) so raytracing in Unity already exist is some way ;).
    Though I would like Unity improves GI, SEGI and creates more unified lighting.
     
  20. neoshaman

    neoshaman

    Joined:
    Feb 11, 2011
    Posts:
    3,367
    UNITY already have support for RTU from power VR GPU, since 2016
    rtx will be done in due time
     
  21. TimoHellmund

    TimoHellmund

    Joined:
    Mar 6, 2011
    Posts:
    60
    I read that Epic mentioned that UE4 will get RTX / GI Raytracing support before the end of 2018 and also the Vulkan API is about to get it very soon. So my hope actually is that we can get RTX functionality via the Vulkan API somehow when working with Unity. Has UT commented on this yet? I have not seen anything.
    I do not want to switch to UE4 if I want to take advantage of the hype as I am horrible in C++.
     
    Last edited: Sep 2, 2018
    jashan likes this.
  22. SteveEsco

    SteveEsco

    Joined:
    Apr 26, 2014
    Posts:
    16
    Couldn't agree more. This is a real "game-changer." Pun intended. :) A ton of time saved in development and awesome results. I can't wait!
     
    jashan likes this.
  23. neoshaman

    neoshaman

    Joined:
    Feb 11, 2011
    Posts:
    3,367
    You still need them because it relax pressure on compute even for raytrace, for example using cubemap mean you can concentrate your ray on out of screen reflexion, and have less noisy raytrace, which mean less expensive denoiser and clearer image.

    Real time light didn't remove the need for lightmap, raytrace will not solve your lightmaping problem, just change the shape of it. IMHO you will have even more specialized lightmap aimed at helping raytrace data.
     
  24. konsic

    konsic

    Joined:
    Oct 19, 2015
    Posts:
    348
  25. jashan

    jashan

    Joined:
    Mar 9, 2007
    Posts:
    2,973
    Not necessarily: They have a denoiser system built into the GPUs which is designed for improving noisy ray-traced material. Granted, you could probably use that denoiser system also for other AI-purposes if your game benefits from those more than raytracing / denoising. But it is special purpose hardware that probably would be idle if your game doesn't need the denoising.

    I'm not sure about the compatibility between using cubemaps, lightmaps and GI pre-computing and raytracing. Could be that you can use the hacky approaches to optimize raytracing - but could also be that this would just add complexity with comparatively little benefit.

    Sometimes, it's really best to let go of old stuff altogether and just look forward without turning back. I'm not saying this is the case with RTX, yet, and it also depends a lot on what your target audience / platforms are, and what your development budget is. But sometimes, the benefit of cutting backwards compatibility and simplifying the pipelines (also in terms of production) simply outweighs the cost of creating something for a limited audience.

    I'm not much of a fan of Apple anymore (I was for a while, and actually got a Mac so I could use Unity back in 2007 when it was still Mac only) ... but if you look at Apple's history, I think it's safe to say that their approach of cutting old stuff and not compromising just for the sake of letting everyone use their product eventually paid off big time.
     
  26. neoshaman

    neoshaman

    Joined:
    Feb 11, 2011
    Posts:
    3,367
    It's pretty simple, they are just light accumulation, rtx is just a layer on top of it.

    Look at the video above, its shows great reflexion and shadow, but notice how much grainy these great reflexion are, you have a set number of ray, spread them and you get less details, using a cubemap mean you would just concentrate all your ray on what's important, ie generally inter reflexion and dynamic objects. The other part are non moving and static, they are easily "cache" by a cubemap.
    What I actually forsee is to cache progressively static element into a cubemap at run time, then use the dynamic ray only for dynamic elements and capturing concave element that are hard to capture on cubemap. You have many combination possible. The best code is the code that is not run, hence we still use lightmap.

    The denoiser run on the tensor core, which are just fp16 units optimized for DNN, which mean you can run arbitrary code, most notably NN architecture. It also run the NN Anti aliasing . I expect even more crazy stuff to come out that aren't anti aliasing or denoiser. Like potentially rendering CG level of graphics from just a object index coded in pixel mask in a low resolution buffer, which is stuff that is already done by NN. I don't worry about the core sitting idle at all.

    I'm sure rtx will be great for indie overall. But expect mixed result at first.

    One way to see rtx is has a out of scene access to texel data, which mean you can further augment existing shader by going around some current limitation, so let say you access a raytrace fragment
     
    Last edited: Sep 17, 2018 at 7:58 PM
    jashan likes this.
  27. elbows

    elbows

    Joined:
    Nov 28, 2009
    Posts:
    1,914
    OCASM and LennartJohansen like this.
  28. konsic

    konsic

    Joined:
    Oct 19, 2015
    Posts:
    348
  29. zenGarden

    zenGarden

    Joined:
    Mar 30, 2013
    Posts:
    4,164
    I think real time GI without any baking will be the next standard.
    When RTX 3D cards price will drop it will be available for any game and anyone :)

     
    OCASM likes this.
  30. Peter77

    Peter77

    Joined:
    Jun 12, 2013
    Posts:
    2,508
    CRYENGINE 5.5, which was released yesterday, now includes a major advancement with SVO Ray-traced Shadows offering an alternative to using cached shadow maps in scenes.



    https://www.cryengine.com/news/cryengine-55-major-release#

    They also provide a migration guide for Unity users.
     
  31. zombiegorilla

    zombiegorilla

    Moderator

    Joined:
    May 8, 2012
    Posts:
    7,231
    They are still in business? ;)
     
    ShilohGames likes this.
  32. neoshaman

    neoshaman

    Joined:
    Feb 11, 2011
    Posts:
    3,367
    their engine are made of cry,
    It's run on manly tears,
    so they won't give up easily.


    quod per sortem sternit fortem, mecum omnes plangite!
     
    zombiegorilla likes this.
  33. Peter77

    Peter77

    Joined:
    Jun 12, 2013
    Posts:
    2,508
    Perhaps their business model is taking over all the Unity users who fail miserable to create open world'ish games :oops:
     
  34. zenGarden

    zenGarden

    Joined:
    Mar 30, 2013
    Posts:
    4,164
    CryEngine is an absolute mess to use, and many areas are far from complete or buggy.
    But they are the only ones proposing a fast global illumination without baking.