Search Unity

  1. Unity 2019.1 beta is now available.
    Dismiss Notice
  2. The Unity Pro & Visual Studio Professional Bundle gives you the tools you need to develop faster & collaborate more efficiently. Learn more.
    Dismiss Notice
  3. We're looking for insight from anyone who has experience with game testing to help us better Unity. Take our survey here. If chosen to participate you'll be entered into a sweepstake to win an Amazon gift card.
    Dismiss Notice
  4. Want to provide direct feedback to the Unity team? Join the Unity Advisory Panel.
    Dismiss Notice
  5. Unity 2018.3 is now released.
    Dismiss Notice
  6. Improve your Unity skills with a certified instructor in a private, interactive classroom. Watch the overview now.
    Dismiss Notice

Unity and Nvidia RTX Raytracing and AI on GPU?

Discussion in 'General Discussion' started by Arowx, Aug 20, 2018.

  1. konsic

    konsic

    Joined:
    Oct 19, 2015
    Posts:
    420

    What do you mean by gameplay died long time ago ?


    It could if realtime GI and light probes get heavy improvement.
     
  2. Murgilod

    Murgilod

    Joined:
    Nov 12, 2013
    Posts:
    4,589
    So basically it can't then. Until the future. Which can be anywhere from a femtosecond to an infinity away.
     
  3. konsic

    konsic

    Joined:
    Oct 19, 2015
    Posts:
    420
    It's just a week away -> Unity 2018.3, a holy grail of engines.
     
  4. hippocoder

    hippocoder

    Digital Ape Moderator

    Joined:
    Apr 11, 2010
    Posts:
    24,481
    What are you expecting from 2018.3 ? sounds like Unity staff should be nervous.
     
  5. konsic

    konsic

    Joined:
    Oct 19, 2015
    Posts:
    420
    A miracle which will solve all my problems and feed my desires to make a game.

    Also, I want from 2018.3 to be like unreleased Unreal engine 5.
     
    Last edited: Aug 31, 2018
  6. atomicjoe

    atomicjoe

    Joined:
    Apr 10, 2013
    Posts:
    283
    Exactly.
    Don't be so pessimistic! There are great modern games out there!
    But it's sad that great gameplay comes more from 2D games lately.
    I think the problem is development of AAA titles is too expensive to experiment. They need to make a profit to stay alive so they just replicate what they know will make good sales.
     
  7. atomicjoe

    atomicjoe

    Joined:
    Apr 10, 2013
    Posts:
    283
    Seriously, if you can't make a good looking destructive scene with realtime GI right now, you're doing it wrong...
     
  8. Murgilod

    Murgilod

    Joined:
    Nov 12, 2013
    Posts:
    4,589
    If you can't tell the difference between how that will appear and literal raytracing then you're blind.
     
  9. olix4242

    olix4242

    Joined:
    Jul 21, 2013
    Posts:
    1,188
    Thanks!
    But actually, there were no baking involved in my example scene. There is just some realtime GI provided by using SEGI, and reflections provided by Post Processing SSRR and Realtime reflection probes.

    I wouldn't say that RTX won't help in a future - but I think that we simply aren't still there in terms of performance. One thing is to put a single character in center of a screen in a static closed space (that can be done well with a simple static reflection probe), another to do full raytracing in open world mixed with closed spaces where you have to shoot rays 10 000 units around a world (and hold/update whole world in graphics memory).
     
    zenGarden and konsic like this.
  10. konsic

    konsic

    Joined:
    Oct 19, 2015
    Posts:
    420
    I think we just might be. There is round up of new Intel CPU's next year. With new RTX GPU's in combo a lot can be achieved.
     
  11. Ryiah

    Ryiah

    Joined:
    Oct 11, 2012
    Posts:
    13,065
    Last I heard Intel's next series is yet another refresh using a slightly more refined version of the process they've been using for a few generations now (marketing calls it 14nm with a string of pluses added onto the end). On the other hand AMD is preparing for a die shrink (marketing calls it 7nm) and there is a good chance we'll see yet more cores and higher Hz.

    AMD's response to NVIDIA's raytracing initiative will be interesting too because they're the ones designing the hardware behind the consoles with the heaviest graphics. Microsoft and Sony can of course order specialized hardware but they might choose not to do so and that might adversely affect NVIDIA.
     
    Last edited: Aug 31, 2018
  12. neoshaman

    neoshaman

    Joined:
    Feb 11, 2011
    Posts:
    3,785
    I think there is a lot of misunderstanding. It's basically hardware tree traversal (bhv which they also call scene representation) that typical gpu are bad at, with fixed function tri intersetion, gpu used to be bad at it due to their architecture (random access). It's still another triangle buffer.

    cons:
    1. it won't help with volumetric, it's a tri intersection system
    2. still has plenty noise, especially for bounces, denoiser shader will be the new anti aliasing, which mean issue with reflection on transparency. Though simple transparency accumulation (no reflection) will be better, so cubemap will help.

    neutral:
    1. it won't replace cubemap, ssr, etc ... it will supplement their weakness, you will use tham to add details, not generate all details.
    2. you will simply accumulate particle render to simulate volumetric AFTER a first trace on opaque.
    3. I expect real time baking (aka render to a cache) as common optimization to handle scene changes.
    4. Most path tracing will be screen space to have nice shadow cut out,

    Pro:
    1. you will have a lod version of the scene in the "raytracing buffer", and use that to get out of view shadow/reflection



    Basically all the GPU based GI implementation struggle with scene representation and traversal. It solve JUST that point. I expect hybrid solution to emerge, with a scene representation to accelerate Lightfield query like in SEGI, instead of a full scene represenation inside a tri tree.
     
    LennartJohansen and konsic like this.
  13. OCASM

    OCASM

    Joined:
    Jan 12, 2011
    Posts:
    206
    DXR supports custom intersection shaders.
     
    neoshaman likes this.
  14. Arowx

    Arowx

    Joined:
    Nov 12, 2009
    Posts:
    6,749
    What if ray tracing is not really for today's display hardware but a first 'hardware' step towards real-time light field displays?




    Light fields are a bit like a hologram as they have variable depth focal points and can be viewed from multiple angles.

    And Nvidia have been working on them for some time their potential in AR and VR would be amazing and solve a lot of the problems associated with depth of field and focus. However to generate a light field requires ray-tracing calculations...
     
    AlanMattano likes this.
  15. neoshaman

    neoshaman

    Joined:
    Feb 11, 2011
    Posts:
    3,785
    there is nothing raytrace do differently than rasterization for lightfield.

    Lightfield is just havinhg more directional data per pixel, which allow you to compute directional effects.

    LPPV (aka a 3d texture box of spherical harmonics) is a lighfield, so is a cubemap array.
     
    SunnyChow likes this.
  16. unity_zOvEB4IgVxRirA

    unity_zOvEB4IgVxRirA

    Joined:
    Sep 20, 2017
    Posts:
    213
    i hope to see HDRP RTX version :]
     
    konsic likes this.
  17. TimoHellmund

    TimoHellmund

    Joined:
    Mar 6, 2011
    Posts:
    62
    I just find it sad to see some negative, pessimistic people around here. Unity does not support it and might be late to the party (again) but that does not mean it is not viable. Perhaps not today, not tomorrow, but if you soon join the OMFG RTX RAYTRACING hype, being one of the first Indies to do so, you could use that marketing effect.
    One does not need to build their game entirely around RTX / raytracing, nobody says that, but adding the possibility as a toggle or an updated version for high-end PC gamers is not impossible as we could see in the gamescom presentations. It will be interesting to see whether UE4 (end of the year acc. to Epic) or Unity adds RTX / GI raytracing support sooner so that we devs can use it. Vulkan seems to receive it very soon. :)
     
    Last edited: Sep 2, 2018
    SteveEsco likes this.
  18. ShilohGames

    ShilohGames

    Joined:
    Mar 24, 2014
    Posts:
    2,252
    Well, to be completely honest, I cannot image Unity implementing RTX support before Epic. Unity is usually not the first to implement new graphics features, especially when those features are largely specific to high end PCs only.

    For example, Unity was slow to implement GPU Instancing support. And the automatic GPU Instancing support in Unity 5.4 was not useful. The GPU Instancing support in Unity 5.5 was awesome thanks to the inclusion of the DrawMeshInstanced API method. By that point, I think every other major engine had useful GPU Instancing support before Unity.

    My guess for RTX support will be Epic before the end of 2018, and then Unity within about 5 years. I do not work for either company, so I could easily be wrong on both of my guesses.
     
  19. konsic

    konsic

    Joined:
    Oct 19, 2015
    Posts:
    420
    Well progressive lightmapper behaves like raytracer (it's based on radeon rays) so raytracing in Unity already exist is some way ;).
    Though I would like Unity improves GI, SEGI and creates more unified lighting.
     
  20. neoshaman

    neoshaman

    Joined:
    Feb 11, 2011
    Posts:
    3,785
    UNITY already have support for RTU from power VR GPU, since 2016
    rtx will be done in due time
     
  21. TimoHellmund

    TimoHellmund

    Joined:
    Mar 6, 2011
    Posts:
    62
    I read that Epic mentioned that UE4 will get RTX / GI Raytracing support before the end of 2018 and also the Vulkan API is about to get it very soon. So my hope actually is that we can get RTX functionality via the Vulkan API somehow when working with Unity. Has UT commented on this yet? I have not seen anything.
    I do not want to switch to UE4 if I want to take advantage of the hype as I am horrible in C++.
     
    Last edited: Sep 2, 2018
    jashan likes this.
  22. SteveEsco

    SteveEsco

    Joined:
    Apr 26, 2014
    Posts:
    16
    Couldn't agree more. This is a real "game-changer." Pun intended. :) A ton of time saved in development and awesome results. I can't wait!
     
    jashan likes this.
  23. neoshaman

    neoshaman

    Joined:
    Feb 11, 2011
    Posts:
    3,785
    You still need them because it relax pressure on compute even for raytrace, for example using cubemap mean you can concentrate your ray on out of screen reflexion, and have less noisy raytrace, which mean less expensive denoiser and clearer image.

    Real time light didn't remove the need for lightmap, raytrace will not solve your lightmaping problem, just change the shape of it. IMHO you will have even more specialized lightmap aimed at helping raytrace data.
     
  24. konsic

    konsic

    Joined:
    Oct 19, 2015
    Posts:
    420
  25. jashan

    jashan

    Joined:
    Mar 9, 2007
    Posts:
    3,040
    Not necessarily: They have a denoiser system built into the GPUs which is designed for improving noisy ray-traced material. Granted, you could probably use that denoiser system also for other AI-purposes if your game benefits from those more than raytracing / denoising. But it is special purpose hardware that probably would be idle if your game doesn't need the denoising.

    I'm not sure about the compatibility between using cubemaps, lightmaps and GI pre-computing and raytracing. Could be that you can use the hacky approaches to optimize raytracing - but could also be that this would just add complexity with comparatively little benefit.

    Sometimes, it's really best to let go of old stuff altogether and just look forward without turning back. I'm not saying this is the case with RTX, yet, and it also depends a lot on what your target audience / platforms are, and what your development budget is. But sometimes, the benefit of cutting backwards compatibility and simplifying the pipelines (also in terms of production) simply outweighs the cost of creating something for a limited audience.

    I'm not much of a fan of Apple anymore (I was for a while, and actually got a Mac so I could use Unity back in 2007 when it was still Mac only) ... but if you look at Apple's history, I think it's safe to say that their approach of cutting old stuff and not compromising just for the sake of letting everyone use their product eventually paid off big time.
     
  26. neoshaman

    neoshaman

    Joined:
    Feb 11, 2011
    Posts:
    3,785
    It's pretty simple, they are just light accumulation, rtx is just a layer on top of it.

    Look at the video above, its shows great reflexion and shadow, but notice how much grainy these great reflexion are, you have a set number of ray, spread them and you get less details, using a cubemap mean you would just concentrate all your ray on what's important, ie generally inter reflexion and dynamic objects. The other part are non moving and static, they are easily "cache" by a cubemap.
    What I actually forsee is to cache progressively static element into a cubemap at run time, then use the dynamic ray only for dynamic elements and capturing concave element that are hard to capture on cubemap. You have many combination possible. The best code is the code that is not run, hence we still use lightmap.

    The denoiser run on the tensor core, which are just fp16 units optimized for DNN, which mean you can run arbitrary code, most notably NN architecture. It also run the NN Anti aliasing . I expect even more crazy stuff to come out that aren't anti aliasing or denoiser. Like potentially rendering CG level of graphics from just a object index coded in pixel mask in a low resolution buffer, which is stuff that is already done by NN. I don't worry about the core sitting idle at all.

    I'm sure rtx will be great for indie overall. But expect mixed result at first.

    One way to see rtx is has a out of scene access to texel data, which mean you can further augment existing shader by going around some current limitation, so let say you access a raytrace fragment
     
    Last edited: Sep 17, 2018
    jashan likes this.
  27. elbows

    elbows

    Joined:
    Nov 28, 2009
    Posts:
    2,033
    OCASM and LennartJohansen like this.
  28. konsic

    konsic

    Joined:
    Oct 19, 2015
    Posts:
    420
     
    OCASM likes this.
  29. zenGarden

    zenGarden

    Joined:
    Mar 30, 2013
    Posts:
    4,464
    I think real time GI without any baking will be the next standard.
    When RTX 3D cards price will drop it will be available for any game and anyone :)

     
    OCASM likes this.
  30. Peter77

    Peter77

    Joined:
    Jun 12, 2013
    Posts:
    3,142
    CRYENGINE 5.5, which was released yesterday, now includes a major advancement with SVO Ray-traced Shadows offering an alternative to using cached shadow maps in scenes.



    https://www.cryengine.com/news/cryengine-55-major-release#

    They also provide a migration guide for Unity users.
     
    konsic likes this.
  31. zombiegorilla

    zombiegorilla

    Moderator

    Joined:
    May 8, 2012
    Posts:
    7,458
    They are still in business? ;)
     
    ShilohGames likes this.
  32. neoshaman

    neoshaman

    Joined:
    Feb 11, 2011
    Posts:
    3,785
    their engine are made of cry,
    It's run on manly tears,
    so they won't give up easily.


    quod per sortem sternit fortem, mecum omnes plangite!
     
    Billy4184 and zombiegorilla like this.
  33. Peter77

    Peter77

    Joined:
    Jun 12, 2013
    Posts:
    3,142
    Perhaps their business model is taking over all the Unity users who fail miserable to create open world'ish games :oops:
     
    AlanMattano likes this.
  34. zenGarden

    zenGarden

    Joined:
    Mar 30, 2013
    Posts:
    4,464
    CryEngine is an absolute mess to use, and many areas are far from complete or buggy.
    But they are the only ones proposing a fast global illumination without baking.
     
  35. konsic

    konsic

    Joined:
    Oct 19, 2015
    Posts:
    420
  36. zenGarden

    zenGarden

    Joined:
    Mar 30, 2013
    Posts:
    4,464
  37. Ryiah

    Ryiah

    Joined:
    Oct 11, 2012
    Posts:
    13,065
  38. neoshaman

    neoshaman

    Joined:
    Feb 11, 2011
    Posts:
    3,785
    meh, it's about raytracing, I would like to know what's new and how they did it in their solution, rather than migrating manual comparison :p
     
  39. elbows

    elbows

    Joined:
    Nov 28, 2009
    Posts:
    2,033
    The initial batch of reviewers benchmarks for RTX cards is causing me to sadly further moderate my expectations in terms of user uptake of this tech. Even as a dev I can no longer justify upgrading from 1080ti to 2080ti, at least not until RTX and DLSS stuff is properly available. Which is a shame, because I like to live on the bleeding edge, but there are limits to how much bleeding wallet I can justify.
     
  40. elbows

    elbows

    Joined:
    Nov 28, 2009
    Posts:
    2,033
    I note that one of the Unite LA sessions, on October 25th, is currently labelled Graphics: to be revealed. Quite easy to speculate that ray-tracing stuff will get a mention in the keynote and this session will be more detail. Obviously I could be wrong.
     
    OCASM likes this.
  41. Arowx

    Arowx

    Joined:
    Nov 12, 2009
    Posts:
    6,749
    What if AMD pull a Raytracing capable GPU out of the bag at a more reasonable price point?
     
  42. elbows

    elbows

    Joined:
    Nov 28, 2009
    Posts:
    2,033
    Well that would help, but its too early to say given that we dont even really know what the raytracing on high end nvidia cards is like in practice in terms of performance yet. I'm not convinced there will be lots of room to imagine useful raytracing performance in medium range cards for some yers yet, though I could be wrong and it wont be hard for AMD to beat nvidia when it comes to high end prices since nvidia have set the price bar so high this time around.

    Anyway at least we have the first piece of the jigsaw in place now, the windows update that has DirectX raytracing support came out a few days ago.

    Personally it looks like I will be able to dabble from nearly the start with this attempt at a hybrid raytracing era, because although the performance of the 2080ti, for example in game benchmarks, isnt an amazing leap overall, it has quite a lot more compute performance and the main thing I am working on needs all the compute power it can get. So I can just about justify getting this card, and will treat the raytracing stuff as an added bonus if it turns out to be good enough.
     
  43. neoshaman

    neoshaman

    Joined:
    Feb 11, 2011
    Posts:
    3,785
  44. elbows

    elbows

    Joined:
    Nov 28, 2009
    Posts:
    2,033
    Whether that sort of historical stuff really makes me late or not depends on what I meant by 'useful raytracing performance'. It's still too early to really tell what will end up being practical for games using 2080ti's and 2080's, we have some idea but it will take time for the full picture to emerge, and to see what performance vs looks tradeoffs gamers will accept.

    Certainly thats what the start of this era means to me, at the minimum we will get the answers to some of these questions, and the chance for this stuff to go in various directions, succeed or fail, get beyond the marketing. And even if the hype turns out to have raised expectations too much in some quarters, at least enough graphics programmers/game engine companies seem onboard this time that we will actually get to see this stuff used in some meaningful ways in the months ahead.
     
  45. elbows

    elbows

    Joined:
    Nov 28, 2009
    Posts:
    2,033
    And in that sense I could compare it to VR - not a new concept, but one where the first gen of 'modern VR' hardware & associated platforms, APis and engine support was enough to at least let it stick a toe in the mainstream and get some questions answered about how far it had come, how practical it was in its current state and how long the road ahead may be. Sure other devices existed in the past when the tech was even less ripe, and this gen inevitably didnt live up to the hype or the things it encouraged people to imagine. So it plods on slowly in a manner that leaves people with no choice but to either give up for now and wait for much progress in the longterm, or cope with todays limitations and the less than lightspeed pace of change. But slowly, over time, stuff happens that adds up to more practical capabilities and then, if all interest didnt die in the intervening years, we eventually get to the promised land. The raytracing stuff may be a quite similar story, and I suppose it usually pays to anticipate that the first gen or so will be half-baked (pun intended).
     
  46. neoshaman

    neoshaman

    Joined:
    Feb 11, 2011
    Posts:
    3,785
    My point is exactly we know the cost and trade off already, the problem was there was no market because it was hardware locked. You simply have better shadows and reflection, even with the power vr, but then the cost is number of ray used in scene, which can create grainy image, so you will still use old technique like cubemap, but augmented for dynamic object (where you spend the ray budget).
     
  47. elbows

    elbows

    Joined:
    Nov 28, 2009
    Posts:
    2,033
    Sure, but thats all stuff to do with the broad nature of this 'now have enough rays to do hybrid approach but not the whole thing' era, none of which I dispute. In fact I got quite bored with all the posts in the past from people who missed the 'hybrid' bit or wanted to imagine that the gap between that and full scene raytracing is a small one.

    What I was on about in regards cost was very specific to the new nvidia cards and exactly what people will be able to squeeze out of them. eg discovering any caveats with what ray budget we actually have compared to the nvidia hype. Especially since even when used in hybrid fashion, nvidia still had to promote denoising on top of the raytracing in order to get practical results today. Aside from those developers who got early access to hardware to do the initial demos and games, I dont think many of us can yet fully judge how good the nvidia denoising is, so thats certainly one of the things that may have an impact on practicality that we cannot completely measure yet.
     
  48. neoshaman

    neoshaman

    Joined:
    Feb 11, 2011
    Posts:
    3,785
    But the most interesting part is that it's like the jump from fixed function to programmable shader. The RT part is basically incoherent access to fragment AND out of view sampling, which is something all advance graphics done now suffer from (screen space and various sampling). THIS is a bigger news than the raytrace part in itself. We have created a number of data structure that suffer from the coherent access need of gpu architecture, having incoherent access will allow them to shine.

    The denoising is just a Neural network filter, the main challenge is to have a NN make inference inside the rendering budget, but it also mean that the quality is potentially infinite, given what's done now with NN and the pace at which visual manipulation progress in that subfield. NN uprez is already the most efficient algo for that task, denoising is just that, uprez from a low sample count, and that's what DLSS do too. I mean that NN can generate in real time hirez upscale realistic version of image from low rez sketch version, the problem is to make them fast enough for game rendering budget and not just real time. The thing this issue is dependent on two major thing, the training of the NN and the NN architecture, and that's a recent field that hasn't mature yet.

    Imho, NN rendering is the biggest news here, you could probably entirely unload your entire shader budget to the NN, by rendering a low rez buffer with just enough data for the NN to generate a full rez image. It's already done in the general sense. It's a matter of time we specialized and optimized it for game rendering.

    So for me, raw raytracing is the least interesting news, but the only one people can actually understand. It's marketing.
     
    AlanMattano likes this.
  49. ShilohGames

    ShilohGames

    Joined:
    Mar 24, 2014
    Posts:
    2,252
    There are two separate issues for existing GTX 1080ti owners considering an upgrade. One issue is the naming and price points shifted around. The RTX 2080ti actually replaces the Titan Xp instead of the GTX 1080ti. And the Titan branding is now moved to the Titan V at the $3k price point. The RTX 2080 takes over the price point previously held by the GTX 1080ti. Because of the model naming and price point repositioning, the existing GTX 1080ti owners have a difficult time justifying the upgrade, especially when comparing the raw performance of existing games. This leaves many existing GTX 1080ti owners uninterested in their existing price point with the RTX 2080, but also uninterested in upgrading to the next price point to get the RTX 2080ti.

    The other issue is the new features are exciting but not widely supported yet. RTX is very exciting, and DLSS is very exciting. But since neither is currently widely available, it is understandable that many users will delay the upgrade.

    Personally, I ordered a couple RTX 2080ti cards the moment pre-orders were opened, and one of those will be replacing a GTX 1080ti. The computer components I get most excited about are graphics cards and SSD drives. With graphics cards, I get extremely interested in new features, because it unlocks amazing possibilities.

    Each time there is a big innovation jump (new features) in graphics cards, a lot of people complain that the innovation does not benefit existing games as much as raw performance would have. For example, some people complained when DX11 cards showed up because there were very few DX11 titles to take advantage of the DX11 features. Those people believed that new cards should have simply had even more raw performance in DX9 games instead of worrying about DX11. After all, those users had a library of DX9 games.

    I think of RTX in much the same way as the debate regarding DX9 and DX11 back then. It is an amazing new feature that will lead to exciting new advances in gaming.
     
    neoshaman likes this.
  50. ShilohGames

    ShilohGames

    Joined:
    Mar 24, 2014
    Posts:
    2,252
    Side note: The RTX 2080ti cards I ordered on August 20 just shipped. I should have them tomorrow.
     
    neoshaman, crashTX and elbows like this.