Search Unity

  1. Welcome to the Unity Forums! Please take the time to read our Code of Conduct to familiarize yourself with the forum rules and how to post constructively.
  2. We have updated the language to the Editor Terms based on feedback from our employees and community. Learn more.
    Dismiss Notice

RTX

Discussion in 'General Discussion' started by unitedone3D, Apr 21, 2020.

  1. unitedone3D

    unitedone3D

    Joined:
    Jul 29, 2017
    Posts:
    151
    Hi there! Just a 2 cents.
    For anyone making a RTX/Nvidia GeForce 1060-3080Ti (RTX-core hardware powered) Raytracing 3D game..Raytracing is spectacular graphics that make real-time GI (Global Ilumination), AO (Ambient Occlusion), and penumbra shawos/contact shadows be possible; which is what HDRP's implementation of RTX does. I struggled with as to use it or not (decided not to but for other reason of uncanny valley), because there are so few games that use RTX-raytracing (notable ones are Minecraft, Quake 2, Metro Exodus and Control). The best looking ones use the most hardware-intensive Raytracing method: ''Path Tracing (Monte Carlo algorhitm)'', it is true accurate reproduction of lighting and
    is very taxing on GPU; it is the most advanced 3D type of raytracing, right now I think the only two games that do true path tracing are Quake 2 RTX (by NVidia) and Minecraft RTX...you can see RTX in others, but I'm not sure they always calculate
    true GI with several bounces. This kinfd of lighting has to be (pre)-baked otherwise. That is about to end with RTX path tracing-type of raytracing. I think there is one Unity indie developer that is building a FPS game with RTX...

    There are lots of positive to build your game with it - it is futureforward thinking, your graphics will be mind blowing but you need the gamer to have the hadwarre for..but that is futureforward, at a certain point people would have to upgrade their GPU card...otherwise
    they are limited to playing older games/older engines...or have the dev 'scale' the game for the lowest PC/GPU..but I think that can cause more problems (it is hard to support Very Old GPUs when your game Requires a perfomant GPU to obtain RTX/HDRP graphics...otherwise you must use URP or smaller mobile like rendering pipeline).
    Scalability is important but if you are making a game for PC/highend/console (PS5, Xbox series x...), next gen, it'S hard to make it work on old hardeare.
    Anyway, I'll finish by saying I think uncanny valley is the problem with RTX and path tracing; it works in 'cartoony' like games, not in realistic ones that capture reality. The uncanny valley is when theings approach ultra-realism/like a photograph (it is also about human faces/digintal humans become ultra real..too real that it freaks out/creates revulsion);
    IF we compare

    Minecraft RTX (cartoony lego blocks/no uncanniness despite ultra-real lighting), Quake 2 RTX (less cartoony/still cartoony/exaggreated; little bituncanny valley, ultra-real lighting), Control (not cartoony, very realistic setting, substantial uncanny valley feeling; turning RTX on/off shows it, the large element is the GI causing it (raytraced 'photon noise' (spectral energy)( un the shading/colors like a real photo. In other words, the shading looks more real/rich//photo and less CGI/gamey/plastic PBR shader look.). This elicits the ffeeling of uncanny valley (like in blockbuster CGI movies hvubg uncanny valley CG humans)
    Because it is so real you seemed fooled but you are not/you still know it's false despite uncaniness to reality. It is also called the Reality TV effect/Soap Opera effect/'LIVE Tv'/'the News' effect (a bit like 24 fps film standard vs 48 fps soap opera effect in films; video games can reach 240 fps but do not suffer from this because of strobbing 'frames' instwead of Motion Blur streak/smeared bluured-frames), The solution to that is making an image that is darker to hide 'the uncanny valley' effec of RTX path tracing GI (reducing the'obviousness' of the GI; a bit like Jurrasic Park did with its CGI dinosaurs - they were hidden/in dark on purpose to hide the obvious Copmuter VFX CGI of them; the effect is least in dark lit scenes and the Most apparaent in Full Day Light scenes where everything is very visibile/fully lit and 'lighting bouncing around' due to the day sun/rays) or to make a cartoonish proportion game (cartoon effect extinguinshes uncanny valley realism effect, because cartoon/not real/is cartoon). Thanks for reading. Just a 2 cents.
     
  2. hippocoder

    hippocoder

    Digital Ape Moderator

    Joined:
    Apr 11, 2010
    Posts:
    29,723
    RTX doesn't mean anything at all. It is purely up to the developer to choose what to do with this acceleration hardware.

    It doesn't do anything by itself, you see? It doesn't choose a look. It doesn't store colours of it's own accord and visual style. None of that. It's 100% a developer choice how a thing will look with it.

    So none of what you said makes any sense, it's just you observed games looking like that, then you blamed RTX for it. That's 100% incorrect and if RTX was removed and they replaced the technology with compute shaders (which you can do)... it would look the same - whatever the developers intended.

    Usually developers will use DXR to enhance the quality or accuracy of a particular technique that will work fine without it. Such as GI, or shadows, or most commonly AO. It can be used to get more accurate reflections by casting rays, queries etc.

    It's way lower level than how it looks.
     
  3. unit_dev123

    unit_dev123

    Joined:
    Feb 10, 2020
    Posts:
    989
    P_Jong likes this.
  4. Ryiah

    Ryiah

    Joined:
    Oct 11, 2012
    Posts:
    20,189
    This. RTX is the branding for NVIDIA's raytracing implementation. DXR is the branding for Microsoft's DirectX Raytracing API that developers use to implement raytracing. Everyone labeling it as RTX is either doing so for marketing purposes or out of ignorance.

    Minecraft RTX, for example, is called that because NVIDIA is the one implementing the raytracing in the engine but they are using DXR just like everyone else and when AMD finally releases their implementation it will work with Minecraft too.

    NVIDIA's RTX is an implementation of BVH (Bounding Volume Hierarchy) in hardware. These bounce rays around the scene checking if they intersect volumes surrounding the objects in the scene, and if they hit the volume they then check if the ray will intersect with a triangle making up the object within the volume.

    After that they feed the information to the traditional shader hardware which has the actual task of rendering the scene.

    https://en.wikipedia.org/wiki/Bounding_volume_hierarchy
     
    Last edited: Apr 21, 2020
    hippocoder likes this.
  5. Rasly233

    Rasly233

    Joined:
    Feb 19, 2015
    Posts:
    264
    But dosnt DXR only work with RTX? If RXR is only an api for RTX then calling it RTX is not marketing.
     
  6. unit_dev123

    unit_dev123

    Joined:
    Feb 10, 2020
    Posts:
    989
    I don't think the moderator was outlining the differences between RTX and DXR by his post although you are correct. He was just saying raytracing doesn't automatically imply good artistic / realistic styles I feel.
     
    hippocoder likes this.
  7. Ryiah

    Ryiah

    Joined:
    Oct 11, 2012
    Posts:
    20,189
    No. Check the spoiler below for a demonstration of DXR running through AMD's RDNA 2 architecture.


    Furthermore here is a link discussing NVIDIA's GTX 10 series (1060, 1070, 1080, 1080 Ti) running DXR.

    https://www.anandtech.com/show/14203/nvidia-releases-dxr-driver-for-gtx-cards

    Right, and I'm editing my post with additional information on the actual implementation.
     
    P_Jong and hippocoder like this.
  8. unit_dev123

    unit_dev123

    Joined:
    Feb 10, 2020
    Posts:
    989
  9. unit_dev123

    unit_dev123

    Joined:
    Feb 10, 2020
    Posts:
    989
  10. Joe-Censored

    Joe-Censored

    Joined:
    Mar 26, 2013
    Posts:
    11,847
    Sometimes branding becomes the generic term though. NVidia has done it before. "GPU" comes from the marketing campaign for the original NVidia GeForce card. ATI came up with "VPU" to describe their offerings, but GPU stuck.
     
    PerfidiousLeaf and Ryiah like this.