Search Unity

  1. Unity 6 Preview is now available. To find out what's new, have a look at our Unity 6 Preview blog post.
    Dismiss Notice
  2. Unity is excited to announce that we will be collaborating with TheXPlace for a summer game jam from June 13 - June 19. Learn more.
    Dismiss Notice
  3. Dismiss Notice

Official Raytracing API Out of Experimental in 2023.1

Discussion in 'HDRP Ray Tracing' started by dnach, Oct 19, 2022.

  1. dnach

    dnach

    Unity Technologies

    Joined:
    Mar 9, 2022
    Posts:
    90
    We are happy to announce that as of Unity 2023.1.0a13, the Ray Tracing API is officially out of experimental. This change is introduced after recent improvements to the Ray Tracing API, ranging from stability and performance to additional compatibility with the engine’s existing feature set.

    Unity’s Ray Tracing API is utilized by the High Definition Render Pipeline, in order to implement a collection of cutting-edge raytracing effects such as:
    • Ambient Occlusion
    • Contact Shadow
    • Global Illumination
    • Reflections
    • Shadows
    • Subsurface Scattering
    • Recursive Ray Tracing
    • Path Tracing



    “Enemies” - HDRP Ray Tracing in Unity 2022.2

    In order to experiment with HDRP's raytracing effects, you can now utilize the HDRP Sample Scene which has been updated to provide new raytracing quality settings. The new settings assets are introduced for the 2022.2 beta in "3D Sample Scene (HDRP) v14.1.0", and for the 2023.1 alpha in "3D Sample Scene (HDRDP) v15.1.0":



    “HDRP Sample Scene” - Raytraced GI, Reflections, Shadows and AO enabled



    As of 2022.2, we now provide full desktop and console support for RT capable hardware including Xbox Series. In the same release, we also implemented Terrain Heightmap compatibility with raytracing when utilizing the Terrain system.

    Unity 2023.1 further advances Unity’s Ray Tracing support by fully integrating with VFX Graph in order to allow authoring complex particle effects that are compatible with HDRP’s Ray Tracing effects.

    This release also expands the Ray Tracing API in order to simplify the configuration of meshes added to the Ray Tracing Acceleration Structure (RTAS). This is achieved by introducing a new overload for RaytracingAccelerationStructure.AddInstance”:

    Code (CSharp):
    1. RaytracingAceelerationStructure.AddInstance(ref Rendering.RayTracingMeshInstanceConfig config, Matrix4x4 matrix, Nullable<Matrix4x4> prevMatrix = null, uint id);
    The new API allows to pass the new “RaytracingMeshInstanceConfig” struct in order to conveniently configure the mesh and material parameters of instances to be included in the RTAS. For example, using this new API it is now straightforward to process/animate the geometry of ray-traced meshes by retrieving the mesh vertex buffer using "Mesh.GetVertexBuffer" and binding it to a compute shader using "ComputeShader.SetBuffer".

    Update:

    RayTracingAccelerationStructure.AddInstances is also introduced in 2023.1a18, and provides full instancing support to the Ray Tracing API, allowing to add large amounts of mesh instances to the RTAS, and to use the instance ID in hit shaders for accessing per-instance data.

    Using instancing, it is now possible to more efficiently ray-trace large and dense scenes that include high-frequency repeating meshes and detail. For more information on Raytracing Instancing, including performance testing figures and a reference sample project, please check out the following slides.

    With the Ray Tracing API out of experimental in 2023.1, we can’t wait to see the amazing results you achieve using HDRP’s comprehensive ray tracing effects! Your feedback is crucial, so please let us know if you encounter any issues, and share any features and changes you would like to see. You can contact us directly in this thread, or by submitting a feature/change request using the official Roadmap Request Form.
     
    Last edited: Jan 11, 2024
    rdjadu, AntonioModer, Lex4art and 9 others like this.
  2. sqallpl

    sqallpl

    Joined:
    Oct 22, 2013
    Posts:
    384
    Great to hear that!

    Tree and foliage wind/touch bending is most of the time achieved by using vertex animation in shaders. Do you think that it will be possible to feed the RTAS with vertex animation from shaders in the future? Especially when it comes to instancing when one source mesh is instanced multiple times.
     
  3. INedelcu

    INedelcu

    Unity Technologies

    Joined:
    Jul 14, 2015
    Posts:
    183
    We don't think that is achievable. This is because each tree instance must write the animated result into GPU memory and build the acceleration on the GPU for that tree instance each frame, assuming the wind animation doesn't look the same for all trees. This means that instead of having only one Mesh for a tree prototype in GPU memory, we must have one Mesh for each tree instance + its associated acceleration structure. This is in addition to associated GPU cost for writing these animated meshes in GPU memory and build their acceleration structures. The ray tracing pipeline is a different from the rasterization pipeline and they use different types of shaders (vertex shader, pixel shader versus ray generation shader, hit shaders). If a vertex shader does vertex animation, this animation is not automatically captured in GPU memory unless you write it manually from the vertex shader or a compute shader. SkinnedMeshRenderer component is an example that employs this type of pipeline for example when GPU skinning is enabled.
     
    LooperVFX, chap-unity and dnach like this.
  4. sqallpl

    sqallpl

    Joined:
    Oct 22, 2013
    Posts:
    384
    Thank you for the detailed response.

    So, if this is a general raytracing limitation as we can't simply and cheaply feed the RTAS with the results from vertex shader I'm wondering what's the future (if there is any) for making many instances of animated foliage and raytracing work together?
     
  5. dnach

    dnach

    Unity Technologies

    Joined:
    Mar 9, 2022
    Posts:
    90
    To achieve this, a compute pre-pass that processes the deformed geometry would currently be needed. With the recent API changes mentioned in the post, you can now conveniently deform the vertices of a mesh added to the RTAS via RTAS.AddInstance(Mesh), for example using a compute shader.

    Before enabling wind animation to the existing Trees/Detail painting system when using Raytracing, we are still exploring sensible approaches to avoid excessive impact on performance and stability (e.g due to exceedingly high memory usage), so RT support would currently be limited to static trees/foliage.
     
    Last edited: Oct 19, 2022
    LooperVFX likes this.
  6. newguy123

    newguy123

    Joined:
    Aug 22, 2018
    Posts:
    1,278
    Honestly, we're artists, not programmers.
    We want a simple solution to replace traditional offline rendering, with unity offline rendering.

    All this talk of RTAS.AddInstances and this code and that code, makes our heads spin. Why can't we as artists get something that just works?
     
    LooperVFX likes this.
  7. newguy123

    newguy123

    Joined:
    Aug 22, 2018
    Posts:
    1,278
    So to get/use this in 2023.1.0a14, all I have to do is start a new project using the HDRP demo scene, then convert the scene to Raytracing using the HDRP wizard?

    When 2023.1 tech release launches, can we please get a new demo scene where it already starts up in DX12 with raytracing / pathtracing already enabled please? It makes sense to me...

    Better yet, can we start getting the Unity demos to see how things were created and play with it ourselves. For example that atrium in the Enemies demo looks ace!
     
    Gametyme likes this.
  8. guoxx_

    guoxx_

    Joined:
    Mar 16, 2021
    Posts:
    55
    Please add support for ComputeShader.SetBufferArray. As mentioned we have Mesh.GetVertexBuffer and use that directly in compute shader, but a mesh may contain multiple vertex buffers, it's been supported by ray tracing shader internally in Unity
    Code (CSharp):
    1. ByteAddressBuffer unity_MeshVertexBuffers_RT[kMaxVertexStreams];
    2. ByteAddressBuffer unity_MeshIndexBuffer_RT;
    If we have the same capability in compute shader, we can make those codes capable of both compute and ray tracing which simplifies things.
     
    Glacier-Games and matthewhxq like this.
  9. dnach

    dnach

    Unity Technologies

    Joined:
    Mar 9, 2022
    Posts:
    90
    In order to get started with HDRP Raytracing, you can now:
    1. Create a project using HDRP sample scene
    2. Under "Project Settings"/"Quality" set RenderPipelineAsset to either 'HDRPRaytracingQuality.asset' or 'HDRPRaytracingRTGIQuality.asset'
    3. Control the desired RT effect in the inspector via Volume Override settings. In the scene hierarchy, look for "Lighting"/"Volumes"/"Volume Global"/"Volume Ray Tracing".
    4. You can also add your own volume overrides. For example, you can add a Screenspace Reflections override and set the Tracing method to "Raytracing".
    Thank you for the feedback! I agree that a C# API for ComputeShader/Material.SetBufferArray would be useful.
    I believe this is currently unsupported as it would be limited to more modern platforms. I logged a request to our team, so will follow up on this and keep you updated.
     
  10. newguy123

    newguy123

    Joined:
    Aug 22, 2018
    Posts:
    1,278
    I see no such settings or assets
    You sure its in 2023.1.0a14?

    upload_2022-10-21_18-30-32.png
     
  11. dnach

    dnach

    Unity Technologies

    Joined:
    Mar 9, 2022
    Posts:
    90
    The new Raytracing Quality Assets were added for the latest 2022.2 beta in "HDRP Sample Scene v14.1.0", and for the 2023.1 alpha in "HDRP Sample Scene v15.1.0".

    It seems like there was a short delay with the latter (template update for 2023.1a), while these are already available for 2022.2b.

    Edit: HDRP Sample Scene v15.1.0 should now be available!
     
    Last edited: Oct 28, 2022
    newguy123 likes this.
  12. merpheus

    merpheus

    Joined:
    Mar 5, 2013
    Posts:
    202
    I wish there would be a chance to run the raytracing api/shaders on metal/m1 macs. Its nice to use unity with metal from the day one, but not getting raytracing API support(despite metal having one) is sad and affects our workflow between different team members that use different devices (one having mac/metal and other having win/dx12).

    At least having the raytracing shader support, even if hdrp raytracing pipeline not working, is a good start.
     
  13. argibaltzi

    argibaltzi

    Joined:
    Nov 13, 2014
    Posts:
    221
    Any chance this will work with DOTS, or is it a gameobjects feature only?
     
  14. burningmime

    burningmime

    Joined:
    Jan 25, 2014
    Posts:
    845
    You might need to manage the instances manually instead of letting CullInstances() do it, but the new API linked in the OP doesn't require a Renderer, just a Mesh and a Material. You could ping hybrid renderer team to see if they want to integrate it, or just do it yourself.
     
    argibaltzi likes this.
  15. dnach

    dnach

    Unity Technologies

    Joined:
    Mar 9, 2022
    Posts:
    90
    DOTS does not officially support Raytracing at the moment, but the team will consider supporting this in the future.

    As @burningmime mentioned above, you cannot currently use RaytracingAccelerationStructure.CullInstances() to add such instances to the Raytracing Acceleration Structure efficiently. It could be possible to get a custom implementation working via RaytracingAccelerationStructure.AddInstance now, though we did not fully verify this may not recommended this due to performance concerns.
     
    Last edited: Nov 4, 2022
  16. ftejada

    ftejada

    Joined:
    Jul 1, 2015
    Posts:
    695
    Hi everyone!!!
    I'm playing a bit with Unity 2023 HDRP and raytracing and I'm very confused.

    I'm trying to make a room dark because there's no sunlight coming in...
    But I can't get it. The ambient lighting of PBS always stays as you can see in the following video:


    I have touched all the parameters of "SSGI" but I can not eliminate this annoying problem. I also have a reflection probe updating in real time... but it doesn't do anything either.

    I'm confused. I have tried to follow this guide: https://forum.unity.com/threads/hdr...acing-lighting-troubleshooting-guide.1248550/
    and other Unity videos like this
    .
    But I don't know how to solve these problems.

    Is Unity not able to handle this type of scenario without having to change all kinds of settings in the PBS, intensity of directionalLight, or patching settings with "Indirect Lighting Controller or Exposure"?

    How can I handle this type of situation? Any info, help, etc??

    Cheers
     
  17. chap-unity

    chap-unity

    Unity Technologies

    Joined:
    Nov 4, 2019
    Posts:
    788
    Hey, you should probably have made your own post since it looks like a specific setup issue, but anyway.
    Could you make the video "not private" and / or provide a repro project so that we can have a look ?

    Recently we've introduced ambient light dimmer in RayTraced GI, Ray Traced Reflection and Recursive Rendering to take care exactly of this issue (basically ambient lighting falling back to ambient probe after the last bounce), did you try that ? (it's an advanced parameter so you will need to click on the ellipsis, three vertical dots on the top right corner of the override, and click on show additional properties)
     
  18. LooperVFX

    LooperVFX

    Joined:
    Dec 3, 2018
    Posts:
    182
    Agreed, this is a Unity Editor user experience / tooling issue, not a core technical feature issue. It's already working, just without an Artist or rapid Prototyping friendly UI/UX on top. (Very valuable to developers not just artists.) so, less of a question for @INedelcu and more of a question for @LaurentGibert or at least he'll know what team already is or will be managing this.

    I suspect this hasn't gotten a lot of attention because many Artists don't know exactly what to ask for other than to just make their existing workflows keep working. Which is reasonable to ask and I don't blame the artists for this problem we the engineers and programmers made, but in this case (making vertex offset work for raytracing) would limit accessibility and flexibility for future features. so as a developer / programmer that works with artists i'll tell you what would be the best of both worlds:

    A "compute" (shader) graph and perhaps a domain specific "mesh" graph variant (that may take advantage of mesh shader gpu hardware when available and will fallback to pure gpu compute when that is not available or DOTS style cpu compute as another fallback or as desired.)

    In practice this would work similar in many ways to how Blender or Houdini Geometry Node graph tools or "modifiers."

    This would separate the Mesh abstraction from the Material and lighting Abstraction back to where they belong (in most cases, I know many exceptions exist.)

    You have a Mesh Object, you deform it, manipulate it, generate it from scratch, whatever you want using "Mesh" compute nodes / functions in the compute graph. This can either update dynamically every frame at runtime or be baked to static.

    And then you can apply whatever material you want to that mesh to make the surface material appear or lit how you desire.

    We no longer have this collapsed abstraction of "gotta use a vertex shader" where the only way to performantly deform a mesh is to hack it into the vertex shader which can do very little, even less in raytracing pipeline, and eventually heading to a future where vertex shaders will be deprecated altogether, in favor of mesh / compute shaders.
     
  19. burningmime

    burningmime

    Joined:
    Jan 25, 2014
    Posts:
    845
    Is stream-out with a null GS an option? This would allow you to run some user-defined legacy vertex shader and get the modified positions to put into the RTAS. Don't know how performant it would be if there were 10000 plants waving in the wind with alpha-tested materials, but it would be a seamless workflow instead of requiring compute shaders for mesh deformation.
     
  20. DevDunk

    DevDunk

    Joined:
    Feb 13, 2020
    Posts:
    5,200
    Does this also mean VR is supported with ray tracing?
     
  21. m0nsky

    m0nsky

    Joined:
    Dec 9, 2015
    Posts:
    263
    Ray tracing in VR works fine up to 2021.2. In 2023.1, there are two known issues which I have reported and described over here. (these issues are only present in Windows standalone builds, you should be able to develop in Editor without issues)
     
    DevDunk likes this.
  22. Deleted User

    Deleted User

    Guest

    Is it possible to safely call AddInstance from a thread other then the main thread or even better, can it be called from multiple threads simultaneously?
    That would improve performance a lot when using entities as well as a lot of procedural AABBs.
     
  23. m0nsky

    m0nsky

    Joined:
    Dec 9, 2015
    Posts:
    263
  24. INedelcu

    INedelcu

    Unity Technologies

    Joined:
    Jul 14, 2015
    Posts:
    183
    @hubbux None of the functions in RayTracingAccelerationStructure (RTAS) are thread-safe.
    CullInstances uses multi-threading and SIMD for plane or sphere vs Renderer's AABB tests so please use it if you can. This is what HDRP is using now. You can use AddInstance before or after CullInstances as well.
     
    m0nsky likes this.
  25. dnach

    dnach

    Unity Technologies

    Joined:
    Mar 9, 2022
    Posts:
    90
    Instancing support for the Raytracing API is now available, allowing to efficiently add large amounts of mesh instances to the Raytracing Acceleration Structure, in order to improve raytracing performance.

    RayTracingAccelerationStructure.AddInstances is a counterpart of Graphics.RenderMeshInstanced (used for rasterization pipeline), and will add a Mesh ray tracing instance in the TLAS for each matrix in the array. In the shader, you can use the Instance ID in order to access per-instance data. For more information, including performance testing figures and a reference sample project, please check the following slides.

    You can try out Raytracing Instancing, along with other recent Raytracing improvements, in the newly available Unity 2023.1 beta.
    As part of the beta release, we also have an ongoing sweepstakes that you can check out, for a chance to win an NVIDIA Geforce RTX 3070!

    We highly appreciate your feedback and reports, so please keep them coming and let us know if you encounter any issues, or if there are additional improvements you would like to see :)
     
    Last edited: Jan 27, 2023
  26. LooperVFX

    LooperVFX

    Joined:
    Dec 3, 2018
    Posts:
    182
    I doubt native Metal raytracing support is a priority for Unity (and is not likely to be.) Also, Vulkan is not supported by Apple (and is not likely to be.)

    That said, one possibility would be to follow the progress of MoltenVK's (Vulkan to Metal translation layer) support for raytracing: https://github.com/KhronosGroup/MoltenVK/issues/427

    I also found an informative blog post by Codeweavers (who are also MoltenVK contributors) on these particular challenges of DirectX12 & DirectX Raytracing API translation to Apple Metal API: https://www.codeweavers.com/blog/cj...etting-there-crossover-support-for-directx-12

    Some additional info on the status of MoltenVK translating Metal 3 features to Vulkan features / API calls: https://github.com/KhronosGroup/MoltenVK/discussions/1616

    So, it's no trivial task, but there's hope and people working on it (at Khronos Group, Codeweavers, and also Apple since it seems Metal 3 may be required for tier 2 level resource binding, etc.)

    Once this foundation is complete, then the developers at Unity will be more likely to implement and verify this feature. Not necessarily using MoltenVK, as Unity commonly writes their own API / code translation layers, but it will at least provide a proven path of reference for the required API features and translation toolchains, etc.
     
    Last edited: Jan 29, 2023
  27. vx4

    vx4

    Joined:
    Dec 11, 2012
    Posts:
    182
    Can we use Inline raytracing inside vfx graph?
     
  28. a-h-r

    a-h-r

    Joined:
    Jun 8, 2023
    Posts:
    4
    What about now that Unity has partnered with Apple? Do you think it support raytracing using Metal?
     
  29. DevDunk

    DevDunk

    Joined:
    Feb 13, 2020
    Posts:
    5,200
    Probably not, I think it even got removed from the under consideration at the roadmap
     
  30. LooperVFX

    LooperVFX

    Joined:
    Dec 3, 2018
    Posts:
    182
  31. Timten

    Timten

    Joined:
    Nov 11, 2017
    Posts:
    13
    Hello,
    1. I would like to know when will Entities Graphics support Ray Tracing?
    2. Will it has a big performance impact if I change all entities from Entities Graphics rendering back to GameObject rendering?
    I love DOTS, so I will not go back to the GameObject/MonoBehaviour world, but I also love Ray Tracing, looking forward to hearing from you, thanks.
     
    Tigrian, newguy123 and DevDunk like this.
  32. ArIaNFury008

    ArIaNFury008

    Joined:
    Dec 22, 2019
    Posts:
    28
    hi, sorry for my noob question:

    Is Raytracing ready for games to have open-world foliage games like Sons of the Forest?
     
  33. LooperVFX

    LooperVFX

    Joined:
    Dec 3, 2018
    Posts:
    182
    Yes, and Sons of the Forest was made with Unity! As I understand, raytracing static foliage has been supported for some time, while animated "swaying in the wind" foliage raytracing support was added more recently in one of two ways:

    1. Mixed tracing that is, mixed between worldspace raytracing and screenspace raymarching. (the difference between tracing and marching is a far less important implementation detail that doesn't affect the end result as much as the "space" that has been collapsed with it so don't get caught up on that. Anyway, the static / immovable parts of foliage, will by fully raytraced in worldspace and GPU raytracing hardware accelerated, the animated parts of the foliage will be raymarched in screen space via GPU software (still very performant), altogether looking and performing in many situations just about (but not quite) as well as if it was all fully worldspace raytraced.

    This Mixed tracing + vertex shader approach is easier / simpler to get working and will likely work for most situations. The vertex shader can be authored with ShaderGraph.

    2. The new Unity Mesh API with compute shader access to the vertex / index buffers (mesh data). With this, a shader can animate foliage for the highest quality world space raytracing. This can also be used to recompute normals in a compute shader, author custom LOD / level of detail behavior, or completely procedural dynamic real-time generated geometry. Many things are possible that were not before.

    This Mesh API + compute shader approach requires writing code with an advanced knowledge of graphics programming. It is very powerful / open-ended, and can be utilized for cases demanding very high quality and accuracy. However, with some boilerplate and well-crafted example code from such a programmer, it wouldn't be too difficult for the average developer or technical artist to work with.

    All this plus the Raytracing API being production ready means that Unity can be further customized to fit the requirements of different projects, tailor optimizations, custom features, stylized rendering, etc.
     
    Last edited: May 2, 2024
    DevDunk and ArIaNFury008 like this.