Search Unity

  1. Unity 6 Preview is now available. To find out what's new, have a look at our Unity 6 Preview blog post.
    Dismiss Notice
  2. Unity is excited to announce that we will be collaborating with TheXPlace for a summer game jam from June 13 - June 19. Learn more.
    Dismiss Notice

Question RayTracingAccelerationStructure.AddInstance() with procedural mesh

Discussion in 'High Definition Render Pipeline' started by AntonioNoack, Jul 26, 2022.

  1. AntonioNoack

    AntonioNoack

    Joined:
    Mar 26, 2020
    Posts:
    7
    Hey,

    I have a procedural shader, that I can visualize with Graphics.DrawProcedural() and CommandBuffer.DrawProcedural(matrix, shader, pass, triangles, baseMeshVertexCount, instanceCount).
    I would like to use a RayTracingAccelerationStructure to find all procedural instances, that intersect with a custom ray (to execute a process called "light sampling"; my procedural instances are global-illumination-surfels (cubes)) or are within a certain area.
    I want to reduce any CPU-GPU bottlenecks, so creating a traditional mesh on CPU side is not possible (or I really want to avoid it).

    There is a method called RayTracingAccelerationStructure.AddInstance(), where you can add a Renderer. Looking at the default MeshRenderer, it accepts procedural shader-based materials, and there is an option to "Ray Trace Procedurally". However, there isn't any obvious way to set the number of procedural instances.

    a) Is there a way to set that property; and then the RTAS would work?
    b, if not a) could I implement a custom Renderer (working with RTAS)?
    c, if not a) is there an existing Renderer (working with RTAS)?
    d, if neither a,b or c) is there really no way? Then I'd have to write my own acceleration structure on the GPU :/ (I have the code for the CPU side, but porting it to GPU is sth else)
     
  2. INedelcu

    INedelcu

    Unity Technologies

    Joined:
    Jul 14, 2015
    Posts:
    176
    Hi,

    Graphics.Draw* and Graphics.Render* functions don't have any effect in ray tracing since rasterization and ray tracing are two separate graphics pipeline that use different shader types, but you can mix their results together (e.g. generate ray traced reflections and mix it with rasterization). If you do some vertex manipulation or generate triangles in a vertex/geometry/tessellation shader, the result is not automatically visible in ray tracing because the RayTracingAccelerationStructure (RTAS) build process uses geometries that are in main GPU memory only.

    The current ray tracing hardware can do ray triangles and ray AABBs intersection. The second one might be useful in your case.

    In ray tracing, a procedural geometry is generated using an intersection shader where you evaluate a function to generate the geometry intersection (or intersection T). This shader type is executed when you use AABBs instead of actual triangle geometry. "Ray Trace Procedurally" will cause an AABB to be added to the RTAS instead of the actual Mesh geometry and will also enable intersection shader evaluation on the GPU.
    In your case is better to use the other signature that takes a GraphicsBuffer instead to add multiple AABBs in one go:

    int AddInstance(GraphicsBuffer aabbBuffer, uint aabbCount, bool dynamicData, Matrix4x4 matrix, Material material, bool opaqueMaterial, MaterialPropertyBlock properties, uint mask, uint id);

    The shader used by the material passed as argument needs to have a shader pass with your intersection shader. There's a little (useless) example of an intersection shader in SetShaderPass function.

    I attached a project example from our graphics tests database that uses intersection shaders. Switch to game view to see the ray traced scene. The AABB list is generated in RayTracingProceduralIntersection.cs but you can also generate the AABB buffer in a compute shader. Also check the intersection shaders in .shader files.
     

    Attached Files:

    cecarlsen, wm-VR and m0nsky like this.
  3. AntonioNoack

    AntonioNoack

    Joined:
    Mar 26, 2020
    Posts:
    7
    Thank you, especially for the sample, it works now :)
     
  4. cecarlsen

    cecarlsen

    Joined:
    Jun 30, 2006
    Posts:
    870
    Hi @INedelcu I just tried the IntersectionShaderTest example in Unity 2022.2.1 (HDRP 14.0.4) and it crashed Windows. Is there an updated example?
     
  5. INedelcu

    INedelcu

    Unity Technologies

    Joined:
    Jul 14, 2015
    Posts:
    176
    Hi @cecarlsen

    I can open that project in 2022.2.1f1 without issues on my PC. Can you actually report a bug using Bug Reporter or it crashes your operating system?
     

    Attached Files:

  6. cecarlsen

    cecarlsen

    Joined:
    Jun 30, 2006
    Posts:
    870
    Hi again @INedelcu. I closed other Unity instances running in the background and the example seems to stay running now. It looks very odd though. It seems the game view camera is somehow being overridden by the Scene view camera. And I don't see any vertical bars in the background like on your screenshot. It is a completely clean project with HDRP 14.0.4 and DXR enabled (using the wizard).

    EDIT: Hold on. The original project works now in 2022.2. Not sure what exactly went wrong. I'll get back once I've understood more.

    EDIT2: So I presumed the example was for HDRP with DXR (because of the forum sub), but to my surprise it does not depend on any of that, just DX12. It is a very useful example, there is a lot to digest here.
     

    Attached Files:

    Last edited: Jan 5, 2023
  7. INedelcu

    INedelcu

    Unity Technologies

    Joined:
    Jul 14, 2015
    Posts:
    176
    m0nsky and cecarlsen like this.
  8. cecarlsen

    cecarlsen

    Joined:
    Jun 30, 2006
    Posts:
    870
    Hi @INedelcu

    I am attempting to use AddInstance( aabbs ) in HDRP with the goal of adding raytraced SSAO and SSR on top of rasterized instances rendered with Graphics.RenderMeshIndirect(). I've duplicated the HDRP Lit shader and the RaytracingIntersection.hlsl include, modified the include and updated the references in Lit to it.

    The sphere on the left is rendered by a MeshRenderer.

    The sphere on the right is rendered using Graphics.RenderMesh() and RayTracingAccelerationStructure.AddInstance( aabbs ), both using the modified Lit shader. The intersection shader tests against a sphere instead of triangles.

    CustomInstancedDXR.png

    A couple of questions:
    1) Does the code below look sane to you? There is so little documentation on this ... I am not sure is I am using the API as it was intended.
    2) Any idea why the mesh on the right (RenderMesh) is not effected by lights (2023.1)? It seems to work in 2022.2.7. EDIT: Oh, it could be because the Lit shader originated from 2022.2. This is exactly why this needs to be supported by ShaderGraph and BlockShaders.

    The C# script for testing looks like this:
    Code (CSharp):
    1. using UnityEngine;
    2. using UnityEngine.Rendering;
    3. using UnityEngine.Rendering.HighDefinition;
    4.  
    5. public class RayTracingInstancesAABB : MonoBehaviour
    6. {
    7.     [SerializeField] Camera _camera;
    8.     [SerializeField] Material _material;
    9.     [SerializeField] Mesh _mesh;
    10.     [SerializeField] bool _animate;
    11.  
    12.     GraphicsBuffer _aabbs;
    13.     AABB[] _aabbsData;
    14.  
    15.     RayTracingAccelerationStructure _rayStruct;
    16.     RayTracingInstanceCullingConfig _cullConfig;
    17.  
    18.     HDCamera _hdCam;
    19.  
    20.     RenderParams _renderParams;
    21.  
    22.     public struct AABB { public Vector3 min, max; }
    23.  
    24.  
    25.     void Start()
    26.     {
    27.         _aabbsData = new AABB[ 1 ];
    28.         _aabbs = new GraphicsBuffer( GraphicsBuffer.Target.Structured, _aabbsData.Length, stride: 6 * sizeof( float ) );
    29.         _rayStruct = new RayTracingAccelerationStructure();
    30.  
    31.         _material.SetBuffer( "_AABBs", _aabbs );
    32.  
    33.         RayTracingInstanceCullingTest cullingTest = new RayTracingInstanceCullingTest();
    34.         cullingTest.allowAlphaTestedMaterials = true;
    35.         cullingTest.allowOpaqueMaterials = true;
    36.         cullingTest.allowTransparentMaterials = false;
    37.         cullingTest.instanceMask = 255;
    38.         cullingTest.layerMask = -1;
    39.         cullingTest.shadowCastingModeMask = -1;
    40.  
    41.         _cullConfig = new RayTracingInstanceCullingConfig();
    42.         _cullConfig.subMeshFlagsConfig.opaqueMaterials = RayTracingSubMeshFlags.Enabled | RayTracingSubMeshFlags.ClosestHitOnly;
    43.         _cullConfig.subMeshFlagsConfig.alphaTestedMaterials = RayTracingSubMeshFlags.Enabled;
    44.         _cullConfig.subMeshFlagsConfig.transparentMaterials = RayTracingSubMeshFlags.Disabled;
    45.         _cullConfig.instanceTests = new RayTracingInstanceCullingTest[] { cullingTest };
    46.  
    47.         _hdCam = HDCamera.GetOrCreate( _camera );
    48.  
    49.         _renderParams = new RenderParams( _material ) {
    50.             renderingLayerMask = GraphicsSettings.defaultRenderingLayerMask,
    51.             lightProbeUsage = LightProbeUsage.BlendProbes,
    52.             worldBounds = new Bounds( Vector3.zero, Vector3.one * 1000 ), // Screw culling for now.
    53.             camera  = _camera,
    54.         };
    55.     }
    56.  
    57.  
    58.     void OnDestroy()
    59.     {
    60.         if( _aabbs != null ) _aabbs.Release();
    61.         _aabbs = null;
    62.  
    63.         if( _rayStruct != null ) _rayStruct.Release();
    64.         _rayStruct = null;
    65.     }
    66.  
    67.  
    68.     void Update()
    69.     {
    70.         // Update sphere transform.
    71.         float ani = _animate ? Mathf.Sin( Time.time ) * 0.05f : 0f;
    72.         Vector3 spherePos = new Vector3( 1, 1 + ani * 0.5f, 2 );
    73.         float sphereRadius = 0.5f + ani;
    74.         _aabbsData[ 0 ] =
    75.             new AABB() {
    76.                 min = spherePos - Vector3.one * sphereRadius,
    77.                 max = spherePos + Vector3.one * sphereRadius
    78.         };
    79.         _aabbs.SetData( _aabbsData );
    80.  
    81.         // Update ray tracing structure for DXR SSAO and SSR. Eventually I guess we need a manager to handle multiple sources per camera.
    82.         _rayStruct.ClearInstances();
    83.         _rayStruct.CullInstances( ref _cullConfig );
    84.         _rayStruct.AddInstance( _aabbs, (uint) _aabbs.count, dynamicData: false, Matrix4x4.identity, _material, opaqueMaterial: true, properties: null );
    85.         _rayStruct.Build( _camera.transform.position );
    86.         _hdCam.rayTracingAccelerationStructure = _rayStruct;
    87.  
    88.         // Issue raterisation. This will eventually be done by Graphics.RenderMeshIndect, but that requires a different shader.
    89.         Graphics.RenderMesh( _renderParams, _mesh, submeshIndex: 0, Matrix4x4.TRS( spherePos, Quaternion.identity, Vector3.one * sphereRadius * 2 ) );
    90.     }
    91. }

    And the modified RaytracingIntersection.hlsl looks like this:

    Code (CSharp):
    1. #ifndef UNITY_RAYTRACING_INTERSECTION_INCLUDED
    2. #define UNITY_RAYTRACING_INTERSECTION_INCLUDED
    3.  
    4. // Engine includes
    5. #include "UnityRayTracingMeshUtils.cginc"
    6.  
    7. // Raycone structure that defines the stateof the ray
    8. struct RayCone
    9. {
    10.     float width;
    11.     float spreadAngle;
    12. };
    13.  
    14. // Structure that defines the current state of the visibility
    15. struct RayIntersectionDebug
    16. {
    17.  
    18.     float t;                // Distance of the intersection
    19.     float2 barycentrics;    // Barycentrics of the intersection
    20.     uint primitiveIndex;
    21.     uint instanceIndex;
    22. };
    23.  
    24. // Structure that defines the current state of the visibility
    25. struct RayIntersectionVisibility
    26. {
    27.  
    28.     float t;                // Distance of the intersection
    29.     float velocity;            // Velocity for the intersection point
    30.     RayCone cone;            // Cone representation of the ray
    31.     uint2 pixelCoord;        // Pixel coordinate from which the initial ray was launched
    32.     float3 color;            // Value that holds the color of the ray or debug data
    33. };
    34.  
    35. // Structure that defines the current state of the intersection
    36. struct RayIntersection
    37. {
    38.     float t;                // Distance of the intersection
    39.     float3 color;            // Value that holds the color of the ray
    40.     RayCone cone;            // Cone representation of the ray
    41.     uint remainingDepth;    // The remaining available depth for the current Ray
    42.     uint sampleIndex;
    43.     uint rayCount;
    44.     uint2 pixelCoord;        // Pixel coordinate from which the initial ray was launched
    45.     float velocity;            // Velocity for the intersection point
    46. };
    47.  
    48. struct AttributeData
    49. {
    50.     float2 barycentrics; // Barycentric value of the intersection
    51.     float3 normalOS; // For RayTraceProcedurally
    52. };
    53.  
    54. // Macro that interpolate any attribute using barycentric coordinates
    55. #define INTERPOLATE_RAYTRACING_ATTRIBUTE(A0, A1, A2, BARYCENTRIC_COORDINATES) (A0 * BARYCENTRIC_COORDINATES.x + A1 * BARYCENTRIC_COORDINATES.y + A2 * BARYCENTRIC_COORDINATES.z)
    56.  
    57. // Structure to fill for intersections
    58. struct IntersectionVertex
    59. {
    60.     float3 normalOS;
    61.     float4 tangentOS;
    62.     float4 texCoord0;
    63.     float4 texCoord1;
    64.     float4 texCoord2;
    65.     float4 texCoord3;
    66.     float4 color;
    67. #ifdef USE_RAY_CONE_LOD
    68.     float  triangleArea;
    69.     float  texCoord0Area;
    70.     float  texCoord1Area;
    71.     float  texCoord2Area;
    72.     float  texCoord3Area;
    73. #endif
    74. };
    75.  
    76.  
    77. struct AABB
    78. {
    79.     float3 min;
    80.     float3 max;
    81. };
    82.  
    83. StructuredBuffer<AABB> _AABBs;
    84.  
    85.  
    86. void ReadSphere( out float3 pos, out float radius )
    87. {
    88.     AABB aabb = _AABBs[ PrimitiveIndex() ];
    89.     pos = (aabb.min + aabb.max) * 0.5;
    90.     radius = ( aabb.max.x - aabb.min.x ) * 0.5;
    91. }
    92.  
    93.  
    94. float HitSphere( float3 center, float radius, float3 ro, float3 rd )
    95. {
    96.     float3 oc = ro - center;
    97.     float a = dot(rd, rd);
    98.     float b = 2.0 * dot(oc, rd);
    99.     float c = dot(oc,oc) - radius*radius;
    100.     float discriminant = b*b - 4*a*c;
    101.     if( discriminant < 0.0 ) {
    102.         return -1.0;
    103.     } else {
    104.         float numerator = -b - sqrt(discriminant);
    105.         return numerator > 0.0 ? numerator / (2.0 * a) : -1;
    106.     }
    107. }
    108.  
    109.  
    110. // Fetch the intersetion vertex data for the target vertex
    111. void FetchIntersectionVertex(uint vertexIndex, out IntersectionVertex outVertex)
    112. {
    113.     float3 sphereCenter;
    114.     float sphereRadius;
    115.     ReadSphere( sphereCenter, sphereRadius );
    116.  
    117.     float3 ro = ObjectRayOrigin();
    118.     float3 rd = ObjectRayDirection();
    119.  
    120.     float hitDist = HitSphere( sphereCenter, sphereRadius, ro, rd );
    121.     outVertex.normalOS = 0;
    122.     if( hitDist >= 0.0 ) outVertex.normalOS = normalize( ro + hitDist * rd - sphereCenter );
    123.  
    124.     outVertex.tangentOS  = 0.0;
    125.     outVertex.texCoord0  = 0.0;
    126.     outVertex.texCoord1  = 0.0;
    127.     outVertex.texCoord2  = 0.0;
    128.     outVertex.texCoord3  = 0.0;
    129.     outVertex.color  = 0.0;
    130. }
    131.  
    132. void GetCurrentIntersectionVertex(AttributeData attributeData, out IntersectionVertex outVertex)
    133. {
    134.     outVertex.normalOS = attributeData.normalOS;
    135.  
    136.     //float3 rd = ObjectRayOrigin();
    137.     float3 rd = float3( 0, 1, 0 ); // The HDRP does not provide RayIntersection to this function :(
    138.  
    139.     outVertex.tangentOS  = float4( normalize( cross( cross( attributeData.normalOS, rd ), attributeData.normalOS ) ), 1 );
    140.     outVertex.texCoord0  = 0.0;
    141.     outVertex.texCoord1  = 0.0;
    142.     outVertex.texCoord2  = 0.0;
    143.     outVertex.texCoord3  = 0.0;
    144.     outVertex.color      = 0.0;
    145. }
    146.  
    147.  
    148. // Compute the proper world space geometric normal from the intersected triangle
    149. void GetCurrentIntersectionGeometricNormal( AttributeData attr, out float3 geomNormalWS )
    150. {
    151.     geomNormalWS = mul( (float3x3) UNITY_MATRIX_I_M, attr.normalOS ); // World to object attr.normalOS;
    152. }
    153.  
    154.  
    155. [shader("intersection")]
    156. void IntersectionMain()
    157. {
    158.     float3 sphereCenter;
    159.     float sphereRadius;
    160.     ReadSphere( sphereCenter, sphereRadius );
    161.  
    162.     float3 ro = ObjectRayOrigin();
    163.     float3 rd = ObjectRayDirection();
    164.  
    165.     float hitDist = HitSphere( sphereCenter, sphereRadius, ro, rd );
    166.     if( hitDist >= 0.0 ) {
    167.         AttributeData attr;
    168.         attr.barycentrics = 0.0;
    169.         attr.normalOS = normalize( ro + hitDist * rd - sphereCenter );
    170.         ReportHit( hitDist, 0, attr );
    171.     }
    172. }
    173.  
    174.  
    175. #endif // UNITY_RAYTRACING_INTERSECTION_INCLUDED
     
    Last edited: Feb 22, 2023