Search Unity

Question Optimization of HashMap Bottleneck in Burst-Compiled Job for normals calculation

Discussion in 'Burst' started by Leyren, May 13, 2023.

  1. Leyren

    Leyren

    Joined:
    Mar 23, 2016
    Posts:
    29
    Hey!

    I have some procedurally generated terrain, for which I build the meshes using Marching Cubes in Jobs, separated by chunks. As a last step, I compute the normals for vertices based on their coordinates (same coordinates but different index in the vertices array, due to duplicate vertices for different triangles).
    Removing this step speeds up the entire job by a factor of 2.

    Any ideas on how to improve this? I can't seem to think of a way to not require a Map / Dictionary, as the normals need to be grouped by their position.

    The only thing I could think of is to have separate jobs running after the mesh generation, that will compute all of the normals for the chunks in parallel.


    Code (CSharp):
    1.        
    2. public void SmoothNormals(int vertCount, int trisCount)
    3. {
    4.   UnsafeHashMap<Vector3, Vector3> dict = new UnsafeHashMap<Vector3, Vector3>(vertCount, AllocatorManager.Temp);
    5.  
    6.   for (int i = 0; i < trisCount; i += 3)
    7.   {
    8.     Vector3 vertex1 = vertices[triangles[i]];
    9.     Vector3 vertex2 = vertices[triangles[i + 1]];
    10.     Vector3 vertex3 = vertices[triangles[i + 2]];
    11.  
    12.     Vector3 triangleNormal = Vector3.Cross(vertex2 - vertex1, vertex3 - vertex1).normalized;
    13.  
    14.     dict[vertex1] = dict.TryGetValue(vertex1, out Vector3 v1) ? v1 + triangleNormal : triangleNormal;
    15.     dict[vertex2] = dict.TryGetValue(vertex2, out Vector3 v2) ? v2 + triangleNormal : triangleNormal;
    16.     dict[vertex3] = dict.TryGetValue(vertex3, out Vector3 v3) ? v3 + triangleNormal : triangleNormal;
    17.   }
    18.   for (int i = 0; i < vertCount; i++)
    19.   {
    20.     normals.Add(dict[vertices[i]].normalized);
    21.   }
    22.  
    23. }
    24.  
     
  2. DreamingImLatios

    DreamingImLatios

    Joined:
    Jun 3, 2017
    Posts:
    4,264
    Some low-hanging fruits:
    Use Unity.Mathematics types for calculations like cross and normalize and storing in the HashMap (which involves GetHashCode() and Equals().

    You are calling Add() on normals. That should be AddNoResize() and you should set the capacity appropriately beforehand.

    And for your question, the hashmap is completely unnecessary. You already have the actual indices to the vertices from the triangles, so instead you just need:
    Code (CSharp):
    1. var summedNormalPerVertex = NativeArray<float3>(vertCount, Allocator.Temp, NativeArrayOptions.ClearMemory);
    The clear memory option means that you can just always add the new triangle normal every time, because the first add will just be adding to 0f.
     
    CodeSmile likes this.
  3. Leyren

    Leyren

    Joined:
    Mar 23, 2016
    Posts:
    29
    Thanks for your answer. Sadly, the last part is not applicable in this use-case. Each triangle is made up of its own 3 vertices (as I need separate uv data for them for rendering), so there would be multiple vertices at the same position while having different indices in the list. The goal of this step here is to ensure that those vertices that are at the same position still share one common normal.
     
  4. DreamingImLatios

    DreamingImLatios

    Joined:
    Jun 3, 2017
    Posts:
    4,264
    In that case, you can still reduce the number of hashmap lookups by instead only hashing once per each vertex. Create a NativeHashMap<float3, int> where the value is the index into a NativeList<float3>. Then create a NativeArray<int> where each element is also the index into a NativeList<float3> which you populate by hashing each vertex.

    Then when iterating triangles, you simply have to do list[array[vertexID]] to get your desired normal.
     
  5. CodeSmile

    CodeSmile

    Joined:
    Apr 10, 2014
    Posts:
    5,882
    ALWAYS use Mathematics library types in Burst compiled code!
    And it can often be beneficial to split jobs into two separate smaller tasks even when that means iterating over the same collection multiple times!
     
  6. Leyren

    Leyren

    Joined:
    Mar 23, 2016
    Posts:
    29
    Awesome, thanks to the both of you! I did fiddle around with the approach of reducing the Dictionary calls, swapped to the Mathematics types everywhere (although the difference of that seemed to have been very minor in this case), and delegated the computation of the normals into a separate job that is executed between the ones generating the other mesh data and baking the mesh physics.