Search Unity

Ordered NativeMultiHasMap

Discussion in 'Entity Component System' started by Radu392, Oct 10, 2019.

  1. Radu392

    Radu392

    Joined:
    Jan 6, 2016
    Posts:
    210
    Hello, I’m looking for a multidimensional container to store a bunch of float3’s for pathfinding purposes.

    Constraints:

    1. Has to work in a bursted job.
    2. Has to support different numbers of values per key
    3. Amount of keys can be constant. My entities can never get destroyed or created after initializing.
    4. Has to be deterministic.
    5. Has to be fast

    Here is what I’ve tried so far:

    1. One big static NativeMultiHashMap with keys being entity indexes. Though this is the fastest option, it fails spectacularly when adding/ removing waypoints to it. The order of waypoints keeps changing at random since it's not a deterministic container. This was confirmed by mike_acton here: https://forum.unity.com/threads/nativemultihashmap-order.545713/. Sadly, I only found out about that thread after having written out the system and testing it out for myself.

    2. Dynamic buffers. This is an option I tried a few weeks ago when I knew almost nothing about ECS so I didn't document this and maybe I didn't do it the proper way back then. This satisfied all constraints except number 5 because from what I remember, GetBufferFromEntity<Waypoint> used to be quite slow for tens of thousands of entities. I might have to try it out again with all the knowledge I gathered about ECS. Though I have to say that redesigning my pathfinding system for a 4th time is putting me down, so that's why I'm posting this! I also have not found a way to use a Dynamic Buffer without using the Reinterpret method on them, which is an experimental feature that might of course get removed.

    3. One big static List<List<float3>> with the first dimension being all entitiy indexes in order. Since I can never create/destroy entities and if I create my 40k units initially, then my list from 0 to 40k will always correspond to the right entity. The only problem with this method is writing a proper job to copy the data from that big list onto a NativeArray<float3> with proper offsets. Here's the job:

    Code (CSharp):
    1.         struct CopyDataJob : IJob {
    2.             [WriteOnly] public NativeArray<float3> Waypoints;
    3.             public NativeArray<int> waypointsLengthPerEntity;
    4.             public NativeArray<int> slicesStart;
    5.  
    6.             public void Execute() {
    7.                 executeWaypoints();
    8.             }
    9.             void executeWaypoints() {
    10.                 slicesStart[0] = 0;
    11.                 int c = PathfindingSystem.Waypoints.Count;
    12.                 int k = 0;
    13.                 int prevSlice = 0;
    14.                 int prevLength = 0;
    15.                 for (int i = 0; i < c; ++i) {
    16.                    
    17.                     List<float3> ways = PathfindingSystem.Waypoints[i];
    18.                    
    19.                     int cc = ways.Count;
    20.                     if (cc == 0) {
    21.                         slicesStart[i] = prevSlice;
    22.                         continue;
    23.                     }
    24.                     for (int j = 0; j < cc; ++j) {
    25.                         Waypoints[k] = ways[j];
    26.                         ++k;
    27.                     }
    28.  
    29.                    
    30.                     waypointsLengthPerEntity[i] = cc;
    31.                     if (i > 0)
    32.                         slicesStart[i] += prevSlice + prevLength;
    33.                     prevSlice = slicesStart[i];
    34.                     prevLength = waypointsLengthPerEntity[i];
    35.                 }
    36.                
    37.             }
    38.         }
    Then in the job with entities, I get the waypoints corresponding to that entity as follows:

    Code (CSharp):
    1. NativeArray<float3> waypoints = new NativeArray<float3>(Waypoints.GetSubArray(slicesStarts[entity.Index], waypointsLengthPerEntity[entity.Index]), Allocator.Temp);
    The bottleneck here is the CopyDataJob job. It scales poorly when going over a few thousands of entities even if it's done in a job. The job itself can't be bursted since it's making reference to the List<List<float3>> object and I also could not think of a way to parallize it. (I don't even think it's possible since the number of waypoints can differ per entity)

    Does anyone have a solution that can scale up properly?
     
  2. Antypodish

    Antypodish

    Joined:
    Apr 29, 2014
    Posts:
    10,780
    1. As you find out, HashMap has no guaranteed ordering. So that is out of the window.

    3. By copying data from / to arrays, you probably will lose whole benefits, of arrays and performance.
    Saying that, if you got fixed amount of waypoints, or maximum waypoints, for example 20, you can set nicely index offset, of linear array. So you don't need really copy nor, resize array. You just need track, which entity is associated with offset index, if needed and which one is spare.

    2. Buffers are not that slow as they may seems. I use them quiet allot.

    Either way, your best bet, is to run stress test and try relevant approach, while profiling performance. Specially 2rd one. For example instead of target 10k, 20k, run 40k, 80k entities, and see how they behaves. You will, see what is your system capacity.

    Burst and Threaded Jobs of course, when applicable.
     
  3. Radu392

    Radu392

    Joined:
    Jan 6, 2016
    Posts:
    210
    I just spent a few hours refactoring the code to work with Dynamic Buffers instead. Can confirm, much better than my other solutions.
     
    Antypodish likes this.