Search Unity

Can there be a list/array of NativeArrays in a Job?

Discussion in 'Entity Component System' started by Inter-Illusion, Mar 19, 2019.

  1. Inter-Illusion

    Inter-Illusion

    Joined:
    Jan 5, 2014
    Posts:
    598
    Q: Is it possible to have an array of NativeArrays in a job?

    I have a job that takes the results of several other jobs, and do some operations on them as a whole.

    Each of the initial jobs have its own NativeArray<float>.
    The second job needs to access them and do the operation on some of them (something like find the max value and update that one).

    I see two ways of doing it:
    1- have an array of NativeArray pointing to the ones in each previous job
    Code (csharp):
    1.  
    2. // Silly example to show the idea
    3. public struct JOB_Update : IJob
    4. {
    5.     public NativeArray<float> pos;
    6.     public NativeArray<float> result;
    7.  
    8.     public void Execute(){ result[0] = pos[0] + 3; }
    9. }
    10.  
    11. public struct JOB_UpdateHigh : IJob
    12. {
    13.     public NativeArray<float>[] positions; // each element is a copy of the JOB_Update.result
    14.  
    15.     public void Execute()
    16.     {
    17.         int index = ... find index of max value in positions...;
    18.         positions[index][0] = 10;
    19.     }
    20. }
    21.  
    2- Have inbetween jobs that copy the data from JOB_Update into JOB_UpdateHigh
    Code (csharp):
    1.  
    2. public struct JOB_Update : IJob
    3. { ...
    4. }
    5. public struct JOB_CopyFloat : IJob
    6. {
    7.     public NativeSlice<float> from;
    8.     public NativeSlice<float> result;
    9.  
    10.     public void Execute()
    11.     {
    12.         result[0] = from[0];
    13.     }
    14. }
    15.  
    16. public struct JOB_UpdateHigh : IJob
    17. {
    18.     public NativeArray<float> positions; //There is a JOB_CopyFloat that moves from JOB_Update.result into positions[x]
    19.  
    20.     public void Execute()
    21.     {
    22.         int index = ... find index of max value in positions...;
    23.         positions[index] = 10;
    24.     }
    25. }
    26.  
    Ideally the first way is the best, but given that Jobs have restrictions on using C# arrays and list, is the first approach supported?
    Is there a better way that don't need to spawn jobs just to copy data?
     
    Last edited: Mar 19, 2019
  2. Abbrew

    Abbrew

    Joined:
    Jan 1, 2018
    Posts:
    417
    Try using a flattened array, where instead of having n arrays of k length you have one array of n * k length, and store the elements of each array in their respective spots. For example,
    [a,a,a]
    [b,b,b]
    [c,c,c]
    becomes
    [a,a,a,b,b,b,c,c,c]
    Make sure to label the NativeArray inside of the job with [NativeDisableParallelForRestriction]
     
  3. DreamingImLatios

    DreamingImLatios

    Joined:
    Jun 3, 2017
    Posts:
    4,271
    You can also use NativeSlice if you want to split the array for multiple jobs.
     
  4. Inter-Illusion

    Inter-Illusion

    Joined:
    Jan 5, 2014
    Posts:
    598
    Hi,
    Thanks a lot for the suggestions.

    I managed to solve my issue with a combination of both.
    Now, when scheduling all jobs, I'm creating a bigger NativeArray that contains the values I want to modify on each object (they are all flattened toguether)

    Then, for those jobs that need to read/write from a random number of them, I pass that shared array and also another array with the indices of the values the job should modify.
    And for those smaller objects that only interact with one or two values, I pass them as NativeSlices from the bigger array. That way, I don't have to create extra memory to hold the indices.

    I had to disable safety protections for the bigger NativeArray, but its easy avoid race conditions by setting the right dependencies when scheduling the jobs.

    The bonus of using the bigger flattened array, is that values are layout sequencially and grouped by object, instead of fragmented in memory, so, when accessing the data in the job, there are fewer cache misses which gives a nice speed boost.

    Again, thanks a lot for the suggestions.
    Frank