Search Unity

Best way to iterate over each entity in a ComponentGroup in a Job

Discussion in 'Entity Component System' started by PublicEnumE, Mar 16, 2019.

  1. PublicEnumE

    PublicEnumE

    Joined:
    Feb 3, 2019
    Posts:
    729
    I'd like to iterate over each Entity of a ComponentGroup, and do some work on each.

    It'd be great to know the most efficient way to do this (in March 2019), with as much parallelism as possible.

    What's the most up-to-date way of doing this in a Job? Right now, I'm using IJobChunk, like this:

    Code (CSharp):
    1. private struct Job : IJobChunk
    2. {
    3.     public ArchetypeChunkEntityType archetypeChunkEntityType;
    4.  
    5.     public void Execute(ArchetypeChunk chunk, int chunkIndex, int firstEntityIndex)
    6.     {
    7.         NativeArray<Entity> entities = chunk.GetNativeArray(archetypeChunkEntityType);
    8.  
    9.         for (int i = 0; i < entities.Length; i++)
    10.         {
    11.             Entity entity = entities[i];
    12.  
    13.             // do work here
    14.         }
    15.     }
    16. }
    ...But this seems like it's leaving some parallelism on the table. It seems like I could be processing each one of these entities in a separate, concurrent Job.

    Is there another Job type, or API, that would let me do that? Or is there another approach is general which is better these days?

    Thanks!

    P.S. I don't think I can't use IJobProcessComponentDataWithEntity for this, because I'm constructing the ComponentGroups at runtime, and they can involve an unbounded number of Component types.
     
  2. ilih

    ilih

    Joined:
    Aug 6, 2013
    Posts:
    1,416
    The jobs can be executed with ComponentGroup using ScheduleGroup instead of the Schedule.
    job.ScheduleGroup(group, inputDeps);
     
  3. PublicEnumE

    PublicEnumE

    Joined:
    Feb 3, 2019
    Posts:
    729
    Yes, but with IJobProcessComponentDataWithEntity, you still have to specify at least one ComponentData type as a generic argument.

    There's no IJobProcessEntities type of interface...is there?? :D
     
  4. Micz84

    Micz84

    Joined:
    Jul 21, 2012
    Posts:
    451
    Unless processing one entity is very computational intensive there would be no point in running calculations for each entity in a separate thread. You can't have more concurrently running jobs then you have cores in your system. So even if you would start 100 in parallel at the same time only a few of them will be running. Moreover starting that many jobs would not be good because starting a job has its cost.
     
  5. PublicEnumE

    PublicEnumE

    Joined:
    Feb 3, 2019
    Posts:
    729
    Very true. But the way I'm writing it now, there is zero parallelism between entities - only chunks.

    But at the end of the day, I just want to make sure I'm writing using the most efficient, up to date approach that Unity intends. :) Is the IJobChunk example at the top currently the best way?
     
  6. Micz84

    Micz84

    Joined:
    Jul 21, 2012
    Posts:
    451
    If you will have enough chunks then it will be parallel. There is no built-in job I know of that is more granular. You could create your own but I do not think starting a job per entity would be efficient, but you could test it.
     
  7. PublicEnumE

    PublicEnumE

    Joined:
    Feb 3, 2019
    Posts:
    729
    Thanks!

    Code (CSharp):
    1. I do not think starting a job per entity would be efficient
    I would be a fool to argue. I'm guessing IJobProcessComponentData isn't doing that either.
     
  8. Micz84

    Micz84

    Joined:
    Jul 21, 2012
    Posts:
    451
    IJobProcessComponentData is chunk iteration under the hood.
     
  9. AndesSunset

    AndesSunset

    Joined:
    Jan 28, 2019
    Posts:
    60
    I’m absolutely being pedantic here, but this doesn’t actually imply anything about the Job structure. You can use chunk iteration with no jobs at all. Or you could fire off your own IJob-s for whatever granularity level you might want.
     
  10. PublicEnumE

    PublicEnumE

    Joined:
    Feb 3, 2019
    Posts:
    729
    Yes, but in this case they are only using one Job per chunk (so says the source).
     
  11. Joachim_Ante

    Joachim_Ante

    Unity Technologies

    Joined:
    Mar 16, 2005
    Posts:
    5,203
    We process per chunk because we found that actually gives the best scalability and performance. There is a lot of overhead in splitting data too tightly that makes splitting in less than chunk boundaries be worse.

    Is there a specific performance issue you are observing in the profiler?
     
    PublicEnumE likes this.
  12. PublicEnumE

    PublicEnumE

    Joined:
    Feb 3, 2019
    Posts:
    729
    Absolutely not. :D Things change fast around here. I just want to make sure I'm doing this the best way. :)

    Thanks for the helpful comments on here, I appreciate the advice.

    @Joachim_Ante Good luck next week!
     
    Last edited: Mar 16, 2019
    MostHated and DirkMueller like this.