Search Unity

Using IJobProcessComponentDataWithEntity results in same JobComponentSystem's OnUpdate()

Discussion in 'Entity Component System' started by yossi_horowitz_artie, Mar 1, 2019.

  1. yossi_horowitz_artie

    yossi_horowitz_artie

    Joined:
    Jan 30, 2019
    Posts:
    87
    Is there a way to implement algorithms like these using JobComponentSystem/IJobProcessComponentDataWithEntity?


    Code (Pseudocode):
    1. foreach entity with TagComponent and without TagComponentState
    2.     - Add TagComponentState to entity
    3.     - If tagComponent.AssociatedEntity doesn't have Buffer
    4.         - Add Buffer to tagComponent.AssociatedEntity
    5.     - Add entity to tagComponent.AssociatedEntity's buffer
    Or, even more simply:

    Code (Pseudocode):
    1. foreach entity with TagComponent and without TagComponentState
    2.     - Add TagComponentState to entity
    3.     - If tagComponent.AssociatedEntity doesn't have TagComponentCounterComponent
    4.         - Add TagComponentCounterComponent to tagComponent.AssociatedEntity
    5.     - Increment tagComponent.associatedEntity's TagComponentCounterComponent.Counter value
    I don't see how I could do it using EntityCommandBuffer, since you can't queue anything conditional inside of it.

    I could make my own queue of commands in a native container, populate them in the job, and loop through them in JobComponentSystem.OnUpdate() after the job is done, but don't see any examples that call jobHandle.Complete() and then do any processing with the results of the job in OnUpdate().

    I could queue up events as _new_ entities (with "RequestRegisterTagComponent" Components) in EntityCommandBuffer, establish a barrier, and process those event entities to apply the buffer setup/counter setup in a different System's non-Jobified OnUpdate() later. But that extra layer of indirection wouldn't actually get me anything; it seems to me that I might as well not try to jobify the above algorithms in the first place.

    Is there another solution?
     
  2. Piefayth

    Piefayth

    Joined:
    Feb 7, 2017
    Posts:
    61
    If you need to check components conditionally like that, you typically want to use chunk iteration.

    Essentially, you will say

    -> Schedule a new IJobChunk on a ComponentGroup selecting TagComponent / Subtractive<TagComponentState>
    -> For each chunk, you can call chunk.Has(ArchetypeChunkComponentType<T>) or chunk.Has(ArchetypeChunkBufferType<T>) to see if it has the component / buffer you expect
    -> If it doesn't, add the component / buffer via EntityCommandBuffer
    -> Set the component or buffer data via the EntityCommandBuffer (I just learned as I was typing this that you can .SetBuffer() from ecb!)
     
  3. yossi_horowitz_artie

    yossi_horowitz_artie

    Joined:
    Jan 30, 2019
    Posts:
    87
    Piefayth -- thanks for your reply!

    That wouldn't work because chunk.Has() can't tell me about the contents of the EntityCommandBuffer; it can only tell me about the state of the world that already exists.

    If iteration 1 of the Job causes
    buffer.AddComponent(associatedEntity, new TagComponentCounterComponent() {Counter = 1} )


    Iteration 2 would need to cause
    buffer.SetComponent(associatedEntity, new TagComponentCounterComponent() {Counter = 2} )


    based not on the contents of the World, but on the contents of the EntityCommandBuffer, which is, as far as I know, impossible.
     
  4. Piefayth

    Piefayth

    Joined:
    Feb 7, 2017
    Posts:
    61
    Oh, yeah, you're probably better off running a job that populates a NativeHashMap<Entity, TagComponentCounterComponent> and then a second job that dumps all the data from the final state of the map into the ECB.
     
  5. yossi_horowitz_artie

    yossi_horowitz_artie

    Joined:
    Jan 30, 2019
    Posts:
    87
    That sounds perfect! Thank you!
     
  6. yossi_horowitz_artie

    yossi_horowitz_artie

    Joined:
    Jan 30, 2019
    Posts:
    87
    (I guess the answer to the question in the title of the thread is "surely there's a way not to have to.")
     
  7. yossi_horowitz_artie

    yossi_horowitz_artie

    Joined:
    Jan 30, 2019
    Posts:
    87
    Okay, so now I'm doing this:

    Code (CSharp):
    1. // Simplified and pseudocode:
    2.  
    3. OnUpdate(dependencies)
    4. {
    5.     hashMap = new HashMap();
    6.  
    7.  
    8.     jobThatPopulatesHashMap = new IJobProcessComponentDataThatPopulatesHashMap() { HashMap = hashMap.ToConcurrent() } ;
    9.  
    10.     dependencies = jobThatPopulatesHashMap.Schedule(this, dependencies);
    11.  
    12.  
    13.     jobThatIteratesThroughHashMap = new IJobNativeMultiHashMapVisitKeysThatIteratesThroughHashMap() { HashMap = hashMap } ;
    14.  
    15.     dependencies = jobThatIteratesThroughHashMap.Schedule(this, dependencies);
    16.  
    17.  
    18.     return dependencies;
    19. }
    This yields the error message --
    InvalidOperationException: The previously scheduled job ProcessComponentDataJobThatPopulatesHashMap writes to the NativeArray ProcessComponentDataJobThatPopulatesHashMap.HashMap. You must call JobHandle.Complete() on the job ProcessComponentDataJobThatPopulatesHashMap, before you can read from the NativeA


    This is a very clear and straightforward error message, but it leaves me with two questions:

    1) Isn't the whole point of passing these dependencies around and into the job schedule functions to allow us to not have to have our own hand-written sync points on the main thread like this? I figured that if Job B depends on Job A, the scheduler simply wouldn't run Job B until Job A is done.

    2) As the error message suggests, I can bypass the problem by calling
    dependencies.Complete();
    before scheduling the second job. But is that allowed/safe/reasonable? I haven't seen
    Complete()
    calls on job handles returned by
    IJobProcessComponentData
    in any
    JobComponentSystem.OnUpdate()
    calls in any example code; I figured that if you're using a JobComponentSystem you're supposed to let the system handle everything related to Jobs that's not simply scheduling them and specifying their dependencies.
     
    Last edited: Mar 12, 2019
  8. sngdan

    sngdan

    Joined:
    Feb 7, 2014
    Posts:
    1,154
    1- should work, but you would have to show you real code to allow community to review
    2- you can and sometimes have to do this - from your pseudo code though, it looks like that it should not be required in your case
     
  9. yossi_horowitz_artie

    yossi_horowitz_artie

    Joined:
    Jan 30, 2019
    Posts:
    87
    For anyone curious, the problem was that I was reading from the hashmap in the Schedule() function for the second job (it's a custom job), and the Schedule() function of course runs on the main thread.
     
  10. GilCat

    GilCat

    Joined:
    Sep 21, 2013
    Posts:
    676
    Here you should pass the concurrent version of hashmap since you will be writing to it in a parallel job. I can't tell if you are doing that
     
  11. yossi_horowitz_artie

    yossi_horowitz_artie

    Joined:
    Jan 30, 2019
    Posts:
    87
    I am; updated the sample code to clarify that.
     
    GilCat likes this.