Search Unity

  1. Megacity Metro Demo now available. Download now.
    Dismiss Notice
  2. Unity support for visionOS is now available. Learn more in our blog post.
    Dismiss Notice

How to Chain System Group Job Dependencies?

Discussion in 'Entity Component System' started by gilley033, Aug 17, 2018.

  1. gilley033

    gilley033

    Joined:
    Jul 10, 2012
    Posts:
    1,181
    tldr: I need a way to chain groups of systems together, so that the jobs from the systems in the same group run in parallel, but Group B runs after Group A, Group C runs after Group B, etc. Is this possible?

    Edit: I know UpdateBefore/UpdateAfter exist, but I need to manually update the systems so that I can run them multiple times in the same frame (once for each physics tick, the same way FixedUpdate is called multiple times in the standard Unity model), so I don't think I can use those attributes.

    Long Version:

    I am running into a common scenario in my port of a custom 2D physics engine to the new ECS model, where I have multiple systems performing the same task but in slightly different ways depending upon the components an entity has.

    For example, consider the task of calculating the Axis Aligned Bounding Box of a physics object. The goal (calculating the AABB) is the same, but the calculation is different for circles, boxes, etc.

    My first crack at this problem was using a single system with multiple jobs, one each for the different shapes. The problem with this when using multiple jobs in a system, my understanding is that they have to be scheduled to be dependent on each other. However, since a physics object can only be one shape, we know that each job will operate on discreet sets of entities, therefore they actually aren't dependent on each other can can be run in parallel.

    Getting around this issue is simple. Create separate systems, one for each type of shape, and organize those systems into a single group using the UpdateInGroup attribute.

    GroupA
    System A1 : Calculate AABB for Circles
    System A2 : Calculate AABB for Boxes

    Now here comes the problem. If I have another group of systems that are reading from the AABB component (Group B), and thus are dependent on Group A that writes to the AABB component, how do I schedule those system jobs so that they run after the first group but in parallel to each other (again, the AABB processing systems are guaranteed to not be dependent on each other).

    It's important to note that while the first group of systems that wrote to the AABB differed in what Shape component they had, the second group of systems reading from the AABB might different in some other component (not Shape). So it's not as simple as just chaining the job dependencies based on shape.

    Edit: Also important, I will be manually running these systems in my own loop, because each iteration of systems will correspond to a "physics tick" and multiple ticks will be run in a single frame. So just using the UpdateAfter/UpdateBefore attributes will not work, since I am not using Unity's automatic update loop.
     
    Last edited: Aug 17, 2018
  2. 5argon

    5argon

    Joined:
    Jun 10, 2013
    Posts:
    1,555
    I think they do not need to be dependent on each other? I have one system with 5 jobs but they are unrelated to each other, so all of them only use that one OnUpdate argument `JobHandle` when scheduling. Then I combine all JobHandle with JobHandle.CombineDependencies before returning so that all this can be other system's input deps. JobHandle.CombineDependencies does not mean combine in a dependent way, it is just bunching them up in one unit with no change in dependency.
     
    Singtaa likes this.
  3. gilley033

    gilley033

    Joined:
    Jul 10, 2012
    Posts:
    1,181
    Yes, you are right, the jobs from the same group are not dependent on each other. I did not know about JobHandle.CombineDependencies. That is exactly what I am looking for, thank you so much!

    Edit: Actually, I see now you are saying to have multiple jobs in the same system, and then combine the dependencies in the OnUpdate method. That makes sense, though the reason I did not think to do this before is because of this piece of text from the documentation:
    I took that to mean you should chain the jobs within the same system together, but is it just actually saying that all jobs in the system must use the input dependencies when scheduling the job?

    One final thought. Does this work with IJobProcessComponentData as well? I.e, can I have multiple structs that inherit from IJobProcessComponentData in a single system? I just wonder if it's different because you pass in "this" as the first parameter, so I am worried it won't know how to do deal with the same this passed in for multiple scheduled jobs.
     
    Last edited: Aug 18, 2018
  4. 5argon

    5argon

    Joined:
    Jun 10, 2013
    Posts:
    1,555
    Yes all jobs should at least use the input deps as a rule to make the dep chain work. If not the input dep on the later running system will lose its meaning as it does not cover all the deps of prior systems anymore. To additionally depends on each other is for like when 2 jobs are writing to the same CDA or you want the correct order of read-write.

    CombineDependencies can do up to 3 handles at once but you can do it again and again until you have bundle up all of your handles.

    Yes you can have multiple IJobProcessComponentData job in one system. They are the same. When scheduling IJobProcessComponentData there is still an overload with the second optional argument `JobHandle` to the next of `this` mandatory argument and you should still add input deps to that.

    What is `this`? Because this type of job can magically get data from nowhere the real implementation is that it have to add a component group to some system to get the data. So using `this` will works like you create an inject group to the `this` system with components like in generic parameter and send them to normal job with a loop. Anyways that means `this` should be unrelated to the dependency. You still required to depend on input deps.
     
    Last edited: Aug 18, 2018
  5. gilley033

    gilley033

    Joined:
    Jul 10, 2012
    Posts:
    1,181
    Thanks!
     
  6. Joachim_Ante

    Joachim_Ante

    Unity Technologies

    Joined:
    Mar 16, 2005
    Posts:
    5,203
    Actually there is an array version, so it really is as many JobHandles as you like.
     
    zhuchun, T-Zee and 5argon like this.
  7. gilley033

    gilley033

    Joined:
    Jul 10, 2012
    Posts:
    1,181
    @Joachim_Ante

    Hmm, even though my jobs operate on different entities, it looks like you cannot combine the dependencies if you are writing to the same component type in each job. You get this error:

    Should this be happening? Here are what the two jobs look like:

    Code (CSharp):
    1. [BurstCompile]
    2.     [RequireSubtractiveComponent(typeof(Heading2D), typeof(Scale2D))]
    3.     struct TransToMatrix : IJobProcessComponentData<RenderPosition2D, TransformMatrix>
    4.     {
    5.         //[ReadOnly] public ComponentDataArray<RenderPosition2D> positions;
    6.         //public ComponentDataArray<TransformMatrix> matrices;
    7.  
    8.         public void Execute([ReadOnly]ref RenderPosition2D position, ref TransformMatrix matrix)
    9.         {
    10.             float2 positionV = position.Value;
    11.             matrix = new TransformMatrix
    12.             {
    13.                 Value = float4x4.translate(new float3(positionV.x, positionV.y, 0.0f))
    14.             };
    15.         }
    16.     }
    Code (CSharp):
    1. [BurstCompile]
    2.     [RequireSubtractiveComponent(typeof(Heading2D))]
    3.     struct ScaleTransToMatrix : IJobProcessComponentData<RenderPosition2D, Scale2D, TransformMatrix>
    4.     {
    5.         //[ReadOnly] public ComponentDataArray<RenderPosition2D> positions;
    6.         //[ReadOnly] public ComponentDataArray<Scale2D> scales;
    7.         //public ComponentDataArray<TransformMatrix> matrices;
    8.  
    9.         public void Execute([ReadOnly]ref RenderPosition2D position, [ReadOnly]ref Scale2D scale, ref TransformMatrix matrix)
    10.         {
    11.             float2 positionV = position.Value;
    12.             float2 scaleV = scale.Value;
    13.             float4x4 pos = float4x4.translate(new float3(positionV.x, positionV.y, 0.0f));
    14.             matrix = new TransformMatrix
    15.             {
    16.                 Value = math.mul(pos, float4x4.scale(new float3(scaleV.x, scaleV.y, 1f)))
    17.             };
    18.         }
    19.     }

    Will this ever be possible in the future?
     
  8. Joachim_Ante

    Joachim_Ante

    Unity Technologies

    Joined:
    Mar 16, 2005
    Posts:
    5,203
    @ gilley033

    Thats definately by design. Currently depdendencies are done by access of ComponentTypes using read / write to allow parallelism for multiple readers. So event though these jobs are exclusive due to subtractive components, the job dependency system can't know that, changing that would have non-scalable performance cost for determining dependencies. So we won't do that. That said, using ArchetypeChunk based API you can disable safety system between the buffers and basically go all unsafe. But I think in practice this is rarely necessary. I would hope you have many other jobs that can run in parallel.
     
  9. gilley033

    gilley033

    Joined:
    Jul 10, 2012
    Posts:
    1,181
    Hmm, I guess I am just looking to make anything and everything run in parallel if possible. Isn't that one of the main points of this whole ECS/Jobs paradigm shift?

    If I understand you correctly, the reason it would hinder performance is because you'd have to add a check to see if the different jobs are operating on different entities, whereas right now it just checks if they are operating on the same ComponentType.

    I get that, but wouldn't it be possible to add a special IJob (or maybe System) that somehow automatically combines different jobs that are exclusive into one? So you wouldn't have to change the way dependencies are determined, because to all the other jobs being scheduled, they'd just see the combined job, which would sum all of the dependencies of the exclusive jobs, e.g., the combined job is reading from ComponentType X and Y and writing to ComponentType Z. But within the special job, it has a iterator per EntityArchetype which would allow for the same component type to be written to in parallel.

    Something like this:

    Code (CSharp):
    1. public struct SomeJob : IJobExclusive<RenderPosition2D, Heading2D, TransformMatrix>
    2. {
    3.  
    4.     public void Execute(ref RenderPosition2D position, ref TransformMatrix matrix)
    5.     {
    6.  
    7.     }
    8.  
    9.     public void Execute(ref RenderPosition2D position, ref TransformMatrix matrix, ref Heading2D heading)
    10.     {
    11.      
    12.     }
    13. }
    Maybe you'd have to define Execute methods for every possible combination, thought it would be better if you could ignore certain combinations because you know your game does not need them.

    I don't know, perhaps this doesn't make a lick of sense given the way the ECS/Job system is set up. It just seems like this sort of special case could take advantage of the fact that different archetypes are laid out into different chunks.

    Maybe I am crazy, because I see this pattern of combining components in different ways to determine logic a lot, as a means of converting OOP code that relies on polymorphism to ECS code. That is, I tend to have a lot of systems that are designed around doing some general thing, and the combination of components is what determines the specifics of how that thing is done (see AABB calculation example in OP). And in these cases, it is always such that the different jobs that are triggered in response to the different combinations are exclusive to each other and thus would seem like logical fits for parallelism.

    I will look into working with the ArchetypeChunk based API.

    Thanks!