Search Unity

Incorrect "Previous Scheduled Job" Error?

Discussion in 'Entity Component System' started by PublicEnumE, Apr 4, 2019.

  1. PublicEnumE

    PublicEnumE

    Joined:
    Feb 3, 2019
    Posts:
    729
    When I schedule this job:

    Code (CSharp):
    1. private struct TestJob : IJob
    2. {
    3.     public DynamicBuffer<MyElementData> buffer;
    4.  
    5.     public void Execute()
    6.     {
    7.     }
    8. }
    with this code (in which each entry in
    myEntities
    is guaranteed to be unique):

    Code (CSharp):
    1. for (int i = 0; i < myEntities.Length; i++)
    2. {
    3.     Entity myEntity = myEntities[i];
    4.  
    5.     DynamicBuffer<MyElementData> buffer = EntityManager.GetBuffer<MyElementData>(myEntity);
    6.  
    7.     TestJob testJob = new TestJob()
    8.     {
    9.         buffer = buffer
    10.     };
    11.  
    12.     testJob.Schedule(inputDeps);
    13. }
    I get the following error:

    "InvalidOperationException: The previously scheduled job MySystem:TestJob writes to the NativeArray TestJob.buffer. You must call JobHandle.Complete() on the job MySystem:TestJob, before you can read from the NativeArray safely."

    I'm not sure why this would be. If each Entity in myEntities is unique, then a different DynamicBuffer should be passed into each Job. Unity shouldn't think that any two Jobs are trying to read or write to the same DynamicBuffer.

    Am I misunderstanding something about how job dependencies work? Thanks!
     
  2. 5argon

    5argon

    Joined:
    Jun 10, 2013
    Posts:
    1,555
    I think job dependency is per type. So it doesn't know exactly which exact buffer is not being written.
     
    PublicEnumE likes this.
  3. PublicEnumE

    PublicEnumE

    Joined:
    Feb 3, 2019
    Posts:
    729
    Could you explain that a bit more? I'm having trouble following.

    I'm starting multiple, individual Jobs with different data assignments. Why would Unity treat them like they're connected at all?

    Are you saying that Unity sees I have multiple TestJobs running, and assumes they must all be working on the same data? That doesn't sound right...

    Thanks!
     
    Last edited: Apr 4, 2019
  4. 5argon

    5argon

    Joined:
    Jun 10, 2013
    Posts:
    1,555
    That interconnection is the safety handle in your DynamicBuffer (and other things like NativeArray that came from something like ArchetypeChunk). It looks like your IJob is separated but actually it isn't, because you cannot create DynamicBuffer from thin air but from the ECS database system. ECS database will not let you have a slice of it without the safety handle tagging along. All jobs is able to inspect its own public fields for dependency, and then it would found the safety handle.
     
    PublicEnumE likes this.
  5. PublicEnumE

    PublicEnumE

    Joined:
    Feb 3, 2019
    Posts:
    729
    Thanks for the reply, but I still don’t get it.

    No two Jobs are actually trying to write to the same DynamicBuffer. So why would a safety check think this is happening?

    Thanks for all the explanations.
     
  6. 5argon

    5argon

    Joined:
    Jun 10, 2013
    Posts:
    1,555
    Becasuse the dependency is per type. ECS cannot possibly know if it is the "same DynamicBuffer". It only know through the component job manager (which distribute the safety handle to reference-like struct such as NativeArray or DynamicBuffer) that the type
    DynamicBuffer<MyElementData>
    is in a job somewhere.

    If you were EM, you would be worrying if you allow the Get, would this memory pointer change suddenly at anytime or not? (potentially in between lines of code because jobs are threaded) Therefore the only safe solution is EM needs to complete all jobs with DynamicBuffer<MyElementData> on their public field.

    Why can't ECS detect exactly which native container is being written instead of a per-type solution like this? That's the ideal of course, but remember that these native containers maps directly to a packed memory area that is consistent and calculatable per type. It is likely that doing this per type is the most efficient design. The "which" native container question is difficult to answer because the containers weren't real in the first place. (in a way the word "container" might make you think it is separating a subset of data to you) They are just a glorified memory pointer pointing to the shared big ECS database with some length, bounds, and safety checks.