Search Unity

  1. Megacity Metro Demo now available. Download now.
    Dismiss Notice
  2. Unity support for visionOS is now available. Learn more in our blog post.
    Dismiss Notice

Jobs with GameObjects: Am I doing it right?

Discussion in 'Entity Component System' started by wx3labs, Dec 6, 2020.

  1. wx3labs

    wx3labs

    Joined:
    Apr 5, 2014
    Posts:
    77
    In my current project (2D space game) I am sticking with traditional monobehaviours but using Jobs to handle some tasks that involve iterating over lots of data. For example, I have implemented one job to predict future collisions so the AI can avoid them.

    Here's how it works:
    1. A single CollisionPredictor monobehaviour looks at all gameobjects in the world that are a collision risk (implement IAICollisionRisk) and puts their AIColliderData (position, velocity, radius) into a NativeArray, and their corresponding Rigidbody into a List.
    2. It starts a job to predict collisions. The job does a simple step simulation advancing collider positions by velocity. After each step it checks how close each AIColliderData is to each other (skipping cases where the first collider isn't an agent).
    3. The job puts each collision prediction into another NativeArray. A collision prediction consists of the index of the collider and the index of the collidee (plus some other data).
    4. When the job is done, the CollisionPredictor goes through the prediction array and pulls the corresponding Rigidbody from its list.
    5. A dictionary associates Rigidbodies with Agents, so the CollisionPredictor can now tell agents "here are the rigidbodies you might collide with soon"
    And it does work. But before I go implementing other similar jobs, I wanted to get a sanity check whether or not this seems like a "correct" way to tackle this problem.

    Specific concerns:
    • The job is looking at n^2 pairings for each simulation step, with no reduction in search space that I would get from a physics spherecast.
    • Intuitively, using array indices as a replacement for object references feels fragile. Bugs that would normally produce conspicuous null reference exceptions might instead just reference the wrong object.
     
  2. desertGhost_

    desertGhost_

    Joined:
    Apr 12, 2018
    Posts:
    259
    This approach should be fine. I would write a monobehaviour (no job code) alternative and check that the speed up from the job system is actually worth it (and exists) for the scale of units you have (the overhead of scheduling jobs, managing this data, writing back results, etc. could exceed the speedup of jobified code depending on the number of units).

    You should probably write your own custom container or consider using some containers from preview packages to reduce this complexity. Doing more work (even if some of that work is very fast) is in the end of the day more work for the CPU and can be slower than traditional approaches (if the overhead of doing that work exceeds the traditional cost). You can implement your native container (for spatial partitioning) that could handle this more efficiently.

    How fragile this solution is / feels depends entirely how robustly and reliably you are mapping those indices. If you have a robust method for adding or removing data or you rebuild that data structure each time the job is run (could have too much overhead to make jobifying make sense) then your solution shouldn't be fragile at all.
     
    wx3labs likes this.
  3. wx3labs

    wx3labs

    Joined:
    Apr 5, 2014
    Posts:
    77
    Thanks for the reply, I'm glad to hear that it doesn't seem like an obviously flawed approach.

    I may test a monobehaviour approach for comparison as you suggest.