Search Unity

  1. Megacity Metro Demo now available. Download now.
    Dismiss Notice
  2. Unity support for visionOS is now available. Learn more in our blog post.
    Dismiss Notice

Overhead on starting jobs

Discussion in 'Entity Component System' started by LennartJohansen, Mar 21, 2018.

  1. LennartJohansen

    LennartJohansen

    Joined:
    Dec 1, 2014
    Posts:
    2,394
    Hi.

    How big is the overhead of starting a job? I have a process where I would like to run several smaller job after each other to produce some final data.

    Is this a good approach or should I try to make the jobs bigger.
    It chould be hundreds of jobs in a frame.
     
  2. RootCauseProductions

    RootCauseProductions

    Joined:
    Feb 1, 2018
    Posts:
    31
    It depends on your data and your needs. However, you can string jobs together as dependencies so when the first one finishes the next one starts. By doing that, you will eliminate most (if not all) of the overhead of starting the jobs.
     
  3. LennartJohansen

    LennartJohansen

    Joined:
    Dec 1, 2014
    Posts:
    2,394
    That I get, I was more thinking internally for Unity. Creating handles, sorting jobs by dependency etc.
    If this is something to consider.
     
  4. RootCauseProductions

    RootCauseProductions

    Joined:
    Feb 1, 2018
    Posts:
    31
    The overhead doesn't seem to be too much. Remember, ECS is based on the Job system and it is designed to handle large numbers of systems and components.

    Keeping one job is easier to manage and work with, but I can see several reasons to have several jobs. If you have large amounts of intermediate data then breaking out the jobs allows the data that is no longer in use to be released and reused elsewhere (this is not C# garbage collection, the Unity team is managing memory manually). Another reason for separating the jobs is if they are reusable in another chain although functions can do the same thing.
     
  5. Joachim_Ante

    Joachim_Ante

    Unity Technologies

    Joined:
    Mar 16, 2005
    Posts:
    5,203
    Its quite heavily optimized. As a general rule we say make sure jobs are larger than 0.05ms each to keep overhead at a reasonable level.

    Also when profiling. Do note that
    a) JobDebugger overhead is quite big(In Austin demo its around 5ms)
    In player it is stripped out completely. In Editor you can turn it off in the jobs menu, but of course that turns off all error messages so you might get crashes if you cause a race condition.

    b) SchedulingMany jobs on main thread is cheap, whats expensive is waking up workerthreads with semaphores.
    For this reason. Job.Schedule in fact doesn't schedule any jobs... It puts it on a main thread queue. You need to call JobHandle.ScheduleBatchedJobs(). This is done automatically at the end of each component system. Generally you want to schedule a bunch of jobs with dependencies and then kick them off all at the same time. This way main thread overhead is minimized.
     
  6. LennartJohansen

    LennartJohansen

    Joined:
    Dec 1, 2014
    Posts:
    2,394
    Sounds good.

    Lennart
     
  7. OndrejP

    OndrejP

    Joined:
    Jul 19, 2017
    Posts:
    303
    I had to do some raycasts into bunch of shapes and was considering burst + jobs to improve the performance.
    Workload was quite small: 40-60 shapes, mostly boxes, some cylinders and spheres.
    The raycasts included transforming the ray from world space to local unit space of each shape.
    Then transforming the hit point back into world space.
    Also I could not do that asynchronously, I needed the result immediately.

    Rough process was:
    1. Calculate local rays into "rays array"
    2. Do batched raycasts (first boxes, than spheres, then cylinders...)
    3. Transform results back into world space
    ...all was done in one bursted job / function

    With just C# it took about 50 us (microseconds)
    With job and Burst it took about 17 us
    With just Burst it took 5 us

    I was using IJob and Run, so I guess 12 us was the job overhead in my case?
    Getting rid of the job added some complexity (passing pointers instead of NativeArrays), but it was worth it.

    Notes:
    ReadOnly, WriteOnly attributes in job, NoAlias attributes on bursted functions, SafetyOff
    Local unit space - raycasts are transformed into local space of the shape as if the shape had zero center and unit size, this makes the raycasting code faster, since if doesn't have to take into account offset and scale

    Can someone please explain what is the overhead of Job.Run?
    Since there's no scheduling I'd expect to be very close in performance to Bursted function, but it's not.
    @Joachim_Ante