Search Unity

IJobFor + Long Jobs + Temporary Data = Useless

Discussion in 'Entity Component System' started by LaireonGames, Jun 30, 2020.

  1. LaireonGames

    LaireonGames

    Joined:
    Nov 16, 2013
    Posts:
    705
    I've been fighting with jobs for a while now and the more I use it the more I am finding IJobFor utterly useless which is a real shame.

    I'm using jobs to generate custom terrain meshes and split some of the biggest ones to make use for IJobFor since it makes most sense to try and get that big job on multiple cores.

    Well as soon as you use IJobFor it seems to make many jobs that have a high potential to cause unity to WaitForJobGroupID on any of its internal methods, things like particles or rendering bounding volumes. Already started the process to report this as a bug but no confirmation of it yet.

    This alone is bad but I'm seeing more and more that jobs were not properly designed to be long running, especially with IJobFor. We keep getting crashes and I'm fairly sure its because of a warning we are getting about allocations being alive for too many frames.

    The reason for this is because I make temporary arrays within the IJobFor for each execution and use Allocator.Temp. This was great for speed (it seemed to offset the warning in the docs that allocating memory on a job is really slow). However since the job runs for more than 4 frames, sometimes 15 frames, these allocations become a problem and come with a warning that says they will likely leak.

    Well now I have a real problem, I can't make those arrays I need per run persistent. To do that I need to do things like create the array outside the job and pass it in. But I don't know how many times that job will run... I can use the thread index trick and make as many arrays as there are thread indexes but they need hard coded so it feels like a mess.

    Instead it feels like I need to schedule my own batches of IJob so there are a consistent number of jobs I can account for and pass in the temporary lists to each one. Which feels like I'm basically coding a simplified version of IJobFor and losing the flexibility of different max thread/core counts :/

    Am I mad, have I missed something?
     
  2. jwvanderbeck

    jwvanderbeck

    Joined:
    Dec 4, 2014
    Posts:
    825
    Why can't you make the arrays persistent? Sorry for maybe the stupid question, as I'm far from an expert myself still learning all the ins and outs.

    Allocator.Persistent doesn't mean they never go away it just means they are allocated in a way that they can stay around longer, which is a tad slower. You can still dispose of them after say 30 frames or whenever you want.

    I have several jobs, though they use the new SystemBase method, that use Allocator.Persistent because they can potentially last for more than 4 frames, and I just Dispose of them when done.
     
  3. LaireonGames

    LaireonGames

    Joined:
    Nov 16, 2013
    Posts:
    705
    This is an allocation whilst the job is running. You are only allowed to do that with Temp otherwise Unity throws an error and wont finish the job.
     
  4. - Jobs weren't created for long-running service threads, Jobs were created for parallelization for job-like tasks. This means you usually should put tasks in jobs if they aren't running longer than 1-4 frames tops (hence the warning if you have the allocated arrays longer)
    - allocating inside the running jobs is a bad idea (allocation is slow), IDK if it's even possible?
    - or you mean in the beginning, before the schedule? because that is outside of the job itself

    - summary: if you want long running tasks, use Threads or Tasks. Jobs are for per frame jobs as the naming suggests
     
  5. LaireonGames

    LaireonGames

    Joined:
    Nov 16, 2013
    Posts:
    705
    Job is ambiguous and doesn't suggest per frame in any way. They are not called frame jobs anywhere in the documentation that I can remember.

    You can indeed allocate inside a job but only Allocator.Temp as I mentioned. Also mentioned that the hit in allocation doesn't outweigh the gain I got from using the data needed so it had good gains for me.

    That summary is bad and misleading. If I run threads/tasks I am competing with the jobs thread queues and thus fighting with context switching. Its why I'm trying to use jobs in the first place.... (well that and burst)

    The jobs system should allow me to hook into it for long running jobs in graceful ways. I agree it doesn't feel designed for long running jobs and that is a bad design flaw since my options are now just deal with context switching or fight with jobs that 90% works for long running scenarios.
     
    JoNax97 likes this.
  6. No, they weren't created for long threading. If they were created for long term service-threads, they should. But they weren't. It's simple as that.
    You're using the tool on the wrong way.

    I agree that they should write more clear documentation that the standard definition of jobs (as in small batch of work) is not precise enough, so they should write explicitly that it is intended for small batches of work in one or two frames.


    Oh, and what do you think is happening when you schedule a job for 10-15 frames? You're competing for threads with all the Unity threads for extended period of time.
    In general it is a very bad idea. You should really break up that work to smaller pieces if you can, especially if you want to gracefully schedule with the Job system. So it can decide when it can give you the resource to complete piece by piece.
    Long threading is good for systems like audio and sometimes rendering when the work is almost constantly high. But it is a waste when you occupy it for elongated time and then release it and then occupy it again.
     
  7. LaireonGames

    LaireonGames

    Joined:
    Nov 16, 2013
    Posts:
    705
    No its not as simple as that..... there seems to be no good answer with Unity to process long running background tasks/threads/jobs etc.

    I got sick of the issues/slowness caused by context switching on traditional threads and am now trying Unity's system for multi threading.

    Both systems have issues in different ways, trying to find which one fits best for my project is the real challenge and so far not contesting resources with the jobs system seems like my most promising solution.
     
  8. snacktime

    snacktime

    Joined:
    Apr 15, 2013
    Posts:
    3,356
    I don't think there is anything frame specific about jobs themselves. Unity has used jobs internally for more then a frame I know that because I've seen the Allocator complain. And I've never seen anything specific about why jobs can't be long lived.

    Temp allocations though are frame based, so using those in a long running job is wrong.

    If you allocate Persistent outside the job ya it will take more work as you said in a parallel context. It's just not a common pattern so that's sort of expected.
     
    LaireonGames likes this.
  9. SamOld

    SamOld

    Joined:
    Aug 17, 2018
    Posts:
    333
    I don't understand the problem here. Why don't you just allocate a persistent array at the same time as scheduling the job?
     
  10. LaireonGames

    LaireonGames

    Joined:
    Nov 16, 2013
    Posts:
    705
    So with IJobFor it will decide how many internal jobs to create when you call schedule parallel. From what I understand this will change depending on the CPU and cores/virtual threads available.

    E.G in some cases your one call to scheduleParallel 5 jobs might run at once, in others it could be 10 (purely theoretical numbers).

    I need an array per running instance of the job. If I pass in the data from the outside, all the threads in the job for will try to use the same data.

    The thread trick I mentioned is I could do things like check the max thread count and manually make arrays for each potential thread but even that is real dangerous cause I can't make a nested NativeArray to account for a changing max thread number :/ (E.g I might think 50 is a safe max thread limit, new CPU comes out with more cores and my job will fall over).
     
  11. LaireonGames

    LaireonGames

    Joined:
    Nov 16, 2013
    Posts:
    705
    I guess I could pass in a huge flat array thats size is determined by the max thread limit * dataNeeded and just read from the section I need based on the thread index... f**k why didn't I think of that earlier!!!

    Still pretty damn messy though
     
  12. SamOld

    SamOld

    Joined:
    Aug 17, 2018
    Posts:
    333
    Oh I see! I actually have a situation like that in one of the things that I'm working on at the moment. Each thread gets its own NativeMultiHashMap. My solution was to replace IJobFor with a few manually scheduled jobs that share the workload. I have a little
    InterlockedArrayIterator
    utility type that allows each job to safely consume from the same queue in parallel.
     
  13. LaireonGames

    LaireonGames

    Joined:
    Nov 16, 2013
    Posts:
    705
    That is exactly what I'm in the middle of coding now :) Guessing I will take a noticeable calculation time hit with this new workflow but managed to get a longer chain of job dependencies so hopefully its not too bad.

    Did you notice a big difference with the switch?
     
  14. SamOld

    SamOld

    Joined:
    Aug 17, 2018
    Posts:
    333
    This is mine. It's a quick and dirty internal type with no testing, so proceed with caution, but it's pretty simple.

    My job does quite a lot of work for each item consumed, so the overhead of consumption with this is negligible. If you're worried about it being slow, you could modify it to read in batches.

    Code (CSharp):
    1. struct InterlockedArrayIterator<T> : IDisposable
    2.     where T : unmanaged
    3. {
    4.     private NativeArray<T> data;
    5.     private NativeArray<int> index;
    6.  
    7.     public InterlockedArrayIterator(NativeArray<T> data, Allocator allocator)
    8.     {
    9.         this.data = data;
    10.         this.index = new NativeArray<int>(1, allocator);
    11.         this.index[0] = -1;
    12.     }
    13.  
    14.     public void Dispose()
    15.     {
    16.         this.data.Dispose();
    17.         this.index.Dispose();
    18.     }
    19.  
    20.     public unsafe bool TryGetNext(out T value, out int index)
    21.     {
    22.         ref var indexRef = ref UnsafeUtility.ArrayElementAsRef<int>(this.index.GetUnsafePtr(), 0);
    23.         index = Interlocked.Increment(ref indexRef);
    24.  
    25.         if (index >= this.data.Length)
    26.         {
    27.             value = default;
    28.             return false;
    29.         }
    30.  
    31.         value = this.data[index];
    32.         return true;
    33.     }
    34. }
    I just schedule four jobs in parallel, each doing a
    while (this.Queue.TryGetNext(out var value, out var index) { DoWork(value, index); }
    .
     
    Last edited: Jul 1, 2020
  15. LaireonGames

    LaireonGames

    Joined:
    Nov 16, 2013
    Posts:
    705
    Ah ok, took me a while to figure what that was doing! Thanks for that, if this split away from IJobFor doesn't work out I will give something like that a try
     
  16. SamOld

    SamOld

    Joined:
    Aug 17, 2018
    Posts:
    333
    Mine is even slightly more complicated than necessary. All you really need is the shared interlocked index increment. I think something like this should do it. I haven't run this code, I just threw it together now.

    Code (CSharp):
    1. [BurstCompile]
    2. struct FooJob : IJob
    3. {
    4.     // Length 1 array acting as a single shared value, saves us dealing with pointers
    5.     public NativeArray<int> Index;
    6.     [ReadOnly] public NativeArray<int> Data;
    7.     public NativeArray<int> YourPerJobArray;
    8.  
    9.     public unsafe void Execute()
    10.     {
    11.         // Get a ref to the single item in the Index array
    12.         ref var indexRef = ref UnsafeUtility.ArrayElementAsRef<int>(this.Index.GetUnsafePtr(), 0);
    13.         int i;
    14.  
    15.         // Atomically increment the index and take data at that point
    16.         while ((i = Interlocked.Increment(ref indexRef)) < this.Data.Length)
    17.         {
    18.             var value = this.Data[i];
    19.             // Do Work
    20.         }
    21.     }
    22. }
    With both of these methods you may need to mess with some attributes to persuade the safety system that you know what you're doing. I just realised that I've had safety checks disabled in the editor for 3 days!
     
    Last edited: Jul 1, 2020