Search Unity

Problem: Why am I getting a Thread Context Error on a build execute only, once I use 3 or more jobs.

Discussion in 'Entity Component System' started by Xrystal, Jul 9, 2018.

  1. Xrystal

    Xrystal

    Joined:
    Mar 25, 2014
    Posts:
    203
    I used to use LibNoise system for my landscape generation but due to them being classes with non job system friendly elements I decided to write my own noise system based on libnoise for job system use.

    It all seemed well until I tried to run it outside of the editor .... crash!!

    I have narrowed it down to a limitation in the number of Jobs an application can set up and use.

    Here is a minimal set of routines to demonstrate the problem

    Code (CSharp):
    1.  
    2. -- Initialise Arrays
    3. Landscape.Build(noiseData, planetData, landscapeData,latitude, longitude, heights);
    4. ColorMap.ApplyGradient(cg, heights, colors);
    5. texture = ColorMap.CreateTexture(landscapeData.sizeX, landscapeData.sizeZ, colors);
    6. material.mainTexture = texture;
    7. -- Dispose of Arrays
    8.  
    ApplyGradient is a job scheduler that applies uses a gradient to create a color array which is then used by the CreateTexture function before applying to the material.

    Landscape Build has the following
    Code (CSharp):
    1.  
    2. JobHandle gpcHandle = GenerateCoordinates(planetData, landscapeData, latitude, longitude);
    3. GenerateHeightsFromCoordinates(noiseData, planetData, latitude, longitude, heights, gpcHandle);
    4.  
    5.  
    GenerateCoordinates is a job schedule function that sets up the location to generate and generates a set of arrays for use by other jobs needing this information.

    GenerateHeightsFromCoordinates is a set of job schedule functions that handles different aspects ( noise modules ) and results in an array of heights.

    This is a minimal portion of the job schedule functions that triggers the error.
    Code (CSharp):
    1.  
    2. public static void ComplexPlanetJob(NoiseData noiseData, PlanetData planetData, NativeArray<double> latitude, NativeArray<double> longitude, NativeArray<double> heights, JobHandle reqHandle)
    3.     {
    4.         reqHandle.Complete();
    5.  
    6.         PlanetDefinitionData data = ComplexPlanetData();
    7.  
    8.         NativeArray<double> h01Results = new NativeArray<double>(heights.Length, Allocator.Persistent, NativeArrayOptions.UninitializedMemory);
    9.  
    10.         FractalData fractalData = new FractalData();
    11.         fractalData.amplitude = 1.0;
    12.         fractalData.frequency = data.continentFrequency;
    13.         fractalData.curveType = CurveType.Quintic;
    14.         fractalData.lacunarity = data.continentLacunarity;
    15.         fractalData.octaves = 14;
    16.         fractalData.persistence = 0.61;
    17.         fractalData.seed = planetData.seed + 0;
    18.         fractalData.heightOffset = data.terrainOffset;
    19.  
    20.         JobHandle h01 = JobScheduler.Fractal(noiseData, fractalData, latitude, longitude, h01Results, reqHandle);
    21.  
    22.         NativeArray<double> h02Results = new NativeArray<double>(heights.Length, Allocator.Persistent, NativeArrayOptions.UninitializedMemory);
    23.         NativeArray<CurveListData> curveData = new NativeArray<CurveListData>(10, Allocator.Persistent, NativeArrayOptions.UninitializedMemory);
    24.         curveData[00] = new CurveListData(-2.0000 + data.seaLevel, -1.625 + data.seaLevel);
    25.         curveData[01] = new CurveListData(-1.0000 + data.seaLevel, -1.375 + data.seaLevel);
    26.         curveData[02] = new CurveListData(-0.0000 + data.seaLevel, -0.375 + data.seaLevel);
    27.         curveData[03] = new CurveListData(+0.0625 + data.seaLevel, +0.125 + data.seaLevel);
    28.         curveData[04] = new CurveListData(+0.1250 + data.seaLevel, +0.250 + data.seaLevel);
    29.         curveData[05] = new CurveListData(+0.2500 + data.seaLevel, +1.000 + data.seaLevel);
    30.         curveData[06] = new CurveListData(+0.5000 + data.seaLevel, +0.250 + data.seaLevel);
    31.         curveData[07] = new CurveListData(+0.7500 + data.seaLevel, +0.250 + data.seaLevel);
    32.         curveData[08] = new CurveListData(+1.0000 + data.seaLevel, +0.500 + data.seaLevel);
    33.         curveData[09] = new CurveListData(+2.0000 + data.seaLevel, +0.500 + data.seaLevel);
    34.  
    35.         JobHandle h02 = JobScheduler.Curve(curveData, h02Results, h01Results, h01);
    36.  
    37.         h02.Complete();
    38.         heights.CopyFrom(h02Results);
    39.         h01Results.Dispose();
    40.         h02Results.Dispose();
    41.         curveData.Dispose();
    42.     }
    43.  
    The JobScheduler functions create the Jobs, fill the data and schedules the jobs ready for processing.

    This is the same block of code but without the Job Schedulers, this time they are standard functions.
    Code (CSharp):
    1.  
    2. public static void ComplexPlanet(NoiseData noiseData, PlanetData planetData, NativeArray<double> latitude, NativeArray<double> longitude, NativeArray<double> heights, JobHandle reqHandle)
    3.     {
    4.         reqHandle.Complete();
    5.  
    6.         PlanetDefinitionData data = ComplexPlanetData();
    7.  
    8.         FractalData fractalData = new FractalData();
    9.         fractalData.amplitude = 1.0;
    10.         fractalData.frequency = data.continentFrequency;
    11.         fractalData.curveType = CurveType.Quintic;
    12.         fractalData.lacunarity = data.continentLacunarity;
    13.         fractalData.octaves = 14;
    14.         fractalData.persistence = 0.61;
    15.         fractalData.seed = planetData.seed + 0;
    16.         fractalData.heightOffset = data.terrainOffset;
    17.  
    18.         FractalModule fm = new FractalModule();
    19.         fm.fractalData = fractalData;
    20.         fm.latitude = latitude;
    21.         fm.longitude = longitude;
    22.         fm.resultArray = heights;
    23.         fm.noiseData = noiseData;
    24.         fm.Evaluate();
    25.  
    26.         NativeArray<CurveListData> curveData = new NativeArray<CurveListData>(10, Allocator.Persistent, NativeArrayOptions.UninitializedMemory);
    27.         curveData[00] = new CurveListData(-2.0000 + data.seaLevel, -1.625 + data.seaLevel);
    28.         curveData[01] = new CurveListData(-1.0000 + data.seaLevel, -1.375 + data.seaLevel);
    29.         curveData[02] = new CurveListData(-0.0000 + data.seaLevel, -0.375 + data.seaLevel);
    30.         curveData[03] = new CurveListData(+0.0625 + data.seaLevel, +0.125 + data.seaLevel);
    31.         curveData[04] = new CurveListData(+0.1250 + data.seaLevel, +0.250 + data.seaLevel);
    32.         curveData[05] = new CurveListData(+0.2500 + data.seaLevel, +1.000 + data.seaLevel);
    33.         curveData[06] = new CurveListData(+0.5000 + data.seaLevel, +0.250 + data.seaLevel);
    34.         curveData[07] = new CurveListData(+0.7500 + data.seaLevel, +0.250 + data.seaLevel);
    35.         curveData[08] = new CurveListData(+1.0000 + data.seaLevel, +0.500 + data.seaLevel);
    36.         curveData[09] = new CurveListData(+2.0000 + data.seaLevel, +0.500 + data.seaLevel);
    37.  
    38.         CurveModule cm = new CurveModule();
    39.         cm.curveData = curveData;
    40.         cm.resultArray = heights;
    41.         cm.sourceArray = fm.resultArray;
    42.         cm.Evaluate();
    43.  
    44.         curveData.Dispose();
    45.     }
    46.  
    The Fractal and CurveModule classes include a full copy of the execute function and an evaluate function that does the for loop equivalent to the IJobParallelFor interface.

    This means that the only difference between the two running processes is that one is using Jobs for FractalData and CurveData and one isn't. The one using Jobs is the one that crashes the system when CurveData section is added and the one not using jobs for these two elements works fine.

    Remember, the error message only occurs on the build version and not the editor version.

    Unless someone can explain why I can only conclude that there is a limit on how many jobs or connected jobs you can have running.

    Thanks for reading and any assistance offered.