Hi all, After dabbling a while ago with ECS I'm taking a new look at it to solve some practical problems, and I'm getting to grips with a few of the (welcome) changes to coding with ECS. In an attempt to recreate some old work from scratch to refresh my memory, I've been attempting to rewrite in my own overly described version of the AccelerationParallelFor example. Found Here: https://github.com/stella3d/job-sys...ter/Assets/Scripts/AccelerationParallelFor.cs I seem to be doing something wrong with my velocities NativeArray<Vector3>, as at appears new AccelerationJobs in Update are trying to access the same native array the previous AccelerationJob is using. From what I can tell the older AccelerationJobs should be concluding (from LateUpdate: m_MoveJobHandle.Complete() that depends on AccelerationJob completing). Why are the next AccelerationJobs causing this error? Thanks! InvalidOperationException: The previously scheduled job LiquidBodyParallelFor:AccelerationJob writes to the NativeArray AccelerationJob.velocity. You are trying to schedule a new job LiquidBodyParallelFor:AccelerationJob, which writes to the same NativeArray (via AccelerationJob.velocity). To guarantee safety, you must include LiquidBodyParallelFor:AccelerationJob as a dependency of the newly scheduled job. Code (CSharp): using System.Collections; using System.Collections.Generic; using UnityEngine; using System; using Unity.Entities; using Unity.Burst; using Unity.Mathematics; using Unity.Transforms; using Unity.Jobs; using Unity.Collections; using UnityEngine.Jobs; public class LiquidBodyParallelFor : BaseJobObject { public Vector3 m_Acceleration = Vector3.up; public Vector3 m_AccelerationMod = Vector3.down * 0.001f; NativeArray<Vector3> m_Velocities; TransformAccessArray m_TransformsAccessArray; // Defined in this class MoveJob m_MoveJob; AccelerationJob m_AccelJob; JobHandle m_MoveJobHandle; JobHandle m_AccelJobHandle; /// <summary> /// In Start, we are creating a bunch of bodies to work with, generating references to both their /// Transforms (in a TransformAccessArray) and their renderers. /// /// We are also creating a NativeArray (persistent) to contain the velocities of these objects. /// </summary> protected void Start() { m_Velocities = new NativeArray<Vector3>(m_BodyCount, Allocator.Persistent); m_Objects = LiquidUtil.PlaceLiquidBodies(m_BodyCount, m_BodyPlacementRadius, new Vector3(0, 50, 0)); for (int i = 0; i < m_BodyCount; i++) { var obj = m_Objects[i]; m_Transforms[i] = obj.transform; m_Renderers[i] = obj.GetComponent<Renderer>(); } m_TransformsAccessArray = new TransformAccessArray(m_Transforms); } /// <summary> /// A parallel job - doesn't need any other object reference information than an index. /// The values required for the actual execution must still be defined when the struct /// is created (timescale, access to the velocity array, current accel, accel modification). /// </summary> struct AccelerationJob : IJobParallelFor { public NativeArray<Vector3> velocity; public Vector3 acceleration; public Vector3 accelerationMod; public float deltaTime; /// <summary> /// Executing this job will feed any desired acceleration behaviour into the transform's /// velocity. For now, let's just add accelerationMod. /// </summary> public void Execute(int i) { velocity[i] += (acceleration + accelerationMod) * deltaTime; } } /// <summary> /// A parallel job, specifically involving transforms - these must define the input data /// required to execute, and the execution method itself: /// Execute(int index, TransformAccess transform). /// </summary> struct MoveJob : IJobParallelForTransform { [ReadOnly] public NativeArray<Vector3> velocity; // The velocities from AccelerationJob public float deltaTime; /// <summary> /// Executing this job simply requires adding the velocity to the transform's position. /// </summary> public void Execute(int i, TransformAccess transform) { transform.position += velocity[i] * deltaTime; } } /// <summary> /// In Update we prepare the movement and acceleration jobs, populating their JobHandles. /// These are typically run to conclusion (ensured by jobHandle.Complete() in LateUpdate). /// </summary> public void Update() { // Prepare the jobs by defining the common data going into each parallel task m_AccelJob = new AccelerationJob() { deltaTime = Time.deltaTime, velocity = m_Velocities, acceleration = m_Acceleration, accelerationMod = m_AccelerationMod }; m_MoveJob = new MoveJob() { deltaTime = Time.deltaTime, velocity = m_Velocities }; // Here we are scheduling the acceleration job, then the move job, and saying that the // move job depends on the completion of the acceleration job m_AccelJobHandle = m_AccelJob.Schedule(m_BodyCount, 64); m_MoveJobHandle = m_MoveJob.Schedule(m_TransformsAccessArray, m_AccelJobHandle); } /// <summary> /// In LateUpdate, we make sure the movement job completes. Since this depends on the /// completion of the acceleration job, both will be completed. /// </summary> public void LateUpdate() { m_MoveJobHandle.Complete(); } /// <summary> /// It is important to dispose of component arrays after we're done with them. /// </summary> private void OnDestroy() { m_Velocities.Dispose(); m_TransformsAccessArray.Dispose(); } }
Solved. For anyone that runs into a similar problem: The logic in the updates is fine. I needed some more control to refresh and recreate the native arrays - I moved from creating things in Awake to custom 'Spawn' and 'Dispose' functions. I'm not certain but I think some array was being left undisposed somewhere messing with things.