Search Unity

  1. Megacity Metro Demo now available. Download now.
    Dismiss Notice
  2. Unity support for visionOS is now available. Learn more in our blog post.
    Dismiss Notice

Is ECS the right choice?

Discussion in 'Entity Component System' started by betaFlux, Jan 27, 2020.

  1. betaFlux

    betaFlux

    Joined:
    Jan 7, 2013
    Posts:
    112
    Hello,

    during the past few days I've been trying to implement ECS by watching tutorials and grabbing some bits here and there for my own project, but after reading more and more about the topic, I doubt that it is the right choice for the project I'm working on, since there are quite a few MonoBehaviour systems interacting with one another. But may be I'm wrong and it can be done?

    Facts about my project:
    - it's 3D
    - multiple MonoBehaviours on every character
    - characters come in fbx format, which means deep gameobject hierarchies
    - every character has one mesh with two material slots (skin and cloth)
    - the "game" consists only of NPCs (no player character) with StateMachine
    - heavy use of NavMesh


    Am I correct in assuming that it is better to wait until ECS is a bit more grown up?
     
  2. 5argon

    5argon

    Joined:
    Jun 10, 2013
    Posts:
    1,555
    Currently I think ECS worth it if you are able to foresee your problem in data-oriented. You didn't state what would be bulk of your work. I think that is the most important. You can just learn ECS API what are available to play with, then estimate how much you would be number crunching with ECS db and systems.

    Some games that immediately reminds you of data oriented are games with a lot of similar things that keeps updating. All games in the end may be thought like this (far future goal of Unity ECS when it matures), but right now you better stick with games that are more obvious that it would benefit from DOD. e.g. RTS, shoot em up/bullet hell, tycoon/simulator game, music games, tiled puzzle games, ...

    For other games I think at least wait for GetSingleton that the T could accept MonoBehaviour, if it would be ever implemented. I can imagine it would be a really nice mono-ECS bridge. (So you can use exposed inspector easily and bring things to ECS)
     
    Last edited: Jan 27, 2020
  3. Arowx

    Arowx

    Joined:
    Nov 12, 2009
    Posts:
    8,194
    DOTS is great for optimising slow areas of your game. You could profile your game and look for areas that are slow then bring in DOTS to replace those systems and improve the games performance.

    Also, you don't have to fully adopt DOTS you can gain a lot of improvement by moving your code and data to Native Arrays and Jobs to benefit from multi-threading and mathematics and Burst Compiler for further improvements to performance hot spots.
     
    Last edited: Jan 27, 2020
    SamFernGamer4k and betaFlux like this.
  4. Sarkahn

    Sarkahn

    Joined:
    Jan 9, 2013
    Posts:
    440
    For whatever reason it accepts a MonoBehaviour for GetSingletonEntity but not for GetSingleton. For now you can do this:
    Code (CSharp):
    1.     T GetSingletonObject<T>()
    2.     {
    3.         var entity = GetSingletonEntity<T>();
    4.         var comp = EntityManager.GetComponentObject<T>(entity);
    5.         return comp;
    6.     }
    You can call this in a JobComponentSystem in OnStartRunning to get a MonoBehaviour on a gameobject with a ConvertToEntity set to "ConvertAndInjectGameObject". Kinda awkward but it works for now.
     
    charleshendry likes this.
  5. ippdev

    ippdev

    Joined:
    Feb 7, 2010
    Posts:
    3,850
    Currently I have the same reservations as the OP and the same enthusiasms as the posters replying saying give it a go. I would be using it for the subatomic rigidbody particle sim which is not only a bottleneck with not enough subatomic species able to be in the game at once to make a danged Iron atom but now crashes to desktop every single time the rigidbodies emit and anywhere from 5 to 50 are on stage..then kaboom..crash... So DOTS is a no brainer for that subsystem.

    But I saw the issues with character hierarchies exploding and having two render pipelines rendering because the Hybrid Renderer was somehow in play and they didn't seem to be able to trashcan it and use the legacy renderer..The legacy render works dandy giving the game the look I need and do not want to convert hundreds of materials to work with the Hybrid Renderer. So It gets kinda confusing. Unity is great at here is a demo with a ton of cubes and no hierarchy. See how fast it is..Wow..cool..let me go try..oh wait..this ain't working the way it was advertised..been bitten a hundred times and have as many mostly finished projects as the bite marks. I don't want to ignore this one out of frustration. It is too good but Physx had deadstopped me with a regression bug that seems to pop up every few months.
     
    tonialatalo likes this.
  6. betaFlux

    betaFlux

    Joined:
    Jan 7, 2013
    Posts:
    112
    Thanks guys for taking the time to answer! I really appreciate your input!

    Also, if anyone tried implementing DOTS into an existing 3D MonoBehaviour based project, please let me know how it went. I'd love to know!

    This game is going to be a Majesty clone. Right now all I have are NPCs running around attacking enemies in their proximity. There is constant distance checks, action selection, navmesh path selection and more. I'll go into detail about it below. I would happily go the DOTS road without hesitation but I couldn't find a tutorial or any info about how to handle NavMesh and animations with SkinnedMeshRenderers. I think these are not covered by DOTS yet?

    Though I have the feeling that this answers my question. A bridge between pure ECS and MonoBehaviour seems necessary if you want to convert dynamic, complex games to DOTS.

    That's a wonderful idea, but the systems listed below which would need this are all part of what defines the agent. I just can't see how to make only part of the system data-driven while maintaining their function and communication with the left over MonoBehaviours.

    This is how the system looks in a nutshell (Controllers are all MonoBehaviours):
    - AgentController with all System.Action events necessary for all other controllers to subscribe to like OnDrawWeapon, OnAttack, OnDeath, etc.
    - AggroSensor to find closest enemy with OverlapSphereNonAlloc
    - SkillController to find the skill/spell/action(data classes) with the highest priority, only paused while an action is being performed

    Controllers that only listen to AgentController events
    - AnimationController
    - SFXController
    - VFXController

    I imagine these last 3 controllers are acceptable MonoBehaviours since they are only waiting for events to fire off.
    But controller 2 and 3 are the reason I have the urge to implement DOTS or at least the Job System with Burst. They do a lot every frame.

    That's what I thought too, but I can't for the life of me figure out how to weave it in to make it do things faster. In the AggroSensor I have implemented a Job to calculate all distances which I thought could be quite the relief for the CPU:
    Code (CSharp):
    1. [BurstCompile]
    2. public struct GetDistanceJob : IJob {
    3.  
    4.     public float3 start;
    5.     public float3 end;
    6.     public NativeArray<float> result;
    7.  
    8.     public void Execute() {
    9.  
    10.         result[0] = math.distance(end, start);
    11.     }
    12. }
    which I implement inside this method:
    Code (CSharp):
    1. public Agent GetClosestEnemy(float range) {
    2.  
    3.         List<Agent> enemies = GetEnemiesInRangeNonAlloc(range); // Peek below
    4.  
    5. // The typical get closest approach
    6.         Agent closest = null;
    7.         float closestDist = Mathf.Infinity;
    8.  
    9.         for(int i = 0; i < enemies.Count; i++) {
    10.  
    11. // Job implementation
    12.             NativeArray<float> distanceResult = new NativeArray<float>(1, Allocator.TempJob);
    13.             GetDistanceJob job = new GetDistanceJob {
    14.  
    15.                 start = _agent.m_transform.position,
    16.                 end = enemies[i].transform.position,
    17.                 result = distanceResult
    18.             };
    19.  
    20.             JobHandle jobHandle_GetDistance = job.Schedule();
    21.             jobHandle_GetDistance.Complete();
    22.             float dist = job.result[0];
    23.  
    24.             distanceResult.Dispose();
    25.  
    26.             float currDist = dist;
    27.  
    28.             if(closest == null || currDist < closestDist) {
    29.  
    30.                 closest = enemies[i];
    31.                 closestDist = currDist;
    32.             }
    33.         }
    34.  
    35.         return closest;
    36.     }
    Code (CSharp):
    1. List<Agent> GetEnemiesInRangeNonAlloc(float range) {
    2.  
    3.         int numHit = Physics.OverlapSphereNonAlloc(_agent.transform.position, range, _allAgents, 1 << LayerMask.NameToLayer("Agents"));
    4.  
    5.         if(numHit == 0) {
    6.  
    7.             return null;
    8.         }
    9.  
    10.         List<Agent> agentList = new List<Agent>();
    11.  
    12.         for(int i = 0; i < numHit; i++) {
    13.  
    14.             Agent agent = _allAgents[i].GetComponent<Agent>();
    15.  
    16.             if(agent != _agent && agent.dead == false && agent.IsEnemy(_agent)) { // ***
    17.  
    18.                 agentList.Add(agent);
    19.             }
    20.         }
    21.  
    22.         return agentList;
    23.     }
    *** IsEnemy leads to the following relation check and another job implementation:
    Code (CSharp):
    1. public static int GetRelations(Faction faction1, Faction faction2) {
    2. // Faction = enum
    3.         _relationDataCollection = new NativeArray<RelationData>(data, Allocator.TempJob);
    4.         NativeArray<int> relationResult = new NativeArray<int>(1, Allocator.TempJob);
    5.         IterateRelationsJob job = new IterateRelationsJob {
    6.  
    7.             relationDataArray = _relationDataCollection,
    8.             faction1 = faction1,
    9.             faction2 = faction2,
    10.             relationResult = relationResult
    11.         };
    12.  
    13.         JobHandle jobHandle_GetRelation = job.Schedule();
    14.         jobHandle_GetRelation.Complete();
    15.         int result = job.relationResult[0];
    16.  
    17.         _relationDataCollection.Dispose();
    18.         relationResult.Dispose();
    19.  
    20.         return result;
    21.     }
    Code (CSharp):
    1. [BurstCompile]
    2. public struct IterateRelationsJob : IJob {
    3.  
    4.     public NativeArray<RelationData> relationDataArray;
    5.  
    6.     public Faction faction1;
    7.     public Faction faction2;
    8.  
    9.     public NativeArray<int> relationResult;
    10.  
    11.     public void Execute() {
    12.  
    13.         if(faction1 == faction2) {
    14.  
    15.             relationResult[0] = 256; // If factions are the same, we are friends anyway
    16.         }
    17.         else {
    18.  
    19.             for(int i = 0; i < relationDataArray.Length; i++) {
    20.  
    21.                 RelationData data = relationDataArray[i];
    22.  
    23.                 if((data.faction1 == faction1 && data.faction2 == faction2)) {
    24.  
    25.                     relationResult[0] = data.relations;
    26.                 }
    27.             }
    28.         }
    29.     }
    30. }
    I was under the impression that no matter where you implemented Jobs, they would always increase performance. But my stress tests with a lot of agents running around had all the same outcomes in the profiler with or without Jobs.

    I must be using the Job System incorrectly. Should I rather rewrite the system and iterate over all agents in the scene in one big main Update loop and assign the jobs from there to the controllers (one job / agent)?
     
    Last edited: Jan 29, 2020
  7. Arowx

    Arowx

    Joined:
    Nov 12, 2009
    Posts:
    8,194
    @betaFlux Always profile, profile, profile first, it's the only way to understand what areas of your game need a performance boost.

    Otherwise you could be trying to optimise and multi-thread areas that do not need it.
     
    betaFlux likes this.
  8. Arowx

    Arowx

    Joined:
    Nov 12, 2009
    Posts:
    8,194
    Just a few notes as I go over your code examples:
    • GetEnemiesInRangeNonAlloc
      • If you use layers and move entities into seperate layers e.g. Player, DeadEnemy, Enemy, <Faction> you can massively reduce the number of agents you collect and remove the filtering loop from your code.
    • GetDistanceJob
      • Looks like you are hard coding the result to index 0 in a native array. You should be passing in Native Arrays of end vectors and writing out distance results to another array.
    • GetClosesetEnemy
      • If you used an Native Array filled with the enemies positions your GetDistanceJob would be more performant.
    • GetRelations
      • It looks like again you are using a Job system like a simple function and not as intended as batch based function that can compare multiple sets of data, the RelationResult[0] clearly shows this intent.
      • The thing is unless the player factions can change allegiances mid game you should only need to know if an agent is friend or foe and that can be done with a simple boolean variable and for this type of searching with an Enemy or Faction Layer.
    Anywhere where you are trying to Jobify and Multi Thread your code where you see a loop or fixed array index variable should help you realise that you are not multithreading your code.

    The idea is to write code that will work through a batch of data and can do so as lots of jobs.

    This should help https://docs.unity3d.com/Manual/JobSystemParallelForJobs.html

     
    tonialatalo, Sarkahn and betaFlux like this.
  9. betaFlux

    betaFlux

    Joined:
    Jan 7, 2013
    Posts:
    112
    Arowx, thanks for the excellent advice! I guess I'm going to rewrite a bit of code now. :)
     
  10. mechanimal1

    mechanimal1

    Joined:
    Aug 22, 2019
    Posts:
    11
    The profiler is the tool that you need to use in order to see if you should use ECS right now.
    It will show you what portions of the engine are Grunting the hardest. However.....

    Right away you should be using the Job System, the Burst Compiler, and Native Containers, and Mathematics. There is no reason not to use them. But there is also no reason to change your current project, especially if its near completion.

    The job system is a great place to start. It's helpful to find the smallest system in your project, clock the speed for measurement. Convert it to the job system. Verify it runs. Enable burst. EDIT: Enable burst one at a time, once you know it works without.

    Start with something small, like isolating for loops into single systems and enable burst. Start using the Mathematics library. Even if entities scare you, you can manage data with out using Entities or Components. AND ITS IMPRESSIVE.

    Native Containers (Array and List and Slice) NativeArray(int2)
    Operate on the data in those containers using Systems :IJob
    Increase the speed of those systems in almost every case with Burst. [BurstCompile]
    Using mathematics library types where possible int2 int3 int4 float4 bool4........

    Start, and don't look back.
     
    betaFlux likes this.
  11. betaFlux

    betaFlux

    Joined:
    Jan 7, 2013
    Posts:
    112
    Thank you! Very motivating advice! My project is far from complete. I'm pausing development while learning about the "new ways". Reading your comment, I realize I just scratched the surface of an overwhelming set of mechanics. After 8 years programming in Unity I feel this was all just a joke and now I have to learn real programming, but as you say, it's worth it.

    I'm having a hard time finding good spots for the job system. Currently I'm trying to implement a job base distance check system, since the traditional way is rather computation heavy with 20 NPCs minimum calculating proximity to each other.

    I know every game is different, but are there any general good fits for the job system? Whenever a loop is called in update may be? That's what I'm concentrating on right now.
     
  12. mechanimal1

    mechanimal1

    Joined:
    Aug 22, 2019
    Posts:
    11
    One method I am using to convert:
    Identify single systems which can be converted. Conver only on one thing at a time.

    Yes, start by migrating loops away from the Main Thread. If you don't have a problem, don't do it. But in general, main loop is not the place the workers work.

    The Job System is exactly the tool for that.
    The side effect is that you can use [BurstCompile] attribute

    You will inevitably be forced into Native Containers, so embrace it, there is NO-LEARNING-HURDLE in using Native Containers, except you MUST destroy them, and you decide when to do that. But if its temporary, Dispose ASAP.
    If its persistent, .Dispose() OnDestroy(). Simple rules to start with.

    My first use and attempt was just getting rid of an allocation/assignment spike. It turns out, there are many ways to do that. The one I chose, what I described in my previous, has a consequence of simply using the hardware more efficiently.

    Here's a 5 dollar app, you need a $550 GPU to use it, on top of a decent machine.
    ^
    That by itself should explain the reality of just DON'T LOOK BACK, but start somewhere or you will feel discouraged, and I am here to tell you, It will all be okay.

    Response to Topic Name: You don't have a choice in using ECS, you have a choice to not use part of it today. Unity has made it clear that ship has sailed.
     
    Last edited: Jan 31, 2020
    betaFlux likes this.
  13. mechanimal1

    mechanimal1

    Joined:
    Aug 22, 2019
    Posts:
    11
    Every thing that does "work" is a system performing a job.

    The main loop should manage the allocation of memory for workers. Then list of Schedule() calls to jobs. The end of the main loop will clean up Dispose() and Complete() when required, and not always Complete(). But always before you want the data from the job if it is not already complete. Depending on the Allocation of memory, your job could be in process for multiple frames on a separate thread, if that is the requirement.
    You gain access to job scheduling via Jobs provided in Unity.Jobs

    Copy and create the DATA. Understand that its really fast to copy a chuck of data temporarily and work on it, all the while the system is never waiting for updated data, it just gets updated when its completed and or scheduled by the main loop. Other threads do the work. Thus making your program (Main Loop) faster. Do not be afraid of using memory in this way, that is why you have control of it. Its useless memory if you do not use it.
    This commands you to use NativeContainers provided under Unity.Collections

    Copying or Creating temporary DATA that you want work on allows the active DATA to continue to be acted on. Its temporary, so its a low cost idea and the memory is just sitting there, PILES OF IT. Its like the Duck Scrooge cartoon where then he swims in mountains of golden coins. (Don't get carried away though)

    If you operate on the principle that you don't want to wait for data, then don't become dependent on it. Have a working data set, and an operational data set. Its /ONE/ of the reasons that it looks like twice as much typing to achieve the same thing.
    In that case, there are twice the things to type, because Working + Operational = 2 things.

    You end up "Boiler Plating" after an hour or two, you'll probably have a skeleton template that you just flop in and change the names and cut something out or add it and save as a different name. After that, it starts getting easier, and I mean easy. Because all the code files are very similar in: (style, shape, naming and !!LENGTH!!), it becomes really obvious when something is not quite right. I became accustomed quickly. The files are typically easy to understand, or at least my personal format.

    Establish or ADOPT a naming convention as an imperative. If you thought that you are stringent already, you have 3-8 additional name spaces to deal with, by default. In addition to the original engines name spaces. Which in many cases parallel the DOTS ones.
    Not forgetting to MENTION Unity.Mathematics types, there's over 20 more types right there. And still doesn't address that you now have 2-3 additional containers of data for each type of represented data. The result is dealing with naming is rapidly becoming complex and it will cause you significant "headaches" as your project grows in complexity. Unity is having an internal struggle, to some degree, with this issue. But they will sort it out on their end. And you still have to name more than twice as many types of things.

    !!!!BUT!!!!! you get the [BurstCompile] attribute, courtesy of the Unity.Burst package. If you don't understand what burst is, then what you need to know is that there are secrets hidden within various CPUs. And that burst knows exactly how to find many of them, automatically. Just be sure that you apply burst to ONE job at a time. Otherwise you could overdose and have a stroke or coronary. Its really powerful,... the most powerful single word implemented tool I have witnessed since I can remember.

    Everything mentioned above does NOT REQUIRE Entities. So you can rest easy knowing that you can get intimate with the job system, and not have to start sleeping with its ugly cousin. not yet at least... The entity package will continue to change. I feel confident however that when Unity places a version number of 1.0 or higher on something and not (a) or (b), its solid.
     
    Last edited: Jan 31, 2020
    CanardCoinCoin and betaFlux like this.
  14. mechanimal1

    mechanimal1

    Joined:
    Aug 22, 2019
    Posts:
    11
    In your current situation,
    create an array of distances and array of positions.

    Create a job that gets the current positions of those NPCs and assigns the data to a temporary array. Schedule it.
    Don't force it to complete.

    Copy the array to the working set when .IsComplete

    Create another job that uses the working set to calculates the distances like for (i=0;i<#;i++) math.distance(.... , ....[i+..]) and assigns them to an array which is temporary.
    Don't force it to complete.

    Copy the temp array to the active array whenever ( .IsComplete )

    Create one more job to use the data for what ever you want...... and
    Always use the active data set for operating, and don't worry if new data isn't ready until next frame. (Force this or not, its up to you, its more up to the game demands)

    The only things in there that slow you down(even at all?) is that you need to gather the data. But if nothing waits for that to happen, it doesn't need to hold you up. It can occur over more than one frame....

    You will default to Read/Write however SPECIFY [ReadOnly] explicitly whenever
    you have no intention to write data.

    Spoiler:
    try NativeArray<float4> and store the distance in the .w component.
    you can swizzle it to make it disrespect data types a little.
    v1.w = v2.w = math.distance(v1.xyz, v2.xyz)
    or
    v1.w = With math.distance(v1.xyz, v2), where v1 is float4 and v2 is float3 types....

    One note, you cannot assign to .w .x .y or .z component values or whatever component values from outside a NativeContainer.

    Meaning ArrayOfStuff.x = something is not valid. I just don't want you do that and spend an hour thinking its screwed up. It just might not throw an exception.

    ArrayOfStuff = something is valid assuming types are respected.

    Ill elude to why this is amazing if you cant see it. But I hope you can see that I you can copy two arrays of data, in one copy instruction. That is just another tip of the iceberg. Not to mention how tightly packed this data is. For other purposes.

    to access x/y represented data in a single int loop, where its the array is evenly dimensioned (square), or from a function which input requires separate coordinates.

    This is useful to simple convert stuff to jobs where it doesn't make sense to approach the work in completely different way. Meaning NativeContainers are not [x,y] accessible. Its a bad way to store and access data. BUT if you were doing that before, its okay, what i just explained will make it really easy to mash out conversion from standard loops to jobs.

    from 650ms to 8.6ms

    The below loop now runs at 8.6ms.... and also writes to a 1024x1024 texture2d and apply function.

    This loop started at over 650 ms (main thread)
    NativeContaier and IJob implemented 300-330ms

    [BurstCompile] 60-65ms
    [ACheapTrick]~40ms


    for(x= 0 ......
    for (y = 0 ........
    result = fN(x, y)

    replaced with
    for(i = 0.....;i++) result = fN(math.floor(i/width), math.frac(i/width) * width)

    I apologize that this does require "Boiler Plate" but it takes two clicks and a couple pasted names. So, its as fast to implement.

    fN is Unity.Mathematics.noise.cnoise() in the actual example last night.
    inside an IJob assigning values from cnoise to a NativeArray[].


    Test it out for your self. =) I have a feeling you will have Burst compiled jobs running on a separate thread within like an hour!

    HUGE TIP INCONSISTENT LINE ENDINGS
    when you commit a change to your IDE, have it correct for inconsistencies in line endings before saving. It screws burst up at the moment, and is undetectable if you make a mistake. It will do unexpected things and drive you nuts with no errors. Until you disable burst so that you can read the error or a NativeContainer doesn't get Disposed as a result, then it will tell you something is not right. That is a huge concern. You may go waste a lot of time fixing things which were never broken. SO APPLY BURST ONE JOB AT A TIME, and follow up by disabling burst entirely when you are finished working, and see if there are actually any errors.
     
    Last edited: Jan 31, 2020
    betaFlux likes this.
  15. mechanimal1

    mechanimal1

    Joined:
    Aug 22, 2019
    Posts:
    11
    https://docs.unity3d.com/ScriptReference/Unity.Jobs.IJob.html

    Thats where I found my basic boiler plate for this type of conversion. (Fast and dirty use of jobs and containers)
    I removed all the comments and types and flattened it as much as possible. Its easy to read and work-ready.
    I also removed the UPDATE() thing and removed the monobehaviour. And replaced it with

    public static class LoopIJob : {
    // [BurstCompile] // Uncomment after tested
    public struct ForILoopJob : IJob {
    [ReadOnly] NativeArray<int> ReadArray;
    NativeArray<int> WriteArray;
    [ReadOnly] int Value;
    public static void Execute() {
    int i;
    for(i = 0.....{} // Loop goes here, this is the magic of BoilerPlating
    }
    }
    public JobHandle StartJob(in NativeArray<int> readArray, ref NativeArray<int> writeArray, int value) {
    var job = new LoopIJob() {
    ReadArray = readArray,
    WriteArray = writeArray,
    Value = value
    };
    JobHandle jobHandle = job.Schedule();
    return jobHandle;
    }

    And I have 4 versions now which I use commonly. I look through my code, find a loop, see which template fits, then convert. Test and apply burst then move on. Unless you want to refactor. You'll end up with your own templates, which you will quickly identify, loop by loop. Its just snowballs itself. It will allow you to see your existing code, operating on multiple threads, and utilizing low level optimizations. With literally minimal work.

    I hope this helps. Everyone keeps saying Entity this and Component that. I needed to find a way to actively learn new ideas without starting a new project from nothing. Now, I feel comfortable with the idea I need to make entities but I still haven't even gone there yet, i'm still getting comfortable with the way I think about my program and how to setup the systems because its not completely obvious until you can make the connections in your brain. I also felt like, man,
    "This is really not going to be okay, I don't understand any of it" It was depressing because I felt useless. But I feel so much better now because I can keep up with the competition again.
     
    Last edited: Jan 31, 2020
    betaFlux likes this.
  16. betaFlux

    betaFlux

    Joined:
    Jan 7, 2013
    Posts:
    112
    Wow, thanks for these in depth answers, mec! A lot of helpful info here!

    Can native containers be used anywhere in code, replacing generic lists and stuff even without jobs? Also, is there a rule when to use Allocate.Temp vs Allocate.TempJob?

    I thought as much. As soon as ECS can handle fbx, navmesh and the animator, I'm in for it all the way. I think I'll focus on Jobs and Burst for now.

    I'm still not sure if the way I do it is correct:
    NPC.cs -> void UpdateLoop -> For(AllEnemiesInScene) -> DistanceCheckJobs
    Would it be even more performant if I did this?:
    NPCManager.cs -> void UpdateLoop -> LoopThroughAllNPCsJob -> NPC.UpdateLoop -> For(AllEnemiesInScene) -> DistanceChecks

    When you say "Apply Burst to one job at a time" do you mean I should put it only on one chosen job in the whole project, make sure it doesn't break anything then implement the next? I've put the BurstCompile attribute above every job by default.

    Yes, I'm also looking forward to it! I like how ECS handles things and forces me to write clean code. After digging a bit into it, everytime I build a new MonoBehaviour and start GetComponent - ing it feels ugly and wrong. So I started to convert part of the code base to a custom MVVM system, very similar to the structure of Quill18's Base Building Game. It's so much more satisfying to work on!

    Thanks for all the helpful tips, though I have to say I do not get most of it. I really have to study it more to be able to grasp casual talk about this stuff. From what you wrote I understand that I don't have to .Complete a JobHandle, but how and why is not clear to me.

    This is how my DistanceCheckClass looks right now:
    Code (CSharp):
    1. public override void Update()
    2.     {
    3.         _closestEnemy = GetClosestEnemy_WithJobs(visionRange);
    4.  
    5.         if(_closestEnemy != null)
    6.         {
    7.             _agent.Execute_SetAttackTarget(_closestEnemy);
    8.         }
    9.     }
    10.  
    11. AgentMonoController GetClosestEnemy_WithJobs(float range)
    12.     {
    13.  
    14.         AgentMonoController closestEnemy = null;
    15.  
    16.         AgentMonoController[] enemies = GetEnemiesInRangeNonAlloc(range).ToArray();
    17.         float3[] enemyPositions = enemies.Select(e => (float3)e.m_transform.position).ToArray();
    18.  
    19.         NativeArray<float3> positions = new NativeArray<float3>(enemyPositions, Allocator.TempJob);
    20.         NativeArray<float> distances = new NativeArray<float>(enemyPositions.Length, Allocator.TempJob);
    21.  
    22.         if(_agent != null)
    23.         {
    24.  
    25.             DistanceCheckJob job = new DistanceCheckJob()
    26.             {
    27.                 positions = positions,
    28.                 selfPosition = _agent.transform.position,
    29.                 distances = distances
    30.             };
    31.             JobHandle jobHandle = job.Schedule(distances.Length, 1);
    32.             jobHandle.Complete();
    33.  
    34.             float currDist = Mathf.Infinity;
    35.             for(int i = 0; i < distances.Length; i++)
    36.             {
    37.                 if(distances[i] < currDist)
    38.                 {
    39.  
    40.                     currDist = distances[i];
    41.                     closestEnemy = enemies[i];
    42.                 }
    43.             }
    44.         }
    45.  
    46.         distances.Dispose();
    47.         positions.Dispose();
    48.  
    49.         return closestEnemy;
    50.     }
    51. [BurstCompile]
    52. struct DistanceCheckJob : IJobParallelFor
    53. {
    54.  
    55.     public NativeArray<float3> positions;
    56.     public NativeArray<float> distances;
    57.     public Vector3 selfPosition;
    58.  
    59.     public void Execute(int i)
    60.     {
    61.  
    62.         distances[i] = math.distancesq(selfPosition, positions[i]);
    63. }
    64.  
    This is what I could come up with thanks to all the input from you and the others. I'm trying to get the the hang of it and see how to implemtn the tricks that you mentioned but it seems too tricky for my knowledge.^_^

    Interesting, I'll have look at that! I appreciate every reusable Job template I can get my hands on. Most examples in the internet are very specific and hard to learn from.
     
    Last edited: Jan 31, 2020
  17. betaFlux

    betaFlux

    Joined:
    Jan 7, 2013
    Posts:
    112
    Yes! Every bit of info helps, but even more so if the info comes from someone who knows how it is to feel dumb in the beginning and where the difficulties are. Thanks for your input!
     
  18. mechanimal1

    mechanimal1

    Joined:
    Aug 22, 2019
    Posts:
    11
    I was looking at your code and I understand where the confusion comes from. Ill share a demonstration project.
     
  19. mechanimal1

    mechanimal1

    Joined:
    Aug 22, 2019
    Posts:
    11
    You can, there's limitations on what they can contain. They cannot contain nullable types. They allow you to share information between managed and unmanaged memory. I use NativeArray and NativeList in a lot of places, everything mostly.

    Persistent is pretty much an array of aligned memory that you are responsible for releasing. It will feel like [] array does, except it is accessible outside of the managed environment.

    Temp-Job is for jobs, it expires after 4 frames.

    Temp is usable from from the main thread and not a job. Expires 1 frame

    I was reading and article written by a unity developer and I got the impression that using Temp not particularly common for many use cases. But that it is the fastest allocation and expires the fastest. So this feels finite and delicate, the way I perceive it. However clarification to the extent that we can rely on Temp is still not entirely obvious.

    Something to think about:
    Temp allocation might be a finite reserved area of memory that doesn't really do any alignment checks and clearing so that way it can allocate extremely fast. And in that case its preferable for COPY ill remember to update if I find out.
     
    Last edited: Feb 1, 2020
  20. mechanimal1

    mechanimal1

    Joined:
    Aug 22, 2019
    Posts:
    11
    Its advisable to ensure that the code operates correctly before burst is enabled on that job. As you enable more features, with anything in the world, you are increasing the complexity and unexpected results can occur. By enabling one job at a time, you have a basis to know what caused something.

    If you take the option below into account a situation could occur where you have waited for many seconds, or possibly minutes, and this isn't normal behavior for your project. You may wrongly assume there is a problem. It may lead you to needless investigation or potentially even down a road of "Fixing what is not broken".... I can't be the only one....

    Synchronous compilation
    Applies only to the Editor
    Compiles the burst instructions ahead of time, when using the Editor. The compilation occurs when you press play in the Editor window. When synchronous compilation is not used, the Job code will execute normally until the burst compile completes.

    Its common to notice this after you have made changes to the job code. (2019.3)

    The side effects of turning on Synchronous
    1. The editor may actually freeze for a number of second when you first press play after changing the job code.
    2. The test simulation will not suffer from performance loss due to not having to compile while its simulating.
     
    Last edited: Feb 1, 2020
  21. betaFlux

    betaFlux

    Joined:
    Jan 7, 2013
    Posts:
    112
    It's like reading a very informative PDF Guide, thanks so much! For now I'm going to change some Allocators and may be implement some more native containers. :)
     
  22. sheikhg1900

    sheikhg1900

    Joined:
    Sep 13, 2021
    Posts:
    8
    With ECS, I found a major boost in performance.
    But I think it is not ready for production releases. If you are planning to build an android game, you will be in trouble when deciding to publish it. You will face different kinds of issues.
    Although you will be able to make the Mono release, but it will not be accepted by the google play store.
    IL2CPP release will work only with development mode, which is again not be accepted by the google play store.
    I developed a complete application in ECS. Now struggling to publish it to the play store. So far no luck :-( .
    I think, now I have to rewrite the code without ECS.
    Very Painful experience with ECS development.
    My suggestion, if you want to make a mobile app in ECS, keep testing on mobile during development. Don't forget to use IL2CPP mode, it is the only option that will be accepted by the google play store.

    One more thing, I learned during the development of my game. That I should haven't used Preview packages. These packages are not ready for the production releases (as suggested by Unity as well).
     
    brunoliveiraaraujo likes this.