Search Unity

  1. Megacity Metro Demo now available. Download now.
    Dismiss Notice
  2. Unity support for visionOS is now available. Learn more in our blog post.
    Dismiss Notice

Performance Optimisation and Pool Managers

Discussion in 'Scripting' started by Aedous, Jun 2, 2020.

  1. Aedous

    Aedous

    Joined:
    Jun 20, 2009
    Posts:
    244
    Hey guys! I've got a few things on optimisation in the game that I'm working on, was wondering if anyone could chime in with their knowledge. I'm using Unity 2018.1.3f1 to develop.

    I've created a Pool Manager to store majority if not all the objects that will be spawning, and these are all spawned during the first load time of starting the game. I'm using Scriptable objects as keys to reference them in a dictionary, so I can easily identify which pool an object will need to grab cache'd Gameobjects from (not sure if this is the best way to do things).

    Question 1: Is it best to call a GC.Collect after everything has been spawned from the start as simply instantiating a bunch of Gameobjects will cause some memory allocation?

    I've also been profiling my code and trying to reduce as much GC Alloc calls that may come up during game play, I've noticed that simply just using delegate functions can cause some allocation. I normally subscribe some of my Gameobjects to delegate events in the 'OnEnable' call by using '+=' and then unsubscribe on the 'OnDisable' call by using '-='. What shows up in the profiler is 'Delegate.Combine()'

    Question 2: What methods are there that I can use to reduce GC Allocation with delegates?

    Any help would be greatly appreciated :).
     
    Last edited: Jun 2, 2020
  2. Baste

    Baste

    Joined:
    Jan 24, 2013
    Posts:
    6,294
    For question 1; no. You've instantiated a bunch of objects, there's nothing to GC. GC is cleaning up stuff that's not referenced anymore.

    For question 2: you can't. If you want to get around the allocations, you do this manually - give the things that need the callback an interface, and implement the callback by adding them to a list and calling a method on all the objects in the list.
     
  3. _met44

    _met44

    Joined:
    Jun 1, 2013
    Posts:
    633
    Hey, you have to cache your delegates in order to avoid recreating it everytime you pass it to a method.
     
  4. Depends on the technique used. Best way to find out is to use the Profiler. There are some methods which do allocation despite you don't expect it.

    Delegates can be one of the worst offenders when it comes to sneaky garbage made. Make sure you always register a method directly and avoid the lambda delegate register entirely.
    Also keep your register/unregister cycle minimal (I know it's a delicate balance between unneeded delegate calls and registers/unregisters.
     
  5. Aedous

    Aedous

    Joined:
    Jun 20, 2009
    Posts:
    244
    Thanks for the response, I was actually pretty afraid of this, as I use events as my go to tool for trying to keep my code readable and manageable, are you suggesting that staying away from delegates entirely is probably the best route for creating optimised code?

    I looked this up and don't quite understand how to cache my delegates, here is a sample of my code not sure how cacheing will apply to it, maybe you could chime in a bit on what you mean :).

    Code (CSharp):
    1. public delegate void DamageTriggers(DamageOutput damageoutput);
    2. public  OnStartWithReference;
    3.  
    4. //Sample subscribe on enable
    5. damageOutput.OnStartWithReference -= OnDamageOutputStarted;
    6. damageOutput.OnStartWithReference += OnDamageOutputStarted;
    7.  
    8. //Sample unsubscribe on disable
    9. damageOutput.OnStartWithReference -= OnDamageOutputStarted;
    10.  
    Yep I've been using the profiler to pin point the GC.Alloc calls and what's causing the lag spikes, wish I knew about delegates causing some memory allocation as I use them quite a lot, buy avoid the lambda expression you mean not using '+=' and '-='. That may be an issue as I have a parent object with some messages it will fire and then child objects that use the parent object as a sort of guideline to figure out when it needs to do things, so multiple child objects could be listening to 1 particular event of the parent object.
     
  6. I mean avoid the

    Code (CSharp):
    1. Something.SomeAction += () => doSomething();
    Style or any derivatives with parameters. These basically makes it impossible to unregister as well.

    Instead if and when you use delegates, it should be the
    Code (CSharp):
    1. [...]
    2. Somethin.SomeAction += doSomething;
    3. [...]
    4.  
    5. void doSomething() {
    6.    // do soemthing
    7. }
     
  7. Aedous

    Aedous

    Joined:
    Jun 20, 2009
    Posts:
    244
    Yep I basically subscribe / unsubscribe most of my events using the '+=' and '-=' just to avoid the lambda expressions, I still get the memory allocation either way :(
     
  8. Memory allocation is not your enemy. The GC is.

    The best way to minimize the performance implications of the delegates:
    - try to register to all of your delegates at startup time
    - try to not to unregister all off your registered listeners until exit time
    - avoid the lambda "at all cost" (this is my personal flaw probably)

    If you don't tweak your delegates during the runtime, there should be no allocation. If you don't allocate in your delegates, obviously.
    The GC problem appears when you constantly register/unregister listeners. Every time you do that you create some allocation and you have the chance that even the action object becomes abandoned so it will be subject of GC and when next time you register again a new one needs to be created.
     
  9. Aedous

    Aedous

    Joined:
    Jun 20, 2009
    Posts:
    244
    Hmm, thanks for that I was wondering about this as well, particularly:
    Code (CSharp):
    1.  
    2. - try to register to all of your delegates at startup time
    3. - try to not to unregister all off your registered listeners until exit time
    4.  
    I always subscribe / unsubscribe events on the 'OnEnable' and 'OnDisable' as I thought this was 'best practice'. Perhaps if I only subscribed OnAwake and OnDestroy this may have better results. I do read a lot of articles suggesting to sub/unsub in 'OnEnable' and 'OnDisable' guess that's a bad idea for objects that are constantly spawned using pooling.
     
  10. Now, it is your job to decide which one is better/worse: keeping the subscription of the frequently spawned pooled objects and early out in a potential invoke or unregister/reregister on OnEnable/OnDisable. First is a little CPU overhead, the second is garbage and CPU eventually (GC).
     
  11. Aedous

    Aedous

    Joined:
    Jun 20, 2009
    Posts:
    244
    Just did a quick test and it does in fact reduce the memory allocation if I just call it once on Awake and then remove it when the object is destroyed, which makes a lot of sense according to what I've researched which states that essentially '+=' will always allocate memory.
    Therefore my thinking is that if something doesn't always need to re-register it's events and can do it just once then maybe it's best to do just that and have the messages set themselves up just once when they are needed, and completely remove them when they are destroyed.

    The next issue that I'm trying to now workaround is adding dynamic objects that need to subscribe to events of the parent, I'm leaning towards using the Interface option rather than events to handle this, which should hopefully reduce the amount of times an object needs to allocate memory every time it get's added to a list.
     
  12. _met44

    _met44

    Joined:
    Jun 1, 2013
    Posts:
    633
    You can just create a delegate field and put your method in there in Awake, then instead of passing your method to += and -= you pass the action. Doing this the delegate instance does not need to be recreated, you're reusing an existing one and avoiding the allocation.


    Code (CSharp):
    1. YourDelegateType _yourMethodDelegate;
    2.  
    3. void Awake()
    4. {
    5.      _yourMethodDelegate = this.YourMethod;
    6. }
    7.  
    8. void OnEnable()
    9. {
    10.     //register delegate
    11.     thatTargetObject.TheEvent += _yourMethodDelegate;
    12. }
    13.  
    14. void OnDisable()
    15. {
    16.     //unregister delegate
    17.     thatTargetObject.TheEvent -= _yourMethodDelegate;
    18. }
    19.  
    20. void YourMethod()
    21. {
    22.    //blabla
    23. }
    I'll add this to explain, when you pass your method directly as a callback, what it really does is create a lambda delegate automatically for your, hence the allocation if you don't cache it.
    Code (CSharp):
    1.     thatTargetObject.TheEvent += YourMethod;
    2.  
    3. is in fact the same as :
    4.  
    5.     thatTargetObject.TheEvent += () => YourMethod();
     
    Last edited: Jun 3, 2020
  13. Baste

    Baste

    Joined:
    Jan 24, 2013
    Posts:
    6,294
    I think that when you're jumping through that many hoops in order to make the callback work, the interface-list version ends up being easier to read, and a lot easier to reason about.
     
  14. _met44

    _met44

    Joined:
    Jun 1, 2013
    Posts:
    633
    Do you mind showing a code example @Baste ? I might be missing something as I don't get how it would end up being less code, but if you have a fancy trick to set this up like that i'd love to learn about it please.
     
  15. Aedous

    Aedous

    Joined:
    Jun 20, 2009
    Posts:
    244
    Yep I would like to see your implentation as well @Baste if you wouldn't mind. I've actually tried out the method to cache the function and then subscribe to it when you first mentioned it, it does reduce less garbage but still produces some sort of garbage, so was glad that it did actually make a slight difference to garbage collection, roughly about half.
    Will have to test it to see if it does the garbage collection only on the first subscription and perhaps it wouldn't do it again on the next subscriptions after that.

    Thanks for the info guys, this is really helping :), but yes would definitely like to see your Interface implementation @Baste as I think you are right it will reduce less garbage in the long run and may be a lot more readable.

    I just recently re-watched this video on developing inside stutter free at 60 FPS.
    Mention of event subscription
    Question on event subscription

    Funny enough they do mention that subscribing / unsubscribing is quite expensive over runtime and they mentioned used their own Interface with a list to cycle through to reduce it.
     
  16. Baste

    Baste

    Joined:
    Jan 24, 2013
    Posts:
    6,294
    Sure. I don't do it much (because I generally just accept the perf hit of delegates), but I'm pretty sure this is what PlayDead is talking about. It's just how you'd have written it if delegates didn't exist:

    Code (csharp):
    1. // less allocating delegate code:
    2. public class Foo : MonoBehaviour {
    3.     public event Action spacePressed;
    4.  
    5.     void Update() {
    6.         if (Input.GetKeyDown(KeyCode.Space))
    7.             spacePressed?.Invoke();
    8.     }
    9. }
    10.  
    11. public class Bar : MonoBehaviour {
    12.     public Foo foo;
    13.     private Action report;
    14.  
    15.     void Awake() => report = Report;
    16.  
    17.     void OnEnable() => foo.spacePressed += report;
    18.     void OnDisable() => foo.spacePressed -= report;
    19.  
    20.     private void Report() {
    21.         Debug.Log("Space was pressed");
    22.     }
    23. }
    24.  
    25. // interface version:
    26. public interface ISpacePressedListener {
    27.     void SpacePressed();
    28. }
    29.  
    30. public class Foo : MonoBehaviour {
    31.     private List<ISpacePressedListener> spaceListeners = new List<ISpacePressedListener>(10);
    32.  
    33.     private void Update() {
    34.         if (Input.GetKeyDown(KeyCode.Space))
    35.             foreach (var listener in spaceListeners)
    36.                 listener.SpacePressed();
    37.     }
    38.  
    39.     public void AddSpaceListener(Bar bar) => spaceListeners.Add(bar);
    40.     public void RemoveSpaceListener(Bar bar) => spaceListeners.Remove(bar);
    41. }
    42.  
    43. public class Bar : MonoBehaviour, ISpacePressedListener {
    44.     public Foo foo;
    45.  
    46.     void OnEnable() => foo.AddSpaceListener(this);
    47.     void OnDisable() => foo.RemoveSpaceListener(this);
    48.  
    49.     public void SpacePressed() {
    50.         Debug.Log("Space was pressed");
    51.     }
    52. }
    So all the extra lines of code come from the definition of the interface, other that that it's the same thing.

    EDIT: oh, and if you're only ever sticking things of the type Bar in that list, don't create ISpacePressedListener, just have Foo have a List<Bar>. Keep It Simple, Stupid.
     
    MartinTilo likes this.
  17. Aedous

    Aedous

    Joined:
    Jun 20, 2009
    Posts:
    244
    Thanks for that! Will have to go through my code and re-organise some things.

    I don't mind the memory allocation if I don't really have a choice, or if it's something I can do at runtime or just once in the lifetime of the game, however I do want to try and reduce the garbage collection with events with things that are spawned during runtime to just try and reduce the amount of spikes that appear when playing.

    I'll probably end up using a combination of both styles and just select which one according to the scenario.

    Thanks for the help! :)