Search Unity

Force Editor to release memory?

Discussion in 'Editor & General Support' started by gilley033, May 7, 2020.

  1. gilley033

    gilley033

    Joined:
    Jul 10, 2012
    Posts:
    1,191
    I have an editor tool which is importing external heightmap files and creating terrains from them. Part of the tool is importing tile sets, which could be of extremely large sizes. Unity obviously has memory limitations, which I thought I got around by adding the ability to convert the generated terrains to prefabs, and then removing the terrains used to generate those prefabs from the scene.

    I use GameObject.DestroyImmediate to destroy the terrains once they are converted to a prefab asset, and try to force memory freeing by calling EditorUtility.UnloadUnusedAssetsImmediate and GC.Collect.

    However, memory continues to increase and if enough tiles are imported, the editor will crash.

    It seems to me that although I am destroying the terrain game objects, somehow the scene is still referencing them. This seems likely as I have tested destroying the prefab assets themselves after they are created, and when I try to save the active scene where the terrain game objects were created/destroyed, I get this error:

    Code (CSharp):
    1. Component could not be loaded when loading game object. Cleaning up!
    2. UnityEditor.SceneManagement.EditorSceneManager:SaveScene(Scene)
    3. TerrainImporter.TerrainImporter:RemoveDuplicateTerrainInScene() (at E:/Backup/GithubRepositories/Terrain-Importer-Library/TerrainImporter/TerrainImporter.cs:866)
    4. TerrainImporter.TileSetImporter:Import() (at E:/Backup/GithubRepositories/Terrain-Importer-Library/TerrainImporter/TileSetImporter.cs:126)
    5. TerrainImporter.Importer:InitializeImport() (at E:/Backup/GithubRepositories/Terrain-Importer-Library/TerrainImporter/Importer.cs:51)
    6. TerrainImporter.TerrainImporterObjectEditor:InitializeImport() (at E:/Backup/GithubRepositories/Terrain-Importer-Library/TerrainImporter/TerrainImporterObjectEditor.cs:50)
    7. UnityEditor.EditorApplication:Internal_CallDelayFunctions()

    Looking at the profiler, I see ManagedHeap.UsedSize growing, so clearly something from the heap is not being freed.

    I'm not sure what I am doing wrong. I feel that the two methods I call to free memory should be enough, so long as there are no references to the terrain, which there are not. Is there another method I should be calling? Has anyone got such a tool working correctly, where you are generating prefabs via an editor tool?
     
    the_unity_saga likes this.
  2. gilley033

    gilley033

    Joined:
    Jul 10, 2012
    Posts:
    1,191
    I also tried this sequence of method calls and it doesn't help.

    Code (CSharp):
    1. AssetDatabase.SaveAssets();
    2. GC.Collect();
    3. EditorUtility.UnloadUnusedAssetsImmediate();
    4. AssetDatabase.Refresh();
    Sometimes ManagedHeap.ReservedUnusedSize increases and UsedSize increases but not as much.
     
  3. gilley033

    gilley033

    Joined:
    Jul 10, 2012
    Posts:
    1,191
    Some more info, here is the Simple Profile Analysis before and after my editor tool runs (here it is importing for heightmaps and creating four terrain prefabs.

    Before:
    upload_2020-5-6_23-9-23.png

    After
    upload_2020-5-6_23-9-41.png

    Of note, Textures and Total Objects in Scene go up by 17 and 3 respectively, however Textures memory remains at 5MB. Assets decreases from 1584 to 1544 which I don't understand.
     
  4. gilley033

    gilley033

    Joined:
    Jul 10, 2012
    Posts:
    1,191
    Other things I have tried:

    Deleting temporary textures that are assigned to the terrain alphamap using DestroyImmediate(obj, true) instead of AssetDatabase.DeleteAsset.

    calling Resources.UnloadAsset(terrain.terrainData) before destroying the terrain in the scene. I also tried calling this on the terrianData of the terrain prefab.

    setting terrain.hideFlags = HideFlags.HideAndDontSave;

    Please, I have seen several threads about this issue and they all remain unsolved. Is there no Unity technician that can comment on this issue?
     
  5. gilley033

    gilley033

    Joined:
    Jul 10, 2012
    Posts:
    1,191
    I downloaded the Memory Profiler package but I guess I'm not using it right, because when I take the difference between a pre-tool use snapshot and post-tool use snapshot, it doesn't show the memory increasing much.

    upload_2020-5-7_16-4-26.png

    5039 new items in the post-tool use snapshot, but the Owned Size of these new items is only 368.4KB and Native Size only 2.3 MB.

    Yet the task manager shows the editor increasing by nearly 800MB.
     
  6. gilley033

    gilley033

    Joined:
    Jul 10, 2012
    Posts:
    1,191
    It looks like I have figured out what is going on. I allocate what I thought to be temporary arrays in different places of the tool. These arrays are enclosed in local methods so their memory should be theoretically freed once the local method is done executing and the garbage collector runs.

    For whatever reason, they are not freed however. I don't know if this is some weird gotcha because the tool is effectively run in a single editor frame, or if the garbage collector is just not able to run in the editor. Even if I run the tool and then try to force the garbage collector to run via a separate command, well beyond the frame that the tool was run in, the memory is not freed.

    I have tried to reduce new allocations by reusing arrays, however there are some spots where this is just not possible. For instance, I am using Texture2D.GetPixels32 and Texture2D.EncodeToPNG, both of which allocate new arrays.

    Has anyone run into these issues before? Is there a way to force the garbage collector to collect these arrays?
     
  7. MartinTilo

    MartinTilo

    Unity Technologies

    Joined:
    Aug 16, 2017
    Posts:
    2,460
    Can you still find these arrays in the Memory Profiler?

    What do you mean with:
    Are these used in Lambdas/ anonymous functions? Or are they just declared in the methods and once the method is done, nothing retains any references to them?

    The list you looked at in the Memory Profiler package only contains Objects, but some memory is allocated or reserved without any objects attached to it. Speaking of reserved memory: that might be where your memory usage is growing. I.e. during the loading and conversion of your textures, you allocate a bunch of memory that all needs to sit on the managed heap, potentially with native memory behind. If the existing native memory pools and the empty managed heap space doesn't have enough space for them, it will expand. Unity never reduces these pools and heaps though, assuming they will be used again, and getting them from the OS in the first place is performance intensive, so it just keeps the space, even if it is no longer used. You have no way to give that memory back to the OS, save for restarting the editor. You can only try to avoid the memory to expand to this degree.

    That is, if this really is just down to temporary memory use and not a genuine memory leak and/or heap fragmentation.

    You could take a look at the Memory Map (Diff) in the memory profiler packages, to get a better understanding of all the memory changes, including allocations without Objects and reserved memory sections.
     
  8. gilley033

    gilley033

    Joined:
    Jul 10, 2012
    Posts:
    1,191
    Unfortunately I do not find any of the arrays in the memory profiler, however I am new to the package and may be missing something.

    All of my code is contained in an external c# library that I use via a DLL, so perhaps that is the reason nothing shows up in the memory profiler.
    The latter, my code is complicated but you can think of it like this:
    Code (CSharp):
    1. public class CustomInspector : Editor
    2. {
    3.     void OnInspectorGUI()
    4.     {
    5.         if(GUILayout.Button("Create Terrain"))
    6.         {
    7.             EditorTool tool = new EditorTool();
    8.             tool.CreateTerrainFromFiles();
    9.             FreeMemory();
    10.         }
    11.     }
    12. }
    Where all of the array allocations, and everything else for that matter, happens inside that base CreateTerrainFromFiles method. There are no outside references to the EditorTool class, so it should be destroyed immediately after the OnInspectorGUI is run. Yet if I try to call UnloadUnusedAssetsImmediate or GC.Collect manually many "editor frames" after, the memory created in the editor tool is not released.

    This makes sense and explains why memory doesn't ever go down, however I still don't understand why it keeps growing. Unity not releasing the pool memory is not an issue it seems. Within these pools, if I create an array, but then no longer have a use for this array, I should be able to set the array to null and tell Unity to "reclaim" the array so that the next new array call uses that same memory.This does not seems to be happening however. Unity just keeps increasing the pool size every time I create a new array (and likely anytime I create any new reference type or object that needs to be allocated on the heap).

    Why is that? Is it because the editor code is all executed in a single "editor frame"? Or should things work how I just stated, and there is some error in my code that makes Unity think the allocated memory belongs to my arrays and cannot be reused?

    Put another way, I feel like it should work like this:
    Code (CSharp):
    1. Stuff s1 = new Stuff();
    2. Stuff s2 = new Stuff();
    3. Stuff s3 = new Stuff();
    4. UseStuff(s1, s2, s3);
    5. s1 = null;
    6. s2 = null;
    7. s3 = null;
    8. UnityCollectNullStuff();
    9. Stuff s4 = new Stuff();
    10. Stuff s5 = new Stuff();
    11. Stuff s6 = new Stuff();
    12. UseStuff(s4, s5, s6);
    13. //etc.
    In this example, Unity might allocate new memory for the first three "new Stuff" calls, but the second set of calls shouldn't. The memory from the first three calls should be used.

    However, either I am not calling the right methods to notify Unity to mark the memory as "open for reuse", or Unity is unable to reuse that memory, which seems odd.

    I tried this, but for the reason stated above (code is in a DLL), I believe the memory is not showing up in the profiler.
     
  9. gilley033

    gilley033

    Joined:
    Jul 10, 2012
    Posts:
    1,191
    @MartinTilo

    I have simplified the main issue I was talking about in the previous post into a simple editor test:

    Code (CSharp):
    1. using UnityEditor;
    2. using System;
    3.  
    4. public static class EditorFreeMemoryTest
    5. {
    6.     [MenuItem("Assets/Test Memory")]
    7.     static void MemoryTest()
    8.     {
    9.         for (int i = 0; i < 10; i++)
    10.         {
    11.             CreateMemory();
    12.             FreeThatMemory();
    13.         }
    14.     }
    15.  
    16.     static void CreateMemory()
    17.     {
    18.         int[,] stuff = new int[10000, 10000];
    19.         stuff = null;
    20.     }
    21.  
    22.     static void FreeThatMemory()
    23.     {
    24.         EditorUtility.UnloadUnusedAssetsImmediate();
    25.         GC.Collect();
    26.     }
    27. }
    Each call to CreateMemory creates roughly 400MB of data. Ideally, the 400MB would be created once and then reused by each new array, but it is not. Instead, 4000MB of new memory is allocated by Unity.

    Is there any way to get Unity to reuse the memory? I'm going to keep testing stuff to see if I can come up with something, but in the mean time if you have any ideas please let me know!

    Thanks so much for the assistance, it is greatly appreciated!
     
  10. MartinTilo

    MartinTilo

    Unity Technologies

    Joined:
    Aug 16, 2017
    Posts:
    2,460
    You might want to add a GC.WaitForPendingFinalizers call to your FreeThatMemory() method.

    No that should still show up, only native allocations made in .dlls like c++ Malloc/Free and such wouldn't be captured. The issue is more likely one of discovery or that the arrays have indeed been collected but have expanded of fragmented the heap already.

    Also I was slightly mistaken. Managed Heap Memory can be returned to the OS if the entire heap section has been emptied. It's usually just quite rare for that to happen as random allocations are likely to land in these and stick around.

    Basically, the heap consist of chunks. You can see these as managed heap allocations in the memory map. The memory Profiler currently doesn't have any information about Mono/IL2CPP internal allocations that are not scripting objects. These allocations can be caused by e.g. the use of Reflection or Generics and contain Type Meta data and the like. So if a managed allocation in the memory map appears empty, i.e. no brighter blue colored objects are placed within it, it could either be an empty heap chunk or a scripting backed internal allocation containing this kind of scripting meta info. If it is a heap chunk it would get returned to the OS after some time and GC.Collects.

    Now to the next problem: if the current chunk doesn't have the space for a new allocation, the GC runs and tries to free up space. If that doesn't free up enough space, the heap expands and the new allocation goes into the newly allocated chunk. From that point forward, the previous chunkbis never again scanned to see if it has enough space for a new allocation. The GC only moves forward. The chunk will stick in memory until it's entirely empty and eventually returned to the OS.

    To check if that might be whats happening to you, the memory map, and the diff of that map is going to be key.
    So you could try running your code in smaller iterations while taking memory snapshots along the way. Also, the editor is likely to add some noise to the allocations. To observe this behavior at first, it might be easier to study this in a build player with asynthetic test just like the code you posted here.

    1. Take a snapshot (in 2019.3 on taking the first snapshot a small amount of memory might be allocated to stream the snapshot out. This should reduce that noise too)
    2. Clear the memory (unload unused assets, GC.Collect, wait for finalizers)
    3. Take a snapshot
    4. Allocation your Array (maybe assign it to a field in a type that should be easy to search for in the snapshot)
    5. Take a snapshot
    6. Free the Array
    7. Take a snapshot
    8. Allocate a new array
    9. Take a snapshot
    Now you can diff these snapshots against each other, search for the array on there, and check the Memory Map Diff view to see what's going on.
     
  11. gilley033

    gilley033

    Joined:
    Jul 10, 2012
    Posts:
    1,191
    I am a little confused; you say I should study this in a "build player" however I am specifically concerned with the behavior in the editor as this is editor only code; I am not sure how testing this all in the player would be beneficial given that.

    In any case, I created a static field to hold the array reference, then in my CreateMemory method just allocated a new int[10000, 10000] array.

    FreeMemory doesn't seem to do anything, I see no difference in Task Manager calling this method even with GC.WaitForPendingFinalizers added, however I still took some snapshots after calling this method.

    What I found was, the memory profiler shows the previous array memory being deleted after the first CreateMemory call:



    However, this memory is not returned to the OS, Task Manager just keeps growing and growing in size even if FreeMemory is called manually.

    PreCreateMemory1
    upload_2020-5-12_23-32-40.png

    PostCreateMemory1
    upload_2020-5-12_23-33-11.png

    As you can see the array shows up and is roughly the size I would expect it to be (~400MB). The memory address is 1510014976.

    The Memory Map also shows up as this:
    upload_2020-5-12_23-43-25.png

    I am guessing this bright blue region is the array.

    PostFreeMemory1

    The array is gone, which seems to be correct, however Task Manager does not show this memory being reclaimed.
    The deleted array does show up in the diff, and the bright blue region in the Memory Map turns dark blue (whatever that means).

    When I call CreateMemory again I get a Memory Map with the Dark Blue region seen in PostFreeMemory1 snapshot, plus a new bright blue region which I presume is the new array:

    upload_2020-5-12_23-47-1.png

    PostCreateMemory2
    The bright blue region turns dark blue.

    The first question is, what does a dark blue region mean? It's seems like it is a block of memory that is freed/available for reuse, which makes it seems like my FreeMemory method is actually doing something. But if that is the case, why is Unity allocating a new chunk instead of reusing this memory? I am going to upload the snapshots in the next post, thanks for all the help!
     
  12. gilley033

    gilley033

    Joined:
    Jul 10, 2012
    Posts:
    1,191
    Nevermind, they were too large. you can download them from my dropbox account if you are interested, although the descirption/images posted above will probably be enough for you to see what is going on here.

    Dropbox Link to Snapshots
     
  13. MartinTilo

    MartinTilo

    Unity Technologies

    Joined:
    Aug 16, 2017
    Posts:
    2,460
    The GC behaves the same in editor and in player, I just thought it might be easier to spot and easier to keep your allocations clean in a player. Especially because the Memory Profiler UI might allocate some memory while taking the snapshot, e.g. for adding it to the list in the UI.

    The reasons those kind of allocations between the snapshots might be detrimental to the test is that, say you asked for 400mb heap space, you get a new heap section that fits that space exactly, just as the memory map shows here. Now, something else also allocates before you free the memory again. That creates a new heap section, leaving the old one behind. Now, in your follow up snapshot, that section is empty and should eventually be returned to the OS. (I'm not exactly sure under what precise parameters, but it can take multiple frames and GC.Collects)

    Then again, maybe doing these tests in the editor could just be a good showcase of what is likely going wrong with your allocation pattern / how hard it is to keep the heap from fragmenting, especially when making these huge allocations.

    Yep. The Heap section can be empty after a GC swipe but it is still allocated. The dark blue in Memory Map shows that it is still allocated. The task manager doesn't know or care about allocated memory vs used memory. It has no idea that this mono memory is no longer used. It would just know if it eventually was returned to the OS.

    yes to all of that. Dark blue is just Allocated Managed memory, bright blue within that means that space is taken up by a Managed Object. Btw you can select these regions and in the bottom pane select Object list from the drop down at the right to see what objects are in there, to confirm it is the array.

    Note: I can see that you have a version of the Memory Profiler where the UI is broken. The labels at the top should have colored boxes next to them. Try updating to a newer version of the package, they are all backwards compatible so far.


    If you diff PostCreateMemory1 and PostFreeMemory1 and check the Memory Map Diff, you can see that besides the array having been deallocated, other smaller managed objects have been allocated in other chunks in the mean time. By the time you get to allocate your new array, the old heap section is therefore no longer at the front of the heap and will not be considered for the new allocation. It is vacant and wasted space, until Mono decides to eventually return it to the OS.

    Btw, have you considered using NativeArrays for your tool? those are backed by native memory instead of managed ones so you get less of this funkyness with the GC Managed Heap.
     
  14. gilley033

    gilley033

    Joined:
    Jul 10, 2012
    Posts:
    1,191
    While I understand the reasoning behind this (so users can have equivalent performance in editor vs player [for the most part], I do wish there was some special consideration for editor only memory allocations, i.e., allocations made outside of Play Mode within the editor. I mean, there are just different use cases between these types of allocations and allocations within a game, so it doesn't make sense to treat them the same. I guess this is GC related and not Unity related though?

    Okay, I get what you're saying, but yeah, I think I need to keep testing in the editor as this isn't typical code that will eventually be run in a build, it is editor stuff only and if there's an issue with how things are working in the editor than it is pertinent to my problem.

    Okay, that's what I figured.

    When I click on the regions I don't see any additional information. I don't see any new version to update to, I am using 0.2.3 preview 2 (I am using Unity 2018.4, that might be why there is no other version available).

    While I do understand this, if I run a test where memory is created and freed in successive iterations of a for loop, then nothing else should be able to allocate new memory in between the array creation/freeing steps, right? Take this code, for instance:
    Code (CSharp):
    1. static void MemoryTest()
    2. {
    3.     for (int i = 0; i < 10; i++)
    4.     {
    5.         CreateMemory();
    6.         FreeThatMemory();
    7.     }
    8. }
    How can something else be allocated in between those calls? Just to take a double look at this, I did a snapshot before running this code and after. Here is what the post-code snapshot Memory Map looks like:
    upload_2020-5-13_13-13-6.png
    This memory map shows the second chunk having a single array at the end, a third chunk with only an array, and a fourth chunk with 6 arrays. I'm not sure if the other two arrays are somewhere in the gray areas or were created with reused memory.

    I'm not sure if I fully understand all the info you've put out there, but I believe this shows the issue isn't with a new chunk being allocated and causing the old chunk to be ignored, as all of the final arrays are part of the same chunk. Therefore, the code should at least be able to reuse these last 6 arrays that are part of the same chunk, right?

    There may be a few spots where I can use Native Arrays, but one of the main things I am doing is applying heightmap/alphamap/other terrain data to Unity Terrains. The methods of the Terrian and TerrainData class do not take Native Arrays as far as I can tell, so even if I used them I think I would need to allocate a managed array eventually.

    I am also using Texture2D.GetPixels32 and EncodeToPNG, both of which allocate new managed arrays. If you have a workaround for those two methods that would be fantastic, but I don't think there is one.

    Thanks again for all the help. You are clearing up a lot of stuff even though it is looking like there won't be a way to get around the problem. I guess I just need to reuse my memory, although I think I will still see the memory grow due to methods where arrays cannot be reused, such as GetPixels32 and EncodeToPNG. It's too bad these methods do not allow a List as input.
     
  15. gilley033

    gilley033

    Joined:
    Jul 10, 2012
    Posts:
    1,191
    Strangely enough, through chance I have found that in my editor tool, calling AssetDatabase.SaveAssets() is the only thing that is needed to keep the memory leak from happening. Not sure why, but GC.Collect and UnloadUnusedAssetsImmediate are not even needed.

    In the Test Memory code I posted this is not the case, but hey, it works for my editor tool so that is the main issue solved at least.
     
  16. MartinTilo

    MartinTilo

    Unity Technologies

    Joined:
    Aug 16, 2017
    Posts:
    2,460
    There are some quite complex reasons for Unity (/Mono/IL2CPP) using the GC it is using. Changing the GC is far from trivial and has all sorts of implications, not least of which on the API, Projects, performance and stability as you could easily have build upon some assumptions that are GC specific. Separating "Play mode memory" from "Editor memory" is also not something that could be done without having to seriously reconsider how to implement the smooth interactions between the two that you've probably come to expect as quite normal with Unity. (Drag & Drop assignment, Editor Scripts interacting with the Player, tools ...)

    If they are dark blue, there is nothing in them. If they have light blue content (or yellow for native in the green regions) then opting to show Objects should show what objects are contained in there.


    Sorry, just realized we hadn't pushed that fix in a new release yet. I'll see if we can drop a new version with this. (and no version availability, is currently still the same across all versions >2018.3 and we're trying to keep it that way for at least as long as 2018.4 is a supported LTS release.)

    You can still have scripting allocations on other threads happening simultaneously. There is also the potential for JIT or statics to be initialized in the first iteration of the loop. So the chances aren't 0%


    Just to clarify: The regions shown here that start with an address label with a dark background are not heap sections. They show the memory contents for allocations that are close enough in their virtual address range that they get grouped together in the UI. The heap sections are only the dark blue blocks, every one of these is it's own heap section.

    hmm good point that some APIs might not accept input as Native Arrays. I'm not very firm in these APIs so I wouldn't know. Maybe if you have some things would have the same sized arrays, you could however at least reuse some of the arrays? it'd make your code faster too because allocating memory takes time.


    And good to hear you found something that works, even though I don't exactly know why it would...
     
  17. MartinTilo

    MartinTilo

    Unity Technologies

    Joined:
    Aug 16, 2017
    Posts:
    2,460
    Thinking some more on this. This might force the editor to release some unused heap sections simply by the virtue of this call using a bunch of native memory, upping the memory pressure. Meanwhile, you might be saving some assets that a user wouldn't want to have saved just at that moment?

    While generally not an advisable practice, you could try to call GC.Collect a whole bunch of times to trigger the release of the memory sooner.
     
  18. gilley033

    gilley033

    Joined:
    Jul 10, 2012
    Posts:
    1,191
    I had a thought that because I am writing new assets to the project in my editor tool, Unity keeps those assets in memory until SaveAssets is called. But I'm obviously not sure if that is actually the case.

    You are right about the saving assets the user might not want saved, but as this is the only method I will have to just accept that and put up a warning that SaveAssets will be called when using the tool and tell the user exactly what that means.
     
  19. Ne0mega

    Ne0mega

    Joined:
    Feb 18, 2018
    Posts:
    755
    I am having this problem, except over time, Unity stops hogging so much memory. For example 15 minutes ago, it was hogging about 12 Gb, now it is down to 6 Gb, and I think it might be Windows telling Unity to give up some memory for other tasks? In those 15 minutes Ive run my browser to open about 10 tabs and lots of surfing around looking for answers.

    Ive been doing some gnarly run-time texture swapping/remapping on 2048s, and am pretty sure that is what is causing the memory allocation growth, because it was not happening before this. (as in,it used to eat up to 3Gb, and I thought it was bad then... )

    EDIT: 5 minutes later, it dropped back down to normal 700Mb, and my computer is running normal again. So, what is telling Unity to do this?
     
    Last edited: Mar 27, 2021
  20. gilley033

    gilley033

    Joined:
    Jul 10, 2012
    Posts:
    1,191
    Since you are working with runtime code, have you tried using GC.Collect?
     
  21. Ne0mega

    Ne0mega

    Joined:
    Feb 18, 2018
    Posts:
    755
    No. This is actually the first of my research into the whole problem, so I am still debating whether I care. Also I am kind of in transition/prototype phase, as eventually, these shaders will be driven mostly by actual created PNGs. I have read a lot about garbage collection, and mostly just done my best to mitigate it before it becomes a problem.

    I just find it odd and I don't see an explanation for this behavior in everything I read here in this thread, so my curiosity has been piqued. I have noticed this before, both the memory spike and the cool-down over time.

    I am also experiencing a different kind of "memory leak" where it takes longer and longer to load up the game screen every time, until it is 11 - 15 seconds. I am now stopping and rebooting after about 10 seconds of delay per iteration, thsi is about once every 3 hours of continuous iteration/experimentation

    Both of these I find minor inconveniences at the moment, but I am concerned it may make the shaders I am developing unmanageable.
     
  22. Ne0mega

    Ne0mega

    Joined:
    Feb 18, 2018
    Posts:
    755
    So this little thing has popped up a couple times today. Most of the time, Unity doing fine. I would check periodically, and it would be running normal. I can't remember the first time, but the second time, was right after I copied and pasted a shader graph, renamed it, and then applied the new shader graph to a material.

    There is no "baking shader"or "collecting shader" or any shader work messages, but I now strongly think it is related to shader graph

    EDIT: It happened once yesterday, but no shadergraph work had been done for over an hour. It was a bunch of UI restructuring this time.

    It is a minor annoyance right now, as it grinds everything to a halt, and takes about 5 minutes to save and shut down (as opposed to five seconds) to restart. But it is starting to make me wary of shader graph work, which I have noticed has slowed down my experiments.

    However, as I have mentioned, if I just let it go, or do something else on my computer, over 15 - 20 minutes, it bounces back to normal.
     
    Last edited: Mar 30, 2021
  23. mailfromthewilds

    mailfromthewilds

    Joined:
    Jan 31, 2020
    Posts:
    217
    so how do you guys force editor to release memory? GC.Collect, resource.Clean/unload (or whatever it was) and other methods dont do anything for me.

    In my case its meshes (im 90% certain). if your meshes are loaded to memory from first to last frame then (MAYBE) unity wont increase memory with these meshes, but for anything like loading the meshes from disk, from resources, from addressables, from internet, or generating them in runtime, i am most certain editor will not clean these for you when you click stop play, and then it will load them again in runtime, adding to the memory

    ofc it can be textures and other things too but in my case its meshes im sure as they take most size and i have lot of them (auto generated ones, or loaded to memory during runtime)

    so is there any way? reseting unity every a while is kinda annoying
     
  24. gilley033

    gilley033

    Joined:
    Jul 10, 2012
    Posts:
    1,191
    I'm not sure about in game resources, this thread was focused on resources loaded outside of Play Mode, for instance with an editor tool. I would have thought any resources loaded in Play mode would be unloaded when you exit Play mode. Which exact method are you using to load the meshes?
     
  25. MartinTilo

    MartinTilo

    Unity Technologies

    Joined:
    Aug 16, 2017
    Posts:
    2,460
    You should use the Memory Profiler and make that 100%. It should also help see if those meshes still have any references to them that are keeping them in memory, what references those are and therefore, how to get rid of that memory.

    There's also a chance that some of these meshes are created in a way (possibly involving HideFlags, e.g. HideFlags.HideAndDontSave) that would require you to explicitly call Destroy() or DestroyImmediately() on them to get rid of them.
     
  26. Ne0mega

    Ne0mega

    Joined:
    Feb 18, 2018
    Posts:
    755
    J1wan likes this.
  27. theforgot3n1

    theforgot3n1

    Joined:
    Sep 26, 2018
    Posts:
    208
    Anyone who has written an editor script that frees up memory reliably?

    I have a procedural generation which (undesirably) uses a ton of system memory, which I am trying to debug. If I had an editor script to free up memory in between plays it would help my workflow a great deal.
     
  28. MartinTilo

    MartinTilo

    Unity Technologies

    Joined:
    Aug 16, 2017
    Posts:
    2,460
    I highly doubt there is a one-size-fits-all solution to this
    You should use the Memory Profiler and figure out what exactly is using those resources and if they could be cleared up and how.

    If that is Native memory attached to dynamically created objects of Asset Types (UnityEngine.Object inheriting types that are not GameObjects or Components) then you should make sure you call Destroy on them when you no longer need them.

    If that is managed memory, you need to make sure it is either reused or no longer referenced so the GC can then automatically take care of it.

    In either case, or any other case, the Memory Profiler should help you in figuring out what it actually is and how to address it.