Search Unity

  1. Welcome to the Unity Forums! Please take the time to read our Code of Conduct to familiarize yourself with the forum rules and how to post constructively.
  2. We are updating our Terms of Service for all Unity subscription plans, effective October 13, 2022, to create a more streamlined, user-friendly set of terms. Please review them here: unity.com/legal/terms-of-service.
    Dismiss Notice
  3. Have a look at our Games Focus blog post series which will show what Unity is doing for all game developers – now, next year, and in the future.
    Dismiss Notice
  4. Join us on Thursday, September 29, for a day with Unity's SRP teams here on the forum or on Reddit, and discuss topics around URP, HDRP, and the Scriptable Render Pipeline in general.
    Dismiss Notice

Runtime mesh generation, reduce memory consumption

Discussion in 'Scripting' started by Jenna, Sep 19, 2022.

  1. Jenna

    Jenna

    Joined:
    Apr 28, 2013
    Posts:
    11
    Code (CSharp0):
    1. meshfilter.sharedMesh = new Mesh();
    2. meshfilter.sharedMesh.vertices = vertices;
    3. meshfilter.sharedMesh.normals = normals;
    4. meshfilter.sharedMesh.uv = uvs;
    5. meshfilter.sharedMesh.SetIndices(indices, MeshTopology.Triangles, 0);
    So this is my current code. I'm making a client for an already existing, 20 year old virtual world that streams mesh data, texture data, etc over the internet. The meshes are being downloaded in another thread which then decodes them into arrays of vertices, normals, uvs, and indices for turning into a mesh in Unity.

    If I don't set the sharedMesh to newMesh, then it null references out because the instantiated object has no initial mesh set in the filter.

    The code works, but it consumes a lot of memory. I know it's this doing it, because if comment out this bit of code that applies the meshes to the instantiated objects that are subsequently used for displaying the mesh data, it doesn't consume much memory at all.

    Please note that when meshes are repeated with the same materials, they end up sharing the same mesh and materials, so I have that taken care of already.

    The amount of memory being used is absolutely insane. When I cash the meshes to disk to see how much space they take up, they only take up a few hundred megabytes, and they end up using a couple dozen gigabytes of ram at run time.

    So how can I improve this?
     
    Last edited: Sep 19, 2022
  2. polemical

    polemical

    Joined:
    Jun 17, 2019
    Posts:
    1,244
    An improvement would be reusing the existing mesh,
    Code (CSharp):
    1. MeshFilter meshFilter = gomesh.GetComponent<MeshFilter>();
    2. if (meshFilter.sharedMesh == null)
    3. {
    4.   meshFilter.sharedMesh = new Mesh();
    5. } else {
    6.   meshFilter.sharedMesh.Clear(); // can use this to make an existing mesh like a new one
    7. }
    8. Mesh mesh = meshFilter.sharedMesh;
    9. mesh.vertices = vertices;
    10. mesh.normals = normals;
    11. mesh.uv = uvs;
    12. mesh.SetIndices(indices, MeshTopology.Triangles, 0);
    Note, if your gomesh MeshFilter has a pre-existing mesh assigned (isn't a null sharedMesh), you will probably want to set sharedMesh to null yourself before calling the above (maybe in Awake, depends on how you've set things up) ..that way, you can prevent whatever mesh was assigned from being modified.
     
  3. Jenna

    Jenna

    Joined:
    Apr 28, 2013
    Posts:
    11
    It's always null at the point the code is run because the prefab that gets instantiated has no mesh set to the meshfilter component, and at no point is the mesh ever revisited in the code. From there the data is either destroyed when the object is deleted from the scene by the server, or it's rotated, moved, etc, but that doesn't happen on the mesh itself, but on the transform.
     
  4. polemical

    polemical

    Joined:
    Jun 17, 2019
    Posts:
    1,244
    My suggestion assumes you're reusing these gomesh objects from a pool.. instantiating anything always increases how much memory you're using, each and every time, so that would be the first thing to tackle if not (unless you can't pool these for whatever reason).
     
  5. Jenna

    Jenna

    Joined:
    Apr 28, 2013
    Posts:
    11
    It's instantiating from a blank prefab with no mesh in it because the mesh data is received from a remote server, and not stored in the project. Though I do have it caching downloaded mesh data to disk so it doesn't have to redownload the meshes every single time.

    If an object has to request a mesh that's already been processed and spawned in the scene, it grabs the sharedMesh of the preexisting object's meshfilter. However, this is a very rare occurrence and doesn't save much at all.
     
  6. polemical

    polemical

    Joined:
    Jun 17, 2019
    Posts:
    1,244
    All I can suggest is that instead of destroying it, return it to a pool - then when you need to instantiate one, grab from the pool instead unless there aren't any in it, in which case just do what you're doing with Instantiate. That said, unless there is significant use of this, "couple dozen gigabytes" is excessive.

    Also,
    Code (CSharp):
    1. void OnDestroy()
    2. {
    3.   MeshFilter meshFilter = GetComponent<MeshFilter>();
    4.   if (meshFilter.sharedMesh != null) Mesh.Destroy(meshFilter.sharedMesh);
    5. }
    Edit: The above code is more likely to help. I do recall similar issues with meshes being responsible for crazy memory consumption if I didn't destroy them and that solving it.. should have been my first suggestion :D
     
    Last edited: Sep 19, 2022
  7. Jenna

    Jenna

    Joined:
    Apr 28, 2013
    Posts:
    11
    Well at currently there are no destruction packets being processed as I control the location in the virtual world in question, and am making sure that nothing is being created, deleted, or modified while my Unity based client is logged in, just to have a sterile test environment...

    The issue here is that I have a few hundred megs worth of meshes that are using up 30 gigabytes of memory once they're put in...

    There is also something I haven't mentioned that MIIIIGHT be a factor... I haven't yet compiled and this is all just being run in test on the Unity editor... I don't know if that'd cause this 5-10 fold increase in expected ram use or not. I don't think it should because I've made far more complex scenes in Unity than this by hand, and it didn't use this much memory.
     
  8. polemical

    polemical

    Joined:
    Jun 17, 2019
    Posts:
    1,244
    I guess try in a build ..if it's fine there, could be due to an editor-specific issue (in which case maybe try another version, unless you're locked into a specific one). But definitely do want to call Destroy on meshes you've created with new Mesh().
     
  9. Jenna

    Jenna

    Joined:
    Apr 28, 2013
    Posts:
    11
    There's only a SINGLE new Mesh call in the entire project, and it's right there... I tried doing it without using it, but it null references every single time because the prefab has no mesh... it was also the only way to use sharedMesh without every new mesh that comes in changing every other object with a mesh applied to it because every instanced object had the same shared mesh. It was amusing to see every mesh object get changed every time a new mesh was ready for use, but that is NOT desired behavior.
     
  10. Jenna

    Jenna

    Joined:
    Apr 28, 2013
    Posts:
    11
    Just ran a compiled version of it... it's still using an insane amount of memory...
     
  11. polemical

    polemical

    Joined:
    Jun 17, 2019
    Posts:
    1,244
    It's also possible the function is being called more than expected.. maybe put a
    Debug.Log("added", this);
    or something in there, run it and see. If there's no issue like that, I'd say try the Profiler while you're waiting for more answers here, but if I can think of anything else I'll mention it.
    Edit: correction:
    Debug.Log("added", gomesh);
     
    Last edited: Sep 19, 2022
  12. Yoreki

    Yoreki

    Joined:
    Apr 10, 2019
    Posts:
    2,301
    Are you doing anything else with the mesh or its contents afterwards? Just because you commented out those lines of code and the problem disappeared, does not mean they actually cause the problem at all. Imagine someone had a performance problem due to some O(!n) algorithm he invented, which runs on a list of 100.000 elements, and he then tells you that the performance problem disappears when he comments out the code which fills the list. So that must be the problem, right?
    No. We still know basically nothing about what it is you are doing.

    Post some more code. Do some more testing. Check in the profiler if you can break down the high ram usage, or its reason. Do you have to use SetIndices? I dont remember ever using that function when i wrote mesh generation code. Maybe i just forgot about it - but the docu implies it's used for meshes which follow a different topology from triangles.
     
    AnimalMan and polemical like this.
  13. Jenna

    Jenna

    Joined:
    Apr 28, 2013
    Posts:
    11
    After the mesh is set into the filter, it is never touched again. Never modified.

    I might not have to use SetIndices, but the data received from the server has indices and not triangles, and I'm unsure how to use the indices as triangles.

    The arrays being used are all reused rather than being reallocated every time. I'm using queues rather than lists or dictionaries where ever feasible.

    As for posting more code, I'm willing to invite you to the github project... It will be a BSD licensed open source project, but as it's so early on in development, I don't want to open the flood gates to forks just yet. I mean, I've got a license written up, but it's not like that'll stop certain people who would use it to steal content from the virtual world in question.
     
  14. Jenna

    Jenna

    Joined:
    Apr 28, 2013
    Posts:
    11
  15. Bunny83

    Bunny83

    Joined:
    Oct 18, 2010
    Posts:
    2,217
    Well, you do know that a Mesh is NOT destroyed when a referencing GameObject is destroyed? You are responsible for destroying any instances of Meshes, Materials, Texture2d that you create directly or indirectly. You may indirectly create those by using the "mesh" instead of the sharedMesh property.

    Meshes are assets and usually exist as assets in your project. Those are never destroyed either. When you instantiate an object that references such a mesh asset, the mesh stays in memory as it may be used by more than one instance and is not "bound" to the life-cycle of a certain gameobject instance. So I guess you never destroy your Meshes. Meshes that are no longer referenced stay in memory and can be found again by using FindObjectsOfType. When you call Resources.UnloadUnusedAssets such instances are destroyed as long as they aren't referenced anymore. However you should not use this method to clean up assets you don't need anymore since it's a quite expensive method.

    You posted a link to a github repo. However you haven't really said which file contains your problematic code. I've found several where you use meshes. There are endless pages of code with tons of commented code or code in preprocessor tags. It's a mess to look at and certainly does not help here that way. At least post a clear link where your code is located.
     
    cyriaca likes this.
  16. Jenna

    Jenna

    Joined:
    Apr 28, 2013
    Posts:
    11
    Except nothing's being destroyed. I thought I explained this... I haven't even gotten around to implementing the packet handler to remove objects because they aren't being removed in the sterile testing environment on the server that I'm testing it in.

    Ok, but again, the problem isn't that they're remaining in memory because nothing is being destroyed. I've even completely nulled out the initial mesh on the prefab that gets instiantiated to the point that sharedMesh starts out null the moment the prefab is insantiated.

    The problem is that there's a runaway memory problem. I know, I know, not cleaning up the meshes when they're no longer being used would cause a runaway memory problem, but again, that's not the issue... The issue is memory use when the meshes are used. If they're made and stored into an array, there's a lot less memory use. When they're inserted into a meshfilter, it uses an absolutely insane amount of memory.

    Also, the prefab being used has no mesh in the meshfilter. It's blank. None. So much so that if you try to modify sharedMesh on it without putting something IN to sharedMesh first, it throws a null reference exception. It's literally the only reason I'm going filter.sharedMesh = new Mesh(); There's nothing there to replace and thus nothing to destroy.

    The problem code will be in Assets\Scripts\AssetManager.cs and Assets\Scripts\SimManager.cs

    Given that even when storing the resulting meshes to an array but not using them at all there's a lot less memory use than when the mesh is put into a mesh filter, I doubt it'll be anything in the Assets\libreMetaverse folder, which is the backend that handles networking with the Second Life servers as well as decoding of assets into something usable. All of that happens and there's no memory leak UNTIL MeshFilters start being filled.

    I mean yeah memory goes up by a good bit before mesh filters are being filled, but it's not a problem. If I render everything as a Unity box primitive while still decoding the meshes, it's not a problem.

    That said, I'm also having an issue with combining meshes into a single mesh with submeshes. Any time I try, I end up with a bunch of strange things happening with vertices. So I'm creating each submesh as a separate mesh.

    If there is a better way of going about this than the way I'm doing it, I'm open... If you look at the work I've put into it, and all the commits, you can see I'm definitely not a "please code it for me and I'll take credit for it" kind of person.

    If there's a way to attach a meshfilter to an instantiated object with the mesh that's to be used at the time of creation, and then that mesh gets removed from memory, then I'm all for that. I just don't know what I'm doing wrong and would appreciate some help.
     
    Last edited: Sep 20, 2022
  17. Jenna

    Jenna

    Joined:
    Apr 28, 2013
    Posts:
    11


    I don't think it's my meshes... According to this, it's a perfectly sane amount of memory being used for meshes and textures. You might go "wait, 1.14gb of memory for almost 2000 textures, 1gb of memory for almost 100k meshes, and that's sane?" And my reply is "for Second Life, you have no idea... people on SL will make the insole on shoes be a 1024x1024 texture and you never really see it because it's an insole, there's a foot covering it... Also you can divide by 6 and get a more realistic estimate on how many meshes there would be if I could figure out how to do submeshes without freaking out at the end of one submesh and the beginning of the next and looking janky.

    MOST content creators on Second Life really have no concept of a memory budget. But that's a problem I can solve at a later date.

    For right now, I need to track down this 15+gb of untracked memory... That other nearly 8gb of memory sounds about right from my experience using a client based on the Linden Lab reference code like Firestorm... The scene being rendered uses up about that much memory in Firestorm. So where ever this untracked memory is the issue...

    How the heck do I track untracked memory?
     
    Last edited: Sep 20, 2022
  18. Bunny83

    Bunny83

    Joined:
    Oct 18, 2010
    Posts:
    2,217
    Well, as I said, it's hard to follow your code. The two files you mentioned both have over 1200 lines of code with tons of code commented out which makes searching painful and completely scattered with preprocessor tags. I always prefer to seperate two completely different approaches to a problem, often even in seperate classes, though when there is some shared logic, I usually create a method with a well defined interface / parameter list that can be used by both approaches. Interleaved preprocessor tags makes the code 10 times harder to read, especially when you don't know which defines are actually used ^^.

    Something most likely not related to your problem, but I just noticed it, is the UnityMainThreadDispatcher you're using. This is an inefficient implementation because:

    1. It aquires a lock every frame no matter what. Getting / releasing a lock is expensive. If the queue is empty there's no reason to get the lock.
    2. A lock should only be hold as long as necessary. This dispatcher dequeues one item and processes that item and still holds the lock. So all the work done in the item will block the queue.
    Usually you either get the lock, dequeue an element, release the lock and then process the element. Or if there are many items scheduled and you want to process all pending in a row, you get the lock, copy all elements into a local List / queue, clear the shared queue / list and release the lock. After that you can process the copied elements one by one from the copied list. This list of course can be re-used by the dispatcher. I always go for the second option as it requires just a single lock per frame with minimal overhead and scales well when many elements are scheduled.

    A lock should only protect the shared data which in this case is the queue. So only queuing and dequeuing needs to be protected.

    How did you actually figure out how much memory is used? Did you use the Unity memory profiler? As long as a Mesh instance is read / writeable, it will hold the memory in two places, once on the CPU main memory and of course on the GPU in video memory once the mesh has been uploaded to the GPU. If you mark a mesh as not-readable, the CPU based memory would be released. However you can no longer read, write or modify that mesh as the data only exists on the GPU side.

    You haven't mentioned and hard facts about how many objects / meshes we talk about and how many vertices / triangles those have. You always just use words like "huge". You said

    Well, in which format do you store them on disk? Mesh data that is a few hundred megabytes is really a lot if all of that should be loaded into memory. Certain file formats are designed to reduce the required space and the actual representation in memory does in deed require much more memory. That's just how it is.

    Submeshes only make sense if you have meshes that share the same vertices but define different triangles with different materials. If you just combine individual meshes into seperate submeshes, there's no real benefit. Actually it's easier to get in trouble depending on the vertices count and whether you use 16 or 32 bit index buffers for your mesh(es). Keep in mind that a newly created mesh would use a 16 bit index buffer and therefore can only address 64k vertices. If a single mesh needs to have more vertices than that, you have to switch the index format to 32 bit. Of course a 32 bit index buffer requires twice the size. So the index buffer of a mesh with 5000 vertices and 7000 indices requires 14k bytes with a 16 bit index buffer and 28k with a 32 bit index buffer. Though the vertices are usually the major part of a mesh. Just the vertices 60k. If you have normals, another 60k, tangents? another 60k, uvs? 40k - 80k depending on the used uv. The same for every additional uv channel.

    So can you actually give us any stats about how many objects and what's the average vert / tris count per mesh? Have you actually used the profiler? You know that you can also enable deep profiling to see which line of code takes how long and how much memory is allocated? Though be warned: If you have heavy code that executes millions / billions of lines the profiler could even crash your system :) Make sure you saved your work before you start profiling.

    ps: I just saw your latest post, so some of my points have been answered already, at least partially. Yes, I know second life, tried it years ago, haven't touched it since then. About the untracked memory, have a look at this post. We don't know which Unity version you're using, which scripting backend you're using and we don't even know your target platform. I assume standalone (windows / linux / mac)?
     
    AnimalMan likes this.
  19. Jenna

    Jenna

    Joined:
    Apr 28, 2013
    Posts:
    11
    It was worse. I've been cleaning it up.

    Yeah I get that.

    Something most likely not related to your problem, but I just noticed it, is the UnityMainThreadDispatcher you're using. This is an inefficient implementation because:

    1. It aquires a lock every frame no matter what. Getting / releasing a lock is expensive. If the queue is empty there's no reason to get the lock.
    2. A lock should only be hold as long as necessary. This dispatcher dequeues one item and processes that item and still holds the lock. So all the work done in the item will block the queue.[/QUOTE]
    There used to be a lot more of it. I'm still learning multithreading and I'm using it for training wheels I have been doing what I can to eliminate it, and I've eliminated about 95% of the instances of it that aren't just me being lazy in sending debug.logs.

    Note I'm confused over how to track down my memory issues, not over the performance issues from thread locking.

    Well like in the screenshot above, I've figured out that it's definitely not my meshes that are the problem... It's definitely something else entirely.

    The raw asset data received from Second Life, so what ever format it uses.

    That's good to know, thanks. I was worried to death it might be caused by my using separate objects for the different materials (which SL and thus the backend library I'm using refers to as "faces", which has caused some people confusion, making them think I'm creating each triangle as a separate mesh when that's not the case at all).

    I don't have that information, but I could calculate it. I'm not sure it's relevant though since the amount of mesh memory according to the profiler is actually within expected values, and I've been barking up the wrong tree... Not that all the things I've done to try to save on mesh memory as a result isn't a good thing. It just doesn't solve my problem.

    Thanks I'll take a look. And yes, standalone Windows at the moment because I only have Windows so it's easier for me to test on. The native code JPEG2000 library I'm using is open source, and everything else is in straight C# and included directly in the Unity project's source code, so once it's in a usable condition, I'm just going to find people to join the project who can compile the JP2K native library to the other platforms. This may even end up including a mobile client. It'll be the first graphical mobile client for SL in a very long time.
     
unityunity