Search Unity

  1. Unity support for visionOS is now available. Learn more in our blog post.
    Dismiss Notice

Feedback Library folder takes up more than 1 GB on a clean 2021.3.0f1 URP project. Isn't that bad?

Discussion in 'Editor & General Support' started by ThisIsSomeRandomUsername, May 22, 2022.

  1. ThisIsSomeRandomUsername

    ThisIsSomeRandomUsername

    Joined:
    Jan 22, 2022
    Posts:
    4
    I think I should try and call attention to the Unity devs for this. 1 GB is just too much, especially if they wanna make URP the new standard.

    On 2018 (non-URP) versions the Library folder would take up about 30 megabytes on a clean project.
    On 2021.3.0f1 (non-URP) the Library folder takes up about 300 megabytes on a clean project.
    On 2021.3.0f1 (URP) the Library folder takes up 1.2 gigabytes on a clean project. How did that happen? Why?

    I think that they should try to reduce the size a little, because this means that if someone has 10 Unity projects (say some small jam games) that's 12 GBs kicked out of that person's hard drive, and as more projects are created that computer's performace will be degraded (because in my experience even a few GB on or off make a big difference, EVEN on a 1-terabyte HDD)

    "Delete the Library folder from old projects"? "Get an SSD"? Well, those might actually be good ideas. But aren't there ways for the Unity devs to reduce that folder's size? I mean, just HOW did it become that large and why is it necessary for it to be that large? Can someone at least explain that to me?
     
    Last edited: May 22, 2022
    Gustjc likes this.
  2. Ryiah

    Ryiah

    Joined:
    Oct 11, 2012
    Posts:
    20,711
    A gigabyte is nothing. Once you start developing a project it's going to swell way past that. My last work project using the new pipelines totaled 130 GB and out of that only 30 GB were assets.
     
    Last edited: May 22, 2022
    SuspecM likes this.
  3. Scyra

    Scyra

    Joined:
    Nov 21, 2017
    Posts:
    19
    It's a bother even on non-URP. The Library/Artifacts folder gets bloated, and I think maybe the overall size of the Library folder is also affected by how many assets you've downloaded from the store. I can't clearly recall, but I think I had a clean project weighing in ~4 gigabytes before I backed up my (many) assets and fully removed/reinstalled Unity.
     
  4. Ryiah

    Ryiah

    Joined:
    Oct 11, 2012
    Posts:
    20,711
    Speaking of which I recommend deleting the folder every once in a while as Unity is not perfect at removing old and unnecessary files from it. I have noticed on occasion that the folder is smaller after completely regenerating.

    Only if they have been added to the project.
     
  5. Noisecrime

    Noisecrime

    Joined:
    Apr 7, 2010
    Posts:
    2,034
    URP uses Burst now doesn’t? If so that’s 800mb in the library straight off for that package.

    Honestly the size of Unity installs really bugs me and when you take into account the UPM cache, the extracted UPM packages, the copies of those packages per project ( why when you can’t edit them? ) then Unity is easily becoming as large as Unreal installs.

    Most of my client projects used to take a few GB and that normally includes the library, but nowadays it feels really difficult to keep it lean, not to mention it seems like lots of packages needless end up forcing unused resources to be included in builds.

    another pet peeve is simply the number of files a project creates, tens to hundreds of thousands if you include the library, mostly all tiny files that I’m sure adversely affect loading times, and most definitely affect upload/download times for cloud back ups. It’s gotten to the point where I don’t even bother backing up Library folder any more except for a couple of client projects and will often just delete the Library folder altogether once a project is finished with.

    Honestly surprised that Unity haven’t invested in packing files together. Not sure if that’s really applicable for meta files, but pretty sure it could be done elsewhere. Thinking about it I wonder if it would make sense to provide the option of using a dll version of packages instead of loose source files.
     
    Last edited: May 22, 2022
  6. Ryiah

    Ryiah

    Joined:
    Oct 11, 2012
    Posts:
    20,711
    Only for HDDs. SSDs are designed for this and in fact when working with small files won't achieve their maximum performance unless you are accessing many of them at once. Meanwhile HDDs are both slow at accessing small files and can only do so one file at a time.
     
    Last edited: May 22, 2022
  7. Murgilod

    Murgilod

    Joined:
    Nov 12, 2013
    Posts:
    10,026
    Sort of. SSDs have trouble processing small file reads even sequentially more because of OS filesystem limitations than they do any sort of technical ones. This is still a major problem in Windows.
     
  8. valarnur

    valarnur

    Joined:
    Apr 7, 2019
    Posts:
    436
    I have made a request to rewrite SRP 14 and 15 to reduce the size of API and Packages as well, especially Burst.
     
    Noisecrime likes this.
  9. spiney199

    spiney199

    Joined:
    Feb 11, 2021
    Posts:
    7,256
    I thought we were over the days of being precious about our storage space? My computer has nearly 10TB of storage and I still have space for more drives.
     
    SuspecM likes this.
  10. Antypodish

    Antypodish

    Joined:
    Apr 29, 2014
    Posts:
    10,716
    Mind, not every one can buy 2-4TB of space. Not everyone earns £, €, $. For some indies may be still expensive. Specially if needing for backup drives and holding other files than just Unity projects. That assuming holding files outside clouds.
     
    Rodolfo-Rubens likes this.
  11. spiney199

    spiney199

    Joined:
    Feb 11, 2021
    Posts:
    7,256
    I mean, storage space is the cheapest component of building a PC. So cheap it's a joke how cheap it is. A 1TB drive here is about 50 dollarydoos (Australian dollars), 2TB is 80.

    SSD's less so, but if you're on a budget then what can you say?
     
    SuspecM likes this.
  12. valarnur

    valarnur

    Joined:
    Apr 7, 2019
    Posts:
    436
    It's not only size economy but also possibly effecting editor responsiveness and scripts re/compilation.
     
    angrypenguin likes this.
  13. spiney199

    spiney199

    Joined:
    Feb 11, 2021
    Posts:
    7,256
    I think the only code getting recompiled is the code you edit and any assemblies referencing the edited assembly; ergo, packages aren't getting recompiled when you edit your scripts.

    However in the context of domain reload speeds, that's a fair cop. Though I imagine there are more factors than pure package size that affect their overall performance.
     
  14. Murgilod

    Murgilod

    Joined:
    Nov 12, 2013
    Posts:
    10,026
    "Possibly" doing a lot of heavy lifting here.
     
  15. Ryiah

    Ryiah

    Joined:
    Oct 11, 2012
    Posts:
    20,711
    Last edited: May 23, 2022
  16. neginfinity

    neginfinity

    Joined:
    Jan 27, 2013
    Posts:
    13,484
    I have about 6 terabytes of storage total, but fast storage sits on two 500GB SSDs and I keep running out of space on both of them. So, yeah, it is better when software doesn't waste space.
     
    Noisecrime likes this.
  17. Ryiah

    Ryiah

    Joined:
    Oct 11, 2012
    Posts:
    20,711
    Speaking of which I found a solution: a directory junction (aka symlink) to the PackageCache folder. Packages will be extracted to and stored there but only projects that reference them in their manifest will load them when being opened.

    While testing I was able to have multiple projects open at the same time and was able to import packages into one without affecting the others. The only problem that occurred was when I tried to remove a package from one. The others immediately locked up. Once they were restarted though they added the package back (since it was still in their manifests).

    I don't see any reason why Unity cannot do this themselves but until they do nothing prevents you from doing it.
     
    Last edited: May 24, 2022
    Gustjc, Noisecrime and Gekigengar like this.
  18. Noisecrime

    Noisecrime

    Joined:
    Apr 7, 2010
    Posts:
    2,034
    Interesting. I was considering this whilst writing my earlier reply, however I was and still am concerned that directly symlinking to the extracted package cache will lead to weird issues. You discovered one, that when removing a package it presumably deletes the extracted package from the cache, meaning other projects have to unextracted it again. I have a sneaking feeling there might be more gotchas though, since the cache folder is now being used by Unity for two different purposes at the same time.

    That doesn't discount the concept though, but I think I'd be more comfortable making a secondary cache to symlink into.

    I can understand why Unity choose to do it this way as having the packages embedded into the project, albeit within the Library cache means you can be sure not to accidently update or alter that code, but ultimately I'd like to have a preference setting to choose between embedding or use the global package cache and let Unity take care of the potential conflicts - i.e. instead of deleting the package its reference is just removed from the manifest.

    The global package cache has another issue though as if like myself you have dozens or hundreds of legacy projects from clients and personal work then the NPM and cache folder will quickly become huge as you create new projects or regularly update a subset of packages in legacy projects. I regularly delete the NPM & cache folders to try and stay on top of this, but checking now I see these folders are back to taking up over 8GB! Almost 40% of that is just three Burst versions ;(

    I had an idea of creating a 'Package Manager Manager' that would track or parse Unity project manifests and keep tabs on packages and versions used. This would highlight projects where you could maybe update low impact packages ( e.g. Visual studio support ) and remove all unused package versions from NPM. Trouble is as soon as you create a new project you'll download and install the default package versions unless you remember to update the manifest first.

    So ultimately I feel just deleting the Global Package Cache folder periodically is probably the easiest method, but I'm still tempted occasionally to see if a manager app might be worth while.
     
  19. Gustjc

    Gustjc

    Joined:
    Oct 16, 2017
    Posts:
    9
    That's actually a great idea. Will try that one later, thanks.

    I usually create a lot of small projects and prototypes while learning or following tutorials and it just feels wrong to have so many large duplicated files for the package cache. So much so that I stuck with only using 2019 LTS and have no plans to try out the new LTS versions unless I start some 'real' project.

    I really hope unity would care more about duplicating so many files all the time.
    I might be wrong but it is my understanding that a package version x.x.x is always the same and should not have any differences between projects so I'm not sure why the package cache is not something made globally instead of duplicating the same files over and over again.

    Edit: I guess one could say its important to make custom changes to a package's code on specific situations to adapt to your project, and in that case it would be necessary for them to be local. But I at least have never needed to do that and even if that was indeed required for some reason, unity could 'mark that packaged' as changed and make a local copy of only that.
     
    Last edited: May 30, 2022
    Ruslank100 and Noisecrime like this.
  20. Arowx

    Arowx

    Joined:
    Nov 12, 2009
    Posts:
    8,194
    I have an ongoing issue with no the total size of project data but the use of lots of tiny files that are so good at slowing down even the best storage systems.



    Optimal file size is about 64K and up and anything below the native minimum file size is just wasting space (2-4K).

    How hard would it be for Unity to tackle both by creating a file format that compresses and combined these small files for faster loading and better performance with a slight cost overhead when updating files.

    Compression levels could even be optional for various hardware platforms.
     
  21. angrypenguin

    angrypenguin

    Joined:
    Dec 29, 2011
    Posts:
    15,603
    Which small files are you referring to? Because I imagine that it'd make things quite painful for version control, at the very least.

    And what is your "ongoing issue" in practical terms? By which I mean, what things do you do in projects of what size where this actively hampers your productivity? Your graph there shows that except for the old-style spinny drive the drop in performance from best case to worst is less than 40%.
     
  22. Arowx

    Arowx

    Joined:
    Nov 12, 2009
    Posts:
    8,194
    Yet if someone offered you a 40% faster SSD / RAM or CPU at the moment wouldn't you consider that a worthy upgrade?

    PC Users often pay lots of money for boosts in performance from their components to save them time why waste any of that money on poor performing software.
     
  23. Arowx

    Arowx

    Joined:
    Nov 12, 2009
    Posts:
    8,194
    Check out the Artifacts folder and all your .meta files.
     
  24. hippocoder

    hippocoder

    Digital Ape

    Joined:
    Apr 11, 2010
    Posts:
    29,723
    DOTS/Devils Advocate - Burst makes a huge, huge difference to performance. Without Burst I doubt it makes sense for URP to support Forward+. It just would be CPU bottlenecked.

    800MB is big but the build does not suffer that cost.

    Come on. I've been developing for many years on a <100 euro SSD that is 1TB. HD space is one of the cheapest things in the computer. And the can get REALLY cheap. Like a backup drive? You're looking at pennies. Artists are the ones that need the storage!

    Artists kill 800MB many times just for one model. All those substance files, revisions, uncompressed sources etc.

    HD space is something artists or gamers run out of a lot. A programmer, not so much... mind you it will depend on how much of a squirrel people are hoarding their nuts.
     
    Luxxuor likes this.
  25. angrypenguin

    angrypenguin

    Joined:
    Dec 29, 2011
    Posts:
    15,603
    That's not at all the same thing. Getting 40% faster across the board isn't the same as speeding up your worst case by 40%. And you talk about this as if addressing the worst cases won't have costs elsewhere, which they could well do.

    As I asked:
     
  26. Arowx

    Arowx

    Joined:
    Nov 12, 2009
    Posts:
    8,194

    Every time I open Unity.
     
  27. zinexe

    zinexe

    Joined:
    Aug 21, 2015
    Posts:
    5
    Reading this thread and seeing people white knight this complete and utter mess of huge unity projects is just sad.

    this creeping bloat is what has really turned me off unity (and before that UE), they become so slooow.
    starting a new project takes several minutes, what are they even thinking?

    don't Unity have a single good system designer left?

    an empty project should be way below 100mb to just be sensible (even 100mb is pushing it)
    several GB's is pure madness, whatever it's doing is way worse and slower than it was in the past

    what a bloated pile of junk, rarely feel like quickly firing up a new project in unity, due to all the weird dances you have to do until you arrive in your project several minutes later (usually with some errors/warnings about updates), it's SUCH a bad experience!

    if they somehow manage to get a render pipeline to take up for than a megabyte or 2, it's really impressive, in a bad way.

    If nothing else, they could learn from Godot, which starts before you can blink, and doesn't waste your disk space with a billion files..even moving or zipping a unity project is a nightmare

    my new project, 300mb assets (yes, actual graphics, animations, sounds etc) and the project is 5.02GB! just lul wth
     
  28. spiney199

    spiney199

    Joined:
    Feb 11, 2021
    Posts:
    7,256
    Weird because starting a new project for me takes a few seconds. My largest project takes < 10 seconds to open.

    This is running my projects off of one of my M.2 drives. You'd expect an SSD to be bit slower. What hardware are you running off? Almost sounds like you're running off an old school HDD...
     
  29. DragonCoder

    DragonCoder

    Joined:
    Jul 3, 2015
    Posts:
    1,634
    That the people who actually pay for their product (Unity Pro) use reasonably beefy computers :p

    Yeah sure Unity projects are larger than Godot's (and I too wonder what the reason is the library folder cannot be based on a centralized location to share between projects), however the comparison is not fair. The number of features in Unity simply are higher...
     
  30. Ryiah

    Ryiah

    Joined:
    Oct 11, 2012
    Posts:
    20,711
    Hey, the asteroid from 66 million years ago called, it wants it's dinosaur back. Jokes aside project creation and load times are worse now than they were before but they should never take you several minutes. The work project that I mentioned at the start of the thread only takes me seconds to load.

    I'm thinking some combination of a very weak processor, very low memory, and that old school HDD. You almost have to go out of your way to have serious performance problems with Unity. My computer back when I started wouldn't have been this bad and it'd be nearly 15 years old now.
     
    Last edited: Apr 14, 2023
  31. Noisecrime

    Noisecrime

    Joined:
    Apr 7, 2010
    Posts:
    2,034
    Weird - I'm definitely on the side of creation times and start up times in Unity take far longer than necessary.

    The problem I've found is its just not consistent from one day to the next for the same project. I get the feeling that there is more to this than simply hardware or even project complexity. The only 'variable' aspect I can think of is network, certainly Unity Editor is far more demanding on network operations these days, especially with packages needing to be checked, but cannot be sure if that explains the variations I see.

    Without actually finding some method to monitor and cache the time it takes to open specific projects I don't have any hard numbers and just have to rely on my 'feeling' that projects sometimes take longer than normal to open.

    My gut feeling though is that there is something going on due to my usage patterns with Unity.
    • Perhaps the frequency at which I swap between projects or return to old legacy projects has some impact. That for some reason returning to an old project ( yet still using the same editor version as most of my other projects ) inexplicitly takes far longer to start up the first time and is not due to rebuilding library or caches.
    • That the first project of the day that I open also takes longer than normal.
    • Maybe its as simple as the bloody stupid Unity Hub losing sign-in credentials when the PC goes to sleep, so opening a project is just waiting to timeout on package checks etc.
    Now in general I think most projects actually open up in under a minute, only creating new projects or moving projects to a new editor take longer and that is frequently due to Unity having to say download, decompress or rebuild caches ( e.g. the first time a template is accessed in a new editor it has to be decompressed). So much of this could be a perception issue, especially when I just want to start working and here I am waiting for 20-30 seconds for a project to open, it can feel like its taking minutes!

    What I have not noticed however is having projects or editor installed on SSD having any measurable impact on load times. In fact I'm just testing between having both editors and projects installed on HDD vs SSD and the results are negligible! However this is on a tiny < 300MB project, an SSD should have greater impact on overall performance the larger a project gets ( depending upon if you load assets on load or on demand - another aspect that we may not all be testing or judging the same load ).

    This does point to another factor though, if my projects are generally < 2 - 5 GB, then any impact from say Unity not being signed-in and timing out waiting for network commands, or rebuilding caches etc will proportionally have a greater impact on the load time, than for a 50GB project. Where an additional 5-10 seconds load time due to these weird issues or usage patterns is far more noticeable on a project that should load in 20 seconds, vs one that takes 60 or longer.


    For Reference a few tests on a 250MB script heavy project.
    It shows a matrix of editors and project on HDD or SDD, followed by first run, and subsequent runs.

    Editor: 2019.4 (HDD) Project (HDD) - 26 seconds - 18 seconds - 18 seconds
    Editor: 2019.4 (HDD) Project (SDD) - 18 seconds - 17 seconds - 18 seconds
    Editor: 2021.3 (SDD) Project (HDD) - 48 seconds - 16 seconds - 18 seconds
    Editor: 2021.3 (SDD) Project (SDD) - 20 seconds - 17 seconds - 18 seconds
    As you can see apart from the first run with the project on HDD, load times are constant. Why the first run on HDD is longer? I have no idea, the project was literally copied ( including library ) straight from SSD to HDD maybe disk cache? What will be interesting is if I can run the same tests over a few days and see what happens to load times, to see if they reflect my perception that load times can change for no good reason.

    Edit:
    A day later and restart of machine ( performed in 3,4,1,2 order )
    1. Editor: 2019.4 (HDD) Project (HDD) - 148 seconds - 18 seconds
    2. Editor: 2019.4 (HDD) Project (SDD) - 18 seconds - 16 seconds
    3. Editor: 2021.3 (SDD) Project (HDD) - 58 seconds - 16 seconds
    4. Editor: 2021.3 (SDD) Project (SDD) - 32 seconds - 17 seconds​

    So this is strange. Firstly it sort of confirms that after a certain amount of time or machine restart that loading up a project for the first time takes longer, at least for me. HDD projects do worse, but HDD editor and HDD project is really bad! Not sure what to make of HDD editor and SSD project, that seemed to have no impact at all?

    Beginning to suspect that various levels of caching might be play here, from hardware to software.

    Edit 2
    Another day later but no Restart of PC ( performed in 4,3,2,1 order )
    1. Editor: 2019.4 (HDD) Project (HDD) - 44 seconds - 17 seconds
    2. Editor: 2019.4 (HDD) Project (SDD) - 94 seconds - 17 seconds
    3. Editor: 2021.3 (SDD) Project (HDD) - 48 seconds - 17 seconds
    4. Editor: 2021.3 (SDD) Project (SDD) - 31 seconds - 16 seconds​

    This is the last update I'll make as I feel this has adequately shown the inconsistent load issues I've seen.
    • Something to note is that I swap the order of tests each day and I feel it shows the bulk of the delayed loading I'm seeing comes from first the editor use, then the project.
    • As expected having editor/project on SSD helps over HDD for the initial 'run' of the day, though for a small project like this, its negligible upon restarting/reloading editor/project.
    • In all cases the first run of the day takes longer, suggesting the editor is doing additional work, checking, caching etc.
    I do find it strange that others are reporting larger projects opening in seconds, as while I would consider reopening project times above to be within that statement, I'd argue clearly that the first run does not. This could be a perception issue, editor settings or maybe even some hardware/os settings.

    For reference I'm pretty sure I've excluded my project folders from Windows Defender, have fast fibre broadband and no particular network problems.
     
    Last edited: Apr 17, 2023
    Ryiah likes this.
  32. Ryiah

    Ryiah

    Joined:
    Oct 11, 2012
    Posts:
    20,711
    An overly zealous antivirus or firewall might have a significant impact in that case. I've largely forgotten about how much of an impact they can have thanks to how fantastically efficient Windows Defender is but thinking back I do remember having to turn off or add exceptions to third party ones due to it causing tasks to take unusually long.
     
    Noisecrime likes this.
  33. angrypenguin

    angrypenguin

    Joined:
    Dec 29, 2011
    Posts:
    15,603
    I've noticed this playing games these days. More than one game on the PS5 loads from the OS to the game's main menu lightning fast, only to sit there for a while checking for online content.
     
  34. zinexe

    zinexe

    Joined:
    Aug 21, 2015
    Posts:
    5

    amd 3950x 16 core, 64gb ram, 2tb M2 nvme (5000mbps), rtx4090

    no, it's not a system spec issue, it always takes a minimum 45 sec to start a new project, since it's compiling who knows what. It almost feels like a microsoft product, everything takes ages now, buried in logins and wizards.
    on an oldschool hd/ssd i wouldn't dare to think how long it would take.

    once a new unity project is finally ready, you still need to spend several minutes updating outdated packages, switching to linear rendering, each triggering some recompile.

    I'm not really comparing Unity to Godot. Of course there are more features to load etc. just because you are one of the "big boy" engines, doesn't mean you should just squander away storage space and not keep things lean.
    an empty unity project is surprisingly still only around 120mb, but it quickly bloats

    a small Unity project, just 4 small fbx 3d models:
    assets: 320mb, project size: 5.16gb, Library folder: 4.74gb

    What's in there? it's like the library of Alexandria stored in a basic project folder? "Artifacts"? x)

    comparing to Godot (not necessarily 1:1 comparison), just a reference
    Godot takes 3-5 sec to start a new project, empty project is 5mb
    Unity is install is 7.24gb pr version, Godot is 112mb

    It's not that i don't like Unity, just that it's starting to get bloated and feel monolithic, similar to UE
     
  35. Ryiah

    Ryiah

    Joined:
    Oct 11, 2012
    Posts:
    20,711
    See this is what you should have said initially and not that it takes several minutes because it just doesn't.

    For reference mine is a 5950X, 64GB RAM, 3x 2TB (OS, personal, work) M.2 NVMe (3500), RTX 3070.

    I've never had to update outdated packages with a new project. In fact I just checked my current work project which is now a few months old and only three packages have available updates: JetBrains Rider, Visual Studio Editor, and Unity's Test Framework.

    Seeing that you know there is an "artifacts" folder in there I imagine you've taken the time to at least poke through some of the folders. It should be obvious what most of them do. Artifacts is the only one with an unusual name but it's just the imported form of your assets.

    Godot doesn't have a large cache because it hasn't yet reached the point where they need it. It's a less complex engine but then it's less capable too. In theory you could use Unity without a cache if they allowed it but everything would take considerably longer and in a professional environment time is far more valuable than storage space.

    Just as an example if I delete the library folder for my largest work project the total time to make a new build for that project is around two hours but every subsequent build is at most 30 minutes. I'm more than willing to trade 100GB for a quarter of the build time especially when I'm making builds for multiple platforms.
     
    Last edited: Apr 23, 2023
  36. zinexe

    zinexe

    Joined:
    Aug 21, 2015
    Posts:
    5

    I just don't get this whiteknighting, like you are trying to defend Unity for whatever reason

    if it takes 45 sec for a pci4 nvme, it could easily take minutes, and 45 sec is still absurd, it wasn't always like this

    If your project would take hours to build, i don't think it's some small quick game idea, or prototype with 2-3 small models. Yet even tiny projects has extreme overhead now.

    you want to spend 100's of GB to trade off for build speed, good for you.
    maybe it makes sense if your project takes hours to build, but 2 models and a script don't need 5gb cache

    Saying 'what's in there' was clearly a rhetorical question, of course i know it has cached files for assets, those are not even the big problem, but having 90% of the lib folder just be some build/compiler artifacts - do you realize how much data 4gb is in text?

    besides the actually cached files, i'm not even including the several GB being saved in temp cache in you app folder, lightmap data etc, those are not even included.

    A 3d engine does not need to take up some monolithic piece of storage to be effective - at all!

    why even defend that you 'don't have to' update your packages? you don't update them fine, but there are warnings shown by default, it's pretty normal to update to latest when starting new projects. .

    It seems like you are trying to win some non existing argument, what is your point?
    That there is absolutely nothing to look at, unity is the perfect program? geez :)

    My point is, unity is way way slower than it used to be, empty projects takes forever to start, takes up loads of space, even switching from VS back to unity triggers a compile that takes longer than before. The experience is just overall worse, since sometime around v2017

    Godot has as a design directive to be lightweight and fast to use, it's not some impossible inverted scaling related to features. You'd be surprised what you could do with even 64kb in the past
     
  37. Ryiah

    Ryiah

    Joined:
    Oct 11, 2012
    Posts:
    20,711
    It's not whiteknighting. It's learning to pick my battles. I care about how long a task takes to complete. I don't care how much space it takes to do so. Storage space is trivial to increase. Performance is much less so. If I wanted to increase the former I can do so for $100. If I want to increase the latter it would take thousands of dollars.
     
  38. Noisecrime

    Noisecrime

    Joined:
    Apr 7, 2010
    Posts:
    2,034
    Coming from the days where games would fit into 48k or less I understand the frustration at the ballooning size of Unity and its projects. At this stage I'm not even sure Unity is that far off Unreal when you add up all the various caches it sticks everywhere ( e.g. npm, extracted packages, and the same packages in your project ). However I also apricate the caching of data in projects to alleviate run and build times.

    I go for a middle ground, once a project is complete I tend to strip out build caches and other non-editor running caches from the library project or sometimes delete the library entirely.

    The most obvious solution to these diametric opposite desires which often becomes a personal choice would be for Unity to offer the option to not cache data or easily delete it. However that would likely result in more bug reports and even more complaints of builds taking too long.

    In the meantime though there is nothing stopping you from writing an editor script, or even some powershell/batch script to automatically delete the caches or library. Perhaps its run on a per project basis, since as you said these sort of caches do not necessary make sense for small experimental projects and the time to regenerate them might not even be an issue, though for bigger projects I think you'd soon come to apricate having the caches and the time it saves.

    I will admit I'm confused as to why Unity insists on unpacking npm packages, in both the cache and then copying them over to the project library caches as well. For many packages its negligible, they can be a few MB, but for some like Burst they take up 800MB. I'd rather either those packages stayed in the local cache or if they had to be in a project, then for the local cache to not to keep unpacked versions. Again providing users an option to say not unpack local packages would be nice and I'd rather take the decompression hit every time it was used in a new project as that is infrequent.

    I think my last paragraph sums up well my frustration with Unity, is that no-one really seems to care about balancing efficiency of disk space vs time, that they offer no options to those who might want more focus on less disk usage.
     
    zinexe and Ryiah like this.
  39. zinexe

    zinexe

    Joined:
    Aug 21, 2015
    Posts:
    5
    I know that these things are not just there to bloat, they are to optimize speed etc, but it also is not a clear design principle from Unity to be lightweight.

    This is a shame I think.
    Personally i see the art of good code is to do a lot with very little.
    Responsiveness and not waiting for loading bars is a user experience design goal.
    Be lightweight, efficient, don't bloat.

    It's like old printer drivers, driver itself was maybe a few kb, yet the driver installs increased to 450mb+. that's more data than a compressed full feature film. They *could* have used vector gfx and kept a goal to never go beyond 2mb.

    Design principles can help enforce more efficient code as well as better experience.

    Why these caches are local to each project, i don't know, i can only assume they are unique and related to user data/unique packages.

    I think there is a design flaw in mixing user data with engine files in itself.
    Power users know they can clean the Lib folder, but many do not, they just wonder why their tiny 5mb data project turned into 5gb+

    It's not about the price cost of storage, it's the underlying philosophy, if something takes tons of space, it always makes me wonder if this is actually bad code/design or not.

    I admit I was hoping Unity would not go down a similar path as Unreal, because to me Unity was still a lot lighter and faster to use than Unreal, in which just a barebone project always seemed to be 2gb+.

    For Unreal the problem was mainly lot of big detailed standard assets (terrains, trees etc)

    Unity seems to have more problems with large amounts of small file fragments, caches, temp data.

    I'm still wondering how code (text) can be 5gb+ (some even in binary form)
     
    unity_18smith111 and Reactorcore like this.
  40. unity_18smith111

    unity_18smith111

    Joined:
    Sep 15, 2022
    Posts:
    3
    oh god please. your software is bloated
     
  41. Ryiah

    Ryiah

    Joined:
    Oct 11, 2012
    Posts:
    20,711
    Build size for that project varied between depending on the platform but the smallest was 5GB. That's just how it is for the source files when you're working on a high quality project primarily targeting the consoles.
     
    Last edited: Sep 24, 2023