Search Unity

  1. Unity 6 Preview is now available. To find out what's new, have a look at our Unity 6 Preview blog post.
    Dismiss Notice
  2. Unity is excited to announce that we will be collaborating with TheXPlace for a summer game jam from June 13 - June 19. Learn more.
    Dismiss Notice
  3. Dismiss Notice

Question Near-term status of Unity 6 and Desktop 4Gb Web support

Discussion in 'Web' started by ferretnt, May 9, 2024.

  1. ferretnt

    ferretnt

    Joined:
    Apr 10, 2012
    Posts:
    417
    Although Unity6 now has an option to exceed 2Gb and set a maximum memory size of 4096Mb for desktop browser builds, it seems from a quick test like we're still some way from it being possible to ship a build that uses 4Gb even if you restrict users to the latest (stable) Chrome...

    Based on a very quick test, a simple WebGL spinning cube (default scene plus cube with animator), with a single script which in its Start() function that allocates 3x1Gb NativeArrays (or C# arrays) will run, but with some strange WebGL lighting artefacts. Commenting out the allocs will then mean the scene renders correctly.

    The same scene in WebGPU will crash immediately (commenting out the memory-allocating script will allow it to run correctly.)

    (Note that in both cases "Target WebAssembly 2023" was checked, because it's my understanding that it has to be for >2Gb heap.)

    Now, in fairness Unity6 does warn you about setting a >2Gb limit, directing to the link below, which (paraphrased) says that some bits of the guts of browser resource handling don't really support >2Gb correctly (?? presumably affecting any WebGL tech, not just Unity, i.e. there *shouldn't* be ANY 4Gb web pages out there using other engines that run, but I have seen some that seem to work??)

    https://issues.chromium.org/issues?q=1476859

    We would like to make some technical plans for whether we'll ever be able to use 4Gb RAM on desktop browsers, without which our game is pretty crippled compared to its own high-end phone SKU. So, two part question:

    (1) Does Unity think that all of the engine-side work is "done", all of the remaining work is by browser vendors, and if so is there a predicted Chrome desktop version where Unity expects that all of this will work? I assume at Unity has a set of outstanding Chrome/Safari/etc tickets that they're watching, after which 4Gb should be fully working. Can that list be shared?

    (2) If we had an app which really does need >2Gb RAM, is there any way that we can order our allocations or pre-allocate to reduce crash frequency, since it seems like it's some specific WebGL (?? and webGPU based on crash above??) APIs that are the problem? (In practice, not sure how easy that is, since most of our 4Gb needs to be GPU accessible at some point, it's not like we have 2Gb of data that we could have just live in the mono heap...)

    I really LOVE all the work Unity is putting into turning web, everywhere, into a usable platform, and know that Unity is pushing a very large rock up a hill, with a lot of other companies needing to play ball. And that's been true ever since "Web was the big new platform for Unity 5".... I won't go quite off the deep end like Alexander St John's linkedin rant about the state of Web tech, but it's 2024 and in terms of RAM a desktop browser is the same spec as a 2014 iPad Air 2...
     
    Last edited: May 9, 2024
    kmowers likes this.
  2. CodeSmile

    CodeSmile

    Joined:
    Apr 10, 2014
    Posts:
    6,974
    Well, it would be risky to "plan for" something that may or may not be flawlessly possible even if some other party were to say: "sure, it will work by date x". So yes, you ought to also make that backup plan for the 2GB version for sure.

    Perhaps rely on 2GB to begin with since it's your safety net and the "4gb boost" by necessity has to be something extensible. So you make it work by merely batch exporting either the "2k" or the "4k" quality assets. You're welcome. :)

    Unless the memory isn't for assets but for procedural content, which would also get rid of the initial multi-gig download. In that case you would design your generative algorithms to work with a 2k - 4k quality slider of some sort.

    What about the system requirements? Perhaps you do want to fall back at runtime to 2k mode if the user's system has no more than 8 GB memory installed, which typically means barely 4 gb of available memory under the best of conditions.

    You will have to plan and design a memory-flexible solution to begin with because it's in your best interest everyway you look at it! ;)
     
  3. ferretnt

    ferretnt

    Joined:
    Apr 10, 2012
    Posts:
    417
    OK, sorry, I wish I hadn't written "ever be able", because clearly that got focussed on rather than the specific questions I numbered...

    A full list of (cross-organizational - unity, chromium, etc) tickets to get to a working 4Gb web build, whilst not a date, would be a great thing. I assume Unity has such a list internally, and my first mistake was asking on the forum. That list would at least represent a burn-down. Granted, making plans based on the burn-down of an initiative driven by committee is its own adventure...

    Yes, I am aware you can scale content (it's what I do for a living...), but not all content scales elastically. Assume we know how to scale content, and that this question is independent of that. Given finite effort, min specs still exist for reason. In 2024, we'd love min spec not to be 2Gb.
     
  4. CodeSmile

    CodeSmile

    Joined:
    Apr 10, 2014
    Posts:
    6,974
    I trust you know what you are doing. Webgl isn‘t a target audience that I would expect to favor high memory usage games, or even be able to play them. So I hope you‘re catering to a profitable niche where developing web content at that scale actually provides a positive ROI. I have my doubts, I guess that came through in my post‘s subtext. ;)