Search Unity

Which platforms do use the full GPU and which don't?

Discussion in 'General Discussion' started by ARealiti, Jul 28, 2020.

  1. ARealiti

    ARealiti

    Joined:
    Oct 4, 2013
    Posts:
    133
    With frame rates unlocked UWP builds can use up to 100% GPU and Windows builds about 50% on my machine. The player in Unity itself only uses about 30% GPU, WebGL can use 25-50% depending on the build and what's in it. So what the so and so is going on Unity? Can anyone shed some light on this as all the 3 windows platforms UWP, Native and WebGL should be able to utilise 90%+ GPU with a single running instance.

    Thanks
     
  2. ShilohGames

    ShilohGames

    Joined:
    Mar 24, 2014
    Posts:
    3,023
    The really depends on the game and how efficient your code is. I have built some really crazy DOTS demos where my CPU was only 50% utilized and my GPU was 100% utilized. That was with a Windows build on a high end PC.
     
    Joe-Censored and angrypenguin like this.
  3. Joe-Censored

    Joe-Censored

    Joined:
    Mar 26, 2013
    Posts:
    11,847
    The percentage of GPU usage depends on the work you're asking of the GPU, the performance capabilities of your specific GPU, and whether your CPU is capable of delivering data to the GPU faster than the GPU can process it.
     
    angrypenguin likes this.
  4. ARealiti

    ARealiti

    Joined:
    Oct 4, 2013
    Posts:
    133
    Hi, no nothing to do with that all running under 60fps high complexity and still only 40% max GPU on WebGL, and WebGL usage is so inconsistent sometimes the most complex things with no CPU still only use 20-25% GPU and lag like crazy. If they used 90%+ GPU like the UWP which gets 50fps they would be ok and not that big a difference, but because WebGL only uses less than half the GPU for exactly the same thing even though it is way below 60fps, then of course what I'm getting is 20fps frame rate. This isn't rocket science Unity Dev's is this a WebGL limitation like the 4GB Ram per tab, please respond. Basically, I don't believe WebGL is slower it's just throttled to heck. Unity Dev's answer please!
     
  5. Ryiah

    Ryiah

    Joined:
    Oct 11, 2012
    Posts:
    21,195
    Just differences between the platforms and their optimizations. WebGL, for example, is very dependent on the browser and often doesn't have the latest features of the hardware, and Microsoft has basically deprecated UWP.

    https://www.thurrott.com/dev/206351/microsoft-confirms-uwp-is-not-the-future-of-windows-apps

    Unity uses Emscripten to translate code into WebAssembly which is then executed in a sandbox which imposes a penalty on performance that is on average around 10% slower compared to native code.

    https://docs.unity3d.com/Manual/webgl-performance.html
    https://www.usenix.org/conference/atc19/presentation/jangda

    WebGL isn't immune to this performance penalty meaning you now have two sources of the penalty due to using both at the same time. I wasn't able to find a statement about the performance or benchmarks but I'd expect at least another 10% and the statement from people on Stack Overflow suggests that that may be very optimistic.

    https://stackoverflow.com/questions/17516187/performance-of-webgl-and-opengl

    So, yes, WebGL is slower and not just because it's missing features but because it's sandboxed within a browser.
     
    Last edited: Jul 29, 2020
    angrypenguin likes this.
  6. ShilohGames

    ShilohGames

    Joined:
    Mar 24, 2014
    Posts:
    3,023
    It is not reasonable to expect WebGL to perform as well as a native build. WebGL is running inside a web browser, and that creates some bottleneck.

    UWP is basically not a relevant option. UWP was always a strange thing for Microsoft to push, and it makes sense for Microsoft to back away from UWP.
     
    Joe-Censored likes this.
  7. Neto_Kokku

    Neto_Kokku

    Joined:
    Feb 15, 2018
    Posts:
    1,751
    Browsers also perform validations and apply some limits to WebGL content for security purposes.
     
  8. ARealiti

    ARealiti

    Joined:
    Oct 4, 2013
    Posts:
    133
    You're all just not getting it, if you were gamers you'd know the one thing they all try to do is max the GPU, why do you think they over clock it, as quoted by this article,

    "Low GPU usage in games is one of the most common problems that trouble many gamers worldwide. Low GPU usage directly translates to low performance or low FPS in games, because GPU is not operating at its maximum capacity as it is not fully utilized."

    https://graphicscardhub.com/low-gpu-usage/

    Running at a max of 30% GPU isn't going to max performance it's going to kill it stone dead! Trying to invent made up reasons about why the brower runs slower is nonsense.

    To prove that I bought this https://assetstore.unity.com/packages/tools/gui/embedded-browser-55459 and built to Windows EXE and UWP with exactly the same Angular GUI overlaid on a 3D scene with frame rates unlocked as I had in my Browser WebGL. The Unity3D builds for WinExe and UWP run at exactly the same frame rates and GPU usage 70-90% on Windows and UWP with and without the in game browser Angular GUI. So if a guy by himself using Chromium which is exactly the same build code as Chrome and Edge now use can get no performance cost to an in Unity3D browser, there is absolutely no performance cost to running in the browser period! It could only be WebGL, but it isn't, it is WebGL being throttled, whether by Unity3D or the browser or WebGL standard and I can't get to the bottom of it!

    What's going on Unity devs answer please!!!!!!!
     
    Last edited: Jul 30, 2020
  9. ARealiti

    ARealiti

    Joined:
    Oct 4, 2013
    Posts:
    133
    Riddle me this Browser Slow believers o_O...
    All Shots at same Resolution:

    Windows EXE Build With In Game Chromium Browser and Angular GUI overlaid, frame rates on right 172 FPS.
    upload_2020-7-30_10-47-15.png
    Windows EXE Build With Nothing else 172 FPS
    upload_2020-7-30_10-45-18.png
    WEBGL With Angular in real Chrome Browser 91 FPS
    View attachment 671552

    Exactly the same performance the Chromium Browser cost nothing and both ran with same GPU Usage about 70% whereas WebGL build runs at about 30% and runs half the speed!

    What's going on Unity with WebGL builds devs answer please!!!!!!!
     

    Attached Files:

    Last edited: Jul 30, 2020
  10. ARealiti

    ARealiti

    Joined:
    Oct 4, 2013
    Posts:
    133
    GPU Usage 75% with Win EXE builds
    upload_2020-7-30_11-6-19.png
    GPU Usage 35% with WEBGL build of course it's slower, nothing to do with WEBGL or Browser, just less GPU usage
    upload_2020-7-30_11-8-23.png
    What's going on Unity with WebGL builds devs answer please!!!!!!!
     
  11. bobisgod234

    bobisgod234

    Joined:
    Nov 15, 2016
    Posts:
    1,042
    Did you read the articles Ryiah linked to?
     
    angrypenguin and Joe-Censored like this.
  12. Neto_Kokku

    Neto_Kokku

    Joined:
    Feb 15, 2018
    Posts:
    1,751
    What is your CPU usage while running that? It's very likely the browser is far more CPU-bound than a native build since everything was converted to Javascript, it cannot utilize SIMD and is probably single threaded (I'm not up to date to how widespread multithreaded javascript support is on browsers and if Unity leverages it).
     
    angrypenguin and Joe-Censored like this.
  13. ARealiti

    ARealiti

    Joined:
    Oct 4, 2013
    Posts:
    133
    Thanks for the good question, shows someone is thinking about what I'm saying now. But not the reason...
    WebGL CPU Usage at 13% when running the example...
    upload_2020-7-30_11-59-46.png
    WinEXE CPU Usage at 11% when running the example...
    upload_2020-7-30_12-2-12.png
    WinEXE CPU Usage at 11% when running the example with in game Chromium browser overlay...
    upload_2020-7-30_12-3-21.png

    So no real difference and certainly not the CPU maxing out as the bottle neck.

    I think WebGL browser slow is a myth and throttling is what is causing it like the 4GB per tab memory limit, it's artificial.

    Thanks for the good question, shows someone is thinking about what I'm saying now.
     
  14. Ryiah

    Ryiah

    Joined:
    Oct 11, 2012
    Posts:
    21,195
    Of course he didn't. He's too convinced his own thoughts are the only accurate source of information in spite of the fact that there is a plethora of information out there that says he's full of it. He's the poster child for the Dunning-Kruger effect.

    https://en.wikipedia.org/wiki/Dunning–Kruger_effect

    Specifically, for anyone actually curious and wanting to learn, it's the instruction itself that has the bottleneck because it has to be passed through multiple layers to reach the GPU. Once it has though it's irrelevant whether it's an instruction operating on a single asset or operating on multiple assets at once.

    With proper batching you can largely bypass the limitation which is why it's an "average of 10%" and not a "flat 10%", but on the other hand if you poorly implement your app you will see more than just a 10% penalty. I don't know what the upper limit is but I'm willing to bet it's around the amount the OP is getting.

    SIMD is functional within Firefox but it's completely absent from Chrome.

    https://blog.mozilla.org/javascript/2015/03/10/state-of-simd-js-performance-in-firefox/
    https://bugs.chromium.org/p/v8/issues/detail?id=4124

    There are ways to multi-thread your code in JavaScript and Unity has had it in experimental form since 2019.1. That said some browsers have them disabled by default due to security exploits like Spectre.

    https://forum.unity.com/threads/multithreading-and-webgl.817986/#post-5488017
     
    Last edited: Jul 30, 2020
    NotaNaN, De-Panther, GCatz and 2 others like this.
  15. ARealiti

    ARealiti

    Joined:
    Oct 4, 2013
    Posts:
    133
    Oh here we go the typical inexperienced developer who has never done anything real on the web. Come back when you've done some real browser systems and not just quoting "sandbox" as the reason as if you even know how that works.

    Show me your great examples with only 10% performance loss.

    From Unity themselves which I agree with...
    "What kind of performance can you expect on WebGL?
    In general, you should get performance close to native apps on the GPU, because the WebGL
    graphics API uses your GPU for hardware-accelerated rendering. "
    https://docs.unity3d.com/Manual/webgl-performance.html

    It should be "close" not using less than half as much GPU resource and running more than twice as slow.

    So unless you have something real to add other than a few quotes that mean nothing, like some real data that says exactly the percentages lost in performance from some study rather than just quoting things and saying that's it and your're right and I'm right you're, then shut the ... up b...
     
    Last edited: Jul 30, 2020
    bluecollar likes this.
  16. Ryiah

    Ryiah

    Joined:
    Oct 11, 2012
    Posts:
    21,195
    From the very same address the sections that you chose to ignore because they agree with my statement...
     
    De-Panther and angrypenguin like this.
  17. ARealiti

    ARealiti

    Joined:
    Oct 4, 2013
    Posts:
    133
    Show me your great examples with only 10% performance loss.
     
  18. Ryiah

    Ryiah

    Joined:
    Oct 11, 2012
    Posts:
    21,195
    I provided my evidence backed up by professionals. Feel free to peruse the links I provided.
     
  19. ARealiti

    ARealiti

    Joined:
    Oct 4, 2013
    Posts:
    133
    Cop Out Much "Close" not Half or less Mr Troll. Your inexperience is obvious and you hijacked an important thread. If I ignored you that was up to me, what you said was rubbish and not based on your abilities or experience.
     
  20. bobisgod234

    bobisgod234

    Joined:
    Nov 15, 2016
    Posts:
    1,042
    Obviously this is the kind of thing you want to post on an "important thread", along with lots of exclamation marks and shouting. That will get you an answer to your question.

    May I ask how a WebGL build and a native build with integrated web browser are comparable? In the latter case, the heavy lifting (rendering, skinning) is being handled by multi threaded native code (as opposed to the single-threaded Web Assembly of the first example). There is a good chance that this is where the bottle-necking is.

    Have you tried other engines that have a WebGL option vs native (e.g. Godot)? That might help reveal if it's a bottleneck on the Unity side or the WebGL side of things.
     
    Last edited: Jul 30, 2020
    KyleOlsen and Ryiah like this.
  21. Ryiah

    Ryiah

    Joined:
    Oct 11, 2012
    Posts:
    21,195
    Godot has awful performance on WebGL but in this case it's due to fewer resources invested in its development.

    https://www.reddit.com/r/godot/comments/cirv9c/how_viable_is_html5/
    https://www.reddit.com/r/godot/comments/hdzhwi/question_about_godots_performance_especially/

    Meanwhile Epic Games removed support for it from the engine starting with UE 4.24. You can still make use of it but it's now up to the community to develop it. The repository it's in though has three contributors and was last updated two months ago. It's basically dead.

    https://docs.unrealengine.com/en-US/Platforms/HTML5/GettingStarted/index.html
     
    Last edited: Jul 30, 2020
    Joe-Censored and bobisgod234 like this.
  22. ARealiti

    ARealiti

    Joined:
    Oct 4, 2013
    Posts:
    133
    See this is a good suggestion but did you notice what "know it all" did, they immediately came in and squashed your reasonable and intelligent suggestion with their no experience "know it all" based on their "reading".
    Yes I will try other engines to see if anything else takes full advantage of the GPU with WebGL like three.js and Babylon and PlayCanvas . I'll ignore know it all who should have just said from the beginning "I Hate WebGL!!! And I have never used it but I just know everything!!!" and not bothered with all their rubbish.
     
    Last edited: Jul 30, 2020
    bluecollar likes this.
  23. ARealiti

    ARealiti

    Joined:
    Oct 4, 2013
    Posts:
    133
    The ANSWER, Unity is c... with WebGL

    The Winner: PlayCanvas kicks, end of discussion, WebGL is fine. Unity only using 30% GPU s...ks big time, "know it all" knows nothing, end of discussion.

    PlayCanvas Mozilla demo for WebGL 2 release using 80% GPU
    upload_2020-7-30_16-29-14.png
    upload_2020-7-30_16-29-46.png
    I will post the same demo I posted above for Unity3D Winexe and WebGL with PlayCanvas later.
     
  24. bobisgod234

    bobisgod234

    Joined:
    Nov 15, 2016
    Posts:
    1,042
    I built the default URP scene in WebGL and it uses 98% of my GPU.

    I think the GPU usage is going to depend very heavily on the game itself. 100 very dense CPU-computed skinned meshes will probably be CPU bound (and limited to one thread, so low apparent CPU usage) vs 100 layers transparent full screen quads on a 4k monitor is probably going to max out your GPU
     
  25. ARealiti

    ARealiti

    Joined:
    Oct 4, 2013
    Posts:
    133
    PlayCanvas is much faster than Unity3D though, I'm tempted to switch. I'll try the URP scene and see what I get I'm just using Legacy, thanks for giving it a go.
     
    Last edited: Jul 30, 2020
  26. andyz

    andyz

    Joined:
    Jan 5, 2010
    Posts:
    2,276
    Having worked with WebGL I have concluded it is an unusable web tech, at least via Unity, as the resources are too costly, depends on browser choice/version, Mac Safari complains your page is using too much energy, mobile doesn't really work!
    Using a lighter framework may work but the state of Html5 is disappointing - I think in part it's Apple pushing for native apps as usual.
    UWP is something I never wanted and was a poor solution next to normal (win32?) apps - glad to hear it may be going!
     
  27. Neto_Kokku

    Neto_Kokku

    Joined:
    Feb 15, 2018
    Posts:
    1,751
    Native javascript WebGL frameworks are usually going to be more efficient in browsers than emscriptem-based engines like Unity, Godot and UE4, specially in terms of memory usage and resource management. If you can use them (aka: your programmers have web-development experience and you are willing to deal with the much less robust 3D scene tooling and content pipeline), you should.
     
    De-Panther and Ryiah like this.
  28. ARealiti

    ARealiti

    Joined:
    Oct 4, 2013
    Posts:
    133
    Hi, KH, because the business project I'm working on at the moment is all going to be Point Cloud LiDAR data without much need for physics, fancy effects, shaders or lighting, I'm architecting the GUI in Angular and abstracting the 3D Viewer as a plugin/service which will allow for plugin of WinEXE (Unity3D), WebGL (PlayCanvas / maybe Unity wasm for things not requiring high performance) and now WebGPU (using Chrome Canary, in hand written code and perhaps Babylon/PlayCanvas when they support). It's a 1 or 2 year project so WebGPU may well be full released by mid next year or whenever it is the plugin viewer architecture will allow for WebGPU to be "plugged in" to the Angular GUI without modification. The in game browser https://assetstore.unity.com/packages/tools/gui/embedded-browser-55459 works great and allows for exactly the same GUI to run in a WinEXE build using the same Angular source as running in an external browser with WebGL or WebGPU.

    The plugins using "browser native coding i.e. Javascript" for WebGL and WebGPU code avoid the 4GB per tab limit as well as they are Javascript and not WebAssembly.

    Thanks for the discussion. Appreciated.
     
    Last edited: Aug 1, 2020
  29. ARealiti

    ARealiti

    Joined:
    Oct 4, 2013
    Posts:
    133
    Unity need to stop using Emscripten and hire some developers with real Web coding experience and just get WebGPU build done, their own proprietary tiny whatever Canvas 2D doesn't cut it, they need to stop moaning and parading around as though know how to code in the browser and get some Google devs or something working with them.
     
  30. angrypenguin

    angrypenguin

    Joined:
    Dec 29, 2011
    Posts:
    15,620
    Can you show us your stats popup and the profiler? Screenshots and hardware stats aren't actually helpful on their own because they may well show there's an issue but they give no indication of where it's coming from. For all we know your scene is just choking on loads of tiny batches that aren't an issue for a native client with a different graphics API.

    What kind of CPU is that?

    On my own CPU (quad core with hyperthreading) that's exactly what I'd expect to see in WebGL: one thread maxed out showing 12.5% total CPU usage, and the other threads with almost nothing to do for reasons already covered by @Ryiah. In other words, on my PC that's precisely what a maxed out CPU bottleneck would look like.

    Having lower CPU usage and higher GPU usage in native versions doesn't surprise me, though they're not comparable environments.

    As a game developer, I'm sure that you know that higher GPU utilisation is achieved by the CPU giving it more stuff to do. Assuming you can't upgrade the hardware itself that means one of two things:

    - The CPU being able to submit more instructions. This requires having more processing efficiency CPU-side to be able to create and issue more work items to the GPU. This in turn needs things such as efficient native code, multi-threading and use of modern, low-level graphics APIs designed for this (eg: Metal, Vulkan). WebGL's access to those things is hit and miss.

    - The CPU submitting instructions which include more GPU work. This is typically achieved by effective batching at some stage of the pipeline, from well constructed art assets to real-time batching algorithms, or various things at the stages of the pipeline between those. These are up to the developer(s) being able to optimise for that stuff.

    An example of the latter is constructing your scene in a way that Unity can submit huge chunks of it to the GPU for drawing at once, rather than separately submitting each individual piece.
     
    MartinTilo, bobisgod234 and Ryiah like this.
  31. ARealiti

    ARealiti

    Joined:
    Oct 4, 2013
    Posts:
    133
    Thanks but it really does come down to Unity not knowing what they're doing in Web/Browser and putting all their eggs into the Appstore / Native camp in line with Apple.

    This is a much bigger battle than Unity and Unity devs think it is, Javascript is by far the number 1 language in the world and Google, Microsoft, Amazon, Facebook and a lot of others are all lining up against Apple and the Appstore native only model not just in 3D / games...

    https://www.cnet.com/news/google-web-app-plans-collide-with-apple-iphone-safari-rules/
     
  32. angrypenguin

    angrypenguin

    Joined:
    Dec 29, 2011
    Posts:
    15,620
    They may not be browser experts (I don't know), but keep in mind that to them WebGL is just one of 27 platforms. When I chose Unity to work in it wasn't because they were ever the best at any given platform, it's because as long as I looked after the design side of things they took care of 99% of the compatibility considerations when targeting arbitrary other platforms.

    If you specifically need to just get the most out of a web browser then I'd suggest looking at tools which focus on that.

    That aside, if you want to get the most out of any real-time 3D platform then the stuff people have mentioned here is going to be important at some point or another regardless of platform. Data doesn't get to the GPU or VRAM by magic. That stuff has been getting optimised one step at a time since real-time 3D graphics were a thing, and if your content isn't designed to take advantage of those optimisations where relevant you're really just hoping it'll work via brute force.
     
    De-Panther, MartinTilo and Ryiah like this.
  33. Ryiah

    Ryiah

    Joined:
    Oct 11, 2012
    Posts:
    21,195
    This. PlayCanvas getting more performance than Unity is completely expected since it's made for WebGL. Unity is made to be cross platform and one of the side effects of being cross platform is that you're not spending all of your resources on just one platform.

    It's very dependent on who you ask. According to the IEEE JavaScript is the fifth preceded by Python, Java, C, and C++. If it were to be the first though it wouldn't be solely because of front-end development but rather because it can be used for the full development stack and has features like asm.js, emscripten, simd.js, etc that expand its usefulness.

    https://spectrum.ieee.org/at-work/tech-careers/top-programming-language-2020
     
    De-Panther likes this.
  34. ARealiti

    ARealiti

    Joined:
    Oct 4, 2013
    Posts:
    133
    Geez you're going back to an organisation that was last relevant in the last the millennium, they still recommend standards for waterfall development, next you'll be bringing out quotes form guys in white coats at IBM.

    https://insights.stackoverflow.com/survey/2020#technology-programming-scripting-and-markup-languages
     
    Last edited: Aug 2, 2020
  35. ARealiti

    ARealiti

    Joined:
    Oct 4, 2013
    Posts:
    133
    Unity need to stop using Emscripten and hire some developers with real Web coding experience and just get WebGPU build done, their own proprietary tiny whatever Canvas 2D doesn't cut it, they need to stop moaning and parading around as though know how to code in the browser and get some Google devs or something working with them.
     
  36. neginfinity

    neginfinity

    Joined:
    Jan 27, 2013
    Posts:
    13,572
    Acting arrogant and looking down at others will make people assume that you're very young and new to programming in general, plus it will decrease chance of getting any assistance in the future. Something to keep in mind.

    A more reasonable idea would be to ditch web as a platform, as it seems to be quite volatile. Basically, there doesn't seem to be a warranty that your application will remain usable for a long time if it is made for the web.

    I used to enjoy Ken Perlin's work, for example. His site had tons of small and amazing visual demos... except they were all written in Java as applets. Applets got axed, so nothing works there anymore. There were also ton of flash games, and flash got axed too.

    At least on normal PC applications last much longer before they can no longer be launched.

    To theoretically reach full GPU/CPU utilization, you'd need to work multithreaded and disable VSync. However, this is not exactly necessary for most cases, as you may end up wasting electricity for nothing. Normally, an application would wait for vblank signal for an application to refresh, so in this case neither CPU nor GPU would be doing anything useful. Disabling the wait for vsync will make the computer redraw scene continuously, likely causing screen tearing and increased battery drain. In this scenario you'll either overload GPU first or you'll reach 100% utilization... on a single CPU core, unless you've invested into DOTS.

    Generally, I'd be likely to treat hitting 100% GPU utilization in my application as a problem. Because it means that less powerful hardware wouldn't be able to handle the game.
     
    Last edited: Aug 2, 2020
  37. kdgalla

    kdgalla

    Joined:
    Mar 15, 2013
    Posts:
    4,639
    Yeah, having done some web development myself, I'm actually amazed that a Unity project could actually be built to web in any capacity.
     
  38. angrypenguin

    angrypenguin

    Joined:
    Dec 29, 2011
    Posts:
    15,620
    To be honest, though, that's one of the really neat things about Unity and its truckload of supported platforms. One of my old games has been built for PC, iOS, Android, the old WebPlayer, Flash, WebGL, probably console via UWP, and I didn't have to do any of the underlying work to make it compatible with any of those platforms.

    When native web apps got canned I basically just did a new build, ran some QA, and... that was it. Same deal when Flash got the boot and things moved over to WebGL.

    If I'd directly supported one of those platforms myself then I'd have got a bit more performance, I'm sure. But then the project's web build support would have just been dead when the native plugin was canned, because it was a small project that wouldn't have warranted a manual port to an unfamiliar technology.
     
    De-Panther, MartinTilo and Ryiah like this.
  39. zombiegorilla

    zombiegorilla

    Moderator

    Joined:
    May 8, 2012
    Posts:
    9,052
    Watch the language and insults. Thread bans have been issued, stay on topic and not get personal.
     
    De-Panther likes this.
  40. Ryiah

    Ryiah

    Joined:
    Oct 11, 2012
    Posts:
    21,195
    I remember reading into the technologies that they are using (asm.js, emscripten, webassembly) back when they initially announced it and being very impressed that there is only a relatively minor performance penalty when you consider that they're not compiling down to native code but rather down into JavaScript and Wasm for execution in a browser.

    Wikipedia has a couple examples of Emscripten taking C and translating it into JavaScript if you're curious.

    https://en.wikipedia.org/wiki/Asm.js

    And a C snippet compiled into Wasm.

    https://en.wikipedia.org/wiki/WebAssembly
     
    Last edited: Aug 3, 2020
  41. De-Panther

    De-Panther

    Joined:
    Dec 27, 2009
    Posts:
    589
    Some of the employees of Unity did work on browser development and on web standards.

    The difference in performance between Web based game engines to Unity, is not because of JS vs WASM, but because that those engines are optimized specifically to run on browser.
    When looking on performance of JS vs WASM, WASM has the advantage, it's just that Unity doesn't use it to it's max potential on the Unity WebGL build (with Tiny they do some amazing stuff, but it's only in the beginning).

    If you need something to run well on web, you can use a web based render/game engine like babylon.js, playcanvas or three.js. Or you can optimize your Unity project to run better on WebGL, don't expect that something that you built on native desktop would run exactly the same on other platforms.
    I saw some stuff made using Unity that looks great and runs fast in the browser, and I saw stuff that looks bad and struggling to run in the browser and developed using web rendering engines.
    The engine is a tool, you need to learn how to use it right.

    Regarding ditching the web platform as Flash games or Java applets are no longer available, they were never a web standard.
    But games and experiences that were developed using web standards, like WebGL, still work years later...

    I think that Unity has lots of improvement to do in the web platform, but saying that they don't know web development or that they shouldn't use emscripten/WebAssembly, it's mainly saying that the one who says that need to better check his resources.
     
    DanOrgan likes this.
  42. DanOrgan

    DanOrgan

    Joined:
    Mar 13, 2018
    Posts:
    6
    I've tried your WebXR Exporter fork (https://github.com/De-Panther/unity-webxr-export) and it was pretty impressive even on Oculus Go. As a comparison, the old WebVR exporter was unusable because of low fps.

    But I have an impression that Babylon/Three.js applications are faster on mobile VR devices. Do you recommend to switch to web-based game engines if I want to make a game for Oculus Quest/Go? Or it worth trying to optimize Unity project as much as possible?
    Have you tried to compare your WebXR exporter to BabylonJS/ThreeJS? I mean performance-wise, of course.
     
  43. De-Panther

    De-Panther

    Joined:
    Dec 27, 2009
    Posts:
    589
    This is not the thread for this question, so I'll try to make it short.
    Unity WebGL is not optimized to work on mobile. As such, it would be harder to optimized it to work on mobile with the same performance that web platform specific tools can.
    Having said that, it has great tools and pipeline, and also lots of ready made assets and components/systems.
    So it's a question of what will take you longer, to optimize the game, or the other aspects of the development.

    Anyway, if the question is between BabylonJS or ThreeJS, in most cases I'll recommend Babylon.

    But it's all depends on the project and the scope.