Search Unity

How to handle truly open world maps with Unity, due to floating point limitations?

Discussion in 'World Building' started by Marcos-Elias, May 9, 2018.

  1. snacktime

    snacktime

    Joined:
    Apr 15, 2013
    Posts:
    3,356
    DOTS is perfectly usable for this right now in the hands of more generally experienced developers. Core ECS and physics were two parts of DOTS that were amazingly stable from day one. Far more so then SRP's ever were or are still. A good number of commercial projects using DOTS are in the works also according to Unity (don't have the post handy was rather recent).

    We don't do open world by design, but we do pretty large scale worlds using DOTS physics. We have terrain, thousands of dynamic building structure pieces, tens of thousands of vegetation colliders, and characters and vehicles. And I can rebuild that every frame with zero main thread impact.

    Transforming that to another space would almost be white noise relative to everything else.

    But open world would IMO require a lot of the same performance first design approaches that scale forces. Just for different reasons. Best approaches all over pretty much. Even within DOTS physics we use more advanced approaches like dynamic compound collider partitioning, multiple physics worlds to separate static stuff, etc..

    The biggest obstacle *is* performance. Transforming is cheap. But moving is often not. So you end up needing to have a rather deep knowledge of a lot of different areas in order to create implementations that can move things performantly enough. And it generally means using more low level api's.

    Also transforming stuff every frame is nothing new. And it's not continuous vs periodic. It's what fits the context best. Since the cost of moving can vary a lot, different areas might have their own context specific logic for how often they transform. Very often it might be based on whatever internal partitioning strategy is used, where that partitioning is for other reasons. I would say that transforming is probably best if it's thought of as feature specific. Ie don't create some grand design that tries to force a single approach. That can only end badly IMO.
     
    Marcos-Elias and Lurking-Ninja like this.
  2. cosmochristo

    cosmochristo

    Joined:
    Sep 24, 2018
    Posts:
    250
    @Marcos-Elias, before I answer your questions I will let a game-of-the-year answer some. In the video linked below Alex Beachum describes a Unity game, Outer WIlds, written with continuous floating origin. Time stamps from
    31:49-32:49. From now on think of these two things together "Continuous floating origin" and "Game of the year".
     
    Deleted User likes this.
  3. cosmochristo

    cosmochristo

    Joined:
    Sep 24, 2018
    Posts:
    250
    The assumption that one cannot smoothly travel over 10km over a fixed map is not supported by evidence. In the link below, I move smoothly a distance of over 50km. No jitter evident. There IS a small amount of one form of jitter in one of the two towers at 55,000m from the origin. It is removed in the second tower. Unity 2018, all float32, not one double.

     
    Last edited: May 22, 2021
    FlightOfOne and deus0 like this.
  4. cosmochristo

    cosmochristo

    Joined:
    Sep 24, 2018
    Posts:
    250
    I fully support the suggestion that Unity should provide floating origin natively and static objects are useless in a system where everything moves. However, such a change must be based on evidence: proof that the solution will scale and not fail with glitches or speed or some other factor.

    The problems discussed here are not caused by limitations of Unity but by limitations of the design and algorithms that manifest jitter and the proof is that I can solve these issues with Unity as it is. Changing Unity based on incorrect assumptions and unproven ideas about the nature of the solution would only harm all who develop and use it.

    Admittedly, there may be some smoking gun that is a problem with moving everything, but so far I have not found it and i will continue to test and prove everything I can.
     
  5. cosmochristo

    cosmochristo

    Joined:
    Sep 24, 2018
    Posts:
    250
    As a somewhat late addition to the discussion initiated by @Marcos-Elias, I would like to include another aspect relating to the development of large scale worlds. A scenegraph that spans a large continuous space (e.g. over 50km) can sometimes exhibit what I call distant relative jitter: a nearby object visibly jitters because it is connected to a very distant scene node. I combine CFO with "Dynamic Resolution Spaces" (DRS) to solve this. My video above did not demonstrate this so well and the one attached below is clearer. It was with versions of these two assets that I was able to have a single static scenegraph spanning the Solar system. DRS can be used to handle the dynamic mods to the scene, as you approach something, to ensue it is repositioned more accurately within a higher rez space. There is no need for a fixed subdivision/sectorisation model - but it will work with one.
     
    deus0 and Marcos-Elias like this.
  6. hippocoder

    hippocoder

    Digital Ape

    Joined:
    Apr 11, 2010
    Posts:
    29,723
    The world's contents will exhibit drift even if being moved to the player. A force at 50k cannot be as accurate as nearby. How bad that is probably isn't bad at 50k but it will keep on getting worse.

    So for some things it might be a problem. Certainly not for Outer Worlds.
     
    awesomedata likes this.
  7. cosmochristo

    cosmochristo

    Joined:
    Sep 24, 2018
    Posts:
    250
    And this is a very good use case for supporting @Marcos-Elias's call for more open access to the lower level physics API. With access, distant physics could be performed in mathematical modelling and only presented visually when close. Mathematical modelling can be centred at the origin, no matter what the reference position for the visual representation of physical objects it relates to.
     
    awesomedata and deus0 like this.
  8. aaronmichaelfrost

    aaronmichaelfrost

    Joined:
    Feb 4, 2021
    Posts:
    39
    Check out my solution. I'm using it in my own game: https://assetstore.unity.com/packages/slug/204179.

    It's on the asset store and it supports Mirror/Multiplayer. You just have to DM a staff member in the Discord if you need the Mirror extensions.

    There is also this getting started guide video:
     
    Erebar and Marcos-Elias like this.
  9. Tony-Lovell

    Tony-Lovell

    Joined:
    Jul 14, 2014
    Posts:
    127
    I'm always perplexed why people have physics-related offerings that defy origin shifts. Is the issue that the physics layer does not allow you to simply go in and alter the positions within the history of the rigidbodies?

    I mean, if I want to move an object to the right 2000 units, shouldn't I be able to simply tweak its data to make it think it is 2000 units further to the left the moment before I do so, resulting in a NOP no matter its prior state?
     
    awesomedata and cosmochristo like this.
  10. awesomedata

    awesomedata

    Joined:
    Oct 8, 2014
    Posts:
    1,419
    I think, at least in Unity, this sometimes becomes a chicken-and-egg situation. Unity does things in discrete steps, and all operators to "move" execute before other aspects of the program are allowed to execute and eventually let the overall game loop finally tick-up.

    Because of this and no direct control of systems (as one might have somewhat better control of the game 'ticks' in DOTS / ECS, though I'm not sure about this yet since the "update" loop might still be doing what it is currently doing letting all rendering batches be submitted ASAP, leading to the rendering jitters), this batching, while saving performance by not always being selective of the meshes and gameobjects it moves, prevents us from having direct access to the "moved" mesh data (at least directly) except in the next cycle of the overall game loop.

    This is the only time we're allowed to move the mesh data itself (directly) outside of just before the shader/material rendering pass. This may have changed though, so anyone @Unity please correct me if I'm wrong.
     
    cosmochristo likes this.
  11. cosmochristo

    cosmochristo

    Joined:
    Sep 24, 2018
    Posts:
    250
    We really need to have a clearly defined API that allows control of important stages of the processing pipeline during each frame.
    I thought I found an answer for me: set
    Code (CSharp):
    1. Physics.autoSyncTransforms = false;
    and manually control physics updates. However, this setting had no effect. (Unity 2020.4.37).
    I called it from an Update, not fixedupdate loop, and I don't see why that should matter.
    The API documentation is very concise, which is good, but more detail, clarity and explanation, with examples are needed.
     
  12. awesomedata

    awesomedata

    Joined:
    Oct 8, 2014
    Posts:
    1,419
    I've always agreed with this point. The "concise" is only useful to the programmers who developed the API. As someone who _uses_ it though, I would argue that more documentation on software is _always_ better -- even if it gets to be a bit long-winded. At the very least you can't blame the tool developers when you don't understand something.


    To be fair, the game update loop used to be clearly defined and the API seemed to follow that (despite a lot of it still being a bit of a black-box), but true _control_ over those steps of the physics and rendering processing pipeline was off-limits for a very long time until just recently-ish. They are still under heavy construction right now. I'm sure we won't get a true solution to this problem until the end of 2023 -- at the soonest.

    Unity claims its World Building department is in full-swing, but I'm not sure it's taking into account floating origin worlds. That being said, Joachim himself said they were working on it at least a few years ago, around the time DOTS was really getting underway. So perhaps they've finally found an acceptable solution for larger worlds -- and not just on the vertical axis -- and maybe that will be built into the World Building tools.

    Perhaps we'll know when MegaCity is released again.
     
    cosmochristo likes this.
  13. Marcos-Elias

    Marcos-Elias

    Joined:
    Nov 1, 2014
    Posts:
    159
    The biggest problem about DOTS and its related features is that... It is incompatible with PhysX, requiring the usage of Unity Physics or other custom solutions. So all the vehicle systems made with PhysX will not work properly thus requiring a rewrite of the entire game.
    It would be way simpler if Unity just exposes to us the PxScene::shiftOrigin(const PxVec3& shift) function... Like Unreal did a few years ago. I really don't understand why they choose to completely ignore this. PhysX has solved it a long time ago :(
     
    CosmicStud and awesomedata like this.
  14. cosmochristo

    cosmochristo

    Joined:
    Sep 24, 2018
    Posts:
    250
    Out of curiosity, What is the equivalent unreal (exposed) function - Are they also using the nVidia code or something else?
     
  15. Marcos-Elias

    Marcos-Elias

    Joined:
    Nov 1, 2014
    Posts:
    159
    UE 4 uses PhysX too, it manages it automatically by enabling the checkbox "origin rebasing" in World Settings. UE 5 got an experimental feature for double precision to provide large world coordinates without shifting the origin, and also a custom physics engine to get rid of PhysX (Chaos).

    Flax Engine (not very popular yet) got optional 64-bit world coordinates to allow large worlds out of the box, although it must be compiled from the source to allow that.

    I hope there will be more tutorials and sample content for DOTS and Unity Physics. They are very promising but still a bit hard to deal with.
     
    awesomedata and cosmochristo like this.
  16. snacktime

    snacktime

    Joined:
    Apr 15, 2013
    Posts:
    3,356
    Physx is a huge api surface and insanely complex. And showing it's age. There is a reason why so many commercial games use engines like Havok. More and more are rolling their own also. It's just not as difficult a domain as it was when there were so few implementations and less open source.

    For 64 bit Rust has a good library in Rapier3D. With how Rust's type system works it makes supporting both 32 and 64 bit easier then some other languages. It's also easy to parallelize.

    The thing with 64 bit and open world is that it opens up completely new partitioning strategies. You still have to partition if nothing else just to get good parallelism. But you aren't locked into only using contiguous partitions. Which allows for some creative approaches.

    No good reason anymore for engines to not have 64 bit physics. The performance hit is noticeable but not significant. For an average game it might even be negligeable. That's based on testing with Rapier 32 vs 64 bit. Basically that minor perf hit vs not having to transform and more freedom with partitioning, and it's just a no brainer as to which approach wins (assuming you have a 64 bit physics engine).
     
    cosmochristo likes this.
  17. cosmochristo

    cosmochristo

    Joined:
    Sep 24, 2018
    Posts:
    250
    I'd like to follow these things with unreal, even though I don't use it. Ise there a UE5 link that covers these?
     
  18. Marcos-Elias

    Marcos-Elias

    Joined:
    Nov 1, 2014
    Posts:
    159
    You can easily search for world composition and large world on UE documentation to get the most updated guides.
     
  19. cosmochristo

    cosmochristo

    Joined:
    Sep 24, 2018
    Posts:
    250
  20. awesomedata

    awesomedata

    Joined:
    Oct 8, 2014
    Posts:
    1,419
  21. runner78

    runner78

    Joined:
    Mar 14, 2015
    Posts:
    792
    It's probably also a reason that unity is so focused on DOTS, which fixes a lot of performance issues, especially when it comes to big worlds with lots of content.

    What has to be taken into account is also the rendering, graphics cards generally render in 32-bit floats (my level of knowledge) So basically camera-relative rendering (which HDRP already does) is basically a prerequisite. According to the roadmap, this should also come in URP at some point, maybe that will open Unity in a 64-bit coordinate system.
     
  22. neoshaman

    neoshaman

    Joined:
    Feb 11, 2011
    Posts:
    6,493
    Thats' unfair though, open world will always be performance oriented and you always have to do some leg works. As someone who want to do open world and critized some fundamental of unity (like origin shifting), and game that wasn't build from teh ground up to be open world, especially on low hardware will always run into performance issues. Especially when you have ultra long vista with random occlusion scattered sparsely, which is like the worse case, like in that game.

    Also this game is not representative of what people think of open world game "technically".

    Now unreal HAS elements to facilitate open world, but those are very fundamental, like nanite taking care of occlusion and LOD. And those are like industry wide game changer that tooks 10 years to do.
     
    cosmochristo likes this.
  23. awesomedata

    awesomedata

    Joined:
    Oct 8, 2014
    Posts:
    1,419
    Wow. You got me. No idea camera-relative rendering didn't happen in URP too already. I thought it existed back way back when they introduced it to HDRP. D:

    I would love to believe this, but Unity hasn't said anything about floating origin and DOTS together in the same sentence. However, it's possible I may be wrong. Years ago, Joachim Ante responded to one of my questions that they were looking to make open world a part of the editor experience. But I don't know what that was in the context of -- i.e. was it DOTS, or the Unity Editor itself? My question was about both, and not requiring DOTS to do origin shifting at all. DOTS as a prerequisite for origin shifting isn't ideal to me -- but if it enables 64-bit worlds, then fine. However, that isn't accounting for all of the tooling (i.e. Terrain generation) that would need to be reworked for a floating origin solution. It's this requirement, I believe, that has held up a native floating-origin or 64-bit open world solution. Unity desperately just wants us to make small games. Otherwise, many important tools would "need" to be reworked by Unity to support a floating origin -- many tools in which the original developers of those tools are nowhere to be found anymore.



    I don't think it's unfair. Most game engines (not necessarily the open-to-everyone engines such as Unreal or Unity) have built-in assets and optimization techniques to handle the kind of issues present in open-world games. Granted, origin-shifting isn't always one of these techniques (i.e. Zelda: Breath of the Wild doesn't use the whole 10,000km x 10,000km it has been granted, which ensures origin-shifting isn't necessary and also that they have and maintain a working physics system).

    Unity is notoriously bad at performance and optimization, which is why I was looking forward to DOTS. Unreal already has a semi-made implementation of DOTS in its node system, and as such, I think it's fair to point out the flaws that I think Unity has had plenty of opportunity to contend with. Open world is something the programmers of Alba should've been able to pull off easily if Unity had standard features that have been present since PS2 days. Features such as HLOD support, for example, which would have helped the distant tree issue in Alba that nearly crushed their game. Its usage of impostors was a great start, but it was their choice of targeting older hardware that crippled them in that area. However, while that was the _developers'_ fault for targeting old hardware, HLOD would have saved them here on old hardware. To be fair, Unity has had a number of HLOD attempts (one with DOTS and one with its own modified engine), but since they've never pulled the trigger on it officially to integrate it into the engine, games like Alba tend to naturally fall to Unity's lack of apparent optimization features. We don't even have something as simple as basic LOD generation.

    That being said -- I see where you're coming from. And I agree, this game isn't even close to "open world" -- but Unity is quick to pretend that it is (thus their video touting "optimization" techniques for "open worlds"), which is why I have no shame in holding a magnifying glass to Unity's own desired standards. This is why I believe it is fair to point out just how terrible Unity's approach to "optimization" really is as it currently stands.


    Also, nothing is stopping Unity from developing features like this. Someone has already developed an early version of Nanite for Unity. I hold a lot of contempt sometimes at Unity's desire to pull in other industries and woo them with promises of a great ecosystem, but ignore prioritizing game developers who have been their bread-and-butter since their inception. Optimization features are part of _any_ professional game engine. Unity's lack of development in any of these areas has proven to have a huge impact on many small developers who just (rightfully) assume the engine will "just work" and handle much of the optimizations itself.

    To tout "open world" and "optimizations" and then show a (clearly tiny) game like Alba as a flagship product that demonstrates the concepts of "optimization" and "open world" in Unity says a lot about their engine (and their morals as a company) -- and as such, I feel that Unity (as a business) genuinely needs to be burned a bit for pushing the narrative that they "support" games that are "open world" and "optimized" -- which is a pretty big lie.
     
    Last edited: Nov 15, 2022
    Gravesend and frarf like this.
  24. runner78

    runner78

    Joined:
    Mar 14, 2015
    Posts:
    792
    I think Unity first needs the capacity to realize openworld within the 32bit floats range without problems. Depending on the mechanics of the game and the level design, you can pack quite a lot in there. Genshin Impact, for example, has a good level design in my opinion, it felt like a large area, but it is only about 4-5km in diameter on the initial map. And I think it was not an easy task to fill the world so densely with content, even if the assets are all very low poly.

    By the way, a tiling system is used in UR5, I don't know the details, I just read something about the tilesize and tile offset regarding rendering with 64bit coordinates. My interpretation is that internally the world is divided into tiles, and each tile has its own origin.
    Incidentally, Unreal also has a default bound limit of 21sqkm.
     
    cosmochristo and awesomedata like this.
  25. Marcos-Elias

    Marcos-Elias

    Joined:
    Nov 1, 2014
    Posts:
    159
    Some great island maps can be achieved under the 20km limit very easily, so they don't require origin shifting or large world coordinates at all. They can use 10k units/meters positive and 10k negative in both directions with a granted square of 20km per side. But for real world simulators this becomes very hard since we must follow a direct path, like for instance a 30 or 50 or 100 km straight road (to keep them as close as possible to real world locations).

    I made a loading/unloading system on Unity using tiles made by 3km x 3km pieces (each one with its own origin at 0,0,0). Once user gets close to the border the next chunk is loaded and connected on the border, and after the player continues moving forward all the past ones are unloaded (by using Load/Unload level async). It works great like 99% of the time.

    The biggest problem that we face is still the coordinate shifting when there are a lot of physics stuff happening, like moving pedestrians, cars, triggers, and so on. Moving everything that is loaded is a heavy operation since we must do it for all current physics objects being simulated on the screen. If they allow to use shiftOrigin directly from PhysX probably that hit in performance would not be so noticeable for most games. At least in all my tests in UE4 with origin rebasing I never noticed the hiccup: origin is shifted with great performance and everything just continues working.
     
    awesomedata likes this.
  26. awesomedata

    awesomedata

    Joined:
    Oct 8, 2014
    Posts:
    1,419
    Seconded. However, it's very possible that Unity simply has a lot of custom code to allow for special collider detection and sweeping tests that they don't want to redo if the user can suddenly change the whole position of the object (i.e. for rigidbodies that are "sleeping" and are at rest, what should Unity do with these if they can't be accessed yet are moved inside some collider by a super small fractional amount? -- Havok just gently shifts meshes like that out of the collider, though it has to evaluate these "at rest" meshes and colliders anytime they are moved, which could cause a performance hit / CPU spike).

    I really am not sure what their _true_ hangup is on the physics side of things for open worlds, but it's definitely getting old just waiting for super-basic and fundamental issues to be solved with the physics engine -- especially when Unity seems to be focused on advertisements and showcasing pretty graphics instead.

    Yes, we get it. Unity can be as pretty as Unreal now. It's time to make it as FUNCTIONAL too. :/
     
  27. cosmochristo

    cosmochristo

    Joined:
    Sep 24, 2018
    Posts:
    250
    If we could control the messaging from a World shift transform change then we could stop physics from reacting to it. If it does not receive the message then the transform change would be transparent to the physics system.
     
  28. neoshaman

    neoshaman

    Joined:
    Feb 11, 2011
    Posts:
    6,493
    @awesomedata
    Let's refocus, i agree the video is kind of a lie, as it's not representative of the issue people have with open world within unity.
    - the main performance is fillrates, more precisely stacked transparency, that's industry wide problem not even nanite solved, unity won't solved it easily, it's custom rendering stuff (see how horizon solved it with depth prepass) and you can probably do it yourself in unity.
    - the second issue they had was too many animator, again while unity has some fault (overloaded main thread), they also have some solution, which the demo use (burst), and other had other solution (forced flat hierarchy even for character bone animation), it's entity management, while unity can do some leg up, it's the responsability of the dev.
    - the third issue is unity specific, it's the culling issue, which is rigid and obfuscated in the engine. But ultimately, it's not a such hard problem because they are doing something differently, and using a clever solution, not a generic problem.

    Also nanite isn't magic, handle it badly and you will still have performance issue, the fillrate transparency problem they had is such an issue nanite don't solve. It was their main issue, no matter what you do, open world ask a ceratin amount of planification this game clearly didn't do. HLOD wouldn't have save them at all, opaque (z buffer rejection) and stylized foliage was the right decision or their specific game.

    The issue with unity is when you have no work around, or work around are too costly. Origin shifting and physics system belong to that, nobody going to implement and maintain a complex physics system just for one game. HLOD is really a service issue, it's not tied to the engine and there is alternative, nothing prevent you from rolling your own, and it tend to be game specifics because it can break visual. Open world game existed before nanite, and one guy doing it isn't proof it's working everywhere on all platform unity cover, we shouldn't need a nanite scope solution for open world, even though it's fine.

    Unity has issue, but let's not say all problem are on the same responsibility else it muddy the metrics and force unity to make bad choice by chasing too many rabbit. Fundamental issue should be sorted first.
     
    Deleted User and frarf like this.
  29. awesomedata

    awesomedata

    Joined:
    Oct 8, 2014
    Posts:
    1,419
    Or when the proposed "solution" by Unity is simply too buggy, complicated, or unreliable to use consistently without deep project-authoring problems -- which Unity has in spades.


    That said -- I agree with most everything else you've said, except for this:

    HLOD isn't game-specific. It's been used since even before PS2 days in some form. It's a standard way to handle large view distances without special shader magic. It's nothing more than partitioning the world starting with a broad LOD for the larger chunks, then, as one gets closer, the smaller chunks start to pop in and gain detail -- but only within the chunk being processed. Everything else is essentially the lowest-level LOD (before being completely invisible after a certain distance). Handling LODs individually without accounting for the Hierarchy of LODs (i.e. for streaming textures, etc.) can really impact performance -- and Unity's texture handling, etc., being based around gameobjects natively (including for physics processing) can make your FPS tank, and reach rock-bottom _very_ quickly the larger the scene becomes.

    That's why I got kind of grumpy about the video essentially being a freaking LIE -- Not a single one of these issues have anything to do with making open-world games. Unity still thinks everyone makes mobile games with their engine. So why wouldn't they optimize _at least_ the LOD handling system?

    Not necessarily -- While it fit better with the character detail levels, had they decided to use texturing on those chars too, HLOD would have been the better choice (assuming they used RGB to simulate wind and applied the shader to a single mesh at a time) -- except they used Unity Terrain. So they're not dealing with plain old meshes anymore. This is exactly the point I'm trying to make about open world development -- Unity's "gotcha!" workflows. HLOD as a standard tool for optimization, applied to Unity Terrains, would have prevented any issues with trying to turn a Unity terrain into a mesh to process it with one's own custom HLOD implementation. But no -- everything in Unity needs to be siloed.


    Aside from that, I'm mainly pointing at floating-origin and LOD-management issues for larger games. I'm not a fan of Nanite due to the transparency issues it has at the moment. So I agree with you. But even Nanite is making progress in that department. Unity hasn't even begun. At the end of the day, my gripes aren't about depth-sorting issues. That's a shader issue. And shader issues are definitely on the hands of the developer -- not Unity. However, Unity has done a pretty good job on updating their shader systems and providing solid tools to author them. At the end of the day, I really am honestly impressed.


    Sadly, while I completely agree with these statements -- Unity is _already_ chasing too many rabbits right now, isn't it? Visual Scripting, for example, was supposed to be finished before the end of last release -- and now even that has been essentially abandoned.


    On a lighter note --

    I think Unity does depth pre-pass though, doesn't it? -- As far as I was aware, that was a feature someplace at some point. Or is it something to do with VR / Stereoscopic 3D stuff only?
     
  30. snacktime

    snacktime

    Joined:
    Apr 15, 2013
    Posts:
    3,356
    And in the meantime UE 5.1 defaults to 64 bit world coordinates and continues pushing towards having 64 bit as an option everywhere, plus more stuff that matters for open world like nanite terrain/vegetation. Unity is never going to catch up to modern practices for open world games that ship has sailed.

    Unity simply has no technical vision anymore. Sure tech like Nanite has issues, but that is sort of missing what is most important. Other engines are actually investing in the next generation, spending time and money exploring. Unity is not doing that. If you look closely almost all of their new tech is copying well known approaches and producing sub par implementations of those. The end result of that is a widening gap that they will never recover from. And eventually they will just admit it and openly abandon trying to compete in more and more areas that matter to people making games outside of mobile.
     
  31. awesomedata

    awesomedata

    Joined:
    Oct 8, 2014
    Posts:
    1,419
    Sadly, and a bit reluctantly -- I essentially agree with all of this.

    I'd love Unity to prove me wrong.
    But I'm just not seeing that happening with the current mobile ad-facing direction Unity seems to be going in.
    Their roadmap claims they want to help people make games that scale -- but what in their roadmap truly enables that?

    DOTS isn't a bad start, but they can't keep these 5-10 year long development stretches for basic tools going when most other companies are leaving them in the technological dust -- especially innovation-wise.

    Unity, as a company, seriously has NO vision anymore.



    Something tells me if we all decided to boycott Unity -- stuff just _might_ start to change.
    I guess we should all start to hit them where it hurts:

    -- Their pocketbook. :/

    They're already laying off employees left and right. Might as well get it over with. :/
     
  32. Tony-Lovell

    Tony-Lovell

    Joined:
    Jul 14, 2014
    Posts:
    127
    I can barely track the acronyms and insights you guys are sharing, but I can totally see that, no matter the underpinnings, we'd all be in a better place if Unity itself had a vision for "world" vs. "scene", as in, the game engine depicts a single-precision scene within a double-precision world. If they chose a clear API for expressing these, and made sure to use this nomenclature in all documentation, asset-creators would have a single coding pattern to support, and more stuff would "just work".
     
    awesomedata and cosmochristo like this.
  33. FirstTimeCreator

    FirstTimeCreator

    Joined:
    Sep 28, 2016
    Posts:
    768
    I have this fully working in my project. I built my own floating origin system along with everything else. The key is that you need to execute physics manually in the fixed update AFTER all physics updates execute then in lateupdate do your origin position shifts. As far as particles, all mine use local space inside a container and I shift just the container that contains the particles which exist in the local space of that container, therefore no LOOPS to update particle systems in your origin update - updating vast amounts of particles with the way I have seen people do it by updating every particle position manually is not efficient at all.

    You see when you shift the rigidbody position the physics updates it, but you need to overwrite what the physics engine has output before the next frame. So what I do is execute physics manually at the end of EVERY physics update call in fixed update. Physics.Simulate(Time.fixedDeltaTime);

    It is the very last thing called after all physics operations. Then in LateUpdate, After the physics engine has outputted its position writes to all RB's I execute an origin shift in LateUpdate.

    Take note of the fact that im also updating over 1,000 objects. The projectiles are DrawMeshInstanced() but I still have to Update the Position Matrix of the projectile when I do a shift along with many other things that are position tracked. The performance overhead from doing a shift with 150 ships, that have both a convex collider AND a full mesh collider (following each ships position) is actually quite minimal. Barely even noticeable in analysis. Updating the projectile 4x4Matrix actually takes more CPU overhead than shifting all the collider positions.

    There really is no need for a double precision system, floating origin works really well when you implement it correctly.
    You can see it working just fine in my project here:
     
    Last edited: Nov 30, 2022
    neoshaman and cosmochristo like this.
  34. cosmochristo

    cosmochristo

    Joined:
    Sep 24, 2018
    Posts:
    250
    @FirstTimeCreator can you please clarify whether you use a threshold-based world shifting, where the player moves some distance from the origin, or not?
     
  35. neoshaman

    neoshaman

    Joined:
    Feb 11, 2011
    Posts:
    6,493
    @awesomedata
    Let's clarify what you call HLOD, because I see two system:
    - A baking system that takes a scene and create a grouped LOD version of it
    - A culling system that replace objects at a distance with a grouped LOD

    I was kinda talking about the former (which unreal has), which has visual implication, but you seems to be talking about the latter, which seems like a non technical issues to me (maybe I'm wrong).
     
  36. awesomedata

    awesomedata

    Joined:
    Oct 8, 2014
    Posts:
    1,419
    Ideally, since it is a fairly complex process on the technology front, system-wise, HLOD is really just a streaming+culling system and mesh simplification system (generated, not just baked -- and ultimately stored on the hdd) all rolled into one.

    Unreal does a mesh simplification step for when it generates the HLOD too (something like what Simplygon was about) and their HLOD handles large worlds with floating origin. And it does this for all versions of the mesh partitioning and chunking simplifications. The culling system is a separated system in Unreal, but uses the generated HLOD meshes as an input to the (more generalized) streaming+culling system.



    Also, if anyone's curious:




    Unity's lagging pretty bad these days...




    #FckU-EAGaemzXCEO-Ad-fapping-$$$-grabbn-fkTRD-in202X
     
    Last edited: Dec 1, 2022
  37. cosmochristo

    cosmochristo

    Joined:
    Sep 24, 2018
    Posts:
    250
    @awesomedata thanks for showing that UE demo, it is certainly a big change and the results are very compelling.
     
    awesomedata likes this.
  38. awesomedata

    awesomedata

    Joined:
    Oct 8, 2014
    Posts:
    1,419
    Honestly, Unreal Engine's entire workflow sort of makes me cringe...

    ...but, unlike Unity, Epic is actually improving everything else.



    Not gonna lie -- their technology is definitely looking sweeter and sweeter everyday.

    After that demo showing off such a ridiculous world-size...
    As an artist/designer at my core... I'm heavily debating switching over to the dark side...



    I don't want to see Unity die... but it's already happening. And Unity's CEO is too tone-deaf to stop it. :/
     
    Last edited: Dec 1, 2022
  39. cosmochristo

    cosmochristo

    Joined:
    Sep 24, 2018
    Posts:
    250
    @awesomedata "..but, unlike Unity, Epic is actually improving everything else."
    I think Unity are trying to improve everything as well, but it does appear the UE has maintained some advances ahead of Unity.
    However, isn't Unreal tied to WIndows?
     
  40. Marcos-Elias

    Marcos-Elias

    Joined:
    Nov 1, 2014
    Posts:
    159
    This article brings important concepts about the double precision, some challenges and the solution they found to make it easier. Worth reading!

    Now we have some competition on Unreal Engine 5, Flax Engine and also Godot. Please Unity, consider adding this so long requested feature haha ♥️. Smaller engines are working on it. It would be amazing having large worlds support out of the box on the most popular game engine.

    https://godotengine.org/article/emulating-double-precision-gpu-render-large-worlds
     
    Last edited: Dec 7, 2022
    awesomedata and Peter77 like this.
  41. cosmochristo

    cosmochristo

    Joined:
    Sep 24, 2018
    Posts:
    250
  42. TerraUnity

    TerraUnity

    Joined:
    Aug 3, 2012
    Posts:
    1,251
    Marcos-Elias and cosmochristo like this.
  43. Neto_Kokku

    Neto_Kokku

    Joined:
    Feb 15, 2018
    Posts:
    1,751
    The HPTransform component uses LateUpdate() to do stuff. This will not scale well if you have large numbers of objects on the scene (which is likely if you have such large worlds).
     
    Marcos-Elias likes this.
  44. TerraUnity

    TerraUnity

    Joined:
    Aug 3, 2012
    Posts:
    1,251
    Even though the purpose of this solution is not related to large number of objects while your assumption is true, maybe you could alter the codes to only update surrounding objects for camera when needed on demand, for example through event callbacks or with a condition to only update when camera moves! Not sure if they have to be calculated under LateUpdate every frame though!
     
  45. cosmochristo

    cosmochristo

    Joined:
    Sep 24, 2018
    Posts:
    250
    Thanks @TerraUnity for sharing. I have a lot of questions so will probably PM you on some of them.
    Firstly, where does this fit with Utech development cycle plans. I see it is a "Unity-Technologies" repository and the manual pages have the unity logo so this implies they have some official pathway for the inclusion of this code... ?
     
    TerraUnity likes this.
  46. TerraUnity

    TerraUnity

    Joined:
    Aug 3, 2012
    Posts:
    1,251
    @cosmochristo My pleasure, yes this is an official Unity solution but usually they release these as side projects and do not have any development cycle plans nor any support which is reasonable but it's good that these are open source and the community can contribute to it if they have the knowledge to.
     
    cosmochristo likes this.
  47. Marcos-Elias

    Marcos-Elias

    Joined:
    Nov 1, 2014
    Posts:
    159
    Oh that is great! But at the same time it will be useless for most games :( By reading documentation:

    "Unfortunately, as of today, the high-precision framework does not work with the physics engine and can only be used for visualization purposes. For this reason, we will need to remove the colliders from both cube objects."
     
  48. HIBIKI_entertainment

    HIBIKI_entertainment

    Joined:
    Dec 4, 2018
    Posts:
    595
    Pretty sure someone mentioned it here already, but this is a large topic.

    However ,for those of your working on HDRP projects you'll be utilising camera-relative rendering which by default put GOs in negative world space before the final translation.
    ( This affects shaders to for those unaware )

    You can have pretty large worlds without a hitch 25km2+ easy. No need for floating point movements.
    If anything many old pipeline habits tend to slow HDRP down more so than the counter parts.
     
  49. cosmochristo

    cosmochristo

    Joined:
    Sep 24, 2018
    Posts:
    250
    @HIBIKI_entertainment
    is any of: "camera-relative rendering which by default put GOs in negative world space before the final translation." explained somewhere?

    Rendering is camera-relative drawing so "camera-relative" becomes "camera relative camera relative drawing"! Why this meaningless double-speak?

    "put GOs in negative world space before the final translation" ?
    appears to contradict:
    "No need for floating point movements." really? Why?

    It would be really helpful to see some documents linked and code examples of the basic algorithm here.

    The physics underpinning all we do.

    A lot of time and effort is put into simulating physics in Unity, by applying forces and causing a chain of physics calculations that eventually lead to the displacement of objects. A lot derived from the 600-year-old Newton's laws of motion.

    Although there is plenty of good physics being implemented, one of the fundamental cornerstones is mostly being ignored!

    The reason this must change is explained by Hawking:
    the theory of relativity forces us to change fundamentally
    our ideas about space and time.

    Hawking, A briefer history of time, p33.

    and:
    “Newton was very worried by this lack of absolute position, or absolute space, as it was called, because it did not accord with his idea of an absolute God. In fact, he refused to accept the lack of absolute space, even though it was implied by his laws.”
    A briefer history of time, p23. Last para.

    We use positional data. We implement positional algorithms. If we don't want to be mired in 600-year-old absolute positional thinking then we need to adopt relative thinking.

    By exploiting an understanding of floating point, adding origin-centring, and reverse motion, the position-independent principle provides a way forward to using the knowledge of no absolute position: i.e. relative computation. There are plenty of documents and some implementations (e.g. continuous floating origin) online. Note the implementations are *not* the same as the fake "floating origin" that shifts the World based on a threshold movement of the player in an absolute or static map. It is the relative movement of the World around a stationary player.

    As a long-time developer, I want to be treated with respect and be given examples and explanations, not double-speak (camera relative rendering).
     
    Last edited: Dec 14, 2022
  50. snacktime

    snacktime

    Joined:
    Apr 15, 2013
    Posts:
    3,356
    Camera relative is a good way to keep the rendering pipeline 32 bit. It doesn't solve the problem of very large worlds overall. You still need 64 bit or origin shifting for that.

    The rendering internally won't have precision issues, but the camera will start moving at larger and larger increments. Which will be noticeable it just takes a bit more distance to get there.