Search Unity

  1. Welcome to the Unity Forums! Please take the time to read our Code of Conduct to familiarize yourself with the forum rules and how to post constructively.

Resolved Large world coordinates - double precision data

Discussion in 'DOTS Dev Blitz Day 2022 - Q&A' started by bb8_1, Dec 8, 2022.

  1. bb8_1

    bb8_1

    Joined:
    Jan 20, 2019
    Posts:
    97
    Will Unity - and especially dots part of it - have support for large world coordinates(meaning support for double precision data) - like your main competitor game engines already have?
     
    ThatDan123 likes this.
  2. arnaud-carre

    arnaud-carre

    Unity Technologies

    Joined:
    Jun 23, 2016
    Posts:
    74
    Hi,

    Unity engine itself ( transforms, physics, editor ) doesn’t support double precision. So basically our recommendation about max world size is around 50km ( ie 50.000 units )

    However DOTS mathlib & Burst compiler are supporting it (ie 64bits floating numbers).
    Also please note that HDRP shaders are working in “relative camera” position to minimize graphics glitches when dealing with objects far from the origin.
     
    Last edited: Dec 8, 2022
    Slaghton and bb8_1 like this.
  3. jivalenzuela

    jivalenzuela

    Unity Technologies

    Joined:
    Dec 4, 2019
    Posts:
    43
    Double precision has a steep performance penalty so we're unlikely to switch component width. But the DOTS team has some early ideas for supporting large, open worlds without requiring 64-bit width translation and it's something we intend to explore over the coming year.

    With the revisions to the Transform System in entities 1.0, it's feasible for users to replace the entire Entity transform stack. So folks who want to weigh the tradeoffs between precision/performance can do so themselves.
     
    Slaghton, stonstad, bb8_1 and 3 others like this.
  4. Max-om

    Max-om

    Joined:
    Aug 9, 2017
    Posts:
    473
    I just want to highlight that 50km is not practible with single precision floats. I notice jitter only 2km away from origin in our first person VR game.

    Native origin shifting could solve that and also make it work for multiplayer etc.
     
    bb8_1 likes this.
  5. jivalenzuela

    jivalenzuela

    Unity Technologies

    Joined:
    Dec 4, 2019
    Posts:
    43
    > Native origin shifting could solve that and also make it work for multiplayer etc.

    Indeed :)
     
    bb8_1 and Max-om like this.
  6. arnaud-carre

    arnaud-carre

    Unity Technologies

    Joined:
    Jun 23, 2016
    Posts:
    74
    Sorry about that. What renderer pipeline are you using for your VR game? ( builtin, URP, HDRP? ). My 50.000 units assumption is in HDRP context. ( HDRP have native origin shift for rendering ). So 50.000 units should be fine in editor, when using HDRP. But this is for rendering only. You may have accuracy issues with Physics or other systems at 50.000 units
     
  7. Max-om

    Max-om

    Joined:
    Aug 9, 2017
    Posts:
    473
    I'm on built in. But, tried a POC on HDRP because we want to go open world for next title. It's a little better but not alot still jitter i guess from physics. I'm on Physx though.

    Edit: still far from 50k though. I could observe jitter at around 2k
     
    pm007 and bb8_1 like this.
  8. arnaud-carre

    arnaud-carre

    Unity Technologies

    Joined:
    Jun 23, 2016
    Posts:
    74
    yes HDRP is just solving rendering issues. All other systems (including physics) will jitter much more earlier than 50km, as you mention :(
     
    bb8_1 likes this.
  9. stonstad

    stonstad

    Joined:
    Jan 19, 2018
    Posts:
    475
    Can you please elaborate more on this penalty? Unreal passes two single precision vectors instead of one, and they assert that performance impact across CPU/GPU is minimal.
     
    bb8_1 likes this.
  10. jivalenzuela

    jivalenzuela

    Unity Technologies

    Joined:
    Dec 4, 2019
    Posts:
    43
    I'll caveat that I haven't profiled this change in Transform V2. This opinion is based on anecdotal (but I consider reliable) information as well as my experience doing performance programming.

    While the compute latency is about the same for single vs double precision, the memory bandwidth is not, and efficiently amortizing memory costs is a cornerstone of performance programming.

    I've heard this too, but honestly I'm inclined to believe it only after seeing some rigorous proof. I'm not in a position to test UE personally obviously.
     
    Luxxuor and bb8_1 like this.
  11. Max-om

    Max-om

    Joined:
    Aug 9, 2017
    Posts:
    473
    Ironic how mono JITed code typecasts every float to a double before operating on it and then typecast back to float and loses that precision. So it roundstrip over double without gaining and benefit from it :)
     
    Gekigengar and bb8_1 like this.
  12. jivalenzuela

    jivalenzuela

    Unity Technologies

    Joined:
    Dec 4, 2019
    Posts:
    43
    Ironic is one word for it.
     
    stonstad likes this.
  13. stonstad

    stonstad

    Joined:
    Jan 19, 2018
    Posts:
    475
    In a microbenchmark test specific to Transform.Position I can certainly understand memory bandwidth doubling when comparing a single Vector3 to double Vector3s...

    But in a real-world test with a scene containing 2500 transforms, is GPU bandwidth *meaningfully* impacted by passing an additional Vector3 parameter to the default shader for each batched call?

    I wonder if the impact was sufficiently small that Epic felt it was a worthwhile change. I'm not a performance programmer. I'm trying to better understand Unity's thinking.

    I am tempted to build a prototype of Unreal's implementation and measure a scene with thousands of game objects passing an additional Vector3 to the default shader. I'm willing to wager that impact to FPS is negligible, or at minimum not meaningful.
     
    Last edited: Dec 13, 2022
    Gekigengar and bb8_1 like this.
  14. Gekigengar

    Gekigengar

    Joined:
    Jan 20, 2013
    Posts:
    690
    Why not? At the place I used to work in, our R&D team always consistently investigate and benchmark with all of the competitor's product, not just our biggest one.
     
    bb8_1 likes this.
  15. jivalenzuela

    jivalenzuela

    Unity Technologies

    Joined:
    Dec 4, 2019
    Posts:
    43
    Maybe, but I'm not convinced this would be predictive of the performance implication for doubling the size of Translation.
     
  16. Gekigengar

    Gekigengar

    Joined:
    Jan 20, 2013
    Posts:
    690
    I don't know, they have data from Fortnite (In house by Epic), FF15, FF16 (Square Enix), new Witcher, new Cyberpunk (CDPR), all confirmed to use Unreal Engine 5, and they are all large scale open world games.

    But hey I still don't think its enough evidence to say they have done their homework, considered development cost, time, nor done any performance impact comparison between all implementations to arrive at the final decision.
     
    stonstad and bb8_1 like this.
  17. runner78

    runner78

    Joined:
    Mar 14, 2015
    Posts:
    709
    I wouldn't call Fortnite a large scale open world game, I once read about only 2.6 km x 2.6 km
     
  18. stonstad

    stonstad

    Joined:
    Jan 19, 2018
    Posts:
    475
    The point being made is that Unreal has, as a matter of fact, shipped titles with large world coordinate support (the topic of this thread) and it runs fine on consoles.
     
  19. runner78

    runner78

    Joined:
    Mar 14, 2015
    Posts:
    709
    FF15 has a map size of 1800km2 that is about 42km x 42km. That would still be within the max recommended size of HDRP. (see post above). The point is that many large open world games still basically don't need origin shifting or 64bit coordinates.
    Unreal uses cm as a unit, but I don't know if that's only in the editor or also under the hood, so Unreal would reach the limit much faster than Unity.
     
  20. stonstad

    stonstad

    Joined:
    Jan 19, 2018
    Posts:
    475
    UE4 WORLD_MAX is 21km. UE 5.1 WORLD_MAX is 88,000,000 km. LWC is enabled by default in UE 5.1.
    Large World Coordinates in Unreal Engine 5 | Unreal Engine 5.1 Documentation
     
    bb8_1 likes this.
  21. Max-om

    Max-om

    Joined:
    Aug 9, 2017
    Posts:
    473
    1000,001 1000 meters and one millimeter in meters
    100000,1 1000 meters and one millimeter with same number of digits in centimeters.
     
    Last edited: Dec 14, 2022
  22. runner78

    runner78

    Joined:
    Mar 14, 2015
    Posts:
    709
    Still way too small for KSP :p

    UE5 uses a tiling system, I don't know exactly how it's implemented, but on the GPU side UE5 still works with single precision.
     
  23. rawna

    rawna

    Joined:
    Aug 13, 2015
    Posts:
    25
    FF15 uses luminous engine.
    FF16 started with unreal, then they moved to a modified version of FF14 engine.
     
    bb8_1 likes this.
  24. KnewK

    KnewK

    Joined:
    Sep 2, 2012
    Posts:
    18
    If Minecraft can do it Unity should be able to do it too. I'm using ArcGIS in Unity that shows the whole world in scale. The precision jitter is definitely a problem. Luckily ArcGIS does provide a Rebasing component that updates the origin to the Camera every now and then to avoid the issue (like floating origin).