Search Unity

  1. Unity support for visionOS is now available. Learn more in our blog post.
    Dismiss Notice

Resolved Large world coordinates - double precision data

Discussion in 'DOTS Dev Blitz Day 2022 - Q&A' started by bb8_1, Dec 8, 2022.

  1. bb8_1

    bb8_1

    Joined:
    Jan 20, 2019
    Posts:
    100
    Will Unity - and especially dots part of it - have support for large world coordinates(meaning support for double precision data) - like your main competitor game engines already have?
     
    AntonioModer and ThatDan123 like this.
  2. arnaud-carre

    arnaud-carre

    Unity Technologies

    Joined:
    Jun 23, 2016
    Posts:
    97
    Hi,

    Unity engine itself ( transforms, physics, editor ) doesn’t support double precision. So basically our recommendation about max world size is around 50km ( ie 50.000 units )

    However DOTS mathlib & Burst compiler are supporting it (ie 64bits floating numbers).
    Also please note that HDRP shaders are working in “relative camera” position to minimize graphics glitches when dealing with objects far from the origin.
     
    Last edited: Dec 8, 2022
    AntonioModer, Slaghton and bb8_1 like this.
  3. jivalenzuela

    jivalenzuela

    Unity Technologies

    Joined:
    Dec 4, 2019
    Posts:
    68
    Double precision has a steep performance penalty so we're unlikely to switch component width. But the DOTS team has some early ideas for supporting large, open worlds without requiring 64-bit width translation and it's something we intend to explore over the coming year.

    With the revisions to the Transform System in entities 1.0, it's feasible for users to replace the entire Entity transform stack. So folks who want to weigh the tradeoffs between precision/performance can do so themselves.
     
  4. Max-om

    Max-om

    Joined:
    Aug 9, 2017
    Posts:
    499
    I just want to highlight that 50km is not practible with single precision floats. I notice jitter only 2km away from origin in our first person VR game.

    Native origin shifting could solve that and also make it work for multiplayer etc.
     
  5. jivalenzuela

    jivalenzuela

    Unity Technologies

    Joined:
    Dec 4, 2019
    Posts:
    68
    > Native origin shifting could solve that and also make it work for multiplayer etc.

    Indeed :)
     
    bb8_1 and Max-om like this.
  6. arnaud-carre

    arnaud-carre

    Unity Technologies

    Joined:
    Jun 23, 2016
    Posts:
    97
    Sorry about that. What renderer pipeline are you using for your VR game? ( builtin, URP, HDRP? ). My 50.000 units assumption is in HDRP context. ( HDRP have native origin shift for rendering ). So 50.000 units should be fine in editor, when using HDRP. But this is for rendering only. You may have accuracy issues with Physics or other systems at 50.000 units
     
  7. Max-om

    Max-om

    Joined:
    Aug 9, 2017
    Posts:
    499
    I'm on built in. But, tried a POC on HDRP because we want to go open world for next title. It's a little better but not alot still jitter i guess from physics. I'm on Physx though.

    Edit: still far from 50k though. I could observe jitter at around 2k
     
    cosmochristo, pm007 and bb8_1 like this.
  8. arnaud-carre

    arnaud-carre

    Unity Technologies

    Joined:
    Jun 23, 2016
    Posts:
    97
    yes HDRP is just solving rendering issues. All other systems (including physics) will jitter much more earlier than 50km, as you mention :(
     
    bb8_1 likes this.
  9. stonstad

    stonstad

    Joined:
    Jan 19, 2018
    Posts:
    648
    Can you please elaborate more on this penalty? Unreal passes two single precision vectors instead of one, and they assert that performance impact across CPU/GPU is minimal.
     
    bb8_1 likes this.
  10. jivalenzuela

    jivalenzuela

    Unity Technologies

    Joined:
    Dec 4, 2019
    Posts:
    68
    I'll caveat that I haven't profiled this change in Transform V2. This opinion is based on anecdotal (but I consider reliable) information as well as my experience doing performance programming.

    While the compute latency is about the same for single vs double precision, the memory bandwidth is not, and efficiently amortizing memory costs is a cornerstone of performance programming.

    I've heard this too, but honestly I'm inclined to believe it only after seeing some rigorous proof. I'm not in a position to test UE personally obviously.
     
    cosmochristo, Luxxuor and bb8_1 like this.
  11. Max-om

    Max-om

    Joined:
    Aug 9, 2017
    Posts:
    499
    Ironic how mono JITed code typecasts every float to a double before operating on it and then typecast back to float and loses that precision. So it roundstrip over double without gaining and benefit from it :)
     
    stonstad, Gekigengar and bb8_1 like this.
  12. jivalenzuela

    jivalenzuela

    Unity Technologies

    Joined:
    Dec 4, 2019
    Posts:
    68
    Ironic is one word for it.
     
    stonstad likes this.
  13. stonstad

    stonstad

    Joined:
    Jan 19, 2018
    Posts:
    648
    In a microbenchmark test specific to Transform.Position I can certainly understand memory bandwidth doubling when comparing a single Vector3 to double Vector3s...

    But in a real-world test with a scene containing 2500 transforms, is GPU bandwidth *meaningfully* impacted by passing an additional Vector3 parameter to the default shader for each batched call?

    I wonder if the impact was sufficiently small that Epic felt it was a worthwhile change. I'm not a performance programmer. I'm trying to better understand Unity's thinking.

    I am tempted to build a prototype of Unreal's implementation and measure a scene with thousands of game objects passing an additional Vector3 to the default shader. I'm willing to wager that impact to FPS is negligible, or at minimum not meaningful.
     
    Last edited: Dec 13, 2022
    Gekigengar and bb8_1 like this.
  14. Gekigengar

    Gekigengar

    Joined:
    Jan 20, 2013
    Posts:
    724
    Why not? At the place I used to work in, our R&D team always consistently investigate and benchmark with all of the competitor's product, not just our biggest one.
     
    stonstad and bb8_1 like this.
  15. jivalenzuela

    jivalenzuela

    Unity Technologies

    Joined:
    Dec 4, 2019
    Posts:
    68
    Maybe, but I'm not convinced this would be predictive of the performance implication for doubling the size of Translation.
     
  16. Gekigengar

    Gekigengar

    Joined:
    Jan 20, 2013
    Posts:
    724
    I don't know, they have data from Fortnite (In house by Epic), FF15, FF16 (Square Enix), new Witcher, new Cyberpunk (CDPR), all confirmed to use Unreal Engine 5, and they are all large scale open world games.

    But hey I still don't think its enough evidence to say they have done their homework, considered development cost, time, nor done any performance impact comparison between all implementations to arrive at the final decision.
     
    stonstad and bb8_1 like this.
  17. runner78

    runner78

    Joined:
    Mar 14, 2015
    Posts:
    777
    I wouldn't call Fortnite a large scale open world game, I once read about only 2.6 km x 2.6 km
     
  18. stonstad

    stonstad

    Joined:
    Jan 19, 2018
    Posts:
    648
    The point being made is that Unreal has, as a matter of fact, shipped titles with large world coordinate support (the topic of this thread) and it runs fine on consoles.
     
  19. runner78

    runner78

    Joined:
    Mar 14, 2015
    Posts:
    777
    FF15 has a map size of 1800km2 that is about 42km x 42km. That would still be within the max recommended size of HDRP. (see post above). The point is that many large open world games still basically don't need origin shifting or 64bit coordinates.
    Unreal uses cm as a unit, but I don't know if that's only in the editor or also under the hood, so Unreal would reach the limit much faster than Unity.
     
  20. stonstad

    stonstad

    Joined:
    Jan 19, 2018
    Posts:
    648
    UE4 WORLD_MAX is 21km. UE 5.1 WORLD_MAX is 88,000,000 km. LWC is enabled by default in UE 5.1.
    Large World Coordinates in Unreal Engine 5 | Unreal Engine 5.1 Documentation
     
    tspk91 and bb8_1 like this.
  21. Max-om

    Max-om

    Joined:
    Aug 9, 2017
    Posts:
    499
    1000,001 1000 meters and one millimeter in meters
    100000,1 1000 meters and one millimeter with same number of digits in centimeters.
     
    Last edited: Dec 14, 2022
  22. runner78

    runner78

    Joined:
    Mar 14, 2015
    Posts:
    777
    Still way too small for KSP :p

    UE5 uses a tiling system, I don't know exactly how it's implemented, but on the GPU side UE5 still works with single precision.
     
  23. rawna

    rawna

    Joined:
    Aug 13, 2015
    Posts:
    35
    FF15 uses luminous engine.
    FF16 started with unreal, then they moved to a modified version of FF14 engine.
     
    bb8_1 likes this.
  24. KnewK

    KnewK

    Joined:
    Sep 2, 2012
    Posts:
    19
    If Minecraft can do it Unity should be able to do it too. I'm using ArcGIS in Unity that shows the whole world in scale. The precision jitter is definitely a problem. Luckily ArcGIS does provide a Rebasing component that updates the origin to the Camera every now and then to avoid the issue (like floating origin).
     
  25. Velctor

    Velctor

    Joined:
    Dec 5, 2020
    Posts:
    2
    maybe you're interested in this:
    https://web.mit.edu/tabbott/Public/quaddouble-debian/qd-2.3.4-old/docs/qd.pdf
    I'v impled double-double on Burst in Unity, for full visible universe size(46 billion light year diameter) of continuous high precision coordinate, and there's no obvious performace impact on desktop CPUs. This algorighm works for single precision either(for GPU), but in my opinion, that's not necessory, because we can upload camera-relative position of meshes to GPU just for rendering-single precision position is really enough.
     
    jivalenzuela, bb8_1 and Greexonn like this.
  26. jivalenzuela

    jivalenzuela

    Unity Technologies

    Joined:
    Dec 4, 2019
    Posts:
    68
    You used a similar approach to the linked library then? Do you mean there's no performance impact vs double, or vs single precision?
     
    Last edited: Sep 14, 2023
    bb8_1 likes this.
  27. stonstad

    stonstad

    Joined:
    Jan 19, 2018
    Posts:
    648
    Just a friendly reminder that Unreal has supported double precision transforms (enabled by default) since April 2022. It was introduced with Unreal 5.0.

    Unity, how are your plans coming along to remain technically competitive?
     
  28. jivalenzuela

    jivalenzuela

    Unity Technologies

    Joined:
    Dec 4, 2019
    Posts:
    68
    Steadily and methodically :)
     
    bgebitekin, kdchabuk and stonstad like this.
  29. Quatum1000

    Quatum1000

    Joined:
    Oct 5, 2014
    Posts:
    889
    Does this refer to the 'point of return on investment'? It seems that less than 0.01% of PC games, in contrast to Android offerings, might only reach this point by the third decade, if at all.

    Oh, and the gap in awaiting the 'point of return on investment' is likely to increasingly widen the divide with other 3D engines. :)

    Edit:
    And for those who don't quite grasp the concept of extended distances, the command buffers and shaders also have to go through the whole litany of 64-bit calculations. It wouldn't be pleasant to see if the textures on the objects and the vertices themselves start to swim.
     
    Last edited: Jan 17, 2024
    cosmochristo and bb8_1 like this.
  30. stonstad

    stonstad

    Joined:
    Jan 19, 2018
    Posts:
    648
    Unreal's large world coordinate support is fantastic. It just works. How are the steady and methodical plans coming along on the Unity side? Any progress to share?

    Thanks,
    Shaun
     
    bb8_1 likes this.
  31. bb8_1

    bb8_1

    Joined:
    Jan 20, 2019
    Posts:
    100
    There are ways to pack doubles as 2 floats so to calculate more precisely height(on gpu so creating procedural worlds) where distances are far away from camera keeping camera all the time at origin position(hdrp does this already(move camera to origin), also it uses a kind of logarithmic function for depth buffer which is also helpful for creating Earth size planets) - but it will require more memory on gpu side also the code will be slower.
     
  32. stonstad

    stonstad

    Joined:
    Jan 19, 2018
    Posts:
    648
    Thanks, bb8. To remain relevant as an engine product, it is time for Unity to support double-precision coordinates in transforms and rendering pipelines.
     
    TheGamery, Max-om and bb8_1 like this.
  33. cosmochristo

    cosmochristo

    Joined:
    Sep 24, 2018
    Posts:
    250
    It does seem a little crazy. Is there any doc on the design motivations behind this?
     
    bb8_1 likes this.
  34. Max-om

    Max-om

    Joined:
    Aug 9, 2017
    Posts:
    499
    There is more crazy stuff going on in mono than sane stuff :D
    Meanwhile .NET 8 is faster than ever. Jitted code is often on par with native code.
     
    stonstad and bb8_1 like this.
  35. runner78

    runner78

    Joined:
    Mar 14, 2015
    Posts:
    777
    Jitted code is actually always native (machine) code, but the quality is far better than mono. A large part of the performance in modern .NET also comes from optimizations in the .NET library.
     
    stonstad and bb8_1 like this.
  36. Max-om

    Max-om

    Joined:
    Aug 9, 2017
    Posts:
    499
    I ment handwritten native code. .NET 8 outperformance alot of handwritten native code
     
    bb8_1 likes this.
  37. cosmochristo

    cosmochristo

    Joined:
    Sep 24, 2018
    Posts:
    250
    ok, thanks for clarifying.
     
  38. cosmochristo

    cosmochristo

    Joined:
    Sep 24, 2018
    Posts:
    250
    To be fair, I can think of one case where there would be some benefit: if you are going to multiply two or more large numbers and then reduce them, such as in the square root of sum of squares distance formula, then converting to double beforehand will reduce the additional error from the squared floats "overflowing" the 32bit limits. Then, after the double square root put the result back into float.

    I would still like to see Unitiy's design reasoning - if it is documented, implementation details and some measurements.

    My alternate preference is to minimise the size of the numbers, for the most important calculations (around the observer/player), by keeping the observer at the origin (no periodic shifting) and doing everything in floats. No need to incur the casting costs of upcast-calculate-downcast and the extra memory allocation costs for the doubles.
     
  39. runner78

    runner78

    Joined:
    Mar 14, 2015
    Posts:
    777
    HDRP does exactly that (camera relative-rendering), but positions in the scene are stored as floats and physics is also calculated in floats in worldspace.
    https://docs.unity3d.com/Packages/c...on@17.0/manual/Camera-Relative-Rendering.html
     
  40. Max-om

    Max-om

    Joined:
    Aug 9, 2017
    Posts:
    499
    Well mono does it for two floats so if you for example multiply two floats it will cast both to doubles multiply them and cast the result back to float. Zero gained.
     
  41. cosmochristo

    cosmochristo

    Joined:
    Sep 24, 2018
    Posts:
    250
    So I am told. And the description is the same as my floating origin algorithm. However, they do not provide details of the implementation and have not acknowledged my prior work in this area. Pretty shabby really.
     
  42. cosmochristo

    cosmochristo

    Joined:
    Sep 24, 2018
    Posts:
    250
    In general, yes, unless it saves on extra accuracy loss from overflow of the mantissa as I described. My example is only an isolated case of where there is some potential benefit (because you take the square root before casting the result back). As a general approach, I agree with you. So I would like to see the justification.
     
  43. Max-om

    Max-om

    Joined:
    Aug 9, 2017
    Posts:
    499
    That resolution gain is so small so it's not worth it., MS JIT code have never done it what I know about, not even back in full framework times.
     
  44. cosmochristo

    cosmochristo

    Joined:
    Sep 24, 2018
    Posts:
    250
    And you may be right - that it is not worth it. I would not do it either. I see it as a bandaid over a fundamentally flawed foundational design.
     
    Max-om likes this.