Search Unity

  1. Welcome to the Unity Forums! Please take the time to read our Code of Conduct to familiarize yourself with the forum rules and how to post constructively.
  2. Join us on Thursday, June 8, for a Q&A with Unity's Content Pipeline group here on the forum, and on the Unity Discord, and discuss topics around Content Build, Import Workflows, Asset Database, and Addressables!
    Dismiss Notice

Floating Point Errors and Large Scale Worlds

Discussion in 'World Building' started by Phelan-Simpson, Apr 15, 2018.

  1. cosmochristo

    cosmochristo

    Joined:
    Sep 24, 2018
    Posts:
    224
    I have measured the moving collider performance issue raised by @Marcos-Elias, @ristophonics, and @razzraziel. With 10k collider objects (boxes with a collider component), if I continuously transform the group by transforming the scene root, there is a significant physics load seen in the profiler. As I did in my latest paper, I ran it on my core i5 laptop and, with very little geometry visible, and used the mouse keys to continually move the scene root.

    In the first image attached, on the left of the screen can be seen the simple visible geometry and profiler window on right.

    CollidersActiveHierarchy-30pc.jpg

    You need to tell Unity to not perform unnecessary processing when the scene root is moved and this can be by turning off the active hierarchy above the colliders before each transform and back on after it:
    Code (CSharp):
    1. nucleonsGroup.SetActive(false);
    2. // Move the scene to the new position by changing scene parent object transform.
    3. transform.Translate(rotated_transform);
    4.  
    5.  nucleonsGroup.SetActive(true);
    The next image shows what happens when I do this.


    The profiler measurements show that the magnitude of processing devoted to collider movement is now much less compared to the proportion of cpu devoted to rendering a few simple objects, even while continuously transforming 10,000 colliders. Therefore, moving colliders does not need to cause a significant performance load.
     

    Attached Files:

    Last edited: Sep 8, 2019
  2. Peter77

    Peter77

    QA Jesus

    Joined:
    Jun 12, 2013
    Posts:
    6,361
    Did you profile it in a build or editor?

    I'm asking because:
     
  3. cosmochristo

    cosmochristo

    Joined:
    Sep 24, 2018
    Posts:
    224
    @Peter77 Good question, I am running from the editor play button because, even though I have developer build ticked and autoconnect profiler ticked, no profiler is displayed when I run it as an app.

    I run it full screen, so the rendering of editor windows superpig mentioned is not an issue.
    I also turn vsync off so there are no cpu spikes from that.
    There are no memory allocations during run either, after the initial creation of 10k colliders.
    There may be other interference from the editor but, on average, the relative load comparisons are still valid anyway.

    The doco on profiling with dev build talks about running profiler on another machine, which would be ideal but I do not have a network setup.

    If you can tell me how to get the profiler up on the same machine when running from a built app then I will try it.
     
  4. Peter77

    Peter77

    QA Jesus

    Joined:
    Jun 12, 2013
    Posts:
    6,361
    That should be no different:
    • Start development build
    • Open Profiler window
    • Select Player in Profiler drop-down
    • Press record in Profiler window
     
  5. cosmochristo

    cosmochristo

    Joined:
    Sep 24, 2018
    Posts:
    224
    It is on my system: after starting it there is no menu option to open profiler. Is there a KB shortcut?
    No menu options for anything. OSX Mojave 10.14.6. Unity 2018.4.7.f1
     
  6. Peter77

    Peter77

    QA Jesus

    Joined:
    Jun 12, 2013
    Posts:
    6,361
    You're saying when you start the build, the "Windows > Analysis > Profiler" menu item disappears from the Unity Editor main menu? :eek:
     
  7. cosmochristo

    cosmochristo

    Joined:
    Sep 24, 2018
    Posts:
    224
    I mean it is not available in the build that I run. If I need to run it from the editor (like I have been doing) then ... do you mean I should close the editor after starting profiler and it will connect with the build when I run that?
     
  8. cosmochristo

    cosmochristo

    Joined:
    Sep 24, 2018
    Posts:
    224
    @Peter77: I worked it out. Here are the profile measurements from two dev builds: the first with an active hierarchy during moves and the second with it inactive.
    Profile with active 10k colliders under root node while moving (the hierarchy).
    HierarchyActive2-moving40pc.jpg

    Profile for inactive hierarchy:

    HierarchyInactiveUnity2018.7-40pc.jpg

    So the difference is 60fps+ (active hierarchy) to 200fps+ (inactive).

    For these tests, I used 1280x800 with rendering quality set to very low. Same version of Unity 2018.7. Only apps running were the unity editor with the profiler, the game from dev build.
     

    Attached Files:

    buFFalo94 and Peter77 like this.
  9. cosmochristo

    cosmochristo

    Joined:
    Sep 24, 2018
    Posts:
    224
    I put an example video online here:


    The relevant scripts shown attached in video are:
    Code (CSharp):
    1. using System.Collections;
    2. using System.Collections.Generic;
    3. using UnityEngine;
    4.  
    5. /*
    6. * Performs simple Floating Origin player movement by reverse transforming the scene
    7. * Floating origin movement only happens if there is
    8. * a change in navigation input and it is above a minimum threshold.
    9. *
    10. * Here's and online example testing 10k colliders: https://youtu.be/DsWLQFnRdLo
    11. * which shows how the PlayerMove and PlayerView are attached.
    12. *
    13. * Assumptions:
    14. * Player object has camera attached, it is positioned at the origin, and
    15. * tagged "playerCapsule".
    16. * This script is attached to the scene parent object.
    17. */
    18. public class PlayerMove : MonoBehaviour
    19. {
    20.  
    21.     // Minimum player collision detection distance
    22.     private const float COLLISION_DISTANCE = 2f;
    23.     // Multiplier for when bouncing back from collisions
    24.     private const float COLLISION_ADJUST = 1.2f;
    25.     // Threshold for detecting navigation changes
    26.     private const float NAV_SHAKE_THRESHOLD = 0.0001f;
    27.     // Max distance for detecting player collisions
    28.     private const float RAYCAST_DISTANCE = 10f;
    29.  
    30.     // Multiplier to each movement input
    31.     private readonly float speed = 7.0f;
    32.     private GameObject player;
    33.     private GameObject SceneRoot;
    34.     // Layer mask for ray casting
    35.     private int layerMask;
    36.     // Horizontal movement deltas
    37.     private float deltaX;
    38.     private float deltaZ;
    39.     private float speedAdj;
    40.     // Current reverse transform
    41.     private Vector3 reverseMovement;  
    42.     // Rotated reverse transform
    43.     private Vector3 rotated_transform = new Vector3(0f,0f,0f);
    44.     private readonly Vector3 player_position = new Vector3(0f, 0f, 0f);
    45.     private RaycastHit rayCollision;
    46.  
    47.     void Start()
    48.     {
    49.         // Accloc mem once only
    50.         reverseMovement = new Vector3(0, 0, 0);
    51.  
    52.         // Use Physics.Raycast to cast a ray forward into scene to check for collisions
    53.         // create a bit mask for 7 layers with 0 for player layer to use in Raycast
    54.         layerMask = 1 << 8;
    55.         layerMask = ~layerMask;
    56.  
    57.         // turn off cursor display in game window
    58.         Cursor.visible = false;
    59.  
    60.         // Get access to the player object
    61.         player = GameObject.FindGameObjectWithTag("playerCapsule");
    62.         SceneRoot = GameObject.FindGameObjectWithTag("root");
    63.  
    64.         if (player == null)
    65.         {
    66.             print("player not found");
    67.         }
    68.     }
    69.  
    70.     /// <summary>
    71.     /// Do not use FixedUpdate here because performance drops dramatically (on dual core i5 macbook pro).
    72.     /// <seealso cref="PlayerView.cs"/>
    73.     /// </summary>
    74.     void Update()
    75.     {
    76.         // Get the horizontal movement changes from keyboard and
    77.         // negate them so we can move scene in reverse
    78.         deltaX = -Input.GetAxis("Horizontal");
    79.         deltaZ = -Input.GetAxis("Vertical");
    80.  
    81.         // Only process floating origin movement if there is navigation input
    82.         // change and it is above noise/shake threshold.
    83.         // Performance: don't really want a sqr root here -
    84.         //   or even a squares comparision.
    85.         if ((Mathf.Abs(deltaX) + Mathf.Abs(deltaZ)) > NAV_SHAKE_THRESHOLD)
    86.         {
    87.  
    88.             speedAdj = Time.deltaTime * speed;
    89.  
    90.             // Scene reverse transform for floating origin navigation.
    91.             // Make movement delta proportional to time since last move and speed factor.
    92.             // Peformance: changed this to assignment so no mem alloc and GC needed, and
    93.             // 2 multiplies a bit faster than multiply by 3D vector.
    94.             reverseMovement.x = deltaX * speedAdj;
    95.             reverseMovement.z = deltaZ * speedAdj;
    96.            
    97.             /*// Uncomment to do player collision detection.
    98.               //If player collided with close object then ...
    99.             if (Physics.Raycast(player_position, player.transform.TransformDirection(Vector3.forward), out rayCollision, COLLISION_DISTANCE, layerMask)
    100.                 && (rayCollision.distance < COLLISION_DISTANCE))
    101.             {
    102.                 /// ... bounce back a little from collision
    103.                 transform.Translate(-rotated_transform*COLLISION_ADJUST);
    104.             }
    105.             else // no collision, so move scene in reverse
    106.             {*/
    107.                 // use player camera rotation to modify reverse movement vector so that player forward corresponds to forward movement input
    108.                 rotated_transform = Quaternion.Euler(player.transform.localEulerAngles) * reverseMovement;
    109.  
    110.                 SceneRoot.SetActive(false);
    111.  
    112.                 // Move the scene to the new position by changing scene parent object transform.
    113.                 transform.Translate(rotated_transform);
    114.  
    115.                 SceneRoot.SetActive(true);
    116.             /*}*/
    117.         }
    118.     }
    119. }
    and
    Code (CSharp):
    1. using System.Collections;
    2. using System.Collections.Generic;
    3. using UnityEngine;
    4.  
    5. /*
    6. * Rotates the player, and hence attached player view, based on mouse input.
    7. *
    8. * Assumptions:
    9. * This script is attached to player object.
    10. * Camera is attached to player object.
    11. */
    12. public class PlayerView : MonoBehaviour
    13. {
    14.  
    15.     // Control how fast player view responds to mouse movement
    16.     // More sophisticated games would allow the sensitivity to be changed to suit a player's preferences.
    17.     private const float sensitivity = 2f;
    18.  
    19.     // Limit (clamp) the vertical rotation to +/- this angle
    20.     private const float vertClamp = 60.0f;
    21.  
    22.     // Rotation about the horizontal axis (up and down)
    23.     private float currentRotationX = 0;
    24.     // Left and right rotation
    25.     private float currentRotationY = 0;
    26.  
    27.     void Update()
    28.     {
    29.         // horizontal and vertical rotation at the same time
    30.         currentRotationX -= Input.GetAxis("Mouse Y") * sensitivity;
    31.         currentRotationX = Mathf.Clamp(currentRotationX, -vertClamp, vertClamp);
    32.  
    33.         currentRotationY = transform.localEulerAngles.y + Input.GetAxis("Mouse X") * sensitivity;
    34.  
    35.         transform.localEulerAngles = new Vector3(currentRotationX, currentRotationY, 0);
    36.     }
    37. }
    38.  
     
    Last edited: Sep 13, 2019
  10. ncho

    ncho

    Joined:
    Feb 1, 2014
    Posts:
    93
    At a map size of 4096x4096 am I at risk for any physics inaccuracies? I haven't noticed anything yet but my game does rely on raycasting significantly so I'm somewhat paranoid about this.
     
  11. Stardog

    Stardog

    Joined:
    Jun 28, 2010
    Posts:
    1,861
    Probably not. If you put the middle of the terrain at the origin will only be 2048 from the edges.
     
  12. ristophonics

    ristophonics

    Joined:
    May 23, 2014
    Posts:
    32
    Depends... Is the 4096x4096 the resolution of the heightmap or the distance in meters of the terrain?
     
  13. cosmochristo

    cosmochristo

    Joined:
    Sep 24, 2018
    Posts:
    224
    If you are using a threshold of similar that size, I would have to say yes. The tile size might be ok for keeping the map vertex coords small but the threshold can independently lead to physics errors at much smaller distances.
    I did some very constrained physics tests because physics can be highly sensitive to error. I reported the experiment here:
    https://www.researchgate.net/public...ghting_cubes_battle_for_positional_invariance

    and you can see the videos here: part 1: 1DOF:
    https://youtu.be/vbIb9dh1f7o
    part 2: 2DOF:
    https://youtu.be/80W113eL6wQ

    Note that the videos compare continuous floating origin(CFO) with the threshold method that I call POS.

    The main point I would like to make is that there will always be some increased error with the lower 3space resolution induced by greater distance from the origin and complex calculations are sensitive to ti and magnify it.
    As for map tiles, I used to use a LOD system that effectively adjusts size when you get closer.
     
    Last edited: Oct 24, 2019
    PrimalCoder and buFFalo94 like this.
  14. buFFalo94

    buFFalo94

    Joined:
    Sep 14, 2015
    Posts:
    264
    Really cool tests
     
    cosmochristo likes this.
  15. signalsaudio

    signalsaudio

    Joined:
    Jul 13, 2019
    Posts:
    4
    Sorry to necro an old thread, but there was tons of information in here that is awesome - and from my experience, really hard to find.

    After reading through this thread I tried the following experiment, which may be useful to some people trying to make open world games in Unity:



    Other things I thought of that are not in the video:

    1. You actually could resolve some physics issues at those distances by using the "Continuous Floating Origin (CFO)" "display layer" to calculate physics from near the origin on the "floating error corrected" representations of enemies (will have to watch video to understand what I mean by that). I didn't really consider this because the game I'm creating is multiplayer, and the server would be verifying hit calculations on the "collider world" anyway.

    2. I found that even without colliders, the CFO method struggles with moving immense numbers of game objects under a root parent node in Unity - However, some tests I tried indicated that this may be easily resolved if game objects are set inactive that are not near the player. I'll have to do more research to verify this, but the early tests I did seemed promising.

    3. A big benefit to this approach is that it also allows you to use regular control methods on the "regular collision layer". Nothing is changed about how a player interacts with that world - just that its mesh renderers are stripped so it isn't actually being rendered.

    4. A good piece of advice to make your life easier is to simply use the reverse transform.position on the CFO "display layer" - no need to map the controls backwards or anything over-complicated. Since player rotation is generally local anyway I believe you can just use a direct copy of the player's rotation from the "regular collision layer" for most situations.

    The slightly more involved process of implementing this hybrid approach is creating the script to instantiate both worlds on top of each-other, strip the mesh colliders on one, and the mesh renderers on the other - but that should not pose much difficulty for most Unity programmers.

    Big thanks to CosmoChristo for his posts and research papers. Awesome stuff.
     
    cosmochristo likes this.
  16. cosmochristo

    cosmochristo

    Joined:
    Sep 24, 2018
    Posts:
    224
    Note that the continuous methods allow greater (e.g. doubling) of performance for some physics operations relevant to this discussion, such as avatar/view proximity, collision, and distance tests because relative-to-zero design eliminates some variables and operations automatically.
    I provide an explanation of this short article here:
    https://www.researchgate.net/publication/342510617_Position_Independent_Principle
     
    Marcos-Elias likes this.
  17. cosmochristo

    cosmochristo

    Joined:
    Sep 24, 2018
    Posts:
    224
    This is a good way to visually compare the two floating origin methods side-by-side. Another variation that I think would be worth making is to have just one copy of the world and two avatars, one for each navigation method.
    My comparisons showed each method one at a time, like this video of the object shift method, where the rendering problems starting from the first shift are shown and also how accelerating to faster speeds causes glitches.

    and this one that demonstrates scaling out into planetary space with simple physics in a HUD:

    I think your side-by-side comparison would be more effective.
     
  18. cosmochristo

    cosmochristo

    Joined:
    Sep 24, 2018
    Posts:
    224
    Hi @Phelan-Simpson, I'm curious about how you arrived at 1cm accuracy for that distance. The reason is that I asked myself similar questions some time ago and I ended up making some theory and rules to predict base worst-case accuracy. My aim was to provide people with a proven way to estimate accuracy limits.
    So my rule (3.4*distance*machine epsilon for single/double precision) gives a value of 4cm worst-case accuracy for single precision between two points/vertices at 99,999.99 distance. In other words, most of the time you will get 1cm or so accuracy, but occasionally it will be 4cm.
    On top of that, there is the magnification of spatial resolution base error due mainly to multiplications (if you multiply a number by 10, the error is multiplied by 10). Although I use this rule, magnification tends to overwhelm it and it hard to predict when you only write some of the code: unity may contribute more than your code.
    For example, I find the Unity reparenting operation generates noticeable error in the transform value. By observing this error at a distance of > 10^11 (near an average Mars orbit), the reparenting magnification + whatever is done in my scripts, is around 45 times what my rule predicts.
     
  19. Omti1990

    Omti1990

    Joined:
    Jun 22, 2018
    Posts:
    9
    I'm sorry for Necroing this, I saw this thread and found it very helpful for programming my space game. I've just been facing a few practical problems right now that have only barely been adressed here: How do I get serious collision physics to work with a continous floating origin (CFO) setup?

    There've been code suggestions for raycast collision detections and that works well enough for frontal collisions. But what about hits that aren't directly frontal, but where my spaceships collides with something at the sides?

    Do you know if there's some elegant solution to detect such hits? I've bee trying to use the detect collision function from rigidbodies and it sort of works, but there's a major problem. As far as I can tell if you want to make use of rigidbodies in a CFO setup, you need to restrain its ability to move on the x y and z axis. That also means you can't feed it a velocity (I tried but the vector remained {0,0,0}), which means it's collision detection ability is crippled and can easily lead to you glitching through objects.

    Is there some way to get at the underlying functions and just feed data into them? I've been looking into the scripting API, but I don't think they're exposed. (Please correct me if I'm wrong)

    Now I come to the next part of the problem. How do you actually calculate the collision. This might not be much of a problem for first person shooters and the likes, where it's mostly about being bumped back when crashing into a wall. In my case I do need a bit more complicated calculations (to say the least). I've shown the easy cases below. In case of a frontal elastic collision with a movable object you can just use standard one dimensional impulse calculations. The spaceship gets reflected, but loses part of its velocity to the object it has collided with. The other option is that the object is static and the spaceship gets just reflected in that case it doesn't even have to be frontal.

    But what about elastic collisions that aren't frontal or with a static partner? What to do when crashing with two objects at the same time? (Or at least two colliders of the same model?) I believe under normal circumstances the rigidbody component would calculate all of this. I've been looking into the calculations myself and even in university course level stuff on the internet it tends to be simplified.

    To be honest, I would rather not try to program a custom rigidbody sort of collision system myself if I can avoid it. Do you know if unity's physics calculations are exposed somewhere? Or if there are modules that I could use for 3d collisions (or even assets)?
     
  20. cosmochristo

    cosmochristo

    Joined:
    Sep 24, 2018
    Posts:
    224
    hi @Omti1990,

    If you are talking about collisions between the zero centered avatar (and attached ship) then I think it is probably worth considering making a custom collision detection CFO system if you can exploit faster optimisations like I mention in the Position_Independent_Principle linked to earlier, at least for first approximation collision detection.

    I have only played with overly simplified collision responses myself so far so don't have a ready made solution for you. However, I'd be happy to discuss further because I think I will be looking at doing the same thing later on.
     
  21. Omti1990

    Omti1990

    Joined:
    Jun 22, 2018
    Posts:
    9
    My current solution is to use boxcasts from the colliders and then just add the normals of all the raycastHit. Then just use the reflection function from Vector3 / Vector3d to calculate where my spaceship should be reflected. This works decently well for demonstration purposes, but actual collision physics are far more complicated than that. I'd really love something where I'd just have to plug in rotations, masses, collision points and velocities and it'd just do the physics for me. No clue if someone made an asset that does that sort of thing.

    Anyways, I've come across another problem. Actually moving a parent transform works well enough if you're only moving basically static objects. The problem I've encountered happens if you've moved the parent transform for your world far from origin and try to add moving objects to it (in my case projectiles). They're moving really oddly. I suspect this is because their transform.localPosition has high float values on at least one axis. The further you move the parent transform away the worse these effects get.

    Even with static objects, I suspect there are limits, which I might easily surpass at the solar system scale I've been using. So I don't think just moving the parent transform is really a solution for that kind of problem. I've been thinking I could just move the objects individually and just use my own coordinate system in double Vectors to determine their position, but that opens the question how do I effectively call on mobile objects or objects I might or might not want to spawn later on.

    My current best bet is to just get all children during Awake and put them into some array and just have thousands of deactivated game objects sit around for everything at the start and activate them if needed. I'm just worried that'll be not particularly performative although I could optimise this by only updating those that are actually currently visible. Still worried what might happen with the performance in some bigger spacebattle if thousands of projectiles have to be updated every frame.
     
  22. cosmochristo

    cosmochristo

    Joined:
    Sep 24, 2018
    Posts:
    224
    you are probably right. There is a way to manage the precision and distance from origin of every active thing, not just the CFO avatar (and stuff near it).

    ok well i would like to investigate this type of dynamic performance issue with you, if that's ok. I can test it on my current framework. Maybe you can make a small test project that I can test. My main time waster is trying to get some acceptable flight dynamics in different navigation modes - e.g. when transitioning from space to near planet to a detailed region - where you might need some pitch and roll - to walk/terrain follow and then back out again.

    Yes that's right and the hierarchy should not remain static as you say - static hierarchies are useless. I use a dynamically modifiable hierarchy. I am using single precision only and go out to Mars using this approach. Without it I would not get away with single precision going from space to ground.

    haha yes I use activate/deactive all the time on sub hierarchies.
     
  23. KeinZantezuken

    KeinZantezuken

    Joined:
    Mar 28, 2017
    Posts:
    53
    @cosmochristo
    @Omti1990
    If you get somewhere please update this thread when you can, I'm interested in this as well.
     
  24. cosmochristo

    cosmochristo

    Joined:
    Sep 24, 2018
    Posts:
    224
    ok, minor update: As to your question about setActive false/true, it improves performance in my app, and I do this every time there is a camera "move", in quotes because the camera actually stays at zero. So I don't think you will have a performance hit, instead fps should improve. You will have to design it differently for what you want (as you are already thinking) and one of the changes is to not deactivate something containing sound, or you will get audio glitches. So sound elements may need a separate hierarchy if you have some complex audio landscape.

    As for vector algebra, I am still only using it for determining if player is facing certain objects. No CD yet.
     
  25. AndreiMarian

    AndreiMarian

    Joined:
    Jun 9, 2015
    Posts:
    71
    For the floating point precision problem there are mitigations and solutions. All suggestions presented so far seem like mitigations. But why not take the bull by the horns and make it as it really should be ideally?

    Using floating point, single or double in 3D Worlds, is the same blunder, like hammering the nails with the pliers: the whole idea behind "floating" point is to use them either in small xor in large scale. Not both at a time!
    Inconsistent accuracy -- 7 decimals near origin and skipping integers past the insignificant value of ±16,777,216. I'm saying "insignificant " as the float positive range is 0 - 340,282,366,920,938,463,463,374,607,431,768,211,456.
    Problem is doubled due to negatives.
    So really the usable fraction of a
    float 
    is 1 / 81,129,638,414,606,681,695,789,005,144,064‬. You see now?

    Now what I'm proposing is making a Unity package with integer based coordinates and possibly math. Use
    int
    or even better --
    long
    .
    I know it's a huge undertake but it will be the solution. We would have a steady ±2,147,483,647 for the same 32 bit price, and possibly even faster on CPU, and ±9,223,372,036,854,775,807 with
    long
    .
    Using
    double 
    only sweeps the issue under the carpet.
    Need 3 decimals precision? use the unit as mm. Need precision of merely thousands? Use the unit as Km. Any way, you get more space for the buck than with
    float
    . Plus consistency. Using
    long
    units as mm you'd get ~2 light-years across at the mm precision, as 1 ly ≈ 9.4605284e15 m.

    Note: You would have to manually prevent/handle overflow cases or the engine would reserve some values for the corresponding
    NaN
    and ±∞ in floating points.
     
  26. cosmochristo

    cosmochristo

    Joined:
    Sep 24, 2018
    Posts:
    224
    @AndreiMarian, Integers could be utilised for greater accuracy, as you say. Floating point has the advantage of greater scale. Personally I prefer floats. I think if multi-precision int math could be made to work well then they could be a good option but I have not seen any proof of that. I think it is worth pursuing though. :)

    As for a solution, when one combines position independence of calculations with the observation that small numeric values lead to less error propagation then you have a solution where calculations can be performed at or close to the origin to minimise error in the result. The results can then be applied to whatever location they are relevant to.

    This is a general solution, whether you use integer or float. It is not some sort of hack that only works under some cases. When coded as algorithms, my two main algorithms are Continuous Floating Origin and Relative Spaces which work together. And the first is not origin rebasing or object/origin shifting based on some threshold distance from the origin - those are what may be considered hacks because they have glitches and other limitations. For example, what threshold applies generally to every application?

    I don't claim to have the only answers but I have spent nearly two decades on this very subject and built a great deal of evidence that my approach is a general and effective solution with good performance. It does require a change from absolute to relative thinking and that is not always so easy, even Newton, who's Laws proved there is no absolute space or position, could not accept that everything is relative (Hawking, A Briefer History of Time, bottom of page 23.)

    I hope this helps :)
     
    AndreiMarian likes this.
  27. AndreiMarian

    AndreiMarian

    Joined:
    Jun 9, 2015
    Posts:
    71
    It's my favorite approach of the existing ones. If only it could fit all cases, including mine... but unfortunately I can't use it in my case (more about it below).

    TL;DR
    I want a robust, consistent, simultaneous, light-year spanning, world.
    long
    delivers these.

    I've read through the paper on which you've based your approach, I've seen your git code and the related videos. I for one like how it turned out, and kudos for the activation/deactivation gimmick that skips unnecessary recalculation.
    I like it as it's one of the most straightforward approach, not straying (too much) from the ideal, and it's relatively performant.

    Everything is relative, I subscribe, but it's about space-time not just space. For our case the ideas that come up are:
    1. Computing the events in the order of range from the camera, at a speed greater than the fastest locomotion speed, such that "updating" is not observable while traveling. But this means high spatial accuracy + low processing.
    2. Only compute events in the observable range. This means your approach but no simultaneity.

    In my specific case I just want the regular game routine but scaled to a large world. That is, I want simultaneity, i.e. things to still happen even if I move away past the reliable (4000 units?) range. This implies the same precision all over the world.
    Just one example: if I send a ship to mine some asteroid in the far distance, I don't want them to overlap due to floating error and explode. Nor do I want to compute the mining event at a later time, when I'll be close enough.
    A workaround would be to just don't do collision checking but it's ghetto, plus it's not just physical collisions, there are essential matters like do something when in range, like f.i. shoot enemy when in visual range but not in their weapons range. If these ranges difference is say, a mile, and the precision is 2 miles, you can imagine the mess. It may even shoot friendlies etc. Anyway, anything you could think of can happen.

    Using
    float
    is just wrong, like using a rubber hammer for nails. Replacing it with
    double
    is like using a ten times bigger rubber hammer.

    As for the double coordinate set (x, y, z, x', y', z') it's both cumbersome for us devs and a burden for CPU, memory. Plus how do you deal with rendering? You'd have to convert to some regular coordinates for the GPU.
     
  28. cosmochristo

    cosmochristo

    Joined:
    Sep 24, 2018
    Posts:
    224
    thanks, for the positive comments :).
    and note that the activation/deactivation is just getting around an inefficiency in Unity - would be better if you could tell Unity that a node and it's subtree are being moved by floating origin so don't send unnecessary events. Also, you can't use it on something containing streamed audio, and maybe some other things.
     
  29. AndreiMarian

    AndreiMarian

    Joined:
    Jun 9, 2015
    Posts:
    71
    Floating point idea and inner implementation is worth something close to a Nobel prize. A world in a seed, so ingeniously distributed. And I mean it. But as I said in other posts, it doesn't fit our use case. Why? Because it doesn't provide for the needs. And even if, because there is something else more suitable.

    We have to realize that all you, me and everyone else we're trying here is just come up with ways to patch the system such that we can make space, which is the foundation of a game, simply... work;)
    For so many devs the 3000-5000 unit limit came as a surprise, otherwise the internet wouldn't be full packed with the subject. Let's be honest guys, this is a sign of some poor thinking, not hardware limitation. And it's not Unity's fault, it's the downside of how all engines have ever been conceived.

    To better see what I mean, take time instead of space. What if all of a sudden you'd discover that 25 minutes into the game a frame would start to take arbitrary values, even using deltaTime, due to some type limitations? After 1 hour a frame would take randomly between 5ms to 200ms?
    Would you be happy just capping play time to 25 mins? (Current situation)
    Would you call restarting the game every 25 mins a solution?
    Would you be happier if the fluctuation would be reduced to 10ms?
     
    Last edited: Feb 19, 2022
  30. cosmochristo

    cosmochristo

    Joined:
    Sep 24, 2018
    Posts:
    224
    I fully agree, temporal jitter is as important as spatial and even causes some spatial jitter. CFO should be positional centering not just spatial: you remain at the center of space and time [and other positional fields]. I have not yet had the time to implement this :).

    As for your mining example, my plan is to put distant calculations in the back-end/ server side processing: calculated in a centered model and recorded for the place it relates to, but not rendered (in detail) unless a player is viewing it close.
     
    AndreiMarian likes this.
  31. cosmochristo

    cosmochristo

    Joined:
    Sep 24, 2018
    Posts:
    224
    Collisions and physics.
    This is a somewhat slow moving thread, but, to paraphrase the wise Treebeard, The acquisition of deep knowledge is not worthwhile unless you spend a lot of time acquiring it.
    From time to time, there are some questions raised over collisions and physics under continuous floating origin, so here is a demo of a ship colliding with terrain and stuff and some active physics.
     
  32. cosmochristo

    cosmochristo

    Joined:
    Sep 24, 2018
    Posts:
    224
    @TerraUnity correctly pointed out to me that the above video does not demonstrate use of Unity rigidbody and collision physics very well. So here is a better demo:
     
  33. cosmochristo

    cosmochristo

    Joined:
    Sep 24, 2018
    Posts:
    224
    Hi again @Omti1990, I have an update for this.
    the video:
    (deleted) but you can see some collision at work in later ones and this one:


    shows many kinds of ship collisions working, including from the side. This uses plain unity rigid-body collisions to make the ship react, so there is no difficult detection to handle. Of course, to make the Universe move the other way, instead of the ship moving, I have to do a little extra code to handle it.

    Note: this works for speeds up to, say 1000m/s but then some penetration of terrain/objects can occur. Two options I have tried that work for higher speeds are:
    1) fire more rays in different directions,
    2) make invisible surfaces some distance around the ship and use unity rigid body collision to make collision detection handled automatically by Unity.
     
    Last edited: Apr 14, 2021
    Omti1990 and Digika like this.
  34. Omti1990

    Omti1990

    Joined:
    Jun 22, 2018
    Posts:
    9
    Sorry, my updates for this thread where borked somehow.

    Anyways, I can explain how I implemented my floating origin system.

    First and most importantly, I downloaded the Vector3d asset from github, which is basically Vector3 only with double precision.
    Since I didn't quite think things through from the beginning, I don't have one super parent for all my objects to determine their position via floating origin, so I implemented some interfaces:

    Code (CSharp):
    1. public interface ISpaceObject
    2.     {
    3.         Vector3d Origin{get;}
    4.         Vector3d CurrentPosition{get;}
    5.         Vector3 Velocity{get;}
    6.     }
    7.  
    8. public interface IUpdatingSpaceObject: ISpaceObject
    9.     {
    10.      
    11.         void UpdateSpaceObject();
    12.         double CurrentDistance{get;}
    13.  
    14.     }
    15.  
    The origin isn't technically necessary, but I figured it'd be cleaner if I set some starting position seperate from the CurrentPosition for all space objects. For the purposes of the floating origin, I'm always using the IUpdatingSpaceObject, I'm also using the parent interface in other interfaces, so I kept it seperate in an attempt to keep things cleaner.

    I'm using a Singleton to keep track of my IUpdatingSpaceObjects and run the update script. Since I didn't really understand Actions when I originally programmed this it's using Lists. I'm using FixedUpdate to move the objects in space, interface stuff and AI behaviours run on void Update though.

    Code (CSharp):
    1. public class Services : SingletonMonoBehaviour<Services>
    2. {
    3.  
    4.    private List<IUpdatingSpaceObject> updatingPlanets = new List<IUpdatingSpaceObject>();
    5.        public List<IUpdatingSpaceObject> UpdatingPlanets{get{return updatingPlanets;} set{updatingPlanets = value;}}
    6.  
    7.     private List<IUpdatingSpaceObject> updatingProjectiles = new List<IUpdatingSpaceObject>();
    8.       public List<IUpdatingSpaceObject> UpdatingProjectiles{get{return updatingProjectiles;} set{updatingProjectiles = value;}}
    9.     private List<IUpdatingSpaceObject> updatingMobs = new List<IUpdatingSpaceObject>();
    10.     public List<IUpdatingSpaceObject> UpdatingMobs{get{return updatingMobs;} set{updatingMobs = value;}}
    11.  
    12.  
    13.     private GameUpdates gameUpdates = new GameUpdates();
    14.    
    15.  
    16.     private void FixedUpdate()
    17.     {
    18.        
    19.             //Current Update Order:
    20.  
    21.             //Planets
    22.             gameUpdates.UpdateSpaceObjects(UpdatingPlanets);
    23.             //Projectiles
    24.             gameUpdates.UpdateSpaceObjects(UpdatingProjectiles);
    25.             //Ships
    26.             gameUpdates.UpdateSpaceObjects(UpdatingMobs);
    27.  
    28.             //PlayerShip
    29.             PlayerShip.UpdatePlayerShip();
    30.  
    31.         }
    32.  
    33.      
    34.     }
    35.  
    I put my updating function into a seperate class to keep my Singleton cleaner (in hindsight nice try myself). Since I didn't know about static class back then, I'm actually instantiating this, but you could obviously also make it a static class. I just didn't have the drive to refactor things yet.

    Code (CSharp):
    1. using System.Collections.Generic;
    2. using UnityEngine;
    3. using EscapeFromEarth;
    4.  
    5. public class GameUpdates
    6. {
    7.  
    8.     private int i = 0;
    9.     private int n = 0;
    10.  
    11.     private IUpdatingSpaceObject spaceObjectCheck;
    12.  
    13.     bool check = false;
    14.  
    15.     public void UpdateSpaceObjects(List<IUpdatingSpaceObject> updatingSpaceObjects)
    16.     {
    17.         if(updatingSpaceObjects.Count != 0)
    18.         {
    19.             for(i = 0; i < updatingSpaceObjects.Count; i++)
    20.             {
    21.                 //save updating object for later checks
    22.                 spaceObjectCheck = updatingSpaceObjects[i];
    23.  
    24.                 //Update UpdatingSpaceObjects
    25.                 updatingSpaceObjects[i].UpdateSpaceObject();
    26.  
    27.                 //Secure if deletion has occured
    28.                 if(updatingSpaceObjects.Count == 0) break;
    29.                 if(i>=updatingSpaceObjects.Count) break;
    30.  
    31.                
    32.              
    33.                 //In case something has been deleted run updates for the following objects and deal with potential further deletions
    34.                 RerunOnDeletion(updatingSpaceObjects);
    35.                 if(updatingSpaceObjects.Count == 0) break;
    36.              
    37.             }
    38.         }
    39.         else
    40.         {
    41.             //Debug.LogWarning("updatingSpaceObjects is empty!");
    42.         }
    43.      
    44.     }
    45.  
    46.     private void RerunOnDeletion(List<IUpdatingSpaceObject> updatingSpaceObjects )
    47.     {
    48.      
    49.         if(spaceObjectCheck != updatingSpaceObjects[i])
    50.         {
    51.             check = false;
    52.             i--;
    53.             //Debug.Log("Object got removed! i: " + i + "Count: " + updatingSpaceObjects.Count);
    54.             if(i<0)
    55.             {
    56.                 i=0;
    57.                 while(check == false)
    58.                 {
    59.                     //Debug.Log("Updating additional updatingSpaceObject");
    60.                     spaceObjectCheck = updatingSpaceObjects[i];
    61.                     updatingSpaceObjects[i].UpdateSpaceObject();
    62.                     if(updatingSpaceObjects.Count == 0) check = true;
    63.                     else if(spaceObjectCheck == updatingSpaceObjects[i]) check = true;
    64.                     else Debug.Log("Repeating while loop!");
    65.                 }
    66.              
    67.              
    68.             }
    69.         }
    70.     }
    71.  
    72.  
    73. }
    The code is probably more complicated than it needs to be. It's just that SpaceObjects can potentially be destroyed during their update, so I needed to do some shenanigans with the list. Since the code works, I kept with the old wisdom of never touch a running system.

    That's just the code necessary to run things from my singleton. Now I come to my IUpdatingSpaceObject s. I'm using one basic example here, my mobileObjects:
    Code (CSharp):
    1. public class MobileObject : IUpdatingSpaceObject
    2. {
    3.  
    4.    [SerializeField] protected Vector3d origin;
    5.     public Vector3d Origin{get{return origin;}protected set{origin = value;}}
    6.    protected Vector3d currentPosition;
    7.     public Vector3d CurrentPosition {get{return currentPosition;} private set{currentPosition = value;}}
    8.     protected Vector3 movement = Vector3.zero;
    9.     public double CurrentDistance{get{return (CurrentPosition-Services.Instance.PlayerShip.CurrentPosition).magnitude;}}
    10.     public Vector3 Velocity
    11.     {
    12.         get{return movement;}
    13.         protected set{movement = value;}
    14.     }
    15.  
    16.    protected bool initialised = false;
    17.  
    18.    protected virtual void Awake()
    19.     {
    20.         if(Initialised == false)
    21.         {
    22.             currentPosition = origin;
    23.             initialised = true;
    24.         }
    25.      
    26.      
    27.  
    28.         Services.Instance.UpdatingMobs.Add(this);
    29.  
    30.     }
    31.  
    32.  
    33.    public virtual void UpdateSpaceObject()
    34.     {
    35.      
    36.         currentPosition += movement;
    37.         if((currentPosition- Services.Instance.PlayerShip.PlayerPosition).magnitude < Services.Instance.Environment.FadeoutDistance)
    38.         {
    39.                 gameObject.SetActive(true);
    40.                 transform.position = (Vector3)(currentPosition - Services.Instance.PlayerShip.PlayerPosition);
    41.         }
    42.         else
    43.         {
    44.              
    45.                 gameObject.SetActive(false);
    46.         }
    47.      
    48.     }
    49. }
    The FadeoutDistance is set to 50,000 for me. I figured 50 km is more than enough, though you can obviously add more.

    movement is the velocity of the MobileObject.

    Now the PlayerShip basically works the same. Since we're using floating origin, the PlayerShips position obviously won't get changed.
    Code (CSharp):
    1. public virtual void UpdatePlayerShip()
    2.     {
    3.      
    4.         currentPosition += movement;
    5.      
    6.     }
    I implemeted this sort of floating origin, since moving the world GameObject still caused floating point errors at lunar distances.

    @cosmochristo Thanks for research the collisions. I'll have a look at it and see if it works for me.
    Edit: Actually why did you remove your video?
     
    Last edited: Apr 11, 2021
    Digika likes this.
  35. cosmochristo

    cosmochristo

    Joined:
    Sep 24, 2018
    Posts:
    224
    @Omti1990 Re video: I was removing an older one as there was a better updated version and I made a mistake removing the wrong one, so ended up taking out two. :( I updated the post with a valid link.
     
    Last edited: Apr 14, 2021
    Omti1990 likes this.
  36. cosmochristo

    cosmochristo

    Joined:
    Sep 24, 2018
    Posts:
    224
    @Omti1990: when I originally used setActive to turn off/on sub-trees of objects (per frame) it was giving an improvement in performance. [BTW: I set the objects to off, then did the transform change, then set them on: opposite to what you are doing.] With the recent version of unity I am using (2018.4.2x-2018.4.33) and my upgrade to a 2020 laptop I have found this halves the frame rate. So now I only do it when I approach/leave an area and that only happens infrequently (e.g. not more often than a minute). I do not know the reasons for the difference but the old computer died so cannot go back and retest. So one has to do some tests with respect to frequency of use of SetActive to determine if it is worth it.

    Also everything is floats for a Solar system scale.
     
  37. Omti1990

    Omti1990

    Joined:
    Jun 22, 2018
    Posts:
    9
    @cosmochristo , did you try firing some projectiles somewhere far from 0/0/0 of your environment transform? I mean it's possible that I somehow screwed the code back then, but even at Lunar distances they went all over the place, while working perfectly fine near 0/0/0. I also didn't have any issues moving my spaceship around, it was only when several transforms moved that I had weird effects.

    About the set active command, yeah that's probably less than efficient actually. Thank you for pointing it out.

    I should probably put it after the movement and also check if it's active. Is there some serious performance gain for turning them off, moving them and then turning them on again? Ala this?

    Code (CSharp):
    1. public virtual void UpdateSpaceObject()
    2.     {
    3.    
    4.         currentPosition += movement;
    5.         if((currentPosition- Services.Instance.PlayerShip.PlayerPosition).magnitude < Services.Instance.Environment.FadeoutDistance)
    6.         {
    7.                 gameObject.SetActive(false);
    8.                 transform.position = (Vector3)(currentPosition - Services.Instance.PlayerShip.PlayerPosition);
    9.                 gameObject.SetActive(true);
    10.         }
    11.         else
    12.         {
    13.            
    14.                 gameObject.SetActive(false);
    15.         }
    16.    
    17.     }
    That said, what extra Code did you use for the collisions? I was pretty much resigned to programming my own S***ty collission system. I did try restraining my spaceship on the x,y,z axis and enable rotation, but the behaviour isn't really satisfying since I don't know how to make unity understand that the world is moving rather than my spaceship.
     
  38. cosmochristo

    cosmochristo

    Joined:
    Sep 24, 2018
    Posts:
    224
    if you give me some simple example code+model (can be via a private gitlab or whatever) then I'd be happy to do some tests and compare results. logically, they will suffer jitter but I think if missiles were firsed from the Moon to Earth, for example, then they should behave fine. *However*, it is a very good question because I think the way i would approach it, if there are distant jitter issues, is to calculate the path in a mathematical model as if fired from Zero and displace it to it's reference position (i.e. displacement every frame).

    ...
    the first option is what I was using on the old computer and measured roughly double performance. Intention was to prevent Unity from generating useless messaging when the FO transform changes. Now I measure roughly half the performance! on new equipment+unity versions.

    You can PM me on how I do it. I generally I don't publicly go into detail until after I have published it somewhere.

    And from my experiments so far, the Unity physics based collisions I am using are fine except when the speed is too great for the frame rate - in which case penetration of surfaces will occur anyway. Therefore, one either has to very carefully control speed or create custom additions that handle the higher speeds.
     
    Last edited: Apr 17, 2021
  39. brentskegg

    brentskegg

    Joined:
    Nov 21, 2017
    Posts:
    6
    This is somewhat OT but how well will this work in multiplayer environment where you have an authoritative server (which is sending position updates to clients)?

    Preferably I would like to use unity physics on the server but I'm not sure how to solve it, 32 bit floats prevents using a unified simulation for all players and I don't think unity physics have a way to simulate many "worlds" (one for each player or something like that) out of the box.

    Probably allot of other issues to solve too.
     
  40. cosmochristo

    cosmochristo

    Joined:
    Sep 24, 2018
    Posts:
    224
    @brentskegg, this is a common question and is not off topic :)
    If I understand you correctly, there are two questions here: one about the floating point precision used and one about compatability of floating origin with multiplayer.

    1. Precision.
    My game/simulation pipeline designs do not suggest/require single precision everywhere - just at the sharp end: the high performance, frame-by-frame rendering player of views. My first multiplayer was an experimental VRML world and the server used doubles for map / calculations but served the vrml as floats only.

    If I wrote a server now then it would also use double precision for map/wolrds and players and objects. Double positions for clients are only converted to floats using a precision-preserving step: a double floating origin subtraction is performed first. In other words, the server uses the position of players as their floating origin reference position. This can be done on client or server. If doubles go to the client programs then there is an increase of bandwidth requirements. You have to decide on this early in your design. The client sends player position deltas back to the server and ther server updates the double positions.

    2. FO and multiplayer.
    All floating origin works with multiplayer because each player's view on the world is independent of other player views and these views have no affect on other players or the world. Only interaction affects the world. Their relative positions are the same as if there were no floating origin. The difference is that CFO and RFO improve accuracy, scalability and performance and minimise jitter.

    Physics comment. I let Unity do as much / all the physics as possible, on the client. For example, it generates the player collision vector for me. I just take the vector and push the world the other way. It works efficiently. The outer wilds game has been reported as doing the same. I have not tried to do high performance multiplayer interaction with this system - I think it is a difficult area no matter what system you use because bandwidth and lag are difficult issues to solve well.
     
    Last edited: Jun 2, 2021
    Marcos-Elias likes this.
  41. brentskegg

    brentskegg

    Joined:
    Nov 21, 2017
    Posts:
    6
    Thanks for your answer!

    I'm experimenting with FO (periodically shifting right now) and multiplayer, haven't got that far but I see problems on the horizon so to speak.

    I was hoping to be able to run physics on the server and it being authoritative but I guess it still can be somewhat authoritative but have the client running its physics.

    Basically I will try something like this:
    1. Server keeps track of clients in double precision and runs no (collision) physics
    2. Clients runs physics, sends position deltas to server, receives other clients positions from server

    Server can do sanity checks on deltas. Client runs/displays graphics using FO.

    I will probably run into unforeseen problems :)
    But its a start.
     
  42. cosmochristo

    cosmochristo

    Joined:
    Sep 24, 2018
    Posts:
    224
    I expect the server communications and tracking are pretty much independent of the navigation method. My main advice is that if you want a large, continously navigable world, then don't write too much code before settling on the main nav method Otherwise there may be a lot of redesing and coding. I did submit a CFO asset to the store on the 17th May but there is no accept/reject yet.
     
  43. brentskegg

    brentskegg

    Joined:
    Nov 21, 2017
    Posts:
    6
    Yes they'll be independent but as server keeps track of positions in double precision some conversion is needed for the client as you said.
     
  44. amirpaktin

    amirpaktin

    Joined:
    Jun 14, 2021
    Posts:
    19
    Hello, this is a PlayerMove but where is view script? I could not see
     
  45. cosmochristo

    cosmochristo

    Joined:
    Sep 24, 2018
    Posts:
    224
    Last edited: Jul 18, 2021
    amirpaktin likes this.
  46. Gerigory

    Gerigory

    Joined:
    Jan 16, 2019
    Posts:
    3
    @cosmochristo Hi, Chris, Thx for your sharing. I'm currently looking for techniques on solving float point precision problem, your solution sheds some light on it, however, i still gotta some question on the implementation details, to make them clear, i'd like to illustrate it using example scene:
    1. we have a scene with lots of objects, all the objects are attached to a root empty node(denoted as root for simplicity)
    2. At the first time, root & viewpoint are both on the origin point (vec3(0, 0, 0)), one of the object A's world coordinates is vec3(10, 0, 0), and it's relative coordinates to the parent(root) is vec3(10, 0, 0) too.
    3. When camera or player move forward along X-Axis for about 5, then viewpoint's coordinates is still vec3(0, 0, 0) while the root's coordinates' is vec3(-5, 0, 0), and A's relative coordinates keep the same as vec3(10, 0, 0), it's coordinates relative to camera is vec3(5, 0, 0), where all looks great at the point.
    4. Things start to change when camera moves too much, say we may encounter precision problem at 4k(m) and if camera move forward along X-Axis for about 5k(m), what happens now? Does root's coordinates go to vec3(-5k, 0, 0) and does this -5k value lead to a precision error? At the same time, what about another object that it's relative coordinates to camera is vec3(-500, 0, 0), which makes its relative coordinates to the root is vec3(4.5k, 0, 0), and will this 4.5k value lead to precision error?

    I'm not sure if i get your point correctly, just woudering how the objects' coordinates goes, how to avoid root's coordinates go beyond the precision range and how to avoid children objects of root's relative coorindates go beyond the precision range, looking forward to your reply!
     
  47. cosmochristo

    cosmochristo

    Joined:
    Sep 24, 2018
    Posts:
    224
    @Gerigory Good question, and your reasoning is correct. What you seem to be describing is Distant Relative Jitter (DRJ) and, as you have worked out, CFO does not directly address it. This does not become an issue for about 50km, by my tests. See:
    The_nature_of_distant_relative_jitter
    and:


    My solution, that plugs into CFO, is here: dynamic-resolution-spaces-198907
     
  48. Gerigory

    Gerigory

    Joined:
    Jan 16, 2019
    Posts:
    3
    Hi, Chris, Glad to receive your reply, that's what bothers me. I've read your "Dynamic Resolution Space" document and get the information listed below:
    1. Root node using float point for far distant coordinates
    2. Object Far from root node while near from the camera will come across "Distant Relative Jitter"
    3. What "Dynamic Resolution Space" do is to detach those objects from the root node when camera enters the space and attach back when camera leaves.

    After the reading, I've got some new questions:
    1. Does the detach/attach thing use double coordinates from object system?
    2. What happens when we need to reset the camera's position to origin when in dynamic resolution space? Should we shift the objects that detached from the root node?
    3. Will it be a much serious problem when camera moves to a much much longer distance, like 1000 km?
    4. Why don't you reset the root node's world coordinates to the origin when reaches a distance threshold(1 km for example)?
    5. Will it leads to bad performance when shift all the rigidbody's coordinates in physic's world in one frame when root node's coordinates changed?
     
  49. cosmochristo

    cosmochristo

    Joined:
    Sep 24, 2018
    Posts:
    224
    It Can do but the main idea is you don't need to and I don't: I use floats everywhere.
    [QUOTE}
    2. What happens when we need to reset the camera's position to origin when in dynamic resolution space? Should we shift the objects that detached from the root node?
    [/QUOTE]
    That doesn't happen. What you are describing is the threshold based origin-shifting algorithm which is falsely named "floating origin". Epic Unreal doc is the only place (apart from my documents) that tells the truth. they described the algorithm for what it does, calling it "origin rebasing", i.e. same as what you are referring to.
    Real floating origin keeps the camera at the origin always (i.e. continuously).
    [/QUOTE]
    Same answer to all these as before, you are assuming the algorithm and design are like the fake floating origin on the unity wiki - it is not! real continuous floating origin is more like the exact opposite :)
    Unfortunately, the wiki has mislead thousands of developers down this blind alley because it is falsely named and people think that my floating origin is the same!
    I have had to ask Unity Tech to correct or redo this wiki article.
     
  50. Gerigory

    Gerigory

    Joined:
    Jan 16, 2019
    Posts:
    3
    That doesn't happen. What you are describing is the threshold based origin-shifting algorithm which is falsely named "floating origin". Epic Unreal doc is the only place (apart from my documents) that tells the truth. they described the algorithm for what it does, calling it "origin rebasing", i.e. same as what you are referring to.
    Real floating origin keeps the camera at the origin always (i.e. continuously).
    [/QUOTE]

    Same answer to all these as before, you are assuming the algorithm and design are like the fake floating origin on the unity wiki - it is not! real continuous floating origin is more like the exact opposite :)
    Unfortunately, the wiki has mislead thousands of developers down this blind alley because it is falsely named and people think that my floating origin is the same!
    I have had to ask Unity Tech to correct or redo this wiki article.[/QUOTE]

    Ok, i see, but if we use CFO, we need to alter the whole scene's coordinates every frame, and we need to do this for the physics world, which can't be done by transform the root node only, instead we have to do the transform for all the objects, won't it lead to a bad performance?