Search Unity

  1. Welcome to the Unity Forums! Please take the time to read our Code of Conduct to familiarize yourself with the forum rules and how to post constructively.
  2. Dismiss Notice

Game Object Distortion at Small Scales

Discussion in 'General Graphics' started by btschumy, Jul 30, 2021.

  1. btschumy

    btschumy

    Joined:
    Jul 31, 2019
    Posts:
    91
    I have an astronomy visualization app. It allows you to position the camera relative to various game objects. The distance to the object can vary widely (astronomical scales), anything from 0.1 to (say) 500,000,000.

    With many objects, such as stars, I scale the GO transform as you zoom in or out from them. This is because I want the object to still be visible at large distances but as I zoom in I scale the transform to get it closer to its actual size.

    It is working well in general but when I get very close to some objects they become distorted. You can see this in the attached picture. In this example I am scaling the "star" by 0.03. Note that the Sun label is also distorted and it is being scaled by a similar small amount.

    I am unsure how to fix this. The five pointed star representing the Sun is a Quad using a 128x128 image. Increasing the size of the image doesn't seem to help.

    I'm a bit lost on what to do here. Any advice would be greatly appreciated.

    Distortion.jpg
     
  2. kdgalla

    kdgalla

    Joined:
    Mar 15, 2013
    Posts:
    4,355
    You're probably encountering problems with floating-point precision. If you want to model something like this you'll probably need to virtualize your scale somehow.
     
  3. btschumy

    btschumy

    Joined:
    Jul 31, 2019
    Posts:
    91
    Thanks for the response.

    I think the crux of the problem may be that the local scale is applied to the object *before* it is scaled based upon the distance to it and the camera FOV. If I take a 128x128 image and scale it by (say) 0.03 the of course the image will be distorted since you really have 128 * 0.03 = 3.84 pixels to work with in given dimension.

    I don't know the finer points of Unity (heck, I barely know the course points), but is there any way to change the order of operations in the pipeline so the scaling based upon distance happens before the localScale is applied? That might solve it.

    Changing my scene's scale as I zoom in will probably be difficult to do but I can explore it if that is my only option.
     
  4. kdgalla

    kdgalla

    Joined:
    Mar 15, 2013
    Posts:
    4,355
    What I mean by virtualizing your scale- people typically have trouble with anything greater than 10,000 in any of the position components. Unity uses 32-bit floating-point numbers in it's Vector3 format. There's only so much space for number representation, so the higher the magnitude, the less granular they become.

    What some people do in space simulations is change how the scale is represented depending on how far-back the camera is pulled. So you might have to change it from 1 unit = 1 light year to 1 unit - 1000 light years or 1 unit 1,000,000 or what ever. Rather than trying to construct your galaxy in game objects at a realistic scale, you should probably think more towards using them to construct just a "visualization".
     
  5. btschumy

    btschumy

    Joined:
    Jul 31, 2019
    Posts:
    91
    Thanks for the further explaination.

    Initially I did do something like this. I used 1 unit = 1000 light-years. This worked well in most cases but I found that in order to zoom in close to (say) the Sun, I need to have the camera's nearClipPlane be 0.001. I found that this caused some strange behaviors. Labels would be clipped out well before the objects that were being labeled. It seems like keeping the nearClipPlane around 0.01 of greater prevented this.

    This led me down the path of trying a 1:1 representation of scene units to light-years. Doing so allowed me to get close objects and have the labels clip at the same point and the object being labeled. However, I then noticed the distortion of things when getting close.

    Currently I am using:

    nearClipPlane = 0.1
    farClipPlane = 1E+09

    Is that a reasonable range? I do need to be able to zoom out to see galaxy clusters at a camera distance of around 500,000,000 light-years.

    Is the solution to change the way things are scaled when zoomed out as opposed to being zoomed in?
     
  6. kdgalla

    kdgalla

    Joined:
    Mar 15, 2013
    Posts:
    4,355
    Right, neither one of those scales is going to work in every situation.
    What I'm saying is- don't try to choose the perfect scale because it's impossible for something this size. You need to dynamically change the scale to suit your situation.

    No where close. I believe the depth buffer is 16 bits. Typically people use something more like 0.1 to 1000. You could try it, but if you have any overlapping objects then you'll probably have z-fighting or other rendering problems.
     
  7. btschumy

    btschumy

    Joined:
    Jul 31, 2019
    Posts:
    91
    Thanks for sticking with me on this.

    I can't see that 0.1 to 1000 could possibly work for me. I need to be able to be within a few light years of the Sun with stars several light-years away visible, yet still see galaxies 1 million light years away in the distance.

    Even though my clip range is huge, I really haven't seen any problems except for the distortion when getting close to an object (say within a light year). I have developed this app first on Apple's SceneKit, then on Urho3D (which I abandoned) and now finally on Unity (because I wanted it to be cross platform). In all cases I have used a large clip range.

    Right now all the top level objects are direct children of the main scene. Although it would take several days of re-work, I could define a Universe GO that is the scene's child along with a camera. The Universe could be scaled as needed depending on how near or far I am to things. I'm having a hard time thinking through if this would allow more flexibility.
     
  8. kdgalla

    kdgalla

    Joined:
    Mar 15, 2013
    Posts:
    4,355
    So if it's only a problem when you're close, maybe try with both a near and far camera. Set up two camera's with one's clip planes at 0.1-1000 (that's the near camera) and the other one starting at 1000 and going off to the rest of the way. Use the "depth" field on each camera to ensure that the far camera renders first, and then set the "clear flags" of the near camera so that it doesn't delete everything the far camera renders. You should see the results of both camera's on the screen at once.

    I guess you could make both of these cameras children of the same object so that their positions are always in sync.
     
  9. runner78

    runner78

    Joined:
    Mar 14, 2015
    Posts:
    760
    You can also "fake" the zoom in. When you get closer to an object, switch to an alternative local space (1 local space for far away object and 1 local space for the close object. Than you scale the object up when you get closer.
     
  10. btschumy

    btschumy

    Joined:
    Jul 31, 2019
    Posts:
    91
    Yes, this is what I suspect I will need to do. Unfortunately, it will take significant reworking of the app (I think). I will try to tackle it in a few days.
     
  11. btschumy

    btschumy

    Joined:
    Jul 31, 2019
    Posts:
    91
    I don't think that two cameras will help. It is not so much the near/far range that is the problem. My main issue is the fact that when getting close to objects, they get distorted because I set a small local scale on them. As I responded to runner78, I think the solution will be to change the light-years/unit scale as I get close.