Search Unity

  1. Calling all beginners! Join the FPS Beginners Mods Challenge until December 13.
    Dismiss Notice
  2. It's Cyber Week at the Asset Store!
    Dismiss Notice

Accurately calculating distance object is in front of background

Discussion in 'Shaders' started by a436t4ataf, Oct 14, 2019.

  1. a436t4ataf

    a436t4ataf

    Joined:
    May 19, 2013
    Posts:
    313
    Imagine your scene, with a large quad somewhere in front of the camera. On that quad, I'm trying to draw the distance that the background is behind the quad.

    I'm sure I'm doing some stupid, something so obviously simple that I keep making the same mistake and glossing over it when debugging.

    I tried:
    1. Implement GrabPass (apparently not supported any more because of SRPs, but I want to get it working with this first, and worry about how to write hacks for SRPs later)
    2. Read the _CameraDepthTexture:
    * Vert: o.savedClip = UnityObjectToClipPos( input.vertex );
    * Frag: screenUV = (clipPosition.xy / clipPosition.w) * 0.5f + 0.5f;
    * Frag: depth = LinearEyeDepth(SAMPLE_DEPTH_TEXTURE(_CameraDepthTexture, screenUV));

    ...this gives an approximately-correct view, but draws the DEPTH not the DISTANCE. i.e. at the center of the screen, it's correct, but the further you move towards edges of the screen, the more incorrect it becomes (As I understand it ... Unity's Depth texture is providing orthogonal distance to each pixel, rather than straight-line distance?)

    So then I tried:
    1. As above, but multiply depth by the length of the viewspace vector (an idea of @bgolus in a different thread - https://forum.unity.com/threads/optimization-reconstructing-distance-along-ray-from-depth.386511/ - seems to me obviously a good way of doing it, and ought to work):
    * Vert: o.viewSpaceViewDirection = mul( UNITY_MATRIX_MV, input.vertex );
    * Frag: depth *= length( IN.viewSpaceViewDirection / IN.viewSpaceViewDirection.z );

    ... which seems to be more accurate, although it still changes a bit strangely as you rotate the camera, especilly up/down. BUT this is measuring the distance from the camera to the background, not the distance from the QUAD to the background.

    I have tried many different ways of subtracting the distance to the quad, and nothing works. The main ways I expected to get distance to quad that's correct:

    A: use the same "multiply by viewSpace/viewSpace.z" trick as above
    B: use screenPosition.w (with or without the multiplication from A above)
    C: use UNITY_Z_0_FAR_FROM_CLIPSPACE( clipPosition ) // tried screenPosition too, because some example shaders use that instead
    D: use length( _WorldSpaceCameraPos - IN.worldSpacePos ) -- where:
    worldSpacePos = mul( unity_ObjectToWorld, input.vertex );

    ...etc. Nothing gives correct results.

    Expected results:
    - moving camera forwards/backwards should have no effect: the distance from quad-to-background hasn't changed!
    - tilting camera up/down should only make a different to distances in vertical plane, but instead it changes distances in horizontal direction too
    - panning camera left/right should only make a difference to distances horizontally, but instead changes distances vertically too
     
    Last edited: Oct 14, 2019