Search Unity

Question per-object screen space uv issue

Discussion in 'Shaders' started by MaT227, Jun 2, 2020.

  1. MaT227

    MaT227

    Joined:
    Jul 3, 2012
    Posts:
    628
    Hey everyone,

    I am currently trying to sample a texture in screen space. This works well :
    Code (CSharp):
    1. float4 positionCS = vertexInput.positionCS / vertexInput.positionCS.w;
    2. screenPos = ComputeScreenPos(positionCS).xy;
    3. float aspect = _ScreenParams.x / _ScreenParams.y;
    4. screenPos.x = screenPos.x * aspect;
    But I would like to be able to constrain uv position and scale based on object's position and distance from camera. I found some example but I also faced some issues and for the moment I don't see how to fix them. Here's the code :
    Code (CSharp):
    1. float4 positionCS = vertexInput.positionCS / vertexInput.positionCS.w;
    2. screenPos = ComputeScreenPos(positionCS).xy;
    3. float aspect = _ScreenParams.x / _ScreenParams.y;
    4. screenPos.x = screenPos.x * aspect;
    5.  
    6. float4 originCS = TransformObjectToHClip(float3(0.0, 0.0, 0.0));
    7. originCS = originCS / originCS.w;
    8. float2 originSPos = ComputeScreenPos(originCS).xy;
    9. originSPos.x = originSPos.x * aspect;
    10. screenPos = screenPos - originSPos;
    11.  
    12. // You can match object's distance like this
    13. float3 cameraPosWS = GetCameraPositionWS();
    14. float3 originPosWS = TransformObjectToWorld(float4(0.0, 0.0, 0.0, 1.0));
    15. float d = distance(float4(0.0, 0.0, 0.0, 0.0), cameraPosWS - originPosWS);
    16. screenPos *= d;
    And here's the issue I am facing. You can notice that when the object is near screen edges the texture starts to move. Is there a way to avoid that ?



    I am using URP but this doesn't really matter.
     
  2. MaT227

    MaT227

    Joined:
    Jul 3, 2012
    Posts:
    628
    This seems to be related to FOV and associated distortion but I don't see a way to get rid of that for the moment.
     
  3. bgolus

    bgolus

    Joined:
    Dec 7, 2012
    Posts:
    12,336
    There's not really a solution for this artifact. Because of the perspective projection, as objects get further to the sides you can see more of the back side of the object, and in screen space the object is getting stretched out so the screen space distance between the object's center point and the furthest extents on the object are increasing.

    Extreme example with a 140 degree fov.
    upload_2020-6-2_14-19-38.png

    But... the code you have above is also slightly wrong, so it's worse than it should be! You don't want to be multiplying the screen positing by the distance, you want to multiply it by the depth. The easiest way to get that is to transform your object center world space position into view space and use the view space -z. That's a negative because on the GPU view space is -Z forward, so
    -viewPos.z
    will get you a positive value for things in front of the camera. You could also try
    abs(viewPos.z)
    .

    *edit: the depth is also the
    originCS.w
    in your example! That'll work better since that's also correct for orthographic views where you don't want to divide by the depth (
    originCS.w
    is 1.0 in that case).

    Here's an example of the same setup using the distance like your shader code rather than the depth.
    upload_2020-6-2_14-22-4.png
    Notice how the screen space grid is event changing scale at the corners! This is what you're seeing, so it's doubly bad.
     
    Last edited: Jun 2, 2020
  4. MaT227

    MaT227

    Joined:
    Jul 3, 2012
    Posts:
    628
  5. bgolus

    bgolus

    Joined:
    Dec 7, 2012
    Posts:
    12,336
    Won't help here at all, not by itself at least.

    That's applying a distortion to the final rendered image to get something that "feels" less distorted for a static image. In can also make people sick in motion. Lots of games already do this to some subtle degree to get a specific visual style, and all VR rendering does something this to correct for the distortion the physical lenses in the headsets do (it also reduces bandwidth requirements for the display).

    When you're computing the screen space position, that's being calculated to the original linear projection / pinhole camera that all modern GPUs use to render with.

    If you use both all it means is you get a distorted screen space texture. This is a bad example because the math is wrong, but it gives you an idea of the distortion you'd see. This is just taking the above image and doing an Photoshop spherize on a larger square canvas.
    upload_2020-6-2_16-40-8.png
    The spheres now remain circles on screen, but see how the screen space texture starts to bend?

    The solution to this is to do the "screen space" texturing in some other space, like view direction or spherical space, but there's lots of problems there too.

    The easiest option is to do something akin to camera facing UVs, where it's using the vector from the camera position to the object center to determine the "screen space" UVs. But those distort like crazy unless you're using a fish eye or barrel distortion post process.
     
  6. MaT227

    MaT227

    Joined:
    Jul 3, 2012
    Posts:
    628
    Thanks again @bgolus for the detail answer. I think that this is not enough visible to push things forward and try to fix this, at least for now.