Search Unity

What coordinate space dose Fragment shader's input in?

Discussion in 'Shaders' started by GreatWall, Mar 15, 2018.

  1. GreatWall

    GreatWall

    Joined:
    Aug 7, 2014
    Posts:
    57
    I use " i.position = UnityObjectToClipPos(v.position);" to convert point from model space to clip space. The "clip space" I understand is x,y,z all in [-w,w],these points which x,y,z not in [w,-w] would be clipped. Then My question is in fragment shader what space dose " i.position" in?
    Shader "Unlit/CustomShader"
    {
    SubShader {

    Pass {
    CGPROGRAM
    #pragma vertex MyVertexProgram
    #pragma fragment MyFragmentProgram
    #include "UnityCG.cginc"

    struct VertexData {
    float4 position : POSITION;
    };
    struct Interpolators {
    float4 position : SV_POSITION;
    };
    Interpolators MyVertexProgram (VertexData v) {
    Interpolators i;
    i.position = UnityObjectToClipPos(v.position);
    return i;
    }

    float4 MyFragmentProgram (Interpolators i) : SV_TARGET {
    return i.position;
    }
    ENDCG
    }
    }
    }
    the result is object became "yellow" when the camera move too close to it(as figure 1). That means the x and y always be larger than 1.
    Then I replace the return value with "return i.position/i.position.w". I thought this would convert coordinate to UDC space which [-1,1],But the result is still "yellow"(as figure 2).That means the x and y always be larger than 1.
    So I confused. I thought the left half of the sphere will be a little blue as x < 0.
     

    Attached Files:

  2. Invertex

    Invertex

    Joined:
    Nov 7, 2013
    Posts:
    1,546
  3. GreatWall

    GreatWall

    Joined:
    Aug 7, 2014
    Posts:
    57
    I wonder the input of fragment shader. Is it in clip space or NDC?
     
  4. GreatWall

    GreatWall

    Joined:
    Aug 7, 2014
    Posts:
    57
    The result picture shows it is not clip space or NDC. Or I make mistake somewhere?
     
  5. bgolus

    bgolus

    Joined:
    Dec 7, 2012
    Posts:
    12,339
    The XY values of SV_Position in the fragment shader is the pixel coordinates (0 to the resolution), Z is depth (0.0 to 1.0), W is 1.0, the same as VPOS (or gl_FragCoord in GLSL).
     
    ValakhP_Gismart likes this.
  6. GreatWall

    GreatWall

    Joined:
    Aug 7, 2014
    Posts:
    57
    As I understand, the the x coordinate is from 0-1 as left to right, y coordinate is from 0-1 as bottom to top, on openGL.
    If so why the color is yellow ,because the yellow is (1,1,0) which x and y is 1.following figure is using a orthographic camera, which frustum is a cube as the object. if frag shader's input i.position is in pixel coordinates then the cube's left half should be a color whose r is small such as 0.But it is yellow.@bgolus
    I confused.
     

    Attached Files:

  7. bgolus

    bgolus

    Joined:
    Dec 7, 2012
    Posts:
    12,339
    The bottom left pixel is 0,0, the top right pixel is screen resolution XY - 1. So if you're rendering at 640x480 the bottom left is 0,0 and the top right is 639,479. It's yellow because the X and Y are 1 or greater for almost every pixel on screen.

    Between the output of the vertex shader and the input of the fragment shader that SV_Position value is used for rasterization. It starts as homogeneous clip space when output from the vertex shader, and is transformed into NDC and ultimately viewport space / pixel positions before getting to the fragment shader. If you want the NDC in the fragment you have to pass the clip space position as an additional semantic (like TEXCOORD#) and do the perspective divide on your own.

    https://forum.unity.com/threads/wha...shader-input-sv_position.520499/#post-3413867
     
    ValakhP_Gismart likes this.
  8. GreatWall

    GreatWall

    Joined:
    Aug 7, 2014
    Posts:
    57
    @bgolus Thanks a lot. I don't confused now. thank you