Search Unity

  1. Welcome to the Unity Forums! Please take the time to read our Code of Conduct to familiarize yourself with the forum rules and how to post constructively.
  2. We have updated the language to the Editor Terms based on feedback from our employees and community. Learn more.
    Dismiss Notice
  3. Join us on November 16th, 2023, between 1 pm and 9 pm CET for Ask the Experts Online on Discord and on Unity Discussions.
    Dismiss Notice

Obtaining screen-space texel size?

Discussion in 'Shaders' started by FerrousBueller, Nov 1, 2015.

  1. FerrousBueller

    FerrousBueller

    Joined:
    Oct 31, 2015
    Posts:
    3
    Hi everyone!

    I'm new to Unity and Cg shaders but I have a small amount of experience writing HLSL shaders for XNA.

    I'm trying to produce a gravitational lensing effect for a sphere in space. Basically, I have a textured quad at the same location as the sphere. This quad always points toward the camera. I am using Unity's example GlassStainedBumpDistort shader as a starting point for my distortion shader. This shader is applied to the quad with the following normal map:

    Wormhole_normal_swapped_smooth_center.png

    The sphere would be located in the centre of the quad. This produces a nice-looking distortion of the scene behind the quad, which resembles "gravitational lensing" around the sphere, as shown here:
    Zoomed_out.png

    Here is the relevant part of the shader which distorts the background:

    Code (CSharp):
    1. sampler2D _GrabTexture;
    2. float4 _GrabTexture_TexelSize;
    3. sampler2D _BumpMap;
    4. sampler2D _MainTex;
    5. float4 _MainTex_TexelSize;
    6.  
    7.  
    8. half4 frag (v2f i) : SV_Target
    9. {
    10.     // calculate perturbed coordinates
    11.     half2 bump = UnpackNormal(tex2D( _BumpMap, i.uvbump )).rg;
    12.     float2 offset = bump * _BumpAmt * float2(_MainTex_TexelSize.x,-_MainTex_TexelSize.y) * 500;
    13.     i.uvgrab.xy = offset + i.uvgrab.xy;
    14.  
    15.     half4 col = tex2Dproj( _GrabTexture, UNITY_PROJ_COORD(i.uvgrab));
    16.  
    17.     UNITY_APPLY_FOG(i.fogCoord, col);
    18.     return col;
    19. }

    As I understand it, the shader is computing an offset vector (x,y) based on the (Red,Green) channels of the quad's normal map. Then the current state of the framebuffer is sampled at the current pixel coordinate plus this offset vector.

    However, the problem I am having is that this offset vector does not scale with the size of the quad as projected onto the screen. For example, if I lower the camera's FOV to zoom in on the sphere, I expect the distortion effect to scale exactly with the sphere:

    Zoomed_in_correct.png

    However, what actually renders is the following:

    Zoomed_in_wrong.png

    I think in order to solve this issue, in the shader I need to be able to get the size of the sampled normal map texel, in screen space. I.e. how many screen pixels wide is the current texel. Then I can scale the offset vector by this texel size. E.g. in the zoomed-out case above, each normal map texel is 1 screen pixel wide whereas in the zoomed-in case, each normal map texel is 3.5 screen pixels wide. So I should scale the offset by a factor of 3.5. How can I access this scaling factor in a fragment shader?

    I have tried using _MainTex_TexelSize but that just returns the size of a texel in texture space (e.g. if a texture is 2048x2048, _MainTex_TexelSize will return 1/2048). What I want is the size of a texel in screen space.

    Any help would be appreciated. Thanks in advance!
     
    Last edited: Nov 1, 2015
    lvvova likes this.
  2. bgolus

    bgolus

    Joined:
    Dec 7, 2012
    Posts:
    12,248
    The easiest way to handle this is with ddx and ddy. By using ddx(uv.xy * tex_TexelSize.zw) and ddy(uv.xy * tex_TexelSize.zw) you can get how much the texel sized UVs have changed between the x and y screen pixel and the next. I.e. it gets you the screen space texel size you're trying to find.
     
    kotoezh, lvvova and FerrousBueller like this.
  3. FerrousBueller

    FerrousBueller

    Joined:
    Oct 31, 2015
    Posts:
    3
    Thanks! That works. I didn't know you could use ddx and ddy to find the derivatives of the coordinates themselves. I thought ddx and ddy could only calculate the derivatives of color values. Now I know.

    For those interested, here is my fixed code:

    Code (CSharp):
    1. sampler2D _GrabTexture;
    2. float4 _GrabTexture_TexelSize;
    3. sampler2D _BumpMap;
    4.  
    5. half4 frag (v2f i) : SV_Target
    6. {
    7.     //calculate normal map's texel size in screen space
    8.     float scl = 1/abs(ddy(i.uvbump.y));
    9.  
    10.     // calculate perturbed coordinates
    11.     half2 bump = UnpackNormal(tex2D( _BumpMap, i.uvbump )).rg;
    12.     float2 offset = bump * _BumpAmt * float2(_GrabTexture_TexelSize.x,_GrabTexture_TexelSize.y) * scl;
    13.     i.uvgrab.xy = offset + i.uvgrab.xy;
    14.  
    15.     half4 col = tex2Dproj( _GrabTexture, UNITY_PROJ_COORD(i.uvgrab));
    16.  
    17.     UNITY_APPLY_FOG(i.fogCoord, col);
    18.     return col;
    19. }
    Since this shader is applied to a camera-facing square quad, the x and y screen size of the texels are equal (i.e. the texels, as projected onto the screen, are square). So I only need to calculate either the x or y size; I chose to use ddy to calculate the y dimension.
     
    lvvova likes this.
  4. bgolus

    bgolus

    Joined:
    Dec 7, 2012
    Posts:
    12,248
    One minor optimization: UnpackNormal

    If you're using DXT5nm (the default compressed format for normal maps in Unity on non mobile platforms) then you're paying the cost of reconstructing the z component ... which you don't use. You're better off doing:

    tex2D(_BumpMap, i.uvbump).ag * 2.0 - 1.0;

    If you're not using the default compressed normals, UnpackNormal just does the * 2.0 - 1.0 anyways so you're fine.
     
    lvvova likes this.
  5. FerrousBueller

    FerrousBueller

    Joined:
    Oct 31, 2015
    Posts:
    3