Search Unity

  1. Welcome to the Unity Forums! Please take the time to read our Code of Conduct to familiarize yourself with the forum rules and how to post constructively.
  2. Dismiss Notice

Question How to get pixel normal?

Discussion in 'Shaders' started by Zimaell, Sep 14, 2023.

  1. Zimaell

    Zimaell

    Joined:
    May 7, 2020
    Posts:
    337
    Is it even possible to get a pixel normal?

    I am interested in a pixel on a terrain, namely what surface it is on, is it a flat or inclined surface.

    Tell me, is it possible to get such a normal? if so, how?

    I'm manipulating the tiling in the fragment part, but I just need to get the normal for the intended result...
     
  2. bgolus

    bgolus

    Joined:
    Dec 7, 2012
    Posts:
    12,221
    The shader needs to pass the world space vertex normal from the vertex function to the fragment function. The value the fragment shader receives is then that pixel's normal. If you're doing any kind of lighting, you already need to be calculating that normal, so it's hopefully either already being calculated in the vertex shader, or already being passed to the fragment shader.
     
    Zimaell likes this.
  3. Zimaell

    Zimaell

    Joined:
    May 7, 2020
    Posts:
    337
    I show on the screenshots what the problem is, the vertex has its own normal, and the pixels can look in other directions, because of this there is an incorrect visual representation, you can see it on the screenshot...
     

    Attached Files:

  4. bgolus

    bgolus

    Joined:
    Dec 7, 2012
    Posts:
    12,221
    You want the actual geometry normal, not the interpolated vertex normal. A "pixel" has no normal itself as it's a singular point in space, only those it gets from the geometry that is rendering it.

    If you want the geometry normal, yes, you can get that using screen space partial pixel derivatives.

    The very short explanation is partial pixel derivative functions give you how much a value is changing between the current pixel and the pixel beside it, either vertically (
    ddy()
    ) or horizontally (
    ddx()
    ) from the current one. You can use that on the interpolated world position to get two vectors that are along the geometry surface. And a cross product of that gets you the geometry surface normal.
    Code (csharp):
    1. half3 worldNormal = normalize(cross(ddy(i.worldPos), ddx(i.worldPos)));
    This does have one issue. Float precision. The precision of the interpolated world position will affect the accuracy of the calculated normal, and as you get away from the world origin, or very close to a surface, the loss in accuracy will add a lot of noise. You can significantly improve this by passing the camera relative world position from the vertex shader to the fragment shader. Literally just subtract the camera's world position from the actual world position in the vertex and output that. Then do the above code on that interpolated value in the fragment shader. You can add the camera position back in the fragment shader if you don't want to pass two different world positions. You can't just subtract the camera position from the interpolated world position in the fragment shader because the precision was already lost when the value was interpolated.
     
    Zimaell likes this.