Search Unity

Shader Angle vs Raycast Angle Discrepancy

Discussion in 'Shaders' started by surgio, Oct 15, 2019.

  1. surgio

    surgio

    Joined:
    Feb 26, 2018
    Posts:
    3
    I am trying to make a shader that filters all surfaces that the camera can see within 40 degrees.

    Specifically:
    Camera directly facing a surface is 0 degrees (acceptable, paint red pixel)
    Camera view angle hitting the surface > 40 degree (not acceptable, paint white pixel)

    However, when I try to verify the results with a Raycast hit angle, the results are different. In some areas where the shader results tells me that the surface angle is within 40 degrees, the Raycast tells me that it is more than 40 degrees. In other areas, the opposite happens - Raycast within 40 degrees, shader indicates outside 40 degrees.


    Here is my shader code:
    Code (CSharp):
    1. Shader "Custom/AngleFilter”
    2. {
    3.    Properties
    4.    {
    5.    }
    6.  
    7.    SubShader
    8.    {
    9.        Tags{ "RenderType" = "Opaque" }
    10.  
    11.        CGPROGRAM
    12.        #pragma surface surf Lambert vertex:vert
    13.  
    14.        struct Input
    15.        {
    16.            float3 viewDir;
    17.        };
    18.  
    19.        void surf(Input IN, inout SurfaceOutput o)
    20.        {
    21.            float angle = degrees(acos(dot(normalize(o.Normal), normalize(IN.viewDir))));
    22.  
    23.            o.Alpha = 1;
    24.            if (angle > 40)
    25.            {
    26.                o.Emission.r = 1;
    27.                o.Emission.g = 1;
    28.                o.Emission.b = 1;
    29.                return;
    30.            }
    31.  
    32.            o.Emission.r = 1;
    33.            o.Emission.g = 0;
    34.            o.Emission.b = 0;
    35.        }
    36.  
    37.        ENDCG
    38.    }
    39.  
    40.    FallBack "Diffuse"
    41. }

    I have tried:

    1. Changing the view angle of the camera thinking that maybe there is a perspective distortion that affects the surface angle seen by the camera. However, the same discrepancy between raycast angle and shader angle occurs.

    2. Tried using WorldNormalVector(IN, o.Normal) instead of just plain o.Normal. Same error, nothing changes.

    3. Have verified that the Raycast origin position/direction is exactly the same as the shader camera position/direction.

    4. Tried with and without RecalculateNormals() - which I use to smooth out the meshes. The red pixel area that falls within the acceptable 40 degree range changes (because smooth surfaces become more edgy), however the raycast still show different results.


    The only reason I can think of is that the surface normals seen by the shader is slightly different from the surface normals picked up by raycast. Hence my efforts have been spent trying to reconcile this difference.

    Is it something else? What am I missing?
     
  2. bgolus

    bgolus

    Joined:
    Dec 7, 2012
    Posts:
    12,352
    A CPU side raycast is going to to be tracing against the collision mesh, and it's possible the collision mesh and render mesh geometries don't match. Even if the meshes are the same, the normal a raycast returns is the actual triangle normal and not the interpolated normal you get in the shader. so unless your render mesh is using flat shading the normals won't match.

    But perhaps more importantly. if that example shader really is the shader you're using (which it's not since that shader doesn't compile), the o.Normal value is uninitialized by default so you're comparing the view dir with float3(0,0,0). Usually o.Normal is used to set the tangent space normal. If you are setting it you're usually going to be using a normal map, which isn't really want you want to do if you're looking to compare the face normal with the view normal. For that you want to use the IN.worldNormal value, though there's a bug that if you set the o.Normal the IN.worldNormal also isn't initialized and you need to use WorldNormalVector(IN, float3(0,0,1)) to get the actual world normal. However the IN.viewDir will be in tangent space if you're setting the o.Normal, so really all you need is:

    normalize(IN.viewDir).z > cos(radians(40))
     
  3. surgio

    surgio

    Joined:
    Feb 26, 2018
    Posts:
    3
    explain.png
    Here are some screenshots of the problem, the purple and green line represents the raycast from a camera position, and the red dots represent the shader results relative to the same camera position. Yellow circle indicates the surface point of interest.

    Green Line = Raycast within 40 degrees
    Purple Line = Raycast outside 40 degrees

    Red Dots = Shader within 40 degrees
    No Dots = Shader outside 40 degrees

    So intuitively, the green line should only fall within the red dot area and the purple line should only fall outside of the red dot area. However, you can see in the above images that this is not the case around the edges of the 40 degree area.

    I suspect you area right that the mesh used by the raycast and the shader is different. So I have tried running the same test but with a cylinder primitive rather than a custom model. The results show consistent behavior between the raycast and shader results (maybe only very slightly different if I squint my eye).

    Is there anything I can do to the model so that the mesh the shader uses is the same as the mesh used by the raycast?

    -----

    Fyi, I have tried normalize(IN.viewDir).z > cos(radians(40)). But this just highlights everything within 40 degrees relative to the camera direction. I am looking for surfaces angles that within 40 degrees relative to the camera, so this does not work for me. In my instance, I find the angle between the surface normal and camera view direction:

    angle = degrees(acos(dot(normalize(o.Normal), normalize(IN.viewDir))));
     
  4. bgolus

    bgolus

    Joined:
    Dec 7, 2012
    Posts:
    12,352
    Yes, assign the render mesh as the collision model of the mesh collider. That's it. Again, if they don't match, it may be because of the interpolated normals vs actual poly normals. This one is harder to deal with generally, though the dots make me think you're using a geometry shader? If that's the case, you can calculate the actual surface normal from the vertices. Alternatively you can calculate the surface normal using derivatives (see the various threads on flat shading).

    Or you can try to calculate the interpolated normal from the raycast instead of using the hitinfo.normal. Use the barycentric coordinate and the triangle index to get the per vertex normals and interpolate them manually. The Unity documentation for the hitinfo.barycentricCoordinate is exactly that code:
    https://docs.unity3d.com/ScriptReference/RaycastHit-barycentricCoordinate.html

    See if that syncs them up for you.
     
  5. surgio

    surgio

    Joined:
    Feb 26, 2018
    Posts:
    3
    This did it for me! Raycast and shader normals sync up perfectly.

    Thank you so much :)