Search Unity

  1. Welcome to the Unity Forums! Please take the time to read our Code of Conduct to familiarize yourself with the forum rules and how to post constructively.
  2. We have updated the language to the Editor Terms based on feedback from our employees and community. Learn more.
    Dismiss Notice
  3. Join us on November 16th, 2023, between 1 pm and 9 pm CET for Ask the Experts Online on Discord and on Unity Discussions.
    Dismiss Notice

Procedural shader - incorrect bump mapping

Discussion in 'Shaders' started by specterdragon, Jan 13, 2015.

  1. specterdragon

    specterdragon

    Joined:
    Dec 30, 2013
    Posts:
    21
    I'm sure this has been asked before, so if someone can point me in the right direction, I'd appreciate it. Here's the issue I'm seeing... This is a bit long winded, but I don't have any pics to share at the moment, so trying to explain the issue.

    I have a model, imported from Blender. It's a simple hightmap displaced plane intended for use as a landscape. Because I don't want to paint the landscape by hand, I've worked out a simple procedural tritexture shader that uses world coordinates and world normal to determine which texture to use and the UV for each. This worked fine. I added normal mapping, which wasn't hard but has inconsistent results.

    "flat" pixels (i.e. those with a tangent-space normal of 0,0,1) work great. So a normal map with rgb=0.5,0.5,1.0 works correctly.

    The issue is that y axis normal displacements on the normal map appear to be backwards, but ONLY on surfaces that are closer to worldNormal vertical (note that the projection is vertical in this case). Pixels with the same (all be it distorted) projection closer to worldNormal horizontal have the correct normal displacement. At first, I thought that the projection just needed to be flipped in the Y, but then the results are also flipped. FWIW, the x axis displacement is correct.

    I'm wondering if there's an issue with the bitangent direction that might be causing the y directed normals to flip once the pixel's worldNormal reaches a certain point, but can't tell. Is there a known issue? I've tried different combinations of letting Blender export the tangent, having Unity calculate it, in both blend and FBX files - pretty much the same result.

    Questions... any thoughts on what might be happening (the UV must be calculated procedurally, so I can't remap it in Blender)? And, in a surface shader, is there any way to get to currently used bitangent vector, so I can check that value and/or change the tangent space coordinate system on a pixel by pixel bases to use for the normal displacement?
     
  2. jvo3dc

    jvo3dc

    Joined:
    Oct 11, 2013
    Posts:
    1,520
    Some images and a more clear explanation would help.

    Is the plane being displaced in the shader or has it already been displaced in Blender?

    I'm also not sure what you mean with:
    Finally, I assume you just mean normal mapping with normal displacement. It sounds a bit like the normal map contains the displacement height map, but I don't think that's the case here.

    Note that the tangent and binormal are defined to point in the directions of u and v changes for a given set of uv coordinates. It can't just be any two vectors that make an orthogonal base with the normal. This means that if you generate your uv coordinates for the normal map, you also have to generate the accompanying tangent and binormal. This might well be the source of the issue.



    The best way would be to mathematically find a way to calculate a tangent and binormal that fits the generated uv coordinates. There is however, another way called "Normal Mapping Without Precomputeted Tangents". It leads to the following code, which cleverly uses the ddx and ddy instructions to enable tangent and binormal calculation for any uv coordinate:

    Code (csharp):
    1.  
    2. float3x3 calcWsCotangentFrame (float3 wsNormal, float3 wsInvViewDir, float2 tsCoord)
    3. {
    4. // get edge vectors of the pixel triangle
    5. float3 dp1 = ddx(wsInvViewDir);
    6. float3 dp2 = ddy(wsInvViewDir);
    7. float2 duv1 = ddx(tsCoord);
    8. float2 duv2 = ddy(tsCoord);
    9.  
    10. // solve the linear system
    11. float3 dp2perp = cross(dp2, wsNormal);
    12. float3 dp1perp = cross(wsNormal, dp1);
    13. float3 T = dp2perp * duv1.x + dp1perp * duv2.x;
    14. float3 B = dp2perp * duv1.y + dp1perp * duv2.y;
    15.  
    16. // construct and return a scale-invariant cotangent frame
    17. float invmax = rsqrt(max(dot(T,T), dot(B,B)));
    18. return float3x3(T * invmax, B * invmax, wsNormal);
    19. }
    20.  
    (Not my code, I find the variable naming a bit cryptic.)
     
    specterdragon likes this.
  3. specterdragon

    specterdragon

    Joined:
    Dec 30, 2013
    Posts:
    21
    Thanks for the reply. Even if I didn't explain it very well (yeah, I used the term "displace" in multiple contexts - sorry for that), you seemed to get my point. You put me on the right track and I found a solution. Essentially, it had nothing to do with the model coming from Blender (at least not directly).

    Unity was calculating the tangent and bitangents. For whatever reason, v.tangent.w was reversed around dot(v.normal, (1,0,0)) > 0.5, or roughly put, within 45 degrees or the X axis. I set up a simple vertex shader that sets v.tangent.w to 1 (always) and that solved the issue. *shrug* That's what I get for trusting the engine to do all the work for me. LOL

    Thanks again!
     
    Last edited: Jan 15, 2015