Search Unity

  1. Megacity Metro Demo now available. Download now.
    Dismiss Notice
  2. Unity support for visionOS is now available. Learn more in our blog post.
    Dismiss Notice

Shader[Rendering Object Normals as Colors] - Usable as a RenderTexture'd normal map??

Discussion in 'Shaders' started by SomeTallGy, May 30, 2014.

  1. SomeTallGy

    SomeTallGy

    Joined:
    Jan 3, 2014
    Posts:
    18
    Hi folks, I am trying to use a custom shader for 3D objects that I'd like to render as a texture to pop into shader forge to use as a normal map.

    This is the code I am using - straight from http://docs.unity3d.com/Manual/ShaderTut2.html

    Code (csharp):
    1. Shader "Tutorial/Display Normals" {
    2.         SubShader {
    3.             Pass {
    4.  
    5.     CGPROGRAM
    6.  
    7.     #pragma vertex vert
    8.     #pragma fragment frag
    9.     #include "UnityCG.cginc"
    10.  
    11.     struct v2f {
    12.         float4 pos : SV_POSITION;
    13.         float3 color : COLOR0;
    14.     };
    15.  
    16.     v2f vert (appdata_base v)
    17.     {
    18.         v2f o;
    19.         o.pos = mul (UNITY_MATRIX_MVP, v.vertex);
    20.         o.color = v.normal * 0.5 + 0.5;
    21.         return o;
    22.     }
    23.  
    24.     half4 frag (v2f i) : COLOR
    25.     {
    26.         return half4 (i.color, 1);
    27.     }
    28.     ENDCG
    29.  
    30.             }
    31.         }
    32.         Fallback "VertexLit"
    33.     }
    However, while rendered objects using this shader does deliver what appears to be something usable as a normal map, it isn't converted to the more blueish map that Unity/ShaderForge seem to understand - thus the map isn't moving light properly and I am getting unexpected results. A similar problem is discussed more in detail here: http://forum.unity3d.com/threads/135841-Creating-runtime-normal-maps-using-renderToTexture

    Now I am a bit of a newbie with writing custom shaders, but I would like my camera to render the correct color values on 3D objects using a shader so I can use the render texture as a normal map in shader forge.

    Is this possible?

    Thank you very much in advance!
     
  2. RC-1290

    RC-1290

    Joined:
    Jul 2, 2012
    Posts:
    639
    Anything is possible ;).

    You're looking at object-space normals that are modified to be easier to show. If you remove the "* 0.5 + 0.5", you see how red indicates how much something points in the positive X direction, green for Y, and Blue for Z.

    The kind of normals used in normal maps are usually 'tangent space' normals, relative to the surface of the object, not to the object as a whole. Here's a video of someone who can explain it better than I can.

     
  3. SomeTallGy

    SomeTallGy

    Joined:
    Jan 3, 2014
    Posts:
    18
    Okay, I see what you are saying.. however there is a problem that I am having. Per the attached screenshot. I am getting the RGB normals as you described, but on one side of the object you have values that appear to be only black with no gradation - thus light isn't being reflected right. You can see on the hexagon shapes that the lower left side is completely black with no shading.

    (this is what the shader produces after I remove the *0.5 + 0.5 as you suggested).

    How do you get around this problem?

    $Screen Shot 2014-05-30 at 12.12.22 PM.png
     
  4. RC-1290

    RC-1290

    Joined:
    Jul 2, 2012
    Posts:
    639
    I never suggested that removing "* 0.5 + 0.5" would fix your problems. It was meant to help you understand how those normals work. To help you get started on learning how to solve your problem.

    If you want a finished solution, I suggest using the commercial jobs forum.

    But it's probably worth asking why you would want to create tangent space normal maps from your in-game objects. It seems like a work-around for something. What is the original problem you're trying to solve?
     
  5. SomeTallGy

    SomeTallGy

    Joined:
    Jan 3, 2014
    Posts:
    18
    Oh, pardon me as when I did remove them I got my hopes up thinking I was on the right track (was getting the blue tinted normals rather than the pink/purple ones).

    Anyway, here is what I am trying to accomplish.

    1) Create a shader that shows normal colors on 3D objects.
    2) Render those shaded objects into a render texture.
    3) Use that render texture as a normal map.

    This is so that I can place normal maps on a plane at runtime.

    This is the before and after result I am trying to achieve:
    $Screen Shot 2014-05-25 at 1.09.48 PM.png
    $Screen Shot 2014-05-25 at 1.09.16 PM.png

    I have been trying to get this result by applying normal colors onto a 3D hexagon grid and then blending the normal maps through Shader Forge.
    $Screen Shot 2014-05-29 at 9.57.37 PM.png
    $Screen Shot 2014-05-29 at 9.59.09 PM.png

    While I did successfully create a normal map updating at runtime, the normal colors are off and its creating undesired lighting.
    $Screen Shot 2014-05-29 at 9.59.16 PM.png

    To your point about posting this on the jobs forum, yea that would be a quick fix, but I'd rather learn the solution to this and hopefully someone with the same problem would find this thread. I figured this was a common trick used.
     
  6. RC-1290

    RC-1290

    Joined:
    Jul 2, 2012
    Posts:
    639
    It is a common trick, and sometimes people nicer than me give out answers straight up. I mainly try to nudge people in the right (hopefully) direction instead, in the hope that they get better at figuring out how to get things working.

    You can already place normals maps on a plane at runtime, by assigning a normal map to a 'bumped' material at runtime. But I guess you're talking about dynamically combining multiple normal maps together at runtime. Do you really need the normals to be created from 3D geometry, or is that something you could do in advance?

    An important thing to consider is that when you're combining normal maps, simply blending the colors together does not give accurate results. Keep in mind that normals are vectors (usually unit vectors), so you might want to use some 3d math to get the correct average normal.
     
  7. RC-1290

    RC-1290

    Joined:
    Jul 2, 2012
    Posts:
    639
    I just noticed your other thread, and I guess you really do want to create the normals at runtime. Unfortunately, I have never written a normal map baking tool before, and I don't know exactly how the normal data should be stored, to match other normal maps for Unity. Especially when it comes to the compression of existing normal maps.

    I think you'll want to take a look at UnpackNormal and UnpackNormalDXT5nm in UnityCG.cginc (since you might have to do the reverse)
     
  8. SomeTallGy

    SomeTallGy

    Joined:
    Jan 3, 2014
    Posts:
    18
    Thanks.
     
  9. Farfarer

    Farfarer

    Joined:
    Aug 17, 2010
    Posts:
    2,249
    You're rendering things based on camera, assuming that the render texture should be upright with respect to the camera.

    So you'll want to get the camera-space normal and compress that down to 0-1 range (using * 0.5 + 0.5;) in order to be able to save it in a texture. It should pretty much be in a default tangent space by then anyway (with the camera normal being the vertex normal, and screen-space up/left being tangent/binormal - as it's a flat surface it's not a problem).

    Then you should be able to use Reoriented Normal Mapping to combine the two normals together in your shader.
     
  10. RC-1290

    RC-1290

    Joined:
    Jul 2, 2012
    Posts:
    639
    Does that match how normal map assets are stored? Because so far I've only seen that used for Deferred Lighting, where it is done for the Depth+Normals+specular buffer. And in that case it's immediately undone in the lighting pass.

    Ahhh, nice, I was looking for that link, but I didn't use the right words in the bookmark.
     
    Last edited: May 31, 2014
  11. SomeTallGy

    SomeTallGy

    Joined:
    Jan 3, 2014
    Posts:
    18
    Farfarer, thanks for your input. Here is what my renderTexture is producing: $Screen Shot 2014-05-31 at 11.20.16 AM.png

    If I were to take a screenshot of that, and then import that texture into unity3d and set it as a normal map, it would work.
    I believe that's because unity3d converts this normal map into a different format based off my research (the blueish texture). Shader forge makes that pretty clear in this screengrab: $normalConvert.jpg

    So here is where I am stuck. How do I get this render texture to be converted to that blueish tint one so Unity will render it correctly.

    From what I have been researching the last 3 days, there are 2 possible routes:

    A) Create a shader for the 3D objects being captured to render texture that displays the correct colors (what's mostly being discussed here). Unfortunately it's hard to get a straight, newbie-friendly answer on how to achieve that. (folks here are very experienced with shaders and talk in a language that I don't quite understand)

    or

    B) Run my current renderTexture through a script that converts the colors to what I need. I found this thread:http://answers.unity3d.com/questions/47121/runtime-normal-map-import.html that has given me some clues on that.. however i am not sure if that is the most efficient.

    So, I know I am close.. just trying to find that missing link to get this working.
     
    Last edited: Jun 2, 2014
  12. Farfarer

    Farfarer

    Joined:
    Aug 17, 2010
    Posts:
    2,249
    That render texture is simply showing a standard normal map. It's packed from the range -1 to 1 into the range 0 to 1 by * 0.5 + 0.5;

    To unpack it back into -1 to 1 you just do the opposite... multiply by two then subtract one... * 2 - 1;

    That'll give you the deep blue result that you see in Shader Forge (which is simply an unpacked normal map).
     
  13. SomeTallGy

    SomeTallGy

    Joined:
    Jan 3, 2014
    Posts:
    18
    Farfarer, you are right exactly - that did change it to the darkblue I was looking for, thank you so very much for explaining that clearly. I am almost there.. however there seems to be a channel that's not being read by the shader. Please look at the attached screenshots:

    $Screen Shot 2014-05-31 at 1.53.26 PM.png
    $Screen Shot 2014-05-31 at 1.53.34 PM.png

    As you can see.. it seems that the darker blue shades in the generated normal map are not bending any light.

    What am i missing?

    BTW, here is my updated shader code:

    Code (csharp):
    1. Shader "Tutorial/Display Normals" {
    2.         SubShader {
    3.             Pass {
    4.  
    5.     CGPROGRAM
    6.  
    7.     #pragma vertex vert
    8.     #pragma fragment frag
    9.     #include "UnityCG.cginc"
    10.  
    11.     struct v2f {
    12.         float4 pos : SV_POSITION;
    13.         float3 color : COLOR0;
    14.     };
    15.  
    16.     v2f vert (appdata_base v)
    17.     {
    18.         v2f o;
    19.         o.pos = mul (UNITY_MATRIX_MVP, v.vertex);
    20.         o.color = (1 - (v.normal * 0.5 + 0.5)) * 2 - 1;  // one minus to "flip" the normals to the correct camera orientation
    21.        
    22.         return o;
    23.     }
    24.  
    25.     half4 frag (v2f i) : COLOR
    26.     {
    27.         return half4 (i.color, 1);
    28.     }
    29.     ENDCG
    30.  
    31.             }
    32.         }
    33.         Fallback "VertexLit"
    34.     }
     
  14. Farfarer

    Farfarer

    Joined:
    Aug 17, 2010
    Posts:
    2,249
    So you can't do that inside the same shader. All you're doing there is putting it from -1 to 1 into 0 to 1... then immediately throwing it back into -1 to 1 and trying to save that to an image (which you can't - it has negative values).

    You'll want to uncompress it using * 2 - 1 in the shader you're blending the normal maps together, not the shader you're rendering them out of.
     
  15. SomeTallGy

    SomeTallGy

    Joined:
    Jan 3, 2014
    Posts:
    18
    Okay.. that seemed to do the trick although the normals look much more intense now, but definitely a step in the right direction.
    $Screen Shot 2014-06-01 at 2.45.11 PM.png

    This is the code thats in the shader, did I append the product in the right place (Line 10)? I only ask because the normals look pretty intense.

    Code (csharp):
    1. fixed4 frag(VertexOutput i) : COLOR {
    2.                 i.normalDir = normalize(i.normalDir);
    3.                 float3x3 tangentTransform = float3x3( i.tangentDir, i.binormalDir, i.normalDir);
    4.                 float3 viewDirection = normalize(_WorldSpaceCameraPos.xyz - i.posWorld.xyz);
    5. /////// Normals:
    6.                 float2 node_967 = i.uv0;
    7.                 float3 node_962_nrm_base = UnpackNormal(tex2D(_normal,TRANSFORM_TEX(node_967.rg, _normal))).rgb + float3(0,0,1);
    8.                 float3 node_962_nrm_detail = tex2D(_footprint,TRANSFORM_TEX(node_967.rg, _footprint)).rgb * float3(-1,-1,1) ;
    9.                 float3 node_962_nrm_combined = (node_962_nrm_base*dot(node_962_nrm_base, node_962_nrm_detail)/node_962_nrm_base.z - node_962_nrm_detail) ;
    10.                 float3 node_962 = node_962_nrm_combined * 2 - 1;
    11.                 float3 normalLocal = node_962;
    12.                 float3 normalDirection =  normalize(mul( normalLocal, tangentTransform )); // Perturbed normals
    13.                 float3 lightDirection = normalize(lerp(_WorldSpaceLightPos0.xyz, _WorldSpaceLightPos0.xyz - i.posWorld.xyz,_WorldSpaceLightPos0.w));
    14. ////// Lighting:
    15.                 float attenuation = LIGHT_ATTENUATION(i)*2;
    16.                 float3 attenColor = attenuation * _LightColor0.xyz;
    17. /////// Diffuse:
    18.                 float NdotL = dot( normalDirection, lightDirection );
    19.                 float3 diffuse = max( 0.0, NdotL) * attenColor;
    20.                 float3 finalColor = 0;
    21.                 float3 diffuseLight = diffuse;
    22.                 float4 node_2 = tex2D(_diffuse,TRANSFORM_TEX(node_967.rg, _diffuse));
    23.                 finalColor += diffuseLight * node_2.rgb;
    24. /// Final Color:
    25.                 return fixed4(finalColor * node_2.a,0);
    26.             }
     
    Last edited: Jun 1, 2014
  16. RC-1290

    RC-1290

    Joined:
    Jul 2, 2012
    Posts:
    639
    If you're going to write the shader in Shader Forge, you should probably perform the conversion in Shader Forge as well. Reading generated shader code like that is very confusing, and it's much easier to make mistakes that way.
     
  17. SomeTallGy

    SomeTallGy

    Joined:
    Jan 3, 2014
    Posts:
    18
    Done.

    After some trial and error, I added a code node that performs the * 2 - 1 to the RGB of the renderTexture.
    Now I have a nice clean result and I am sure this might be of some use to other folks :)

    $Screen Shot 2014-06-01 at 9.58.43 PM.png
    $Screen Shot 2014-06-01 at 9.58.50 PM.png

    It's pretty cool as I can move the 3D objects around in the scene and normals will follow in real time. Very cool effect :)

    Thanks for the help!