Search Unity

Runtime generated bump maps are not marked as normal maps

Discussion in 'General Graphics' started by fra3point, Jun 28, 2016.

  1. fra3point

    fra3point

    Joined:
    Aug 20, 2012
    Posts:
    269
    Hi!

    As the title says, I use some runtime generated bump maps.
    I create them in Start(), modify their pixels, and then put them in the materials with SetTexture().
    But these maps are not marked as Normal Maps, because that is an Editor feature.

    What I need is a kind of "Mark as Normal Map" simulation for runtime use.

    Some days ago I did something like this:
    • Override TextureImporter settings by creating an AssetPostprocessor subclass, marking by default that texture as Normal Map
    • Create the texture to be treated as bump map
    • Apply a color conversion with a custom Diffuse2Normal() function
    • Save it as a PNG image in Assets/...
    • AssetDatabase.Reload();
    • <the overridden Texture Importer automatically converts the texture in Normal Map>
    • Load from AssetDatabase the newly created asset
    • Put it in the right material
    And that worked well, but it was too slow, worked only in the editor and forced me to save the texture on the hard disk. The problem is that I don't generate one texture, but dozens.
    I want to store the textures in memory, not on the hard disk.

    So, I need something working in both editor and build... Just like this:
    • Create the texture to be treated as bump map
    • Apply a color conversion with a custom Diffuse2Normal() function
    • Programatically mark it as Normal Map
    • Put it in the right material
    I read about a special normal map encoding called DXTnm. I suppose that marking as Normal Map from the editor does this conversion. Maybe reproducing it is the solution?

    I'd appreciate any suggestion, thank you! :)
     
  2. mgear

    mgear

    Joined:
    Aug 3, 2010
    Posts:
    9,411
  3. fra3point

    fra3point

    Joined:
    Aug 20, 2012
    Posts:
    269
    @mgear This means that this will work only whith custom non-unpacked shaders and I have to modify them?
     
  4. bgolus

    bgolus

    Joined:
    Dec 7, 2012
    Posts:
    12,342
    DTXnm is a DTX5 texture with the red channel of the normal map stored in the alpha channel of the texture and the red and blue channels blacked out. The green channel of the normal map remains as the green channel of the texture.

    You can use an uncompressed RGBA with that same layout and it should work.

    rgbvsdxtnm.png

    edit: If you're using any version of Unity newer than 5.5 you'll want the red and blue channels to be white, not black. But you also do not need to swizzle normal maps for them to work anymore.
     
    Last edited: Mar 12, 2021
  5. fra3point

    fra3point

    Joined:
    Aug 20, 2012
    Posts:
    269
    @bgolus That's good, it works! Thanks. The only problem I am facing is the following:

    • I have a material (Standard Shader) with a diffuse and a bump map assigned.
    • On Start() I take these maps, I modify them and then I put the newly created textures back in the material.
    • Everything works fine for diffuse maps.
    • If the original bump map is already marked as Normal Map it looks bad after the computation, even if I don't try to manually apply the DTXnm conversion to it.
    • If the original bump map is NOT marked as Normal Map, it looks good after the computation.

    But that bump map is already marked as Normal Map in the editor because the model should look nice in edit mode, too, and not only in play mode.
    I do not understand why modifying a marked as NM texture (DTXnm), doesn't produce a DTXnm texture.

    As now, I use bump maps not marked as NM as reference maps in the material (which obviuosly look bad in edit mode), and I have perfect result in play mode.

    To avoid this difference between edit and play mode, I think there are 2 possibilities:

    1) On Start(), re-convert the original bump map to RGB, - impossible because red and blue channels were cleared - modify it and put it back to DTXnm.
    2) Give up.
     
    Last edited: Jun 29, 2016
  6. bgolus

    bgolus

    Joined:
    Dec 7, 2012
    Posts:
    12,342
    You've kind of answered your own question of what you're doing wrong, but we'll get to that in a moment.

    First off ignore option 2.

    Option 1 is in fact totally possible, how do you think DXTnm works in shaders? That blue channel (or normal z axis) is still important but can be derived from only having the "red" (normal x axis, stored in the texture asset's alpha) and "green" (normal y axis).

    To convert a DXTnm texture back to an RGB normal map just move the alpha channel back to the red channel and recalculate the blue channel. Here's the snippet of code Unity uses in its shaders:
    Code (CSharp):
    1. inline fixed3 UnpackNormalDXT5nm (fixed4 packednormal)
    2. {
    3.     fixed3 normal;
    4.     normal.xy = packednormal.wy * 2 - 1;
    5.     normal.z = sqrt(1 - saturate(dot(normal.xy, normal.xy)));
    6.     return normal;
    7. }
    The shader code saturate(dot(normal.xy, normal.xy)) produces the same result as clamp(normal.x * normal.x + normal.y * normal.y, 0.0, 1.0) which is a little easier to translate to C# once you understand, though you could also use Unity's Vector2.Dot(). Once you've recreated the normal recompress it back to a 0.0 - 1.0 range with normal * 0.5 + 0.5 and you have your RGB values again. There will be some quality lost if you do this as the blue channel is not exactly the same as it would have been originally, but again this is what shaders do anyway.

    But option 1 won't fix the problem!

    The conversion to "*nm", either compressed "DXTnm" or uncompressed truecolor "Linear nm", happens during the texture import process. If you modify the texture afterwards and don't do the full save to disk & reimport you need to do the conversion manually. The whole "normal map" designation is a purely editor texture importer thing that the game and rendering systems have no concept of, it's just a texture2D asset that happens to have been formatted in a special way.

    So, you should leave your texture asset marked as a Normal Map for the editor so it looks right, but always manually move the normal map's red channel to the texture asset's alpha when doing manual manipulation of it regardless how the texture was initially imported!
     
    fra3point likes this.
  7. fra3point

    fra3point

    Joined:
    Aug 20, 2012
    Posts:
    269
    Thank you for the great answer!! Now I understand lots of things about normal maps!!
    As now I'm not using a shader to convert DXTnm to RGB, so I wrote this function (for debug I don't make a copy of the source texture):

    Code (CSharp):
    1. private Texture2D DTXnm2RGBA(Texture2D tex) {
    2.         Color[] colors = tex.GetPixels();
    3.         for(int i=0; i<colors.Length;i++) {
    4.             Color c = colors[i];
    5.             c.r = c.a*2-1;  //red<-alpha (x<-w)
    6.             c.g = c.g*2-1; //green is always the same (y)
    7.             Vector2 xy = new Vector2(c.r, c.g); //this is the xy vector
    8.             c.b = Mathf.Sqrt(1-Mathf.Clamp01(Vector2.Dot(xy, xy))); //recalculate the blue channel (z)
    9.             colors[i] = new Color(c.r*0.5f+0.5f, c.g*0.5f+0.5f, c.b*0.5f+0.5f); //back to 0-1 range
    10.         }
    11.         tex.SetPixels(colors); //apply pixels to the texture
    12.         tex.Apply();
    13.         return tex;
    14.     }
    Note: For mobile development I had to use the red channel and not the alpha as normal.x. That's because of the different mobile encoding for normal maps, which are rgb=xyz, while on other platforms is agb=xyz.

    I also had to import the textures as "Read/Write enabled" and with no compression (TrueColor) because SetPixels works only with uncompressed textures. I'm not happy about this, but it's good, anyway...

    EDIT: I simply made a copy of the original compressed texture in another ARGB32 (or RGB24 for mobile) texture, so I solved this last issue.
     
    Last edited: Jun 30, 2016
    dforstmaier likes this.
  8. radiantboy

    radiantboy

    Joined:
    Nov 21, 2012
    Posts:
    1,633
    So any idea how you mark the texture type as "normal map" in editor via script without wrecking the image... when I do it that way the normal goes red, if I do it in the editor it works fine.
     
  9. bgolus

    bgolus

    Joined:
    Dec 7, 2012
    Posts:
    12,342
    Not really. The easiest solution is to just not bother and ignore the warnings. The warnings are just something in the material inspector and don't do anything. It's just there to catch people who have imported normal maps and didn't check the box. If you're generating or modifying textures from script there isn't a "normal map" setting on the texture asset itself, only it's import settings. And having that set causes it to clear some channels and do the swizzle. Since you presumably don't want that, don't set it.

    If the warning bothers you and you're using custom shaders, don't use texture properties named _BumpMap or _NormalMap or the [Normal] material property drawer.
     
    Last edited: Aug 20, 2019
  10. AFrisby

    AFrisby

    Joined:
    Apr 14, 2010
    Posts:
    223
    Just a note - since some people might encounter this, if you are manipulating normal maps at the low level (i.e. manipulating them in render textures), Unity changed this behavior sometime between 2017.2 and 2018.4 on Standalone devices.

    Previously, they would be represented as float4(1, y, 1, x) - now it is now float4(x,y,?,1)

    Edit: this might be a spurious report - something weird is going on.
     
    Last edited: Sep 5, 2019
    fra3point likes this.
  11. bgolus

    bgolus

    Joined:
    Dec 7, 2012
    Posts:
    12,342
    2017.1 added support for BC5 normal maps, as well as optionally having both RGBA 1y1x (shown as "DXTnm" which is really just the swizzled DXT5, or "Linear Nm 32 bit" which is a swizzled RGBA32) and RG xy01 packed normals (BC5 is a two channel RG only format that always returns a 0.0 B and 1.0 A). For 5.6 and earlier it they were actually packed as yyyx, but only the GA channels were used.

    For 2018.4 both are still supported. The HDRP defaults to BC5 normal maps. LWRP I've seen it use both BC5 and older "DXTnm" for the default format in different revisions, but I haven't looked at what it currently defaults to. AFAIK the built in rendering paths still default to using DXTnm, but again both 1y1x and xy01 are supported.

    From UnityCG.cginc, this is the function UnpackNormal calls on desktop:
    Code (csharp):
    1. // Unpack normal as DXT5nm (1, y, 1, x) or BC5 (x, y, 0, 1)
    2. // Note neutral texture like "bump" is (0, 0, 1, 1) to work with both plain RGB normal and DXT5nm/BC5
    3. fixed3 UnpackNormalmapRGorAG(fixed4 packednormal)
    4. {
    5.     // This do the trick
    6.     packednormal.x *= packednormal.w;
    7.     fixed3 normal;
    8.     normal.xy = packednormal.xy * 2 - 1;
    9.     normal.z = sqrt(1 - saturate(dot(normal.xy, normal.xy)));
    10.     return normal;
    11. }
    Note on a note, the "neutral texture like "bump"" mentioned is actually a 4x4 (127, 127, 255, 255) or (0.498, 0.498, 1.0, 1.0) RGBA32 texture, and not the "(0, 0, 1, 1)" as described, though that's roughly the vector that's encoded in the default bump.
     
    fherbst and fra3point like this.
  12. mmkc

    mmkc

    Joined:
    Sep 12, 2019
    Posts:
    49
    I have tried to follow this discussion as best as I can but I'm struggling to get the results. I have built a system that modifies materials during run-time and part of that involves downloading the usual "purple" normal maps from the cloud and applying them to materials. I have converted my textures in the cloud to clear the R and B channels and put the previous R value into the A channel. For example, in RGBA (107, 153, 250, 255) becomes (0, 153, 0, 107). I am still seeing different results when applying this normal map from script versus applying the normal map in the editor. If I debug and look at the imported normal map pixel (the "correct" one), the value is "RGBA(1.000, 0.541, 0.549, 0.443)". Shouldn't R and B be zero?
     
  13. bgolus

    bgolus

    Joined:
    Dec 7, 2012
    Posts:
    12,342
    RGBA(107, 153, 250, 255) should become RGBA(255, 153, 153, 107) since Unity 2017.1. Really, the B value is somewhat arbitrary, and can be 0, 255, equal to G, or some other arbitrary data you want to pack into the texture, at the cost of some extra compression artifacts. The reason is, as shown in my last post, Unity multiplied the R and A channels together to get the encoded X, which is how Unity 2017.1 and beyond supports both RG and AG normal encoding.

    I wanted to touch on this value a bit more. The way the BC (aka DXTC) compression formats work, the red and blue channels use a 5 bits each and the green channel uses 6 bits for encoding per block color palettes. This means the green value has slightly better precision than red or blue, which is why the Y is left in the green channel. Unity actually stores “DXTn” as (1.0, y, y, x), but the precision difference between the G and B means they might not be perfectly equal when sampled by the shader. I honestly don’t know why they chose to duplicate the Y in the blue channel, but most likely it’s because they used to store it as a greyscale value so all 3 channels held the Y, and changing the red to 1.0 was the smallest change to the existing code. It’s possible some encoders do slightly better when data is duplicated across the RGB channels, but there’s no technical reason the format would do better packed that way.
     
  14. mmkc

    mmkc

    Joined:
    Sep 12, 2019
    Posts:
    49
    Thank you so much for the very detailed answer! I've updated my app to convert the "purple" normal images to use 1 for alpha and make the G and B values the same:

    Code (CSharp):
    1. Bitmap normalFile = new Bitmap(normalFileName);
    2. Bitmap unityNormalFile = new Bitmap(normalFile.Width, normalFile.Height, System.Drawing.Imaging.PixelFormat.Format32bppArgb);
    3.  
    4. for (int x = 0; x < normalFile.Width; x++)
    5. {
    6.     for (int y = 0; y < normalFile.Height; y++)
    7.     {
    8.         Color normalPixel = normalFile.GetPixel(x, y);
    9.         unityNormalFile.SetPixel(x, y, Color.FromArgb(normalPixel.R, 255, normalPixel.G, normalPixel.G));
    10.     }
    11. }
    Then in the game, I'm loading the image from the web like so:

    Code (CSharp):
    1. byte[] data = client.DownloadData(uri);
    2. texture = new Texture2D(2, 2, UnityEngine.Experimental.Rendering.GraphicsFormat.RGBA_DXT5_UNorm, -1, UnityEngine.Experimental.Rendering.TextureCreationFlags.MipChain);
    3. texture.LoadImage(data);
    If I look at the first pixel, there is still a difference between the imported texture and the converted texture, but it's pretty close:

    Imported: RGBA(1.000, 0.541, 0.549, 0.443)
    Converted: RGBA(1.000, 0.510, 0.518, 0.443)
     
  15. radiantboy

    radiantboy

    Joined:
    Nov 21, 2012
    Posts:
    1,633
    I used texImporter.convertToNormalmap and texImporter.normalmap to get around this as far as I can see from my code.
     
  16. Willchenyang

    Willchenyang

    Joined:
    Jul 20, 2018
    Posts:
    14
    I have a similar problem but I want to fix on the shader level instead of modify the texture (increased the image render time). So I am working with unity 2018.4 now and I have some legacy 5.6 asset bundle needed to be supported. When assets loaded, it has this normal map: see attached image. If I use the DTXnm2RGBA method you guys posted above. It works totally fine for me. However I want to do this in the shader level. Is it possible? I tried to move this function inside
    UnpackScaleNormalRGorAG in the SpaceUnityStandardUtils.cginc but it is not working. Thanks
     

    Attached Files:

  17. fra3point

    fra3point

    Joined:
    Aug 20, 2012
    Posts:
    269
    You can write something like this (just translated from C#):
    Code (CSharp):
    1. fixed4 DTXnm2RGBA(fixed4 col)
    2. {
    3.     fixed4 c = col;
    4.     c.r = c.a*2-1;  //red<-alpha (x<-w)
    5.     c.g = c.g*2-1; //green is always the same (y)
    6.     fixed2 xy = fixed2(c.r, c.g); //this is the xy vector, can be written also just "c.xy"
    7.     c.b = sqrt(1-clamp(dot(xy, xy), 0, 1)); //recalculate the blue channel (z)
    8.     return fixed4(c.r*0.5f+0.5f, c.g*0.5f+0.5f, c.b*0.5f+0.5f, 1); //back to 0-1 range
    9. }
    Then, you can use this function in your fragment function to convert all pixels like this:
    Code (CSharp):
    1. fixed4 frag (v2f i) : SV_Target
    2. {
    3.     fixed4 col = tex2D(_MainTex, i.uv);
    4.     return DTXnm2RGBA(col);
    5. }
     
    Last edited: Jan 30, 2020
  18. Willchenyang

    Willchenyang

    Joined:
    Jul 20, 2018
    Posts:
    14
    Hi fra3point Thanks for you reply. I am trying to modify the standard shader directly. Look like I can change the normal value there rather than convert the texture first.
    So I am trying to use
    Code (CSharp):
    1. half3 UnpackScaleNormalDXT5nm(half4 packednormal, half bumpScale)
    2. {
    3.     half3 normal;
    4.     normal.xy = (packednormal.wy * 2 - 1);
    5.     #if (SHADER_TARGET >= 30)
    6.         // SM2.0: instruction count limitation
    7.         // SM2.0: normal scaler is not supported
    8.         normal.xy *= bumpScale;
    9.     #endif
    10.     normal.z = sqrt(1.0 - saturate(dot(normal.xy, normal.xy)));
    11.     return normal;
    12. }
    But packednormal.w doesn't seem to have correct value.
     
  19. bgolus

    bgolus

    Joined:
    Dec 7, 2012
    Posts:
    12,342
    Is that the normal map as it exist in the asset bundle, or after you've converted it? Because either way it's not quite right. The red channel is way too dark, almost like it's been scaled or had some gamma correction applied. The green and blue channels look fine enough. Probably because you're generating the new textures using the wrong color space... but you shouldn't need to do any conversion if you're planning on modifying shaders anyway.

    Assume this is the post-conversion pass you've done, to convert pre 2017.1 normal maps to post 2017.1 the only difference is the red channel is white instead of a copy of the green channel. For the shader change, use the
    UnpackScaleNormalDXT5nm
    function you posted above in your shader instead of the
    UnpackScaleNormal
    function and do nothing to the textures and it'll work fine.
     
  20. Willchenyang

    Willchenyang

    Joined:
    Jul 20, 2018
    Posts:
    14
    Hi boglus,
    Thank you for your answer. That wrong normal map is wrong because I tried to load 5.6 assetbundle in unity2018. If I open this assetbundle in unity5.6 it looks perfectly fine. So here is what I have after tried this function
    Code (CSharp):
    1. half3 UnpackScaleNormalDXT5nm(half4 packednormal, half bumpScale)
    2. {
    3.     half3 normal;
    4.     normal.xy = (packednormal.wy * 2 - 1);
    5.     #if (SHADER_TARGET >= 30)
    6.         // SM2.0: instruction count limitation
    7.         // SM2.0: normal scaler is not supported
    8.         normal.xy *= bumpScale;
    9.     #endif
    10.     normal.z = sqrt(1.0 - saturate(dot(normal.xy, normal.xy)));
    11.     return normal;
    12. }
    Left is one with correct normal map, second is the one with "blue" normal map.
    As you can see they look the same but render result is wrong.

    After I change UnpackScaleNormalDXT5nm function to this:
    Code (CSharp):
    1. half3 UnpackScaleNormalDXT5nm(half4 packednormal, half bumpScale)
    2. {
    3.     half3 normal;
    4. packednormal.w *= packednormal.x;
    5.     normal.xy = (packednormal.wy * 2 - 1);
    6.     #if (SHADER_TARGET >= 30)
    7.         // SM2.0: instruction count limitation
    8.         // SM2.0: normal scaler is not supported
    9.         normal.xy *= bumpScale;
    10.     #endif
    11.     normal.z = sqrt(1.0 - saturate(dot(normal.xy, normal.xy)));
    12.     return normal;
    13. }
    They looks like this:
    As you can see the one with correct normal map looks correct but the one with "blue" normal looks different. So looks like the "blue" normal doesnt have the correct red channel. Anyway I can fix this?

    If I use an additional shader pass use the UnpackScaleNormalDXT5nm function to convert the normal map to RGBA and then use
    UnpackScaleNormalRGorAG it works perfectly for me.

    Thanks.
     

    Attached Files:

    Last edited: Feb 6, 2020
  21. bgolus

    bgolus

    Joined:
    Dec 7, 2012
    Posts:
    12,342
    The "wrong.png" you posted, where is that coming from? I mean, I know it's the normal map from the 5.6 asset bundle, but how are you extracting it & saving it out? The reason I ask is whatever that texture is, it's not setup like any default Unity normal map. I would expect if it was exactly the asset as stored unaltered, it'd be a greyscale texture w/ alpha. The default behavior for 5.6 was to copy the red channel to the alpha, and then copy the green channel to the red and blue, resulting in a greyscale version of the green channel if you ignore the alpha. What I'm seeing there looks like what would happen if you tried to reconstruct the normal map in the 2017 "RGorAG" style multiplying the two together, which won't work.

    The fact that it does work for you when calling the UnpackScaleNormalDXT5nm in a separate conversion pass ... that makes no sense to me unless you're leaving out some details because the output of that function (as posted above) would be in a -1 to 1 range which the UnpackScaleNormalRGorAG would also mangle. Are you using that function, then scaling back to 0.0 to 1.0 before outputting it to a render texture? What format / sRGB mode is that render texture?

    Also, what format are the "new" textures in when viewed in the inspector? Does it show them as DXTnm?
     
  22. Willchenyang

    Willchenyang

    Joined:
    Jul 20, 2018
    Posts:
    14
    Thanks for your help !
    The "wrong.png" was I took a screenshot from the inspector. That is how it looks. And you are right after I save the texture as RGBA it is like this:
    apt2b_melrosesofa_fabric.png
    And use UnpackScaleNormalDXT5nm in a separate pass, yes I did convert it back to 0-1 range. Attached the shader.
    Code (CSharp):
    1. half4 CustomUnpackNormal(half4 packednormal){  
    2.                 half4 normal;
    3.                 normal.xy = packednormal.wy * 2 - 1;
    4.                 normal.z = sqrt(1 - saturate(dot(normal.xy, normal.xy)));
    5.                 normal.xyz = normal.xyz*0.5+0.5;
    6.                 return half4(normal.xyz,1);
    7.             }
    This is the normal after convert:
    apt2b_melrosesofa_fabric_fixed.png
    I am using RGBA render texture, after convert it shows RGBA instead of DXTnm in inspector.

    Update: I was able to submit a ticket and get help from Unity. They were able to reproduce and I am waiting for their response. Will update once they have a fix.
     

    Attached Files:

  23. Bersaelor

    Bersaelor

    Joined:
    Oct 8, 2016
    Posts:
    111
    So, in our app, what I ended up doing is using a slightly modified shader version of the default `Standard` shader (just create new surface shader, and add a shader_feature and the normal map). The suggestion to do
    Code (CSharp):
    1.                 o.Normal = tex2D(_BumpMap, IN.uv_BumpMap)*2-1;
    2.  
    came from @Wolfram in this thread .

    Is this efficient? Is this a solution that more seasoned unity devs would also do?
    It seems to work on WebGL, will try on mobile later tomorrow.
     

    Attached Files:

  24. bgolus

    bgolus

    Joined:
    Dec 7, 2012
    Posts:
    12,342
    That's what the built in
    UnpackNormal()
    function already does, at least on mobile.
     
  25. Bersaelor

    Bersaelor

    Joined:
    Oct 8, 2016
    Posts:
    111
    But `UnpackNormal` doesn't work with textures that are downloaded from the web, i.e. via `
    UnityWebRequestTexture.GetTexture`. I feel like manually converting all images that are downloaded to the normal format that Unity-Default-Shaders unpack with `UnpackNormal` would create a lot of overhead.
    My app is basically a catalogue viewer for a large database of materials stored in the backend.
     
  26. bgolus

    bgolus

    Joined:
    Dec 7, 2012
    Posts:
    12,342
    As noted above, since 2017.1
    UnpackNormal()
    should work on normal maps that have been packed by the editor as DXT5nm, BC5, or "as is" without any issue. If it doesn't it means the textures have an alpha channel, which as normal maps they should not.

    But my comment was more in response to:
    The
    normalTex * 2.0 - 1.0
    is the standard normal map expansion that's been used for the last 20+ years. If you look at the example functions above, they all have that. The only difference between most normal map packing techniques is which channels of the texture are used, and whether or not the z channel is reconstructed or not.In your case, if you're just loading the raw texture, you're note reconstructing the z, so that line is all you need.
     
    Bersaelor likes this.
  27. bgolus

    bgolus

    Joined:
    Dec 7, 2012
    Posts:
    12,342
    tldr: * 2 - 1 is fine.
     
    Bersaelor likes this.