Search Unity

  1. Welcome to the Unity Forums! Please take the time to read our Code of Conduct to familiarize yourself with the forum rules and how to post constructively.
  2. Dismiss Notice

Why are detail normals inverted on UV1 when using built-in Unity object?

Discussion in 'Shaders' started by Invertex, Jun 17, 2016.

  1. Invertex

    Invertex

    Joined:
    Nov 7, 2013
    Posts:
    1,495
    I noticed this when comparing Unity's shader to a shader I'm making that has detail normal blending, if a normal map that for example has a scratch in it, is used on UV0 in the detail normals of the Standard shaders, it appears properly as a crevice, but if set to UV1, Unity uses it subtractively, creating a bump. This only seems to happen on Unity's built in objects like Sphere, Cube, etc... Is there a reason for this?


     
  2. bgolus

    bgolus

    Joined:
    Dec 7, 2012
    Posts:
    12,209
    Normal mapping relies on the orientation of the texture coordinates, or UVs. The orientation of those UVs are effectively stored in the mesh "tangents" in conjunction with the mesh's surface normals. Only one set of normals and tangents are stored per mesh though, and they're usually calculated from the primary UV set (aka UV0).

    The built in meshes all have their secondary UVs setup for lightmapping which don't match the UVs. Usually imported meshes don't have a secondary UV set and Unity doesn't automatically generate lightmap UVs for them unless you check on that option for each mesh. Instead Unity copies the first UV set across all of the UV sets. Because of this your custom meshes likely work fine using either UV0 or UV1 as they are actually the same UVs and thus the mesh tangents work properly for both.

    Unfortunately there's no easy fix for this when using the built in standard shader and built in meshes. With your own meshes if you want to use a custom secondary UV trying to keep the orientations of both UV sets as close as possible will make things work better. Alternatively you could use a custom shader that uses derivatives to calculate the tangents for the detail normal maps. There's actually some experimental code within Unity's various cginc files that could be used toward this end, but sadly not a simple task. I might suggest putting a bug in for it in the issue tracker.

    If you're curious about derivative normal mapping you can check out this page:
    http://www.thetenthplanet.de/archives/1180

    That's written in GLSL and there are some oddities with this technique in Unity regarding the way it flips the projection space, but it does work. I personally would love to see it implemented in Unity by default.
     
  3. Invertex

    Invertex

    Joined:
    Nov 7, 2013
    Posts:
    1,495
    Ah ok, so it's just an issue with how the Unity asset's UV1s are setup. Alright, thanks.

    Yeah, it's not actually an issue for me, was just curious about it. I implemented the reoriented normal mapping technique that is described here which gives better results than Unity's current method at the same cost:
    http://blog.selfshadow.com/publications/blending-in-detail/

    Though it begs the question, why are the UV1 tangents reversed on their object for lightmapping if they don't need to be reverse on my objects for it?
     
  4. bgolus

    bgolus

    Joined:
    Dec 7, 2012
    Posts:
    12,209
    That article is about blending two normal maps together with the assumption that both textures already have the same base tangents. The reorienting in that technique is referring to the theoretical surface normal direction of the normal map rather than the rotational orientation of the UVs. In the specific situation this post is talking about none of the options on that page will help.

    As for your own meshes, are you using baked lighting and are you either creating your own light map UVs or using the Unity importer generated ones? If not that's why it's working for you. You may find if you enable light map uv generation for your meshes you'll find the detail normals are also wrong for you, even more so if you have baked lighting enabled as this further modifies a mesh's UVs when generating the light map atlas.
     
    Invertex likes this.
  5. Invertex

    Invertex

    Joined:
    Nov 7, 2013
    Posts:
    1,495
    Oh, no currently I'm not using baked GI, just the real-time one. Though I had tried it and it was causing all kinds of shading garbage on many of the meshes, so I guess that might explain why that was happening.

    What we're doing is using the material blending workflow that's upcoming, using Substance. So our UV0 is for a base normal map, blend map and AO, and then there are extra inputs in our shader for multiple tiling detail albedo/normals/spec that use UV1 with a different layout.

    But there will be instances where the UV1 has overlapping UVs, so that definitely wouldn't be good for lightmapping... Is there a way to make Unity perhaps use UV2 for lightmap?
     
  6. bgolus

    bgolus

    Joined:
    Dec 7, 2012
    Posts:
    12,209
    Yep, that "shading garbage" is caused by the secondary UVs on your meshes overlapping. Lightmap UVs have to be within the 0.0 to 1.0 range of the UV (can't be setup for tiling textures) and every part needs to be uniquely mapped. The Generate Lightmap UVs checkbox on the mesh import options will do this for you. UV2 is actually already used by the realtime GI, and unlike the Baked GI lightmaps it doesn't care what's in those to start with it just blows them away with auto generated UVs straight away.

    Now you might ask "why can't the baked GI and realtime GI use the same UVs?" The answer is ... It's complicated. They have different requirements. The Realtime GI is generally much lower resolution and I think the UVs are setup to keep everything roughly on a world space grid. Baked GI is generally higher resolution and a level might be spread across multiple sets of lightmap textures and the UV mapping is usually setup to try to keep a consistent texel size to surface area ratio. You can also separately control the resolution of the baked GI resolution and realtime GI per object.
     
    Invertex likes this.
  7. Invertex

    Invertex

    Joined:
    Nov 7, 2013
    Posts:
    1,495
    Thanks, that makes sense. So then is there no way for me to force the lightmapping to be used on some other UV set? Like UV3/4? I would guess it would be done in the shader...? Or perhaps the easier solution would be to have our tiling UV set on UV3/4 instead...
     
  8. bgolus

    bgolus

    Joined:
    Dec 7, 2012
    Posts:
    12,209
    Unity is hardcoded in the native code side to use UV1 for the lightmap UVs (and UV2 for realtime GI) so changing the shader to use another uv set for lightmaps won't help there. You could use UV0 as you currently do and UV3 for your tiling details, especially if you're already using a custom shader. Or you could bake your own lightmaps externally from Unity which a lot of people do with Unity 5.0 since the lightmap baker is so terrible, then you could use whichever UV set you want.

    Alternatively if you're never going to use Baked GI, you could just keep using UV1 and not worry about it, just don't use the built in meshes or have an option in your shader to use UV0 with a separate offset / scale like the standard shader.
     
  9. Invertex

    Invertex

    Joined:
    Nov 7, 2013
    Posts:
    1,495
    Dang, well that's fine I guess, will likely stick to the real-time GI for now. Though I thought about creating an extension script that simply copies UV1 to UV3 to make the artist's lives easier, but for some reason it doesn't seem to actually do it.

    Code (csharp):
    1. using UnityEngine;
    2. using UnityEditor;
    3. using System.Collections.Generic;
    4.  
    5. class UV1toUV3 : AssetPostprocessor
    6. {
    7.     void OnPreprocessModel( GameObject obj)
    8.     {
    9.         MeshFilter importMesh = obj.GetComponent<MeshFilter>();
    10.         List<Vector2> copyUV = new List<Vector2>();
    11.         importMesh.sharedMesh.GetUVs(1, copyUV);
    12.         importMesh.sharedMesh.SetUVs(3, copyUV);
    13.     }
    14. }
    It just throws:
    I'm not quite sure how to interpret that error...
     
  10. bgolus

    bgolus

    Joined:
    Dec 7, 2012
    Posts:
    12,209
    The mesh doesn't exist OnPreprocessModel, only OnPostprocessModel. The gameObject doesn't exist then either.
     
  11. Invertex

    Invertex

    Joined:
    Nov 7, 2013
    Posts:
    1,495
    Changing it to Postprocess does solve the error, I had tried changing it to Preprocess in hopes it would solve what my actual problem was, which was that once I check "Generate Lightmap UVs", it changes the UV3 as well. I am guessing that generating the lightmaps happens before PostprocessModel, so I just end up copying the lightmap UV1 to UV3 instead of my original UVs. I've been looking and haven't found any documentation on how I would be able to get the copy to happen before that.