Search Unity

  1. Unity Asset Manager is now available in public beta. Try it out now and join the conversation here in the forums.
    Dismiss Notice

Texture Importer: Using 8 bits better..

Discussion in '5.5 Beta' started by jbooth, Aug 27, 2016.

  1. jbooth

    jbooth

    Joined:
    Jan 6, 2014
    Posts:
    5,461
    One improvement I'd really like to see from the texture preprocessor is the ability to access the raw 16bpc data from an image and modify it before it hits the compressor, as well as store data along with that modification that stays with the texture and can be accessed in the shader easily.

    The basic use case is for things like normal maps, which are often heavily quantized on platforms which only support 8bpp textures. However, on many surfaces with subtle gradients, the full range of the normal data is not used, or is primarily around 0.5.

    Ideally, what I'd do take the original 16bpc data and normalize it to make full use of the 8 bits of precision, or do a non-linear transform on the data which better uses the 8 bits of precision. Then, I'd write out some data associated with the texture to be used in the shader, so I can remap those values to the proper range.

    When applied to every texture, this is the type of thing that can make every pixel on screen look better, as well as reduce specular shimmer/noise/aliasing from heavily quantized normals, which is a huge issue with PBR rendering. But right now, the process for doing this in Unity would require pre-processing your textures in an external app and manually copying the data needed to remap the ranges into the material.

    If this could be applied/generated as part of the import step, then unpacked via a macro in the shader, much like UV transformations are currently done, it would improve the overall look of every Unity game on every platform. For many users, this would make a bigger difference than support for formats like BC6/7. as those formats are on limited platform and consume considerable more memory.
     
    Noisecrime likes this.
  2. Aras

    Aras

    Unity Technologies

    Joined:
    Nov 7, 2005
    Posts:
    4,770
    Yes, overall this would be a useful thing. I think it would have to be an option though, since to make use it, you have to make sure all your shaders are aware and do proper range adjustment of texture data after fetching it. While we could do this for all built-in shaders, there's a ton of people who write custom shaders, or somehow have custom shaders but don't know how to adjust them to do what's needed.

    Anyway, will think about it! Very likely not for 5.5 though.

    I'm gonna do a "well, actually..." here :) That statement is true for mobile, but on PC/console, not so much. BC5 in particular ("the thing useful for normal maps") has been supported by all GPUs for last 10 years, and even in APIs like DX9. And it takes same amount of space as DXT5. Same with BC7, it takes same amount of space as DXT5 (but yes this one requires a more modern GPU).
     
  3. jbooth

    jbooth

    Joined:
    Jan 6, 2014
    Posts:
    5,461
    I did stress "For many users". If you're shipping a PC/Console only game, then yes, you have those options available and likely feel sad if you can't use them. Unfortunately it's my experience that very few people pay attention to these types of details though, so "Automatic" is what you'll see most people set things to.

    Yeah, it obviously needs to be opt in, in some fashion. I'd be more than happy to simply have the capability to extend the pipeline in this manner with nothing automatically done by Unity, but a "It just works for everyone that ops in" type solution would be pretty awesome, since it would help those same people that don't futz with the texture compression settings. I was originally thinking this would be somewhat analogous to the macro for transforming the texture coordinates that already exists in Unity, but it's a bit more complex than that because it requires a matching of data between the shader and the texture which doesn't really exist in that case. A safer route for Unity would likely be to just give users the capability to add this type of functionality and leave it at that.
     
  4. AcidArrow

    AcidArrow

    Joined:
    May 20, 2010
    Posts:
    11,799
    Can't you already do it by using the postprocessor? You'd need to keep the original in Truecolor, do the normalization and store the values somewhere to be able to reverse it later, then duplicate the texture (save as a new png?) and compress it? (wouldn't this work? or am I missing something?)

    It would be a pretty awkward workflow though.
     
  5. jbooth

    jbooth

    Joined:
    Jan 6, 2014
    Posts:
    5,461
    I think the easiest way to do it in the current pipeline, automated, would be to do something like the following:

    Import the texture and set it to uncompressed, then write a flag into userData through some custom editor
    As a pre-build step, process your texture and reimport it with the option set to compress the texture. Store the de-normalization parameters in some dictionary based on texture.
    As a pre-build step, process all the materials in your game, setting the denormalization values on them based on the textures they use.
    Build your game
    Revert all the changes you made on the asset importers and materials

    We already do something similar to this for generating asset bundle variations in our pipeline (to get around the asset bundle variant system wanting a 3 copies of every texture in our tree for 3 variant levels) and quite frankly, I hate having to integrate things this way. It tends to break from time to time, so we have to monitor our build machines checkout list to make sure everything got reverted correctly, and it relies on the cache server working 100% of the time as it's constantly re-importing textures.

    So yeah, it can be automated, but it's not exactly easy or error prof.
     
  6. Noisecrime

    Noisecrime

    Joined:
    Apr 7, 2010
    Posts:
    2,054
    I really like this idea, especially as its just given me an idea for my current project where I think the lack of precision was really hurting the final rendering of normal maps on mobile. The issue being the normal map had such slight variations ( varying from the baseline 128 by 1 or 2 ) that it just looked bad. re-normalising should fix that and I don't know why it never occurred to me before. Of course whether i'll have time to implement it is a different matter, but thanks for the idea.
     
  7. jbooth

    jbooth

    Joined:
    Jan 6, 2014
    Posts:
    5,461
    You can do this for just one texture pretty easily- We've done it in the past for particularly problematic textures. But it's the kind of thing I'd roll out project wide if I had an easy way to automate it. Also keep in mind that on some specific cases a non-linear transform may be worth the extra shader cost, allowing you to get a better representation of the original data. For instance, something with subtle gradients, like a car normal map cast from a high res model, might contain mostly subtle gradients but the occasional hard angle for the bevels. In practice, normalization might not do much since those hard bevels set the range; but if you store it using most of the precision around the 0.5 range, and very little precision on the hard angles, you might get a very good gradient representation and while the hard angles are much rougher, it won't matter much because they're just there to catch the light.

    In the end, this is why I'd prefer Unity enable people to make this kind of pipeline themselves over trying to just solve it for everyone. It seems more 'the Unity way' anyway; but there are many options beyond just a simple normalization of the data- though certainly that helps in many cases. (Also, if you do it, process each channel independently).