One improvement I'd really like to see from the texture preprocessor is the ability to access the raw 16bpc data from an image and modify it before it hits the compressor, as well as store data along with that modification that stays with the texture and can be accessed in the shader easily. The basic use case is for things like normal maps, which are often heavily quantized on platforms which only support 8bpp textures. However, on many surfaces with subtle gradients, the full range of the normal data is not used, or is primarily around 0.5. Ideally, what I'd do take the original 16bpc data and normalize it to make full use of the 8 bits of precision, or do a non-linear transform on the data which better uses the 8 bits of precision. Then, I'd write out some data associated with the texture to be used in the shader, so I can remap those values to the proper range. When applied to every texture, this is the type of thing that can make every pixel on screen look better, as well as reduce specular shimmer/noise/aliasing from heavily quantized normals, which is a huge issue with PBR rendering. But right now, the process for doing this in Unity would require pre-processing your textures in an external app and manually copying the data needed to remap the ranges into the material. If this could be applied/generated as part of the import step, then unpacked via a macro in the shader, much like UV transformations are currently done, it would improve the overall look of every Unity game on every platform. For many users, this would make a bigger difference than support for formats like BC6/7. as those formats are on limited platform and consume considerable more memory.