Search Unity

How to layer textures on specific faces?

Discussion in 'General Graphics' started by froztwolf_unity, Jul 1, 2021.

  1. froztwolf_unity

    froztwolf_unity

    Joined:
    Nov 21, 2018
    Posts:
    20
    Hi.
    I´m creating some customizable dice and I´m having a hard time finding a good way to set up the materials and UVs for them.

    The die itself should have a background texture, but then some faces should be able to overlay a second texture to indicate the value of each face. (imagine putting stickers on physical dice to change what they do)

    I´m modelling the dice in Blender, and I´ve had some success with copying the faces that should take the overlays into new meshes and applying a different material to each of them, but I feel that this must be a sub-optimal way to do things, especially when I get to 20-sided dice, both in terms of workflow and rendering.

    I´m new to blender, and intermediate at Unity at best, so I´m not sure if the magic trick I need to pull is in Blender or Unity, but I´d love to hear your ideas on how to do this.
    Keep in mind that I´m not comfortable writing shaders in HLSL, and I´d strongly prefer this to work for all render pipelines if possible.
     
  2. bgolus

    bgolus

    Joined:
    Dec 7, 2012
    Posts:
    12,352
    Unity’s done a lot of work to make sure that’s not (easily) possible.:(


    The most straightforward way to do this is have a unique material per side. You can be slightly fancy and have a single material for the body, and then unique materials for all the faces on cards that sit just above the surface and use an alpha tested or alpha blended material.

    If you’re using URP or HDRP you can use shader graph to setup a basic layered shader where you use a base texture for the dice body, and some kind of alpha blended texture on top, perhaps using a second UV, so you don’t need the second set of geometry.
     
  3. froztwolf_unity

    froztwolf_unity

    Joined:
    Nov 21, 2018
    Posts:
    20
    Thanks
    I´m currently using the second option you mentioned, as I´d rather restrict to specific render pipelines than have extra geometry and extra drawcalls. At least it's decently optimized and the workflow for myself isn't too bad.

    But it requires a single texture atlas for all the values, instead of a per-face one, and because I'm using shader graph it will work in URP and HDRP only. So it won't be nearly as easy for end users to use as I'd hoped.
    The former I could fix if I spent the time making a tool that takes in the texture for each face and automatically updates the atlas with it based on the UV mapping. Sounds a bit tricky to make though.

    The latter I guess I could fix if I learned how to write shaders in both GLSL and HLSL.

    Thanks for the reply. At least I know I'm not missing anything obvious.
     
  4. bgolus

    bgolus

    Joined:
    Dec 7, 2012
    Posts:
    12,352
    A texture atlas is indeed the other option, and is the one that's going to be compatible with the most platforms. The other option would be to use a texture array, though that would exclude some lower end mobile devices. For limiting to a single material & mesh for the entire dice, using an atlas or array is the most efficient option.

    For common 6 sided dice, you could also use a cubemap. Really that would work for all styles of dice, though it gets a little more complicated for anything not using 6 sides.

    Generating atlases in the editor, or even at runtime, isn't as scary as it sounds. At runtime you can take individual face textures, create a render texture to use as an atlas and
    Graphics.Blit()
    the faces into the render texture. This has the unfortunate side effect of requiring you to use an uncompressed texture as you can't use compressed formats in a render texture. Alternatively you can use
    Graphics.CopyTexture()
    which would let you copy from individual face textures to a larger atlas
    Texture2D
    with the same format, but there are complications with mip mapping due to compressed texture format block sizes.

    In the editor you can do the same
    CopyTexture()
    , but you can force texture assets to be imported uncompressed, and then save them back out to the disk as a .png image to be reimported so it's more a creation tool than something that runs during the game.

    Texture arrays, specifically a
    Texture2DArray
    , is just like a
    Texture2D
    , but has individual layers than can be indexed. At runtime you can build these using the same
    CopyTexture()
    to copy from an existing
    Texture2D
    of the same format. You can also take a texture file that's been saved as an atlas and import it as an array in Unity 2020.2 and newer, otherwise you'd have to generate them in the editor the same way as you would for runtime, though they can be saved as a .asset file.




    On the topic of writing a shader that works on all render pipelines, built-in, URP, and HDRP, currently the only way to do that is using Better Shaders. Though even that only supports a limited selection of SRP versions. Amplify Shader Editor, which is a third party node based shader editor, supports making shaders for all of the render pipelines as well, but AFAIK you can't have your shader support more than one pipeline at a time. I might be wrong on that though. Even Unity's Shader Graph kind of broke supporting both URP and HDRP with a single shader as different versions of Shader Graph don't always work across both SRPs, or all versions of the same SRP! They did try not to break that, there are bugs that make it not 100%. The "best" case apart from Better Shaders is people ship the URP and HDRP shaders as zip files for people to manually extract as a work around.

    No need to learn GLSL. Just need to learn a little HLSL. And a Surface Shader will do 90% of the work for you to get something working for the built in renderer. Like Shader Graph, they mostly limit you to just needing to write the bit of code that defines the surface properties; albedo, normal, etc.
     
    CyrilGhys likes this.
  5. froztwolf_unity

    froztwolf_unity

    Joined:
    Nov 21, 2018
    Posts:
    20
    I have a working solution with an atlas and shader graph that I'll use for the time being, but texture arrays are interesting indeed.
    I'm not to worried about blitting them together, it's more to make sure that each texture is translated, rotated and scaled to go to the exact right place in the atlas. I'm trying to minimize seams, so each die has the smallest possible amount of islands, which means a lot of rotations and a bit of distortion. (something I may reconsider if I keep the atlas, as I want customization to be simple)

    I'd have to transform each texture by hand to precisely match every face of every die. Might not be too bad if I can find a neat way of building tooling for it, but it's a lot of magic numbers and a ton of re-work if the meshes ever change.
    This would for sure be a creation tool. I'm not planning to support face customization at run-time. (although I'm sure it would be possible to make that work too)

    For now I've built a shader with Shader Graph that works well in URP. I'll do something that works in HDRP and perhaps just not support URP. Still mulling, but the point is simplicity of customization for the end-user and I feel that I'm hitting a place where I may need to trade that off for universality.

    At least it looks nice now, and I can do pretty-looking showcases while I work on the other necessary features, like collision audio, support for a variety of different dice at the same time etc.

    Thanks for the help :)