Search Unity

Mipmaps working incorrectly for 2D sprite?

Discussion in 'General Graphics' started by _eternal, Sep 10, 2018.

  1. _eternal

    _eternal

    Joined:
    Nov 25, 2014
    Posts:
    304
    I'm trying to find the best way to downscale 2D sprites, so I was experimenting with mipmaps. However, the results I'm seeing on-screen don't match with the texture data I'm getting from the sprite.

    In Update, I'm printing the texture's mipmapCount and loadedMipmapLevel, along with its resolution (Texture2D.width and Texture2D.height). When I disable mipmaps in the sprite's import settings, I get the expected result (looks like this):

    mipmap count == 1
    loaded mipmap level == 0
    resolution == 256x512 (this is native)

    Then, when I enable mipmaps in the import settings, the sprite gets noticeably blurry at native res, suggesting that it's rendering at a lower resolution. But when I check which mipmap level was loaded, it still says it's the first one.

    mipmap count == 10
    loaded mipmap level == 0
    resolution == 256x512 (this is native)

    If it helps, here are my import settings: https://my.mixtape.moe/dxaoiw.png

    Am I misunderstanding how loadedMipmapLevel works, or is there something else going on here? Moving the camera, changing the resolution, and even changing the camera to Perspective does not trigger a change in loadedMipmapLevel. It remains blurry even though it says that the native resolution mipmap is the one that's loaded.
     
  2. bgolus

    bgolus

    Joined:
    Dec 7, 2012
    Posts:
    12,343
    The loadedMipmapLevel is only the information about what mipmap level has been loaded by the CPU and passed to the GPU. When rendering the GPU decides what mip level to display. Understand that mipmapping’s primary goal is to reduce visible aliasing caused by under sampling. That is, to not render a texture such that is displayed smaller than its resolution.

    To that end, when using bilinear filtering, if a 256x256 texture is displayed on geometry that’s only shown at 255x255 pixels across, the GPU will drop down to showing the first mip level. That means it’s a 128x128 texture stretched across 255x255 pixels.

    Switching to trilinear can help a little, as can using a small mip bias (either using a custom shader or setting the bias on the texture via script). Unfortunately the only way I’ve found to fix this completely is to use a custom shader which does in shader super sampling of the texture. I do this for UI elements and world elements with important text for the VR titles I work on.
     
  3. _eternal

    _eternal

    Joined:
    Nov 25, 2014
    Posts:
    304
    Interesting, thanks. So is there no easy way to debug this via code to confirm which mip level is being displayed?

    I know there's a mipmaps view in Scene mode, so I can eyeball it and scroll until I'm at native res, but the real question is which version is being displayed in Game mode.

    Having said that, even without debugging this properly, I can kinda see how the mip levels work. The sprite quality changes when I change my game resolution, so it's probably working as intended.
     
  4. bgolus

    bgolus

    Joined:
    Dec 7, 2012
    Posts:
    12,343
    Unfortunately, the mipmap view is about the best you can get with what's built in, and it's not all that useful for this particular situation. It doesn't show the actual mip level, only an approximation of if the base texture size is to high or low resolution at the current range. Because the GPU makes the decision about what mip level to show on each unique pixel independently, there's no single value you can get from c# to tell you which mip is being used. For a 3d object the answer could be "all of them".

    A better mip level viewer would require some custom shader work, maybe some carefully constructed debug textures. I've written / made these before, but it's a pain to get them to really match.
     
    _eternal likes this.