Search Unity

Question Mipmap Streaming with Atlased Textures

Discussion in 'General Graphics' started by ConorsFlying, Sep 30, 2022.

  1. ConorsFlying

    ConorsFlying

    Joined:
    Apr 26, 2018
    Posts:
    8
    Hi,

    I have lost of models that share a single atlased texture. I am interested in using texture streaming but I am wondering if Unity will be able only load in the parts of the image that are near the player? Or will I need to use individual textures for each object to get texture streaming to work correctly?

    e.g.

     
  2. lyndon_unity

    lyndon_unity

    Unity Technologies

    Joined:
    Nov 2, 2017
    Posts:
    66
    Unity would load just the reduced mips if all of the objects using the atlased texture are able to render at dropped mip levels. As soon as one object needs full resolution, the mip streaming system will load the whole texture. The mip streaming system doesn't split up the texture into sub sections.

    The virtual texturing system does however work at sub texture level so could give you the support you need, dependent on which platform you are running on.
     
  3. ConorsFlying

    ConorsFlying

    Joined:
    Apr 26, 2018
    Posts:
    8
    Thanks for the info! We are targeting the Meta Quest and use URP so I don't think virtual texture steaming would work for us.

    Would the same be true about mip map streaming if we were using a texture array instead of an atlas?
     
  4. hippocoder

    hippocoder

    Digital Ape

    Joined:
    Apr 11, 2010
    Posts:
    29,723
    Why not use Texture Arrays instead of atlas? Cost is basically the same and you get mips plus it is just the single GPU resource like an atlas would be.

    I think it works with mip streaming?
     
  5. Neto_Kokku

    Neto_Kokku

    Joined:
    Feb 15, 2018
    Posts:
    1,751
    I'm not sure texture streaming works with arrays, but if it does it should have the same limitations as an atlas.

    When using an array, it's the shader who decides which slice to use for each pixel, on the GPU. Meanwhile the texture streaming is running on the CPU using a limited set of precalculated heuristics based on mesh UVs to estimate a single texture density value for each renderer based on their size on the screen.

    Therefore, there's no way for the streaming system to know which slices an object is actually going to display and it would load the necessary mips for all slices. It also cannot take in account any shader-calculated UVs which result in a different density than the one in the mesh (like triplanar, for example).
     
  6. jubaerjams8548

    jubaerjams8548

    Joined:
    Jun 8, 2020
    Posts:
    41
    HI, Can You please give me some ideas how u're implementing texture streaming for mobile? and what should be the efficient values of these parameters for performance?
     

    Attached Files: