Search Unity

GPU performance VS different texture sizes

Discussion in 'Shaders' started by Shushustorm, Jul 1, 2020.

  1. Shushustorm

    Shushustorm

    Joined:
    Jan 6, 2014
    Posts:
    1,084
    Hey everyone!

    I am wondering:

    If, for a material, I am using
    - a 2048px texture (Tex1)
    - a 1024px texture (Tex2) that performs calculations using the 2048px texture (let's say col = Tex1 + Tex2),
    will using a 1024px texture for Tex2 only reduce RAM usage?
    Or will it impact GPU frame time? If so, positively or negatively?

    First thought would be less pixels, less calculations, less frame time.
    Then again, the 1024px texture may have to be upscaled per frame (?) and result in more calculations than when using a 2048px texture for Tex2.

    Best wishes,
    Shu
     
  2. aleksandrk

    aleksandrk

    Unity Technologies

    Joined:
    Jul 3, 2017
    Posts:
    3,025
    Hi!
    Texture size should not have any effect on the performance. Texture compression can have a positive impact - the less space a block occupies in memory, the more blocks can be cached at the same time.
    You can read more on the topic by searching for "texture filtering".
     
    JCO123456 and Shushustorm like this.
  3. Shushustorm

    Shushustorm

    Joined:
    Jan 6, 2014
    Posts:
    1,084
    Thanks for your reply! Interesting! Including the compression topic! - I was actually thinking compression would reduce RAM usage but decrease performance due to constant decompression. And I also think I read something like that some time ago, but I guess that was about audio files, which probably use quite different procedures. I sure will read more on texture filtering!
     
  4. aleksandrk

    aleksandrk

    Unity Technologies

    Joined:
    Jul 3, 2017
    Posts:
    3,025
    The compression algorithms used for GPU textures are quite different from regular image compression algorithms - they are optimized for random access and decompression is frequently done in hardware. For many applications memory bandwidth is a more limiting factor than GPU load, so this is usually a win in terms of performance.
    Keep in mind that these compressed formats can decrease visual quality, so if you're dealing with something that needs to be pixel-perfect like UI you'd probably best off without compression.
     
    Shushustorm likes this.
  5. Shushustorm

    Shushustorm

    Joined:
    Jan 6, 2014
    Posts:
    1,084
    Thanks for the reply and the additional information! Very interesting!
    Now that I think about the actual shader code, it does make sense that this is not a texture size issue, since when doing calculations, you don't use the texture pixels to work with, but pixels on screen instead.

    I'm not sure that's going to be the case for my game, since I am not going to use that many texture sets per material, so I should get away with somewhat manageable memory usage. On the other hand, when available, I will probably see if I can get some more content on screen that uses the available memory. I do, however, plan on using quite a number of particles via VFX graph and some compute shaders (e.g. for flocking), if supported by the target platform. But that's going to be a late decision, since in that case, VFX quality can be scaled rather easily.

    That for sure! Also, not all formats are supported on each platform. That said, I did see some quite convincing results using crunch compression, depending on the texture. Some did keep visual quality quite well while reducing storage usage and read time from disk drastically. As far as I remember, though, crunch was the least supported one. But that's quite a while ago when I read about that, with current hardware (e.g. Nintendo Switch, PS4, PS5), it may be quite different.
     
    Last edited: Jul 1, 2020
  6. bgolus

    bgolus

    Joined:
    Dec 7, 2012
    Posts:
    12,352
    Crunch does not reduce RAM usage. A Crunched DXT1 uses the same amount of RAM on the GPU as a regular DXT1. It actually takes slightly more RAM on the CPU, at least when initially loaded. The difference is the Crunch format takes up less space in the build prior to loading, and can load from disk slightly faster.
     
    Shushustorm likes this.
  7. Shushustorm

    Shushustorm

    Joined:
    Jan 6, 2014
    Posts:
    1,084
    Thanks for the reply! You're absolutely right. I corrected that, but I guess I wasn't fast enough!
     
  8. OmarVector

    OmarVector

    Joined:
    Apr 18, 2018
    Posts:
    130
    @aleksandrk
    I know its quite old thread, but there is something I never could find any information regard it, does Texture DPI has any impact on performance/memory usuage?

    Is 72Dpi image will be treated the same way as 300Dpi image?

    Thank you:)
     
  9. aleksandrk

    aleksandrk

    Unity Technologies

    Joined:
    Jul 3, 2017
    Posts:
    3,025
    @OmarVector there's no such concept as texture DPI.
    DPI for an image makes sense when you want to print it on paper, for example.
    The only things that matter are:
    • Texture dimensions - these have a direct effect on memory usage, a texture that is 2x wider will use 2x memory.
    • Texture data format - has an effect on memory usage and performance: a texture that uses a more narrow data format (for example, 8 bits per pixel vs 32 bits per pixel) will take less memory (4x in the example) and will fit more pixels in the cache at once, so the GPU will have to read data from memory less often.
    • Mipmaps - enabling them uses more memory (~34%) but increases performance when you need to render something that's further away from the camera in 3D.
     
    OmarVector likes this.
  10. OmarVector

    OmarVector

    Joined:
    Apr 18, 2018
    Posts:
    130
    Yeah I'm sure about that, the DPI is usually used for printing and has nothing to do with how unity process image files on GPU as far as I'm concern, thats why I wanted to double check with you to be sure

    Thank you so much:))
     
  11. OmarVector

    OmarVector

    Joined:
    Apr 18, 2018
    Posts:
    130
    @aleksandrk
    Sorry to annoy you one last time

    Same concept apply for images that are imported as sprites , bec unity anyway convert them to meshes on canvas and DPI of the image has nothing to do with the images being rendered on canvas. Right?
     
  12. tw00275

    tw00275

    Joined:
    Oct 19, 2018
    Posts:
    92
    I don't think sprites or textures use the DPI in the image's metadata, but there is something similar to DPI for sprites called Pixels Per Unit. This can be used to control how large sprites are on a UI element.

    For instance, if some of your sprites are 16x16 and have the default Pixels Per Unit of 100 and and want your 32x32 pixel sprites to appear the same size, then you'd set their Pixels Per Unit value to 200.
     
    OmarVector likes this.
  13. OmarVector

    OmarVector

    Joined:
    Apr 18, 2018
    Posts:
    130
    Yeah I'm aware about this one, but unlike DPI, PPU are set inside unity but DPI is set while exporting the original assets.... so what I want to be sure that the DPI of the image has nothing to do with how unity render UI elements:))
     
  14. bgolus

    bgolus

    Joined:
    Dec 7, 2012
    Posts:
    12,352
    Texture DPI is 100% ignored at all times for games. The Unity texture importer or asset class doesn’t even access or store that data.

    Mobile platforms have a concept of DPI for resolution scaling or UI, but that’s for scaling things like buttons or similar touch interfaces and oddly less about textures.
     
    OmarVector likes this.