Search Unity

Odd behaviour in GPU profiler from image effects

Discussion in 'Image Effects' started by Zergling103, Oct 23, 2018.

  1. Zergling103

    Zergling103

    Joined:
    Aug 16, 2011
    Posts:
    392
    I'm currently trying to optimize our game for 4K rendering when I noticed an odd behaviour that is making it difficult:

    When I'm rendering an image effect at 4K, as expected, it will take longer to compute the image effect. This is normal.

    However, the very strange thing is this: An increase in computation time is measured from 1080p to 4K for passes where neither the blit nor the textures it samples from (if any) increase in size.

    For example, a pass that blits white (1,1,1,1 with no texture lookups) into a 1x1 render texture takes twice as long (according to the profiler) if the camera's render target is 4K vs. 1080p. That is, it appears the camera's pixel resolution is having an performance impact on what could be completely unrelated draw calls.

    What's going on here?
     
    brownboot67 likes this.