Search Unity

Dimensions of color surface does not match dimensions of depth surface

Discussion in 'General Graphics' started by LazloBonin, Dec 3, 2020.

  1. LazloBonin

    LazloBonin

    Joined:
    Mar 6, 2015
    Posts:
    812
    I'm struggling with the error in the title when rendering some complex custom camera stack.

    The complete setup would be hard to isolate/replicate and post here, so I'm asking what this error means in general and how to fix it.

    The error is thrown when I call
    Camera.Render()
    on a camera on which I used
    Camera.SetTargetBuffers(RenderBuffer[] colorBuffer, RenderBuffer depthBuffer)


    However, I can confirm that right before the Render call, all my buffers have an equal width and height (each color buffer and the depth buffer as well).

    In fact, here's the code I call right before rendering to confirm:

    Code (CSharp):
    1. if (initialized)
    2. {
    3.     RenderTexture depth = //...
    4.     RenderTexture[] targets = //...
    5.  
    6.     var depthBufferSize = new Vector2Int(depth.width, depth.height);
    7.  
    8.     foreach (var target in targets)
    9.     {
    10.         var targetSize = new Vector2Int(target.width, target.height);
    11.  
    12.         if (targetSize != depthBufferSize)
    13.         {
    14.             Debug.LogWarning($"Target {target} has a buffer size of {targetSize}, which mismatches depth buffer size of {depthBufferSize}.");
    15.         }
    16.     }
    17. }
    18.  
    That LogWarning never gets output, but the Unity error is still thrown.

    What could be causing this? How do I debug it?
     
    junyu44 likes this.
  2. LazloBonin

    LazloBonin

    Joined:
    Mar 6, 2015
    Posts:
    812
  3. artofbeni

    artofbeni

    Joined:
    Mar 15, 2014
    Posts:
    7
    Hey! I was struggling with this same error for a while and came by this thread while looking for answers. I have managed to track down the issue and found out that in my case it was caused by creating the new render texture in OnEnable() in my singleton class but also trying to get that texture from the OnEnable() method of the scripts that reference the singleton class, for example by using a method like this MySingleton.Instance.GetRenderTexture().

    Even if that method theoretically makes sure that the texture is created before returning it (i made it so if render texture is null, it calls the create render texture method), I would get those errors. Making sure MySingleton.Instance.GetRenderTexture() is outside the OnEnable() function of the scripts that request it solved those errors for me. I have moved it to OnPreRender() in my case. Not sure if this is relevant in your case, but hopefully gives you some ideas to try at least.
     
  4. LazloBonin

    LazloBonin

    Joined:
    Mar 6, 2015
    Posts:
    812
    Thanks, this does seem to be the fix. Awake/OnEnable seems to early to create render textures in many regards (googling other RT problems seemed to indicate the same undocumented quirk). But that's only true when in the "first" frame of the game (e.g. late enabling an object that creates a RenderTexture would be fine).

    I wish there was a flag we could check to see whether the engine is in a valid state to create render textures. Right now this is totally unreliable and undocumented.

    For example I wrote a custom camera mode that uses RenderTextures and I can't figure out the equivalent of "after OnEnable" for the editor side of things, so I can't avoid the mismatch error at all.
     
  5. junyu44

    junyu44

    Joined:
    Jul 27, 2017
    Posts:
    2
    I get the same error too. Wondering how to fix it.
     
    Walter_Hulsebos likes this.
  6. acnestis

    acnestis

    Joined:
    Sep 11, 2019
    Posts:
    15