Search Unity

_CameraDepthTexture Android render problems.

Discussion in 'Shaders' started by FuzzyShan, Dec 22, 2015.

  1. FuzzyShan

    FuzzyShan

    Joined:
    Jul 30, 2012
    Posts:
    182
    1. EffectCamera.depthTextureMode = DepthTextureMode.Depth;
    2. using _CameraDepthTexture to get depth,

    On PC I get the right depth texture, however on Samsung note 2 and some other android phones I get all white _CameraDepthTexture.

    On Samsung galaxy 4 depth texture is written(not al white), however it doesn't render anything else that's not in the depth texture.

    Was wondering if somebody could point me toward the right direction, percision problem might be where it is at, but I figure that this is might not be the case since I am not rendering things far.
     
  2. Phantomx

    Phantomx

    Joined:
    Oct 30, 2012
    Posts:
    202
    I had the same problem, I think it's a bug on unity's side. I found a work around by creating my own depth texture with replacement shaders and using a 16 bits depth renderTexture.
     
    a_p_u_r_o and SunnySunshine like this.
  3. FuzzyShan

    FuzzyShan

    Joined:
    Jul 30, 2012
    Posts:
    182
    Thanks, Do you know how is the performance vs buildin depth texture on mobile phones? Creating another camera just to render depth to an render to texture would create double of drawcalls for sake of this bug?
     
    Last edited: Dec 23, 2015
  4. Phantomx

    Phantomx

    Joined:
    Oct 30, 2012
    Posts:
    202
    Well this is basically what unity does behind the scenes so it's pretty much the same thing. It will double draw calls but it doesn't cost twice the rendering since your shaders will be much more simple, no lights, no texture calculations.
     
  5. FuzzyShan

    FuzzyShan

    Joined:
    Jul 30, 2012
    Posts:
    182
    Right, guess some smart people on assetstore already figure out passing into alpha channel might be better off, thanks.
     
  6. Phantomx

    Phantomx

    Joined:
    Oct 30, 2012
    Posts:
    202
    yes that is a better way to do it, but in my case alpha channel was already used for something else... and it implies that you'll use custom shaders for everything.
     
  7. Linearch

    Linearch

    Joined:
    Apr 1, 2015
    Posts:
    33
    @Phantomx I'm having the same issue. _CameraDepthTexture is "empty" on my android. It works on my laptop.

    My android supports depth tex and 24 bit depth buffer. I did uncheck the "32-bit depth buffer" on player settings. I tried both 16 bit and 24 bit, neither works.

    I tried making a replacement shader to literally write depth onto a depth texture. I tried COMPUTE_EYEDEPTH, COMPUTE_DEPTH_01, and UNITY_TRANSFER_DEPTH + UNITY_OUTPUT_DEPTH but none worked, or I just used it wrong, though I don't think so.
    Even an empty shader returning 0 works on my laptop to write on a depth texture as long as I dont set ZWrite to Off.

    Would you mind to share your replacement shader?

    Thanks.


    ---- edit ----
    Ok so I found out how it's calculated in UnityCG.cginc
    Do I just have to return the value to write it to a depth texture? In SV_Target frag?
    I tried encoding it into RGBA but its way lacking in precision. I wonder if depth tex can keep the precision.

    ---edit2---
    Just returning seems to be fine. Though I got better precision with RGBA encoding. Changing floats with halfs actually gave me way better precision. My device got Mali GPU, it uses 16 bit floating point in frag. Didnt think using half from the start would be better than letting the shader (or gpu) change it.
     
    Last edited: May 3, 2016