Search Unity

Unity 5.5, custom matrix doesn't seem to write into z buffer

Discussion in 'Shaders' started by Tiny-Man, Dec 4, 2016.

  1. Tiny-Man

    Tiny-Man

    Joined:
    Mar 22, 2014
    Posts:
    482
  2. bgolus

    bgolus

    Joined:
    Dec 7, 2012
    Posts:
    12,352
    The depth in 5.5 is inverted (for DirectX 11), which I think requires flipping culling so "Cull Back" actually means "Cull Front". You can either flip your projection matrix as well, or try setting the shader's culling to Front for DX11. There's not a great way of handling this purely in the shader that I know of unfortunately, maybe having two SubShaders, one with #pragma exclude_renderers dx11 and one with #pragma only_renderers dx11
     
  3. Tiny-Man

    Tiny-Man

    Joined:
    Mar 22, 2014
    Posts:
    482
    Unfortunately that didn't solve the issue, I'm 100% sure that its something to do with the z testing. I can get the exact same result if I write ZTest always in the shader, quite an annoying issue :(
     
  4. bgolus

    bgolus

    Joined:
    Dec 7, 2012
    Posts:
    12,352
    Oh, right, you probably also need ZTest GEqual if you're not doing the inverse z too.
     
  5. Tiny-Man

    Tiny-Man

    Joined:
    Mar 22, 2014
    Posts:
    482
    Nope it just doesn't even render with that :( confusing stuff. I'm not too familiar with this stuff and the person who wrote our original solution is now on holidays for awhile so its up to me to solve it :)
     
  6. bgolus

    bgolus

    Joined:
    Dec 7, 2012
    Posts:
    12,352
    Try going back to the original shader, but adding this to the end of the script that creates the custom projection matrix.

    ret.z = 1.0 - ret.z;