Search Unity

Forward rendering + SSAO + Vertex offset = Problem

Discussion in 'Shaders' started by SunnySunshine, Jul 3, 2019.

  1. SunnySunshine

    SunnySunshine

    Joined:
    May 18, 2009
    Posts:
    976
    In a custom shader I've made, I'm offsetting vertices according to a compute buffer. With the "addshadow" compiler directive, this works fine with shadow casting and reception. However, when using SSAO, the camera effect is applied as if the object was not offseted by the compute buffer, and thus the AO appears incorrectly. This is due to the "Hidden/Internal-DepthNormalsTexture" pass rendering the unmodified object into the depth normals.

    When using the deferred rendering path, this problem is not present. It is only present in the forward rendering path.

    For reasons unconnected to this problem, I need to use forward rendering.

    I'm now at a bit of a loss, because as far as I know, there's no way to render a specific object with specific compute buffers into the depth normals texture.

    Summoning @bgolus .
     
  2. SunnySunshine

    SunnySunshine

    Joined:
    May 18, 2009
    Posts:
    976
    I suppose one could use a command buffer to render into the depth normals texture "manually", using the "AfterDepthNormalsTexture" CameraEvent. Not sure exactly how that works yet, since I've never set something like that up, but it seems like it should be possible.

    However, incorrect normals and depth would still have been incorrectly written into the depth normals texture by Unity (prior to our command buffer injection). Is there a way to exclude a certain object from being rendered into depth normals? If not, the only possibility I can see then is to replace Unity's depth normal shader with a custom one, that clips pixels according to some convention, for example TEXCOORD0.w being the value 10 or something like that. It feels very hacky but I suppose it could work.
     
    Last edited: Jul 3, 2019
  3. SunnySunshine

    SunnySunshine

    Joined:
    May 18, 2009
    Posts:
    976
    Seems like all you need to do to prevent an object from being rendered into Unity's depth normals textures is to change the RenderType in its shader to something custom.
     
  4. bgolus

    bgolus

    Joined:
    Dec 7, 2012
    Posts:
    12,339
    Yep. This. You need to use a command buffer.

    The camera depth normal texture uses a replacement shader pass. This renders everything in view using a single shader with multiple passes, each with a RenderType flag. Change the RenderType to “Transparent” or “Fred” or whatever you want, as long as it’s not one listed in the shader. Alternatively if you’re rendering the object by having a regular mesh renderer with a compute buffer assigned, you could replace the built in depth normal shader with your own via the graphics settings.
     
    SunnySunshine likes this.
  5. SunnySunshine

    SunnySunshine

    Joined:
    May 18, 2009
    Posts:
    976
    Thanks. Guess I was on the right track. :)

    What exactly do you mean here - to "assign" a compute buffer to a mesh renderer? Do you simply mean to fetch the compute buffer data and assign it to the vertices of the mesh, or is there a way in the APIs to force Unity to draw a mesh using a compute buffer? Because if it's the former, I definitely want to avoid as much GPU->CPU as possibly, but if it's the latter I'm intrigued.
     
  6. bgolus

    bgolus

    Joined:
    Dec 7, 2012
    Posts:
    12,339
    It's possible to assign a compute buffer on a material, like a texture or a float array, using the SetBuffer() function.
    https://docs.unity3d.com/ScriptReference/Material.SetBuffer.html

    This can be used by the shader in any way you see fit, like taking an existing mesh and deforming it using the buffer data per vertex. However, usually when you're using a compute buffer to drive geometry you're looking to have dynamic vertex counts, like for instancing or for procedural mesh generation, and are using DrawMeshInstancedIndirect or DrawMeshProcedural, or similar functions.


    Either way, unless you're calling GetData on the compute buffer, you're not doing any GPU -> CPU work.
     
  7. SunnySunshine

    SunnySunshine

    Joined:
    May 18, 2009
    Posts:
    976
    Ah yeah ok, then we're on the same page. I thought you meant there was a way to send a buffer to a shader without using a material (assigning a compute buffer to a renderer or mesh), so that it would work with shader replacement. But in my scenario, a material has to be used for me to be able to set a buffer.

    I'm using the command buffer technique now and it's working perfectly.