Search Unity

  1. Welcome to the Unity Forums! Please take the time to read our Code of Conduct to familiarize yourself with the forum rules and how to post constructively.
  2. We have updated the language to the Editor Terms based on feedback from our employees and community. Learn more.
    Dismiss Notice

Question Occlude / Stencil cut parts of objects

Discussion in 'Universal Render Pipeline' started by saz_at_evenflow, Apr 27, 2023.

  1. saz_at_evenflow

    saz_at_evenflow

    Joined:
    Mar 13, 2018
    Posts:
    10
    Question: How to perform a cutoff / intersect / occlude / stencil cut of some parts of objects (that are not "visible" due to being rendered over by other objects)?

    I am trying to write normal information for a set of objects (that belong to a specific layer) into a texture, that I can then use for future post processing calculations. This is done using a ScriptableRenderPass.

    The code looks as such:
    Code (CSharp):
    1.  
    2. filteringSettings = new FilteringSettings(RenderQueueRange.opaque, layerMask);
    3.  
    4. ...
    5.  
    6. DrawingSettings drawSettings = CreateDrawingSettings(shaderTagsToRender, ref renderingData, SortingCriteria.BackToFront);
    7. drawSettings.overrideMaterial = normalsMaterial;
    8. drawSettings.perObjectData = PerObjectData.None;
    9.  
    10. CoreUtils.SetRenderTarget(cmd, normalsTextureRef, normalsTextureRef);
    11. context.DrawRenderers(renderingData.cullResults, ref drawSettings, ref filteringSettings);
    12.  
    This will produces this (while I am only interested in the part "above ground"):
    normals.PNG natural.PNG

    This is a simplified example. There will be many ground objects etc.
    It feels like this should be a common scenario and that the framework should support these kind of things (even if it means rendering the entire screen again).

    Any guidance? What would be your approach?
     
  2. saz_at_evenflow

    saz_at_evenflow

    Joined:
    Mar 13, 2018
    Posts:
    10
    For example, could I somehow perform a ZTest in my shader graph, comparing the vertex positions with the camera depth texture and simply color them black?
     
  3. DevDunk

    DevDunk

    Joined:
    Feb 13, 2020
    Posts:
    4,455
    I don't know much about render passes, but I've done this often with a render feature:


    It doesn't integrate well with shader graph from my experience tho. Maybe 2022 or 2023 does support it, haven't tested there
     
  4. wwWwwwW1

    wwWwwwW1

    Joined:
    Oct 31, 2021
    Posts:
    637
    Hi, you can override the depth test (to ZTest Equal) settings in your custom renderer feature.

    Also, you need to set the depth render target to camera's depth buffer and its render pass event to after opaque/transparent rendering.


    Example for URP 12 to override depth test:
    Code (CSharp):
    1. private RenderStateBlock renderStateBlock = new RenderStateBlock(RenderStateMask.Nothing);
    2.  
    3. public override void OnCameraSetup(CommandBuffer cmd, ref RenderingData renderingData)
    4. {
    5.     //...
    6.     // "ZWrite Off" & "ZTest Equal".
    7.     renderStateBlock.depthState = new DepthState(false, CompareFunction.Equal);
    8.     // Tell URP that we changed depth render state.
    9.     renderStateBlock.mask |= RenderStateMask.Depth;
    10.     //...
    11. }
    12.  
    13. public override void Execute(ScriptableRenderContext context, ref RenderingData renderingData)
    14. {
    15.     //...
    16.     // You can also use ConfigureTarget(color, depth) in OnCameraSetup().
    17.     // Change depth target to "cameraDepthTargetHandle" if URP needs a RTHandle.
    18.     CoreUtils.SetRenderTarget(cmd, normalsTextureRef, renderingData.cameraData.renderer.cameraDepthTarget);
    19.     // Pass the RenderStateBlock to DrawRenderers.
    20.     context.DrawRenderers(renderingData.cullResults, ref drawingSettings, ref filteringSettings, ref renderStateBlock);
    21.     //...
    22. }
     
    DevDunk likes this.
  5. saz_at_evenflow

    saz_at_evenflow

    Joined:
    Mar 13, 2018
    Posts:
    10
    Thanks for the reply. I glanced at that video before, but doesn't feel like a render feature will solve it for me, as I need to write this to a texture.
     
    DevDunk likes this.
  6. saz_at_evenflow

    saz_at_evenflow

    Joined:
    Mar 13, 2018
    Posts:
    10
    Thanks a lot! I tried using the RenderStateBlock before in my attempts, but never passed a long the cameraDepthTargetHandle for my render target.

    In addition to this, the normal DrawRenderers simply didn't work for me.
    I reverse engineered the Render Feature functionality and used the RenderingUtils.DrawRendererListWithRenderStateBlock that Unity use internally instead (source). This worked.
     
    wwWwwwW1 likes this.
  7. saz_at_evenflow

    saz_at_evenflow

    Joined:
    Mar 13, 2018
    Posts:
    10
    Well, celebrated a bit too early.
    Turns out I get the reverse problem instead when I have a large object that I need the normals for, while some smaller object happens to be on top of it. From what I can tell, my effect is simply not worth doing in post processing.