Do the default UI & mask shaders write to the stencil buffer? If so, do they use a known WriteMask and ReadMask? I see a number of stencil properties in the default UI shader.
I just found this for HDRP: There are 2 "User Bits": UserBit0: 64 (0b1000000) UserBit1: 128 (0b10000000) That information can be found here: https://docs.unity3d.com/Packages/c...endering.HighDefinition.UserStencilUsage.html or in an example here: https://github.com/alelievr/HDRP-Cu.../Assets/CustomPasses/SeeThrough/SeeThrough.cs The other bits are used by unity, which is listed here: https://github.com/Unity-Technologi...Runtime/RenderPipeline/HDStencilUsage.cs.hlsl Code (CSharp): #define STENCILUSAGE_CLEAR (0) #define STENCILUSAGE_REQUIRES_DEFERRED_LIGHTING (2) #define STENCILUSAGE_SUBSURFACE_SCATTERING (4) #define STENCILUSAGE_TRACE_REFLECTION_RAY (8) #define STENCILUSAGE_DECALS (16) #define STENCILUSAGE_OBJECT_MOTION_VECTOR (32) #define STENCILUSAGE_EXCLUDE_FROM_TAA (2) #define STENCILUSAGE_DISTORTION_VECTORS (4) #define STENCILUSAGE_SMAA (4) #define STENCILUSAGE_WATER_SURFACE (16) #define STENCILUSAGE_AFTER_OPAQUE_RESERVED_BITS (56) #define STENCILUSAGE_USER_BIT0 (64) #define STENCILUSAGE_USER_BIT1 (128) #define STENCILUSAGE_HDRPRESERVED_BITS (63) These values can also be found in the internal enum: "UnityEngine.Rendering.HighDefinition.StencilUsage" which you can decompile by ctrl-clicking the word "StencilUsage" in Visual Studio.