It took a couple hours, but I figured out how to make it work again. Just replace this line: Code (cginc): sceneDepth = 1.0 / ( (_VGZBufferParamA * tex2Dproj(_GrassDepthTex,UNITY_PROJ_COORD(IN.screenPos)).r) + _VGZBufferParamB ); With this: Code (cginc): sceneDepth = LinearEyeDepth(tex2Dproj(_GrassDepthTex, UNITY_PROJ_COORD(IN.screenPos)).r); From what I understood from a quick search, the original code is supposed to improve the curve/distribution of the depth-buffer's z-values, so it works better for close grass blades. However, after making the change above, I compared before and after screenshots, and it doesn't seem to be any worse. Did Unity perhaps implement better depth-buffer distribution themselves? Spoiler: Before Spoiler: After I know that in the changelog here [https://unity3d.com/unity/whats-new/unity-5.5.0], they do mention: "Graphics: Improve shadows precision for large game worlds. Use reverse-float depth buffer for improved depth buffer precision on modern GPUs. Particularly improves directional light shadowing for large view distances.". However, that seems different than what your "custom LinearEyeDepth() parametrization" code does. Anyway, I'm certainly not a shader/rendering-pipeline expert, so the above is just a guess. But the fix does seem to restore at least the main functionality of the plugin, until an official patch is made.