Search Unity

  1. Welcome to the Unity Forums! Please take the time to read our Code of Conduct to familiarize yourself with the forum rules and how to post constructively.

Question Full Screen Shaders in URP?

Discussion in 'Universal Render Pipeline' started by Deleted User, Jan 21, 2022.

  1. Deleted User

    Deleted User

    Guest

    Hi. So I'm still just very bad at shaders. Long story short, I need a shader for a shockwave effect over top the entire scene. i'm trying to follow this tutorial here, which led to this video here, which pointed to this link. I tried to follow along as best as I could but I feel like the information is either way over my head, or outdated or something. I can't seem to get a full screen shader to work following these instructions. I think my main problem is after I'm supposed to add the blit. When I do, I get an error in the console that reads "
    You can only call cameraColorTarget inside the scope of a ScriptableRenderPass. Otherwise the pipeline camera target texture might have not been created or might have already been disposed." I don't know what this means. I don't see anyone else who needs full screen shaders or anyone following this tutorial receiving this error and having this issue, so I must be doing something wrong. I tried to find some information outside of this guys tutorial. The most intruguing thing I found was this github repository, which, if you scroll near the bottom of it, has a graph called impact that is very similar to the shockwave shadergraph I want. I was eager to try it out, but it turns out these shaders dont work anymore since they were written in an old version of the URP or something like that. So I'm wondering, after all this mindless running around and not accomplishing anything, is there a better way to get full screen shaders working? Are there some other resources or tutorials out there, preferably from somewhere other than this gamedevbill guy, that covers how to get full screen shaders in the URP working, and the information is up to date and ACTUALLY works?
     
  2. Lo-renzo

    Lo-renzo

    Joined:
    Apr 8, 2018
    Posts:
    1,281
    MarekUnity and Deleted User like this.
  3. Deleted User

    Deleted User

    Guest

    I was able to get full screen shaders by setting my project up in a similar way to theirs. Thanks!
     
  4. drjmcdonald

    drjmcdonald

    Joined:
    Oct 4, 2016
    Posts:
    1
    Just in case anyone tips over this as well, I ran into the same issue following the same tutorial, i.e., You can only call cameraColorTarget inside the scope of a ScriptableRenderPass. Otherwise the pipeline camera target texture might have not been created or might have already been disposed.

    The issue was that I had multiple cameras in the scene using the same URP ForwardRenderer, one was the main Base camera, and another stacked overlay camera for UI.

    The solution was to have to make a copy of the ForwardRenderer, so have one ForwardRenderer_Main & another ForwardRenderer_UI. Add both to the parent Renderer, you will see a '+' to add another ForwardRenderer.

    Then you can add Blit to the ForwardRenderer of choice. Set the main camera to use ForwardRenderer_Main and the UI camera to use ForwardRenderer_UI. This means the shockwave will only appear on the main scene and the UI will be unaffected. Hope this helps.
     
    Ruchir and thibaut_r like this.
  5. moatdd

    moatdd

    Joined:
    Jan 13, 2013
    Posts:
    74
    Since I was directed here via a web search engine, I'm going to do a favor for everyone who took the same route looking for an answer to the same question:

    This is Unity's URP 16 documentation for custom post-processing effects

    https://docs.unity3d.com/Packages/c...g/post-processing-custom-effect-low-code.html

    A short summary of the steps involved in case the link above breaks:

    Create your Shader
    1. Create > Shader Graph > URP > Fullscreen Shader Graph.
    2. Add a URP Sample Buffer node. (this will grab the screen texture to be processed by your shader)
    3. In the URP Sample Buffer node's Source Buffer dropdown menu, select BlitSource
    4. Do whatever you want to the input and send it to the fragment Base Color output (image from Unity's docs)
    5. Save your graph.


    Applying the Effect
    1. Create a Material and assign your Shader to it
    2. Add a Full Screen Pass Renderer Feature to the URP Renderer you want to apply your shader to
    3. Assign the material to the Full Screen Pass Renderer Feature's "Post Process Material"
    4. Set Injection Point to After Rendering Post Processing.
    5. Set Requirements to Color.

    BONUS ROUND: for people who prefer to create their shaders directly in HLSL code

    Set up a pass-through shader like so:

    upload_2023-3-23_13-3-31.png

    Add all the parameters you need:
    upload_2023-3-23_13-8-25.png

    Save and select the shader graph.
    upload_2023-3-23_13-4-3.png

    View the Generated Shader.
    Grab everything and copy-paste it to a new shader.
    Edit that new shader.
    Rename it so that it doesn't get lost in the Shader Graphs menu.
    upload_2023-3-23_13-7-13.png

    Find:
    Code (CSharp):
    1. SurfaceDescription SurfaceDescriptionFunction(SurfaceDescriptionInputs IN)
    upload_2023-3-23_13-9-28.png

    Replace with this and edit it as necessary.
    Code (CSharp):
    1.  
    2. // =================================================================
    3. // Frag Out
    4. // =================================================================
    5.  
    6. SurfaceDescription SurfaceDescriptionFunction(SurfaceDescriptionInputs IN)
    7. {
    8.     SurfaceDescription surface;
    9.     const float2       inputUvs   = IN.NDCPosition.xy;
    10.     const float4       inputColor = Unity_Universal_SampleBuffer_BlitSource_float(inputUvs);
    11.  
    12.     // Insert your own code, modifying inputColor
    13.     float4 outputColor = float4(inputColor.r, inputColor.g, inputColor.b, 1);
    14.  
    15.     surface.BaseColor = outputColor.xyz;
    16.     surface.Alpha     = 1;
    17.     return surface;
    18. }
    19.  
    YOU AREN'T DONE YET!

    There is ANOTHER instance of the SurfaceDescriptionFunction. You must replace that one as well with your edited code. In total, there are TWO instances of that function, one in the Blit pass and one in the DrawProcedural Pass. BOTH must be replaced.


    If this didn't work for you, post your updated solution below.
     
    Last edited: Mar 23, 2023
    Olmi likes this.
  6. moatdd

    moatdd

    Joined:
    Jan 13, 2013
    Posts:
    74
    Bonus Addendum: Blittexture access

    By default, shaders generated from a graph will contain the following function used to read the full-screen texture for processing.

    Code (CSharp):
    1.             float4 Unity_Universal_SampleBuffer_BlitSource_float(const float2 uv)
    2.             {
    3.                 uint2 pixelCoords = uint2(uv * _ScreenSize.xy);
    4.                 return LOAD_TEXTURE2D_X_LOD(_BlitTexture, pixelCoords, 0);          
    5.             }
    6.  
    This shader will have some shortcomings when it comes to graphical pipelines that involve upscaling/downscaling of image buffers, often creating unsightly aliasing artifacts on shapes or small text.

    This is because it is turning float2 uv into uint2 pixelCoords, in order to satisfy LOAD_TEXTURE2D_X_LOD's requirement for uint2 coordinates, and this promptly introduces a rounding bug.

    wibbly wibbly wobbly result:
    upload_2023-3-24_13-6-39.png

    You can solve this issue by keeping the uv as floats by using a function that takes uv floats to read your texture instead:

    Code (CSharp):
    1.             float4 Unity_Universal_SampleBuffer_BlitSource_float(const float2 uv)
    2.             {
    3.                 return SAMPLE_TEXTURE2D(_BlitTexture, SamplerState_Trilinear_Repeat, uv);
    4.             }
    5.  
    result:
    upload_2023-3-24_13-7-46.png
     
    Last edited: Mar 25, 2023
    saz_at_evenflow likes this.