Search Unity

Multiple questions about the 2D Renderer. Been researching for weeks and can't find info / solution

Discussion in 'Universal Render Pipeline' started by Jamez0r, Jan 4, 2020.

  1. Jamez0r

    Jamez0r

    Joined:
    Jul 29, 2019
    Posts:
    205
    Hi guys, I don't want this to come off as a rant, but I've been trying to figure out how to configure my 2D project so that I can use the 2D Lights system and do a couple other custom rendering things, and after two weeks of researching I still don't know how to do what I want to do. Everything I read is either outdated or doesn't work with URP (grabPass, etc), or there is seemingly zero information or examples on. I would seriously appreciate a bit of help here. I've spent days researching and learning techniques to do custom rendering, only to find out that it can't be done either due to URP limitations or other reasons.

    Overall, what I'm looking to do is create a project that can:

    -----------------------------

    I'll try to narrow down my thoughts to a couple specific questions:

    1) In my previous post above (the one about reflections) @CaptainScience gave me a nice detailed description on how I could create that effect. The immediate problem I have is that the "2D Renderer Data Asset" (the one that you drag into the Renderer List in the "UniversalRenderPipelineAsset") doesn't have any option to add Renderer Features:



    As opposed to the "Forward Renderer" which has an option to add them:



    Am I missing something here? Can I not do any custom rendering with the 2D Renderer? Whats the deal? Can I use the 2D Lights with a Forward Renderer, so that I can add my own Renderer Features, or do they only work with the 2D Renderer? Is the 2D Renderer just a custom-made Forward Renderer, and I could copy its layout and then make my own Forward Renderer based on that? I can't find a single bit of information on this :(

    [MINOR POST UPDATE]

    I opened up both the "Forward Renderer.asset" and the "2D Renderer Data.asset" in notepad. Both of them have the "m_RendererFeatures: []" array. This would lead me to believe that I should be able to add RendererFeatures to the 2D Renderer... but how? :confused:





    ------------------------------

    2) GrabPass has been removed and has been "replaced" by _CameraOpaqueTexture. Please correct me if I'm wrong, but isn't this completely useless for a 2D game since everything is rendered during Transparent? Is there some new way to do the equivalent of a GrabPass during Transparent? Maybe after Transparent has finished rendering? To be super specific, what I'm looking to do is make a heat-distortion trail/arc effect when an enemy swings their weapon.

    I've read probably like 20 posts about this. Some people had solutions, some didn't.
    At the end of this post, @Elringus responded with info that seems useful: https://forum.unity.com/threads/srp-and-grabpass-with-transparent.555892/
    At the end of this post, @phil_lira responded with good info: https://forum.unity.com/threads/the-scriptable-render-pipeline-how-to-support-grabpass.521473/

    But again, how can I do any of that with the 2D Renderer?

    --------------------------------

    3) Are there ANY examples of ANY sort of custom rendering for a 2D game using the 2D Renderer?

    This seems to be what everyone suggests looking at for examples using URP, but they are all 3D: https://github.com/Unity-Technologies/UniversalRenderingExamples
    This link shows FEATURES of the 2D Renderer, but nothing custom about rendering: https://github.com/Unity-Technologies/2d-renderer-samples

    --------------------------------

    Again, I'm sorry if this post is coming off as a rant, but this is the culmination of WEEKS of research and testing and never coming to a solution. Instead of reading official documentation, I feel like I've been trying to piece together tidbits of information from pages and pages of posts on this forum. It's very frustrating. I'm not trying to do anything out of the ordinary, I just want to use the 2D Lights system along with some run-of-the-mill effects.

    Thanks for reading my post, and again, sorry that this sounds so complain-y :oops:
     
    Last edited: Jan 4, 2020
  2. Shane_Michael

    Shane_Michael

    Joined:
    Jul 8, 2013
    Posts:
    158
    I haven't worked with the 2D renderer at all, but it does seem like it doesn't support customer renderer features. At least not yet. You do have all the source code for the render pipeline so you can actually make a custom version of the renderer to do exactly what you need it to.

    You should have a local cache of the package somewhere (on Windows it is "c:\Users\Username\AppData\Local\Unity\cache\packages\"). You can find the Renderer2D.cs file and make a copy of it in your own project. Call it CustomRenderer2D.cs. Then if you find these lines in the Setup() function:

    Code (csharp):
    1.  
    2.             ConfigureCameraTarget(m_ColorTargetHandle.Identifier(), BuiltinRenderTextureType.CameraTarget);
    3.  
    4.             m_Render2DLightingPass.ConfigureTarget(m_ColorTargetHandle.Identifier());
    5.             EnqueuePass(m_Render2DLightingPass);
    6.  
    That is the pass that is doing the main rendering. So you find the Render2DLightingPass.cs and make a copy of that too, and call it Render2DReflectionPass.cs. Then you can edit that to only render things you want to have a reflection; you could use ShaderIDTags, layers, render laying, render queue range, etc. Whatever is going to work for your setup. Then it will render all those things with the regular lights--you may need to use a second layer of lights reflected about the ground plane so the lighting looks correct.

    Then back in your CustomerRenderer2D you do something like this right after the regular lighting pass is enqueued:

    Code (csharp):
    1.  
    2. m_ReflectionTargetHandle = CreateOffscreenColorTexture(context, ref cameraTargetDescriptor, FilterMode.Bilinear);
    3. m_Render2DReflectionPass.ConfigureTarget(m_ReflectionTargetHandle.Identifier());
    4. EnqueuePass(m_Render2DReflectionPass);
    5.  
    And that should render your reflection geometry into that target. You may need one more pass to render out your transparent geometry; pass in your m_ReflectionTargetHandle into the Setup() function of your transparent reflection pass to assign it to a global shader property so it is available when your shader needs it.

    That is just my take after inspecting the code so no guarantees on the details, but something like that should work. The important thing is that with the source code, you can follow the logic and see everything the renderer is doing and add any custom rendering you want in the right spot.

    Customizing things may cause problems when the pipeline is updated. Small changes you may be able to merge automatically if you're using source control, but any changes to the functionality you've customized may break the renderer so there will be some maintenance involved. It looks like the 2D renderer is still marked as "experimental" so I would expect there to be some changes. If they happen to add support for renderer features, though, you can refactor your code to use them and you won't need to modify the renderer class itself any more.

    It is quite late here, so hopefully that made some sense and is helpful.
     
  3. Jamez0r

    Jamez0r

    Joined:
    Jul 29, 2019
    Posts:
    205

    @CaptainScience Seriously, thank you so much for the detailed help. I believe I understand pretty much everything I have to do, and what I need to research.

    To duplicate the original Renderer2D.cs to create the CustomRenderer2D.cs, I had to copy the "Universal RP" library from that unity/cache location directly into my project's Library folder. There appear to be a lot of "internal" classes for the library, so I couldn't create the CustomRenderer2D.cs file in my actual project (aka under the Assets folder), but instead had to create it within that localized /Library/Universal RP folder.

    The point I'm at right now is trying to figure out how to "hook up" my project to use the new CustomRenderer2D.cs that I created. If I was using the default setup, I would create a "2D Renderer Data.asset" (by going Create -> Rendering -> URP -> 2D Renderer) and then drag that into the "Renderer List" on the UniversalRenderPipelineAsset. So it seems I either need to figure out how to create my own version of that "2D Renderer Data.asset" to reference the new CustomRenderer2D.cs file, or possibly create my own version of the PipelineAsset.

    So hopefully my final question is this:

    Should I be researching how to create a Scriptable Render Pipeline? Or can I bypass that by somehow injecting my CustomRenderer2D.cs into the "Renderer List" of the default UniversalRenderPipelineAsset?

    If I do need to create my own Scriptable Render Pipeline, I found these links:
    https://blogs.unity3d.com/2018/01/31/srp-overview/
    https://catlikecoding.com/unity/tutorials/scriptable-render-pipeline/custom-pipeline/

    I feel like I'm 99% of the way there (well, besides doing the ACTUAL custom code, haha). Thanks again for any help!

    [ EDIT ]

    My issue may be that I'm not understanding how these .asset files work in Unity.

    When I open the UniversalRenderPipelineAsset.asset or the 2DRendererData.asset in notepad, it almost seems like its an "instance" of a class - it just contains the specific settings/data. I'm not sure how it works, or how it is created (AKA how could I create one of these for my CustomRenderer2D). Going to google and try to figure it out :p, will keep this post updated if I make some progress!
     
    Last edited: Jan 6, 2020
    Stab likes this.
  4. japhib

    japhib

    Joined:
    May 26, 2017
    Posts:
    65
  5. Jamez0r

    Jamez0r

    Joined:
    Jul 29, 2019
    Posts:
    205
    Thanks so much for commenting on my post to inform me of this information! I still hadn't been able to get any of this to work (moved on to other stuff, figured I'd get back to it sometime later), so this is really exciting! Can't wait to give it a try!
     
    Elvar_Orn likes this.
  6. japhib

    japhib

    Joined:
    May 26, 2017
    Posts:
    65
    No problem! There was a bit of finagling I had to do but I was able to get a screen-space distortion to work eventually, so if you have other questions, I might know the answer :)
     
  7. Jamez0r

    Jamez0r

    Joined:
    Jul 29, 2019
    Posts:
    205
    That would be awesome! I'm probably going to take a look at everything this weekend. Gonna shoot you a direct message. Thanks man!
     
  8. Jamez0r

    Jamez0r

    Joined:
    Jul 29, 2019
    Posts:
    205
    I was able to get the reflections working by doing the following:
    1) Main camera that renders all Layers except the "Reflectors" layer
    2) Secondary camera that only renders the Reflectors layer. This camera is set up follow the Main camera's position and orthographic size. Outputs to a RenderTexture. The RenderTexture is creating through a script at Awake() and it set to the size of the screen.
    3) Put the flipped-upside-down-sprites (the "fake reflections") on the Reflectors layer
    4) Stencil (sprite masks) on the puddles/water that defines where the reflections should be
    5) A full-screen sized sprite set up with a custom shader that draws the RenderTexture to the screen. It also USES the stencil, and applies distortion (the watery/icy effect). This sprite is updated every frame to be centered on the camera, and is set to the size of the camera's view. Note: if you are using a dummy sprite for the _MainTex (the Lost Crypt example uses a random piece of the stone wall, my version used just a white square) and are using its .uv in your shader, make sure that dummy sprite doesn't get packed into a Sprite Atlas. I initially was using a dummy sprite that got packed into an atlas, and in the #scene view and even the #game view when the game wasn't running, everything looked correct. But as soon as I started up the game, the reflections would just disappear. Took me like 3 hours to figure out it was because the dummy sprite was getting packed into an atlas, and so its .uv coords were no longer 0 to 1, and that screwed up the calculations in my shader.

    I'm still trying to find info on whats actually going on behind the scenes when there are multiple cameras. I know the cameras have a "priority" which is supposed to determine rendering order, but what exactly does that do? Does the camera that gets rendered "first" render ALL layers and everything... and THEN the next camera starts and renders all layers? Or what? I googled to find info on it and couldn't find any.
     
    Tapwated likes this.
  9. Teky

    Teky

    Joined:
    Aug 23, 2017
    Posts:
    8
    Hi, Jamez0r are you experiencing any kind of FPS drop when you use a second camera to render your scene on a sprite to make the effect? I am using the same kind of setting and I have a distorsion shader already working when my player jumps but I am experiencing fps drop even when only making the second camera to be enabled to get the renderer texture. I would normally use graphics.Blit() to access the camera image, but it is impossible in the experimental renderer 2D. Any ideas how to fix this?

    Thanks
     
  10. Jamez0r

    Jamez0r

    Joined:
    Jul 29, 2019
    Posts:
    205
    Are you saying that the distortion effect only happens when the player jumps? Or how is the player jumping relevant? What do you mean by fps drop? How much of an fps drop?

    I'm not getting any huge fps drop from the camera/render texture setup, but it clearly isn't a "free" operation. I'm on PC though and haven't tested on mobile or anything, so that could be totally different.

    Maybe try 'disabling' your new distortion setup, getting a baseline for your FPS, and then adding things in one step at a time. Maybe something like this:
    1) Add 2nd camera, run game - any effect on fps?
    2) Make 1st camera (or whichever one in your case) output to a Render Texture - any effect on fps?
    3) Set up the sprite that uses the Render Texture - any effect on fps?
    4) Apply your distortion shader - any effect on fps?

    The main thing off the top of my head that would cause a big fps drop would be if you're creating the RenderTexture every frame instead of just clearing/re-using a single one, or if your game is mobile then I have no idea what to expect FPS-wise with something like this.
     
  11. Jamez0r

    Jamez0r

    Joined:
    Jul 29, 2019
    Posts:
    205
    You could also run the Profiler and look at your CPU/GPU to see what is causing the slowdown. A lot of the time that can at least point you in the right direction.
     
  12. Teky

    Teky

    Joined:
    Aug 23, 2017
    Posts:
    8
    Hi, indeed this effect is only used when my character jumps, creating a distorsion wave around it.
    I disabled all my components in the effect (even the creation of a render texture) and only as a direct consecuence of enabling a second camera i get fps drop. Now this is only relevant for machines without graphics card, so maybe it is normal that this happens. When i make a build and run in a good computer everything is fine(60FPS), if i run it in a computer without graphics card i get 30fps instead of 60fps only when the camera is enabled, even without the effect or render texture.
    Maybe this is perfectly normal and I am being greedy in my desire of the game to work in not so good computers...

    Thanks anyway i would try also with the profiler just in case!