Search Unity

Question Capture Overlay UGUI Camera to Texture

Discussion in 'Universal Render Pipeline' started by Ferazel, Nov 17, 2020.

  1. Ferazel

    Ferazel

    Joined:
    Apr 18, 2010
    Posts:
    517
    I'm currently using 2020.1.6 with URP 8.2.0 with the 2D renderer.

    I have a gameplay camera (2D Renderer), a gameplay UI camera (3D renderer), and then an screen-space overlay UI layer. Why? There are various points in the game where we will take a screengrab of the game and render it to a texture. It is desirable to have some of the UI (but not all of it) render in this screengrab so that we can point/emphasize areas of the UI. I don't want to put the gameplay UI into gameplay space because I don't want the UI to be affected by post-processing so it needs to be in its own camera space (?).

    What I hacked together today is the following and I'm wondering if I'm missing something because I don't feel that it should be this difficult.

    So to preface, I can't seem to render an Overlay camera to a texture (I can only get a BASE camera to render to texture). Am I doing something wrong or is that a limitation of URP? The method I'm using is setting the overlay's targetTexture property and doing a manual Camera.Render() call on it. I couldn't find the cameraOutput property as mentioned in the docs (https://docs.unity3d.com/Packages/c...9.0/manual/rendering-to-a-render-texture.html). I'm guessing that this documentation is outdated prior to the camera stacking behavior.

    So to get around this limitation I had to do the following
    - GameplayUICamera this camera is always enabled and is rendering the gameplay UI onto the camera stack.
    - GameplayUICameraScreenshot this camera component disabled that have the same size/dimenstions as the stack overlay camera, set as a BASE camera.
    - I then have a script that generates a render texture for the gameplay camera. I render the gameplay camera to texture via a .targetTexture and Camera.Render() call.
    - I then render the gameplay UI camera to a separate rendertexture using the same render texture format as the gameplay camera and set the GameplayUIScreeshot camera's .targetTexture property.
    - With both of these render textures I then perform a Graphics.Blit with a basic alpha blend shader material. This final destination texture then becomes the texture I display as a texture.

    This seems extremely hacky. Is there something I can do to skip some of these steps? In my imaginary world, I could do the following, perform a camera render on an overlay and it render to texture UP to that camera in the stack. Another desirable is to render the camera onto the same render texture from the gameplay camera so that I don't need to double the memory overhead by having two fullscreen render textures that I then need to blend myself.

    Thanks for any assistance you can provide!
     
  2. Ferazel

    Ferazel

    Joined:
    Apr 18, 2010
    Posts:
    517
    I eventually was able to resolve this issue by digging into the Camera Stacking thread (https://forum.unity.com/threads/camera-stacking-for-urp.736277/page-3#post-6278039). What I ended up doing is quite a bit simpler.

    baseCam.targetTexture = screenRenderTexture;
    uiCam.targetTexture = screenRenderTexture;
    uiCam.Render(); // not sure why this is necessary but the first render will not render the UI
    baseCam.Render(); // Will render both the base and overlay cameras

    Just in case someone else is running into problems.
     
  3. weiping-toh

    weiping-toh

    Joined:
    Sep 8, 2015
    Posts:
    192
    My workaround is to have separate implementations of Scriptable Renderer for each different use-cases. Similarly, my setup need cameras that renders UI that needs to be behind 3D objects located at different space, hence the need to render to rendertextures.

    In my workaround, I made a simple implementation of the ScriptableRenderer with only a couple of DrawObjectPass that is directed to a custom rendertexture that will be shared to another Renderer that could be loaded as a background texture. It is definitely better performing that adding custom RenderFeatures to the cumbersome default ForwardRenderer.
     
  4. Aupuma

    Aupuma

    Joined:
    Feb 15, 2017
    Posts:
    42
    Thank you so much, simply solved it putting:
    uiCam.targetTexture = baseCam.targetTexture
     
    wuGor and noio like this.