Search Unity

Problem with custom render texture resolutions and anti-aliasing

Discussion in 'High Definition Render Pipeline' started by iSpiegelball, Feb 28, 2020.

  1. iSpiegelball

    iSpiegelball

    Joined:
    Jan 7, 2020
    Posts:
    14
    Hey guys.

    I want to setup a super-resolution anti-aliasing rendering mechanism and to do so I

    • have the scene camera render into a custom render texture (which has twice the resolution of the screen target)
    • display the render texture with Graphics.Blit()...
    To do so I have a simple script, which sets the render texture for the camera to the custom RT at the begin of each frame and clears it after rendering is finished (so that the Blit() method renders the result to the target display.) This is my script:

    Code (CSharp):
    1. using UnityEngine;
    2. using UnityEditor;
    3. using UnityEngine.Rendering;
    4.  
    5. public class RTBlit : MonoBehaviour
    6. {
    7.     [SerializeField]
    8.     public RenderTexture rt;
    9.  
    10.     private void Start()
    11.     {
    12.      
    13.         RenderPipelineManager.beginCameraRendering += OnBeginCameraRendering;
    14.         RenderPipelineManager.endCameraRendering += OnEndCameraRendering;
    15.     }
    16.  
    17.     void OnBeginCameraRendering(ScriptableRenderContext src, Camera camera)
    18.     {
    19.         if (!EditorApplication.isPlaying)
    20.             return;
    21.         Camera.main.targetTexture = rt;
    22.      
    23.         Debug.LogError(Camera.main.targetTexture);
    24.     }
    25.  
    26.     void OnEndCameraRendering(ScriptableRenderContext src, Camera camera)
    27.     {
    28.         if (!EditorApplication.isPlaying)
    29.             return;
    30.  
    31.         // You have to set target texture to null for the Blit below to work
    32.         Camera.main.targetTexture = null;
    33.         Debug.LogError(Camera.main.targetTexture);
    34.         Graphics.Blit(rt, null as RenderTexture);
    35.     }
    36. }
    This mechanism works fine in unitys default renderer (note that the script uses different callbacks there).

    In HDRP I am stumbling across two problems.

    1. Changing the render textures resolution does change the position of the "blitted" result on screen.

    • Default
    normalResSettings.PNG
    normalResGame.PNG

    • Double resolution
    doubleResSettings.PNG
    doubleResGame.PNG
    • Half resolution
    halfResGame.PNG

    As you can see the output is scaled inversely to the resolution

    2. Changing the anti-aliasing value of the render texture to anything except "None" results in a completely black image.

    Any help on that would be greatly appreciated.