Search Unity

Question Help making sense of URP RenderDoc capture

Discussion in 'Universal Render Pipeline' started by Zaflores, Mar 20, 2023.

  1. Zaflores

    Zaflores

    Joined:
    Sep 23, 2017
    Posts:
    10
    Hello!

    I have some questions about interpreting a renderdoc capture for an XR project running on the Oculus Quest 2 using URP and OpenGLES as a graphics API. This post is long as I wanted to include as much information as possible to avoid questions I've already looked into.

    (some quick background)
    I'm currently working on an XR project for the Oculus Quest 2, and am attempting to get FFR working in my game (as I'm currently GPU bound). However I was running into trouble getting FFR to work despite correctly setting the foveation level via script.

    After doing some research my interpretation is that I found that the problem lies in URP sometimes drawing to an intermediate texture (the texture being foveated), and then blitting to the final texture. I understand my goal is to draw to the final texture without the need of an intermediate one. With this in mind, I loaded up an empty URP project to try and get this working, so I can transfer the settings/setup to my main project.

    The Project Setup
    • Imported URP, XR Integration, and Oculus Integration from Package Manager
    • Made a URP asset, assigned it, and adjusted settings to stop URP from using an intermediate texture (photos of my setup below)
    • Player Settings -> Resolution and Presentation -> Blip type set to never
    • Post Process checkbox on main camera set to false


    Now with this setup I get the following renderdoc capture:



    To me this looks possibly correct, possibly incorrect? some red flags I see are the 2048x2048 TempBuffer 3
    But The DrawOpaqueObjects is listed as drawing into Xr Texture [1], and there's no final blit operation after everything is done (though I am unsure if that's what xr mirror view is doing).

    As a test I set the field of intermediate texture in Compatibility from Auto to Always, to force an intermediate texture, with a final blit.

    The renderdoc capture for that setup looks like this:



    And what do you know, it looks the same!
    So now I'm left confused with a few questions, as I'm relatively new to looking at GPU captures, debugging issues, etc. (and truthfully can't make full sense of what im looking at here)
    • Where is the final blit operation happening? XR Mirror View is the only operation I could imagine.
    • What is the intermediate texture? TempBuffer 3? XR texture[1]?
    • When looking at the DrawOpaqueElements operation, it's shown as being drawn into XR Texture[1], which is what I want (im assuming), maybe I want it drawn in XR Texture[0]?
    I feel like I'm missing something here, but foveated rendering is extremely important for me to get working. So any guidance or advice about what I'm looking at here would be extremely helpful for me.

    Best,
    Zach
     
    Last edited: Mar 20, 2023
  2. Zaflores

    Zaflores

    Joined:
    Sep 23, 2017
    Posts:
    10
    Well, if anyone would like to share their wisdom on this post (about the questions that I asked at the bottom) it would be appreciated, but apparently foveated rendering only works with Vulkan. Found this while digging in the URP source that there was a lot of code running related to intermediate buffers if the target graphics API was GLES3. So at least that's no longer blockinng me!