Search Unity

Does the Quest has another Render Pipeline than the Rift?

Discussion in 'AR/VR (XR) Discussion' started by LB_Chris, Feb 24, 2020.

  1. LB_Chris

    LB_Chris

    Joined:
    Jan 29, 2020
    Posts:
    33
    Hey folks,

    I am work on an app for Oculus Quest and I am trying to take a screenshot of the scene without relying on an additional Camera.Render() within that process due to performance reasons. To accomplish that, I am trying to Blit the finished rendered RenderTexture of the camera to my own custom RenderTexture within a CustomRenderPassFeature (after all rendering is done: m_ScriptablePass.renderPassEvent = RenderPassEvent.AfterRendering) by calling CommandBuffer.Blit(BuiltinRenderTextureType.CameraTarget, _customRenderTexture, ...).

    Everything works fine while testing it on the Oculus Rift in Unity Playmode, but once I build and play the apk on my Oculus Quest, some unexpected behaviour occurs:

    The main problem is, that the image is displayed only in the upper right quarter of the image. The other three quarter of the image is black. More unimportantly, on the Rift, the displayed image is a single image of both eyes in fisheyed vew, whereas on the Quest, the image only shows a "normal" view of a single camera/eye without any fisheyed effect.

    So my main question is, if anyone knows if the blit function on the Quest has some kind of different pipeline than the rift or how to explain this quite different behaviour on the Quest in comparison to the Rift. Or maybe it's just the blit function that behaves differently? I am using the URP btw.

    An optional addition thing is, that I use a AsyncGPUReadback to get the image out of my custom render texture, which once again works perfectly fine on the Rift but not on the Quest (where it just displays some whitish colors). For debug purposes I am now using Texture2D.readPixels(...) instead.
    // EDIT: AsyncGPUReadback is not supported in OpenGL. For anyone interested, here is a solution I have found but not yet tested: https://github.com/Alabate/AsyncGPUReadbackPlugin/blob/master/LICENSE

    A little code sniplet of the CustomRenderPassFeature's Execute(...) method looks like this:

    Code (CSharp):
    1. public override void Execute(ScriptableRenderContext context, ref RenderingData renderingData)
    2.     {
    3.     CommandBuffer cmd = CommandBufferPool.Get();
    4.  
    5.     RenderTargetIdentifier prev = BuiltinRenderTextureType.CameraTarget;
    6.     // _scale is (1/1) and _offset (0/0). Playing around with those values did not change the problem with the image only displaying in the top right quarter
    7.     cmd.Blit(BuiltinRenderTextureType.CameraTarget, _customRenderTargetIdentifier, _scale, _offset);
    8.     cmd.SetRenderTarget(prev);
    9.  
    10.     context.ExecuteCommandBuffer(cmd);
    11.     CommandBufferPool.Release(cmd);
    12. }
     
    Last edited: Feb 24, 2020
  2. vl4dimir

    vl4dimir

    Joined:
    Jun 28, 2012
    Posts:
    21
    @LB_Chris Any news on this? Running into the exact same issue. Tried messing with scale and offset but this is only applied to the source texture, not the destination.