Search Unity

Oculus Quest GLES help

Discussion in 'General Graphics' started by VRHealthMain, Feb 25, 2020.

  1. VRHealthMain

    VRHealthMain

    Joined:
    May 24, 2017
    Posts:
    14
    TL;DR How can I read from the eye buffer on Oculus Quest/Go to a byte array without a rendertexture using OpenGL?

    Hi, i'm tring to cast the VR view to an external app. To do that I need to send a byte array of each frame (at about 10 fps). After finding out that Oculus Quest and Go don't support AsyncGPUReadback the go to answer was to attach a rendertexture to a camera-> set it as the active rendertexture -> render to it-> do Texture2d.ReadPixels to a 2d texture-> Texture2d.apply->Texture2d.GetRawData.
    While the method works it caused a few issues: some performance issues that were mostly acceptable, and a jittering in close moving objects (such as the controller objects or our gameplay object). The jittring make it a very unpleasent to play. We Know the jittring is connected to rendring to rendertexture, since we used to have that very problem when a wayward ForceIntoRT was left checked.
    I tired to find any way to do it in Unity, but nothing worked. And so I turned to OpenGL. which I know nothing about. After lots of houres online and some banging the head against the wall, I managed to find a very partial not-solution (I'm using IUnityInterface and IUnityGraphics):

    Code (CSharp):
    1. static void ReadPixels(void* data, int textureWidth, int textureHeight)
    2. {
    3.     int currentFBORead;
    4.     int currentFBOWrite;
    5.  
    6.     glGetIntegerv(GL_READ_FRAMEBUFFER_BINDING, &currentFBORead);
    7.     glGetIntegerv(GL_DRAW_FRAMEBUFFER_BINDING, &currentFBOWrite);
    8.  
    9.     glBindFramebuffer(GL_READ_FRAMEBUFFER, currentFBOWrite);
    10.     glPixelStorei(GL_PACK_ALIGNMENT, 1);
    11.     glReadPixels(0, 0, textureWidth, textureHeight, GL_RGBA, GL_UNSIGNED_BYTE, data);
    12.  
    13.     glBindFramebuffer(GL_READ_FRAMEBUFFER, currentFBORead);
    14. }
    So I set a rendertexture active and in a camera and my ReadPixels turn it to a byte array. No need to read into a Texture2D, apply it and read the raw data. yay.
    But no matter what I do, I can't seem to find a way to read directly from the eye buffer (any eye) or screen or the scene camera. I combed through the OVRPlugin in attampt to see if there is any chance Oculus can throw me a bone (or a buffer to read from), nothing.
    So a go to you guys. I'm a total noob in OpenGL, and i'm at it for few weeks. How can I read the eye buffer to a byte array without a rendertexture?
     
  2. VRHealthMain

    VRHealthMain

    Joined:
    May 24, 2017
    Posts:
    14
    Maybe some one in Unity can help?
     
  3. hippocoder

    hippocoder

    Digital Ape

    Joined:
    Apr 11, 2010
    Posts:
    29,723
    I don't think this approach will be fast enough at all. Your jittering is most likely because the technique you are using is way too insanely slow for mobile hardware.

    So even if you do it every 10 or 100 frames it will stall, causing the jitter you're seeing.

    Why are you casting to an external app and what are the details ?
     
  4. VRHealthMain

    VRHealthMain

    Joined:
    May 24, 2017
    Posts:
    14
    We are casting it using WebRTC to a external control app. Our users aren't always able to handle the games and the supervisor often need / want to see what is done. Sometimes to the entire session. As far as we can see, rendering to a render texture cause the jitter even if nothing else is done (no data sent or processed). I hoped there is a way to access Oculus's eye buffer in order to avoid adding another pass to the rendering and avoid the jittering. We noticed it jitter even if our frame rate is high. From my understanding the IUnityInterface and IUnityGraphics ensure that the function is called from the rendering opengl context, so using glReadPixels suppose to give me the correct buffer, but it's not working. How can I read From the right buffer? Or maybe someone have a better idea?
     
  5. VRHealthMain

    VRHealthMain

    Joined:
    May 24, 2017
    Posts:
    14
    Any way it could work? Or maybe use a shader somehow to use an existing pass to capture the screen (but how? and how to get the byte array out of it? is it even possible?)