Search Unity

  1. Welcome to the Unity Forums! Please take the time to read our Code of Conduct to familiarize yourself with the forum rules and how to post constructively.
  2. Dismiss Notice

Question: Optimal setup for URP & VR (Quest 2)

Discussion in 'Unity Render Streaming' started by LivingBrain-Till, Jul 26, 2023.

  1. LivingBrain-Till

    LivingBrain-Till

    Joined:
    Jul 12, 2019
    Posts:
    12
    Hi, recently discovered this package and i am wondering about how to setup for oculus quest and URP.

    I got it working via using a second camera as streaming camera. However this has quite an impact on performance, since there are two cameras rendering now.

    Is there a better / more performant way to set this up?

    Thanks in advance.
     
  2. KayH

    KayH

    Joined:
    Jan 6, 2017
    Posts:
    107
    You could try to get the framebuffer of one of the eyes before it gets lens distorted (if that is at all possible). Would be better to ask in an Oculus forum though since that isn't related to this package at all.

    Of course, streaming your VR point of view is already supported by the regular system wide streaming feature of Quest.

    What is it that you want to achieve exactly?

    Anyway, you could try reducing the frame rate and/or resolution of the streaming camera for example.

    May I ask what firmware your Quest 2 is on?
     
    Last edited: Jul 26, 2023
  3. LivingBrain-Till

    LivingBrain-Till

    Joined:
    Jul 12, 2019
    Posts:
    12
    The oculus system wide streaming feature is nice, but we tend to run in connectivity issues in unstable internet quite frequently.
    Aside from that, we want to create a companion app and would like to add some bidirectional features in the future.

    Furthermore we also want to build for pico 4, which so far does not have a streaming solution.

    Weirdly reducing the frame rate and resolution did not do much performance wise, so i think there might be a bottleneck somewhere else. Which is why i assumed i might have a weird setup.

    I indeed already tried using a custom render pass which grabs the camera image at the end of the frame, then saving it to a render texture. However this has not brought significant performance benefits so far. Further making me think, my bottleneck is somewhere else.

    Quest 2 is on 55.0
     
  4. KayH

    KayH

    Joined:
    Jan 6, 2017
    Posts:
    107
    You have to actually stop the camera from updating every frame, so disable it and have a timer in your update function that calls camera.Render() only ever other frame. Otherwise the stream being lower frame rate has no impact on the render load caused by the camera.

    I don't have the same problems with render performance (I guess my game is well optimized and/or not as ambitious as yours) but before firmware version 55 the WebRTC solution itself (networking and en/decoding) was causing frame drops from 72 to low 20ies. Which is why I gave up.

    There's also an unresolved issue on github for this: https://github.com/Unity-Technologies/UnityRenderStreaming/issues/716

    With the performance boost in version 55 that is resolved for me actually but for some reason I'm not getting a stream from the Quest anymore. I stream both directions, a video chat feature. Super weird especially since you don't have that problem.

    It's an additional render pass so it causes the same render load as another camera. You would only save resources if you could get the output from a render pass that is already done.
     
    Last edited: Jul 27, 2023
    LivingBrain-Till likes this.
  5. LivingBrain-Till

    LivingBrain-Till

    Joined:
    Jul 12, 2019
    Posts:
    12
    The advice with the camera.Render() call sounds promising, I'll give it a try. Thank you.


    Out of curiosity, how did you make your android the server? I am using termux to run the build with npm on the phone.
     
  6. KayH

    KayH

    Joined:
    Jan 6, 2017
    Posts:
    107
    I'm using LiteNetLib for the signalling process and also to send some data I use for other purposes. My Android app is also made with Unity, sharing a code base for WebRTC and LiteNetLib with my game, but much simpler scene with just two textures for the sent and received streams.
     
  7. LivingBrain-Till

    LivingBrain-Till

    Joined:
    Jul 12, 2019
    Posts:
    12
    Just wanted to come back to say thanks.

    The camera.Render() trick did improve the performance significant and it suffices for now.

    I've given up on termux and instead am now using Nodejs for mobile to run the unity render streaming webclient on my tablet.
     
  8. KayH

    KayH

    Joined:
    Jan 6, 2017
    Posts:
    107
    You're welcome.