Search Unity

  1. Welcome to the Unity Forums! Please take the time to read our Code of Conduct to familiarize yourself with the forum rules and how to post constructively.
  2. Dismiss Notice

[Release] FMETP STREAM: All-in-One GameView + Audio Stream + Remote Desktop Control(UDP/TCP/Web)

Discussion in 'Assets and Asset Store' started by thelghome, Apr 30, 2019.

  1. thelghome

    thelghome

    Joined:
    Jul 23, 2016
    Posts:
    714
    Thanks for your consideration on our product.
    1) Audio playback on WebGL should be fine, Microphone and In-Game Sound is depending on Unity Class. Our HTML example supports Mic stream via javascript, but we haven't integrated it into WebGL build yet.
    2) Stream resolution and quality is adjustable
    3) Desktop capture require standalone build (PC/Mac/Linux)
    4) We don't have specific plan yet. Depending on how difficult of your requested features, we have priority in our To-Do list.
     
  2. pratyakshagar

    pratyakshagar

    Joined:
    Aug 22, 2023
    Posts:
    3
    Thanks for replying!

    I am still slightly confused. If I have 10 players running around in a 3d environment, each of them has an audioSource attached to them, can I pass the specific user's microphone audio to their specific AudioSource component (all on WebGL)?

    If yes, then how well does it work? My reason for asking is simple. I tried multiple methods to implement audioStreaming functionality (using onAudioFilterRead and PCM callbacks) and they work fine on all platforms except WebGL. So I am curious how you were able to achieve this feat.
     
  3. thelghome

    thelghome

    Joined:
    Jul 23, 2016
    Posts:
    714
    For WebGL audio playback, you have to route the data to javascript natively.
    In our current version, only one audio stream is defined in our WebGL demo.
    It's possible to make it compatible with multiple audio streams support in WebGL, but it needs modification with some basic javascript knowledge.

    Similar to Mic support on WebGL, we've tested mic streaming via javascript in our html example, but we haven't convert them into jslib plugin for Unity WebGL build yet.
     
    pratyakshagar likes this.
  4. msdsmith

    msdsmith

    Joined:
    Feb 28, 2017
    Posts:
    8
    I'm trying to run a basic video stream from a Quest 2 to a PC. I followed the youtube tutorial to add the FMETP Stream components to my VR application and create a different scene for the PC. Streaming is working in the sense that I am getting an image on the PC that is updating over time. Unfortunately, in the headset after a couple of seconds the image appears frozen as if tracking is lost. The streamed image is still updating so tracking is working but the image in the headset is not updating or something.

    I have the latest FMETP Stream from the Unity asset store.
    I'm using Unity 2020.3.41
    I'm using Don't Destroy on Load for camera rig since I'm moving between different scenes.

    Any suggestions for what might be going wrong?
     
  5. thelghome

    thelghome

    Joined:
    Jul 23, 2016
    Posts:
    714
    You may please try our Quest 2 template and see if this can be reproduced?
    https://github.com/frozenmistadventure/fmetp_tutorial_questvr_stream
     
  6. msdsmith

    msdsmith

    Joined:
    Feb 28, 2017
    Posts:
    8
    OK, I got the template working.

    My receivers both work with either project so that doesn't appear to be an issue. Somehow the sender in my Unity 2020.3.41 project messes with the display in the Quest 2 headset.

    Are there any reasons it shouldn't work with that version of Unity?

    Besides the Unity version and having the rig as Don't Destroy on Load, the only key difference I see is that I'm using the XR Interaction toolkit (XR Origin rig) not OVR. Are there any known issues with XR Interaction toolkit?
     
  7. thelghome

    thelghome

    Joined:
    Jul 23, 2016
    Posts:
    714
    It should be compatible with XR toolkit too, afaik.
    Few things to check:
    1) is there only one main cam(VR)?
    2) are you using MainCam mode or RenderCam mode in GameViewEncoder?
    3) are you using URP or HDRP?
    4) any error logs?
    5) If there are more than one cam, make sure the other one should set the targetEyes to none.
    6) If there are more than one cam, make sure the depth index for non-VR cam is lower than VR cam.

    I don’t see any big issue on Unity 2020 LTS, as our original project was tested from Unity2019LTS. (But things might be changing, since there are lots of updates related to VR AR)

    Is there any particular reason of staying in Unity2020 LTS? Since Unity 2023 has been released for a while already.
     
  8. msdsmith

    msdsmith

    Joined:
    Feb 28, 2017
    Posts:
    8
    1) only one camera
    2) main camera
    3) URP
    4) no errors I have noticed except initially on build. On build I got a lot of errors and got around them by deleting all the FMETP editor folders. Not sure why I had to do this as I didn't have to do it with the template?

    I tried updating my project to the same version of Unity as the template: 2022.1.22. That seems to solve the tracking problem I was having in the headset but the received image on the PC just shows as a white screen now. The FMNetworkManager status window looks as before with data being received but no image is shown.
     
  9. thelghome

    thelghome

    Joined:
    Jul 23, 2016
    Posts:
    714
    This is a hardware specific issue with URP in Quest VR. It’s also a known bug on Unity URP side.

    Instead, the alternative will be RenderCam mode for Quest 2 URP.

    To be honest, though URP has been released for a while, it’s still far less stable than the original standard render pipeline. The overall performance in URP is actually heavier too in many cases.
     
  10. msdsmith

    msdsmith

    Joined:
    Feb 28, 2017
    Posts:
    8
    OK, thanks I got a stream to work with the RenderCam
     
  11. rportelli

    rportelli

    Joined:
    Apr 26, 2020
    Posts:
    3
    Hello,

    I am trying to stream from a hololens 2 and I followed the sample github repo, the only diference I have is that I am using URP. The app crashes when I connect to the server.
    Is their a way to make it work with URP?
     
  12. thelghome

    thelghome

    Joined:
    Jul 23, 2016
    Posts:
    714
    Are you testing our latest mrtk 3.0 template?
    https://github.com/frozenmistadventure/fmetp_tutorial_hololens2_mrtk3

    We haven't verified the HL2 setup with URP yet. What's the error or logs?
    If URP HL2 issue is caused by the MainCam capture mode, probably you need to duplicate the main camera and use RenderCam capture mode.
     
  13. noir662607004

    noir662607004

    Joined:
    Mar 29, 2018
    Posts:
    1
    Hi,
    I'm trying to stream from a Quest 2 (server) to PC (client) using FMNetworkManager and Game View Encoder / Decoder,
    but it seems like Game View Encoder's fast encode mode does not work on my headset. I've tried various parameter combinations for Game View Encoder. From my troubleshooting, when fast encode mode is enabled, nothing gets encoded by the Game View Encoder. I checked that it works normally on Unity Editor (PC), this only happens on the headset.
    For context, my setup in this project is:
    • Render pipeline: URP
    • Capture Mode is set to RenderCam (since MainCam mode doesn't work well for URP)
    • Render Cam is set to CenterEyeAnchor Camera, the only main camera
    • Output Format: FMMJPEG
    • Everything else same as the tutorial
    I have also tested the template https://github.com/frozenmistadventure/fmetp_tutorial_questvr_stream, and the same problem happens.
    I ran "QuestVR_StreamSender(UDP)" the VR app on the headset and ran "QuestVR_StreamReceiver(UDP)" in my Unity Editor on PC.
    - When fast encode mode is disabled on the VR app, the PC end is able to correctly receive and display the game view.
    - When fast encode mode is enabled on the VR app, the game view does not get encoded.

    Is there a way to fix fast encode mode + async encode for my headset? If not are there any other ways to improve performance on the headset?
     
    Last edited: Oct 31, 2023 at 12:49 AM
  14. thelghome

    thelghome

    Joined:
    Jul 23, 2016
    Posts:
    714
    CenterEyeAnchor Camera from VR rig might behave different.
    In URP, we suggest that you should duplicate another render cam for RenderCam mode, instead of using any existing camera from VR rig.

    For the ease of troubleshooting, could you please send us an example project via email?
    technical support: thelghome@gmail.com
     
  15. thelghome

    thelghome

    Joined:
    Jul 23, 2016
    Posts:
    714
    We've verified the issue with the recent Oculus XR, and Oculus SDK.
    We've done some test on Quest 2 and updated our quest 2 template on github.
    https://github.com/frozenmistadventure/fmetp_tutorial_questvr_stream/

    We noticed that our SDK can achieve 120+ fps when streaming 1920x1080@30fps.
    It's a amazing result with Oculus SDK v57, XR v4.0.0 and Unity 2022.3.8f1.
    (Our testing result in last year with Oculus SDK v37 was maximum 90fps, probably that's SDK version limit with older version SDK?)

    Anyway, the temporary fixes and what we tested:
    - Stereo Rendering Mode notes for the latest Oculus Integration SDK(v57)
    • tested MultiPass supports these Capture Modes: MainCam(Built-in), RenderCam(Built-in, URP)
    • tested Multiview supports these Capture Modes: RenderCam(Built-in)
    • if you are working on URP, please refer to scene "QuestVR_StreamSender(UDP)_renderCam(URP-reference)", and search "RenderCam(GameViewEncoder)" in the scene.

    quest2_129fps.jpg