Search Unity

  1. Unity Asset Manager is now available in public beta. Try it out now and join the conversation here in the forums.
    Dismiss Notice

Official Unity Render Streaming Introduction & FAQ

Discussion in 'Unity Render Streaming' started by kazuki_unity729, Sep 10, 2019.

  1. sindharta_at_unity

    sindharta_at_unity

    Unity Technologies

    Joined:
    Jul 4, 2019
    Posts:
    49
    How does it look then if you are using our web-app ? If it looks good, then you can refer to our source code to see the difference.
     
  2. ina

    ina

    Joined:
    Nov 15, 2010
    Posts:
    1,085
  3. ina

    ina

    Joined:
    Nov 15, 2010
    Posts:
    1,085
    Has anyone tried running this in the cloud? AWS or otherwise?
     
    UnityLoverr and arielfel like this.
  4. yosun

    yosun

    Joined:
    May 18, 2017
    Posts:
    5
    using the maxed out new macbook 2019 bootcamp Win 64...
    does it not work using AMD? up to date on radeon 17.12
    [WebRTC] The hardware codec driver not installed

    System.Exception: Cannot load. Incorrect path: Packages/com.unity.render-pipelines.high-definition/Runtime/RenderPipelineResources/ShaderGraph/AutodeskInteractiveTransparent.ShaderGraph Null returned.
    at UnityEngine.Experimental.Rendering.HDPipeline.ResourceReloader.Load (System.String path, System.Type type, System.Boolean builtin) [0x00036] in
     
    Last edited: Oct 6, 2019
  5. kazuki_unity729

    kazuki_unity729

    Unity Technologies

    Joined:
    Aug 2, 2018
    Posts:
    803
    I am sorry AMD GPU has not supported this product.
    In the near future, we might work on AMD GPU support.
     
  6. Aiursrage2k

    Aiursrage2k

    Joined:
    Nov 1, 2009
    Posts:
    4,835
    Any idea when you will add multiple cameras?
     
  7. kazuki_unity729

    kazuki_unity729

    Unity Technologies

    Joined:
    Aug 2, 2018
    Posts:
    803
    We are working on adding the feature for multiple cameras.
    In our plan, this feature will be released version 2.0.
     
    kayy and arielfel like this.
  8. Aiursrage2k

    Aiursrage2k

    Joined:
    Nov 1, 2009
    Posts:
    4,835
    Last edited: Oct 10, 2019
  9. markj_pw

    markj_pw

    Joined:
    Oct 27, 2016
    Posts:
    8
    Kazuki, would you expect that uGUI buttons are able to be clicked through the remote stream? In my simple test, it doesn't get clicked. It does get clicked when I try it locally.
    My app has only one camera, for GUI (2D app), and a secondary camera only to display the visuals locally using HDRP Render Texture Blitter.

    Secondly, it appears that the WebRTC stream steals the audio output, such that I can hear the audio in the remote stream but not locally. Is there a "HDRP Render Texture Blitter" equivalent for audio output?

    Thanks
     
  10. kazuki_unity729

    kazuki_unity729

    Unity Technologies

    Joined:
    Aug 2, 2018
    Posts:
    803
    It seems you have a problem setting up uGUI with "Input System".
    Could you check the document?
    https://github.com/Unity-Technologi...unity.inputsystem/Documentation~/UISupport.md

    Got it. We will add the feature to hear the audio locally.
     
  11. markj_pw

    markj_pw

    Joined:
    Oct 27, 2016
    Posts:
    8
    I do indeed have the InputSystemUIInputModule on the EventSystem object (I replaced the legacy component with it, via the button that appears on the Event System component when you have the legacy component on it).
    It does not give me a hover nor the click events when interacting through the Remote video stream. It does work through the local window.

    I have a zip of my project if this is something you can take a look at. It's 40MB.

    Thanks
     
  12. ruddycarpio

    ruddycarpio

    Joined:
    Oct 22, 2019
    Posts:
    1
    Is this compatible with Oculus Quest?
     
  13. andyRogerKats

    andyRogerKats

    Joined:
    Oct 3, 2016
    Posts:
    13
    I am working on streaming video from one Unity client to another. I want to stream the local render texture to a remote render texture without using the browser. How would I set up my render texture on the remote Unity client to listen to the video stream? Thanks in advance!
     
  14. markj_pw

    markj_pw

    Joined:
    Oct 27, 2016
    Posts:
    8
    I have investigated more information about the GUI event limitations.

    Unity is set to not process events while it does not have focus. Therefore while it will work if the game is in the foreground on the "server" machine and you are playing remotely on a different machine, it won't work if the game is not in the foreground, or at all if you are trying to test on your own workstation through a browser (both the browser and Unity cannot have focus at once).

    This seems like an unintended consequence regarding operating through RenderStreaming, where the user expects the game to be in "focus" if the web browser is on-screen. I could see how this could complicate the InputSystem code. Likely you'd want the concept of focus to be per-peer and the InputSystemUIInputModule would process events for those peers who are active and not for others (or locally).

    As a hack, I've made two modifications to the input code. First, I've commented out the condition in InputSystemUIInputModule::DoProcess() that forgoes processing events if the app doesn't have focus. This fixes it for standalone. Secondly, I commented out the early-out check for InputUpdateType.Editor in InputActionState::IInputStateChangeMonitor.NotifyControlStateChanged() which fixes it for running on the same machine while in the editor. This allows me to develop on my workstation the most efficiently. I do not know the extent of side effects these changes will have.

    Is there any chance this issue of focus would be resolved in a future update?

    Thanks
     
    maramak, Dirrogate and gdbbv like this.
  15. kazuki_unity729

    kazuki_unity729

    Unity Technologies

    Joined:
    Aug 2, 2018
    Posts:
    803
    No, Oculus Quest is an android based platform, this doesn't support android build.
     
  16. kazuki_unity729

    kazuki_unity729

    Unity Technologies

    Joined:
    Aug 2, 2018
    Posts:
    803
    Currently, it can't output to RenderTexture from webrtc.
    You should use the browsers to receive a video stream.
     
    andykats likes this.
  17. kazuki_unity729

    kazuki_unity729

    Unity Technologies

    Joined:
    Aug 2, 2018
    Posts:
    803
    Thanks for your investigation in detail.
    I am confirming that to Input System members.
     
  18. andykats

    andykats

    Joined:
    Oct 17, 2019
    Posts:
    2
    I would like to use the Render Streaming functionality but with Unity's default render pipeline. How would I remove the HDRP from this project? Thanks!
     
  19. sindharta_at_unity

    sindharta_at_unity

    Unity Technologies

    Joined:
    Jul 4, 2019
    Posts:
    49
    If you are using the RenderStreaming template from Github, that you can
    1. Remove HDRP package using package manager
    2. Delete all HDRP assets under Assets/RenderPipeline/HDRP folder
    If you are starting your project from scratch, then you can just add RenderStreaming package using package manager.

    Note that you might need to create a non-HDRP version of HDRPRenderTextureBlitter.cs to blit Render Texture to the screen
     
    arielfel and andykats like this.
  20. andykats

    andykats

    Joined:
    Oct 17, 2019
    Posts:
    2
    I am experiencing video freezing and grey screen when my (Chrome) browser is on a screen that is 2650x1600 and larger. Just after this happens the input key presses and button events, e.g. the light turning on and off, still come through to the Unity client from the browser. I am also not able to get any browser on any device in my local network to get the video stream. On any device I try I am seeing a grey screen after clicking play in the browser. However, I have been able to get video up and streaming again by power cycling my server machine. Just restarting the node server doesn't fix the issue. I might also add that when I refresh the browser and get the grey screen I am seeing a valid connection state in the js console (see image).
    My goal is to stream a video at 2650x1600 at ~30fps or more. Is this achievable? Or am I hitting some bits-per-second capacity from the Unity client upload stream? Thank you! 169.254.247.4 - Google Chrome 10_26_2019 9_00_29 AM_LI.jpg
     
  21. sindharta_at_unity

    sindharta_at_unity

    Unity Technologies

    Joined:
    Jul 4, 2019
    Posts:
    49
    This should not be happening. But to confirm, can you try using a smaller screen ?
    Also, are you connecting to Unity from another computer ? Or are you connecting locally ?
     
    andykats likes this.
  22. arielfel

    arielfel

    Joined:
    Mar 19, 2018
    Posts:
    33
    kazuki_unity729, sindharta_at_unity and anyone who is anywhere related to the creation of this thing.
    Guys this is amazing! well done! just set it up and it f***** works!! you are gods to me!
    I do have a lot of questions and information I wish to gather on this amazing thing.
    First of all - where is the best place to ask them?
    ...top stuff that and the very important to me and my colleagues to know:
    When could be expected Linux support?
    Will it work headless? (server build)
    any chance to make an LWRP template (it will be far more useful for most use cases)

    Many Thanks!
     
  23. arielfel

    arielfel

    Joined:
    Mar 19, 2018
    Posts:
    33
    Hi, can you elaborate on how to make it on regular or LWRP? (Mabey a small tutorial or something)
    Thanks!
     
  24. tanakake

    tanakake

    Joined:
    Feb 6, 2019
    Posts:
    6
    いちいち、ref つけなければいけないし、RTCSdpTypeが頭文字大文字なもんだから素直にjsonに変換できないし、イベントは += で書けないし、実装するのにイライラが募る。
     
    Last edited: Nov 1, 2019
  25. sindharta_at_unity

    sindharta_at_unity

    Unity Technologies

    Joined:
    Jul 4, 2019
    Posts:
    49
    1. Linux support: currently working on it, but it's a bit early to say when we are going to deliver it
    2. It might work, but we are not going to support it.
    3. It's definitely possible to create an LWRP template, but the priority is a bit low for us at the moment.

    To make it work on LWRP, you can do the following (not tested):
    1. Remove HDRP package using package manager
    2. Delete all HDRP assets under Assets/RenderPipeline/HDRP folder
    3. Add LWRP package using package manager and setup the project
    4. Implement the LWRP version of HDRPRenderTextureBlitter
    BTW, the best place to ask would be on Github, although we do still check this forum.
     
    donov and arielfel like this.
  26. kazuki_unity729

    kazuki_unity729

    Unity Technologies

    Joined:
    Aug 2, 2018
    Posts:
    803
    貴重なご意見ありがとうございます。
    - ref に関して、struct を参照渡しするための記述になりますので、公開APIを全て class で提供するということになります。
    現在のところ変更する予定はありません。

    - RTCSdpType の先頭文字が大文字に関して、C# の 列挙型の命名規約に沿っております。
    現在のところ変更の予定はありません。
    https://docs.microsoft.com/ja-jp/dotnet/standard/design-guidelines/capitalization-conventions

    - イベントの記述に関して、 event 構文を利用する修正の希望と推測します。
    こちらについてはチーム内で議論したいと思います。
     
  27. Cascho01

    Cascho01

    Joined:
    Mar 19, 2010
    Posts:
    1,347
    Last edited: Nov 4, 2019
    arielfel likes this.
  28. arielfel

    arielfel

    Joined:
    Mar 19, 2018
    Posts:
    33
    Thanks Bro!
    we still running tests with the HDRP and it still looks great!!!.
    sorry for asking kind of the same question but it's going to be important to us very soon and it seems the answer is out there...
    Is it possible to use the default rendering pipeline (not the HDRP or the LWRP)?
    and do you have an implimintatiopn of the LWRP / (default render pipeline - if the answer for the above is positive)
    version of HDRPRenderTextureBlitter.

    Again, many thanks, and you are doing something incredible here!
     
  29. kazuki_unity729

    kazuki_unity729

    Unity Technologies

    Joined:
    Aug 2, 2018
    Posts:
    803
    We have been building the RenderStreaming tech stack in Tokyo, but since obvioos is joined to Unity we are looking forward to collaborate with them in close future.
     
    arielfel likes this.
  30. kazuki_unity729

    kazuki_unity729

    Unity Technologies

    Joined:
    Aug 2, 2018
    Posts:
    803
    You can use it with legacy-pipeline but you have to select DirectX 11 from graphics API options.
    About supporting "HDRPRenderTextureBlitter" to LWRP and legacy-pipeline, we have received the requests from users. We will work on this in the close future.
     
    kayy and arielfel like this.
  31. arielfel

    arielfel

    Joined:
    Mar 19, 2018
    Posts:
    33
    Hi @Kazuki and @sindharta, thanks for the fast response, it is very encouraging and motivating for rapid tests.
    I can confirm from last night tests:
    • We were successful to stream realtime webcam video input on world scale canvas over 3 different OS: Windows, Linux, and Android (all with chrome browser:)
    • Server build option (the checkbox) from inside unity build settings - failed (a decent amount of errors on the command prompt)
    • Silent launching via command prompt (-batchMode) - failed, but different, the app itself runs but there is no rendering... meaning we getting black screen in the browser, we also confirmed it with capturing snapshots and build png images using render texture attached to the camera.
    Hopefully, this information helped you, and please continue this awesome work!

    BTW - Apparently there is no need for the new Input system.. all the tests above ran without the package:)
     
    Last edited: Nov 5, 2019
  32. JohnKP-Mindshow

    JohnKP-Mindshow

    Joined:
    Oct 25, 2017
    Posts:
    56
    Love the package @kazuki_unity729 !

    I had one question:

    I think I have a pretty good understanding of how the RemoteInput is captured and sent, but wanted to verify:

    Do you think its possible for me to edit RemoteInput to also capture things like gyroscope or the 'position' the phone reports from AR Kit?

    Gyroscope seems a bit more doable, but unsure about AR Kit.....would love some input if you have time. Don't want to waste time trying to extend RemoteInput if it isn't really meant to be extended to further inputs
     
  33. sindharta_at_unity

    sindharta_at_unity

    Unity Technologies

    Joined:
    Jul 4, 2019
    Posts:
    49
    RemoteInput basically just receives what is being sent by the browser. So if you can make the browser send the data that you want to transfer to Unity side, then yes, it's possible.
     
  34. JohnKP-Mindshow

    JohnKP-Mindshow

    Joined:
    Oct 25, 2017
    Posts:
    56
    Running into an issue where the outputed renderTexture seems to be the wrong color space? I'm using LWRP if that makes a difference...

    Notice how the 2nd image is slightly lighter in color (which makes me think its a gamma vs linear color space issue). It's subtle but its enough to be annoying that the colors don't match what's actually being rendered.

    Any thoughts @sindharta_at_unity or @kazuki_unity729 ?

    (Render Texture version)
    upload_2019-11-8_14-15-5.png

    (Unity Version)
    upload_2019-11-8_14-15-27.png
     
  35. JohnKP-Mindshow

    JohnKP-Mindshow

    Joined:
    Oct 25, 2017
    Posts:
    56
    After looking further into it, the renderTexture that is being sent has the same color as the game view, which is different than the remote texture on the web page.

    Perhaps it has something to do with how the webpage is interpreting the texture? It's a subtle but important difference between the colors....
     
  36. sindharta_at_unity

    sindharta_at_unity

    Unity Technologies

    Joined:
    Jul 4, 2019
    Posts:
    49
    That's interesting. I am wondering if this is an issue related to URP. Could you please let us know:
    1. The Unity version you are using
    2. The version of URP (LWRP) package you are using
    3. Are you using linear/gamma workflow ?
     
    arielfel likes this.
  37. JohnKP-Mindshow

    JohnKP-Mindshow

    Joined:
    Oct 25, 2017
    Posts:
    56
    Unity 2019.2.9f1
    LWRP 6.9.1

    I would upgrade to URP, but it does not seem like a trivial namespace change like its being advertised (just from my brief looking into it, they at the least swapped in a whole new post-processing implementation in this "simple name change"). Also, we've noticed that upgrading the render pipelines, while fixing some issues, creates just as many new ones with each new version.

    But curious if this is specific to this LWRP implementation
     
  38. Creaturtle

    Creaturtle

    Joined:
    Jan 24, 2018
    Posts:
    33
    Hi, I'm making a MOBA that requires thousands of particles to be simulated and synchronized across all clients, has many physics calculations that must be synchronized every iteration, and has projectile motion with non-deterministic behavior (wind, electrostatic forces, or another player can all affect the motion).

    Would implementing this game using render streaming, such that all physics and effects are on the server and cameras from the simulated "world" server have their sound and video streams relayed to clients, be a viable approach to networking my game?
     
  39. JohnKP-Mindshow

    JohnKP-Mindshow

    Joined:
    Oct 25, 2017
    Posts:
    56
    Ran into another issue....

    If I use the RenderStreaming component, it seems to be muting the audio on the computer and sends it to the remote device.....which is cool.

    But I don't want it to mute the computer audio. If I don't send the audio stream over, the computer plays audio as expected. Is there a way to have both the computer and the remote stream both playing back audio?
     
    SenaJP likes this.
  40. sindharta_at_unity

    sindharta_at_unity

    Unity Technologies

    Joined:
    Jul 4, 2019
    Posts:
    49
    Thanks. I have created this Github issue to track it. We'll investigate this.
     
  41. kazuki_unity729

    kazuki_unity729

    Unity Technologies

    Joined:
    Aug 2, 2018
    Posts:
    803
    Current version of the package supports only single camera.
    In our plan, Releasing Multi-camera feature is version 2.0.
    In your use case, it seems GPU performance will be a bottle neck.
    In addition, I concern latency effects on the user interaction.
     
  42. Creaturtle

    Creaturtle

    Joined:
    Jan 24, 2018
    Posts:
    33
    Is there any projected date on 2.0 release?

    Also, server GPU power will not be a problem.

    In terms of latency, how many ms can be expected?
     
  43. JohnKP-Mindshow

    JohnKP-Mindshow

    Joined:
    Oct 25, 2017
    Posts:
    56
    Some other notes:

    - Video stream will randomly cut out (although RemoteInput still works). Unclear what causes this.
    - The video stream was incredibly choppy when played back in Chrome on Windows. iOS Safari works great though.

    I can file these on GitHub if that's easier
     
    arielfel likes this.
  44. sindharta_at_unity

    sindharta_at_unity

    Unity Technologies

    Joined:
    Jul 4, 2019
    Posts:
    49
    That would be great. Please also attach videos, images, or even additional codes you are using when possible.
     
    arielfel likes this.
  45. kazuki_unity729

    kazuki_unity729

    Unity Technologies

    Joined:
    Aug 2, 2018
    Posts:
    803
    • We are trying to develop features for 2.0, it seems to release in Q1 2020.
    • The latency depends on the status of the network environment, but It is to be expected about 100ms
    • We are trying to develop features for 2.0, it seems to release in Q1 2020.
    • The latency depends on the status of the network environment, but It is to be expected about 200ms
     
  46. bartburkhardt-sim-ci

    bartburkhardt-sim-ci

    Joined:
    Aug 21, 2017
    Posts:
    6
    This is really promising. I have two questions.

    How can I get the UI canvas and button interaction to work in the browser. I tried setting the Canvas render mode to Screen Space - Camera and set the camera to the streaming camera. I do get the UI in the UI but UI is not visible on the host.

    I have to set the target texture Depth buffer of the streaming camera to "At least 16 bits depth in order to see textures and object, can I set this somewhere else?
     
  47. polygonfuture

    polygonfuture

    Joined:
    Feb 4, 2016
    Posts:
    20

    Hi, I have gone through the github tutorial steps 3 times, each time getting "NET::ERR_CERT_INVALID" in Chrome and Safari.

    Can you please give further advice?
     
  48. polygonfuture

    polygonfuture

    Joined:
    Feb 4, 2016
    Posts:
    20
    Also I would like to add, that without this SSL certificate, I am unable to view anything on an iPad Pro. The screen remains solid color, however the html buttons and microphone do control the unity scene.
     
  49. bartburkhardt-sim-ci

    bartburkhardt-sim-ci

    Joined:
    Aug 21, 2017
    Posts:
    6
    How can I get the remote mouse position? I do get the delta values using RemoteInput.RemoteMouse.delta.ReadValue()
    but we need to be able to click on objects so I was trying RemoteInput.RemoteMouse.position.ReadValue() and RemoteInput.RemoteMouse.position.x.ReadValue() but those return 0,0
     
  50. bartburkhardt-sim-ci

    bartburkhardt-sim-ci

    Joined:
    Aug 21, 2017
    Posts:
    6
    how can we do this?, I need the mouse position for clicking objects