Search Unity

  1. Unity Asset Manager is now available in public beta. Try it out now and join the conversation here in the forums.
    Dismiss Notice
  2. If you have experience with import & exporting custom (.unitypackage) packages, please help complete a survey (open until May 15, 2024).
    Dismiss Notice
  3. Unity 6 Preview is now available. To find out what's new, have a look at our Unity 6 Preview blog post.
    Dismiss Notice

Official Unity Render Streaming Introduction & FAQ

Discussion in 'Unity Render Streaming' started by kazuki_unity729, Sep 10, 2019.

  1. ruturaj_unity

    ruturaj_unity

    Joined:
    Mar 26, 2020
    Posts:
    2
    Hello, first of all, this works beautifully, so thanks for that.
    But it only works on the local machine on which the server is created and on the Chrome browser on my Android phone.

    However, it doesn't work on any browser on other laptops. The iceConnectionState goes directly to 'disconnected' after checking, while the streaming is working well on the local machine at the same time.

    Sometimes I get the error 'Uncaught(in promise) DOMException: Failed to execute 'addIceCanditate' on RTCPeerConnection'. Error processing ICE.
     
  2. kayy

    kayy

    Joined:
    Jul 26, 2011
    Posts:
    110
    You can start the packed webserver with option -p <number>. See the web app docs for more information
     
    kazuki_unity729 likes this.
  3. ruturaj_unity

    ruturaj_unity

    Joined:
    Mar 26, 2020
    Posts:
    2

    I will answer my own question as I have found the solution. The firewall on the server needs to be turned off or at least it should allow the Unity Editor or your application.
     
    kayy likes this.
  4. kazuki_unity729

    kazuki_unity729

    Unity Technologies

    Joined:
    Aug 2, 2018
    Posts:
    803
    I am sorry for your trouble. You're absolutely right.
    As you said, the feature available to changes bitrate or dimension dynamically is a necessary function.

    1.
    I believe that the best way to cleanly start and stop is recreating MediaStream instance, but unfortunately, I never tried the way.

    2.
    We have no idea about that.
    I will add the feature to our plan and raise the priority.
     
  5. gtk2k

    gtk2k

    Joined:
    Aug 13, 2014
    Posts:
    288
    How to set encode fps?
     
  6. kazuki_unity729

    kazuki_unity729

    Unity Technologies

    Joined:
    Aug 2, 2018
    Posts:
    803
    Thanks for the post that detailed your use-case. I will use that as a reference from now on.
    I am not sure why Furioos is better than Unity Render Streaming to render Unity UI.

    Could you give us the screenshots to compare?
     
  7. kayy

    kayy

    Joined:
    Jul 26, 2011
    Posts:
    110
    The only way today in 1.1.2 appears to be via WebRTC.Initialize() and .Finalize() which works so far. I played with removing and re-adding tracks but ran into trouble (I guess it was 2nd call to Context.Initialize but don't remember exactly anymore). I found out that a call to GetComponent() using the string instead of the generic version

    _captureCamera.GetComponent("Cleaner");
    retrieves an accessible instance of the Cleaner component although Cleaner is declared as internal and resides in in a different assembly (com.unity.webrtc). That way I do not need to change the package code.
    Great this will make things easier and probably faster
     
    Last edited: Mar 30, 2020
    gtk2k likes this.
  8. gtk2k

    gtk2k

    Joined:
    Aug 13, 2014
    Posts:
    288
    To change the encoding FPS,
    Changed "frameRate = 45;" in line 94 of NvEncoder.h to "frameRate = 30;" or "frameRate = 60;" and built it, and tried using each dll, but the encoding FPS was It did not change.
    When the FPS of the sending Unity was changed to "Application.targetFrameRate = 30;" or "Application.targetFrameRate = 60;", the encoding FPS changed.
     
    SenaJP likes this.
  9. kazuki_unity729

    kazuki_unity729

    Unity Technologies

    Joined:
    Aug 2, 2018
    Posts:
    803
    It looks like the same issue.
    https://github.com/xamarin/xamarin-macios/issues/5920

    It might that you did not allow permission to use media.
    Like the screenshot I took, please select "Allow".

    Safari_permission.jpg
     
  10. eproca_SimRTR

    eproca_SimRTR

    Joined:
    Nov 8, 2018
    Posts:
    16
    Thanks for your response. Maybe I can give some insight about my use case: our Unity apps are related to AEC, with typically huge building models. Our clients either have machines that can easily run our apps or not at all. There is not middle ground, it doesn’t matter how hard we try to add graphic options thought for less powerful machines.

    We need a practical option for our clients that cannot run locally our apps in their machines. There are already some cloud gaming platforms: Stadia, Geforce Now…maybe Shadow is the more flexible. But for many reasons we cannot use them for our projects. Currently only Furioos could be a practical solution, but the performance is not so great.

    Because all this we are so interested in UnityRenderStreaming. Thinking in setting in the future our own solution using cloud GPU servers of Google or AWS.

    We have not port any of our apps to UnityRenderStreaming yet. There are many issues yet. We are making some small tests, but using our big architectural models to be realistic with the performance. And we are setting the web server in our normal workstations. It is too early for us to try it in a cloud GPU server.

    Our main issue at the moment is the UI. Only just the fact that we can not use canvas with “Screen Space-Overlay” prevent the use of UnityRenderStreaming in real projects. We are using for the tests “Screen Space-Camera”, but the video encoding makes it really bad when the camera moves (see image attached).

    I am not sure if it makes sense to compare UnityRenderStreaming with Furioos. I understand that the later is some sort of remote desktop via browser (Shadow, for example, works as a remote desktop via their own app). Normal Unity apps developed to run locally with UI canvas in “Screen Space-Overlay” run directly in Furioos, without any kind of specific settings. Leaving aside the performance issues, the UI looks exactly as if the app is running locally.
     

    Attached Files:

    • 01.jpg
      01.jpg
      File size:
      115 KB
      Views:
      478
    UnityLoverr and Cascho01 like this.
  11. Cascho01

    Cascho01

    Joined:
    Mar 19, 2010
    Posts:
    1,347
    Also doing ArchViz, excatly the same here.
    Clients are aware of "streaming" and ask more for it now.
    Furioos works great for me for some testprojects but getting these features with Unity and some more support/docs would be great!
     
  12. gdbbv

    gdbbv

    Joined:
    Apr 20, 2018
    Posts:
    11
  13. tmacharla

    tmacharla

    Joined:
    Aug 1, 2017
    Posts:
    1
    Hi @kazuki_unity729

    This is amazing and we have a specific case for our project.

    We are working on a project that has an application that sends the WebRTC data (video and audio) and we are wondering if there is a way we could receive this data in Unity.

    I went through the Unity RenderStreaming documentation and it looks like Unity serves as the video source (sends the rendered frame and audio) via WebRTC to browsers but I couldn't figure out on how to receive the WebRTC data in Unity.

    1. Is that even possible to make Unity as a receiver of WebRTC data?
    2. How do we go about doing it?
    3. Does Unity RenderStreaming work in VR browsers (oculus and vive)?

    Kindly let us know if it is possible.

    Thank you.
     
  14. kazuki_unity729

    kazuki_unity729

    Unity Technologies

    Joined:
    Aug 2, 2018
    Posts:
    803
  15. jimstewartson

    jimstewartson

    Joined:
    Jun 8, 2018
    Posts:
    2
    Hi!

    My application requires that Unity *receives* audio from a web browser.

    Is there any way to do this right now, even if it's a workaround? ie., if I can't pull audio from the MediaTrack, would it be possible to send audio data over the data channel?

    Thanks!
     
  16. OLGV

    OLGV

    Joined:
    Oct 4, 2012
    Posts:
    57
    I got 'Unity Render Streaming - preview 1.2.3' working great while tested locally on the same machine.
    The flow is: Unity Editor (2019.3.7f1 in Play Mode) -> Webserver (showing local 192.168.#.### address) -> Chrome (pointing to the iP that server shows) - all this on Windows 10.

    The only problem I have is with external devices (other computers and tablets). They are connecting successfully to the specific iP, it sees the server (showing the buttons, etc), but no visual feed is coming through. Truth being told, even on the local machine there is a slight delay until visual feed pipes through.

    Could this be some Windows network connectivity wizardry?
    Is there a Mac version of the Webserver?

    Thanks!

    UPDATE:
    It seems to work in Firefox (on Mac), but not Chrome.
    Also not working in Chrome, Firefox or Safari from either iOS (iPhone & iPad) or Android (Tablet)
    Truth being told I am running as an HTTP not HTTPS (though it should work if avoiding Safari, according to the online doc.)
     
    Last edited: Apr 4, 2020
  17. kayy

    kayy

    Joined:
    Jul 26, 2011
    Posts:
    110
    I am looking for a way to restart WebRTC after its initialization failed. The idea is to use the software encoder on Windows PCs as fallback if WebRTC.Initialize fails with the hardware encoder. Unfortunately the second call to WebRTC.Initialize immediately crashes if the first one failed. I call Finalize and wait 5 seconds before calling Initialize again but I cannot find a way.
    On the other hand the procedure of terminating and restarting works fine in general when the first call succeeded e.g. to change media stream parameters like resolution.

    Is there any other method that I need to call for cleaning up? Thanks

    Edit:
    The crash occurs in WebRTC.cs / VideoEncoderMethods / InitializeEncoder / GL.IssuePluginEvent(callback, (int)VideoStreamRenderEventId.Initialize);
    To simulate the situation of unsupported hardware encoder on my NVidia card, I start 2 instances so that the maximum limit of sessions is reached.

    Edit (2):
    If a WebRTC.Initalize call fails in editor it will cause a crash after stopping play mode and starting again despite of clean up with Finalize() in the 1st play mode session.
     
    Last edited: Apr 5, 2020
    SenaJP likes this.
  18. pengjunxi

    pengjunxi

    Joined:
    Apr 2, 2020
    Posts:
    1
    Thank you so much for your work. I'm trying to apply this technology to AR. It is not known whether this streaming screen can set the background to transparent. On the Webserver side, set the background to the camera feed of client. Thank you very much!
     
  19. kazuki_unity729

    kazuki_unity729

    Unity Technologies

    Joined:
    Aug 2, 2018
    Posts:
    803
    The feature of receiving the audio stream is not supported yet.
    we have never tried using "DataChannel" to stream audio.
     
  20. kazuki_unity729

    kazuki_unity729

    Unity Technologies

    Joined:
    Aug 2, 2018
    Posts:
    803
  21. CocoaG33k

    CocoaG33k

    Joined:
    Sep 24, 2019
    Posts:
    5
    Hi there,

    I'm struggling trying to get the version 1.2.3 of the package (with WebRTC 1.1.2) to work on Linux (x64, with Unity 2019.3.8f1) and while I can get the Sample scene to receives the button press events from the WebApp (that is when the editor doesn't crash on pressing "Play"), the camera stream isn't showing-up in the browser (tried with Chrome and Firefox, both recent versions).

    Any suggestion on what the issue could be or how to best debug that problem?

    Thank you.
     
  22. kazuki_unity729

    kazuki_unity729

    Unity Technologies

    Joined:
    Aug 2, 2018
    Posts:
    803
    In most cases, the cause of an issue you reported is a firewall.
    Could you try again after turning off the firewall?
     
  23. CocoaG33k

    CocoaG33k

    Joined:
    Sep 24, 2019
    Posts:
    5
    Hi, thanks for the reply.

    AFAIK, I do not have the firewall enabled, but I followed the steps described in ufw documentation but the result is the same :(
     
  24. gdbbv

    gdbbv

    Joined:
    Apr 20, 2018
    Posts:
    11
    adding

    this.video.muted = true;
    this.video.autoplay =true;

    to video-player.js seemed to resolve the issue with the streaming video of the app consistently not updating on iOS Safari
     
    kayy likes this.
  25. CocoaG33k

    CocoaG33k

    Joined:
    Sep 24, 2019
    Posts:
    5
    Anything else I should be considering? thanks.
     
  26. vabyogamer

    vabyogamer

    Joined:
    Aug 20, 2019
    Posts:
    2
    Sir, I have a question, I hope you answer it..
    On the unity blog it is written that to use it requires the nVidia GPU, does that mean it can run on all nVidia GPUs?
    Thanks for yout answer..
     
  27. Tobs-

    Tobs-

    Joined:
    Feb 12, 2016
    Posts:
    17
    Is there any progress on that matter? I am still not able to interact with UI elements via remote input.

    Cheers,
    Tobi
     
  28. ashwani_9

    ashwani_9

    Joined:
    Mar 31, 2019
    Posts:
    6
    Hi @kazuki_unity729 ,

    Is there any way to render the Unity UI just like the 3d model. I tried to integrate the Unity Render Streaming package into my existing project. It's working fine with the 3D elements but not able to render the UI.
     
  29. kazuki_unity729

    kazuki_unity729

    Unity Technologies

    Joined:
    Aug 2, 2018
    Posts:
    803
    It should isolate the problem.
    - How about software encoder instead of hardware encoder?
    - Are there errors in the browser console?
    - How about the contents of the SDP?
     
  30. kazuki_unity729

    kazuki_unity729

    Unity Technologies

    Joined:
    Aug 2, 2018
    Posts:
    803
    Render Streaming supported software encoder not only NVIDIA GPU.
    https://docs.unity3d.com/Packages/com.unity.webrtc@1.1/manual/index.html#requirements

    If you want to use the hardware encoder, you can see the table written by NVIDIA.
    https://developer.nvidia.com/video-encode-decode-gpu-support-matrix#Encoder
     
  31. kazuki_unity729

    kazuki_unity729

    Unity Technologies

    Joined:
    Aug 2, 2018
    Posts:
    803
    Which option are you selecting "Render Mode" on "Canvas" inspector?
    To draw UI on Render Texture, you must use "Screen Space - Camera" option.
     
  32. kazuki_unity729

    kazuki_unity729

    Unity Technologies

    Joined:
    Aug 2, 2018
    Posts:
    803
    Sorry for the inconvenience.
    We need to investigate the issue with the latest version of "Input System".

    Do you mean "UI elements" is "UGUI"?
    "UIElements" is not supported by "Input System" yet.
    https://docs.unity3d.com/Packages/com.unity.inputsystem@1.0/manual/UISupport.html
     
  33. CocoaG33k

    CocoaG33k

    Joined:
    Sep 24, 2019
    Posts:
    5
    Hello,

    I'm not seeing any error in the browser console, aside from a warning:

    webrtc.peerconnection.datachannel_created - Unknown scalar.

    Looking at the "Network monitor" in firefox, I see no "media" data being received by the browser

    As for using the software encoder, I'm not sure I see a way to change that (I have 1.2.0-preview).

    Thanks.
     
  34. CocoaG33k

    CocoaG33k

    Joined:
    Sep 24, 2019
    Posts:
    5
    I actually got it to work. Something was wrong with my graphic driver. I had to re-install it (un-related issue) and then it started working ... Thanks for the support.
     
    kazuki_unity729 likes this.
  35. gino_pisello

    gino_pisello

    Joined:
    Jun 24, 2013
    Posts:
    33
    Hello! Is there a way to use Cinemachine to move the Render Streaming Camera?
     
  36. anchalsinha

    anchalsinha

    Joined:
    Jan 28, 2020
    Posts:
    3
    Is it possible to stream a live camera feed from the web browser to unity? I have gotten everything setup to work in iOS Safari, but I wanted to know if I could send the rear camera stream from the iOS device to Unity.

    Thanks!
     
  37. anchalsinha

    anchalsinha

    Joined:
    Jan 28, 2020
    Posts:
    3
    Or if getting the camera feed is not possible, can I receive a continuous stream of data (i.e. a sensor value) over the WebRTC data channel? I notice that the input system is event based, but I could not figure out how to add an AttitudeSensor value over the data channel.
    Thanks for the support.
     
  38. kazuki_unity729

    kazuki_unity729

    Unity Technologies

    Joined:
    Aug 2, 2018
    Posts:
    803
    I believe you can use Cinemachine with it but sorry I don't have a sample to prove it.
    Could you tell me the issue if you have?
     
    Dirrogate likes this.
  39. kazuki_unity729

    kazuki_unity729

    Unity Technologies

    Joined:
    Aug 2, 2018
    Posts:
    803
    You can see the site on how to use the rear camera on iOS.
    https://developer.mozilla.org/en-US/docs/Web/API/MediaDevices/getUserMedia

    However, this package doesn't support receiving the stream from browsers,
    the attitude sensor is not supported too.
     
  40. Dirrogate

    Dirrogate

    Joined:
    Feb 18, 2014
    Posts:
    157
    Hi @kazuki_unity729 great job by you and team. I expect there will be a lot more requests from the Virtual Production (filmmaking) community on Unity Render Streaming/WebRTC
    UnrealEngine has everything setup nicely as an inengine plugin: PixelStreaming

    I'm trying to do a few things with WebRTC and Unity RenderStreaming and would be very thankful if you can help or point me in the right direction:
    - I'm a non coder so can't find the exact place in the example scene where the three "buttons" on the UI are located (are they created by a script?) the light On/Off and audio stream etc. I've found the array elements itself.

    - I'd like to create my own UI "slider" to change render camera field of view / Focal length, so would appreciate an example of a slider button and how an array could be setup for sliders.
    Currently, if I reassign the array element Audio to instead set FOV of render camera, by dragging render cam object into the array- it does as expected, but only accepts one value for input since it's a click event (as in the attached pic)

    - Going forward, I'll experiment with assigning a "record" button to Start UnityRecorder Plugin so the remote camera stream can be recorded as mp4 video/ or actual "remote camera" path can be recorded from an oculus motion controller etc.

    In short what I'd be doing is trying to build a crude version of what UE4 has in their Virtual Camera project. Their UI is here: https://docs.unrealengine.com/en-US/Engine/Plugins/VirtualCameraPlugin/index.html
     

    Attached Files:

    SenaJP likes this.
  41. thelunz

    thelunz

    Joined:
    Jan 7, 2020
    Posts:
    2
    Hi - I have a need for efficient communication between a Unity app and a WebView that the app itself has spawned, and managed to achieve this with a WebRTC data channel set up using v1.1.2 on MacOS. It works really well - so thank you!

    Today I've been trying to build the same app for Android (with lots of Googling to bolster my Unity-newbie level), so far without success: I did persuade Unity to build something, by adding the UNITY_STANDALONE symbol (and selecting IL2CPP, .NET 4.x, ARM64), but the resulting apk issues errors about being unable to find webrtc or libwebrtc.

    Arriving at this forum, it looks as if the WebRTC package simply doesn't cater for Android yet. Is that the case?
     
  42. gtk2k

    gtk2k

    Joined:
    Aug 13, 2014
    Posts:
    288
    @Dirrogate
    3Button Project in "WebApp" folder.
    The WebApp project is a Node.js app, so install and run Node.js.
    Swipe other than the three buttons to move the camera direction in real time.
    Since it is transmitted in real time using the WebRTC DataChannel.
    I think that you can do what you want by applying the code transmitted in real time and writing the code of the slider.

    @thelunz
    Unfortunately "Yes".
    See the README for the repository of WebRTC assets used by UnityRenderStreaming for more information.
    https://github.com/Unity-Technologies/com.unity.webrtc
     
    Dirrogate likes this.
  43. thelunz

    thelunz

    Joined:
    Jan 7, 2020
    Posts:
    2
    Thank you for the confirmation (I had seen that current list, but am not familiar enough with today's software build processes - having sheltered in the embrace of live programming environments for the last 30 years - to know what it implied about the possibilities of other ports).

    Is there the prospect of an Android port in the future? Is it even possible - or are there fundamental limitations in the Android system services that get in the way?
     
  44. Dirrogate

    Dirrogate

    Joined:
    Feb 18, 2014
    Posts:
    157
    Thanks for pointing me in the right direction, @gtk2k.
    I still can't find where it is (the location / directory) of the webapp, and i'm assuming it's packaged inside the .tgz bundles. I'll explore more.

    The WebRTC/streaming package does not come with proper documentation and in fact, I had to follow instructions from here: https://www.gamefromscratch.com/post/2019/09/18/Unity-Release-Render-Streaming-Over-WebRTC.aspx

    to get it started.
    I wish Unity would streamline their offerings more - from a creative / artist perspective.
    Overall, a much more intuitive Engine than UE4 (in my opinion) but needs more "supervisory" Heads consolidating and guiding the many efforts.
     
  45. tanakake

    tanakake

    Joined:
    Feb 6, 2019
    Posts:
    6
    Dirrogate likes this.
  46. Dirrogate

    Dirrogate

    Joined:
    Feb 18, 2014
    Posts:
    157
    Thank you, Tanakake.
    I'm a complete newbie to this (and no coding knowledge) but I can learn from copy/pasting and trying things out.
    Can you, or someone guide me on the following:

    • I've successfully copied the original demo to my own project - and it works, but I do not know where to change the three Gui buttons to create sliders/buttons which I'd like to use.
    This project is for Virtual Cinematography, and the fact that WebRTC sends the entire camera image, wirelessly and can receive touch clicks back is invaluable.

    In fact, UnrealEngine's "Vcam" is built on similar principles. They call it "pixel streaming" and they use Arkit to send positional data back to the engine to drive the UE4 camera.

    As can be seen from this video:


    I'm building everything from scratch, within Unity - and because I'm no coder, I'd appreciate some help.
    Unity itself, can benefit from feedback from a Filmmakers perspective.

    Hoping there is an easier solution - such as Gui/Canvas from Unity or Html/CSS way of creating buttons and sliders and assigning them to Unity component functions.

    Currently I'm using the OSC protocol to achieve this.
    Doing it via WebRTC would be great.




     
  47. herohki

    herohki

    Joined:
    Nov 20, 2018
    Posts:
    5
    Hello there,

    May I ask if it is possible or any method to stream a unity .exe directly? Like import an .exe unity program and stream it on the web.
    Any theory of idea would be appreciated
     
  48. Maktech

    Maktech

    Joined:
    Jan 15, 2015
    Posts:
    31
    Hello,
    Thank you for all the work on this package. Have an example up on AWS server, works great.

    The issue we are fighting now sounds exactly like Kayy's issue above (#127). After first run, the second run crashes the editor with a silent crash. Same crash in your example project or after we integrated web-rtc streaming into an existing project. I am not seeing anything in the Editor.logs, tried try/catch around Awake / OnDestroy in RenderStreaming. No change.

    Kayy's post
    https://forum.unity.com/threads/unity-render-streaming-introduction-faq.742481/page-5#post-5672407

    Player logs do show these:

    Releasing render texture that is set to be RenderTexture.active!
    (Filename: Line: 900)
    Culling group was not disposed. You have to call Dispose explicitly from the main thread. This will likely result in a crash.
    (Filename: Line: 70)

    Tried to makes sure the render texture in HDRPRenderTextureBlitter is cleaned up just in case. Same behavior.

    Thank you,
    Matt
     
  49. Tobs-

    Tobs-

    Joined:
    Feb 12, 2016
    Posts:
    17
    Hi folks,
    I am quite happy to see how much effort has flown into this project and it tunring out really good as far as I can see it :) Keep up the great work!

    Now I got got a quick question about the RenderStreaming SimpleScene. Why are the three cameras? I know to demonstrate the multi-cam feature you would have two cameras with that CameraCapture script attached to theme, but what does the third or main-camera do? It seems its necessary? How does that whole hdr stream capturing work, could someone explain it to me?

    Cheers,
    Tobi
     
  50. kazuki_unity729

    kazuki_unity729

    Unity Technologies

    Joined:
    Aug 2, 2018
    Posts:
    803
    Please let me see the version you are using.
    It might be fixed by the latest version.