Search Unity

  1. Unity Asset Manager is now available in public beta. Try it out now and join the conversation here in the forums.
    Dismiss Notice

Official Unity Render Streaming Introduction & FAQ

Discussion in 'Unity Render Streaming' started by kazuki_unity729, Sep 10, 2019.

  1. kayy

    kayy

    Joined:
    Jul 26, 2011
    Posts:
    110
    I just updated the WebRTC package from version 1.1.2 to the most recent 2.0.5 and it's running :)

    But I had to remove all checks to WebRTC.CodecInitializationResult == CodecInitializationResult.Success. The console always showed an Error:

    System.EntryPointNotFoundException: ContextGetCodecInitializationResult
    at (wrapper managed-to-native) Unity.WebRTC.NativeMethods.ContextGetCodecInitializationResult(intptr)
    at Unity.WebRTC.Context.GetCodecInitializationResult () [0x00001] in C:\........\MyApp\Library\PackageCache\com.unity.webrtc@2.0.5-preview\Runtime\Scripts\Context.cs:62


    Is there any other way to get info about the current status?
    Thanks

    UPDATE:
    I tested this with software and hardware encoder in Unity editor and Windows stand alone. As I expected, the result is the same in all scenarios as there seem to be a problem with the DLL.

    I found that the stability was improved compared with 1.1.x as the Editor does not crash when WebRTC.Init fails. But the behaviour is out of control when I try to exceed the maximum of encoder sessions:
    The browser shows a black screen while the connection seems to be active. Just the video track is not available
     
    Last edited: Aug 6, 2020
  2. bisc67

    bisc67

    Joined:
    Apr 22, 2016
    Posts:
    8
    I've been finding the WebRTC exceptionally unreliable. I've been getting hard crashes, e.g. weird memory ref exceptions, in unity.

    IF I single step through the code, it'll work about half the time. At points, as soon as I try to CreateOffer, and often just when doing new RTCPeerConnection, it hard crashes unity. There is no useful stack frame to determine what is wrong.

    This is the code I'm executing below. Sometimes, it'll crash when RTCConfiguration is being created; sometimes it crashes in the ConnectServer co-routine, when it's trying to create the offer.

    I've been trying to get this working for quite some time. Any help would be greatly appreciated.

    upload_2020-8-6_14-13-14.png

    Below is the Unity stacktrace. There is no useful information in the log.

    upload_2020-8-6_14-21-45.png
     
    Last edited: Aug 6, 2020
  3. dyhmichail

    dyhmichail

    Joined:
    Jan 16, 2016
    Posts:
    16
    What Unity version you use? It only works at 2019.3. On other version crushes.
     
  4. dyhmichail

    dyhmichail

    Joined:
    Jan 16, 2016
    Posts:
    16
    Hello, everyone!
    I'm new one to work with RTC technology.
    For now, I am testing with a local signaling server, and have got peer to peer data and audio transmission.
    But stuck at video transmission...
    After I add a video stream how can I get its data on the other peer and transform it into 2Dtexture or rawImage?
     
  5. kazuki_unity729

    kazuki_unity729

    Unity Technologies

    Joined:
    Aug 2, 2018
    Posts:
    803
    Thanks for sharing issues.
    Do you mean "the maximum of encoder sessions" is about a limitation of the NVIDIA codec?
     
  6. kazuki_unity729

    kazuki_unity729

    Unity Technologies

    Joined:
    Aug 2, 2018
    Posts:
    803
    I believe your question is about the video decoding feature.
    the video decoding feature is on the roadmap , this is going to be released as version 2.3.
     
  7. dyhmichail

    dyhmichail

    Joined:
    Jan 16, 2016
    Posts:
    16
    Yes, Kazuki, you are right. Is there any information when approximately will be this release?
     
  8. kayy

    kayy

    Joined:
    Jul 26, 2011
    Posts:
    110
    Yes. Before updating to 2.0.5 I had a limit of 2 concurrent sessions when using the hardware encoder. Starting a third one resulted in CodecInitializationResult.EncoderInitializationFailed.

    This does not occur when I use software encoder.

    Therefore (and in general) it's a good idea to get information about the status of WebRTC, which I did in the past by calling WebRTC.CodecInitializationResult.
    As this does not work in 2.0.5 I skipped these checks and everything works fine but every session above the limit results shows an empty video stream and I cannot inform the user about the reason. Our server with a Quadro RTX 5000 does not have this limit and 8 sessions in parallel are fine
     
  9. kazuki_unity729

    kazuki_unity729

    Unity Technologies

    Joined:
    Aug 2, 2018
    Posts:
    803
  10. kazuki_unity729

    kazuki_unity729

    Unity Technologies

    Joined:
    Aug 2, 2018
    Posts:
    803
    8 sesions in parallel! Oh my goodness!
    I am interested what kind of software you are developing.

    OK, I got it. I will consider the design of the API to get the status of encoder.
     
    kayy likes this.
  11. princemioRocker

    princemioRocker

    Joined:
    Apr 1, 2020
    Posts:
    3
    hey guys - thanks for this amazing plugin. It really opens up massive possibilities! I wonder if this issue of spontanious video quality drop and artefacts will be fixed any time soon. Here are two different issues on github - but i feel that they actually relate to the same phenomena
    1) https://github.com/Unity-Technologies/com.unity.webrtc/issues/127
    2) https://github.com/Unity-Technologies/com.unity.webrtc/issues/100

    Both examples relate to heavy temporare artefacts while having stable bitrate. Thank you again and hope to hear from you

    best
     
  12. kazuki_unity729

    kazuki_unity729

    Unity Technologies

    Joined:
    Aug 2, 2018
    Posts:
    803
    We are recognizing the issue is high priority but need more time to fix.
     
  13. dyhmichail

    dyhmichail

    Joined:
    Jan 16, 2016
    Posts:
    16
  14. JHEBox

    JHEBox

    Joined:
    Aug 16, 2020
    Posts:
    4
    Hi there,

    I have two questions about the render streaming package. Firstly, I notice in the SimpleRenderStreaming scene on the RenderStreaming object, you can add button click events and the Element Ids line up with the 3 pre-made buttons which appear on play (light on, light off, and play audio). I can also see in the RenderStreaming scipt that an OnButtonClick method exists. However I can't see where these 3 buttons gameobjects actually are and how they can be edited, for example moving the placement of the buttons, their colours, or their labels. Where can I find the button gameobjects so that they can be edited?

    Secondly, and this really is a noob question so please accept my apologies, I am struggling with the documentation to figure out how to make the web application publicly accessible (ie, stream over something other than localhost): https://docs.unity3d.com/Packages/c...ing@2.0/manual/en/webapp.html#command-options. I'm a complete novice at servers and this sort of thing, but if anyone could point me in the right direction of how to get started making the stream accessible from, for example, a different PC, that would be great because I'm struggling to get my head around anything online!

    Thanks :)
     
  15. dyhmichail

    dyhmichail

    Joined:
    Jan 16, 2016
    Posts:
    16
    Hi, you need your own signal server to do this. There is some samples in Unity officials. But generally, you need to send RTCSessionDescription struct via JSON or another way to another PC, then get a mirror answer. Then repeat this with RTCIceCandidate struct. And the connection should be established.
    The next stage depends on what you want to transfer (data, audio, or video).
     
  16. dyhmichail

    dyhmichail

    Joined:
    Jan 16, 2016
    Posts:
    16
    upload_2020-8-17_12-31-55.png
    Hello everyone! Does somebody know how to get webrtc.dll when building on android?
    I've got error Lib = null, when building android build.
     

    Attached Files:

  17. kazuki_unity729

    kazuki_unity729

    Unity Technologies

    Joined:
    Aug 2, 2018
    Posts:
    803
    Last edited: Aug 18, 2020
  18. kazuki_unity729

    kazuki_unity729

    Unity Technologies

    Joined:
    Aug 2, 2018
    Posts:
    803
    Hi, we are checking the issue which you reported, but this is not reproduced.
    Could you tell us the detail of it?
     
  19. Uee

    Uee

    Joined:
    Nov 18, 2015
    Posts:
    1
    Hi, does anyone have a tutorial or steps I can follow to get the unity render streaming to work remotely, please?
     
    Last edited: Aug 24, 2020
  20. JHEBox

    JHEBox

    Joined:
    Aug 16, 2020
    Posts:
    4
    I got the stream to other devices as long as they're all on the same network (still not sure how signal servers work as I couldn't find the Unity Officials which dyhmichail mentioned... Does anybody have any information that could help with my first query? About how to edit the "lights off/on" buttons? As said, I can see how to change the events, but can't find the buttons themselves!
     
  21. kayy

    kayy

    Joined:
    Jul 26, 2011
    Posts:
    110
    These buttons are created and defined in app.js. See also register-events.js for processing.
    Their input is transferred via WebRTC data channel as described in input.

    In the Unity part this is processed in RemoteInput.cs.
     
    polyverseme and JHEBox like this.
  22. JHEBox

    JHEBox

    Joined:
    Aug 16, 2020
    Posts:
    4
    Brilliant, I think I now understand how everything is processed at least, so thank you!

    I can see how I would edit the buttons in app.js, however I am struggling to find app.js locally on my machine. I thought it would be contained in the package files in the unity project but I can seem to find any documentation about the file path and haven't found the files whilst digging through. Do you know what the file path would be to find the webapp files on my machine?
     
  23. kayy

    kayy

    Joined:
    Jul 26, 2011
    Posts:
    110
    They should reside under WebApp/public/scripts/app.js. Everything under WebApp/public is served as static content by the node.js server. Thus if you change anything, you need to call WebApp/run.bat or a similar script if you are on macOS or Linux, you just need the 2 lines
    call npm run build
    call npm run start
     
    kazuki_unity729 likes this.
  24. JHEBox

    JHEBox

    Joined:
    Aug 16, 2020
    Posts:
    4
    Right okay, please accept my apologies for having no idea what I'm doing but I think I'm starting to get it... So the WebApp IS "webserver.exe". However, when you download it from within Unity (Edit/RenderStreaming/Download WebApp) it comes as an .exe, so you can't edit the app.js script because the WebApp has already been built? So you need to download the source code separately, change the scripts to your needs, and then build a new webserver.exe?
     
  25. kayy

    kayy

    Joined:
    Jul 26, 2011
    Posts:
    110
    Exactly :)
     
  26. kayy

    kayy

    Joined:
    Jul 26, 2011
    Posts:
    110
    I am working on a multi-platform project where only the desktop versions will support render streaming. I ended up with a couple of pre-processor statements for those all those methods that rely on WebRTC specific types like EncoderType etc. so they cannot be hidden by abstraction.

    I know that some platforms like Android are not supported by WebRTC. Are there any plans to provide a dummy library so that builds for target Android, UWP, ... don't fail with errors like ' The type or namespace name 'WebRTC' does not exist' ?

    Thanks
     
  27. kazuki_unity729

    kazuki_unity729

    Unity Technologies

    Joined:
    Aug 2, 2018
    Posts:
    803
    OK, I am investigating the way of disabling building an assembly of WebRTC when using the platform which is not supported by the package
     
    kayy likes this.
  28. JianyingWan

    JianyingWan

    Joined:
    Aug 27, 2020
    Posts:
    2
    Hi, I have a question about the UI canvas. I need show some data panel on the screen, like this
    upload_2020-8-27_17-31-43.png
    the canvas render mode is Screen Space - Overlay, and it can not be display in the remote browser.
     

    Attached Files:

  29. Arkade

    Arkade

    Joined:
    Oct 11, 2012
    Posts:
    655
    Hi, Just started looking at WebRTC package but the samples show errors in Rider due to presence of ZWSP (zero-width-space) character in several places in the sample source code. See attached image. Easy enough to go through and delete but thought to notify here so others who see error messages like the following know what to look for (and the authors can remove :) ) In Rider, the error message reports as lots of:

    Argument type 'Unity.WebRTC.RTCIceCandidate' is not assignable to parameter type 'RTCIceCandidate'

    HTH, R.

    (EDITED to remove mention of build problems on Windows which were just due to not switching to 64-bit build :D )
     

    Attached Files:

    Last edited: Aug 27, 2020
  30. kazuki_unity729

    kazuki_unity729

    Unity Technologies

    Joined:
    Aug 2, 2018
    Posts:
    803
    I am afraid the issue which the UI can not be display in the browser is by design.

    You need to select a option "Screen Space - Camera" for Render Mode.
    Please refer to the template project published on GitHub Release.

    https://github.com/Unity-Technologi...derstreaming-hd/Documentation~/en/tutorial.md
     
  31. kazuki_unity729

    kazuki_unity729

    Unity Technologies

    Joined:
    Aug 2, 2018
    Posts:
    803
    Thank you for reporting the issue kindly.
    Rider is so great IDE ever seen for Unity developer. As a side note, I use VS2019 generally. :p

    I fixed the issue in this PR.
    https://github.com/Unity-Technologies/com.unity.webrtc/pull/173
     
  32. JianyingWan

    JianyingWan

    Joined:
    Aug 27, 2020
    Posts:
    2
    Hi, I am using the 2.1.0 preview version now, and unity version Unity 2019.4.9f1 (64-bit).
    I have some confusions.

    First, is there any way to dynamically change the render streaming size when the app is running, I found that the class VideoStreamTrack doesn't have some methods or properties to change the size.

    And another question , Sometimes the webrtc connection between app and browser are not stable, only restart the app can reconnect, refresh the web page is invalid. And at the same time if I don't restart webserver, the connection can only keep stable for a short while.
    upload_2020-9-3_14-57-31.png

    Thanks for your answer.
     
    Last edited: Sep 3, 2020
  33. SenaJP

    SenaJP

    Joined:
    Jan 31, 2016
    Posts:
    4
    Hi

    Unity Render Streaming is a very fun package. I have been having a lot of fun playing it.

    I have successfully controlled the Unity Editor (Unity Render Streaming) running on a PC from my phone over the internet.

    https://twitter.com/_5ENA/status/1301099121495875590?s=20

    To put it simply, I did the following two things

    1. change the signaling to use a signaling server on the internet.
    2. Creating a React Native Android Application

    For the signaling server, I used a free server called Ayame Lite.
    https://ayame-lite.shiguredo.jp/beta

    The source code of Ayame is available to the public, so you should be able to set up your own signaling server.
    You can set up your own signaling server at https://github.com/OpenAyame/ayame

    I used this OSS to create the React Native app.
    https://github.com/react-native-webrtc-kit/react-native-webrtc-kit

    By checking and implementing the specification for Unity Render Streaming's DataChannel, you can send button events and touch events.

    The Unity Render Streaming project with a modified signaling process is available to the public.
    https://github.com/tarakoKutibiru/UnityRenderStreaming-Ayame-Sample

    The React Native side of the source code is still a bit buggy, so I'm hoping to release it to GitHub once I've fixed the bugs and cleaned up the source code.
     
    kayy, JHEBox and kazuki_unity729 like this.
  34. olindstrm

    olindstrm

    Joined:
    Aug 8, 2018
    Posts:
    8
    I have been trying to set this up now for a few hours but have fauked to get it working.

    At the moment I am just trying to connect, over the internet, to an instance of the webserver and build running on my local computer. I have port forwarded port 80 and am using coturn running in google cloud as an Ice server. I am using websocket signaling for my Url signaling.

    Once I connect using my public ip on my phone (the computer and the phone are not on the same network), I can see user interface on my phone, which tells me I have successfully connected to the webserver, however, I am not getting any video coming through. the log in the editor says that a client has connected, but on my phone I can only see the standard grey screen. After around 5-10 seconds the play button appears again on my phone wich i take to mean that the webserver disconnected it.

    What could I be doing wrong? Does it have something to do with the Ice server setup?
     
  35. kazuki_unity729

    kazuki_unity729

    Unity Technologies

    Joined:
    Aug 2, 2018
    Posts:
    803
    I am sorry for bothering you.

    We found a bug of the signaling client with websocket and fixed it this week.
    This Pull Request is for fixing the bug.

    I am going to release the hot fix version 2.1.1 next week.
     
    SenaJP likes this.
  36. olindstrm

    olindstrm

    Joined:
    Aug 8, 2018
    Posts:
    8
    Thanks! Unfortunately this did not solve my problem. I have the exact same issue as before except now, It will occasionaly throw this error whenever a client connects over the internet:
    Code (CSharp):
    1. Network Error: Unity.WebRTC.RTCError
    . Still only see the gray screen with the buttons when connecting over the internet except I don't get thrown out after a few seconds like before.

    EDIT: I Should probaby mention it all works fine on my local network.

    EDIT2: Oh and also, sometimes the main camera does not work when I connect locally on my phone, yet the secondary camera (the one in the corner) works just fine.
     
    Last edited: Sep 9, 2020
  37. fourbb

    fourbb

    Joined:
    Jul 22, 2020
    Posts:
    13
    Hello, when I was using the hardware encoder, the data channel could connect, however, I am not getting any video coming through. I confirm that my graphics card(GeForce RTX 2080 Ti) supports NVENC. I checked Chrome :// WebrTC-Internals and found no video stream. I guess the problem lies in the coding part. What causes the hardware coding failure? And where should I check to see if the code is successful?
     
    Last edited: Sep 10, 2020
  38. olindstrm

    olindstrm

    Joined:
    Aug 8, 2018
    Posts:
    8
    The issue with not being able to see the scene has been resolved. I had set up my TURN server incorrectly, sorry for the hassle.

    The websocket issue does still persist (I know it's geeting fixed) and sometimes a user needs to refresh their feed multiple times before being able to see what is being rendered on their camera. This could be due to poor internet connectivity perhaps? Still investigating this.

    The WebRTC error still persist whenever I try to connect on my mobile device.
     
  39. Maktech

    Maktech

    Joined:
    Jan 15, 2015
    Posts:
    31
    We have a project where each user that connects to our game instance over Web-RTC gets their own camera, audio listener and player. Each camera has its own CameraStreamer and its VideoStreamTrack is given to that user and its RemoteInput is routed to that player's control logic.

    I am unsure how to create a unique AudioStreamTrack for each AudioListener since each AudioStreamer directly calls WebRTC.Audio.Update(data, channel).

    Does WebRTC.Audio.CaptureStream() create different tracks, or is it the same track? If they are different just not clear on how to stream to that track from the OnAudioFilterRead function.

    The AudioRenderer solution in the docs go through this same Update function.

    Thanks for the help.
    Matt
     
  40. kazuki_unity729

    kazuki_unity729

    Unity Technologies

    Joined:
    Aug 2, 2018
    Posts:
    803
    Is the software encoder working properly?
    It is possible your issue caused by the network environments.
     
  41. fourbb

    fourbb

    Joined:
    Jul 22, 2020
    Posts:
    13
    Hello, the software encoder working fine.
    How should I check and solve problems in the network environment?

    Thanks for your answer.
     
  42. kazuki_unity729

    kazuki_unity729

    Unity Technologies

    Joined:
    Aug 2, 2018
    Posts:
    803
    I understand your expectation.
    The latest version of the WebRTC package is not supported streaming multiple audios.

    We'll deal with it in the future.
     
  43. kazuki_unity729

    kazuki_unity729

    Unity Technologies

    Joined:
    Aug 2, 2018
    Posts:
    803
    Oh, so the software encoder is working properly, this might be no network issues.
    I guess the cause either the graphics driver or the browser.
     
  44. fourbb

    fourbb

    Joined:
    Jul 22, 2020
    Posts:
    13
    There should be no problem with the graphics driver. When I use OBS Studio, the hardware code NVENC work normally.
     
    Last edited: Sep 11, 2020
  45. LazyOnion

    LazyOnion

    Joined:
    Mar 6, 2018
    Posts:
    22
    Hello, great job to the whole team!! I have a quest regarding the software encoder, is it plausible to make it work on linux, without many modifications?
     
  46. RocketPixel

    RocketPixel

    Joined:
    Jun 23, 2020
    Posts:
    5
    Has any one had success getting UnityWebRTC working with SFUs? I would love to hear about it.
    We have had great success getting WebRTC scaling to about 8 users. Thank you Unity!
    I would like to figure out how to scale to 100.
    I am looking at AntMedia, Jitsi, Janus, and mediasoup but its not yet clear to me how to send out a unityweb3d stream to these services, they seem written more to connect browser to browser but I am hoping I am just ignorant here.
     
  47. RocketPixel

    RocketPixel

    Joined:
    Jun 23, 2020
    Posts:
    5
     
  48. kazuki_unity729

    kazuki_unity729

    Unity Technologies

    Joined:
    Aug 2, 2018
    Posts:
    803
    This WebRTC package uses NVIDIA Codec SDK 9.0 which requires NVIDIA driver version below:
    - Windows: Driver version 418.81 or higher
    - Linux: Driver version 418.30 or higher
     
  49. kazuki_unity729

    kazuki_unity729

    Unity Technologies

    Joined:
    Aug 2, 2018
    Posts:
    803
    The latest package v2.1 do not support a software encoder on Linux.
    We are developing the new version v.2.2 which supports it.
     
  50. kazuki_unity729

    kazuki_unity729

    Unity Technologies

    Joined:
    Aug 2, 2018
    Posts:
    803
    It is important for increasing number of concurrent connection users,
    but unfortunately, we have never tried to integrate with SFUs.