Search Unity

  1. Unity Asset Manager is now available in public beta. Try it out now and join the conversation here in the forums.
    Dismiss Notice
  2. Megacity Metro Demo now available. Download now.
    Dismiss Notice
  3. Unity support for visionOS is now available. Learn more in our blog post.
    Dismiss Notice

Official Unity Render Streaming Introduction & FAQ

Discussion in 'Unity Render Streaming' started by kazuki_unity729, Sep 10, 2019.

  1. dipak_unity777

    dipak_unity777

    Joined:
    Dec 1, 2021
    Posts:
    6
    We at Sariska have developed unity native plugins for Android/iOS which are built on the Jisti Architecture. You can check out the docs at docs.sariska.io

    Furthermore, you can reach out to me at dipak@sariska.io, and I would be more than happy to integrate the plugins for your use.
     
  2. Deleted User

    Deleted User

    Guest

    Hello, I was wondering if you could tell me when can it be supporting UWP ARM64 build(Hololens2).

    Since Microsoft stopped moving forward with MixedReality-WebRTC, this library is the most recent and gone farthest.

    If it takes too long to support ARM64, I will be happy to learn from you where to look at for developing it myself.

    Thank you.
     
  3. kazuki_unity729

    kazuki_unity729

    Unity Technologies

    Joined:
    Aug 2, 2018
    Posts:
    803
    Unfortunatelly, we don't have the schedule to support UWP ARM64.
    Please see this ticket below.
    https://github.com/Unity-Technologies/com.unity.webrtc/issues/15
     
  4. ayamir

    ayamir

    Joined:
    Jun 18, 2022
    Posts:
    13
    How can I build webrtc.dll from my webrtc fork? I noticed that it will download webrtc-win.zip from github in this script. I can get `webrtc.lib` from `src/out/Default/obj` directory. But where can I find `webrtcd.lib`?
     
  5. ayamir

    ayamir

    Joined:
    Jun 18, 2022
    Posts:
    13
    Ok, I get answer from this.
     
  6. dan_soqqle

    dan_soqqle

    Joined:
    Feb 2, 2022
    Posts:
    113
    Query: I was using bidirectional sample to build two way chat (audio) between users. It was using private signal channel.

    However i now also need to broadcast a camera to a web. I was thinking of using the broadcast sample, but it is using public channel on the signal server.

    I dont quite understand how the private vs public work. How could i make this work?
    Could you advise me please? Thank you
     
  7. kazuki_unity729

    kazuki_unity729

    Unity Technologies

    Joined:
    Aug 2, 2018
    Posts:
    803
    I understood your issue but the signaling server nedds to be selected the communication mode (private or public) by commandline-option.
    It is not confortable for many developers, so we are going to fix this design in the future. We can not say the date clearly when we fix them.
     
  8. NeptuneEarth

    NeptuneEarth

    Joined:
    Dec 20, 2020
    Posts:
    15
    I can see that you've updated the WebRTC package on github.
    Are we going to get updated unity template files for Built-in pipeline, URP, HDRP and HDRP RTX?
     
    Last edited: Oct 4, 2022
  9. dan_soqqle

    dan_soqqle

    Joined:
    Feb 2, 2022
    Posts:
    113
    Thank you for this. However i'm curious what is the difference between private or public? I couldn't see any documentation on it. And if it is available somewhere, feel free to share!
     
  10. kazuki_unity729

    kazuki_unity729

    Unity Technologies

    Joined:
    Aug 2, 2018
    Posts:
    803
  11. NeptuneEarth

    NeptuneEarth

    Joined:
    Dec 20, 2020
    Posts:
    15
    Thanks Kazuki.

    I'm getting and error while trying to download the latest (exp4) version.


    [Package Manager Window] Cannot perform upm operation: Unable to add package [com.unity.renderstreaming@3.1.0-exp.4]:
    Package [com.unity.renderstreaming@3.1.0-exp.4] cannot be found [NotFound].
    UnityEditor.EditorApplication:Internal_CallUpdateFunctions ()
     
  12. kazuki_unity729

    kazuki_unity729

    Unity Technologies

    Joined:
    Aug 2, 2018
    Posts:
    803
    I am working on releasing the version right now. Sorry for waiting!
     
  13. NeptuneEarth

    NeptuneEarth

    Joined:
    Dec 20, 2020
    Posts:
    15
    Awesome, thank you :)
     
  14. goatstruck

    goatstruck

    Joined:
    Apr 29, 2015
    Posts:
    3
    Hi,

    Thanks for all the work creating and maintaining what looks to be a very promising package.

    I'm having some trouble using a WebCamStreamSender when deployed to my Android phone.

    Line 84 of WebCamStreamSender.cs creates a WebCamTexture to be used when creating a VideoStreamTrack. However, the constructor of VideoStreamTrack throws an exception:

    ArgumentException: This graphics format R8G8B8A8_UNorm is not supported for streaming, please use supportedFormat: B8G8R8A8_UNorm

    at Unity.WebRTC.WebRTC.ValidateGraphicsFormat (UnityEngine.Experimental.Rendering.GraphicsFormat format) [0x00000] in <00000000000000000000000000000000>:0

    at Unity.WebRTC.VideoStreamTrack..ctor (System.IntPtr texturePtr, System.Int32 width, System.Int32 height, UnityEngine.Experimental.Rendering.GraphicsFormat format, System.Boolean needFlip) [0x00000] in <0000000000000000000000000



    at Unity.WebRTC.VideoStreamTrack..ctor (UnityEngine.Texture texture, UnityEngine.RenderTexture dest, System.Int32 width, System.Int32 height, System.Boolean needFlip) [0x00000] in <00000000000000000000000000000000>:0

    at Unity.RenderStreaming.WebCamStreamSender.CreateTrack () [0x00000] in <00000000000000000000000000000000>:0

    at Unity.RenderStreaming.StreamSenderBase.get_Track () [0x00000] in <00000000000000000000000000000000>:0

    at Unity.RenderStreaming.SignalingHandlerBase.AddS

    Graphics device is null.

    Unity.WebRTC.VideoEncoderMethods:FinalizeEncoder(IntPtr, IntPtr)

    Unity.WebRTC.VideoStreamTrack:Dispose()

    Unity.WebRTC.MediaStreamTrack:Finalize()


    Replicating the code used in the constructor of VideoStreamTrack I have verified that my device does support B8G8R8A8_UNorm. Inspecting m_webCamTexture.graphicsFormat shows that it is of R8G8B8A8_UNorm format.

    I can't work out how to specify a different texture format, or to configure the use of a codec to sort this out.

    I am using version 3.1.0-exp.3

    Any help would be greatly appreciated. Thanks in advance!
     
  15. kazuki_unity729

    kazuki_unity729

    Unity Technologies

    Joined:
    Aug 2, 2018
    Posts:
    803
    @goatstruck
    We released the latest version of the Unity Render Streaming package last week.
    Could you try it?
     
  16. may1st

    may1st

    Joined:
    Jan 17, 2018
    Posts:
    5
    Hi,
    it's not working properly with 3.1.0-exp.4.

    start websocket signaling server ws://192.168.44.199
    start as public mode
    http://192.168.44.199:80
    http://127.0.0.1:80
    GET /videoplayer/index.html 200 4.071 ms - 1407
    GET /css/main.css 304 0.876 ms - -
    GET /videoplayer/css/style.css 200 1.630 ms - 1546
    GET /videoplayer/js/main.js 200 0.733 ms - 5151
    GET /js/config.js 304 0.406 ms - -
    GET /videoplayer/js/video-player.js 200 1.107 ms - 7768
    GET /videoplayer/js/register-events.js 200 1.153 ms - 10396
    GET /module/signaling.js 404 0.882 ms - 158
    GET /module/peer.js 404 0.281 ms - 153
    GET /module/logger.js 404 0.256 ms - 155
    GET /module/keymap.js 404 0.232 ms - 155
    GET /videoplayer/js/gamepadEvents.js 200 0.794 ms - 5226

    could you give me some help.
     
  17. VRS3DGuru

    VRS3DGuru

    Joined:
    Sep 21, 2017
    Posts:
    12
    In web browser, I am able to connect to server and pick reciever link, etc. The screen has a big grey box in the middle.

    In unity I get this:

    Signaling: WS connection closed, code: 1006
    UnityEngine.Debug:Log (object)
    Unity.RenderStreaming.Signaling.WebSocketSignaling:WSClosed (object,WebSocketSharp.CloseEventArgs) (at Library/PackageCache/com.unity.renderstreaming@3.1.0-exp.4/Runtime/Scripts/Signaling/WebSocketSignaling.cs:262)
    WebSocketSharp.Ext:Emit<WebSocketSharp.CloseEventArgs> (System.EventHandler`1<WebSocketSharp.CloseEventArgs>,object,WebSocketSharp.CloseEventArgs)
    WebSocketSharp.WebSocket:close (WebSocketSharp.PayloadData,bool,bool,bool)
    WebSocketSharp.WebSocket:fatal (string,uint16)
    WebSocketSharp.WebSocket:fatal (string,System.Exception)
    WebSocketSharp.WebSocket:connect ()
    System.Threading._ThreadPoolWaitCallback:performWaitCallback ()
     
  18. goatstruck

    goatstruck

    Joined:
    Apr 29, 2015
    Posts:
    3
    The new version worked, thanks very much.

    The previous version I was using had a WebCamVideoSender component and much of the signalling related classes and structs were public.

    The new version uses a VideoStreamSender which can be configured to source video from webcam, camera, texture etc. In this version lots of the signalling messages had been marked internal.

    In the future will the package continue to use a more generic VideoStreamSender component or will you use different components for each source type (eg WebCamVideoSender)?

    Also do you have any plans to make the signalling bits public again? I had modified the WebSocketSignalling to use the native web sockets package (https://github.com/endel/NativeWebSocket) and added a few extra events such as OnDisconnected. This was no longer possible in the new package due to the access level of classes like
    DescData and CandidateData.

    Thank you for your help.
     
  19. kazuki_unity729

    kazuki_unity729

    Unity Technologies

    Joined:
    Aug 2, 2018
    Posts:
    803
    I guess the generic VideoStreamSender is better than many components because of the maintainancability.

    We want to make APIs more flexible, and should make easier to customize by developers.
    The reason of changing access level is for reducing cost of our maintanance. I can change them if developers want to use them to make the own signaling process themselves.
     
  20. goatstruck

    goatstruck

    Joined:
    Apr 29, 2015
    Posts:
    3
    Thanks. For now a few extra events in WebSocketSignalling would help me out. Are you accepting pull requests on the GitHub?
     
  21. kazuki_unity729

    kazuki_unity729

    Unity Technologies

    Joined:
    Aug 2, 2018
    Posts:
    803
  22. NTyudina01

    NTyudina01

    Joined:
    Mar 26, 2019
    Posts:
    1
    What BoringSSL version is used in 2.4.0-EXP.10?
     
  23. kazuki_unity729

    kazuki_unity729

    Unity Technologies

    Joined:
    Aug 2, 2018
    Posts:
    803
  24. ayamir

    ayamir

    Joined:
    Jun 18, 2022
    Posts:
    13
    Can I use software encoder with h264 codec? I noticed that webrtc offer api for creating encoder for h264 and use openh264 to encode. But why h264 is not in the software encoder candidates? How can I do to enable it? Thanks in advance.
     
  25. kazuki_unity729

    kazuki_unity729

    Unity Technologies

    Joined:
    Aug 2, 2018
    Posts:
    803
    x264 is excluded in our package because of avoiding the license issues.
    You need to build library yourself to use x264. Please remove the "rtc_use_h264=false" in this line.
    https://github.com/Unity-Technologi...ain/BuildScripts~/build_libwebrtc_win.cmd#L47
     
  26. ayamir

    ayamir

    Joined:
    Jun 18, 2022
    Posts:
    13
  27. ayamir

    ayamir

    Joined:
    Jun 18, 2022
    Posts:
    13
    I'm using 2.4.0-exp.4 of com.unity.webrtc and M89 version of webrtc itself. So I forked M89 version of webrtc for myself and noticed that:
    in webrtc.gni.
    So it can be built successfully with arguments as following:
    --args="is_debug=%%j is_clang=true target_cpu=\"%%i\" rtc_include_tests=false rtc_build_examples=false rtc_use_h264=true symbol_level=0 enable_iterator_debugging=false"
    But it failed during LINK process and reported LNK2019 error when I try executing `BuildScript~/build_plugin_win.cmd` to build `webrtc.dll`.
    Is it caused by the compatiablity between MSVC and clang? How can I solve it?
     
  28. ayamir

    ayamir

    Joined:
    Jun 18, 2022
    Posts:
    13
    I also tried to build webrtc m92 version with "rtc_use_h264=true is_clang=true". I have patched the add_jsoncpp.patch to src/BUILD.gn btw. And I switched to 2.4.0-exp.11 version of com.unity.webrtc. But it still failed during link process. Following is the errors:
    upload_2022-11-8_20-43-39.png
     
  29. kazuki_unity729

    kazuki_unity729

    Unity Technologies

    Joined:
    Aug 2, 2018
    Posts:
    803
    Looks like the failing of linking the standard library.
    Recently we changed about the linking the libc++ here.
    https://github.com/Unity-Technologies/com.unity.webrtc/pull/827/files
     
  30. ayamir

    ayamir

    Joined:
    Jun 18, 2022
    Posts:
    13
    Sorry, my environment is windows10. What's more, it's ok when I built webrtc with your arguments provided in build_libwebrtc_win.cmd. So the link errors above are just because of "is_clang=true". I noticed that clang-cl.exe from VS2019 was used during the build process, so maybe it's caused by the two clang's compatiablity between webrtc used and com.unity.webrtc used? How can I solve it?
     
    Last edited: Nov 9, 2022
  31. ayamir

    ayamir

    Joined:
    Jun 18, 2022
    Posts:
    13
  32. dan_soqqle

    dan_soqqle

    Joined:
    Feb 2, 2022
    Posts:
    113
    Hi i'm still looking at doing voice chat style (kind of like for a mmo game) where it can be for groups but also in areas.

    I'm wondering if i should stick to trying to do the renderstreaming with a connection (i can use the bidirectional with connectionId for private chat) - or i am thinking if i should do a custom web-rtc setup. What is the difference? I know renderstreaming uses web-rtc under the hood. Or its the same?

    What do you think?
     
  33. kazuki_unity729

    kazuki_unity729

    Unity Technologies

    Joined:
    Aug 2, 2018
    Posts:
    803
    We design the Unity Render Streaming to use WebRTC more easily, but if the design doesn't match your use-case, you need to use WebRTC package directly.
     
  34. dan_soqqle

    dan_soqqle

    Joined:
    Feb 2, 2022
    Posts:
    113
    thank you. From what i observe, if render streaming works well for my case i should stick to it and not worry about webrtc stuff. thanks!
     
  35. AzeExMachina

    AzeExMachina

    Joined:
    Jan 30, 2016
    Posts:
    14
    Hi, I have a setup where I'm using WebRTC with two different unity clients, on different machines, one has an nvidia card so I'm using h264 hardware encoding by setting the codec preferences after adding the tracks, the other machine howeveer doesn't have an nvidia card, is there a way to decode the stream in this situation? Because I'm receiving an error
    Code (csharp):
    1. Session error code: ERROR_CONTENT. Session error description: Failed to set remote video description send parameters for m-section with mid='0'..
    Forcing a VP8 codec does make it work but it doens't use the hardware encoding

    Thank you very much for your work!
     
  36. kazuki_unity729

    kazuki_unity729

    Unity Technologies

    Joined:
    Aug 2, 2018
    Posts:
    803
    @AzeExMachina
    Currently not supported H264 decoder without NVIDIA GPU on Windows and Linux.
    We have a plan to support H264 hardware decoder for intel, AMD chip.
     
  37. AzeExMachina

    AzeExMachina

    Joined:
    Jan 30, 2016
    Posts:
    14
    Thank you very much, is there any ETA on that?
     
  38. kazuki_unity729

    kazuki_unity729

    Unity Technologies

    Joined:
    Aug 2, 2018
    Posts:
    803
    Sorry but we can't promise the date when we support it.
     
  39. dave_m_moore

    dave_m_moore

    Joined:
    Jul 28, 2016
    Posts:
    39
    Hi. I wonder whether you could help me please? I am trying to use Render Streaming/WebRTC to output a single video stream for presentation on a web browser. However, despite much investigation, I only get a blank view in the browser where the video should be. I have tried Chrome, Edge and IE all with the same results.
    I am using the following:
    • Unity Render Streaming 3.1.0-exp.4
    • WebRTC 3.0.0-pre.1
    • Unity 2022.1.23f
    Running on Windows 10 with Unity, webserver and browser running on the same PC.

    There are no obvious error messages. The webserver seems to run fine. I've tried both HTTP and WebSocket signaling.
    I have followed a number of online tutorials step by step, but always get the same result.
    Any ideas please?
    Thanks,
    Dave
     
  40. kazuki_unity729

    kazuki_unity729

    Unity Technologies

    Joined:
    Aug 2, 2018
    Posts:
    803
    How about the mouse input or keyboard input? It would be the signaling process issue.
     
  41. dave_m_moore

    dave_m_moore

    Joined:
    Jul 28, 2016
    Posts:
    39
    In don't need any input, only video output.
     
  42. kotoezh

    kotoezh

    Joined:
    May 21, 2013
    Posts:
    21
    Hi. Could you please advise a better way to stream only one eye image from mobile VR device?
    If the stream source is the main XR Camera, the image freezes on the mobile device itself. If the stream source is Screen, I get stereo image streaming, which I don't need. Creating additional camera for streaming will drop performance with additional rendering.
    I wonder if there is a way to use rendered Right or Left Camera image for streaming without freezing image on the device itself? Probably it could be done with the Texture streaming sourcem but I didn't manage to find the solution.
    Thanks!
     
  43. dan_soqqle

    dan_soqqle

    Joined:
    Feb 2, 2022
    Posts:
    113
    Hi i've recently been needing to do a bigger conference-type activity thus it will be even hitting 100 person.
    In this context, i think i need a media-server?

    Does that mean render streaming won't be feasible and i have to make my own signaling and media server solution? I'm exploring Kurento. Or is there a way to still use render streaming?
     
  44. kazuki_unity729

    kazuki_unity729

    Unity Technologies

    Joined:
    Aug 2, 2018
    Posts:
    803
    Can you grab the texture from the one eye image? You can stream the texture using VideoStreamSender.
    https://docs.unity3d.com/Packages/com.unity.renderstreaming@3.1/manual/video-streaming.html
     
    kotoezh likes this.
  45. kazuki_unity729

    kazuki_unity729

    Unity Technologies

    Joined:
    Aug 2, 2018
    Posts:
    803
    Unity Render Streaming doesn't support your case like streaming video to 100 person. You need to make SFU server for sharing many people.
     
  46. ayamir

    ayamir

    Joined:
    Jun 18, 2022
    Posts:
    13
    I need to stream two cameras RenderTexture to the web page, can I make the two video streams encode parallel?
     
  47. dave_m_moore

    dave_m_moore

    Joined:
    Jul 28, 2016
    Posts:
    39
    Hi. Could someone please confirm whether a Nvidia GPU is a required for the 'Render Streaming' function to work?
     
  48. harper_18

    harper_18

    Joined:
    Dec 13, 2022
    Posts:
    1
    Does anyone know why Android can't create AudioStreamTrack? error: object reference not set to an instance of an object . in AudioTrackSource.ctor
     
  49. ayamir

    ayamir

    Joined:
    Jun 18, 2022
    Posts:
    13
    No, Nvidia GPU is required only you use hardware encode functionality.
     
  50. kazuki_unity729

    kazuki_unity729

    Unity Technologies

    Joined:
    Aug 2, 2018
    Posts:
    803