Search Unity

  1. Unity Asset Manager is now available in public beta. Try it out now and join the conversation here in the forums.
    Dismiss Notice
  2. Megacity Metro Demo now available. Download now.
    Dismiss Notice
  3. Unity support for visionOS is now available. Learn more in our blog post.
    Dismiss Notice

Official Unity Render Streaming Introduction & FAQ

Discussion in 'Unity Render Streaming' started by kazuki_unity729, Sep 10, 2019.

  1. kazuki_unity729

    kazuki_unity729

    Unity Technologies

    Joined:
    Aug 2, 2018
    Posts:
    803
  2. ayamir

    ayamir

    Joined:
    Jun 18, 2022
    Posts:
    13
  3. kotoezh

    kotoezh

    Joined:
    May 21, 2013
    Posts:
    21
    Hello again. I am trying to build the standalone video streaming server on android (Http is now on NanoHttpd), but cannot figure out where to get the webserver signalling app for android. Will be grateful for any tips how to get it working!
     
  4. kazuki_unity729

    kazuki_unity729

    Unity Technologies

    Joined:
    Aug 2, 2018
    Posts:
    803
    It sounds strange. The encoding process works in parallel.
    Can you take a screenshot of the profiler?
     
  5. kazuki_unity729

    kazuki_unity729

    Unity Technologies

    Joined:
    Aug 2, 2018
    Posts:
    803
    Sorry but the signaling server doesn't work on mobile devices.
     
    kotoezh likes this.
  6. rstump

    rstump

    Joined:
    May 31, 2017
    Posts:
    2
    Hi. I'm trying to stream a 2x2 grid of four cameras. The 'WebBrowserInput' sample scene has a stream with two cameras with one overlaying the other but for the life of me I can't figure out how the screen-space coordinates are setup for the second camera. I thought it would be under the 'viewport rect' but those are set at default values and all my experiments with using them haven't worked when using the video stream sender component.

    Can you point me in the right direction?

    Edit: After a lot of fiddling I found out the the post processing layer on the cameras was preventing it somehow. I have it working now by disabling them.
     
    Last edited: Jan 10, 2023
  7. Mirr0rjade

    Mirr0rjade

    Joined:
    Jan 11, 2023
    Posts:
    1
    Hello!
    Is it possible to play the stream on a video player like VLC?
     
  8. kazuki_unity729

    kazuki_unity729

    Unity Technologies

    Joined:
    Aug 2, 2018
    Posts:
    803
    I have never tested it. I am not sure VLC supports WebRTC.
     
  9. Rocketeer-Cis

    Rocketeer-Cis

    Joined:
    Apr 5, 2022
    Posts:
    3
    I have a problem with the BidirectionalSample scripts.

    I have a VR scene containing a Web viewer. The idea is that you have 1 host making a new WEBRTC connection and the rest joining that room.

    I have now put in my scene a 'RenderStreaming' component (component from the BidirectionalSample) that must ensure that the host can create the room. But as soon as he clicks 'SetUp' in a build version, the whole image in the headset gets stuck.

    It is important to note that I did not select a webcam but a texture to send over.

    Could it be because I have the RenderStreaming(host) and the Receiver(other people) in the same scene?

    How can I fix this?

    **UPDATE 1**

    If I leave this(see image) blank it works just fine. Only thing is that I don't get anything anymore. Which makes it logical that it does work. Why doesn't this work?

    **UPDATE 2**

    If I create a room in a completely different project and join the room in another project it works? What is the problem here?

    In short: I want to make a room where the host can create a stream of a render texture. The other users are joining the same room and receive the room code from the host. If they fill in the code they can watch the stream from the host on a plane in the scene. For multiplayer we are using Photon. I think it's possible but I don't know how :D. Our signalling server is in Google Cloud

     

    Attached Files:

    Last edited: Jan 24, 2023
  10. kazuki_unity729

    kazuki_unity729

    Unity Technologies

    Joined:
    Aug 2, 2018
    Posts:
    803
    Surely, we haven't tested it on VR headset enough currently. It sounds like bugs in our software.
    Whet kind of headset do you use?
     
  11. orixyz14

    orixyz14

    Joined:
    Jun 23, 2022
    Posts:
    2
    Hello, I am currently implementing a team audio chat system using WebRTC. However, there is a latency problem in the call. When the audio channel is first connected, everthing works well, but after a while, the receiver experiences a noticeable delay and I found that the delay time is basically consistent with the size of the lengthSec parameter in the Microphone.Start method. I have tried many methods but have not solved this problem. Do you know any possible causes of this problem?
     
  12. kazuki_unity729

    kazuki_unity729

    Unity Technologies

    Joined:
    Aug 2, 2018
    Posts:
    803
    Hi, we had the issue that the delay about the audio streaming last year. Is it same your issue?
    https://github.com/Unity-Technologies/com.unity.webrtc/issues/477
     
  13. kannan-xiao4

    kannan-xiao4

    Unity Technologies

    Joined:
    Nov 5, 2020
    Posts:
    76
    @orixyz14
    Can you confirm if the cause is the same as the issue below?
    https://github.com/Unity-Technologies/com.unity.webrtc/issues/788
     
  14. XHadower

    XHadower

    Joined:
    Feb 24, 2020
    Posts:
    5
    How to get browser Microphone on Broadcast?Audio Stream Sender works well, but I stuck at Audio Stream Receiver. Thanks
     
  15. kazuki_unity729

    kazuki_unity729

    Unity Technologies

    Joined:
    Aug 2, 2018
    Posts:
    803
    You can see the usage of Audio Stream Receiver component in "Receiver" sample.
     
  16. kannan-xiao4

    kannan-xiao4

    Unity Technologies

    Joined:
    Nov 5, 2020
    Posts:
    76
    If you want to send voice from Receiver sample(in WebApp page), you need implementation about send Microphone process like Bi-directional sample.
     
  17. christoph-heich

    christoph-heich

    Joined:
    Apr 1, 2020
    Posts:
    3
    Hello, how am i supposed to enable the hardware H264 encoder for com.unity.webrtc, unfortunately currently only VP8, VP9 and AV1 is available.

    WebRTC.EnableHardwareEncoder has been removed and I am currently not sure how to enable hardware side support so that only H264 is used.

    I am not able to watch a stream using SRS and OBS as the RTCSessionDescription does not send h264 at all.

    upload_2023-3-6_13-15-53.png
     
  18. kazuki_unity729

    kazuki_unity729

    Unity Technologies

    Joined:
    Aug 2, 2018
    Posts:
    803
    WebRTC package detects NvCodec automatically when using the NVIDIA graphics card. Can you check the Unity log to check the kind of graphics card?
     
  19. christoph-heich

    christoph-heich

    Joined:
    Apr 1, 2020
    Posts:
    3
    Thank you, that may be the issue as i am not using an Nvidia Card, are there any plans to support non-nvidia cards?
     
  20. kazuki_unity729

    kazuki_unity729

    Unity Technologies

    Joined:
    Aug 2, 2018
    Posts:
    803
    We have plans to support hardware encoder for AMD and Intel chips, but I can't promise when will be shipped.
     
  21. DBarlok

    DBarlok

    Joined:
    Apr 24, 2013
    Posts:
    268
    How can I have a non pixelated streaming when moving too fast? Or how can I tweak settings at max to achieve best graphics quality? I think I had already tried all settings...where to code that?
     
  22. kazuki_unity729

    kazuki_unity729

    Unity Technologies

    Joined:
    Aug 2, 2018
    Posts:
    803
    First of all, video quality is trade-off with bandwidth. You need to check the bandwidth between two PCs.
    The resolution of video streaming is the a parameter in video quality.
     
  23. blackdeng

    blackdeng

    Joined:
    Oct 19, 2017
    Posts:
    5
    I'm new to this and apologies for the dumb question. I'm trying to test the Unity Render Streaming using TURN server so I can stream Unity graphics to a remote browser. I don't quite understand how it works though. Say if I have a working TURN server on a cloud service somewhere, the unity project runs on PC1, and the browser is on PC2, where does the web app live on? And the browser side changes on this page https://docs.unity3d.com/Packages/com.unity.renderstreaming@3.1/manual/turnserver.html, does it mean I need to build the web app after the changes? What url shall I put in the browser?
     
  24. kazuki_unity729

    kazuki_unity729

    Unity Technologies

    Joined:
    Aug 2, 2018
    Posts:
    803
    Usually you can get the IP address of the PC with ipconfig command on windows. After you get IP address, please check the connection between PCs using ping command.
    https://learn.microsoft.com/en-us/windows-server/administration/windows-commands/ipconfig
     
  25. kannan-xiao4

    kannan-xiao4

    Unity Technologies

    Joined:
    Nov 5, 2020
    Posts:
    76
  26. blackdeng

    blackdeng

    Joined:
    Oct 19, 2017
    Posts:
    5
    So the WebApp is running on PC1 where the Unity app is, or PC2 where the browser side is?
     
  27. blackdeng

    blackdeng

    Joined:
    Oct 19, 2017
    Posts:
    5
  28. kazuki_unity729

    kazuki_unity729

    Unity Technologies

    Joined:
    Aug 2, 2018
    Posts:
    803
  29. unity_80D9F7F0E68845633386

    unity_80D9F7F0E68845633386

    Joined:
    Jan 13, 2023
    Posts:
    1
    Hi, is it possible save the stream video inside of a server in runtime?
     
  30. kazuki_unity729

    kazuki_unity729

    Unity Technologies

    Joined:
    Aug 2, 2018
    Posts:
    803
  31. DreamDelerium

    DreamDelerium

    Joined:
    Jan 14, 2017
    Posts:
    17
    Hello. I am looking for a technology that would allow me to host unity on a cloud service (like Azure Remote Rendering) but allow users to stream to a web browser. Additionally, each user should have their own session (or have the option to join a session already in progress). Is this possible with this tool? If so, are there any tutorials or examples that could get me started? Thank you! I ran through one of the tutorials that let me build an example scene and run a server to receive a stream from Unity server. I added the camera movement script. But, what I noticed was that if I had multiple browsers open, if one browser moved the camera, all browsers moved (as did the Unity application). I would want each user to have their own independent session (and also allow shared sessions). Hope that makes sense?
     
  32. kannan-xiao4

    kannan-xiao4

    Unity Technologies

    Joined:
    Nov 5, 2020
    Posts:
    76
    We do not provide hosting cloud services.

    As an application implementation, I think that the Multiplay sample will be helpful.
    https://docs.unity3d.com/Packages/com.unity.renderstreaming@3.1/manual/sample-multiplay.html
     
  33. kayy

    kayy

    Joined:
    Jul 26, 2011
    Posts:
    110
    After updating from 2.3.3 to 3.0.0-pre.5 I noticed that almost every browser session falls back to VP8. The only way to force H.264 encoding seems to set the flag media.webrtc.hw.h264.enabled = true in Firefox' about:config.

    Chrome and Edge always end up with VP8 but never H.264. I tried to force it in Unity by
    Code (CSharp):
    1.     var hwCodecs = RTCRtpSender.GetCapabilities(TrackKind.Video).codecs
    2.         .Where(codec => codec.mimeType == "video/H264");
    3.     transceiver.SetCodecPreferences(hwCodecs.ToArray());
    4.  
    This has no effect in Chrome and a black screen in FF (when media.webrtc.hw.h264.enabled is not set). My sdp answer string from the Unity side contains only the four H.264 codecs:
        m=video 9 UDP/TLS/RTP/SAVPF 106 102 112 127


    Despite of this H.264-only answer VP8 is used successfully on Chrome.

    If I switch back to an older version with 2.3.3 calling WebRTC.Initialize(EncoderType.Hardware) always results in a H.264 encoded session (with the same web frontend setup).

    So what am I missing?
    Do I have to force this at the client side?

    Thanks

    PS: My environment: NVIDIA GeForce RTX 3050, 8 GB VRAM, Windows 11, all tests on localhost
     
  34. kayy

    kayy

    Joined:
    Jul 26, 2011
    Posts:
    110
    I did some more investigation on the codec selection stuff and found out that the web client's codec preferences always win. We let the web client send the initial offer and the Unity app is sending the answer. If I set the preferred codecs in javascript at the web client's side, its first codec will be taken. Regardless whether its VP8 or H.264, it is working that way but the preferred codecs of the answering Unity app will be ignored. Only if the clients first codec is supported at the Unity side, the stream is working but the answerer's preferences are never considered.

    I checked the correct setting of local and remote descriptions and too the correct roles (receiver and sender capabilities). I only get one answer from Unity app and its sdp string appears fine to me.
    Further on I see the client's preferred codec in the RTCStatsReport even if it was not in the list of preferred codecs and thus the screen remains black.

    Is there anything else to consider?
     
  35. kazuki_unity729

    kazuki_unity729

    Unity Technologies

    Joined:
    Aug 2, 2018
    Posts:
    803
    Hi, I am sorry that you are surprised by changing the design of the behaviour about codec selection. The codec selection are determined by contents of signaling between sender and receiver.
    We are going to add the feature that the codec filtering in VideoStreamSender of Unity Render Streaming.
     
  36. kayy

    kayy

    Joined:
    Jul 26, 2011
    Posts:
    110
    Sounds like the only workaround would be to send a pre-offer request from the Unity App as sender to the web client (receiver) to instruct it which codecs to consider in its subsequent offer request.

    Is there any chance if I change the roles? That means let the sender create the offer and rely on that the browser's will reflect the offered codecs list (including the order of preferences) in its answer sdp?
     
  37. kazuki_unity729

    kazuki_unity729

    Unity Technologies

    Joined:
    Aug 2, 2018
    Posts:
    803
    I think you can make that the sender create the offer, but you need to make the class instead of the Broadcast class.
     
  38. YousifKhalidDolby

    YousifKhalidDolby

    Joined:
    Nov 7, 2022
    Posts:
    1
    Hi, I would like to make a Pull Request to the Unity.WebRTC project, but I am unable to. The changes live in a fork I have made of the repository. I keep getting Pull request creation failed. Validation failed: You can't perform that action at this time.
     
  39. projoy

    projoy

    Joined:
    May 12, 2022
    Posts:
    2
    Hi, I have encountered a problem and would appreciate your guidance.

    There is an input field in my Unity app . Now, I have sent the video screen to the browser through render streaming.

    Because in the render streaming, each keyboard stroke is an event. So how can I write word in the input field in the browser video and send it to the Unity application?
     

    Attached Files:

  40. kazuki_unity729

    kazuki_unity729

    Unity Technologies

    Joined:
    Aug 2, 2018
    Posts:
    803
    I'm not sure the issue in GitHub... Can you share me the link of your change?
     
  41. kannan-xiao4

    kannan-xiao4

    Unity Technologies

    Joined:
    Nov 5, 2020
    Posts:
    76
    I think it's the same problem as below.
    https://github.com/Unity-Technologies/UnityRenderStreaming/issues/542

    We can't support it, but there are people who have implemented it in the comments below, so why not try it?
    https://github.com/Unity-Technologies/UnityRenderStreaming/issues/542#issuecomment-1503921091
     
  42. theglimpsegroup

    theglimpsegroup

    Joined:
    Oct 18, 2017
    Posts:
    2
    I'm having an issue with Webserver.exe. I'm trying to move a local render stream configuration to our AWS servers so I can demo the feature from outside of our local network.
    - What works: A standalone Unity app that streams a view of the camera to web clients running Chrome, using Webserver.exe to perform the Signaling Server tasks.
    - The problem: I move the standalone executable and the Webserver.exe to an AWS server (with GPU). When Webserver.exe runs, it displays the internal address of the AWS server, instead of it's external address. I cannot reach it from external browsers.
    - The Question: Is there some way to force Webserver.exe to use a specific network interface? Is there source code somewhere for Webserver.exe?
     
  43. projoy

    projoy

    Joined:
    May 12, 2022
    Posts:
    2
  44. kazuki_unity729

    kazuki_unity729

    Unity Technologies

    Joined:
    Aug 2, 2018
    Posts:
    803
    You can see here if you want to get source code.
    https://docs.unity3d.com/Packages/com.unity.renderstreaming@3.1/manual/customize-webapp.html

    And please read this for understanding your network issue.
    https://docs.unity3d.com/Packages/com.unity.renderstreaming@3.1/manual/turnserver.html
     
  45. kazuki_unity729

    kazuki_unity729

    Unity Technologies

    Joined:
    Aug 2, 2018
    Posts:
    803
  46. R43te

    R43te

    Joined:
    May 19, 2023
    Posts:
    2
    Hello !

    My team and I are working on an augmented reality project
    we are using unity render streaming and the Ar foundation simple to stream our 3D model from a unity serveur to AR glass (Lenovo A3 glass )
    the unity client is running on android phone connected to the glass .
    the streaming is working but good but the display in the glass is small 2D windows with the stream because we stream in a raw image.

    is there a way we can fully stream the 3D model from the scene ?

    thanks for the help
     
  47. kazuki_unity729

    kazuki_unity729

    Unity Technologies

    Joined:
    Aug 2, 2018
    Posts:
    803
    Hi, do you mean that you want to send mesh data to another peer?
    VideoStreamTrack doesn't support mesh data because the mesh data is not texture. You need to implement it with DataChannel.
     
  48. camille_viot_avatar_medical

    camille_viot_avatar_medical

    Joined:
    Jun 20, 2023
    Posts:
    12
    Hi @kazuki_unity729
    I have a rather general question. Codecs have usually some knobs to tweak them depending on the use case. For example some of VP9 are given here: https://developers.google.com/media/vp9/settings
    What I would like to do is be able to change some of them to see how the quality/latency/network resiliency behave in different situation and hardware configuration.
    But I miss the big picture. There are many parts here. Notably:
    • The Unity Render Streaming package and the Unity Web RTC package
    • The Unity WebRTC plugin
    • The WebRTC library containing itself library (such as libvpx for instance)
    • Software libraries for hardware encoding such as NvCodec
    Could you explain me how all of these things relate to each other and how they are organized? Does WebRTC (from a spec and protocol point of view) allow these kind of tweaks? If yes, what APIs allow to do it? Are they any example (Unity or otherwise) that I could run to try this?
     
  49. R43te

    R43te

    Joined:
    May 19, 2023
    Posts:
    2
    Hello !

    thanks you for your quick answer ,
    we would like to stream our fbx 3D model from our edge server to another peer like this is this photo .
    i think using mesh is a good idea
    can you give tips how i can implement it with DataChannel?
     

    Attached Files:

  50. kazuki_unity729

    kazuki_unity729

    Unity Technologies

    Joined:
    Aug 2, 2018
    Posts:
    803
    You can find information about video codec in the package document.
    https://docs.unity3d.com/Packages/c...g@3.1/manual/video-streaming.html#video-codec

    Unity Render Streaming use the video codec via WebRTC API. Please read the document to get information about video codec in the WebRTC package.
    https://docs.unity3d.com/Packages/c...ual/videostreaming.html#selecting-video-codec