Search Unity

  1. Unity Asset Manager is now available in public beta. Try it out now and join the conversation here in the forums.
    Dismiss Notice

Official Unity Render Streaming Introduction & FAQ

Discussion in 'Unity Render Streaming' started by kazuki_unity729, Sep 10, 2019.

  1. wechat_os_Qy005by_pzfV0HmWZxb1lf6fQ

    wechat_os_Qy005by_pzfV0HmWZxb1lf6fQ

    Joined:
    Jan 26, 2021
    Posts:
    7
    How can I switch cameras in the same scene? I tried to change the demo to achieve the functionality, but that didn't work. My idea is to add a dictionary (Camera -> VideoStreamTrack) in RenderStreming.cs, and update the stream when the corresponding camera is activated and close other camera render streaming.

    I hope you can give me some advice. thank you!
     
  2. gtk2k

    gtk2k

    Joined:
    Aug 13, 2014
    Posts:
    288
    I think you should create a VideoStreamTrack from RenderTexture and render the camera image to this RenderTexture.
     
  3. wechat_os_Qy005by_pzfV0HmWZxb1lf6fQ

    wechat_os_Qy005by_pzfV0HmWZxb1lf6fQ

    Joined:
    Jan 26, 2021
    Posts:
    7
    I tried, but it didn't work.

    I think there are no changes to videostreamtrack in the RenderStreaming.cs. How to deal with this part? Or my point is wrong?
     
  4. Kahlis

    Kahlis

    Joined:
    Jan 14, 2017
    Posts:
    6
    How to create a receiver for android?
     
  5. kazuki_unity729

    kazuki_unity729

    Unity Technologies

    Joined:
    Aug 2, 2018
    Posts:
    803
    Android is not supported yet. We are working on it at the moment.
     
    SenaJP likes this.
  6. fourbb

    fourbb

    Joined:
    Jul 22, 2020
    Posts:
    13
    I have tried create a VideoStreamTrack from RenderTexture and render the camera image to this RenderTexture.
    but it didn't work.
    I found that I had to render the image from this renderTexture to the target Texture of a camera and create VideoStream from this camera to succeed.
     
  7. gtk2k

    gtk2k

    Joined:
    Aug 13, 2014
    Posts:
    288
    multi camera sample
    Code (CSharp):
    1.  
    2. using System.Collections;
    3. using Unity.WebRTC;
    4. using UnityEngine;
    5. using UnityEngine.UI;
    6. public class WebRTCManager : MonoBehaviour
    7. {
    8.     [SerializeField] private Camera camA;
    9.     [SerializeField] private Camera camB;
    10.     [SerializeField] private RawImage receiveImage;
    11.     [SerializeField] private GameObject cube;
    12.     [SerializeField] private Button btn;
    13.     private RenderTexture renderTexture;
    14.     private VideoStreamTrack renderTextureVideoTrack;
    15.     private RTCPeerConnection pc1;
    16.     private RTCPeerConnection pc2;
    17.     private bool isCamA;
    18.     private RTCOfferOptions offerOptions = new RTCOfferOptions
    19.     {
    20.         iceRestart = false,
    21.         offerToReceiveAudio = false,
    22.         offerToReceiveVideo = false
    23.     };
    24.     private RTCAnswerOptions answerOptions = new RTCAnswerOptions
    25.     {
    26.         iceRestart = false
    27.     };
    28.     void Start()
    29.     {
    30.         isCamA = false;
    31.         btn.onClick.AddListener(() =>
    32.         {
    33.             isCamA = !isCamA;
    34.             Debug.Log($"isCamA:{isCamA}");
    35.         });
    36.         WebRTC.Initialize(EncoderType.Software);
    37.         renderTexture = new RenderTexture(1280, 720, 24, RenderTextureFormat.BGRA32, 0);
    38.         var conf = new RTCConfiguration
    39.         {
    40.             iceServers = new[] { new RTCIceServer { urls = new[] { "stun:stun.l.google.com:19302" } } }
    41.         };
    42.         pc1 = new RTCPeerConnection(ref conf);
    43.         pc2 = new RTCPeerConnection(ref conf);
    44.         renderTextureVideoTrack = new VideoStreamTrack("render_texture", renderTexture);
    45.         pc1.OnIceCandidate = candidate => pc2.AddIceCandidate(candidate);
    46.         pc2.OnIceCandidate = candidate => pc1.AddIceCandidate(candidate);
    47.         pc1.AddTrack(renderTextureVideoTrack, new MediaStream());
    48.         pc2.OnTrack = ev =>
    49.         {
    50.             Debug.Log("ontrack");
    51.             var track = ev.Track as VideoStreamTrack;
    52.             receiveImage.texture = track.InitializeReceiver(1280, 720);
    53.         };
    54.         StartCoroutine(signalingProc());
    55.         StartCoroutine(WebRTC.Update());
    56.     }
    57.     IEnumerator signalingProc()
    58.     {
    59.         var op1Offer = pc1.CreateOffer(ref offerOptions);
    60.         yield return op1Offer;
    61.         if (!op1Offer.IsError)
    62.         {
    63.             var offer = op1Offer.Desc;
    64.             var op1SetOffer = pc1.SetLocalDescription(ref offer);
    65.             yield return op1SetOffer;
    66.             if (!op1SetOffer.IsError)
    67.             {
    68.                 var op2SetOffer = pc2.SetRemoteDescription(ref offer);
    69.                 yield return op2SetOffer;
    70.                 if (!op2SetOffer.IsError)
    71.                 {
    72.                     var op2Answer = pc2.CreateAnswer(ref answerOptions);
    73.                     yield return op2Answer;
    74.                     if (!op2Answer.IsError)
    75.                     {
    76.                         var answer = op2Answer.Desc;
    77.                         var op2SetAnswer = pc2.SetLocalDescription(ref answer);
    78.                         yield return op2SetAnswer;
    79.                         if(!op2SetAnswer.IsError)
    80.                         {
    81.                             var op1SetAnswer = pc1.SetRemoteDescription(ref answer);
    82.                             yield return op1SetAnswer;
    83.                             if(op1SetAnswer.IsError)
    84.                             {
    85.                                 Debug.LogError($"op1SetAnswer error:{op1SetAnswer.Error.message}");
    86.                             }
    87.                         }
    88.                         else
    89.                         {
    90.                             Debug.Log($"op2SetAnswer error:{op2SetAnswer.Error.message}");
    91.                         }
    92.                     }
    93.                     else
    94.                     {
    95.                         Debug.LogError($"op2Answer error: {op2Answer.Error.message}");
    96.                     }
    97.                 }
    98.                 else
    99.                 {
    100.                     Debug.LogError($"op2SetOffer error: {op2SetOffer.Error.message}");
    101.                 }
    102.             }
    103.             else
    104.             {
    105.                 Debug.LogError($"op1SetOffer error: {op1SetOffer.Error.message}");
    106.             }
    107.         }
    108.         else
    109.         {
    110.             Debug.LogError($"op1Offer error: {op1Offer.Error.message}");
    111.         }
    112.     }
    113.     private void OnDestroy()
    114.     {
    115.         renderTextureVideoTrack.Dispose();
    116.         renderTextureVideoTrack = null;
    117.         pc1.Close();
    118.         pc2.Close();
    119.         pc1.Dispose();
    120.         pc2.Dispose();
    121.         pc1 = null;
    122.         pc2 = null;
    123.         receiveImage.texture = null;
    124.     }
    125.     void Update()
    126.     {
    127.         cube.transform.Rotate(1, 2, 3);
    128.         if (isCamA)
    129.             Graphics.Blit(camA.targetTexture, renderTexture);
    130.         else
    131.             Graphics.Blit(camB.targetTexture, renderTexture);
    132.     }
    133. }
    134.  
     
    Last edited: Mar 12, 2021
  8. Chismer

    Chismer

    Joined:
    May 6, 2016
    Posts:
    4
    Hello! I am testing the URP example from version 3.0 and Unity 2019.4.5 and when I connect the camera it loses the layers and only the sky is visible. However, in normal it behaves well. What am I doing wrong?

    upload_2021-3-13_10-10-13.png
     
    Last edited: Mar 13, 2021
  9. mspinluca

    mspinluca

    Joined:
    Feb 7, 2021
    Posts:
    2
    If you enable the depth buffer on the camera streamer it will work correctly.

    Unrelated, but does anyone know the purpose of the render texture blitter component in the Main Camera in the Broadcast scene?
     
  10. gtk2k

    gtk2k

    Joined:
    Aug 13, 2014
    Posts:
    288
    When the target texture of the main camera is set, it will not be rendered on the screen and the screen will be black. To prevent this, I think it is a component for rendering to the screen so that the screen does not turn black even if the target texture is set.
     
  11. DuckStock

    DuckStock

    Joined:
    Jul 25, 2016
    Posts:
    9
    Is this functionality in the works? I'm a little confused. Or is there a known workaround, ability to switch codecs? It appears to work fine on the web client... Apologies for my ignorance. It seems like not being able to access hardware performance gains will be a blocker for the project we're working on.
     
  12. kazuki_unity729

    kazuki_unity729

    Unity Technologies

    Joined:
    Aug 2, 2018
    Posts:
    803
    Supporting H.264 hardware decoder is in our plan, but I can not say for sure.
     
  13. DuckStock

    DuckStock

    Joined:
    Jul 25, 2016
    Posts:
    9
    Thank you as always :).

    I also just came across this readme in the unity webrtc repo https://github.com/Unity-Technologi...4e73630f89b2/Documentation~/videostreaming.md That explains the limitation in a little more detail.

    Currently, HardwareDecoder is not supported.

    We plan about HardwareDecoder support in 2.x release.
    The major browsers that support WebRTC can use H.264 and VP8,
    which means most browsers can receive video streaming from Unity.

    Currently, only SoftwareDecoder can be used,
    in which case VP8/VP9 can be used as a codec.
    In this case, VP8 or VP9 can be used as a codec.

    Which explains why it's working in browser. I am building a client that runs on an ios device. The app is dependent on arkit, so a standalone client will be essential. I guess maybe I could try rendering to a browser element inside of unity, instead of a render texture? This seems kind of hacky, but maybe possible?

    In the grace of coincidence, if anyone has solved a similar problem, decoding the h.246 render stream on unity ios build, I would love some tips!
     
  14. kazuki_unity729

    kazuki_unity729

    Unity Technologies

    Joined:
    Aug 2, 2018
    Posts:
    803
    You mean using WebView on Unity for the video receiver, right?
    I have never tried that but it might be possible.

     
  15. mmacmullin

    mmacmullin

    Joined:
    May 1, 2018
    Posts:
    1
    Is there a way to set the signaling url at run time?
     
  16. petak_core

    petak_core

    Joined:
    Nov 19, 2012
    Posts:
    57
    Hello Kazuki,

    I'm able to run Unity Render Streaming example(s) - Broadcaster + Reciever or WebBrowserInput.
    I'm using the latest version of URS (UnityRenderStreaming) - 3.0.1.

    I would like to ask if there possible to send "signals" back to Unity with Unity GUI or I have to use "HTML/JS" buttons in the web browser?

    When I tried your modification for EventSystem, for enabling UGUI:
    https://forum.unity.com/threads/unity-render-streaming-introduction-faq.742481/page-10#post-6492607

    I'm able to "click" on Unity button, but when I lost focus on the web browser (switch to another app) and then set focus back on the web browser again - I'm not able to click that button again.

    Thank you for any advice on how to enable UGUI in RenderStreaming, my goal is to use UGUI for control "sending signals" back to the broadcaster.
     
  17. HenryBech

    HenryBech

    Joined:
    Oct 7, 2020
    Posts:
    4
    Is there something like a timeline or can you tell if it will be available before fall 2021? I am currently looking for a solution to stream from Android and UnityRenderStreaming looks great.
     
  18. kazuki_unity729

    kazuki_unity729

    Unity Technologies

    Joined:
    Aug 2, 2018
    Posts:
    803
    RenderStreaming class provides Run method. Does this method meet your needs?
     
    mmacmullin and Asraas like this.
  19. kazuki_unity729

    kazuki_unity729

    Unity Technologies

    Joined:
    Aug 2, 2018
    Posts:
    803
    Asraas likes this.
  20. kazuki_unity729

    kazuki_unity729

    Unity Technologies

    Joined:
    Aug 2, 2018
    Posts:
    803
    We are just working on the Android platform support.
    It will be released in April.
     
    Asraas and HenryBech like this.
  21. kazuki_unity729

    kazuki_unity729

    Unity Technologies

    Joined:
    Aug 2, 2018
    Posts:
    803
    Mmm, I have never had this issue, I will check on it.
     
    petak_core and Asraas like this.
  22. Asraas

    Asraas

    Joined:
    Aug 14, 2018
    Posts:
    5
    First of all, thanks for the great work @kazuki_unity729
    Second: Is it possible to build a normal AR App (so the iPad can see the enviroment with the webcam) and the Host Windows Client is editing the environment and is moving some objects such as example?
     
    UnityLoverr likes this.
  23. Asraas

    Asraas

    Joined:
    Aug 14, 2018
    Posts:
    5
    (At the current AR Sample seen its not possible to see the iPad Camera view.. i m right?)
     
    UnityLoverr likes this.
  24. kazuki_unity729

    kazuki_unity729

    Unity Technologies

    Joined:
    Aug 2, 2018
    Posts:
    803
    Yes, it is right.
    This sample is for demonstrating to operate the translation and rotation of a camera on PC from iOS device.
     
    UnityLoverr and Asraas like this.
  25. Asraas

    Asraas

    Joined:
    Aug 14, 2018
    Posts:
    5
    Thanks for the answer, but would it be possible? And how?
     
    UnityLoverr likes this.
  26. ooxxno1

    ooxxno1

    Joined:
    Nov 9, 2020
    Posts:
    2
    I started to use renderstreaming version 1.0, the picture is very clear and very flow, but only three concurrent programs. When opening the fourth program, it will prompt initialization failure (even if I use two rtx3080 graphics cards, I can only open three programs), and cannot display 4K. The latest version of renderstreaming 3.0 solves the above two problems, but the picture is far less smooth than that of version 1.0. Even if only one program is opened, there is still a jam when the lens moves, and the delay is higher than that of version 1.0.
     
    UnityLoverr likes this.
  27. kazuki_unity729

    kazuki_unity729

    Unity Technologies

    Joined:
    Aug 2, 2018
    Posts:
    803
    I guess your issue is the same with this.
    We will add the API that updating SDP manually in near future.
    https://github.com/Unity-Technologies/UnityRenderStreaming/issues/450
     
    UnityLoverr likes this.
  28. ooxxno1

    ooxxno1

    Joined:
    Nov 9, 2020
    Posts:
    2
    Is there any solution at present? Or in renderstreaming version 1.0, how to solve the problem that the same computer can only open three programs? I used two graphics cards, each program connected to a server, still can't open the fourth program, I don't know what limited me to initialize webrtc.
     
    UnityLoverr likes this.
  29. gtk2k

    gtk2k

    Joined:
    Aug 13, 2014
    Posts:
    288
    Last edited: Apr 2, 2021
    UnityLoverr likes this.
  30. kannan-xiao4

    kannan-xiao4

    Unity Technologies

    Joined:
    Nov 5, 2020
    Posts:
    76
    Hi, @_petak_ .


    I check this issue, but I can't reproduce it.

    Can you share more detail about your environment ?
    (using scene, host machine os, using browser and os...)
     
    UnityLoverr likes this.
  31. UnityLoverr

    UnityLoverr

    Joined:
    Mar 29, 2021
    Posts:
    10
    Short question to the people who have already successfully run the Unity Render Streaming Samples virtually on the AWS EC2 server. Do you still have to make any settings? I start the web server and the BroadCast Scene on the AWS server but cannot get a connection via my receiver scenes with the correct server IP. Tips and experience would be very helpful, thank you.
     
  32. Valiner

    Valiner

    Joined:
    Nov 20, 2018
    Posts:
    1
    Hello, I have a question, is it possible to build an Ar app in unity and stream it to the browser using the render streaming
    tool?
     
    UnityLoverr likes this.
  33. KimZigbang

    KimZigbang

    Joined:
    Apr 24, 2020
    Posts:
    9
    Hello, I'm using WebRTC 2.3.3-preview package. I succeeded to send my webcam data via videostream. Is there any way to send microphone audio data via audiostream? I'm testing webrtc package as a video conference solution.
     
  34. kazuki_unity729

    kazuki_unity729

    Unity Technologies

    Joined:
    Aug 2, 2018
    Posts:
    803
    It should be possible.
    Launching TURN server might be needed.
    https://docs.unity3d.com/Packages/com.unity.renderstreaming@3.0/manual/turnserver.html
     
    UnityLoverr likes this.
  35. kazuki_unity729

    kazuki_unity729

    Unity Technologies

    Joined:
    Aug 2, 2018
    Posts:
    803
    As far as I know, Unity does not support AR features for Web platform.
    Unity is using ARKit and ARCore to make AR features.

    I am not sure we can make it but looks like useful tools for your needs.
    https://github.com/google-ar/WebARonARKit
     
    UnityLoverr likes this.
  36. kazuki_unity729

    kazuki_unity729

    Unity Technologies

    Joined:
    Aug 2, 2018
    Posts:
    803
    Unfortunately, native audio stream rendering is not supported yet.
     
  37. petak_core

    petak_core

    Joined:
    Nov 19, 2012
    Posts:
    57
    Hi @kannan-xiao4

    I tried to create my UntyUI Button input, and now it works, sorry for my mistake!
     
    kannan-xiao4 likes this.
  38. petak_core

    petak_core

    Joined:
    Nov 19, 2012
    Posts:
    57
    Hello guys,

    I have an issue with WbeBrowserInput, I would like to remove the "top-left" camera preview in web browser.

    In RenderSteraming GameObject, I remove Element3 and Element4 at the Broadcast component. Then see only "black" screen in my web browser with 3 basic buttons (Light On / Light Off / Play Audio)

    Do you have any tips on how to disable/remove this top-left camera stream from WebBrowserInput example?
    It can be done on the Unity side, or I have to modify "web/JS" side?

    Thank you for any advice on how to disable/remove "top-left" camera preview in WebBrowserInput example at the web browser side.
     
  39. Asraas

    Asraas

    Joined:
    Aug 14, 2018
    Posts:
    5
    How to use Unity UI Buttons in Broadcast/Receiver Scene? With the WebBrowserInput Scene I m able to interact with the Unity UI Buttons by the Remote Canvas from Browser. But at Receiver i m not able to use the UI BUttons from Broadcast.. where is my mistake?
     
    Last edited: Apr 13, 2021
  40. UnityLoverr

    UnityLoverr

    Joined:
    Mar 29, 2021
    Posts:
    10
    Doas anybody else got the problem; by running as http it won't work but as ws it does? I m missing something important?
     
    Last edited: Apr 14, 2021
  41. UnityLoverr

    UnityLoverr

    Joined:
    Mar 29, 2021
    Posts:
    10
    I had permission problems to ping another IP then mine, but was able to get Pinged...
     
  42. GameeDev

    GameeDev

    Joined:
    Apr 15, 2021
    Posts:
    3
    Hey Guys, did anybody tried to use any AR ObjectTracking via VisionLib or other AR Library 's with the Unity Render Streaming Package? :)
     
  43. UnityLoverr

    UnityLoverr

    Joined:
    Mar 29, 2021
    Posts:
    10
    If I understand correctly, this is how it works, when the web server is started, you get to the overview page, and when you click the "WebBrowserInput" or "Bidirectional" link, the Unity app or rather the packages only take effect then and only then then the ice protocol is used.
    In my case, I can't even get to the web server home page. So Turn woun't help i guess?
     
  44. kazuki_unity729

    kazuki_unity729

    Unity Technologies

    Joined:
    Aug 2, 2018
    Posts:
    803
    You need to edit Unity and JS both.
    This might be helpful to customize the web app.
    https://docs.unity3d.com/Packages/com.unity.renderstreaming@3.0/manual/customize_webapp.html
     
  45. kazuki_unity729

    kazuki_unity729

    Unity Technologies

    Joined:
    Aug 2, 2018
    Posts:
    803
  46. kazuki_unity729

    kazuki_unity729

    Unity Technologies

    Joined:
    Aug 2, 2018
    Posts:
    803
  47. kazuki_unity729

    kazuki_unity729

    Unity Technologies

    Joined:
    Aug 2, 2018
    Posts:
    803
  48. kazuki_unity729

    kazuki_unity729

    Unity Technologies

    Joined:
    Aug 2, 2018
    Posts:
    803

    Oh, I see. you can not access the web page.
    I need more info to help you. I want to check your ping first.
     
  49. GameeDev

    GameeDev

    Joined:
    Apr 15, 2021
    Posts:
    3
  50. UnityLoverr

    UnityLoverr

    Joined:
    Mar 29, 2021
    Posts:
    10
    I'm sorry, but you can't ping the AWS instance because the instance won't run overnight (my night) because of $$$. And, as I can see from our response time, we have different time zones ...

    I downloaded the web server from git. In my cmd, I navigate to the webserver.exe file and start it (I tried -w or different ports -p) (yes, I looked it up via netstat -ano if the port is available, and also the instance security groups set up ). I can connect locally, but I can't get access to the web server side from outside of my network. When I start the web server it says on the command line that it is running from my private IP address. Should it be automatically available to my public IP? I haven't found the way to change the IP in the web server scripts.

    I'm really sorry if I missed something important.