Search Unity

  1. Unity Asset Manager is now available in public beta. Try it out now and join the conversation here in the forums.
    Dismiss Notice

Official Unity Render Streaming Introduction & FAQ

Discussion in 'Unity Render Streaming' started by kazuki_unity729, Sep 10, 2019.

  1. kazuki_unity729

    kazuki_unity729

    Unity Technologies

    Joined:
    Aug 2, 2018
    Posts:
    803
    The one camera in the sample scene is for displaying a screen on Game View.
    In Unity`s specification, the camera that attached the target texture doesn't render to the display.
     
  2. Dirrogate

    Dirrogate

    Joined:
    Feb 18, 2014
    Posts:
    157
    @kazuki_unity729
    Would it be possible to interact with screen space overlay UI elements like buttons and sliders? (or maybe camera space UI)
    I'd really appreciate an example - if yes.

    This would help immensely for non-coders to design GUI over the remote rendered screen. I've tried, with the RenderStreaming example, but I see screenspace UI is not being transmitted to remote rendercamera WebRTC_UI_Canvas.JPG .
     
    SenaJP likes this.
  3. Maktech

    Maktech

    Joined:
    Jan 15, 2015
    Posts:
    31
    I am on version 1.1.2 - Looking forward to the next release. I am sure you have heard the question too many times but, is the next 2.0 release coming soon?

    Thank you.
     
  4. kazuki_unity729

    kazuki_unity729

    Unity Technologies

    Joined:
    Aug 2, 2018
    Posts:
    803
    We will release the latest version this week.
    Already published the development version (2.0.0-preview.x) and you can download from the Package Manager.
     
    kayy likes this.
  5. kazuki_unity729

    kazuki_unity729

    Unity Technologies

    Joined:
    Aug 2, 2018
    Posts:
    803
    Currently, It is not possible to interact with UGUI because the New Input System doesn't support UGUI interaction without the screen focus.
    We already requested the team to fix it but I am not sure when it will be fixed.
     
    LostPanda and Dirrogate like this.
  6. Maktech

    Maktech

    Joined:
    Jan 15, 2015
    Posts:
    31
    As it was mentioned before the HDRP is not a required part of adding WebRTC support to your application. A non-HDRP texture blitter would need to be implemented. I took a pass at it below. OnRenderImage seems to be the closest object event to the HDRP customRender event. Any feedback on this would be welcome. Is there another version of this I am not aware of (maybe coming with this week's release)?

    Code (CSharp):
    1. [RequireComponent(typeof(Camera))]
    2. public class LegacyRenderTextureBlitter : MonoBehaviour
    3. {
    4.     [SerializeField] Camera m_rtCamera = null;
    5.  
    6.     Camera                  m_cam;
    7.     bool _enableBlit = false;
    8.  
    9.     private void OnEnable() {
    10.         m_cam = GetComponent<Camera>();
    11.  
    12.         //Render nothing
    13.         m_cam.clearFlags = CameraClearFlags.Nothing;
    14.         m_cam.cullingMask = 0;
    15.         _enableBlit = true;
    16.     }
    17.  
    18.     private void OnDisable() {
    19.         _enableBlit = false;
    20.     }
    21.  
    22.     private void OnRenderImage(RenderTexture source, RenderTexture destination)
    23.     {
    24.         if (_enableBlit)
    25.         {
    26.            Graphics.Blit(m_rtCamera.targetTexture, (RenderTexture)null);
    27.         }
    28.     }
    29.  
    30. }
     
    gdbbv likes this.
  7. Maktech

    Maktech

    Joined:
    Jan 15, 2015
    Posts:
    31
    With the latest release (com.unity.webrtc@2.0.0-preview.3) I have continued to get the crash on the second run in the editor. Here is the information I have collected from the logs and crash dmp files. If you want the actual logs / dmp file please reach out. Walking the RenderStreaming.cs implementation to see how far it gets before the crash. Keep up the good fight. :)

    Bottom of the Editor Log:

    ========== OUTPUTTING STACK TRACE ==================
    0x00007FFF932946D5 (webrtc) UnityPluginUnload
    0x00007FF7EF31952E (Unity) GfxDeviceWorker::RunCommand
    0x00007FF7EF32131B (Unity) GfxDeviceWorker::RunExt
    0x00007FF7EF3216D8 (Unity) GfxDeviceWorker::RunGfxDeviceWorker
    0x00007FF7F01ABA83 (Unity) Thread::RunThreadWrapper
    0x00007FF80D497BD4 (KERNEL32) BaseThreadInitThunk
    0x00007FF80DFACED1 (ntdll) RtlUserThreadStart
    ========== END OF STACKTRACE ===========

    From the error.log file:

    webrtc.dll caused an Access Violation (0xc0000005)
    in module webrtc.dll at 0033:932946d5.

    Stack Trace of Crashed Thread 18244:
    0x00007FFF932946D5 (webrtc) UnityPluginUnload
    ERROR: SymGetSymFromAddr64, GetLastError: 'Attempt to access invalid address.' (Address: 00007FF7EF31952E)
    0x00007FF7EF31952E (Unity) (function-name not available)
    ERROR: SymGetSymFromAddr64, GetLastError: 'Attempt to access invalid address.' (Address: 00007FF7EF32131B)
    0x00007FF7EF32131B (Unity) (function-name not available)
    ERROR: SymGetSymFromAddr64, GetLastError: 'Attempt to access invalid address.' (Address: 00007FF7EF3216D8)
    0x00007FF7EF3216D8 (Unity) (function-name not available)
    ERROR: SymGetSymFromAddr64, GetLastError: 'Attempt to access invalid address.' (Address: 00007FF7F01ABA83)
    0x00007FF7F01ABA83 (Unity) (function-name not available)
    0x00007FF80D497BD4 (KERNEL32) BaseThreadInitThunk
    0x00007FF80DFACED1 (ntdll) RtlUserThreadStart
     
  8. Tobs-

    Tobs-

    Joined:
    Feb 12, 2016
    Posts:
    17
    Hi,
    I got one more question. Would it be possible to change the current captured stream of one camera to another one? When I've multiple cameras in my scene and I want to switch between them, but the user is always the same. How would I accomplish that?
     
  9. fffMalzbier

    fffMalzbier

    Joined:
    Jun 14, 2011
    Posts:
    3,276
    Its there a way to host it on a server without display attached. Any try to run it on a Google GPU Server(windows server with GRID GPU) did just crashes the application. Running in the application with -batchmode works only until someone connects.
     
    kayy likes this.
  10. kazuki_unity729

    kazuki_unity729

    Unity Technologies

    Joined:
    Aug 2, 2018
    Posts:
    803
    It looks great!
    As you said, it seems like legacy pipeline support is needed by some developers yet.
    OK, we will add the sample for the legacy pipeline near the future.
     
  11. kazuki_unity729

    kazuki_unity729

    Unity Technologies

    Joined:
    Aug 2, 2018
    Posts:
    803
    Please attach the player log.
    You can see the manual if you want to know where the log.
    https://docs.unity3d.com/Manual/LogFiles.html
     
  12. fffMalzbier

    fffMalzbier

    Joined:
    Jun 14, 2011
    Posts:
    3,276
    I currently do not have access to the server machine anymore.
    But the same behavior i get with "-batchmode" on my local machine , i attached the log file.
     

    Attached Files:

  13. kazuki_unity729

    kazuki_unity729

    Unity Technologies

    Joined:
    Aug 2, 2018
    Posts:
    803
    According to this manual, "batch mode" argument is used for switching to "headless mode".
    https://docs.unity3d.com/Manual/CommandLineArguments.html

    Maybe this parameter is not worked.
    In the past, I posted about the "headless mode", please see the post.
    https://forum.unity.com/threads/unity-render-streaming-introduction-faq.742481/page-3#post-5340756
     
  14. kazuki_unity729

    kazuki_unity729

    Unity Technologies

    Joined:
    Aug 2, 2018
    Posts:
    803
    kayy likes this.
  15. gdbbv

    gdbbv

    Joined:
    Apr 20, 2018
    Posts:
    11
    Thank you! So the latest versions of the packages are OK to use now? They are preview.4 - 2.0.0 from April 22 for Render Streaming and preview.3 - 2.0.0 from April 20 for WebRTC or will a new version from today April 30 show up in the Package Manager soon?
     
  16. steffanPL

    steffanPL

    Joined:
    Oct 9, 2014
    Posts:
    40
    Great job so far, really useful tool!
    Do you already support broadcasting from iOS?
    If not, is it planned already/What would you recommend as an alternative solution for now?
     
  17. kazuki_unity729

    kazuki_unity729

    Unity Technologies

    Joined:
    Aug 2, 2018
    Posts:
    803
    I had misunderstood the version naming rule on Package Manager.

    As you said "2.0.0-preview" from April 30 is the latest version.
    Immediately I will fix to put the latest version to the top of the list.
     
  18. kazuki_unity729

    kazuki_unity729

    Unity Technologies

    Joined:
    Aug 2, 2018
    Posts:
    803
    I am sorry but the package doesn't support iOS platform.
    The release date hasn't been decided.
     
  19. fherbst

    fherbst

    Joined:
    Jun 24, 2012
    Posts:
    802
    Hey there, I tried out RenderStreaming 2.0.1 today.
    I can get it to stream to local Chrome, but when I try connecting from Android Chrome, I get an exception in Unity:
    Code (CSharp):
    1. Network Error: Unity.WebRTC.RTCError
    2. UnityEngine.Debug:LogError(Object)
    3. Unity.RenderStreaming.RenderStreaming:OnOffer(ISignaling, DescData) (at Assets/Scripts/RenderStreaming.cs:191)
    4. Unity.RenderStreaming.Signaling.HttpSignaling:HTTPGetOffers() (at Assets/Scripts/Signaling/HttpSignaling.cs:251)
    5. Unity.RenderStreaming.Signaling.HttpSignaling:HTTPPooling() (at Assets/Scripts/Signaling/HttpSignaling.cs:98)
    6. System.Threading.ThreadHelper:ThreadStart()
    7.  
    Others (e.g. on GitHub here) suggested that this might be a firewall issue, but disabling the firewall did not resolve the issue. I tried connecting from multiple different devices, and always get that exception.
    I did put the URL into the "Url Signaling" field.

    I tried:
    - Safari on latest iPad Pro
    - Chrome on latest Android
    - Chrome on Quest

    Same exception everywhere. Any ideas?
     
    Last edited: May 4, 2020
  20. Dirrogate

    Dirrogate

    Joined:
    Feb 18, 2014
    Posts:
    157
    Hi @kazuki_unity729
    I know this would be way below the scope of support for this asset, but I would be grateful if you (or anyone on this thread) can show me how we could get a Slider setup for interaction.
    I tried my own code but it just sends a single value to Array Button click event on the Render Streaming script.

    This is what I am using in the WEBAPP folder/public/scripts/app.js to try to create a slider so it can change focal length on the Camera component

    Code (JavaScript):
    1.   // add SLIDER instead of  Blue button
    2.   var elementBlueButton = document.createElement('input');
    3.   elementBlueButton.id = "blueButton";
    4. elementBlueButton.type = 'range';
    5.     elementBlueButton.min = 10;
    6.     elementBlueButton.max = 150;
    7.     elementBlueButton.value = 45;
    8.     elementBlueButton.step = 1;
    9.   elementBlueButton.innerHTML = "Buttn 1";
    10.   document.body.appendChild(elementBlueButton);
    11.   elementBlueButton.addEventListener ("click", function() {
    12.     sendClickEvent(videoPlayer, 1);
    13.  
    14.   });
    Any help would be appreciated.
    Kind Regards
     
    kazuki_unity729 likes this.
  21. Ajaydharan

    Ajaydharan

    Joined:
    May 16, 2017
    Posts:
    7
    HI @kazuki_unity729

    I am trying to use Unity Render Streaming WebRTC and Microsoft WebRTC on 2 different Unity applications. Since the Unity Render Streaming doesn't support remote streaming, I used Microsoft WebRTC package in my unity application to render remote streams. I used the Signalling server webapp from the Unity Render Streaming and I am able to connect it properly.

    The Unity application which uses MS WebRTC acts as a remote client. When I issue an "offer" request from my client, the Unity render streaming application failed to set the Local description when trying to post the answer for the received offer. I used to get only "RTCError" when setLocalDescription is called. I don't have enough error information to proceed further. I used the samplescene in Unity Render Streaming application.

    Do you have any insight on this issue? I am able to open in browsers properly but not in other clients. Any help here would be much appreciated.
     
  22. Maktech

    Maktech

    Joined:
    Jan 15, 2015
    Posts:
    31
    Anyone else getting inconsistency on the latest 2.0.1 version of the webserver? Some applications work just fine, other only work on the first connect then I need to restart the webserver or won't stream video at all. I see the connection in the console. Using an older version of the webserver works everytime. :) Any info on how to debug the issues on this would also help. Thank you.
     
    gdbbv likes this.
  23. kazuki_unity729

    kazuki_unity729

    Unity Technologies

    Joined:
    Aug 2, 2018
    Posts:
    803
    Have you ever tried the sample?
    You can see the way of using samples on the doc.
    https://docs.unity3d.com/Packages/com.unity.renderstreaming@2.0/manual/en/tutorial.html
     
  24. kazuki_unity729

    kazuki_unity729

    Unity Technologies

    Joined:
    Aug 2, 2018
    Posts:
    803
    Unfortunately, we have never tested the combination.
    I can only suggest that you should compare SDPs generated by MS package and browsers.
     
  25. gdbbv

    gdbbv

    Joined:
    Apr 20, 2018
    Posts:
    11
    Yeah also the mouse event tracking and feedback of the app in version 2.0.1 is much more snaggy / delayed with big gaps in frames compared to the 1.2.3 version with our project. Could be so many things (I know) that are causing that but it is definitely consistent and noticably less smooth of an interaction under the same network and hardware conditions. Any ideas from the team on where to look for possible causes?
     
  26. fherbst

    fherbst

    Joined:
    Jun 24, 2012
    Posts:
    802
  27. kazuki_unity729

    kazuki_unity729

    Unity Technologies

    Joined:
    Aug 2, 2018
    Posts:
    803
    I guess the cause of the delay is that the number of streams is increased compared to before.
    Could you try it after changing the number of streams to one?
     
  28. kazuki_unity729

    kazuki_unity729

    Unity Technologies

    Joined:
    Aug 2, 2018
    Posts:
    803
    I found that command-line arguments of Unity app -screen-width and -screen-height don't work on Unity 2019.2 or before. it works correctly on Unity 2019.3.
    Moreover, need to add argument -screen-fullscreen 0 in some cases.

    Please try again with arguments below.

    Code (CSharp):
    1. -screen-fullscreen 0 -screen-width 1280 -screen-height 720
    I posted a comment on this issue.
    https://github.com/Unity-Technologies/UnityRenderStreaming/issues/223#issuecomment-610203248
     
  29. Ajaydharan

    Ajaydharan

    Joined:
    May 16, 2017
    Posts:
    7
    Thanks @kazuki_unity729

    I am able to identify the cause. It is due to the SDP message that doesn't contain the H264 video RTP type when sent from Microsoft Mixed Reality. Is it mandatory for the Unity render stream to always work on the H264 codec? As an alternative, can it work with VP8 or VP9 codes?

    Also, is it possible to get the list of supported SDP values that are required by the Unity render streaming? Since the error we received is so generic, we couldn't able to debug it easily. This info will help us to proceed further.
     
  30. kazuki_unity729

    kazuki_unity729

    Unity Technologies

    Joined:
    Aug 2, 2018
    Posts:
    803
    If what you are saying is correct, I believe that you need to use the VP8 codec.
    Our WebRTC package can use the VP8 codec when you select the software encoder not hardware encoder.
     
    Ajaydharan likes this.
  31. gdbbv

    gdbbv

    Joined:
    Apr 20, 2018
    Posts:
    11
    Thank you for the suggestion. I suspected that the 2 streams may be the problem. For our project after updating the packages to 2.0.1 in our scene we only have a single camera with the CameraStreamer.cs attached to it, not two. However, having a single camera stream “breaks” the 2.0.1 web app at the moment so had to go and change the web app code and recompile it in order to make it work again changing line 40 in video_player.js to

    this.maxVideoTrackLength = 1;

    Otherwise it wasn’t displaying any streams as it is hard-coded to expect 2 streams coming from Unity. I understand this is a preview package but it would have been nice for the updated 2.0.1 webapp to work w/o additional changes with the one steaming camera case as well as the multiple camera case.

    Is there somewhere else in the code (especially on the Unity side) where 2 streams are assumed (even if only one camera has CameraStreamer.cs attached to it) which would make the system “work harder” than in 1.2.3? So far with the single camera and the change in video player.js we still get the snagginess/delay.The snaginess/delay is there too using the old 1.2.3 version of webserver.exe (which still works w/o any changes) when used with the 2.0.1 packages and the new Unity example Scripts. This tells me something new on the Unity side not the web app side is causing the issue.
     
    polyverseme likes this.
  32. fherbst

    fherbst

    Joined:
    Jun 24, 2012
    Posts:
    802
    @kazuki_unity729 any more feedback about what I wrote above? The samples don't seem to work for both me and a number of other people due to that message. What's the Unity + URP version this has been tested against? Anything that needs to be configured besides what's mentioned on the samples? Goal is to have a simple single stream from Desktop to Android.
     
  33. kazuki_unity729

    kazuki_unity729

    Unity Technologies

    Joined:
    Aug 2, 2018
    Posts:
    803
    We are investigating the issue you said but have never reproduced.
    Could you tell me the browser version?

    We tested the combination below.
    Browser: Android Chrome 81.0.4044.138
    OS: Android 8
     
    Last edited: May 11, 2020
  34. fherbst

    fherbst

    Joined:
    Jun 24, 2012
    Posts:
    802
    I also replied on GitHub with the exact details you wanted to hear (I'm soraryu there) - here, for completeness:
    Android 10 QkQ1.190825.002
    Chrome 81.0.4044.138
     
  35. sinaari

    sinaari

    Joined:
    Jan 28, 2019
    Posts:
    47
    Hi! I am following the instructions in the manual:
    1. Launching webserver.exe and taking note of one of the displayed addresses
    2. Creating a Render Streaming object in the scene and setting the address there
    3. Adding a Camera Streamer class to the camera
    and then when I launch the application, I am getting the following error:

    Code (CSharp):
    1. Signaling: HTTP request error System.Net.WebException: The remote server returned an error: (405) Method Not Allowed.
    2.   at System.Net.HttpWebRequest.EndGetResponse (System.IAsyncResult asyncResult) [0x00058] in <ae22a4e8f83c41d69684ae7f557133d9>:0
    3.   at System.Net.HttpWebRequest.GetResponse () [0x0000e] in <ae22a4e8f83c41d69684ae7f557133d9>:0
    4.   at Unity.RenderStreaming.Signaling.HttpSignaling.HTTPGetResponse (System.Net.HttpWebRequest request) [0x0003c] in D:\path-to-the-project\Assets\Scripts\Signaling\HttpSignaling.cs:119
    5. UnityEngine.Debug:LogError(Object)
    6. Unity.RenderStreaming.Signaling.HttpSignaling:HTTPGetResponse(HttpWebRequest) (at Assets/Scripts/Signaling/HttpSignaling.cs:133)
    7. Unity.RenderStreaming.Signaling.HttpSignaling:HTTPDelete() (at Assets/Scripts/Signaling/HttpSignaling.cs:208)
    8. Unity.RenderStreaming.Signaling.HttpSignaling:HTTPPooling() (at Assets/Scripts/Signaling/HttpSignaling.cs:109)
    9. System.Threading.ThreadHelper:ThreadStart()
    I've managed to print out some info about the request that's causing this problem, it's this:

    Code (CSharp):
    1. request = [PUT] http://127.0.0.1/signaling, host = 127.0.0.1, headers = Content-Type: application/json
    I have no idea what might be the cause, since it looks like that I'm doing what's written in the manual and what's shown in the video. Do you have any guess what might be wrong here?
     
  36. Maktech

    Maktech

    Joined:
    Jan 15, 2015
    Posts:
    31
    Been fighting hard and trying to convert an existing application to support WebRTC and have been running into multiple issues. I am aware the primary application needs focus to accept any input into UGUI, I have been running my application and WebApp on an AWS VM with my local machine connecting to the application over webrtc.

    Here are the issues I have run into:
    1. To get the mouse position on the local browser and mouse position in the application to line up correct, I have converted the webserver.exe to pass mouse position as screen percentage using short max as 1%. "ProcessMouseEvent" converts that percentage based on current screen size. I have a dot following the mouse position and they line up correct. I am assuming this is a viable solution, anything I am missing with this solution?
    2. When running the application with renderstreaming.
      1. On my local machine I get inconsistent behavior when attempting to highlight or click buttons. Only one button I have created highlights and clicks as expected. The rest seem to only show highlighting when I move the mouse very quickly and happen to time the click just right.
      2. When streaming, I get no button highlighting or button interaction.
      3. If I remove all the GraphicRaycaster objects in editor and replace them while running, all the buttons work. Have attempted to do that programatically on scene load. No luck.
    Beyond frustrating. I know it early days and we are working with multiple new technologies (Web-RTC, New Input System). There is a ton of potential here. :)
     
  37. Dirrogate

    Dirrogate

    Joined:
    Feb 18, 2014
    Posts:
    157
    @kazuki_unity729 is it currently possible to show an integration of gyro and accelerometer as "mouse" movement. This would be a much needed way to "look around a scene" being streamed by the Unity WebRTC and WebStreaming package.
    https://developers.google.com/web/updates/2017/09/sensors-for-the-web

    It's not only useful for Car configurators/ exhibition / Realestate demos - but for me, gives a way to pan around a camera for Virtual filmmaking.
     
  38. gdbbv

    gdbbv

    Joined:
    Apr 20, 2018
    Posts:
    11
    Sorry for your frustration it is TOTALLY justified and you are not the only one. We have all gone thru it ... Unity's New Input System doesn’t respond w/o app focus. See the issue and some answers / partial solutions described in this post on the Unity Render Streaming forum:

    https://forum.unity.com/threads/unity-render-streaming-introduction-faq.742481/page-2#post-5096528

    You need to create a local version of the Unity Input System package and apply those changes to the files of the package and also find some other places that deal with focus. I found that in addition to the files mentioned in the post above a change to InputManager.cs was needed
    Code (CSharp):
    1. private bool gameIsPlayingAndHasFocus => true;
    With these changes in place and converting all UI events to the New Input System and specifying Render Mode -> Screen Space - Camera for all Canvas objects you should be able to get something going ... we have but it has been a battle and the 2.0.1 release seems to have made things worse. The app is less responsive (more snaggy) than it was with 1.2.3 (see my post above)
     
    Last edited: May 12, 2020
  39. dbillings

    dbillings

    Joined:
    Mar 3, 2020
    Posts:
    6
    has anyone had success using this remotely? such as on AWS or otherwise? I'm trying to implement the renderstreaming into a remote viewer but so far can't get it to work other than localhost ip, if anyone has had luck doing this could they share what they did to make it work?
    Thanks!

    To clarify my problem is that Im able to access it via the public ip address however the video is still grey and the buttons are nonfunctioning
     
    Last edited: May 12, 2020
  40. lopa421

    lopa421

    Joined:
    Mar 5, 2020
    Posts:
    8
    I keep getting the following error whenever I try to click Play on the Render Streaming example scene (using v2.0.1)
    upload_2020-5-13_9-14-59.png
     
  41. kazuki_unity729

    kazuki_unity729

    Unity Technologies

    Joined:
    Aug 2, 2018
    Posts:
    803
    Could you send me the editor log?
     
  42. kazuki_unity729

    kazuki_unity729

    Unity Technologies

    Joined:
    Aug 2, 2018
    Posts:
    803
    I am not sure about your network environment, so I can not advise correctly.
    Generally, the system deploying public network may need to use a TURN server.
    https://docs.unity3d.com/Packages/com.unity.renderstreaming@2.0/manual/en/turnserver.html
     
  43. kazuki_unity729

    kazuki_unity729

    Unity Technologies

    Joined:
    Aug 2, 2018
    Posts:
    803
    Thanks for your suggestion. I am a little surprised that there are needs for the product in the filmmaking industry.
    Recently we are discussing the support of gyroscope input.
     
    Dirrogate likes this.
  44. Buguitus

    Buguitus

    Joined:
    Aug 23, 2019
    Posts:
    1
    Hi there. I've implemented my own signaling server with NodeJS & Socket.IO. Everything is good, i'm receiving video tracks to the browser (actually i'm streaming the camera as in the example). But how do i render on a texture the incoming MediaStreamTrack via webRTC (?)

    I do receive them on the OnTrack delegate for the RTCPeerConnection, but i don't seem to find a way to get to any decoded video data such as a RenderTexture or Texture of some sort.

    In other words, this is only usefull for sending video tracks and displaying them on a browser which i succesfully did but not the other way around?

    The RTCEvent only has acces to the MediaStreamTrack, which in turn does not expose anything useful.
    If anyone knows how to at least render incoming video to a Texture would be super appreciated. I'll keep on investigating.

    Thanks.
    Warm Regards, Alan
     
  45. kazuki_unity729

    kazuki_unity729

    Unity Technologies

    Joined:
    Aug 2, 2018
    Posts:
    803
    Sorry, late reply.
    The latest version of the package can not use the software encoder as fallback because it can not change the encoder type each stream.
    We are recognizing the issue and fix it in the future.
     
    kayy likes this.
  46. Dirrogate

    Dirrogate

    Joined:
    Feb 18, 2014
    Posts:
    157
    @kazuki_unity729 Yes, its a very much needed feature, and if gyro input is being considered, i'd recommend ARcore and Arkit (via Unity Xr framework) be implemented to give proper 6DoF input versus 3DoF of Gyro.

    Sadly, my handicap is coding, or I would have integrated these aspects already existing within Unity
     
    kazuki_unity729 likes this.
  47. gtk2k

    gtk2k

    Joined:
    Aug 13, 2014
    Posts:
    288
    I want to "Vanilla ICE" negotiation.
    Is there any way to know when the collection of ice candidates has finished?
     
  48. salvolannister

    salvolannister

    Joined:
    Jan 3, 2019
    Posts:
    50
    I wanted to use this package to stream my VR application, so I added a RenderingStreaming camera, the render script and to my main camera the URP Render Texture Blitter in my main scene. It does stream the scene but I see all blue in the VR headset, how can I solve this? Does this plugin acutally support VR streaming?
     
    Last edited: Sep 18, 2020
  49. lopa421

    lopa421

    Joined:
    Mar 5, 2020
    Posts:
    8
    I solved it by unchecking the Hardware Encoder Support in the Render Streaming component.
     
    kazuki_unity729 likes this.
  50. ashwani_9

    ashwani_9

    Joined:
    Mar 31, 2019
    Posts:
    6
    I'm using both "Screen Space - Camera" and world space canvas but still its not able to render at the browser end.