Search Unity

Question Microsoft's MixedReality-WebRTC Library: Video Renderer Component doesn't have a Source Property

Discussion in 'AR' started by isaako, Oct 26, 2020.

  1. isaako

    isaako

    Joined:
    Nov 25, 2016
    Posts:
    15
    Hi Everyone,

    The Video Renderer component from Microsoft's MixedReality-WebRTC library doesn't have a source field where I can specify from where I want the video renderer to take the video. I want it to take it from another component that represents the webcam since I'm trying to make a basic WebRTC connection. I'm following the tutorial in https://microsoft.github.io/MixedReality-WebRTC/manual/unity/helloworld-unity-localvideo.html , in that tutorial the Video Renderer Component has a Source field, but mine doesn't have one.

    Any help would be greatly appreciated.
     
  2. dlebel_cimmi

    dlebel_cimmi

    Joined:
    Oct 1, 2020
    Posts:
    1
    I also met this issue. This has changed in the new MixedReality WebRtc but the tutorial was not updated. Before that, the source was specified in the video renderer as mentioned in the tutorial. But now the renderer doesn't have that source attribute. You must specify what happens when the source stream start and or stop from the source itself.

    So go to the component that holds your WebcamSource. You should see a Video Stream Started list. Add one action to the started list. You must then select the GameObject that holds your VideoRenderer and set VideoRenderer.StartRendering. Do the same for StopRendering in the VideoStreamStopped list. This solved the issue for me.
     
  3. isaako

    isaako

    Joined:
    Nov 25, 2016
    Posts:
    15
    Thanks! I was able to add an action to the Video Stream Started and the Video Stream Stopped lists in the WebcamSource. What I added were functions that are inside a script, but they are empty because I don't know what to put inside of them yet. What I understand is that these functions will be activated after the WebcamSource is started or stopped respectively, correct me if I'm wrong.

    Then I tried creating another script containing VideoRenderer.StartRendering, but I got errors. I also tried using something I saw in Microsoft's documentation: StartRendering(IVideoSource source) , but I got errors again. I think I'm kind of confused because I don't know what triggers what. For example, does the VideoRenderer.StartRendering triggers the Video Stream Started in the WebcamSource? Also, I don´t know how to set VideoRenderer.StartRendering, could you show part of your code please?
     
  4. isaako

    isaako

    Joined:
    Nov 25, 2016
    Posts:
    15
    Ok, I made some changes and I think I'm getting closer. Below is my code. But I'm still getting an error, it's in the part where I define the source of the VideoRenderer.StartRendering() function. The error says: error CS1503: Argument 1: cannot convert from 'Microsoft.MixedReality.WebRTC.Unity.WebcamSource' to 'Microsoft.MixedReality.WebRTC.IVideoSource' What is the IVideoSource type? What would be an example of something that is from that type?

    Code (CSharp):
    1. using System.Collections;
    2. using System.Collections.Generic;
    3. using UnityEngine;
    4. using Microsoft.MixedReality.WebRTC.Unity;
    5. using System.Diagnostics;
    6.  
    7.  
    8. public class WebCam_Script : MonoBehaviour
    9. {
    10.     public GameObject LocalVid;
    11.  
    12.     // Start is called before the first frame update
    13.     void Start()
    14.     {
    15.      
    16.     }
    17.  
    18.     // Update is called once per frame
    19.     public void Update()
    20.     {
    21.         if (Input.GetKeyDown("space"))
    22.         {
    23.             print("space key was pressed");
    24.             StopWebCameo();
    25.         }
    26.     }
    27.  
    28.     public void StartWebCameo()
    29.     {
    30.         print("WebCam Started??");
    31.         StartRendeo();
    32.     }
    33.  
    34.     public void StartRendeo()
    35.     {
    36.         WebcamSource wcam = LocalVid.GetComponent<WebcamSource>();
    37.         VideoRenderer.StartRendering(wcam);
    38.     }
    39.  
    40.     public void StopWebCameo()
    41.     {
    42.         //WebcamSource wcam = GetComponent<WebcamSource>();
    43.         //VideoRenderer.StopRendering(wcam);
    44.     }
    45.  
    46.  
    47. }
     
  5. unity_H9rj9tkQr45s3Q

    unity_H9rj9tkQr45s3Q

    Joined:
    Jul 28, 2020
    Posts:
    1
    @isaako This might help.

    1. Create new script and add methods to start and stop rendering and pass Microsoft.MixedReality.WebRTC.Unity.WebcamSource as source.

    Code (CSharp):
    1. using System.Collections;
    2. using System.Collections.Generic;
    3. using UnityEngine;
    4.  
    5. public class MyScript : MonoBehaviour
    6. {
    7.     public Microsoft.MixedReality.WebRTC.Unity.VideoRenderer videoRenderer;
    8.     public Microsoft.MixedReality.WebRTC.Unity.WebcamSource webcamsource;
    9.  
    10.     public void startVideoStream()
    11.     {
    12.         videoRenderer.StartRendering(webcamsource.Source);
    13.     }
    14.  
    15.     public void stopVideoStream()
    16.     {
    17.         videoRenderer.StopRendering(webcamsource.Source);
    18.     }
    19. }
    20.  
    2. Attach this new script to some gameobject and attach your VideoRenderer and WebcamSource as Reference.

    upload_2020-12-2_19-2-42.png

    3. Go to your WebcamSource component and attach your script and assign the respective methods.

    upload_2020-12-2_19-4-2.png

    This should give no errors and should make the tutorial working.
     
  6. isaako

    isaako

    Joined:
    Nov 25, 2016
    Posts:
    15
    Thanks @unity_H9rj9tkQr45s3Q . That worked perfectly for the local peer. Then I worked on the remote peer and tried to establish the webrtc connection but the remote video was not displayed (The webrtc connection seems to be fine, the offer message and answer message look good), so I wonder if my remote peer setup is wrong. I did pretty much the same that I did for the local peer. First I created this receiver script:

    Code (CSharp):
    1. using System.Collections;
    2. using System.Collections.Generic;
    3. using UnityEngine;
    4. using Microsoft.MixedReality.WebRTC.Unity;
    5. using System.Diagnostics;
    6.  
    7. public class ReceiverScript : MonoBehaviour
    8. {
    9.     public Microsoft.MixedReality.WebRTC.Unity.VideoRenderer videoRenderer;
    10.     public Microsoft.MixedReality.WebRTC.Unity.VideoReceiver videoReceiver;
    11.  
    12.     public void startVideoStream()
    13.     {
    14.         videoRenderer.StartRendering(videoReceiver.VideoTrack);
    15.     }
    16.  
    17.     public void stopVideoStream()
    18.     {
    19.         //videoRenderer.StopRendering(webcamsource.Source);
    20.     }
    21.  
    22. }
    Then I attached it to the same gameobject that contains the previous script (the local video script) and attached as reference a new VideoRenderer and a VideoReceiver that were created in a RemoteVideoPlayer object.

    upload_2021-2-19_21-10-56.png

    And then in that RemoteVideoPlayer object that I created I assigned the respective methods in the VideoReceiver component

    upload_2021-2-19_21-11-41.png

    But it doesn't work. Maybe the problem is that my Peer connection component is missing some information? I don't know what else to add in that component

    upload_2021-2-19_21-16-48.png
     
  7. Julian243

    Julian243

    Joined:
    Apr 14, 2021
    Posts:
    2
    Any updates on this? I´m still stuck on the remote peer part.
     
  8. Michael-N3rkmind

    Michael-N3rkmind

    Joined:
    Aug 16, 2015
    Posts:
    24
    Also curious about this