Search Unity

  1. Unity Asset Manager is now available in public beta. Try it out now and join the conversation here in the forums.
    Dismiss Notice

Official Unity Render Streaming Introduction & FAQ

Discussion in 'Unity Render Streaming' started by kazuki_unity729, Sep 10, 2019.

  1. sotokangr

    sotokangr

    Joined:
    Jun 3, 2010
    Posts:
    25
    Hello again,

    yet another question: are you going to implement AR Foundation support for Android?
    tried to compile with AR active but I get runtime exceptions on the webrtc initialization...

    Please let me know...

    Kind regards!
     
  2. kazuki_unity729

    kazuki_unity729

    Unity Technologies

    Joined:
    Aug 2, 2018
    Posts:
    803
    Not tested it yet. What exception did you get?
     
  3. Deleted User

    Deleted User

    Guest

    Hi,

    we would like to use the VideoPlayer Sample of the WebRTC package, but we don't need the second PIP Camera and alle the additional information, header etc. How can we get rid of it? Editing the HTML does make no change. We have tried to empty the browser cache, but no chance. Any idea?

    Thanks for all.
    Greetings, Matthias
     
  4. sotokangr

    sotokangr

    Joined:
    Jun 3, 2010
    Posts:
    25
    I finally got it working using Unity 2020.3.11f1 and AR Foundation v.4.0.12.
    Anything along those versions should be good I suppose.

    So now I am able to stream my android phone's AR camera content to my desktop PC.


    HOWEVER, the main problem I am still having is I can not receive the stream when running on different networks.
    The traffic is appearing on the webserver app and my video container on the windows receiver app turns grey when trying to connect...
    but I never get the actual texture/video stream.

    I have tried multiple STUN servers (pubic ones).
    I have not tried any TURN server...

    ??
     
  5. kazuki_unity729

    kazuki_unity729

    Unity Technologies

    Joined:
    Aug 2, 2018
    Posts:
    803
    This docs might be useful for you
    https://docs.unity3d.com/Packages/com.unity.renderstreaming@3.1/manual/customize_webapp.html
     
  6. kazuki_unity729

    kazuki_unity729

    Unity Technologies

    Joined:
    Aug 2, 2018
    Posts:
    803
    This doc explains how to launch TURN server
    https://docs.unity3d.com/Packages/com.unity.renderstreaming@3.1/manual/turnserver.html
     
  7. sotokangr

    sotokangr

    Joined:
    Jun 3, 2010
    Posts:
    25
  8. sotokangr

    sotokangr

    Joined:
    Jun 3, 2010
    Posts:
    25
    so I noticed a kind of weird behavior on my project.

    I am giving the RenderStreaming.iceServers[0].urls[0] a garbage value like "aaa" and when I am on the same network, the stream is working. Meaning I can send and receive the video...

    This makes me think if there is a fallback/default STUN url...?

    How could I test on which STUN/TURN server I am actually connected to when I establish a peer connection?

    I am trying to debug the RenderStreamingInternal.cs and others but haven't figured it out yet...
     
  9. Piflik

    Piflik

    Joined:
    Sep 11, 2011
    Posts:
    293
    I tried the new version today, and while the program doesn't take number_of_connections X streaming_timeout to close, as soon as I try to establish a connection between my applications, one of them crashes.
     
  10. gtk2k

    gtk2k

    Joined:
    Aug 13, 2014
    Posts:
    288
    You can get the selected candidate pair by pc.getStats(), and you can find the connection type from the value of candidateType from the candidate pair.

    candidateType value
    "host"-> Local Network (Local network P2P connection. Not use STUN / TURN)
    "srflx" / "prflx"-> STUN
    "relay"-> TURN

    Code (CSharp):
    1. private IEnumerator getStats()
    2. {
    3.     var op = pc1.GetStats();
    4.     yield return op;
    5.  
    6.     if (op.IsError)
    7.     {
    8.         Debug.LogError($"RTCPeerConnection.GetStats failed: {op.Error}");
    9.         yield break;
    10.     }
    11.  
    12.     var report = op.Value;
    13.     RTCIceCandidatePairStats activeCandidatePairStats = null;
    14.     RTCIceCandidateStats remoteCandidateStats = null;
    15.  
    16.     foreach (var transportStatus in report.Stats.Values.OfType<RTCTransportStats>())
    17.     {
    18.         if (report.Stats.TryGetValue(transportStatus.selectedCandidatePairId, out var tmp))
    19.         {
    20.             activeCandidatePairStats = tmp as RTCIceCandidatePairStats;
    21.         }
    22.     }
    23.     if (activeCandidatePairStats == null || string.IsNullOrEmpty(activeCandidatePairStats.remoteCandidateId))
    24.     {
    25.         yield break;
    26.     }
    27.     foreach (var iceCandidateStatus in report.Stats.Values.OfType<RTCIceCandidateStats>())
    28.     {
    29.         if (iceCandidateStatus.Id == activeCandidatePairStats.remoteCandidateId)
    30.         {
    31.             remoteCandidateStats = iceCandidateStatus;
    32.             Debug.Log(remoteCandidateStats.candidateType);
    33.         }
    34.     }
    35. }
     
    Last edited: Jun 24, 2021
  11. Ibrahim-Th

    Ibrahim-Th

    Joined:
    Apr 28, 2021
    Posts:
    3
    I think there is a problem when upgrading to the latest release, 3.1.0-exp.1, if you change to HTTP, it returns to the WS, which is the default. Did anybody face this problem?
     
  12. kimminwoo0807

    kimminwoo0807

    Joined:
    Mar 26, 2018
    Posts:
    11
    I'm looking at the WebApp code included in the samples from Unity Render Streaming.

    I'm trying to add the functionality of the "VideoPlayer" sample to a web cam video chat web page implemented with PeerJS and Socket.io. However, it was confirmed that the WebApp sample project is implemented in TypeScript and the RTC function is implemented in detail.

    Is there a way to re-implement this sample code with PeerJS and WebSocket or insert the code itself to use it?
     
  13. kazuki_unity729

    kazuki_unity729

    Unity Technologies

    Joined:
    Aug 2, 2018
    Posts:
    803
    You can see this document to custom the web app.
    https://docs.unity3d.com/Packages/com.unity.renderstreaming@3.1/manual/customize_webapp.html
     
  14. Kahlis

    Kahlis

    Joined:
    Jan 14, 2017
    Posts:
    6
    Hi, I've waiting for the 3.1 version for a long time, and recently I saw the update in the github. But I need to know how to create a receiver for android, in my version I'm getting null refence exception when trying to Start the receiver in the android phone
     
    Abdul_Malik_Badshah likes this.
  15. ModeLolito

    ModeLolito

    Joined:
    Jun 16, 2016
    Posts:
    19
    Hi, I succeed currently to do multiple connections with video streaming, but currently I have some artefacts and latency issues. I read about webRTC we can use different architectures to stream data and I think my issue is because currently webRTC use mesh architecture, do you know if it's possible to use Simulcast architecture to send different stream video according to the bandwidth of my peers?
    https://bloggeek.me/webrtc-multiparty-architectures/
    Simulcast is an approach where the user sends multiple video streams towards the SFU. These streams are compressed data of the exact same media, but in different quality levels – usually different resolutions and bitrates.
     
  16. kazuki_unity729

    kazuki_unity729

    Unity Technologies

    Joined:
    Aug 2, 2018
    Posts:
    803
    Could you attach the exception log? Thanks.
     
  17. gft-ai

    gft-ai

    Joined:
    Jan 12, 2021
    Posts:
    44
    Hi, I am very new to this whole WebRTC thing so I am trying to understand and figure out a way to come up with a solution for my needs. But I just want someone's advise if what I am trying to do is possible which is:

    I want to have Jitsi (a video conferencing platform like Zoom) streaming a video feed which I want to receive it and display it in my Unity app in real-time. So it's basically a video conference call but instead of having the video call in one platform I want these 2 different platforms to be able to communicate to each other.

    I am too new to this area so even going through the sample code (in both Jitsi and Unity) is proving to be quite difficult to understand for me. So if anyone could give me some guidance would be great. Thank you!
     
  18. tsutomunmun

    tsutomunmun

    Joined:
    Oct 8, 2020
    Posts:
    5
    Hi, thanks for providing great product!

    I'm now struggling with re-connecting or keep connecting the streaming and Data channel.
    (I'm modifying sample Webapp.)

    What I want to do is to keep connection over scene transition at Unity.

    Sorry for It may stupid question, but could you give some advice to how to implement?
    I did some rough investigations below but struggling with getting hint and have not succeeded.



    [environment]
    Render Stream package and sample(Preview 3.01)
    Unity 2019.4.7f1 windows10 64bit
    Google Chrome version: 91.0.4472.114


    [What I comfirmed and tried for investigation]

    When Scene changes at Unity,

    Video-palyer.js
    async setupConnection(useWebSocket)

     (Omitted)

    this.channel.onerror = function (e) {
    Logger.log("The error " + e.error.message + " occurred\n while handling data with proxy server.");
    };

    is called and then,

    this.pc.oniceconnectionstatechange = function (e) {
    Logger.log('iceConnectionState changed:', e);

    is called and it "show disconnected".


    I call setupConnection to reconnect by implementing something like below.


    async setupConnection(useWebSocket, recoonect=false) {

    const _this = this;

    if (recoonect==false) {
    (Omitted)
    //skip when calling for reconnect
    }

    // setup signaling
    await this.signaling.start();
    this.connectionId = uuid4();

    // Add transceivers to receive multi stream.
    // It can receive two video tracks and one audio track from Unity app.
    // This operation is required to generate offer SDP correctly.
    this.pc.addTransceiver('video', { direction: 'recvonly' });
    this.pc.addTransceiver('video', { direction: 'recvonly' });
    this.pc.addTransceiver('audio', { direction: 'recvonly' });

    // create offer
    const offer = await this.pc.createOffer(this.offerOptions);

    // set local sdp
    const desc = new RTCSessionDescription({ sdp: offer.sdp, type: "offer" });
    await this.pc.setLocalDescription(desc);
    await this.signaling.sendOffer(this.connectionId, offer.sdp);

    };


    The result is that
    1) video streaming recovered after Unity scene transition.(OK)
    2) data can not be send back to Unity from browser by click event.(Not OK)

    (log genereted by the code below)
    video-player.js sendMsg() this.channel: [object RTCDataChannel] this.channel.readyState: closed
    Attempt to sendMsg message while connection closed.

    ================ Video-palyer.js ===============================================
    sendMsg(msg) {
    console.log("video-player.js sendMsg() called");
    console.log("video-player.js sendMsg() this.channel: " + this.channel + " this.channel.readyState: " + this.channel.readyState);

    if (this.channel == null) {
    return;
    }
    switch (this.channel.readyState) {
    case 'connecting':
    Logger.log('Connection not ready');
    console.log('Connection not ready');
    break;
    case 'open':
    console.log('sendMsg message msg = ' + msg);
    this.channel.send(msg);
    break;
    case 'closing':
    Logger.log('Attempt to sendMsg message while closing');
    console.log('Attempt to sendMsg message while closing');
    break;
    case 'closed':
    Logger.log('Attempt to sendMsg message while connection closed.');
    console.log('Attempt to sendMsg message while connection closed.');
    break;
    }
    };
    }

    ===============================================================


    I think it is natural because on the reconnection phase,

    program don't go through the states below
    pc.oniceconnectionstatechange:checking
    pc.oniceconnectionstatechange:connected

    and
    "this.channel.onopen" never called.



    Why I checked the way bellow is that
    Simply calling setupConnection(useWebSocket) did not work.

    ==================================================
    async setupConnection(useWebSocket, recoonect=false) {

    const _this = this;

    if (recoonect==false) {
    (Omitted)
    //skip when calling for reconnect
    }
    ==================================================

    Error raised at
    await _this.pc.setRemoteDescription(desc); or
    await _this.pc.addIceCandidate(iceCandidate);

    configured by
    this.signaling.addEventListener('answer', ......
    this.signaling.addEventListener('candidate', ......



    I tried
    this.signaling.removeEventListener('answer',
    this.signaling.addEventListener('candidate', ......

    at the top of async setupConnection(useWebSocket)

    but it did not worked (at least by my implementation )

    Maybe I need to know how to implement to trigger "this.channel.onopen"?


    Thanks in advance!
     
  19. tsutomunmun

    tsutomunmun

    Joined:
    Oct 8, 2020
    Posts:
    5
    Self reply for previous my post above, made some progress but not solved yet.
    (I need to sleep so I want to prevent waste someone's time who is kindly enough to check to tell me some advice)

    I revised something on "async setupConnection()" in video-player.js, like, to enable removeEventListener, I put anonymous functions in variables.

    It seems work, why I guess so is because data (3Mbps or such) is streaming from the PC(Unity side) to another PC(client browser side) via internet (each PC is connected to different service provider) after Uinity Scene change and sendback message from client browser to Unity by click event succeeded.

    However, streamed movie does not displayed on the video player in the browser after unity scene changed.

    Something wrong , but I have not figure out what it is yet.

    Maybe restart video.play()?
    After Unity scene change, when I reload the browser and press play button, it seems run normally.
     
  20. tsutomunmun

    tsutomunmun

    Joined:
    Oct 8, 2020
    Posts:
    5
    2nd Self reply for previous my post above , made some progress.

    I checked if I can alter data-channel functionality with customized other websocket connections ( require another server to run which means extra implementation and operation overhead though), and it may work. (why I say "may" is because currently I have done basic implementation just for check).

    Due to development schedule I will probably take the option above though, I would appreciate if you tell me some advices to handle data-channel properly yet (for the near future).

    Thanks in advance!
     
  21. wanfei

    wanfei

    Joined:
    Dec 22, 2018
    Posts:
    19
  22. xavierpuigf

    xavierpuigf

    Joined:
    Mar 17, 2018
    Posts:
    4
    Hi, has anyone succeeded in using RenderStreaming on a headless server? I modified the WebBrowserInput example and can do streaming on a server with nvidia GeForce GTX 1080 gpus by using an X server.

    While it works with a single simulator instance, when I open multiple instances render streaming stops working on some of them, and I just stream a black screen. I still can receive input from the browsers, using multiple instances. Currently, the screen turns black after I add the third streaming track.

    Looking around, I figured this may be due to the hardware encoding, which has a limit on the number of tracks allowed on the gpu's. This is still a bit confusing to me because the gpu is not fully occupied, and I have 8 gpus in the server, which I suppose should allow a larger number of tracks.

    I tried switching to software encoding for webrtc, but I get the following error, which kills the unity executable. If anyone has any leads it would be really helpful. This project is really useful for my needs and it would be really useful to be able to launch multiple instances from a given server.

    Code (Boo):
    1. =================================================================
    2. Got a SIGSEGV while executing native code. This usually indicates
    3. a fatal error in the mono runtime or one of the native libraries
    4. used by your application.
    5. =================================================================
    6.  
    7. Caught fatal signal - signo:11 code:1 errno:0 addr:0xc
    8. Obtained 12 stack frames.
    9. #0  0x007f26fc798980 in funlockfile
    10. #1  0x007f26f7a7171d in _nv044glcore
    11. #2  0x007f26f79dd134 in _nv044glcore
    12. #3  0x007f26fd6088fc in ApiGLES::SetVertexArrayAttrib(unsigned int, unsigned int, VertexFormat, unsigned char, unsigned int, void const*)
    13. #4  0x007f26fd5e861b in SetVertexStateGLES(ShaderChannelMask, VertexChannelsInfo const&, GfxBuffer* const*, unsigned int const*, int, unsigned int, unsigned long)
    14. #5  0x007f26fd5f4414 in GfxDeviceGLES::DrawBuffers(GfxBuffer*, unsigned int, GfxBuffer* const*, unsigned int const*, int, DrawBuffersRange const*, int, VertexDeclaration*)
    15. #6  0x007f26fd5b4c1b in GfxDeviceWorker::RunCommand(ThreadedStreamBuffer&)
    16. #7  0x007f26fd5b546b in GfxDeviceWorker::RunExt(ThreadedStreamBuffer&)
    17. #8  0x007f26fd5ab0b5 in GfxDeviceWorker::RunGfxDeviceWorker(void*)
    18. #9  0x007f26fd9c35aa in Thread::RunThreadWrapper(void*)
    19. #10 0x007f26fc78d6db in start_thread
    20. #11 0x007f26fc4b671f in clone
     
  23. kazuki_unity729

    kazuki_unity729

    Unity Technologies

    Joined:
    Aug 2, 2018
    Posts:
    803
    GTX series has a restriction that the number of concurrent encoder sessions.
    https://developer.nvidia.com/video-encode-and-decode-gpu-support-matrix-new#Encoder

    Could it be that your situation is relevant to the restriction?
     
  24. xavierpuigf

    xavierpuigf

    Joined:
    Mar 17, 2018
    Posts:
    4
    Thank you! Indeed, this seems to match with my experience, since I was only allowed 3 sessions at the same time. Do you know if it is possible to sidestep this number via software encoding, I am not sure if software encoding can work with headless servers, I've been getting the above error every time I tried to run software encoding.
     
  25. xavierpuigf

    xavierpuigf

    Joined:
    Mar 17, 2018
    Posts:
    4
    Alternatively, is there a way to side step video streaming limitations, and just send images from the simulator to the web client at certain times? I have been really stuck with this hardware limitations.

    Thank you for all the help!
     
  26. Kahlis

    Kahlis

    Joined:
    Jan 14, 2017
    Posts:
    6
    I found the error. All I had to do was set the Scripting Backend to IL2CPP and build the game to ARM64. And at last I disabled the "Hardware Econder Support" because my PC don't support that. Thank you for the attention.
     
    Last edited: Jul 10, 2021
  27. chrisming999

    chrisming999

    Joined:
    Jan 15, 2021
    Posts:
    10
    help:
    when I using hardencoder。
    Browser image quality is extremely poor
    but use soft encoder is ok

    streaming size of camera streamer seted 2560*1440

    how can I do ?
    Thank you
     
  28. chrisming999

    chrisming999

    Joined:
    Jan 15, 2021
    Posts:
    10
    Nvidia RTX 2080 Ti
     
  29. chrisming999

    chrisming999

    Joined:
    Jan 15, 2021
    Posts:
    10
    remove render streaming component then add the component again
     
    Ibrahim-Th likes this.
  30. chrisming999

    chrisming999

    Joined:
    Jan 15, 2021
    Posts:
    10
    when I using 2070 hardware encoder is pool Low quality.can you tell me how to do
     
  31. wechat_os_Qy04DY_qwO6Ahm9rkjac3qJ08

    wechat_os_Qy04DY_qwO6Ahm9rkjac3qJ08

    Joined:
    Jul 20, 2021
    Posts:
    6
    upload_2021-7-20_11-30-33.png
    My project has this error, I can't solve it, so I comment it out.Maybe it has a solution? i used package renderStreaming2.2.2
     

    Attached Files:

  32. gtk2k

    gtk2k

    Joined:
    Aug 13, 2014
    Posts:
    288
    Have you executed setRemoteDescription(ref answer); twice?
     
  33. wechat_os_Qy04DY_qwO6Ahm9rkjac3qJ08

    wechat_os_Qy04DY_qwO6Ahm9rkjac3qJ08

    Joined:
    Jul 20, 2021
    Posts:
    6
    I'm not sure,i just play the sample project ,and see the screen in the browser.Then I got the error.
     
  34. gtk2k

    gtk2k

    Joined:
    Aug 13, 2014
    Posts:
    288
    Did you run pc.CreateOffer(ref options); pc.setLocalDescription(ref offer); before setRemoteDescription(ref answer) ;?
     
  35. kazuki_unity729

    kazuki_unity729

    Unity Technologies

    Joined:
    Aug 2, 2018
    Posts:
    803
  36. chrisming999

    chrisming999

    Joined:
    Jan 15, 2021
    Posts:
    10
  37. orangetech

    orangetech

    Joined:
    Sep 30, 2017
    Posts:
    50
    How to direct connect peer by IP:port instead ICE server?
     
  38. hdtqgeneral

    hdtqgeneral

    Joined:
    Jul 13, 2019
    Posts:
    3
    I have this weird situation, I have 2 computers on 2 different networks (network A and B). When I stream from computer A, the computer B displays only gray color (IceCandidateState goes from "Checking" to "Failed"). However, when I stream from computer B, the computer A receives it properly (IceCandidateState goes "Completed" immediately). What could be the problem?

    - I use sample scenes on Editor: 2021.1.11f with UnityRenderStreaming: 3.1.0 exp 1.
    - Signaling runs on network B.
     
  39. Ibrahim-Th

    Ibrahim-Th

    Joined:
    Apr 28, 2021
    Posts:
    3
    I have the same problem, and I added the Editor: 2021.1.11f and webserver to
    `allow apps to communicate through Windows defender firewall`
    private and public. So it works for all networks.
     
  40. wechat_os_Qy04DY_qwO6Ahm9rkjac3qJ08

    wechat_os_Qy04DY_qwO6Ahm9rkjac3qJ08

    Joined:
    Jul 20, 2021
    Posts:
    6
    Help! In web how to interact with UGUI.I tried to add the UI but I couldn't click on it.

    -I use sample scenes on Editor: 2020.3.4f1c1 with UnityRenderStreaming: 2.2.2
     
    yukinrius likes this.
  41. kimminwoo0807

    kimminwoo0807

    Joined:
    Mar 26, 2018
    Posts:
    11
    What source code files should I look at to send and receive custom WebSocket messages between the Unity Render Streaming application and the WebApp?
     
  42. gft-ai

    gft-ai

    Joined:
    Jan 12, 2021
    Posts:
    44
    Hi guys,
    Has anyone tried running a built app that is using Render Streaming & WebRTC on a machine with no Nvidia GPU? It just crashes straight away when I try to open it.

    The built app works on a machine with Nvidia GPU though.

    Both are running on Ubuntu 20.04 and the app was built on Unity 2020.3.15 LTS
    Renderstreaming: 3.1.0-exp.1
    WebRTC: 2.4.0-exp.3
     
    xavierpuigf likes this.
  43. gft-ai

    gft-ai

    Joined:
    Jan 12, 2021
    Posts:
    44
    I don't think there is an example for sending & receiving custom websocket messages.

    You will just have to figure it out and write it your own I think
     
  44. kimminwoo0807

    kimminwoo0807

    Joined:
    Mar 26, 2018
    Posts:
    11
    Do you mean by creating a separate WebSocket to do sending and receiving messages?

    Is it okay to create multiple WebSocket objects?
     
  45. gft-ai

    gft-ai

    Joined:
    Jan 12, 2021
    Posts:
    44
    I guess you could create a separate websocket for your custom messages. but it really depends on you are trying to achieve with it. If you just need a websocket connection without the webrtc part then it would be pretty straight forward. But if you need to send your custom messages with/via the webrtc part then I think you will have to go through the webrtc's source code, change/add the messaging structure to send from/to unity. Of course you need to implement the same on the webapp side.

    I guess creating multiple websocket objects would be ok? (I am not an expert on this so take it with a pinch of salt) as long as port numbers are correctly set? Again, my opinion could be completely invalid depending on what you are trying to achieve. Good luck though!
     
  46. youyng

    youyng

    Joined:
    Mar 12, 2020
    Posts:
    1
    Hi,I’m using the WebBrowserInput sample of render streaming with Multiplayers. I found that Working well when connecting and playing, but if I close one of the player’s connections by shutdown the browser, the other player will disconnect too. The unity editor Report error code: Cannot queue state event for device 'Keyboard:/Keyboard1'(the other player’s device) because device has not been added to system. How can I solve this problem? OS: windows10,package:renderstreaming 3.0.1,webrtc 2.3.3,input system 1.0.2。Unity Editor: 2019.4.14f1。 In addition, How can I add the third user with single camera streamer?

    Code (CSharp):
    1. error code:
    2. InvalidOperationException: Cannot queue state event for device 'Keyboard:/Keyboard1' because device has not been added to system
    3.  
    4. UnityEngine.InputSystem.InputSystem.QueueStateEvent[TState] (UnityEngine.InputSystem.InputDevice device, TState state, System.Double time) (at C:/RenderStreamingProject/Third-packages/com.unity.inputsystem@1.0.2/InputSystem/InputSystem.cs:2255)
    5.  
    6. Unity.RenderStreaming.RemoteInput.ProcessKeyEvent (Unity.RenderStreaming.KeyboardEventType state, System.Boolean repeat, System.Byte keyCode, System.Char character) (at C:/RenderStreamingProject/Third-packages/com.unity.renderstreaming@3.0.1-preview/Runtime/Scripts/RemoteInput.cs:337)
    7.  
    8. Unity.RenderStreaming.RemoteInput.ProcessInput (System.Byte[] bytes) (at C:/RenderStreamingProject/Third-packages/com.unity.renderstreaming@3.0.1-preview/Runtime/Scripts/RemoteInput.cs:170)
    9.  
    10. Unity.WebRTC.RTCDataChannel+<>c__DisplayClass33_0.<DataChannelNativeOnMessage>b__0 () (at C:/RenderStreamingProject/Third-packages/com.unity.webrtc@2.3.3-preview/Runtime/Scripts/RTCDataChannel.cs:184)
    11.  
    12. Unity.WebRTC.WebRTC.SendOrPostCallback (System.Object state) (at C:/RenderStreamingProject/Third-packages/com.unity.webrtc@2.3.3-preview/Runtime/Scripts/WebRTC.cs:397)
    13.  
    14. UnityEngine.UnitySynchronizationContext+WorkRequest.Invoke () (at <9674024ab0e74d27bbe9eaa30dab34d1>:0)
    15.  
    16. UnityEngine.UnitySynchronizationContext:ExecuteTasks()
     

    Attached Files:

  47. wechat_os_Qy04DY_qwO6Ahm9rkjac3qJ08

    wechat_os_Qy04DY_qwO6Ahm9rkjac3qJ08

    Joined:
    Jul 20, 2021
    Posts:
    6
    Hello,how can I click the button in web? Looking forward to your reply, thanks!
     
  48. kazuki_unity729

    kazuki_unity729

    Unity Technologies

    Joined:
    Aug 2, 2018
    Posts:
    803
    Could you tell me what kind of messages are you want to send?
    Websocket is used for exchanging SDPs between peers in Unity Render Streaming.
     
    kimminwoo0807 likes this.
  49. kazuki_unity729

    kazuki_unity729

    Unity Technologies

    Joined:
    Aug 2, 2018
    Posts:
    803
    Hi, I would like to know more details about the issue.
    Do you have logs?
     
  50. kazuki_unity729

    kazuki_unity729

    Unity Technologies

    Joined:
    Aug 2, 2018
    Posts:
    803
    Thanks for your feedback.
    We will check the issue.