Search Unity

  1. Unity Asset Manager is now available in public beta. Try it out now and join the conversation here in the forums.
    Dismiss Notice

Official Unity Render Streaming Introduction & FAQ

Discussion in 'Unity Render Streaming' started by kazuki_unity729, Sep 10, 2019.

  1. GameeDev

    GameeDev

    Joined:
    Apr 15, 2021
    Posts:
    3
    sorry for the may stupid question, but where exactly in the code i can transmit informations? For example i m in the Bidirectional Scene and want to send a bool or something by click or whatever. I m a newbie at this and coulnt find.
    Thanks
     
  2. jojizaidi

    jojizaidi

    Joined:
    Jun 30, 2014
    Posts:
    10
    I want to run my RenderStream build on an EC2 instance. I added it to my startup so whenever I launch the instance the build should run. The issue is because of EC2 behavior, I have to remote desktop to the instance and then only will my build run and provide the stream. In Unreal they have the -renderOffScreen option which bypasses this issue.
    Any thoughts on how I can handle this in Unity?
     
  3. Piflik

    Piflik

    Joined:
    Sep 11, 2011
    Posts:
    293
    Is there documentation on how to shut down renderstreaming correctly?

    In my project I want to start and stop the WebApp from Unity (I have the .exe in StreamingAssets and start it via
    Process.Start()
    ). When I quit I call
    DeleteConnection()
    on the
    SingleConnection
    component and
    Stop()
    on the
    RenderStreaming
    component, before killing the WebApp process. However, the Unity Editor freezes for a bit before exiting play mode and the WebApp stays alive during this time, too. Experiments indicate, that this time is pretty much exactly the
    Interval
    property of the
    RenderStreaming
    component, which determines the polling interval to the signaling server.

    If I kill the process immediately, I also get 1006 as the close code, when I do it with a small delay I get 1005.

    Looking at the source code, I was under the impression that
    RenderStreaming.Stop()
    would disconenct from the signaling server, but that doesn't seem to work, at least not until the next polling interval.

    Is there a way to disconnect immediately?
     
  4. sahithigurram25

    sahithigurram25

    Joined:
    May 13, 2020
    Posts:
    4
    Hello. I have tried the DataChannel sample of unity WebRTC. The only change I made was the offer-answer negotiation between the peer connections. Instead of passing the desc directly I made the offer desc appear in console in unity editor and then copied that to a text box in the Unity UI and assigned the remote description by reading the text box from the script. The offer and answer negotiation worked. But the dataChannel doesn't seem to open. So I wanted to manually negotiate that too by assigning the RTCDataChannelInit.id by myself. But when I pass this through the RTCPeerConnection.createDataChannel, the resulting RTCDataChannel.Id is still -1 it doesn't change I don't know the reason why this is happening. It would be helpful if anyone can suggest anything on what could be the possible reason for this. Thanks in advance.
     
  5. eco_bach

    eco_bach

    Joined:
    Jul 8, 2013
    Posts:
    1,601
  6. Dev_Sebas

    Dev_Sebas

    Joined:
    Jul 10, 2014
    Posts:
    19
    Hi,@kazuki_unity729 sorry for the noob question,

    When using the Broadcast or SingleConnection components, we can add and specify components for the stream (CameraStreamer, AudioStreamer, ...), but it's quite difficult to drag and drop the desired component (Unity picks the Transform component). The solution for this seems to be creating a script to correctly setup the Broadcast/SingleConnection with the desired components. Is there any trick to avoid this and have a simpler way to assign the components? Will it change in the future?

    Thanks in advance,
    Sebastião Rocha
     
  7. tdowdle

    tdowdle

    Joined:
    Feb 10, 2021
    Posts:
    3
    Did you ever resolve this issue? I'm getting the same error trying to communicate with a Janus WebRTC server.
     
  8. kazuki_unity729

    kazuki_unity729

    Unity Technologies

    Joined:
    Aug 2, 2018
    Posts:
    803
    RTCDataChannel is using for transmitting binaries or texts.
     
  9. kazuki_unity729

    kazuki_unity729

    Unity Technologies

    Joined:
    Aug 2, 2018
    Posts:
    803
    Do you want to use it with video streaming on the AWS instance?
     
  10. kazuki_unity729

    kazuki_unity729

    Unity Technologies

    Joined:
    Aug 2, 2018
    Posts:
    803
    OK, You mean
    RenderStreaming.Stop()
    method doesn't work correctly. Right?
    I will investigate the issue. Thanks.
    If you share screenshots of the issue, it is so helpful for me.
     
  11. kazuki_unity729

    kazuki_unity729

    Unity Technologies

    Joined:
    Aug 2, 2018
    Posts:
    803
    Have you tried the "DataChannel" scene in the WebRTC sample?
    This is for a start point to use DataChannel.
    https://docs.unity3d.com/Packages/com.unity.webrtc@2.4/manual/sample.html
     
  12. kazuki_unity729

    kazuki_unity729

    Unity Technologies

    Joined:
    Aug 2, 2018
    Posts:
    803
    Yes, we have updated rapidly this repository since the first release in 2019.
    So the tutorial you mentioned become old. Please refer to the tutorial document on this site.
    https://docs.unity3d.com/Packages/com.unity.renderstreaming@3.0/manual/tutorial.html
     
  13. kazuki_unity729

    kazuki_unity729

    Unity Technologies

    Joined:
    Aug 2, 2018
    Posts:
    803
    Sure, I think the component design on the inspector window should be improved as you said.
    Thanks for your feedback.
     
    Dev_Sebas likes this.
  14. kazuki_unity729

    kazuki_unity729

    Unity Technologies

    Joined:
    Aug 2, 2018
    Posts:
    803
    This error has occurred when there is no supported codec between peers.
    Could you try to change the parameter "Hardware Encoder Support" on the inspector window?
     
  15. kayy

    kayy

    Joined:
    Jul 26, 2011
    Posts:
    110
    Long time ago. According to my repository's commit message added the following lines before calling pc.createOffer():
    Code (JavaScript):
    1.     this.pc.addTransceiver('video', { direction: 'recvonly'
    2.     this.pc.addTransceiver('audio', { direction: 'recvonly' });
     
  16. Piflik

    Piflik

    Joined:
    Sep 11, 2011
    Posts:
    293
    Not sure what a screenshot would help. I can attach a minimal repro project, though. You'd just have to put the webserver.exe from the RenderStreaming Repository into StreamingAssets (including it in the zip file made it too big to attach here).

    You just need to go into play mode and exit again. It will take 5 seconds to exit play mode. If you increase the Interval of the RenderStreaming component, the time before unity exits wil increase accordingly.
     

    Attached Files:

  17. jojizaidi

    jojizaidi

    Joined:
    Jun 30, 2014
    Posts:
    10
    I resolved it by running windows autologon, and then adding the unity build on startup in the registry editor
     
  18. tsutomunmun

    tsutomunmun

    Joined:
    Oct 8, 2020
    Posts:
    5
    Hi, I'm newbie (since several days ago) for this streaming rendering and very impressed by great works. Thanks.

    Now I got into the unity editor crash problem when I send large size render texture(4096 by 2048) .
    Could someone give me advices which direction should I go next?

    [My short term goal / what I want to do]
    Sending render texture or binary data(Image(jpg)) , size of at least 4096by2048.
    (depends on how much Image quality deteriorate)
    The purpose is to use for VR at receiver side (User interactive contents).
    It means render texture to be sent is stereo Image.

    In another test project, I successfully sent binary data encoded to jpeg(4096by2048)
    from an Unity editor to other Unity Editor on another PC via GCP Node-red server
    throw websocket(using websocket sharp)
    ---this is not realtime encoded movie, just sending sequential images each frame.

    However there are some problems
    (require throughput over 50Mbps or such and latency, achieve only low frame late(10fps or such) etc.).
    That's one of reason why I tried Render Streaming this time.


    [What I have done so far]
    Create new 3D unity project(Unity 2019.4.7f1 windows10 64bit).
    Import Render Stream package and sample(Preview 3.01)
    Setup WebBrowserInput scene on Unity editor.
    RenderStreaming Object(Inspector -> Render Streaming Script)
    Signaling Type WebsocketSignaling
    Signaling URL ws://(external IP address of windows server on GCP)
    Ice servers 2
    Ice servers[0] size 1 stun:stun.l.google.com:19302
    Ice servers[0] size 1 turn:(external IP address of TURN server on GCP):3478?transport=tcp
    also set Username and credential(password)
    Create two Render Textures and attached to Render Streaming Camera and Render Streaming Camera (1)
    When each Render Textures size are 2048 by 4096 and 200 by 200(intentionally make it small as an experiment )
    unity editor crashed later step.
    When each Render Textures size are both 2048 by 2048 sending video to succeeded.
    Streaming Sizes of Camaera Streamer(Script ) are the same with Render Textures sizes.

    Use Hardware Encoding (Nvidia GeForce GTX1070 8GB)

    Packing webserver.exe
    setup TURN server configuration to external IP address of TURN server on GCP.
    run the pack_webapp.cmd on Local PC.
    Setting TURN Server on GCP
    I followed the tutorial below
    https://docs.unity3d.com/Packages/com.unity.renderstreaming@3.0/manual/turnserver.html
    Setting webserver on GCP
    I created a Windows server VM instance.
    RDP access and Installed Node.js 14.16.1LTS by .msi installer(64bit)
    copied webserver.exe packed above from local PC to this Windows server On GCP
    run webserver.exe from command prompt by "webserver -w"

    Access to the websever(video player page) from another PC(windows laptop) and another Internet provider(UQ WiMax).
    (browser: google chrome)

    The attempt Render Textures size are both 2048 by 2048 sending video succeeded.
    At first run the webserber, start (play) the unity editor scene, then press the play button on browser.
    On windows10 Task manager base,
    3.0Mbps to 4Mbps when I move sphere rapidly.
    In the case default size (1280by720), 1.5Mbbs or such.

    The attempt Render Textures size are2048 by 4096 and 200 by 200,
    Unity Editor crashed, just after press the play button on browser.


    [What I am considering now]
    Because sending two RenderTextuers both 2048 by 2048 succeeded,
    combining 2images at receiver side to Restore CubeMap could be one option.
    However,
    synchronization of both eyes images at receiver side would be one of technical challenge.
    Maybe there are better alternative options.

    I'd appreciate if you give me any advices to avoid crash, limitation, bottle necks,
    or better way to achieve my goal.

    Thanks in advance.
     
  19. kazuki_unity729

    kazuki_unity729

    Unity Technologies

    Joined:
    Aug 2, 2018
    Posts:
    803
    I appreciate your sharing the detail.

    I am not sure about the issue but I guess it caused by the hardware codec limitation.
    I am going to check the issue and fix the crash but I feel it is difficult to solve if it is caused by the hardware limitation.


     
    tsutomunmun likes this.
  20. tsutomunmun

    tsutomunmun

    Joined:
    Oct 8, 2020
    Posts:
    5
    Hi,@kazuki_unity729
    Thanks for quick reply.

    According to your opinion about hardware limitation, I tested software codec (just check off "Hardware Encoder support") and successfully sent Unity from Editor to Browser at least up to 4096by4096.

    Regards
     
  21. wanfei

    wanfei

    Joined:
    Dec 22, 2018
    Posts:
    19
    hi @kazuki_unity729
    How do I modify the video stream bitrate in Unity. The video quality is not very good.
     
  22. kazuki_unity729

    kazuki_unity729

    Unity Technologies

    Joined:
    Aug 2, 2018
    Posts:
    803
  23. yibo_grace

    yibo_grace

    Joined:
    Aug 17, 2020
    Posts:
    2
    Hello, when will the Android platform be supported?
     
  24. Williano7

    Williano7

    Joined:
    Sep 23, 2019
    Posts:
    5
    I am using the Web server of the Unity Render Stream to send text to the data channel.

    Please how can I get the Peer-connection object in Unity to read the text I’m sending in a different script attached to a game object in Unity?
     
  25. wanfei

    wanfei

    Joined:
    Dec 22, 2018
    Posts:
    19
    Thank you very much. One more question:
    The input event I send through the Web side will not trigger the UI click event, and the mouse will lose focus, and the position is always 0.I set EventSystem according to Git instructions. Keyboard events can be triggered normally, but UI can't. How do I trigger UI events?
    Thank you for your reply
     
  26. yuigahamayuki

    yuigahamayuki

    Joined:
    May 19, 2021
    Posts:
    2
    Hi kazukiさん
    When trying the AR Foundation demo, I met a problem that the demo running on iOS could not connect to the web server.

    Things I have tried:
    1. Start the server using WebSocket, MacOS Chrome, Windows Chrome, iOS Safari, and demo running in Unity all successfully connect to the web server. As for AR Foundation demo running on iOS, the following logs show in Xcode output:

    Signaling: Connecting WS
    No route to host Signaling: WS connection closed, code: 1006

    2. Start the server using HTTP, the situation is the same as the first step, except this time the following logs show in Xcode output when running AR Foundation demo in iOS:

    Signaling: Connecting HTTP
    Signaling: HTTP request error System.Net.WebException: Error: ConnectFailure (No route to host) --->

    (I also tried self-signed HTTPS, it still fails.)

    Environments:
    Unity version: 2020.3.8f1
    Unity Render Streaming version: 3.0.1-preview
    AR Foundation version: 4.1.7
    ARKit XR Plugin version: 4.1.7
    Xcode version: 12.5 (12E262)
    iOS OS version: 14.5.1

    Both camera usage and local network usage successfully prompt.

    I would appreciate it if you could help me. Thanks.
     
  27. gtk2k

    gtk2k

    Joined:
    Aug 13, 2014
    Posts:
    288
    @yuigahamayuki
    Did you leave the signaling URL as "localhost"?
     
  28. Tarik_Boukhalfi

    Tarik_Boukhalfi

    Joined:
    Jul 9, 2012
    Posts:
    7
    Hello @kazuki_unity729,

    Is there a release date for version 3.1 (Android Support) ?

    Also, what are the ports used for video stream / audio stream and control ?
    Are there any guidelines for deploying to VMs ?

    Thanks!

    Tarik
     
  29. kazuki_unity729

    kazuki_unity729

    Unity Technologies

    Joined:
    Aug 2, 2018
    Posts:
    803
  30. yuigahamayuki

    yuigahamayuki

    Joined:
    May 19, 2021
    Posts:
    2
    I specified the signaling URL in Unity as "ws://${ip}:{port}" or "http://${ip}:{port}" with those of the computer running web server, and the log shown in Xcode shows the IP and Port set int Unity.
     
  31. wanfei

    wanfei

    Joined:
    Dec 22, 2018
    Posts:
    19
    @kazuki_unity729 unity inputfiled can't input in web,how to solve it.Thank you for your reply
     
  32. wanfei

    wanfei

    Joined:
    Dec 22, 2018
    Posts:
    19
    @kazuki_unity729 Can you give me some advice? Thank you.
     
  33. kazuki_unity729

    kazuki_unity729

    Unity Technologies

    Joined:
    Aug 2, 2018
    Posts:
    803
  34. wanfei

    wanfei

    Joined:
    Dec 22, 2018
    Posts:
    19
    Thank you for your reply.
    I have a Login UI In unity project, I want to show my project in web using from UnityRenderStreaming,The UI Click and keyboard action is normally,but inputfile can not trigger in web. Now I send unityengine.event to inputField ProcessEvent. it's seems to solve the problem, wondering if there is a better way.Also consider the IME
     

    Attached Files:

    • 11.png
      11.png
      File size:
      196.3 KB
      Views:
      336
  35. Tarik_Boukhalfi

    Tarik_Boukhalfi

    Joined:
    Jul 9, 2012
    Posts:
    7
    Is there a release date for version 3.1 (Android Support) ?

    Also, what are the ports used for video stream / audio stream and control ?
    Are there any guidelines for deploying to VMs ?

    Thanks!

    Tarik
     
  36. kazuki_unity729

    kazuki_unity729

    Unity Technologies

    Joined:
    Aug 2, 2018
    Posts:
    803
    I see.
    Surely, the IME via browsers has not been tested yet.
    I will check it. Thanks.
     
  37. kazuki_unity729

    kazuki_unity729

    Unity Technologies

    Joined:
    Aug 2, 2018
    Posts:
    803
    v3.1 will be released this month.
    You can check ports using the "RTCIceCandidate" class.
    The webrtc package contains a sample "TricleIce" that is useful to understand how to use it.
    No guidelines for VM unfortunately. Thanks.
     
    Tarik_Boukhalfi likes this.
  38. wanfei

    wanfei

    Joined:
    Dec 22, 2018
    Posts:
    19
    Thank you for your reply!
     
  39. Piflik

    Piflik

    Joined:
    Sep 11, 2011
    Posts:
    293
    @kazuki_unity729 I just noticed that this delay seems to be per connection. I haven't timed it exactly, but I am currently trying to stress test our system and I opened multiple connections. When I exited play mode, it took several minutes until Unity was responsive again.

    Edit: I did time it now and it is pretty much exactly numConnections x StreamingInterval
     
    Last edited: May 26, 2021
  40. Tarik_Boukhalfi

    Tarik_Boukhalfi

    Joined:
    Jul 9, 2012
    Posts:
    7
    Thank you
     
  41. peab0dy

    peab0dy

    Joined:
    May 26, 2021
    Posts:
    1
    @kazuki_unity729 We're working on a unity project that heavily utilizes render streaming - so thank you very much for your work on this project. A question from the team - loading/changing scenes seems to freeze up our game for around 10 seconds and it appears to be related to render streaming. Any thoughts or troubleshooting tips? Thanks again.
     
  42. wanfei

    wanfei

    Joined:
    Dec 22, 2018
    Posts:
    19
    @kazuki_unity729 Can I get the buffer datas from VideoStream(unity reveive).Now I just get buffer by Texture-->Texture2D--->GetRawTextureData(); This is not a good way.
     
  43. kazuki_unity729

    kazuki_unity729

    Unity Technologies

    Joined:
    Aug 2, 2018
    Posts:
    803
    Hi, we found the issue you saying and fixed it.
    The next release contains this fix. Thank you.
     
  44. MastersOfUs

    MastersOfUs

    Joined:
    Aug 29, 2015
    Posts:
    17
    Hello, we are experiencing a crash on a specific machine caused by the WebRTC package it seems like.

    ========== OUTPUTTING STACK TRACE ==================
    0x00007FF89C9E4B89 (KERNELBASE) RaiseException
    0x00007FF851338505 (webrtc) VideoTrackRemoveSink
    ERROR: SymGetSymFromAddr64, GetLastError: 'Attempt to access invalid address.' (Address: 00007FF850A8BC38)
    0x00007FF850A8BC38 (webrtc) (function-name not available)
    0x00007FF850A97B1F (webrtc) VideoTrackRemoveSink
    0x00007FF850A954C7 (webrtc) VideoTrackRemoveSink
    0x00007FF850A90BBD (webrtc) UnityPluginUnload
    ERROR: SymGetSymFromAddr64, GetLastError: 'Attempt to access invalid address.' (Address: 00007FF850A8EC4C)
    0x00007FF850A8EC4C (webrtc) (function-name not available)
    ERROR: SymGetSymFromAddr64, GetLastError: 'Attempt to access invalid address.' (Address: 00007FF8437C272B)
    0x00007FF8437C272B (UnityPlayer) (function-name not available)
    ERROR: SymGetSymFromAddr64, GetLastError: 'Attempt to access invalid address.' (Address: 00007FF8437C886D)
    0x00007FF8437C886D (UnityPlayer) (function-name not available)
    ERROR: SymGetSymFromAddr64, GetLastError: 'Attempt to access invalid address.' (Address: 00007FF8437C8965)
    0x00007FF8437C8965 (UnityPlayer) (function-name not available)
    0x00007FF844173BBD (UnityPlayer) UnityMain
    0x00007FF89DF87034 (KERNEL32) BaseThreadInitThunk
    0x00007FF89F242651 (ntdll) RtlUserThreadStart
    ========== END OF STACKTRACE ===========

    It is on a 3080 the error occurs, with Nvidia driver 466.27. On our other machine with a 2070 it works fine with the exact same build and same driver. How could we debug this further or solve the issue?

    EDIT: Github issue with more info: https://github.com/Unity-Technologies/UnityRenderStreaming/issues/485
     
    Last edited: May 31, 2021
  45. MastersOfUs

    MastersOfUs

    Joined:
    Aug 29, 2015
    Posts:
    17


    After further trubleshooting, I posted an issue on the github repo here, which contains a lot more information: https://github.com/Unity-Technologies/UnityRenderStreaming/issues/485
     
  46. MastersOfUs

    MastersOfUs

    Joined:
    Aug 29, 2015
    Posts:
    17
    Workaround was using software encoder. I hope this can be fixed so we can use the hardware encoder for RTX 3000 cards :)
     
  47. sotokangr

    sotokangr

    Joined:
    Jun 3, 2010
    Posts:
    25
    Hello everyone,

    I am a bit confused with what works and what not with the 3.0.1 version of Unity Render Streaming.

    So the main question is I want to Broadcast from Android and Receive on Windows PC, is this possible?

    if not, is it possible to do so from iOS to Windows?

    and lastly, I read that Android support will be released with version 3.1...
    any dates on this one, and will it include both broadcast and receive?

    Thanks!
     
  48. kazuki_unity729

    kazuki_unity729

    Unity Technologies

    Joined:
    Aug 2, 2018
    Posts:
    803
    I released version 3.1 yesterday. Please try it.
     
  49. sotokangr

    sotokangr

    Joined:
    Jun 3, 2010
    Posts:
    25


    Thanks a lot for that!!, It is working. I can stream from android and receive on windows.


    I have another issue now:
    I am able to broadcast and receive when I have my devices connected to the same network.
    However, I can not manage to receive the stream, when my device is connected to another network.


    So my setup is, I have the webserver app running on AWS windows virtual machine.
    Then I am starting a broadcaster (modified a bit) from my android phone.
    Next, I am starting a receiver(modified a bit) from my windows pc.
    Everything works if windows pc and android are on the same wifi.
    But if I change the android phone to connect to the mobile network and keep my windows pc on home wifi,
    then the receiver will not show the stream.

    I checked on the webserver app and all traffic seems legit.

    Any idea...?
     
    Ibrahim-Th likes this.
  50. Ibrahim-Th

    Ibrahim-Th

    Joined:
    Apr 28, 2021
    Posts:
    3
    Hello everyone,

    I'm trying to run the broadcast example, and if I run two clients, the code runs smoothly on a local device. However, it shows a black screen if it runs remotely, and I can not control the scene from the browser if I have more than two clients. Also, I have turned off the Windows Firewall.

    Update:
    It works when I upgrade the Unity Render Streaming to the Latest release 3.1.0-exp.1.
    Environment:

    Win 10
    Unity 2020.3.7f1
    Unity Render Streaming 3.0.1-preview
    WebRTC 2.3.3
    Chrome Version 91.0.4472.77 (Official Build) (64-bit)
    HTTPS server

    The browser log

    PUT /signaling 200 0.371 ms - 52
    GET / 200 0.643 ms - 1674
    GET /css/main.css 200 0.601 ms - 1563
    GET /js/main.js 200 0.575 ms - 597
    GET /js/config.js 200 0.484 ms - 380
    GET /config 200 0.134 ms - 61
    GET /signaling/offer?fromtime=-29023 200 0.300 ms - 18878
    GET /signaling/answer?fromtime=-29023 200 0.256 ms - 14
    GET /signaling/candidate?fromtime=-29023 200 0.135 ms - 17
    POST /signaling/candidate 200 0.316 ms - 2
    POST /signaling/candidate 200 0.770 ms - 2
    POST /signaling/candidate 200 0.301 ms - 2
    POST /signaling/candidate 200 0.294 ms - 2
    POST /signaling/candidate 200 0.254 ms - 2
    POST /signaling/answer 200 0.308 ms - 2
    POST /signaling/candidate 200 0.297 ms - 2
    POST /signaling/candidate 200 0.284 ms - 2
    POST /signaling/candidate 200 0.274 ms - 2
    POST /signaling/candidate 200 0.292 ms - 2
    POST /signaling/candidate 200 0.715 ms - 2
    POST /signaling/candidate 200 0.259 ms - 2
    POST /signaling/candidate 200 0.279 ms - 2
    POST /signaling/answer 200 0.346 ms - 2
    POST /signaling/candidate 200 0.287 ms - 2
    POST /signaling/candidate 200 0.259 ms - 2
    POST /signaling/candidate 200 0.266 ms - 2
    POST /signaling/candidate 200 0.267 ms - 2
    POST /signaling/candidate 200 0.308 ms - 2
    POST /signaling/candidate 200 0.292 ms - 2
    POST /signaling/candidate 200 0.338 ms - 2
    POST /signaling/candidate 200 0.340 ms - 2
    POST /signaling/candidate 200 0.260 ms - 2
    POST /signaling/candidate 200 0.312 ms - 2
    POST /signaling/candidate 200 0.287 ms - 2
    POST /signaling/candidate 200 0.255 ms - 2
    GET /videoplayer/index.html 200 0.527 ms - 1370
    GET /signaling/offer?fromtime=1623658757000 200 0.142 ms - 13
    GET /signaling/answer?fromtime=1623658757000 200 0.163 ms - 14
    GET /signaling/candidate?fromtime=1623658757000 200 0.243 ms - 3010
    GET /css/main.css 200 0.976 ms - 1563
    GET /videoplayer/css/style.css 200 0.533 ms - 1448
    GET /videoplayer/js/main.js 200 0.493 ms - 4885
    GET /videoplayer/js/video-player.js 200 0.622 ms - 7576
    GET /js/register-events.js 200 0.559 ms - 12130
    GET /js/config.js 200 0.539 ms - 380
    GET /js/gamepadEvents.js 200 0.604 ms - 5306
    GET /js/logger.js 200 1.449 ms - 436
    GET /js/signaling.js 200 0.496 ms - 7688
    GET /config 200 0.215 ms - 61
    GET /videoplayer/images/Play.png 200 0.493 ms - 483163
    PUT /signaling/ 200 0.255 ms - 52
    GET /videoplayer/images/FullScreen.png 200 0.688 ms - 19164
    GET /signaling/offer?fromtime=1623658904594 200 0.156 ms - 13
    GET /signaling/answer?fromtime=1623658904595 200 0.101 ms - 14
    GET /signaling/candidate?fromtime=1623658904597 200 0.110 ms - 17
    POST /signaling/offer 200 0.316 ms - 2
    POST /signaling/candidate 200 0.242 ms - 2
    POST /signaling/candidate 200 0.908 ms - 2
    POST /signaling/candidate 200 0.308 ms - 2
    POST /signaling/candidate 200 0.223 ms - 2
    POST /signaling/candidate 200 0.404 ms - 2
    POST /signaling/candidate 200 0.303 ms - 2
    POST /signaling/candidate 200 0.233 ms - 2
    POST /signaling/candidate 200 0.166 ms - 2
    GET /signaling/offer?fromtime=1623658762000 200 0.263 ms - 10145
    GET /signaling/answer?fromtime=1623658762000 200 0.135 ms - 14
    GET /signaling/candidate?fromtime=1623658762000 200 0.164 ms - 17
    POST /signaling/candidate 200 0.307 ms - 2
    POST /signaling/candidate 200 0.315 ms - 2
    POST /signaling/candidate 200 0.360 ms - 2
    POST /signaling/candidate 200 0.326 ms - 2
    POST /signaling/answer 200 0.341 ms - 2
    POST /signaling/candidate 200 0.318 ms - 2
    POST /signaling/candidate 200 0.266 ms - 2
    POST /signaling/candidate 200 0.248 ms - 2
    POST /signaling/candidate 200 0.260 ms - 2
    POST /signaling/candidate 200 0.250 ms - 2
    POST /signaling/candidate 200 0.272 ms - 2
    POST /signaling/candidate 200 0.691 ms - 2
    POST /signaling/candidate 200 0.257 ms - 2
    GET /signaling/answer?fromtime=1623658766000 200 0.184 ms - 6231
    GET /signaling/candidate?fromtime=1623658766000 200 0.149 ms - 17
    GET /signaling/offer?fromtime=1623658767000 200 0.176 ms - 13
    GET /signaling/answer?fromtime=1623658767000 200 0.158 ms - 14
    GET /signaling/candidate?fromtime=1623658767000 200 0.195 ms - 1559
    GET /signaling/answer?fromtime=1623658769000 200 0.191 ms - 14
    GET /signaling/candidate?fromtime=1623658769000 200 0.142 ms - 17
    GET /signaling/answer?fromtime=1623658772000 200 0.168 ms - 14
    GET /signaling/candidate?fromtime=1623658772000 200 0.150 ms - 17
    GET /signaling/offer?fromtime=1623658772000 200 0.181 ms - 13
    GET /signaling/answer?fromtime=1623658772000 200 0.135 ms - 14
    GET /signaling/candidate?fromtime=1623658772000 200 0.190 ms - 17

    I really appreciate any help you can provide.
     
    Last edited: Jun 26, 2021