Search Unity

  1. Welcome to the Unity Forums! Please take the time to read our Code of Conduct to familiarize yourself with the forum rules and how to post constructively.

[Release] FMETP STREAM: All-in-One GameView + Audio Stream + Remote Desktop Control(UDP/TCP/Web)

Discussion in 'Assets and Asset Store' started by thelghome, Apr 30, 2019.

  1. girishd

    girishd

    Joined:
    Apr 25, 2010
    Posts:
    1
    Hi @theIghome,

    I'm having similar issues with the audio. I'm using a sample rate of 12000, StreamFPS of 3 and forcing mono but still get some "cracks" & gaps every second or two. Any other tips to solve this or any specific 3rd party JavaScript plugins you would recommend?
     
  2. thelghome

    thelghome

    Joined:
    Jul 23, 2016
    Posts:
    662
    multiple desktop/monitors is supported. Not per app window.
    By default, only requested desktop will be shown and rendered, you can use multiple GameViewEncoder with different Desktop index and label ID, and decode them with different label id pairing.
     
  3. thelghome

    thelghome

    Joined:
    Jul 23, 2016
    Posts:
    662
    There was a minor bug related to reading connection status, between different threads. We added exception for it, as it's not often triggered.
     
  4. kintovt

    kintovt

    Joined:
    Jan 5, 2017
    Posts:
    13
    Hi! Is 360 stream for youtube possible from unity?
     
  5. thelghome

    thelghome

    Joined:
    Jul 23, 2016
    Posts:
    662
    It's been working in progress, more news will be announced soon(hopefully this summer asap).
    We already got progress on Android support in our internal test, the next stage is iOS, Mac and PC support.

    There was some donation to help us doing research on Android last year. We are still looking for budget for other platforms: iOS, Mac, PC. Revenue from Asset Store is still far too low, which can't even pay our bills. Thus, our development progress is a bit slow unfortunately. But we will try our best to invest our spare time, to make all those feature come true, targeting this summer or late 2023.

    To All FMETP STREAM users:
    Please consider donation via Paypal, which can help us boost the development for youtube, h264, rtsp related feature support.
    donation link: https://paypal.me/FMSolution
     
    kintovt likes this.
  6. dawnr23

    dawnr23

    Joined:
    Dec 5, 2016
    Posts:
    39
    Hello,

    1. AWS (general public) server
    2. Multi-Access Edge Computing (MEC) Server
    - Mec server is a server for Multi-Access Edge Computing (MEC), which is a technology that reduces network latency and improves service performance by locating cloud computing resources close to mobile devices in 5G mobile communication networks.

    WebScoket screen streaming is in progress on HoloLens 2 from the above two servers.

    Using that asset 1. It works fine on the AWS server, but 2. A problem occurs on the MEC server.

    The HoloLens 2 app is minimized the moment you set it as an MEC server and turn on streaming.

    Even if it is maximized again, the FPS remains at 1, and it is minimized again after 3 to 5 seconds.

    1. When it is a MEC server, streaming works well in Unity Editor.
    2. Streaming doesn't work on HoloLens2 only when on MEC server.
    3. Rest API communicates well on HoloLens2 when it is an MEC server. Only websocket streaming doesn't work.

    Please note that the difference between the MEC server and AWS server approaches is as follows.

    1. When accessing the public server, access the application of the instance with the IP of the external instance.
    2. When accessing the MEC server, you must connect to a network with a specific IP. Therefore, the application of the MEC server instance is accessed through the bastion host, which is the instance in charge of security of the MEC server.

    Do you have any guesses for the cause?
     
  7. thelghome

    thelghome

    Joined:
    Jul 23, 2016
    Posts:
    662
    In general, it's most likely related to port forwarding issue, if you have a bastion host.
    The idea is how to reach the node.js server instance, and output the data correctly to your target app client.
     
  8. LichenShare

    LichenShare

    Joined:
    Mar 24, 2019
    Posts:
    2
    First of all, Thank You! FMETP Stream 2.0 is a core element of my current project. I found your work through the StereoPi project.
    I'm about to upgrade to FMETP Stream 3.0, and I see that you've released FM Network 3.0 - Congratulations! Should I expect that FMETP Stream 3.0 will soon be updated to include FM Network 3.0?
    Either way, I'm delighted to continue to support your work. I look forward in particular to additional codecs, as my primary use case previews & captures video (and soon still image) data streamed from a range of (currently Raspberry Pi & Android) sources.
    Thanks again,
    Mark Goldberg
    LichenShare Media
     
    Last edited: Feb 24, 2023
    thelghome likes this.
  9. thelghome

    thelghome

    Joined:
    Jul 23, 2016
    Posts:
    662
    You are welcome, and glad to know that it helps your projects too.
    We don't have specific plan for FM Network 3.0 yet, probably will keep it separate for a while first.

    - FM Network 2.0 is considered as LTS with minor bug fixes from now.
    - FM Network 3.0 is a new release focusing on the networked object sync feature, for the scene objects.
    It should be more efficient than 2.0, which you can choose position, rotation, or scale, and SyncFPS per each gameobject.

    Meanwhile, we are still working in progress for more codecs. Though time and budget are very limited, we will still try our best to make it happen asap.

    PS: If you are satisfied with our product, please consider to write a review on store. It means a lot to us :)
     
  10. LichenShare

    LichenShare

    Joined:
    Mar 24, 2019
    Posts:
    2
    Thanks so much for this prompt & informative reply! :)
     
  11. dawnr23

    dawnr23

    Joined:
    Dec 5, 2016
    Posts:
    39
    I have successfully connected the following configuration using FM Network 3.0 assets:

    • Admin app: 1 Android Galaxy Tab 8 device
    • Content app: 20 Android Oculus Quest 2 VR devices
    The admin app sends simple commands to the 20 VR devices, and screen streaming is done one VR device at a time.

    In this environment, how many more content app (VR) devices can be added?
    (We want to increase it up to 100 units. Is that possible?)
     
  12. thelghome

    thelghome

    Joined:
    Jul 23, 2016
    Posts:
    662
    Thanks for sharing your testing result to us too.

    To be honest, we don't have 100 units to test.
    But I can imagine the bottleneck might be your router connection capability.

    I suggest that you may try sending commands 5 times to your 20 units, to simulate the number of 100 devices.
    See if there is any performance drop or packet loss issue.

    PS: your result is very helpful to us too, as we have another project with 20+ iPad mini in a 360 projection/LED wall systems. (ordered but not arrive yet... at least you have a proven result with 20 devices)
     
  13. LeonLai

    LeonLai

    Joined:
    May 9, 2021
    Posts:
    1
    Hello, may I ask if there is support for RTSP streaming?
    I want to connect to rtsp://wowzaec2demo.streamlock.net/vod/mp4:BigBuckBunny_115k.mp4, but I can't find any way to do it.
     
    Last edited: Mar 22, 2023
  14. thelghome

    thelghome

    Joined:
    Jul 23, 2016
    Posts:
    662
    Unfortunately RTSP is still working in progress, not supported yet.
     
    LeonLai likes this.
  15. yme2878

    yme2878

    Joined:
    Apr 7, 2020
    Posts:
    3
    I am using FMETP in Unity 2020.3.12 URP.
    When I play in scene A, the data is transferred fine.
    However, after playing in the B scene, when I load the A scene, the encoder preview image becomes white as shown in the attached picture.
    The encoder preview image changes to white, and the Render Texture is not transferred. (Text is transferred without any problem)

    Do I need to change the settings of UnityEngine.RenderTexture?
    Or is there any other solution?
     
  16. thelghome

    thelghome

    Joined:
    Jul 23, 2016
    Posts:
    662
    When you switch scenes, please check if there is any component destroyed.
    For example, the reference Render Cam or Main Cam, Network Manager might be destroyed when you unload one of your scenes.
     
  17. yme2878

    yme2878

    Joined:
    Apr 7, 2020
    Posts:
    3
    [Demo_NetworkingMain -> Demo_FMNetworkStreamingRenderCam]
    Because it moves by pressing the button,
    Render Cam or Main Cam, Network Manager is loaded normally without being destroyed.

    What we now know is that when you enable the openVR Loader, you get into trouble when you move the scene.
    If you disable the openVR Loader, you can transfer the scene without any problems.

    But I need a solution because I have to use a VR device.
     
  18. thelghome

    thelghome

    Joined:
    Jul 23, 2016
    Posts:
    662
    As far as I know, there are many customers using our plugin in VR. It shouldn’t be a problem.

    Could you please specify the exact trouble? Are those components destroyed or not with openVR loader?

    Just my guess:
    Probably when you switch scene and there is only one camera left, the openVR might forced your render camera into VR camera?
     
    Last edited: Mar 30, 2023
  19. yme2878

    yme2878

    Joined:
    Apr 7, 2020
    Posts:
    3
    Upon further checking, the same issue occurs when installing FMETP,SteamVR (VR_Loader enabled) in a new project, regardless of Unity version.

    Switching from FMNatworkingmain to
    FMNetworkStreamingMainCam, it works like the attached image.

    However, when I switch to FMNetworkStreamingRenderCam, it doesn't work.

    I want to use the RenderCam feature,
    Is there anything I should keep in mind when using RenderCam mode in a VR environment?
     
  20. thelghome

    thelghome

    Joined:
    Jul 23, 2016
    Posts:
    662
    You have to assign a separate Render Camera if using RenderCam mode.

    Btw, have you tried our Quest 2 templates on github? It should be compatible with MainCam mode.
    https://github.com/frozenmistadventure/fmetp_tutorial_questvr_stream
     
  21. Shamantiks

    Shamantiks

    Joined:
    Mar 11, 2022
    Posts:
    10
    I'm looking for a solution to render a non-primary Virtual Cam using Cinemachine into a standalone build recorder such as Nvidia Shadowplay or OBS Studio (Unity in-editor recording solutions won't work for me). I need to have 1 cam following my player during actual single player gameplay, and a 2nd VC watching from nearby to record the action. Is it possible to render 2 selectable cinemachine VCs from 1 standalone unity build, into OBS studio/Shadowplay as seperate sources using FMETP Stream?
     
  22. thelghome

    thelghome

    Joined:
    Jul 23, 2016
    Posts:
    662
    If OBS support mjpeg(udp) source input, it should be compatible as the simplest solution.

    Meanwhile, there is another interesting use case mentioned by other customers:
    1) display stream on HTML webpage
    2) OBS capture that webpage as source input
     
    Shamantiks likes this.
  23. darkmax

    darkmax

    Joined:
    Feb 21, 2012
    Posts:
    83
    I'm also interested on this RTSP protocol, do you have an estimate time frame o date when this protocol will be implementd?
     
  24. thelghome

    thelghome

    Joined:
    Jul 23, 2016
    Posts:
    662
    Thanks for letting us know your thoughts, we will try to push it to higher priority as possible.
    We can't make promise on release date yet, it's still in our 2023 To-Do list.

    *We got good progress in Android, but we wish to announce this feature as an All-in-One cross platform solution instead. It means that we need time to work on other platforms(at least Mac/PC/iOS/Android)

    PS: We are a small team of developers, creating this live stream solution in our spare time.
    The only way to help us boost the development is via paypal donation, as Asset Store income is very minimal honestly.
    https://paypal.me/FMSolution
     
    Last edited: Apr 18, 2023
  25. darkmax

    darkmax

    Joined:
    Feb 21, 2012
    Posts:
    83
    Ok is good to hear that is on to-do list for this year.
    I hope this is also available for UWP Hololens, mostly because on the current implementation for the hololens that I have tested using websockets and webpage, when the connection is with several people on the same call, the latency increase badly to the point that is not usable, or if there is a spike on latency it seems to not recover correctly from that spike and the voice call and video has several delay time during the whole time, until all the connection is reseted (server -> client) again.
     
  26. ludm80

    ludm80

    Joined:
    Mar 10, 2020
    Posts:
    6
    Hello,

    Tahnk you for this great asset.

    I have question:
    I have set up FMETP like this in LAN:
    One server on a PC or Phone
    Clients on VR Headsets.
    When several clients send their headsets views at the same times, is it possible to choose wich client to display?
     
  27. dawnr23

    dawnr23

    Joined:
    Dec 5, 2016
    Posts:
    39
    Hello,

    I want to send Korean characters during UDP communication. Currently, only English strings are available for communication. Please let me know the sample code to send Korean directly!
     
  28. thelghome

    thelghome

    Joined:
    Jul 23, 2016
    Posts:
    662
    It's possible, a common use case too, as monitor app.
    In the Server, you can get the list of connected clients with IP info.
    You can create your own script, send function as string message to trigger those features.
     
  29. thelghome

    thelghome

    Joined:
    Jul 23, 2016
    Posts:
    662
    By default, the message is encoded as Ascii.
    Do you think Unicode support will solve your problem? (but it will increase the bandwidth)

    Meanwhile, the alternative way would be converting your Korean string in this way:
    Code (CSharp):
    1. string inputMessage = "Hello FMETP! 안녕하세요 FMETP!";
    2. string EncodedMessageAscii = Convert.ToBase64String(Encoding.Unicode.GetBytes(inputMessage));
    3. string DecodedMessage = Encoding.Unicode.GetString(Convert.FromBase64String(EncodedMessageAscii));
    4. Debug.Log(EncodedMessageAscii);
    5. Debug.Log(DecodedMessage);
     
    Last edited: Apr 21, 2023
  30. raymondtsang_amvr

    raymondtsang_amvr

    Joined:
    Feb 26, 2019
    Posts:
    6
    Hi, my current ios build cannot show jpg image after import this plugin.
    I found it is related to turbojpeg, but I cannot find any solution about it...

    Debug message from Xcode:
    JPEG parameter struct mismatch: library thinks size is 632, caller expects 600
     
    Last edited: Apr 26, 2023
  31. thelghome

    thelghome

    Joined:
    Jul 23, 2016
    Posts:
    662
    You may use our jpeg reading method instead.
    https://frozenmist.com/docs/apis/fmetp-stream/jpeg-encoder-example/
    Code (CSharp):
    1. ReceivedTexture2D.FMLoadJPG(ref ReceivedTexture2D, inputByteData);
     
  32. raymondtsang_amvr

    raymondtsang_amvr

    Joined:
    Feb 26, 2019
    Posts:
    6

    I don't know how to rewrite these using your encoder/decoder
    Code (CSharp):
    1. UnityWebRequest request = UnityWebRequestTexture.GetTexture(MediaUrl);
    2. yield return request.SendWebRequest();
    3. var projectImageTexture = ((DownloadHandlerTexture)request.downloadHandler).texture;
     
  33. thelghome

    thelghome

    Joined:
    Jul 23, 2016
    Posts:
    662

    You can download byte[] instead of GetTexture().
    reference link: https://forum.unity.com/threads/www-is-obsolete-use-unitywebrequest-how-to-get-a-byte-array.556564/

    Code (CSharp):
    1. UnityWebRequest www = UnityWebRequest.Get(url);
    2. yield return www.Send();
    3.  
    4. //www.downloadHandler.data is the byte array
    5. byte[] inputByteData = www.downloadHandler.data;
    6. ReceivedTexture2D.FMLoadJPG(ref ReceivedTexture2D, inputByteData);
    I haven't tested above script yet, but the idea should be the same.
     
  34. Yujiyeon

    Yujiyeon

    Joined:
    Apr 28, 2022
    Posts:
    1
    Hello,

    I am developing VR service using Vive with Unity 2020.3 version.
    I want to use SteamVR Asset with your FMETP Stream Asset.
    The Sender project includes SteamVR Asset, and the Receiver project has nothing to do with SteamVR Asset.

    There's a problem arises.

    When tested with a Sender(Server) with SteamVR Asset, the Encoder works normally at the start of the Editor, but the Encoder Preview appears white after reloading Scene and does not stream (= not work)

    I think I should use RenderCam mode because I use Vive instead of Oculus. I want you to test it yourself. Is there a solution?

     
  35. thelghome

    thelghome

    Joined:
    Jul 23, 2016
    Posts:
    662
    Depends on your need, if you want to share the main VR camera, you can use "MainCam" mode with the selected main VR camera.
    If you want to share 3rd person view, "RenderCam" mode with extra camera(non-VR, no target Display) is required.

    I am not sure how to reproduce your bug on my side, is this only happening with SteamVR imported?
    My first thought(If you are using Render Cam mode):
    Check your camera settings, set the target display to none, Depth index to -10, and make sure it's not rendering for VR lens. It shows white only after you reloading the scene, it's likely being occupied as VR camera for stereo(eyes) accidentally.(should be an editor bug too)

    To verify this situation, better use newer version of Unity Editor or build the app.
    *Some AR/VR related bugs should be always solved itself with newer versions of Unity.
     
  36. wechat_os_Qy02fzyOGnzWoIhqek1ZhmB5s

    wechat_os_Qy02fzyOGnzWoIhqek1ZhmB5s

    Joined:
    Aug 6, 2020
    Posts:
    3
    Hi,How do I search for other clients/servers that are linked on the same server and get their ids(or Label ID) ?
     
    Last edited: May 9, 2023
  37. thelghome

    thelghome

    Joined:
    Jul 23, 2016
    Posts:
    662
    Here is an example of getting IP or websocket ID directly:
    Code (CSharp):
    1. string _clientIP = FMNetworkManager.instance.Server.ConnectedClients[0].IP;
    2. string _serverIP = FMNetworkManager.instance.Client.ServerIP;
    3. string _clientWSID = FMWebSocketManager.instance.ConnectedClients[0].wsid;
    Meanwhile, you can register the events to monitor the connection status. Below are the public events available on both FMNetwork and FMWebSocket systems.
    Code (CSharp):
    1. public UnityEventString OnFoundServerEvent = new UnityEventString();
    2. public UnityEventString OnLostServerEvent = new UnityEventString();
    3. public UnityEventString OnClientConnectedEvent = new UnityEventString();
    4. public UnityEventString OnClientDisconnectedEvent = new UnityEventString();
    5. public UnityEventInt OnConnectionCountChangedEvent = new UnityEventInt();
     
  38. wechat_os_Qy02fzyOGnzWoIhqek1ZhmB5s

    wechat_os_Qy02fzyOGnzWoIhqek1ZhmB5s

    Joined:
    Aug 6, 2020
    Posts:
    3

    Thank you very much for your answer. Since I need to use cloud server for remote communication, I need to use FMSocketIOManager for connection. Does FMSocketIOManager have an interface similar to ConnectedClients to get IP?
     
  39. thelghome

    thelghome

    Joined:
    Jul 23, 2016
    Posts:
    662
    Unfortunately, FMSocketIOManager system doesn't support those features.
     
  40. thelghome

    thelghome

    Joined:
    Jul 23, 2016
    Posts:
    662
    You can use MicEncoder to stream microphone input.

    We haven’t integrated the system sound input from Desktop yet, but we will work on it.
    PS: The alternative is routing your system sound as mic input for Unity.
     
  41. rlarbwla113

    rlarbwla113

    Joined:
    Aug 26, 2022
    Posts:
    5
    Thank you for your answer
    Then, Desktop - Desktop's system sound will not be transmitted
    Is it possible to communicate with the microphone between Desktop and Desktop if I use the Mic Encoder?
     
  42. uaena

    uaena

    Joined:
    Apr 16, 2015
    Posts:
    12
    Main Camera Capture Mode does not work in URP, I tested on both 2021.3.11 and 2022.1.22 with URP 12.1.7 and 13.1.8 respectively. I have checked the preview and it keeps returning an empty image, while it works with render cam and full-screen mode.
     
  43. thelghome

    thelghome

    Joined:
    Jul 23, 2016
    Posts:
    662
    Yes, Simply use MicEncoder and AudioDecoder for this purpose.
     
  44. thelghome

    thelghome

    Joined:
    Jul 23, 2016
    Posts:
    662
    We will look further with those versions.
    May I know if you are running with VR MainCam mode in URP?
     
  45. uaena

    uaena

    Joined:
    Apr 16, 2015
    Posts:
    12
    Yes, I also tested with the QuestVR Stream Tutorial
     
  46. rlarbwla113

    rlarbwla113

    Joined:
    Aug 26, 2022
    Posts:
    5
    Thank you for your answer
    I haven't fully mastered how to use it yet, so if I have one more question
    If you look at the parameters, there's a label, but you can understand it simply as a room concept
    Do I have to pay the same value as the person I want to communicate with?
    ex) Desktop(player1,5467) - Desktop(player2, 5467)
    Is this how I and the other person's label value should be the same?
     
  47. thelghome

    thelghome

    Joined:
    Jul 23, 2016
    Posts:
    662
    About URP with Quest 2, there are some conflicts of getting the main camera(VR) frame via MainCam mode. It's hardware specific issue unfortunately. It's a known bug for now, still looking for solution at the moment.
     
  48. thelghome

    thelghome

    Joined:
    Jul 23, 2016
    Posts:
    662
    You can consider the label id variable is a key number, to pair the encoder and decoder.
    For example:
    Player 1:
    MicEncoder(label: 1234)
    AudioDecoder(label: 2234)

    Player 2:
    MicEncoder(label: 2234)
    AudioDecoder(label: 1234)

    In this scenario, Player 1's Mic(1234) will stream to Player 2's AudioDecoder(1234).
    In the opposite, Player 2's Mic(2234) will stream to Player 2's AudioDecoder(2234).

    If you have more than two player, you need more labels to pair those streams.
     
  49. rlarbwla113

    rlarbwla113

    Joined:
    Aug 26, 2022
    Posts:
    5
    Thank you very much for your reply
    But when you're talking, you can hear the voice, but if you hear the voice disconnected, what should you do
    Can you tell me?
     
  50. thelghome

    thelghome

    Joined:
    Jul 23, 2016
    Posts:
    662
    Could you please specify "you hear the voice disconnected"?