Search Unity

  1. Welcome to the Unity Forums! Please take the time to read our Code of Conduct to familiarize yourself with the forum rules and how to post constructively.

[Release] FMETP STREAM: All-in-One GameView+Audio Stream (UDP/TCP/WebSockets/HTML)

Discussion in 'Assets and Asset Store' started by thelghome, Apr 30, 2019.

  1. jfrs20

    jfrs20

    Joined:
    Jul 8, 2020
    Posts:
    7
    Hi,

    so I bought the package but I'm not too happy with it so far. I do not see how it works for me (without analysing code). It seems the documentation does not match the files inside.

    When importing the package it wants me to overwrite my project files. Why does it do that? Shouldn't it be packaged without project files?

    There are a couple of scenes (Demo_FMNetworkPCStream, Demo_FMNetworkStreaming, Demo_FMNetworkStreamingMainCam, Demo_FMNetworkBasic, ...) which all look similar. How do I know what they are supposed to do? Not to mention the archived scenes.

    I should start with "Demo_NetworkingMain" - but what can I do with it?

    There are other places where identifiers and names are not really clear to me.

    Where does the "Network Manager" fit in the overview picture? How could it be replaced by another networking system? I already have a working Websocket.
    What does it mean: "Connect as server", "Connect as client"? Connect to where?
    What does it mean: "SendToAll, SendToOthers, SendToServer, SendToTarget"?

    Also, installation of TestServer.zip (which is actually TestServer_v2.0.0.zip), is messed up. Why is there no package.json? Why install socket.io without package.json? "npm init" after "npm install socket.io" does not make sense.

    Which components and files are necessary for simple streaming the view of a camera? Am I right, it's streaming mjpeg? Am I right I should use Websockets? How can I exchange messages in parallel to GameView streaming in order to control the camera? Lots of questions.
     
    Last edited: Sep 29, 2021
  2. thelghome

    thelghome

    Joined:
    Jul 23, 2016
    Posts:
    510
    Sorry for any confusion it caused.
    Most of step-by-step tutorials can be found on our Youtube Channel. They are very easy to follow without coding needed.


    For new user, we recommended this AR Live Streaming tutorial. The inspector might be a little bit out-dated, but we've been working on a new series of tutorials for FMETP STREAM V2 at the moment.


    As our package is in the category of "Unity Template", the project settings are the requirement for publishing.
    Unity Asset Store team forced adding those settings, and we have no option to exclude them.
    (If we change our template to other category, the license agreement will charge you "Per Seat", which may cost you more for a team in general)


    However, you can untick the project settings items when you import the package.

    We included demos for two major networking systems: FMNetworkUDP(for LAN), FMWebSocket(Internet Streaming)

    For FMWebSocket and node.js setup, this is the most straight forwarding setup for a new computer.
    the npm init is important for installing --express. Otherwise you will get an error as expected.


    For technical support, please feel free to reach us via email: thelghome@gmail.com

    Edit:
    Online Docs & APIs is also available on our webpage now
    https://frozenmist.com/docs/apis/
     
    Last edited: Oct 7, 2021
  3. FraPev

    FraPev

    Joined:
    May 19, 2021
    Posts:
    5
    Hi! I'm trying to figure out if your asset can help me. I need to send the view of N cameras (more than three) from a Unity server to a python client on another device in real time with socket communication. Does your asset allow to do this real time? Does it allow streaming to another device's display or only to websockets?
     
  4. thelghome

    thelghome

    Joined:
    Jul 23, 2016
    Posts:
    510
    It's possible to stream multiple cameras, with different label IDs, for each pair of encoder and decoder.
    For local network, we have FMNetworkUDP system. FMWebSocket solution is targeting the Internet Stream purpose.

    We don't involve python client example, but some customers created their own python udp wrapper themselves.
    They refer to our data structure of our C# networking system and decoder for their devices like Raspberry Pi..etc.
     
  5. Legendary_keith

    Legendary_keith

    Joined:
    Apr 21, 2015
    Posts:
    17
    Hello @thelghome,

    I have gotten 2 errors right after i imported this asset (see the images below).
    it says that these websockets (photon 2 & FM) both have the same name but if i change it, i get a lot more errors.

    What should i do?

    Thanks in advance,
    Keith
     

    Attached Files:

  6. thelghome

    thelghome

    Joined:
    Jul 23, 2016
    Posts:
    510
    It’s a common issue of duplicated library.
    For PUN2 users, you may consider removing FMWebsocket folder completely.

    Otherwise, the alternative will be removing PUN2’s websocket-sharp lib.

    Both should work.
     
  7. Lordmin

    Lordmin

    Joined:
    Mar 9, 2017
    Posts:
    62
    We want one presenter to share a screen with hundreds of spectators.

    For example, the screen could be video streaming, it could be a ppt file, or it could be a word file.

    The presenter will proceed with the presentation by sharing the screen.

    In this way, when one person shares a screen such as Video or Word to several people,

    How many people can you share in real-time?

    Our initial planning number was from 1 to 1000 people.

    The platform will be released as Android - iOS - Windows10 cross platform.
     
  8. FraPev

    FraPev

    Joined:
    May 19, 2021
    Posts:
    5
    Good morning, I was actually referring to your FMETP STREAM V2 asset. Because I bought the asset "FMNetworkUDP system" but it only sends one data at time, between a server and a client, I would need to make a "video" stream of the virtual cameras. Now I'm rendering on render textures what the cameras see, I encode them in png and send them to the client (python) that shows them on the display of another computer, the problem is that I have a latency of about 200 ms. I was wondering if your asset could help me with this. Does it allow me to send frames/images in real time to another pc (not in local host)?
    Thanks in advance
     
  9. thelghome

    thelghome

    Joined:
    Jul 23, 2016
    Posts:
    510
    We don't have limit on connection count. It depends on your cloud hosting plan and server capability.
    Our tool allows you to stream a PC desktop, and anything you put within Unity Game Scene.
     
  10. thelghome

    thelghome

    Joined:
    Jul 23, 2016
    Posts:
    510
    This demo video should give you some idea of our latency performance:


    FM Network UDP is for local network.
    FMETP STREAM includes FM WebSocket for the Internet Streaming support.
    Multi-Cameras Stream is also supported, and our latest Point Cloud 3D Stream(Beta) is also included in this package.

    You may consider an upgrade, for better streaming performance with FMETP STREAM.
     
  11. dvi_unity

    dvi_unity

    Joined:
    Sep 28, 2021
    Posts:
    4
    Hi,

    I have bought the asset and test to stream from Oculus Quest 2 to Unity Editor by following step in the tutorial on YouTube and it works like a charm with UDP, but my purpose is to stream from Oculus Quest 2 to WebGL to integrate with React further. So, I have to move to the socket io but cannot find the tutorial to set up with socket io one.

    Would you release the tutorial to set up steaming Quest 2 with socket io? I only found the older one and try to integrate with the current version but cannot made it works to receive the stream from Quest 2
     
  12. thelghome

    thelghome

    Joined:
    Jul 23, 2016
    Posts:
    510
    No problem, we may consider releasing another template specific for WebSocket solution in the near future.
    For now, the major changes is switching FMNetworkManager to FMSocketIOManager.

    Please be reminded that you will need to host a node.js server, by following our tutorial.
    https://frozenmist.com/docs/apis/fm-websocket/

    And you will have to change FMSocketIOManager's Server IP, which should be the IP of your node.js server. Default was "127.0.0.1", but it's not reachable for Quest 2, you have to find your node.js server IP and replace it.
     
    dvi_unity likes this.
  13. thelghome

    thelghome

    Joined:
    Jul 23, 2016
    Posts:
    510
  14. dvi_unity

    dvi_unity

    Joined:
    Sep 28, 2021
    Posts:
    4
    Thank for the reply, I just had time to test, I have tried changing the ip to the ip of my local machine, run as node server on local, already as you mentioned, it seem Quest 2 was connected(?), but I didn’t see any video on the demo-receiver.

    Have I done anything wrong? or if I have to host the node server on the real server not the localhost one?(Sorry I didn’t good at backend just a newbie)

    Or have I set anything wrong? I’m not sure which one should be the server or client to stream from Quest 2 to WebGL.

    Moreover, can the video streaming quality be adjusted?
    Currently, the video quality is really crisp and lower than 720p.

    Thank you so much for your quick response.
     
  15. thelghome

    thelghome

    Joined:
    Jul 23, 2016
    Posts:
    510
    Probably you may follow this tutorial, the setup will be very similar: GameViewEncoder -> FMSocketIOManager


    For your testing, you can set your Quest 2 as Server in FMSocketIOManager.
    In the WebGL build, you can set it as Client.

    If it works, you should be always able to reach your html demo via http://yourIP:3000
    In the demo html, you have to type your node.js server IP manually, then you will see the stream from Quest 2 immediately.

    WebGL build requires hosting on standard web server(either local with same Wifi or internet).
    Personally, I used to testing my webgl build via xampp or mamp.

    The stream resolution, compression quality and stream fps are all adjustable in runtime.

    If it works in local testing but not via Wifi, the most possible solution will be disabling firewall in your System Settings.

    For the ease of troubleshooting, please always check our demo scene first. This will be important for us to understand & investigate, and help you solve it sooner.

    PS: please feel free to write us an email for technical support: thelghome@gmail.com
     
    Last edited: Oct 24, 2021
  16. Nasrul97

    Nasrul97

    Joined:
    Jun 23, 2021
    Posts:
    2
    Hi, I'm trying to add this to my Hololens 2 project. Is this asset works with OpenXR and newer sdk? Thanks.
     
  17. zhishi

    zhishi

    Joined:
    Oct 10, 2018
    Posts:
    2
    Hello, how did you collect the camera picture? I'm a novice and hope to get your guidance
     
  18. thelghome

    thelghome

    Joined:
    Jul 23, 2016
    Posts:
    510
    We haven't found any conflict yet. Thus, it should be compatible.
    You can either setup another Render Camera or use the Default Main Camera for the stream.

    The logic is very simple:
    (RenderCam, or MainCam) -> GameViewEncoder -> NetworkManager -> ...other side... -> GameViewDecoder
     
  19. Nasrul97

    Nasrul97

    Joined:
    Jun 23, 2021
    Posts:
    2
    Okay, but I've encountered an annoying problem when I test this asset with my setup. The hololens screen is continuously flickering when I run the app. However, on the desktop side, it's working fine. I've tried using UDP and WebSocket methods but the result is the same. I also change the capture mode from the Main Camera to RenderCam but it has no effect. Any guess why? Here's a link to my recorded hololens screen: https://drive.google.com/file/d/1dlHYUeRZQGSZNPukhzz39TyMUHaFDwNd/view?usp=sharing
    I can send you my project files if it's needed.

    My Setup:
    - Unity 2020.3.21
    • MRTK Foundation 2.7.2
    • Mixed Reality OpenXR Plugin 1.1.1
    • Win 10 SDK 10.0.19041
    - Hololens 2 (OS ver 20348.1432)
     
    Last edited: Oct 28, 2021
  20. zhishi

    zhishi

    Joined:
    Oct 10, 2018
    Posts:
    2
    Hello, how can I set the script targetprojectmatrix? I added this script on rendercam, and I can't see the picture of the camera after publishing it
     
  21. thelghome

    thelghome

    Joined:
    Jul 23, 2016
    Posts:
    510
    You have to assign your target camera and reference camera, which you can assign them to the component in inspector.
    They shouldn't be a null value.
     
  22. thelghome

    thelghome

    Joined:
    Jul 23, 2016
    Posts:
    510
    Looks likely this is the camera's Clear Flag issue, which you may try different option in the Camera Settings.
    Would you mind sending us your example project to reproduce this issue, we'd like to investigate the issue.

    tech support: thelghome@gmail.com
     
  23. filmengineers

    filmengineers

    Joined:
    May 14, 2019
    Posts:
    2
    Quick questions:
    1. Does FMStream use Multithreading to avoid FPS drop during game play?
    2. Can we stream a Render Texture instead of a RenderCam / Camera?
    3. Can we add metadata to each streaming frame that is being sent?
    Thanks in advance.
     
  24. thelghome

    thelghome

    Joined:
    Jul 23, 2016
    Posts:
    510
    A1: Yes, we have multi-threading encoding option in our encoders.
    A2: Yes, TextureEncoder is included for this purpose, it's also compatible for webcam texture, and texture2d input source.
    A3: You can modify our encoder and decoder scripts, which are written in C#. It's possible.

    Hope this can solve your concerns.
     
    filmengineers likes this.
  25. inod_clement

    inod_clement

    Joined:
    Nov 9, 2016
    Posts:
    15
    Hi !
    i've same problem without fmetp it's unity bug in unity 2020.3.21...
    upload_2021-11-9_17-1-40.png
     
    thelghome likes this.
  26. Tubbritt

    Tubbritt

    Joined:
    Nov 30, 2015
    Posts:
    35
    Would it be possible to add this feature to Android?
    As in, to be able to stream anything a user does on an Android device to a VR headset.

    Or, is there a way to receive a stream from a standard service found on android devices such as Screen Mirror or Smart View.
     
    Last edited: Nov 14, 2021
  27. thelghome

    thelghome

    Joined:
    Jul 23, 2016
    Posts:
    510
    There would be some technical concerns, but we would investigate the possibility.
     
  28. Tubbritt

    Tubbritt

    Joined:
    Nov 30, 2015
    Posts:
    35
    Thank you for considering it.

    Regards
    James
     
  29. WilliamDrye

    WilliamDrye

    Joined:
    Jun 30, 2020
    Posts:
    7
    Hello, I used this asset to stream the view of a VR application (running on Oculus Quest 2) to an UWP application. It worked well enough, but I had to use an additional camera (The GameViewEncoder's RenderCam), which obviously meant I had to sacrifice a lot of framerate.
    Another customer is asking for the same feature, but this time I'm quite reluctant in using FMETP again due to the performance loss induced by the RenderCam. Is there any way to stream a VR headset's view without adding a camera ?
     
  30. thelghome

    thelghome

    Joined:
    Jul 23, 2016
    Posts:
    510
    In GameViewEncoder, there is a MainCam Mode option.
    You may consider trying this mode for the comparison on performance. In this mode, you may also test the difference by assign either centerEye, leftEye or rightEye.
     
    Last edited: Nov 16, 2021
  31. WilliamDrye

    WilliamDrye

    Joined:
    Jun 30, 2020
    Posts:
    7
    Thanks, if I remember correctly we tried this already but there was a VR-related bug when using MainCam mode. However we were using the old asset (V1) and I just realized you guys published the v2 on the Asset Store. Gonna ask my boss if we can purchase it to see if it works now;
     
    Last edited: Nov 16, 2021
  32. thelghome

    thelghome

    Joined:
    Jul 23, 2016
    Posts:
    510
    In V2, we've fixed the VR single pass bug on Quest 2. You may consider it if this is related to your previous issue.
     
  33. VastnessVR

    VastnessVR

    Joined:
    Nov 21, 2017
    Posts:
    24
    Hi,
    I have been using your product for a while and I must say it is brilliant.
    I have one issue where I am using FMWebSockets.
    It works fine but every time I send something it comes out twice on the other end.
    eg.I send a string to server and the server gets 2 of the same strings

    Any ideas?
     
  34. thelghome

    thelghome

    Joined:
    Jul 23, 2016
    Posts:
    510
    Seems likely that you added the "OnReceivedStringDataEvent(String)" twice via script, or actually sent twice accidentally.
     
  35. VastnessVR

    VastnessVR

    Joined:
    Nov 21, 2017
    Posts:
    24
    Thanks,
    I tested with the demo program and that worked fine.
    But I have searched and cannot find a second "OnReceivedStringDataEvent(String)".
    I even changed the name to "OnReceivedStringDataEventFM(String)" but still no go.
    I might just send that part by using the byte instead.
     
  36. thelghome

    thelghome

    Joined:
    Jul 23, 2016
    Posts:
    510
    [updates] FMETP STREAM v2.120 is available.
    - M1 Native silicon support (tested on M1 and M1 Max cpu)
    - upgraded encoder lib
     
  37. MetaJordan

    MetaJordan

    Joined:
    Nov 16, 2021
    Posts:
    1
    Hello, your plugin has been amazing for my project; great work! Just curious though, but is Mac OS Desktop support still in your To-Do list?
     
  38. thelghome

    thelghome

    Joined:
    Jul 23, 2016
    Posts:
    510
    It's been in good progress, we've achieved the result technically for Mac OS Desktop capture and stream.
    Unfortunately, we still have other concerns about adding this new feature into FMETP STREAM shortly or not. As we've noticed that there are increasing number of illegal users without purchasing a license, it obviously affects our revenue too..

    There are few thoughts at the moment
    (ideally, it has to be able to support our team to maintain it regularly in long term)
    1. embedding this feature into FMETP STREAM V2 as regular updates
    2. provide an add-on plugin, for those who can buy if needed
    3. gather with other features for major update(V3..etc)
    4. funding needed for adding this feature
    5. or just charge the implementation fee for projects as freelance

    Thus, if you have any feature request, you may reach us via email and see what's the best way we can help.
    technical support: thelghome@gmail.com
     
    Last edited: Dec 9, 2021
  39. kirkokuev

    kirkokuev

    Joined:
    Aug 9, 2014
    Posts:
    13
    Hello, can you please advise on my problem?
    I have streaming implemented with your asset over LAN connection with UDP.
    I have following configuration:
    Pico VR device (FM Network Manager as Server sendToOthers) > Android Tablet (FM Network Manager as Client)
    Pico VR device captures camera view and sends it to Android tablet.
    This works perfectly when it runs the first time, but when I stop streaming and reenable it without closing the app it shows only white texture. The only way to reenable streaming is to restart the server application.
    When I enable debug output on the client it shows that "is connected: true" and bytes are increasing, but streaming output is just white.
    Sometimes on trying to reconnect it start switching between true and false without showing streaming output on client.
     
  40. thelghome

    thelghome

    Joined:
    Jul 23, 2016
    Posts:
    510
    The simple way to stop and resume the stream, is by disabling and re-enabling the encoder component.
    If you have unstable connection issue, please make sure you have only one FMNetworkManager in the scene, within your local network, and stable wifi too.
     
  41. kirkokuev

    kirkokuev

    Joined:
    Aug 9, 2014
    Posts:
    13
    Ok, I think I solved the issue. It seems that
    FMNetworkManager.Action_InitAsServer();
    will infinitely add more and more servers at each call and this stalls the streaming after two or more components.
    I have added check for existing component and now I remove servers when connection is disabled..
     
  42. thelghome

    thelghome

    Joined:
    Jul 23, 2016
    Posts:
    510
    Glad to know it's been solved.

    In general, it's not necessary to init() twice.
    If you don't need to change server/client type, the simply way will be disabling/re-enabling the gameobject of FMNetworkManager for pausing/resuming networking system.

    In our latest version, we could not reproduce the bug of invoking multiple times of "FMNetworkManager.Action_InitAsServer();".
    If you have an example scene related to this bug, we'd like to investigate it further.

    technical support: thelghome@gmail.com
     
  43. DrEchoes

    DrEchoes

    Joined:
    May 23, 2013
    Posts:
    1
    Hi,
    I've been using the FMETP Stream package with Hololens devices and it works great!

    I'm using the local network UDP architecture, and I was just wondering about the large file example and the string data events. I saw you mentionned earlier that it was reliable but is it really? I guess in a local network there won't be a lot of issues, but since the underlying technology is UDP I'm wondering if I should implement my own TCP connection on top of it to make sure any critical data is delivered correctly.

    Thanks in advance!
     
  44. thelghome

    thelghome

    Joined:
    Jul 23, 2016
    Posts:
    510
    FMNetworkUDP is optimised UDP by cutting large data into small chunks. But it's still not a reliable-UDP solution.
    For critical data, we suggested that you should use TCP instead.
     
  45. Kikki1024

    Kikki1024

    Joined:
    Oct 27, 2021
    Posts:
    4
    Hi, I bought the package but I can't figure the stream webCam feature out. I run FM Server and Client on my pc at the same time. There is one USB webcamera attached to the pc. And check the WebCam button in the scene, but I can't see any webcem texture streamed in the Received Result image. My English is a little rusty and please help me.
     
  46. thelghome

    thelghome

    Joined:
    Jul 23, 2016
    Posts:
    510
    Have you tried running our demo scene "Demo_FMNetworkStreaming", see if it works?

    We also noticed that there is a small webcam init bug on Windows 11.
    It's been fixed in our latest version: v2.122

    For further technical support, please free feel to write us email: thelghome@gmail.com
     
  47. Kikki1024

    Kikki1024

    Joined:
    Oct 27, 2021
    Posts:
    4
    Yes, I did. Stream My view works fine, but WebCam does not work. I checked components in the scene, only find WebcamDemo component, but can't find any WebCamTexture encoder or decoder component. Any ideas?
     
  48. thelghome

    thelghome

    Joined:
    Jul 23, 2016
    Posts:
    510
    It still requires GameViewEncoder or TextureEncoder for the streaming.
    WebcamManager will init the webcam texture when enabled. In our demo, GameViewEncoder will capture the whole scene, including webcam texture as a 3D quad in the far clipping plane of your main camera.

    Screenshot 2021-12-27 at 7.02.39 PM.png
     
  49. Kikki1024

    Kikki1024

    Joined:
    Oct 27, 2021
    Posts:
    4
    Thanks for the reply, I think I have figured it out.
     
    thelghome likes this.
  50. VastnessVR

    VastnessVR

    Joined:
    Nov 21, 2017
    Posts:
    24
    Hi,
    I have multiple VR headsets that connect via WebSockets to the node server as clients when the program runs.
    I have a viewer program that connects via WebSockets to the node server as a Server that can change variables on a connected headset via sending strings and view the screen when the Game View Encoder is turned on.

    The viewer program only needs to connect to one headset and most of the time there is only one headset on.
    But sometimes a second headset is on and any changes are made on the viewer program with changes setting on both headsets.

    So my question is on UDP you have Label ID to separate data from different headsets.
    What do I use with WebSockets?
    Do I use different Ports?
    If so how does that work with the Node Server?
    Can I set a range of ports up and have each headset have a different port and the viewer program picks the headset to connect to?

    Any help would be appreciated.

    Rod
     
unityunity