Search Unity

[Release] FMETP STREAM: All-in-One GameView + Audio Stream + Remote Desktop Control(UDP/TCP/Web)

Discussion in 'Assets and Asset Store' started by thelghome, Apr 30, 2019.

  1. thelghome

    thelghome

    Joined:
    Jul 23, 2016
    Posts:
    742
    Could you provide more information to us?
    - Unity3D version, Build Platforms
    - Your system OS version
    - Version of our asset
    - Any related settings

    If possible, could you please send us screenshot/videos about your issue?
    We will try to reproduce the issue on our side immediately.
    technical support: thelghome@gmail.com
     
  2. Donkrokodil

    Donkrokodil

    Joined:
    Aug 16, 2019
    Posts:
    9
    The problem was because of using Unity 2018.3.7f1 (64-bit). I updated to Unity 2019.3.14f1 (64-bit) and it is ok now.
    But if I use Unity 2018.3.7f1 (64-bit), it is not working.
    My OS: Windows 10
    Your current asset version
     
  3. thelghome

    thelghome

    Joined:
    Jul 23, 2016
    Posts:
    742
    Thanks for your information, very useful to us.
    In general, we mainly test the assets on Unity 2017 LTS & the latest version of Unity3D.
    For old versions of Unity3D, we recommend that you should always stay with LTS versions.
     
  4. thelghome

    thelghome

    Joined:
    Jul 23, 2016
    Posts:
    742
    [v1.310] Added Async Fast Mode, should solve the performance issue on High-Resolution Streaming with mobile VR devices.
    @Juruhn Thanks for pointing out performance issue, we just found a solution for you in ver1.310
    Performance test on Oculus GO, Streaming 720P in 24fps
    Headset performance 50+ FPS

    First testing on Mac:
     
    Last edited: May 24, 2020
  5. trungleau17

    trungleau17

    Joined:
    Jan 2, 2020
    Posts:
    3
    Hi, I wonder if it possible to create an application using this asset to stream my android device screen to Unity ?? I want to stream my device screen while it's running another app to Unity.
     
  6. thelghome

    thelghome

    Joined:
    Jul 23, 2016
    Posts:
    742
    For devices screen view, it only supports Win10 Desktop view in current version. Thus, Android device screen is not available yet. We are still working in progress for more native device screen support in other platforms..
     
  7. trungleau17

    trungleau17

    Joined:
    Jan 2, 2020
    Posts:
    3
    Glad to hear it, thank you
     
  8. Dinozavrr

    Dinozavrr

    Joined:
    May 14, 2020
    Posts:
    1
    Hi!

    How can I save the audio stream from unity in server side as audio file?
    With video I can do it because its jpg frames, but how to do it with audio?
     
  9. thelghome

    thelghome

    Joined:
    Jul 23, 2016
    Posts:
    742
    After decoding, the raw data is PCM. Thus, you can store all samples in an array. Then, convert them into other format.
     
  10. cecarlsen

    cecarlsen

    Joined:
    Jun 30, 2006
    Posts:
    862
    Hi. I write here because I can't find the support thread for FM WebSocket.

    How do I connect to a wss server? FMSocketManagerIOManager automatically names the URL “ws://”.
     
  11. thelghome

    thelghome

    Joined:
    Jul 23, 2016
    Posts:
    742
    There is a toggle for “Ssl Enabled” in the manager. It will fill as “wss://“ when it’s toggle on.
     
    cecarlsen likes this.
  12. PixelTama

    PixelTama

    Joined:
    Sep 26, 2019
    Posts:
    2
    Hello @thelghome. Is it possible that streaming from multi mobile devices' webcam to Desktop Unity? like creating a CCTV split screen.
     
  13. thelghome

    thelghome

    Joined:
    Jul 23, 2016
    Posts:
    742
    Yes, it’s possible. You can assign unique Label ID for each webcam stream.
     
    PixelTama likes this.
  14. spacedigital

    spacedigital

    Joined:
    Mar 28, 2020
    Posts:
    1
    Hi. I’d like to make a video chat app 1 to 1, 1 to many. Is it possible with your plugin?
    How about server? Do I need to make my own, or you providing a tools for it?
     
  15. thelghome

    thelghome

    Joined:
    Jul 23, 2016
    Posts:
    742
    Yes, it’s possible.
    You may checkout our Youtube channel for more examples.

    We have demo scenes for WebSocket and UDP, and it’s also compatible with other networking systems like Photon and Mirror..etc

    If you follow FMWebSocket demo, it requires node.js server setup. And we have tutorials of how to setup them step by step.

    Or please feel free to email us for further questions.
    technical support: thelghome@gmail.com
     
    spacedigital likes this.
  16. thelghome

    thelghome

    Joined:
    Jul 23, 2016
    Posts:
    742
  17. Donkrokodil

    Donkrokodil

    Joined:
    Aug 16, 2019
    Posts:
    9
    Hi!
    When I use your asset (WebSocket with socket io) there is a huge delay, when I use it in andrioid mobile. It is about 2-3 fps. Even in one local network there is huge delay.
    I use node.js from your example.
    When using in unity, it is perfect and fast, but in mobile app there is a problem. How can I solve it?
    And can I solve this somehow?

    Fast mode, Async mode, gzip turned on. Stream FPS is set to 20 and encoding quality to 35.
     
  18. thelghome

    thelghome

    Joined:
    Jul 23, 2016
    Posts:
    742
    Is the delay only happening on WebSocket?

    What’s your stream resolution and which Android phone do you test?
    You may please send us details via email: thelghome@gmail.com
     
  19. aceakle

    aceakle

    Joined:
    Oct 22, 2017
    Posts:
    11
    Can use use FMETP STREAM to connect to something like a YouTube Live Server and then broadcast your game to YouTube Live?
     
  20. thelghome

    thelghome

    Joined:
    Jul 23, 2016
    Posts:
    742
    In current version, you can stream to HTML , but rtsp stream for YouTube doesn’t support yet.
    Hope this can solve your concerns.
     
  21. RNextStudios

    RNextStudios

    Joined:
    Jun 10, 2020
    Posts:
    1
    Hi @thelghome

    We are evaluating your tool for use in our project, (by the way it's awesome!) but we wonder if with your tool we can stream our (WebGL, MAC,Windows,iOS,Android) game to our server and store the video from that streaming in our server, it that possible?

    Thanks!
     
  22. thelghome

    thelghome

    Joined:
    Jul 23, 2016
    Posts:
    742
    In our examples, we provided WebSocket server demo for streaming through the Internet.
    Some third party alternative like Photon, Mirror are also compatible.

    The frames data can be received completely, which you may need to write some scripts for converting them into your target video format.

    Hope this can solve your concerns.
     
  23. WrongTarget

    WrongTarget

    Joined:
    Nov 18, 2013
    Posts:
    43
    Hi Folks!
    Parsing doesn't seem to be working. Any text I emit with socket.io from my server is received in the client like this ["test string"]

    Any idea on what could be going on?
     
  24. thelghome

    thelghome

    Joined:
    Jul 23, 2016
    Posts:
    742
    In general, you can receive data via "OnReceivedByteDataEvent", "OnReceivedStringDataEvent" and "OnReceivedRawMessageEvent"

    If you have message sending from socket.io server directly, you have to match our format/data structure in our example node.js server.
    Code (JavaScript):
    1.  
    2. //emit type: all;
    3. case 0: io.emit('OnReceiveData', { DataString: data.DataString, DataByte: data.DataByte }); break;
    4. //emit type: server;
    5. case 1: io.to(serverID).emit('OnReceiveData', { DataString: data.DataString, DataByte: data.DataByte }); break;
    6. //emit type: others;
    7. case 2: socket.broadcast.emit('OnReceiveData', { DataString: data.DataString, DataByte: data.DataByte }); break;

    Otherwise, you can only get the raw data without parsing from "OnReceivedRawMessageEvent", which will include everything.
     
    Last edited: Jun 24, 2020
  25. WrongTarget

    WrongTarget

    Joined:
    Nov 18, 2013
    Posts:
    43
    I'm not sure I follow. Those examples are emitting a message based on data they're receiving from the ws connection.
    I just have a simple string that I want to emit from my server directly to the client in Unity. How can I do that?


    EDIT: I get it now.
    For future reference:
    If you'd like to receive the actual string and not unparsed raw message, you'll need to add them under OnReceivedStringDataEvent in the inspector, but for that to event to be invoked at all you can't simply emit the string like this:

    Code (JavaScript):
    1. io.sockets.emit("myString")
    it'd need to match the expected format by FM which is:

    Code (CSharp):
    1.  io.sockets.emit('OnReceiveData', { DataString: "myString"});
    Good tool but documentation needs a lot of work.
     
    Last edited: Jun 24, 2020
    thelghome likes this.
  26. vcasas

    vcasas

    Joined:
    Oct 6, 2017
    Posts:
    6
    Hi, instead of sending the camera, can I transmit a dynamic texture that has transparency?
     
  27. thelghome

    thelghome

    Joined:
    Jul 23, 2016
    Posts:
    742
    In theory, you can. but you may need to modify a bit the core script.
    I did some experiment with another customer with RGBX & RGBA encoder. It's kind of working but there is some problem we found.

    Thus, my suggested method will be converting your alpha part as green screen.
    This is one of our experiment: https://twitter.com/LeosonCheong/status/1266787769948368897

    The alternative method will be converting your RGBA image into Left-Right 3D structure.
    The Left will be normal RGB, and the Right will be Alpha Grey-Scale in RGB.
    And use customised shader for converting them back to rgba on the other size.

    The green screen solution will be recommended as it didn't increase your streaming traffic.
    Hope all these suggested solution can work for your case.
     
  28. atelios

    atelios

    Joined:
    Dec 5, 2017
    Posts:
    26
    Hi, your package works great so far we are using it on the HoloLens and Mobile streaming to the desktop.

    However one weird thing I have encountered is that when I call the Close() function, when a user "logs out" in our application that all init functions appear to be called again in the correct order but the SocketIOComponent state of WebSocketState is stuck on Connecting and it does not appear to connect anymore!?

    I also don't understand why the either Init() or Connect() both call Init() and will keep adding this component, is it neccesary?

    Is there any way to see what is going on here because I have a hard time figuring out what is causing this.
     
  29. thelghome

    thelghome

    Joined:
    Jul 23, 2016
    Posts:
    742
    If you only want to know the status of connection, you may use "FMSocketIOManager.instance.Ready".
    It returns true if connected successfully.

    When you closed the connection, it requires Init() before Connect().
    In the Connect() function, it will check if it's initialised or not. It will trigger the Init() if you haven't called Init() yet.

    I also did a quick test of Close & Connect many times in Editor without problem. You may send us an email for technical support if necessary. Email: thelghome@gmail.com

    Also, please check your version of FMETP STREAM and make sure that you are using the latest version in your project.
     
  30. donnacodes

    donnacodes

    Joined:
    Sep 21, 2017
    Posts:
    7
    Hi and thank you for a great package. I already have a few cases where it's working very well. One thing I cannot seem to find in written docs but do see in your videos is an interaction with webcams. In the example you use 2 phones. What I'm trying to do is, at certain points in the game Unity is suppose to trigger the webcam of the player in the game and then stream that to an external websocket server to be delivered to another person, so streaming out with a webcam.
    I then also need to be able to stream a webcam from the user on the receiving end back into the game.
    I understand the logic of using Websockets so I think I'm fine there. What I'm not sure about is how to connect the palyer webcam to FMETP in Unity. Is this possible or are cameras limited to the game play cameras? Thanks
     
  31. thelghome

    thelghome

    Joined:
    Jul 23, 2016
    Posts:
    742
    For the webcam demos, it's captured by a render cam with a 3D plane of webcam texture.
    This is the most light-weight solution for low-end devices with adjustable resolution.

    You may also refer to FMNetworkStream demo which includes the core part of webcam chat on mobile.
    The networking manager in UDP demo can be replaced by WebSocket in your case.

    If you have further question, please feel free to email us. and wish you enjoy this little tool.
     
    donnacodes likes this.
  32. donnacodes

    donnacodes

    Joined:
    Sep 21, 2017
    Posts:
    7
    @thelghome thank you very much for a quick and detailed reply :)
     
    thelghome likes this.
  33. thelghome

    thelghome

    Joined:
    Jul 23, 2016
    Posts:
    742
    I made a short connection testing video, which demonstrates the correct behaviour of Close() & Connect().
    Hope this can solve your concerns. Again, updating to the latest version may fix some buggy behaviour you found.
     
  34. atelios

    atelios

    Joined:
    Dec 5, 2017
    Posts:
    26
    Hi already figured it out, we added some custom events and in the one we are using the events are only set on Start with the coroutine. So we just had to make sure it was calling all our custom On() things on connecting again.
     
    thelghome likes this.
  35. jpingen

    jpingen

    Joined:
    Jan 28, 2019
    Posts:
    22
    Hi,
    I'm looking to implement video recording in an Oculus Go application we've developed. The main bottleneck here is performance. Most recording solutions I've tried so far basically ruined the framerate. Is there any demo scene/package I can give a try before purchasing FMETP STREAM?
     
  36. thelghome

    thelghome

    Joined:
    Jul 23, 2016
    Posts:
    742
    FMETP STREAM supports live streaming from Oculus Go to other Unity3D applications or html via Wifi, but recording feature is not included.
    In theory, it must have impact on framerate due to the process of encoding & streaming data.

    This video can give you the idea of how much impact on framerate with Oculus GO.
    keeping 50+ fps on Oculus GO with 720P is the best we can achieve.
     
  37. jpingen

    jpingen

    Joined:
    Jan 28, 2019
    Posts:
    22
    Thanks for the quick reply! I understand recording is not a feature included in your package. Since performance on an oculus Go appears to be quite an issue with recording, I have been looking at different solutions such as streaming it to another application and capturing it there. Which is where FMETP comes in :)

    I have seen the video you posted, though it is a very simple scene. What would the performance do on a more complex scene that is already pushing the limits? Is there any way I can give this a try?
     
  38. thelghome

    thelghome

    Joined:
    Jul 23, 2016
    Posts:
    742
    I think that you can estimate that it increases about 4ms for the streaming task on Oculus Go.

    Let’s assume that default maximum fps on GO is 60fps:
    (1/50fps) - (1/60fps) = 0.003333second

    This should give you clear idea instead of testing random scenes.
     
  39. DreamHover

    DreamHover

    Joined:
    Sep 25, 2017
    Posts:
    2
    Hi,
    I got some errors as follows,

    Environment : Windows 10
    Web Browser : Chrome
    Unity version : 2019.3.2f1
    Build target : WebGL
    Player Setting:
    Api Compatibility Level : .Net Standard 2.0
    C++ Compiler Configuration : Release
    Strip Engine Code : unchecked
    Publishing Settings:
    Compression Format : Disabled

    Uncaught abort("To use dlopen, you need to use Emscripten's linking support, see https://github.com/kripken/emscripten/wiki/Linking") at Error
    at jsStackTrace (blob:http://127.0.0.1:3000/be941df2-5d3e-42ca-b18e-4c0b6fe8e2ab:8:22313)
    at Object.stackTrace (blob:http://127.0.0.1:3000/be941df2-5d3e-42ca-b18e-4c0b6fe8e2ab:8:22484)
    at Object.onAbort (http://127.0.0.1:3000/Build/UnityLoader.js:4:11118)
    at abort (blob:http://127.0.0.1:3000/be941df2-5d3e-42ca-b18e-4c0b6fe8e2ab:8:518112)
    at _dlopen (blob:http://127.0.0.1:3000/be941df2-5d3e-42ca-b18e-4c0b6fe8e2ab:8:192203)
    ...

    Any ideas?
    Thank you.
     
  40. thelghome

    thelghome

    Joined:
    Jul 23, 2016
    Posts:
    742
    It seems like you are building the scenes with UDP demo, which doesn't not allow for WebGL due to the multi-threading related codes.
    Only WebSocket demo scenes work in WebGL build.
    Or please send us an email for further support: thelghome@gmail.com
     
  41. DreamHover

    DreamHover

    Joined:
    Sep 25, 2017
    Posts:
    2
    Thanks for quick reply.
    Okay, I'll try using WebSocket.
     
  42. thelghome

    thelghome

    Joined:
    Jul 23, 2016
    Posts:
    742
    We've made a tutorial video for WebSocket, as many new users want to know how to emit their own event between FMWebSocket and socket.io. Hope it can help people.
     
  43. Apfelbeck

    Apfelbeck

    Joined:
    Oct 25, 2015
    Posts:
    6
    I need to display a video that's in a UDP stream and I need to be able to switch between multiple video stream.

    Will FMETP work for my use case? Ideally I would like to take a video streamed at a url like udp://127.0.0.1:1000/viewTest1 and render it to a texture.
     
  44. thelghome

    thelghome

    Joined:
    Jul 23, 2016
    Posts:
    742
    FMETP STREAM is a solution for streaming between Unity3D apps, it supports streaming via url websocket but only with our own encoder/decoder format.
     
  45. thelghome

    thelghome

    Joined:
    Jul 23, 2016
    Posts:
    742
    [Thanks for all trying our assets]
    Since the old promotion video is a bit outdated, we've made a new promotion video of many experiments in past year.

    We didn't expect that live streaming solution would be popular in last year and we just release it as an interesting feature. The original name was FM Exhibition Tool Pack with multiple features, and it was renamed as FMETP STREAM for proper idea of what this tool is for in last year.

    In 2020, we continue working hard in order to help others, and hopefully this will be one of the worthies option for indie developers.
     
  46. DenisTribouillois

    DenisTribouillois

    Joined:
    Jun 14, 2016
    Posts:
    3
    Hi, before I buy the asset I would make sure it would help with my use case.
    I'm building a VR application with HTC vive and I'd like to add a second user in the VR 3D world using an Oculus Quest.
    Running the application on the Quest is not possible for performance reason.
    So would it be possible to add a camera to the 3D scene that live stream the video (both eyes) to the Quest with the plugin?
    Then if we want to go one step further, would it be possible to send position data of the Quest to update the position of the camera, which would no longer make the POV static? Would the latency be low enough for the "head tracking"?
    Thank you for the help
     
  47. thelghome

    thelghome

    Joined:
    Jul 23, 2016
    Posts:
    742
    Seems that you are going to create a remote VR rendering solution for Quest. Technically, all your ideas are achievable with our assets.
    In general case, there will be <50 ~ <100ms latency, it might be less if you have good router environment.
    However, if you are targeting for left-right eyes with correct position tracking, it means that your streaming traffic is double and the hardware performance will be heavier.

    In your case, FMETP STREAM could be a good starting point for research based development as the core source are written in C#.
    But we cannot guarantee your result, because it's a very unique use case and the performance can be dynamic.

    Hope above info can help you.
     
    DenisTribouillois likes this.
  48. SkyWalkerMK

    SkyWalkerMK

    Joined:
    Jul 27, 2016
    Posts:
    8
    Hi. I have an AR mobile app. I liked to stream my phone screen in an AR scene to another phone. I want to make it work on both IOS and Android phones. Does your plugin support it and if so how is the performance?
     
  49. thelghome

    thelghome

    Joined:
    Jul 23, 2016
    Posts:
    742
    FMETP STREAM can capture any Unity3D scene camera views and share between Unity3D apps, WebGL and HTML..etc.
    It's no problem in cross-platform between iOS and Android.
    I think this video can give you idea of performance, which is an old testing with iPhone 5.
    The latest version of FMETP STREAM has better performance than this old demo. Meanwhile, I am sure that AR phones are more powerful than iPhone 5 too.
     
  50. SkyWalkerMK

    SkyWalkerMK

    Joined:
    Jul 27, 2016
    Posts:
    8
    Hi. I have bought the FMETP STREAM. When I import it I got the errors at the attachment. I am using unity 2019.4. Can you let me know what might be the problem.
     

    Attached Files: