Search Unity

  1. Welcome to the Unity Forums! Please take the time to read our Code of Conduct to familiarize yourself with the forum rules and how to post constructively.
  2. Have a look at our Games Focus blog post series which will show what Unity is doing for all game developers – now, next year, and in the future.
    Dismiss Notice

[Release] FMETP STREAM: All-in-One GameView+Audio Stream (UDP/TCP/WebSockets/HTML)

Discussion in 'Assets and Asset Store' started by thelghome, Apr 30, 2019.

  1. thelghome

    thelghome

    Joined:
    Jul 23, 2016
    Posts:
    588
    You may please refer to this main scene "Demo_NetworkingMain"

    FMNetwork Demos scene is mainly for local network.
    -> FMNetwork Basic: Server-Client testing, for sending string, byte[] in different ways
    -> FMNetwork Stream: Server-Client testing, for streaming Video, Webcam, Audio, Mic
    -> Point Cloud Stream: capture 3D objects in the scene and stream to others as point cloud format

    FMWebsocket Demo scene requires node.js setup, you may refer to this guide page:
    https://frozenmist.com/docs/apis/fm-websocket/
    -> Websocket Network: Internet streaming testing, for sending string, byte[]..etc
    -> Websocket Stream: Internet streaming testing, for Video, Webcam, Audio, Mic, and compatible with WebGL, HTML..etc

    Network Action & TCP Stream are deprecated demos, which mixed the use of UDP and TCP.

    For new user, you may refer to this minimal setup guide of server, and client.
    https://frozenmist.com/docs/apis/fmetp-stream/stream-guide/

    Screenshot 2022-05-20 at 3.47.15 PM.png
     
  2. whitegreenstudios88

    whitegreenstudios88

    Joined:
    Dec 12, 2021
    Posts:
    8
    Hi, how to bypass camera component to send webcam data? the GameViewEncoder need main camera or render camera, which i can't position the camera to the exact location. tq

    i saw something call TextureEncoder, may i ask how to decode and can it stream by byte?
     
    Last edited: May 21, 2022
  3. thelghome

    thelghome

    Joined:
    Jul 23, 2016
    Posts:
    588
    You may combine the use of TextureEncoder and WebcamManager in this way.
    Screenshot 2022-05-22 at 11.34.29 PM.png
     
  4. whitegreenstudios88

    whitegreenstudios88

    Joined:
    Dec 12, 2021
    Posts:
    8
    thanks, i have make it working only transmitting webcam view. i saw a pun2 script on your website, may i know how to implement it? do i need to put the script into resource prefab? and any example how to implement? tq

    And is it possible to do more than 2 person video conference?
     
    Last edited: May 24, 2022
  5. thelghome

    thelghome

    Joined:
    Jul 23, 2016
    Posts:
    588
    The general idea is streaming byte[] via PUN2.
    It's possible to do it via SerialStream() or RaiseEvent(), and you may also define who is encoder, decoder by photonview's ownership etc.
     
  6. whitegreenstudios88

    whitegreenstudios88

    Joined:
    Dec 12, 2021
    Posts:
    8
    hi, regarding the textureEncoder, streaming from editor, all other player in browser webgl can received the webcamtexture. But if running textureEncoder in Webgl Browser, other browser or editor cannot received the webcamtexture. pls help. my textureEncoder using FMNetworkManager.SendToOthers (byte[]).

    and another question, i am doing video conference for more than 2 people, does the pair encoder and decoder label matter? tq
     
  7. thelghome

    thelghome

    Joined:
    Jul 23, 2016
    Posts:
    588
    I am afraid that Unity's Webcam class isn't compatible in WebGL. You may need to search for WebGL webcam plugins for getting webcam input.

    If the client receiver multiple video streams, different label ID are required to identify them. For PUN2 user, I'd suggested that you may consider using photonview ID for each encoder/decoder.
     
  8. havokentity

    havokentity

    Joined:
    Sep 25, 2017
    Posts:
    24
    Hello,

    Does your plugin support sending a video stream from a Unity desktop application to a Hololens 2 application texture? I want to be able to broadcast captured frames realtime from a 3D desktop application built in Unity and display the stream on a Hologram in the Hololens 2.

    Thank you
     
  9. thelghome

    thelghome

    Joined:
    Jul 23, 2016
    Posts:
    588
    Yes, it's compatible!
     
    havokentity likes this.
  10. ZettanoZrei

    ZettanoZrei

    Joined:
    Sep 28, 2018
    Posts:
    1
    Hello. I would like to know if it is possible to solve the problem with your asset: I need to stream a video stream from Unity (in minimized mode) to a WinForms application. In this app, I can display webcam video or get link to MJPEGStream source.
     
  11. Beloudest

    Beloudest

    Joined:
    Mar 13, 2015
    Posts:
    239
    Hi, I've been trying to get https to work with web sockets. I modified the index.js node file on the webserver to be setup for https. All seemed to work fine, I could reach the server webpage and connect to server and could see the server connect in SSH console logging. For some reason I couldn't get the Unity client to connect in SSL mode, I would get a message stating series of connection attempts had failed. Have you any advice on what I need to do to make HTTPS work when using websockets? Is there a method to get some more detailed logs of the connection timeout cause? Thanks in adavnce.
     
  12. thelghome

    thelghome

    Joined:
    Jul 23, 2016
    Posts:
    588
    There are few different types of SSL option in the FMSocketIOManager, which has to be matching your server's SSL type.
     
  13. thelghome

    thelghome

    Joined:
    Jul 23, 2016
    Posts:
    588
    It's not designed for this usage, but you can still generate mjpeg stream(UDP) with some modification on our Encoder written in C#.
    We may also consider this as a new feature in coming updates.
     
  14. havokentity

    havokentity

    Joined:
    Sep 25, 2017
    Posts:
    24
    I've bought a copy of your asset and it successfully streams video from desktop to my Hololens 2 application. Thank you
     
  15. thelghome

    thelghome

    Joined:
    Jul 23, 2016
    Posts:
    588
    Awesome, glad to know that it fits your need :)
     
  16. havokentity

    havokentity

    Joined:
    Sep 25, 2017
    Posts:
    24
    Hello,

    I tried to enable async but it doesn't broadcast more than 1 frame sometimes. Sometimes there's image corruption on the hololens. Works great without async though. Is there any ETA when Async feature will be finished?

    (I also tried it on PC and it throws this error:
    texture is smaller than 8 x 8, wrong data)

    Thank you
     
  17. thelghome

    thelghome

    Joined:
    Jul 23, 2016
    Posts:
    588
    It could be due to incomplete data.
    We are still rushing for coming major updates at the moment, which might involve HL2 performance and stability.
    Would you mind reaching us via email with your invoice number, we could share you our latest beta for related bugs.

    technical support: thelghome@gmail.com
     
  18. havokentity

    havokentity

    Joined:
    Sep 25, 2017
    Posts:
    24
    Thank you for your response. I've sent you an email with my invoice number.
     
  19. Beloudest

    Beloudest

    Joined:
    Mar 13, 2015
    Posts:
    239
    I was using an ip address instead of the hostname for the connection which was making the SSL cert fail along the way. All sorted now thanks.
     
    thelghome likes this.
  20. MidnightCoffeeInc

    MidnightCoffeeInc

    Joined:
    Feb 28, 2017
    Posts:
    419
    Can't wait for the update! Is this an update to V2 or a new asset that will require a separate purchase?
    Thanks!
     
  21. samanthalynne

    samanthalynne

    Joined:
    Jan 27, 2022
    Posts:
    1
    Hello there! First, thanks for making this asset, it's very cool!

    Sorry if this isn't the right place to ask this, I could just be asking silly questions because I'm a complete noob at anything networking related ha. I'm trying to stream the game view from a Quest 2 to a Pixel 5. Ideally I wanted them to be able to connect from anywhere ie not on the same network, so I tried to copy the websocket demo. Despite knowing nothing about node or websockets, I finally managed to get the demo node server hosted on heroku and get the Quest and Pixel talking to each other, hooray! Unfortunately it's very laggy. I tried as best as I could to optimize: the Game View Decoder and Encoder have Fast Decode and Async checked, the resolution on the Encoder is 1170 by 540, Quality 34 and StreamFPS 24. The quality is less than I'd like but it's still lagging and choppy. There isn't anything else going on in the scene to slow things down, the Quest server side just has a few rotating cubes and the Pixel client side just has a quad displaying what it's receiving. Is this just the kind of quality that can be expected in this situation, or should it be possible to get smooth casting using the web sockets?

    I tried testing with the Demo_FMNetworkStreaming scene instead, this time with my PC and the Pixel. I tried just as is, with AutoNetworkDiscovery checked, and I tried inputting the IP address of my computer as displayed in the demo scene on PC and unchecking AutoNetworkDiscovery, but when I choose Server on my PC and Client on my phone, they don't connect. I feel like I'm missing something obvious here, is there something else I have to do or set to get them to connect?

    Thanks for your time
     
  22. DrBlackAdder

    DrBlackAdder

    Joined:
    Sep 18, 2017
    Posts:
    2
    I'm trying to implement my own JPG encode/decode as suggested in your experments list:

    ===================New Experiment Updates===================
    We did some testing by replacing the default Unity3D encode jpg method, which you can implement on your side too.

    I'm getting lost with where I would implement this? Any help would be greatly appreciated.
     
  23. thelghome

    thelghome

    Joined:
    Jul 23, 2016
    Posts:
    588
    This would be a major update, recent V2 owners may have free or discounted upgrade depending on the purchase date.
     
  24. thelghome

    thelghome

    Joined:
    Jul 23, 2016
    Posts:
    588
    Have you reached us via email before? For FMWebsocket, performance might be affected if you have unstable connection between Unity(Server, Clients) and node.js server.

    We do have some optimisation in our coming V3, we could also share the latest beta version for your testing purpose.
     
  25. thelghome

    thelghome

    Joined:
    Jul 23, 2016
    Posts:
    588
    Thanks for pointing this out, we've added a detailed documentation about fast jpeg encoder for your need.
    https://frozenmist.com/docs/apis/fmetp-stream/jpeg-encoder-example/
     
    Last edited: Jun 15, 2022
  26. DrBlackAdder

    DrBlackAdder

    Joined:
    Sep 18, 2017
    Posts:
    2
  27. yangyueyue

    yangyueyue

    Joined:
    Aug 1, 2018
    Posts:
    15
    I find when use GameViewEncoder fullScreen and fullSize,the client dont receive rawimage when the server async encoder toggle ison
     
  28. thelghome

    thelghome

    Joined:
    Jul 23, 2016
    Posts:
    588
    Thanks for reporting this issue, it was mismatching resolution bug.
    We've just pushed an update(v2.430) for fixing this bug.
     
    Last edited: Jun 21, 2022
  29. manurocker95

    manurocker95

    Joined:
    Jun 14, 2016
    Posts:
    161
    Is that fixed for the new beta 3? Bc the beta 3 works like a charm so any update is appreciated :3
     
  30. yangyueyue

    yangyueyue

    Joined:
    Aug 1, 2018
    Posts:
    15
    this bug have solved,but I Find fullScreen model encoded size is too lager(94488) so I fps very low.What can I do for this? I open fast encode mode,async mode, gzip mode.but that cant not very helpful.
     
  31. thelghome

    thelghome

    Joined:
    Jul 23, 2016
    Posts:
    588
    If you are not very urgent need, you may wait for our official V3 release, hopefully during the summer period as possible.
    But you may also reach us again for the newer beta via email.
     
  32. thelghome

    thelghome

    Joined:
    Jul 23, 2016
    Posts:
    588
    According to the render queue order, Full Screen Capture mode is slower than other modes as expected, because it needs to wait and block the main thread for a tiny moment until Main Camera and GUI rendering completed.

    In general, RenderCam mode would be more efficient as it doesn't affect your main camera much. MainCam mode may performance better if you have huge number of drawcalls.

    If your GUI Canvas is in 3D world, then you can use RenderCam or MainCam mode to capture them with better performance.

    You may also reach us for beta test V3 if you purchased V2 recently. It may perform better than V2 overall.
    technical support: thelghome@gmail.com
     
  33. yangyueyue

    yangyueyue

    Joined:
    Aug 1, 2018
    Posts:
    15
    It mean that RenderCam 1920*1080 faster than Full Screen 1920*1080?
     
  34. thelghome

    thelghome

    Joined:
    Jul 23, 2016
    Posts:
    588
    Yes, much less impact on your frame rate.
    What's your target device/OS, and Unity version? You may benefit by AsyncGPUReadback encoder option, which should be compatible in most of platforms since Unity 2021.
     
  35. dhruv_unity154

    dhruv_unity154

    Joined:
    Feb 11, 2022
    Posts:
    14
    Hello
    I've implemented streaming view display to Browser(using TestServer2_0_0) from Oculus Device using FMSocketIOManager which using TCP protocol. It's working good but sometime faced issue of drop FPS in oculus of due to packets drop in network.

    Can I use UDP protocol using FMNetwork Manager for same purpose?
     
  36. thelghome

    thelghome

    Joined:
    Jul 23, 2016
    Posts:
    588
    They are two different networking systems, unfortunately.

    If you have a local computer, you may also set up both FMNetworkServer and FMWebsocket on that computer to receive streams from Quest first.
    Then, routing the data from FMNetwork to FMWebsocket for broadcasting.

    If the above solution doesn't fit your case, you may need to switch to PUN2(UDP), or other WebRTC(UDP) networking systems as alternatives.
     
  37. manurocker95

    manurocker95

    Joined:
    Jun 14, 2016
    Posts:
    161
    I only use the UDP protocol and works just fine on meta quest 2 and hololense 2. There are some drops but it is usually due to heavy cpu overload + internet connection.
     
    thelghome likes this.
  38. InnovationCentre

    InnovationCentre

    Joined:
    Mar 3, 2022
    Posts:
    1
    Hi,

    I'm considering purchasing this plugin for a work project. I'm looking to livestream a camera feed into a Hololens 2. The Hololens 2 application would be built in Unity (likely will display the camera feed on some render texture), but the camera feed would just be sending raw camera data via TCP. Would this plugin be able to support that?

    Thanks
     
  39. thelghome

    thelghome

    Joined:
    Jul 23, 2016
    Posts:
    588
    Yes, it's compatible with raw camera feed mjpeg in UDP or TCP.
    https://frozenmist.com/docs/apis/fmetp-stream/gstreamer-example/
     
  40. Cl3pt0man1cx

    Cl3pt0man1cx

    Joined:
    Jan 15, 2019
    Posts:
    3
    Hi, is it possible to stream from encoder direct to vlc and open it via Networkstream in vlc?
     
  41. thelghome

    thelghome

    Joined:
    Jul 23, 2016
    Posts:
    588
    We may share our latest beta V3 for recent purchase, which included a minimal setup of udp(mjpeg).

    Please reach us via email for technical support: thelghome@gmail.com
     
  42. yangyueyue

    yangyueyue

    Joined:
    Aug 1, 2018
    Posts:
    15
    When I use the curved screen, the picture passed by is distorted. What is a good solution
     
  43. thelghome

    thelghome

    Joined:
    Jul 23, 2016
    Posts:
    588
    Do you have screenshots? We'd like to investigate if you have more information.
     
  44. yangyueyue

    yangyueyue

    Joined:
    Aug 1, 2018
    Posts:
    15
    I have a model of a curved screen. I want to synchronize the pictures on this curved screen to other curved screens. If an orthographic camera is used, the camera cannot cover the curved screen exactly. If a perspective camera is used, the synchronized pictures will be deformed
     
  45. yangyueyue

    yangyueyue

    Joined:
    Aug 1, 2018
    Posts:
    15
    The model looks like this
    upload_2022-7-12_9-35-17.png
     
  46. thelghome

    thelghome

    Joined:
    Jul 23, 2016
    Posts:
    588
    For your case, you could try "TextureEncoder" instead of GameViewEncoder, which allows you to stream texture instead of Camera view.

    Your trouble is UV mapping issue, which is actually not very related to FMETP STREAM.
    Referring to my personal experience, I would use an orthographic camera to capture, then modify the UV of the curved screen with flat projection UV. (only if you really want to capture and remap it in runtime without Texture input for TextureEncoder). You could also assign a RenderTexture for an orthographic camera, which will give you the "correct" aspect ratio in flat projection. Then streaming the RenderTexture via TextureEncoder. But this method involves distortion, which require either UV adjustment or shader on the decoder side.

    There are many technical solutions for curved screen capture, like stitching with multiple camera, panorama capture, and also including UV mapping and some math in shader for pixel perfect without distortion.

    It's a very common and interesting topic related to projection mapping, once you learn some basic theory about the UV.
     
    Last edited: Jul 12, 2022
  47. yangyueyue

    yangyueyue

    Joined:
    Aug 1, 2018
    Posts:
    15
    thanks
     
  48. cdev1

    cdev1

    Joined:
    Oct 19, 2021
    Posts:
    9
    Hi, I installed the node js, and giving path node /Path of index.js but am getting an error like,

    node:internal/modules/cjs/loader:959
    throw err;
    ^

    Error: Cannot find module 'express'
    Require stack:
    - C:\Users\CYMAX03\Documents\UNity_Projects\TestServer_v1.2\index.js
    ←[90m at Module._resolveFilename (node:internal/modules/cjs/loader:956:15)←[39m
    ←[90m at Module._load (node:internal/modules/cjs/loader:804:27)←[39m
    ←[90m at Module.require (node:internal/modules/cjs/loader:1022:19)←[39m
    ←[90m at require (node:internal/modules/cjs/helpers:102:18)←[39m
    at Object.<anonymous> (C:\Users\CYMAX03\Documents\UNity_Projects\TestServer_v1.2\index.js:1:15)
    ←[90m at Module._compile (node:internal/modules/cjs/loader:1120:14)←[39m
    ←[90m at Module._extensions..js (node:internal/modules/cjs/loader:1174:10)←[39m
    ←[90m at Module.load (node:internal/modules/cjs/loader:998:32)←[39m
    ←[90m at Module._load (node:internal/modules/cjs/loader:839:12)←[39m
    ←[90m at Function.executeUserEntryPoint [as runMain] (node:internal/modules/run_main:81:12)←[39m {
    code: ←[32m'MODULE_NOT_FOUND'←[39m,
    requireStack: [
    ←[32m'C:\\Users\\CYMAX03\\Documents\\UNity_Projects\\TestServer_v1.2\\index.js'←[39m
    ]
    }

    Node.js v18.6.0

    how to solve this to stream by game view.
    Thanks!
     
  49. luxertlee

    luxertlee

    Joined:
    Dec 31, 2013
    Posts:
    2
    I bought an asset a few days ago.
    Is the server and client running on the same computer during development?

    Build and run the server.
    If you run it on the client in the Unity editor,
    The connection keeps dropping.
     
  50. thelghome

    thelghome

    Joined:
    Jul 23, 2016
    Posts:
    588
    It's because of missing "express" module.
    Please follow the guideline page, and follow the commands in this part "3. Install express".
    https://frozenmist.com/docs/apis/fm-websocket/
     
unityunity