Search Unity

Stream video through network

Discussion in 'Multiplayer' started by Quentin-G, Apr 5, 2017.

  1. Quentin-G

    Quentin-G

    Joined:
    Dec 20, 2012
    Posts:
    29
    Hi everyone,

    I know this question has been addressed on several occasions but I can't find a complete solution to the problem. First a bit of background. I currently have an application running on a single window. The applications has two modes. The first one uses the virtual environment to render some stuff and display it on a texture. On the second mode, the texture is filled in a C++ plugin (using "getNativePtr") from a video stream.
    Now comes the problem. I need to display this same texture on an external window. After some research, I realized that Unity doesn't handle additional windows (that I could move to another screen for instance) and that I have to create another application and network them. Since in the second mode the texture is not rendered by Unity, I can't just have a duplicate environment to display the same texture. So now I need to stream the video through the network.

    I'm not really familiar with networking in Unity but it seems quite doable. Moreover, since the server and client will be running on the same computer, I assume that it would not be lagging (am I right ?). So far, I've created a server and client class along with a TextureMessage class (inherits from MessageBase). I'm able to transfer all the information related to the texture (size, format, ...), but when I try to send the texture itself (that I transformed into a byte[] with GetRawTextureData), I get a "channelBuffer limit reached" error. I tried to raise the limit to 500 but it doesn't change (even for really small images).
    It seems that I need to look at serialization, but I don't really know where to start. Am I going in the right direction ?

    If anyone could help me out or point me in the right directions that would be awesome :).

    Thanks a lot in advance.

    Here is my current code :
    Message class:
    Code (CSharp):
    1. public class TextureInfoMessage : MessageBase
    2. {
    3.     public byte[] textureData;
    4.     public int width;
    5.     public int height;
    6.     public TextureFormat format;
    7.     public bool mipmap;
    8.  
    9.     public TextureInfoMessage(){}
    10.     public TextureInfoMessage(byte[] d, int w, int h, TextureFormat f, bool m)
    11.     {
    12.         textureData = d;
    13.         width = w;
    14.         height = h;
    15.         format = f;
    16.         mipmap = m;
    17.     }
    18. }

    Server:
    Code (CSharp):
    1. public class StreamServer : MonoBehaviour {
    2.  
    3.     // Use this for initialization
    4.     void Start () {
    5.         NetworkServer.Listen(4444);
    6.     }
    7.  
    8.     // Update is called once per frame
    9.     void Update () {
    10.         if(NetworkServer.connections.Count > 0)
    11.         {
    12.          
    13.             TextureInfoMessage msg = MyStreamingClass.getTexture();
    14.             // TODO : get connectionID properly
    15.             NetworkServer.connections[1].SetChannelOption(Channels.DefaultReliable, ChannelOption.MaxPendingBuffers, 500);
    16.             NetworkServer.SendToClient(NetworkServer.connections[1].connectionId,1000, msg);
    17.         }
    18.     }
    19. }

    Client:
    Code (CSharp):
    1. public class StreamClient : MonoBehaviour {
    2.  
    3.     private bool connected = false;
    4.     bool initialized = false;
    5.     private int width;
    6.     private int height;
    7.     private TextureFormat format;
    8.     private bool mipmap;
    9.     NetworkClient myClient;
    10.     RawImage image;
    11.  
    12.  
    13.     // Use this for initialization
    14.     void Start () {
    15.         myClient = new NetworkClient();
    16.         myClient.RegisterHandler(MsgType.Connect, OnConnected);
    17.         myClient.RegisterHandler(1000, getInfoTexture);
    18.         myClient.Connect("127.0.0.1", 4444);
    19.         image = GetComponent<RawImage>();
    20.      
    21.         image.texture = new Texture2D(2,2,TextureFormat.RGB565, false);
    22.     }
    23.  
    24.     void OnGUI()
    25.     {
    26.         GUILayout.Label(connected ? "OK" : "Not OK");
    27.         GUILayout.Label("Width: "+width);
    28.         GUILayout.Label("Height: "+height);
    29.         GUILayout.Label("mipmap: "+mipmap);
    30.     }
    31.  
    32.     void OnConnected(NetworkMessage netMsg)
    33.     {
    34.         connected = true;
    35.     }
    36.  
    37.     void getInfoTexture(NetworkMessage msg)
    38.     {
    39.         TextureInfoMessage msg2 = msg.ReadMessage<TextureInfoMessage>();
    40.         if(!initialized || width != msg2.width || height != msg2.height || format != msg2.format)
    41.             image.texture = new Texture2D (width, height, TextureFormat.RGB565, mipmap);
    42.         width = msg2.width;
    43.         height = msg2.height;
    44.         format = msg2.format;
    45.         mipmap = msg2.mipmap;
    46.         ((Texture2D)(image.texture)).LoadRawTextureData(msg2.textureData);
    47.         initialized = true;
    48.     }
    49. }
     
    Last edited: Apr 5, 2017
  2. aabramychev

    aabramychev

    Unity Technologies

    Joined:
    Jul 17, 2012
    Posts:
    574
    Yes, it is. But I would say, upgrade please to 5.6, and try. You should get some relief...
     
  3. Quentin-G

    Quentin-G

    Joined:
    Dec 20, 2012
    Posts:
    29
    Hello aabramychev and thanks for the reply.

    I tried to update to 5.6 but it seems it didn't solve the problem. In addition to the error on the channel buffer limit I also get this Log now:
    "no free events for message in the queue"

    Any hints for me ?
     
  4. donnysobonny

    donnysobonny

    Joined:
    Jan 24, 2013
    Posts:
    220
    So looking at your code, there's a few issues here. Firstly, your MessageBase is pretty much redundant. Have a look here in the documentation to see a fully implemented MessageBase: https://docs.unity3d.com/ScriptReference/Networking.MessageBase.html. Notice that it implements the Serialize and Deseriaze mehods, where properties of the class are converted/deconstructed to/from a byte array.

    When you pass the instance of your TextureInfoMessage to NetworkServer.SendToClient, it calls the Serialize method, to which you haven't implemented... In your StreamClient class, the call to NetworkMessage.ReadMessage calls Deserialze, again which you haven't implemented.

    I think there is some auto-serialization within MessageBase classes, but you definitely don't want to rely on that.So I would recommend that if you want to use a MessageBase, make sure to implement the serialization of it.


    Secondly, and very likely to be tied into the error that you are getting, your StreamClient class is sending a message to a connection every frame!! If you haven't set a target frame rate via Application.setTargetFrameRate, this could potentially mean that you are sending thousands of messages to that connection per second. Normally, the highest you would ever go with send-rates is around the 20 messages per second, but that's for high-end games targeting pc/console where there will be a small number of players sending very small messages. So the main issue that you will be having right now is how often you are sending messages.

    A couple of minor notes:
    • leave the channel options. You really shouldn't need to tweak these unless you know exactly what you are doing. The default 16 pending buffers should be way more than enough
    • also note that NetworkServer.SendToClient doesn't allow you to specify the channel, and by the looks of things you've already discovered that this means it gets sent on the default reliable channel. If you are sending a byte-encoded image, you'll very easily be sending more bytes than you'll want to send in a single packed. This means that you ideally want to send that message on a ReliableFragmented channel (which splits large messages into smaller fragments).
    • The NetworkServer class is a little limited. It's built for convenience and expects that you have the two default channels set up (ReliableSequenced and Unrelaible). You're going to want to set up an additional ReliableFragmented channel, and use the Send* methods on the NetworkConnection, which you could do directly on the connection that you're getting from NetworkServer.connections[1]. For example: https://docs.unity3d.com/ScriptReference/Networking.NetworkConnection.SendByChannel.html, where you can send your MessageBase instance and specify your custom fragmented channel
    Hopefully this helps. Good luck!
     
  5. Quentin-G

    Quentin-G

    Joined:
    Dec 20, 2012
    Posts:
    29
    Hello donnysobonny,

    Thanks a lot !!! Things got much clearer thanks to your post. And good news, after some time I'm finally getting somewhere. I still have a few issues but I'm able to transfer small images now.

    So I did most of the changes you suggested. I removed my changes on the channel options. I made sure I'm not sending too many images (25 images / sec now). I added a ReliableFragmented channel and as you suggested I am now sending my messages with the sendByChannel function of my connection. I also switched to the EncodeToPng function instead of the getRawTextureData function and now I only need to store a byte[] in my message (which the default serialize function handles easily).

    So now I'm able to transfer small images but as soon as it gets too big I receive this error message :
    "NetworkWriter WriteBytes: buffer is too large (143314) bytes. The maximum buffer size is 64K bytes."
    I thought using a Reliable fragmented channel would fix it :/. And the other issue is that the framerate is really low.
    Would you have any hint for me ? Should I switch back to the getRawTextureData function ? I guess it might be faster than the EncodeToPNG but then the amount of data to be transferred would certainly be much more important ?
    Do you think that implementing a Serialize function might change things ?

    Thanks a lot again,

    Here is a quick update on my code :
    Code (CSharp):
    1. public class TextureInfoMessage : MessageBase
    2. {
    3.     public byte[] textureData;
    4.     public TextureInfoMessage(){}
    5.     public TextureInfoMessage(byte[] d)    {textureData = d;}
    6. }
    Code (CSharp):
    1. public class StreamServer : MonoBehaviour {
    2.     float lastTime;
    3.     int channel;
    4.     void Start () {
    5.         ConnectionConfig Config = new ConnectionConfig ();
    6.         Config.AddChannel (QosType.Reliable);
    7.         Config.AddChannel (QosType.Unreliable);
    8.         channel = Config.AddChannel (QosType.ReliableFragmented);
    9.         HostTopology Topology = new HostTopology (Config, 1);
    10.         NetworkServer.Configure (Topology);
    11.         NetworkServer.Listen(4444);
    12.         lastTime = Time.time;
    13.     }
    14.  
    15.     void Update () {
    16.         if(NetworkServer.connections.Count > 0 && Time.time - lastTime > 0.04)
    17.         {
    18.             lastTime = Time.time;
    19.             TextureInfoMessage msg = MyStreamClass.getTexture();
    20.             NetworkServer.connections[1].SendByChannel(1000, msg, channel);
    21.         }
    22.     }
    23. }
    Code (CSharp):
    1. public class StreamClient : MonoBehaviour {
    2.     private bool connected = false;
    3.     NetworkClient myClient;
    4.     RawImage image;
    5.  
    6.     void Start () {
    7.         myClient = new NetworkClient ();
    8.         ConnectionConfig Config = new ConnectionConfig ();
    9.         Config.AddChannel (QosType.Reliable);
    10.         Config.AddChannel (QosType.Unreliable);
    11.         Config.AddChannel (QosType.ReliableFragmented);
    12.         HostTopology Topology = new HostTopology (Config, 2);
    13.         myClient.Configure (Topology);
    14.         myClient.RegisterHandler(MsgType.Connect, OnConnected);  
    15.         myClient.RegisterHandler(1000, getInfoTexture);
    16.         myClient.Connect ("127.0.0.1", 4444);
    17.         image = GetComponent<RawImage>();
    18.         image.texture = new Texture2D(2,2,TextureFormat.RGB565, false);
    19.     }
    20.  
    21.     void OnConnected(NetworkMessage netMsg)    { connected = true;    }
    22.  
    23.     void getInfoTexture(NetworkMessage msg)
    24.     {
    25.         TextureInfoMessage msg2 = msg.ReadMessage<TextureInfoMessage>();
    26.         ((Texture2D)(image.texture)).LoadImage(msg2.textureData);
    27.     }
     
    Last edited: Apr 7, 2017
  6. donnysobonny

    donnysobonny

    Joined:
    Jan 24, 2013
    Posts:
    220
    Okay great, you are on the right path!

    One thing before I continue, Time.time is okay to use in your situation, however it's not as consistent as you might like... since you're sort of considering a frame to be a fixed unit of measurement, which it is not. I prefer to use Time.realTimeSinceStartUp. Something like this:

    Code (CSharp):
    1. public float inverval = 0.04f;
    2.     float next = 0f;
    3.     void Update () {
    4.         if (Time.realtimeSinceStartup > this.next) {
    5.             this.next = Time.realtimeSinceStartup + this.inverval;
    6.          
    7.             //stuff you need to do each interval...
    8.         }
    9.     }
    Just a little tip...

    Okay so, your next challenge is going to be a little harder to solve. Ultimately, unet has a set of buffers to store each message in, before it gets processed and sent. These buffers are basically fixed-sized byte arrays, and as we know already, a fixed-sized byte array must be constructed with it's intended size. We often use "short" values (or int16 values) here because they are a more efficient type of integer to be sending over the network, and storing internally, so unet internally creates buffers at the maximum size of a short/int16, which is 65535.

    So what this means is, when you call a Send* method, unet has to store the byte[] somewhere. It does this within a buffer, which has a maximum size of 65535. The "fragmentation" happens after this, by sending that buffered array in fragments. So your issue now is that unet cannot store the data that you are passing to Send* within a buffer...

    So ultimately what you are going to have to do is split up the byte[] that you get from encoding the image to a byte array into smaller chunks and send that data in separated Send* messages, so that you feed the data into the buffers in smaller amounts. You don't want to have too many chunks here, so I would recommend fairly large chunks, maybe around 40k, just to give some head-way to the buffers but also try and keep your chunk counts down. At some point though, you're definitely going to want to consider limiting the size of the images and/or compressing them to ensure that they are as small as they can be in memory size.

    The challenges you're going to face now:
    • You need to be able to notify your client before you send any chunks, that you are starting to send chunks
    • At the same time, it's possible that you might be sending chunks for more than one image at a time. So, you need to be sending a unique ID along with the above notification that you are about to send chunks, so that the client can ready itself for a new set of chunks for a new image
    • Start sending the chunks, along with the ID, so that the client can identify the chunks as part of a single image (in the case that you are sending chunks for more than one image at the same time).
    • You will also need to sequence the chunks yourself. I'm not 100% sure but I don't think the fragemented channel orders the messages correctly. Latency can cause messages to arrive un-ordered. So as well as an ID, you will need to provide another number which signifies the order that the chunk has been sent
    • Finally, you need to be able to notify the client after the final chunk has been sent that there are no more chunks to send. You will send the ID being used above to identify which image this is in relation to
    • At this point, take all of the byte[] arrays that came in from the chunks above, merge them into a single byte[] in the order that they should be merged (based on the sequence number sent above), and then finally call Texture2D.LoadImage from the merged array
    As I said, it's going to be a tricky challenge to solve, but the above should cover all of the problems you could come to.

    Hopefully this helps, keep us posted if you get stuck!
     
  7. donnysobonny

    donnysobonny

    Joined:
    Jan 24, 2013
    Posts:
    220
    Sorry, I missed this bit!

    I don't personally know a huge amount about the encode methods that you are using, so what I would recommend is to use the profiler and monitor the different encoding methods that you are using. It will show you the resource cost of them. It will also show you what else is causing the framerate drop.

    You shouldn't see the amount of data being sent over the network causing a drop in framerate (at least not a significant amount) but the profiler will be able to confirm this for you.
     
  8. Quentin-G

    Quentin-G

    Joined:
    Dec 20, 2012
    Posts:
    29
    Hello all,

    First Thanks a lot donnysobonny for your help.
    After some trouble with Unity's Networking system (couldn't get rid of the lag, even for very small pictures), I ended up using good old C# sockets with TCPClient and TCPListener (then I don't have to worry about the order in which the frames arrive). I also ended up switching my client and server (the client is now the one sending the video stream).

    So if anyone needs to do the same, here is how I did it:
    On the client side I
    - encode each frame into a byte array using the EncodeToPNG function
    - compute the number of necessary chunks
    - send the total size of the byte array and the number
    - send each chunks progressively

    On the Server side I
    - read the total size of the byte array and allocate a buffer with the total size of my encoded image
    - read the number of chunks
    - read each chunk of the byte array and progressively write it in my buffer (automatically in the right order as we are using tcp protocol)
    - I load the texture from the buffer with the LoadImage function

    Hope it will help someone.
    Thanks again.
    Best regards.
     
    Nems and donnysobonny like this.
  9. Quentin-G

    Quentin-G

    Joined:
    Dec 20, 2012
    Posts:
    29
    Quick additional comment,

    Using EncodeToPNG was not such brilliant idea. Turns out to be quite time consuming on large images. I ended up using the getRawTextureData and LoadRawTextureData functions. No encoding needed there ... hence no lag :).

    Thanks again.
     
  10. Shrikky23

    Shrikky23

    Joined:
    Jun 30, 2013
    Posts:
    24
    Hey Quentin, I am trying to do the same thing, with the getRawTextureData, you must be getting a serialized byte array, are you sending that array over the network and de-serializing using LoadRawTextureData? Also how do you send the Size first and the data next, I believe you would be using some kind of header system? Please let me know, any code shared related to this problem for understanding purposes will be highly appreciated.
     
  11. loungeintruder

    loungeintruder

    Joined:
    Oct 30, 2014
    Posts:
    6
    Hi Quentin, I am trying to follow what you did using TCPClient and TCPListener but am hitting a few walls. Is there any possibility that you would be willing to share some or all of the code that you use to do this?
     
  12. rajanerve

    rajanerve

    Joined:
    Nov 17, 2017
    Posts:
    17
    Hello Shrikky23, I am trying to do the same now. if u succeeded in doing it, please share the code..
    Your help will be much appreciated :)
     
  13. sharimken

    sharimken

    Joined:
    Jul 30, 2015
    Posts:
    14
    Amazing, using raw data does increase the performance a lot!! but if streaming via wifi, it's slow due to the network speed...
    I found a simple solution on "getRawTextureData" and load it correctly. you don't have to serialize byte.
    you just need to make sure the texture format on client side should be same as server side.
    for example:
    sentTexture = new Texture2D(width, height, TextureFormat.ARGB32, false);
    receivedTexture = new Texture2D(width, height, TextureFormat.ARGB32, false);

    I am beginner in tcp networking. but after searching for weeks and combining different samples on the Internet, finally have some draft code on streaming render textures between MAC/PC to iOS for my recent project.
     
    Last edited: Jun 8, 2018
  14. bvicil

    bvicil

    Joined:
    Feb 7, 2017
    Posts:
    7
    @leo238991 would you please share your piece of code. i am working on small web conference system over 4 players and a host. i can capture host webcam and just this. i figure it out how to make the code algoritm but can not code. because i am new at unet.
     
  15. sharimken

    sharimken

    Joined:
    Jul 30, 2015
    Posts:
    14
    in my streaming solution, I didn't use Unet. I use TCP server/client. I would like to share but my code is too messy right now. I am still working on this project for my boss and I cannot post it directly. However, there are some useful links you may check it out~
    https://stackoverflow.com/questions/42717713/unity-live-video-streaming/42727918
    this thread inspired me a lot! I knew nothing about TCP at the beginning, but it basically shows you how to send webcam images, multi-threading...

    btw, have you check the Asset store? I saw there is a plugin doing exact thing you needed and they have demo version.
     
  16. bvicil

    bvicil

    Joined:
    Feb 7, 2017
    Posts:
    7
    thanks for your reply. we bought an asset and solve our problem.
     
  17. Max-Bot

    Max-Bot

    Joined:
    Sep 25, 2013
    Posts:
    83
    Solution for such kind of tasks is published to Asset Store - GIGA Video Streamer.
    Integrates in two clicks.
    Open code.
     
  18. Toshihiko

    Toshihiko

    Joined:
    Aug 23, 2017
    Posts:
    2
    Can you tell me what asset of you bought?
    Thanks.
     
  19. bvicil

    bvicil

    Joined:
    Feb 7, 2017
    Posts:
    7
    @Toshihiko
    - for video conference we use WebRTC
    - for screen sharing we use NatCorder
     
  20. thelghome

    thelghome

    Joined:
    Jul 23, 2016
    Posts:
    743
    Local Live Streaming for multiple clients is one of our plugin features, please check it out.
    It also supports auto network discovery in local network.
    FM Exhibition Tool Pack
    https://assetstore.unity.com/packages/slug/143080

    As RPC and Unet system will be depreciated in future, we also added custom Network Action feature in next update, which is just submitted by today.

    PS: All Source Code written in C#, free to modify for specific case.
     
    Last edited: May 22, 2019
  21. sharimken

    sharimken

    Joined:
    Jul 30, 2015
    Posts:
    14
    I was looking for solution in one year ago and I also solved it in one year ago. There was a project that requires live streaming 360 panorama reflection map.
    It may be a bit late to mention this, but I contributed to the Asset called "FM Exhibition Too Pack" which is released recently. The core scripts are very stable and have been used in many public exhibitions.
     
    Last edited: May 22, 2019
  22. IgnacioMartinezCT

    IgnacioMartinezCT

    Joined:
    May 7, 2019
    Posts:
    11
    Hi @bvicil, how are you? Do you need both plugins to share screen and stream it? or is it possible to stream scene camera with WebRTC?

    thanks
     
  23. thelghome

    thelghome

    Joined:
    Jul 23, 2016
    Posts:
    743
    FM Exhibition Tool Pack | Forum
    if you are still looking for in-game camera streaming, our plugin may help you.
     
  24. IgnacioMartinezCT

    IgnacioMartinezCT

    Joined:
    May 7, 2019
    Posts:
    11
    Hi, but I understood that your plugin is for local streaming, isn't it? I need remote streaming
     
  25. thelghome

    thelghome

    Joined:
    Jul 23, 2016
    Posts:
    743
    Edited: Our plugin now supports live streaming via WebSocket. We have live streaming demo in both WebGL and HTML, and all native build.
     
    Last edited: Aug 18, 2019
  26. thelghome

    thelghome

    Joined:
    Jul 23, 2016
    Posts:
    743
    Our plugin now supports live streaming via WebSocket. We have live streaming demo in both WebGL and HTML, and all native build.
     
  27. pate0m1

    pate0m1

    Joined:
    Jan 19, 2016
    Posts:
    3
    Hi, is this supported on Android also?

    I have an existing demo which connects remote ends via web sockets (websocat under Linux) and I'd like to extend this so that the Linux server forwards the video stream to a websocket on a realwear headset.

    Many thanks,
    Mark
     
  28. thelghome

    thelghome

    Joined:
    Jul 23, 2016
    Posts:
    743
    Yes, it supports! You can stream your headset game view to web browser via WebSocket.
    We tested in Oculus Go(Android) too.
     
  29. pate0m1

    pate0m1

    Joined:
    Jan 19, 2016
    Posts:
    3
    Hi,

    Actually I want to stream the other way - from the Linux server to the headset (also Android) and view the video there. Is that possible also?

    Many thanks,
    Mark
     
  30. thelghome

    thelghome

    Joined:
    Jul 23, 2016
    Posts:
    743
    Currently, our websocket demo works like below logic:
    Headset(view) => Websocket => All other devices (Android/iOS/Mac/PC/ other headsets..etc)

    Or, do you want to stream some pre-recorded videos(mp4..etc) to Android?
     
    Last edited: Sep 10, 2019