Search Unity

[Release] FMETP STREAM: All-in-One GameView + Audio Stream + Remote Desktop Control(UDP/TCP/Web)

Discussion in 'Assets and Asset Store' started by thelghome, Apr 30, 2019.

  1. thelghome

    thelghome

    Joined:
    Jul 23, 2016
    Posts:
    741
    Yes, all the encoders and decoders are paired by Label ID.

    There are few ways to achieve it:
    1) If you have your own string commands or byte[] data, you may add your own meta data for filtering between headsets. (maybe adding the serial number of your headset before the string command..etc)
    2) You may also separate the stream data by different ports, by running multiple instance of node.js servers.
    3) Socket.io supports room feature, you can modify our node.js server example.
     
    Last edited: Jan 10, 2022
    VastnessVR likes this.
  2. VastnessVR

    VastnessVR

    Joined:
    Nov 21, 2017
    Posts:
    24
    Thanks,
    Sorry I missed that the Encoders have the Label ID.
    That will be ok for the GameViewEncoder and I'll add an ID at the start of the stream I send between the Headset and the Viewer app.

    Cheers
     
    thelghome likes this.
  3. dawnr23

    dawnr23

    Joined:
    Dec 5, 2016
    Posts:
    41
    I am currently using the v1 version.

    UWP (HoloLens2) and Windows 10 (PC) are doing UDP communication on both platforms.

    When running the app for the first time, there is a high probability that it cannot be connected as shown in the screenshot below.

    upload_2022-1-13_18-10-44.png

    Feature 1. The red part says no connection, but the blue part is well connected.
    Feature 2. It occurs with a high probability when the device is first launched.

    Q1. Any way to solve this part? Will buying v2 solve it?
    Q2. Can I use the code I used in v1 if I purchase v2?

    Answers I'll wait. thank you.
     

    Attached Files:

  4. thelghome

    thelghome

    Joined:
    Jul 23, 2016
    Posts:
    741
    We've made many major improvement in FMETP STREAM V2, including some known bugs.
    It's hard to troubleshoot the bugs in V1, we recommended that you may consider the 50% off upgrade for V1 owners.

    The code in V1 should be compatible in V2, but we do recommend users to backup their projects before upgrade.
    The clean installation will be recommended too, which you should remove the root folder of the existing plugin(V1) before importing the new version(V2).

    PS: You may also check our release notes:
    https://assetstore.unity.com/packages/templates/packs/fmetp-stream-v2-202537#releases
     
    Last edited: Jan 13, 2022
  5. dawnr23

    dawnr23

    Joined:
    Dec 5, 2016
    Posts:
    41
    Is there a guide for updating the version from V1 to V2?
     
  6. libra34567

    libra34567

    Joined:
    Apr 5, 2014
    Posts:
    62
    Dear sir/madam
    I recently bought your asset and gave it a test in a empty project. All things fine and its streaming great!
    But I met issue importing into existing project. It seems to modify pretty much all of my project settings.

    I wonder what is the set of changes on project settings thats requried for your asset to work nicely?

    Thanks in advanced!
     
  7. thelghome

    thelghome

    Joined:
    Jul 23, 2016
    Posts:
    741
    By default, most of public functions and variables are compatible.
    There might be some folder names and structure changed. Thus, removing the old package before import would be recommended.
     
  8. thelghome

    thelghome

    Joined:
    Jul 23, 2016
    Posts:
    741
    Simply untick this toggle will skip the project settings.
    Those project settings are required by Asset Store Upload, due to the plugin category of "Template".

    But those settings are actually not required for FMETP STREAM V2 itself.
    Screenshot 2022-01-15 at 6.38.01 PM.png
     
  9. libra34567

    libra34567

    Joined:
    Apr 5, 2014
    Posts:
    62
    Thank you!, its great to know :)
     
  10. CloudyVR

    CloudyVR

    Joined:
    Mar 26, 2017
    Posts:
    715
    Hi I just purchased your asset. I am trying to stream video from a Unity game on a PC to a Oculus Quest 2 HMD,

    I managed to get everything working but the latency is ~0.5 seconds. My texture resolution is 960x540, and I keep the quality slider near 5%.

    But is there any way to get even a small performance gain? I need the video stream to be as close to realtime as possible, and even at the lowest settings there is still noticeable delay.

    I don't notice any differences in latency from medium settings to lowest settings.

    Also I'm using renderTextures, but I notice GameViewEncoder has additional options like ColorReductionLevel, would that perhaps be any faster?

    Would I be able to increase the number of threads used by the decoding to decrease processing time?

    Is there any better solution than render textures that might be faster? Is there anything I could optimize in Android to make the decoding a little faster? I am completely okay with grayscale images, but also when switching color modes didn't make any difference and the latency was still ~0.5 seconds.

    I am very grateful for any advice that might help improve the latency even slightly!! Thanks
     
    Last edited: Jan 18, 2022
  11. thelghome

    thelghome

    Joined:
    Jul 23, 2016
    Posts:
    741
    May I know if you are streaming via Local Network, or via Internet?
    There is a latency test(wifi) we did in last year, it should be more or less the similar result for Quest 2.


    Using GameViewEncoder is always recommended instead of TextureEncoder(RenderTexture directly), as we do have some optimised settings.

    You may also refer to our Quest 2 Template scene's setup. If you are in good Wifi environment, it shouldn't be ~0.5 seconds in local network.
    https://github.com/frozenmistadventure/fmetp_tutorial_questvr_stream


    It's also welcome to reach us directly via email, for any suggested setup for your project.
    technical support: thelghome@gmail.com
     
    CloudyVR likes this.
  12. CloudyVR

    CloudyVR

    Joined:
    Mar 26, 2017
    Posts:
    715
    Hi! I really appreciate your quick reply and excellent tech support !!

    My PC is a I9-9900 z390 with a 2080 GPU and I use a Linksys WRT3200ACN router. Also I am far removed from other RF interference.

    My Quest is connected via 5ghz gigabit wifi and my PC directly connected to the router's ethernet using CAT6 cable.

    I am able to use a program called VirtualDesktop (.h264) to stream to the Quest2 and the latency of HD video appears perceptibly instant. So I am confident there is enough bandwidth and that my network is stable.

    Also I did try switching to GameViewEncoder and streamed a scene camera, I think i might have noticed ever so slight of a improvement, but the latency is still feels like it's between 0.25 to 0.5 seconds. But I still need to actually measure the latency!!

    I was wondering (as my application does not require color) would it be possible to encode as greyscale only (single channel)? Could that possibly reduce the data throughput and processing time by ~1/3?

    I need the rendered video in the Quest to be split second latency, very realtime for an AR application! Any latency is very noticeable, so more important than image quality or color data is concurrency of the realtime video. And any way to improve that I am all ears!!

    Thanks again very much for your help!
     
    Last edited: Jan 19, 2022
  13. thelghome

    thelghome

    Joined:
    Jul 23, 2016
    Posts:
    741
    There is an option of "Async GPU Readback" for better performance, but it will cause extra latency.
    Please make sure this experimental option is disabled.

    In Chroma Subsampling option, grey scale should be faster for the encoding too.

    You may also send us some dummy demo scene or screenshots, we can investigate it in a better way.
    technical support: thelghome@gmail.com
     
  14. CloudyVR

    CloudyVR

    Joined:
    Mar 26, 2017
    Posts:
    715
    Hello! I think I have made slight improvements, but still a bit of lag,

    I am looking through the FMExtentionsMethod.cs extensions where
    FMJPGToRawTextureData
    is called to decode the jpg. But I notice there are only 4 supported formats and they are all RBG (three channel):
    Code (CSharp):
    1.       case TextureFormat.RGB24:
    2.         tjPixelFormat = TJPixelFormat.RGB;
    3.         break;
    4.       case TextureFormat.RGBA32:
    5.         tjPixelFormat = TJPixelFormat.RGBA;
    6.         break;
    7.       case TextureFormat.ARGB32:
    8.         tjPixelFormat = TJPixelFormat.ARGB;
    9.         break;
    10.       case TextureFormat.BGRA32:
    11.         tjPixelFormat = TJPixelFormat.BGRA;
    12.         break;
    Does this mean that even when setting the "Chroma Subsampling option to grey" it is still encoding and decoding as RGB??? :eek:

    I really want to use grayscale instead of sending three R, G, B channels, can I use a single channel grayscale?? I see that
    TJPixelFormat
    contains a enum for Gray (single channel i believe):
    Code (CSharp):
    1.   public enum TJPixelFormat
    2.   {
    3.     RGB,
    4.     BGR,
    5.     RGBX,
    6.     BGRX,
    7.     XBGR,
    8.     XRGB,
    9.     Gray, //<<<<< How can I encode/decode using this format??
    but
    FMJPGToRawTextureData
    can't be edited so there is no way for me to specify this format for decoding or encoding!

    I printed/logged the number of chunks being sent by the GameEncoder and I see clearly that enabling or disabling Chroma grey option makes almost no change to the number of chunks or bytes transmitted. I would expect gray data to be one third the size of color data?

    Does your encoder/decoder support true single-channel grayscale formats? Or could grayscale somehow be implemented? I really really need lower latency and more real-time for my project and would be happy to contribute or donate (if you accept donations?) for this feature to be added!! Because using .h264 is currently not an option for me. Please can you help me to get more speed out of the decoder and encoders?

    I think it would help drastically if I could enable single channel true grayscale so the encoder/decoder has less to process (currently it is sending /receiving all RGB channels then convert back to grayscale..?) But wouldn't single channel give better real time video performance? :)

    Thanks
     
    Last edited: Jan 20, 2022
  15. thelghome

    thelghome

    Joined:
    Jul 23, 2016
    Posts:
    741
    For your request, we added the "Mono" option for the GameViewDecoder.
    Please download FMETP STREAM (v2.140) for this feature.

    The GameViewEncoder has been already supported the format of Grey in previous versions. You may test our new option in the GameViewDecoder.

    And thanks for considering any donation, we have a "Buy Me A Coffee" button at the bottom of our home page.
    Any contribution will be very appreciated, to support our development: https://frozenmist.com/
    Screenshot 2022-01-21 at 6.34.54 PM.png
     
    CloudyVR likes this.
  16. CloudyVR

    CloudyVR

    Joined:
    Mar 26, 2017
    Posts:
    715
    Thank you so much for spending time on that update and adding R8 format! I noticed much faster decoding performance on Quest when using the monochrome format!! I am amazed how well this plugin works for AR. I made a small donation, really wish I could give more. :)
     
    thelghome likes this.
  17. mess0002

    mess0002

    Joined:
    Jan 22, 2015
    Posts:
    2
    Hi! I've just started using FMETP and I'm trying to stream the Hololens 2 camera over the network. I've seen you mention using a WebCamTexture and putting it on a quad, then streaming a second 'render' camera. Using WebCamTexture is very bad for performance though, the framerate drops below 10 FPS. Do you know if there's a better way to stream the Hololens camera?
     
  18. Nigiri-Meshi

    Nigiri-Meshi

    Joined:
    Dec 24, 2020
    Posts:
    9
    I have a question.
    I'm trying to send an mp4 file that is over 2GB.
    I considered sending with "Action_SendLargeByte", but it cannot be done due to the limitation of the argument byte [].

    Does FMNetwork (including LargeFileEncorder / LargeFileDecorder) not support files larger than 2GB?
    (Give Byte [], which is a file divided by FileStream, to the argument of Action_SendLargeByte?)


    Asset:FMETP STREAM All-in-One 2.140
    OS:Windows10 64bit(Server/Client)
    Unity:2021.2.0f1
     
  19. thelghome

    thelghome

    Joined:
    Jul 23, 2016
    Posts:
    741
    Do you have screenshots of your current setup(Encoder/Decoder..etc)?
    According to previous customers' feedback, it shouldn't be below 10FPS. We'd like to investigate your case and see if this can be reproduced on our side.
    For technical support, please feel free to email us: thelghome@gmail.com
     
  20. thelghome

    thelghome

    Joined:
    Jul 23, 2016
    Posts:
    741
    For huge data like 2+ GB, there might be some limitation in the current version of FMNetwork.
    We will investigate this issue, and see if we could provide a fallback solution via TCP, which will be reliable alternative.
     
  21. mess0002

    mess0002

    Joined:
    Jan 22, 2015
    Posts:
    2
    Thanks for the reply, I'll send through an email shortly with a few screenshots but I'm mostly using your setup from the Demo_FMNetworkStreaming scene. I've seen a few other posts about low framerate with Hololens WebCamTexture, like this one, but they're quite old.
     
  22. thelghome

    thelghome

    Joined:
    Jul 23, 2016
    Posts:
    741
    We've just scan your project files really quick. Few notes below, but you may also refer to our detailed reply via email.
    1) Disabling "UseFrontCam" option in WebcamManager may work. Our recent test on HoloLens 2 is using Unity 2021, you may also compare the difference from your Unity 2020 project.
    2) Your own webcam player script didn't assign the resolution and framerate for the webcam texture, it may return heavy raw data and cause low fps.
    below suggested format may help:
    Code (CSharp):
    1.  
    2. webCamTexture = new WebCamTexture(devicesName, width, height, 30);
    3. webCamTexture.requestedFPS = 30;
    4. webCamTexture.Play();
    5.  
     
  23. Nigiri-Meshi

    Nigiri-Meshi

    Joined:
    Dec 24, 2020
    Posts:
    9
    Thank you for your answer.

    I have some additional questions.
    (Different from> 620)

    1. Was "FM Network Action" correct in "FMCore/Scripts/Networking/Network Action Server(or Network Action Client)"?

    2. The manual states that FM Network Action is deprecated. Is this just an older version?
    Or is there any other reason?

    3. Was it okay to use "Network Action Server" when transferring files within the LAN?
    Or is it better to do it with "FM Web Socket" (FM Network IO Manager)?
    (File is less than 1GB)

    4. Can "FM Network Manager" and "Network Action Server" be mixed?



    Asset:FMETP STREAM All-in-One 2.140
    OS:Windows10 64bit(Server/Client)
    Unity:2021.2.0f1
     
  24. thelghome

    thelghome

    Joined:
    Jul 23, 2016
    Posts:
    741
    Our "TCP Network Action" examples are deprecated, as FMNetworkUDP replace it with the ease of setup and better performance.

    You may consider FMWebsocket for your need, as it's also TCP based protocol.
    However, it's not recommended to mix old networking system with FMNetworkUDP.
     
  25. Nigiri-Meshi

    Nigiri-Meshi

    Joined:
    Dec 24, 2020
    Posts:
    9
    thank you for your answer.

    So, for file transfer within the LAN, is it okay to use either FMNetworkUDP or FMWebsocket?

    * Currently, file transfer (200MB) with FM Network UDP does not work well (Is my program wrong?), so I am wondering if I should change to FM Websocket.

    * Successful file transfer up to 50KB.
     
    Last edited: Feb 4, 2022
  26. thelghome

    thelghome

    Joined:
    Jul 23, 2016
    Posts:
    741
    FMWebsocket would be recommended for this case, as UDP may have packet loss for large file transfer.
     
  27. dawnr23

    dawnr23

    Joined:
    Dec 5, 2016
    Posts:
    41
    FMServer_Bug.png

    Hello!
    I bought the latest version to update the assets, the version is 1.329.0.

    Currently, the app is being connected in the following environment.

    FMServer : 1 PC (Windows 10)
    FMClient : 2 HoloLens2 (UWP)

    Here, like the image attached once in a while, a connection bug occurs in FMServer.cs.

    The connection bug is as follows.

    1. Yellow part: IsConnected is not checked.
    2. Yellow part: ConnectionCount is 0.
    3. Green part: This is well connected.

    Any guesses as to why this bug is happening in FMServer.cs?

    It happens intermittently, but it seems to happen with a high probability the first time I turn on the device the next day.


    Currently, the app is being connected in the following environment.

    FMServer : 1 PC (Windows 10)
    FMClient : 2 HoloLens2 (UWP)

    Here, like the image attached once in a while, a connection bug occurs in FMServer.cs.

    The connection bug is as follows.

    1. Yellow part: IsConnected is not checked.
    2. Yellow part: ConnectionCount is 0.
    3. Green part: This is well connected.

    Any guesses as to why this bug is happening in FMServer.cs?

    It happens intermittently, but it seems to happen with a high probability the first time I turn on the device the next day.
     
  28. thelghome

    thelghome

    Joined:
    Jul 23, 2016
    Posts:
    741
    Please be reminded that the latest version on Store is V2, which the connection related bug should be solved.
     
  29. Nigiri-Meshi

    Nigiri-Meshi

    Joined:
    Dec 24, 2020
    Posts:
    9
    Thank you for your answer.
    It will be carried out with FMWebSocket.

    I have another question.
    ・ Is it possible to communicate between servers?
    I want to synchronize text data between servers.
    * If you simply add two FM Networks and set Client and Server, an error will occur.
     
    Last edited: Feb 8, 2022
  30. MagicK47

    MagicK47

    Joined:
    Sep 5, 2017
    Posts:
    30
    I bought and tested the case scenario Demo_FMNetworkStreaming, it shows that another client has been connected, but the video screen has not been transmitted, why
     

    Attached Files:

  31. MagicK47

    MagicK47

    Joined:
    Sep 5, 2017
    Posts:
    30
    I find I can transfer video images on a PC, but not on a MAC. Why?
     
  32. MagicK47

    MagicK47

    Joined:
    Sep 5, 2017
    Posts:
    30
    In the same case scenario, video transmission between MAC and MAC or between MAC and phone is not possible, music and strings are ok, but video transmission on PC is ok
     
  33. thelghome

    thelghome

    Joined:
    Jul 23, 2016
    Posts:
    741
    If you have multiple FMNetworks, you may assign a different listener ports for another server/clients for your "Servers".
     
  34. thelghome

    thelghome

    Joined:
    Jul 23, 2016
    Posts:
    741
    May I have the information of your Unity version and Mac specs & Mac OS version? It may help us to reproduce this issue on our side.

    Edited: thanks for reporting this bug, and it's been solved in the latest version. please see the following post below.
     
    Last edited: Feb 8, 2022
  35. thelghome

    thelghome

    Joined:
    Jul 23, 2016
    Posts:
    741
    We just release a quick fix for this bug, which was related to a wrong ".meta" info issue in the package.
    Please update to the latest version, and it should be working correctly for Mac OS Builds now.
    [v2.142] -fixed meta issue on Mac OS Build
    Screenshot 2022-02-08 at 8.23.49 PM.png
     
  36. MagicK47

    MagicK47

    Joined:
    Sep 5, 2017
    Posts:
    30
    Hello, thanks for your quick fix, the connection and video transmission between MAS OS programs can be normal.
    However, mas OS does not work with mobile ios or Android, including PC and Android, PC and ios. What is the reason for this?
    When the phone is a client, the connection state often changes between true and false. The number of computer connections is always 0
    When the phone is a server, the number of connections will display 1, but the computer has not been successfully connected

    This is in the scenario of FMNetworkStreaming, webscoket seems to have a similar problem, can you help fix it?
     

    Attached Files:

  37. thelghome

    thelghome

    Joined:
    Jul 23, 2016
    Posts:
    741
    Could you check your firewall settings on MAC OS? It seems likely your firewall is blocking the network data.
     
  38. dawnr23

    dawnr23

    Joined:
    Dec 5, 2016
    Posts:
    41
    Even if you try to update to 1.383.0, the maximum version currently supported by v1, the same phenomenon occurs intermittently.

    Do you predict that the problem will be resolved if you update the newly uploaded v2 after purchasing?
     
  39. MagicK47

    MagicK47

    Joined:
    Sep 5, 2017
    Posts:
    30
    The firewall on the computer is off
     
  40. thelghome

    thelghome

    Joined:
    Jul 23, 2016
    Posts:
    741
    There are many improvement and bug fixes in V2, including the connection count bug.
     
  41. MagicK47

    MagicK47

    Joined:
    Sep 5, 2017
    Posts:
    30
    Thanks for your reply, we found that another mobile phone is ok
     
  42. Bambivalent

    Bambivalent

    Joined:
    Jan 25, 2017
    Posts:
    16
    Hi, I have a problem. I'm trying to build a pseudo-Skype video chat app via WiFi for Windows. I'm using your test scenes. While video streaming both in editor and Windows build is super even in maxed-out quality, FPS and resolution (hence no network problems), the microphone stream is super laggy, like 1-3 seconds beyond video stream. Are there any things to consider?
    Thank you for your asset, best regards
     
    Last edited: Feb 22, 2022
  43. thelghome

    thelghome

    Joined:
    Jul 23, 2016
    Posts:
    741
    Since our audio playback is using Unity's Default playback system, it has at least ~0.5 second latency as the limit of Unity part. The best solution for now is adding some delay in GameViewDecoder, in order to synchronise the video with audio manually.

    If the latency is more than 1 second, you may modify "DSP buffer size" under PlayerSettings/AudioSettings.
     
  44. JudahMantell

    JudahMantell

    Joined:
    Feb 28, 2017
    Posts:
    476
    @thelghome Hi! I just purchased v2 and I'm having a lot of fun with it. I am having some trouble that maybe you can help me with:

    1) I just want to note that at the moment GameViewEncoder doesn't set the render cam's target texture back to the original one (if present) which breaks connections to other render textures. I fixed this by creating another RenderTextue variable, and before rendering to the sent texture, just switching back and forth. As seen in the code below.

    Code (CSharp):
    1.  
    2. // This is a new variable
    3. ort = RenderCam.targetTexture;
    4.  RenderCam.targetTexture = rt;
    5. RenderCam.Render();
    6.  
    7. //apply color adjustment for bandwidth
    8. if (ColorReductionLevel > 0)
    9. {
    10.        MatColorAdjustment.SetFloat("_Brightness", brightness);
    11.        Graphics.Blit(rt, rt, MatColorAdjustment);
    12. }
    13.  // make sure to set it back
    14. RenderCam.targetTexture = ort;
    15.  
    2) I'm trying to create a system that allows the user to either use auto discovery to connect to the server, or manually type in an IP Address on the local network. The problem is, it seems to automatically connect regardless of what's typed into the input field I created. Even if Auto Network Discovery is false. Any ideas?
    I'm setting ClientSettings variables.

    3) Is there a way for the client to force disconnect from the server, forcing the server to call it's onDisconnect events?

    4) Is it possible to create an intentional delay between the server and the client? I realize this is a strange question, but I want to be able to dynamically set a delay in seconds. It's used for film production setups and needs to be applied to all set data, not just the picture--transform sync and other variables as well.

    Thanks so much!!
     
    Last edited: Feb 24, 2022
  45. thelghome

    thelghome

    Joined:
    Jul 23, 2016
    Posts:
    741
    Ans1: By default, GameViewEncoder will override the existing RenderTexture for grabbing the frame.
    Ans2: Thanks for reporting this bug to us, we just released a quick fix for this issue in FMETP STREAM V2.200
    Ans3: Yes, we added the new commands for your need, please refer to this page: https://frozenmist.com/docs/apis/fm-network-udp/
    Ans4: It's not a common use case, but you may queue all our command callback on your own script, and add delay manually with Coroutine or Invoke().
     
  46. JudahMantell

    JudahMantell

    Joined:
    Feb 28, 2017
    Posts:
    476
    Awesome, thanks so much for the reply and answers!
    I would recommend adding the option to set the render texture back to its default one after rendering as shown in my example code above. That's just taken and modified from the GameViewEncoder script. Just my two cents, maybe others will agree.

    I just updated to the new version and I'm getting two errors when starting the app:
    This one happens when I try to display the server's local IP on the UI using
     ipText.text = networkManager.LocalIPAddress();
    on start.
    SocketException: A request to send or receive data was disallowed because the socket is not connected and (when sending on a datagram socket using a sendto call) no address was supplied.

    System.Net.Sockets.Socket.Disconnect (System.Boolean reuseSocket) (at <b5226d7672da4aeaa36d32bf1166a63b>:0)
    FMNetworkManager.LocalIPAddress () (at Assets/FMETP_STREAM/FMNetwork/Scripts/FMNetworkManager.cs:42)
    testScript.Start () (at Assets/Scripts/testScript.cs:14)

    And this one happens endlessly as the app is running on both the server and client.
    SocketException: A request to send or receive data was disallowed because the socket is not connected and (when sending on a datagram socket using a sendto call) no address was supplied.

    System.Net.Sockets.Socket.Disconnect (System.Boolean reuseSocket) (at <b5226d7672da4aeaa36d32bf1166a63b>:0)
    FMNetworkManager.LocalIPAddress () (at Assets/FMETP_STREAM/FMNetwork/Scripts/FMNetworkManager.cs:42)
    FMNetworkManager.get_ReadLocalIPAddress () (at Assets/FMETP_STREAM/FMNetwork/Scripts/FMNetworkManager.cs:97)
    FMNetworkManager.Update () (at Assets/FMETP_STREAM/FMNetwork/Scripts/FMNetworkManager.cs:484)
     
    Last edited: Feb 24, 2022
  47. thelghome

    thelghome

    Joined:
    Jul 23, 2016
    Posts:
    741
    It's a minor exception, we just release a quick fix FMETP STREAM V2.201.
    We may not recommend that having existing RenderTexture on your RenderCamera.
    Because your original RenderTexture may lost some Render Frame, when GameViewEncoder is encoding the frame.
     
  48. JudahMantell

    JudahMantell

    Joined:
    Feb 28, 2017
    Posts:
    476
    Thanks for the quick fix and reply. I will experiment more with the Encoder's render texture to see if there are any issues.
     
    thelghome likes this.
  49. JudahMantell

    JudahMantell

    Joined:
    Feb 28, 2017
    Posts:
    476
    Ah, I see what you mean now, when switching the render texture back, it's extremely jittery in the raw image that it's displaying on, and any motion blur applied makes the streamed texture look very blurry.

    The texture encoder still won't work for me because I can't downsample the image like I can with GameViewEncoder and slows down the app significantly (not to mention a worse-looking picture, color-wise).
    Is there any way around this so I can use the GameViewEncoder? Any help would be greatly appreciated!
     
    Last edited: Feb 26, 2022
  50. thelghome

    thelghome

    Joined:
    Jul 23, 2016
    Posts:
    741
    In theory, you may write your own script to clone your high-res RenderTexture for smaller resolution, via Graphics.Blit().
    And only execute that function when you need.

    But normally, if you don't have much drawcalls in the scene, simply setup two Render Cameras would work better.
    Because the GameViewEncoder Camera only renders the frame when needed, instead of every frame.