Search Unity

  1. Welcome to the Unity Forums! Please take the time to read our Code of Conduct to familiarize yourself with the forum rules and how to post constructively.
  2. Dismiss Notice

Video player render to texture performance issue in WebGL on Chrome

Discussion in 'WebGL' started by QuentinWarnantSumo, Aug 28, 2019.

  1. QuentinWarnantSumo

    QuentinWarnantSumo

    Joined:
    Jun 25, 2019
    Posts:
    3
    Hello,

    I'm doing some R&D with the WebGL pipeline. I've set up a simple scene with a video player streaming a video and rendering to a render texture. I also have a quad in the scene with a material rendering the render texture.

    This all works, but I noticed some pretty bad performance on the texSubImage2D step in Google Chrome.
    In comparison Firefox and Edge perform very smooth.

    Chrome takes 32.9ms :
    upload_2019-8-28_11-4-52.png

    In comparison Firefox takes about 3 ms
    upload_2019-8-28_11-13-31.png

    Edge takes about 6ms
    upload_2019-8-28_11-20-15.png



    I'm conscious this may not be a "Unity" issue, but more of a browser implementation issue, unfortunately I can't find much towards a solution on Google's bug tracking board. One link I found that may be pertinent:
    https://bugs.chromium.org/p/chromium/issues/detail?id=612542

    So my question is: is there a recommended method I can use in Unity that get's past this performance hit on chrome?

    PS: I'm currently on version 2019.1.7f1

    Thanks,
    Quentin
     
    qwert024 likes this.
  2. Uli_Okm

    Uli_Okm

    Joined:
    Jul 10, 2012
    Posts:
    94
    Do you really need to use a render texture?
    You can get the original video texture through code so you don't need to have a render texture as "bridge" between them.
    This is a example of how you can do this:

    Code (CSharp):
    1. //Sets a object to use the video as a texture
    2. targetImage.texture = videoPlayer.texture;
    3.  
    4. //Note that by doing this, you will need to manually adjust sizes to keep the video aspect ratio if needed
     
    De-Panther likes this.
  3. QuentinWarnantSumo

    QuentinWarnantSumo

    Joined:
    Jun 25, 2019
    Posts:
    3
    Hi there,

    Thanks for your answer.

    I imagine you're talking about a RawImage component, I've just tried it, and seen no difference.
    I also made sure the video player's Render mode was set to API Only.

    In fact, I've even removed any code to read the video player's texture, meaning I can't see the result of the video, but as soon as I trigger Play, I see the framerate drop and upon profiling the frames I can still see this texSubImage2D call taking around 30ms. Although upon closer inspection the 30ms seems to be increased by the Chrome's profiling tool.

    Still, if I add a fps counter, I'm not getting a great framerate:

    1 video playing:
    upload_2019-8-30_9-40-22.png

    2 videos playing:
    upload_2019-8-30_9-42-40.png

    (note that this avg number is over a long period of time, and thus isn't representative of the video playing testing period)
     
  4. QuentinWarnantSumo

    QuentinWarnantSumo

    Joined:
    Jun 25, 2019
    Posts:
    3
    An update on this thread, I had a look at the performance of video render to image in other WebGL examples in Chrome and noticed this example performing really well, amongst others:
    https://threejs.org/examples/#webgl_materials_video
    http://scottmcdonnell.github.io/pixi-examples/index.html?s=basics&f=video.js&title=Video

    0.7ms on the texImage2D step.
    upload_2019-9-2_10-10-42.png

    Which makes me think it's not "Chrome's implementation at fault", but rather how Unity has implemented the video to texture step.

    However I noticed that both these video other players (threeJS & PixiJS) end up with a "texImage2D" call instead of a "texSubImage2D", and wonder if this is relevant to the performance difference?
     
    Logic_Bomb likes this.
  5. jukka_j

    jukka_j

    Unity Technologies

    Joined:
    May 4, 2018
    Posts:
    944
    Any chance you might be able to modify the build output to use texImage2D() instead of texSubImage2D() and compare how that behaves?

    The trouble with these two functions is that different GL driver vendors and different browsers optimize them differently, resulting in different performance characteristics. Some vendors state that texSubImage2D() is the preferred path since it hints the driver that reallocation of texture resources is not desired (upload over to existing resource), others say that texImage2D() is the preferred path since it hints the driver that the old resource is no longer needed. I recall asking a few years ago from Qualcomm reps at GDC booth, who suggested back then that manually double-buffering texSubImage2D() and bufferSubData() calls would be the fast path, so there is variance there. Layering browsers on top can make this even more complex behavior wise.

    If it looks like texImage2D() calls are faster on Firefox and Chrome (and preferably also in some Safari scenario), definitely we should migrate to using that instead. If texImage2D() slows down some browsers, then it becomes a bit more difficult, may need to do different paths on different browsers..
     
  6. logmaor

    logmaor

    Joined:
    Apr 10, 2019
    Posts:
    2
    I also encountered this problem in Chrome when using texSubImage2D() to upload a texture from a HTML video element.
    A potential solution is creating the texture in WebGL with createTexture() and use texImage2D(), instead of passing a reference from a Unity texture. Then on the Unity side you can use Texture2D.CreateExternalTexture to get the texture into Unity and attach it to a material or do whatever you want with it. However, I can't find any documentation on how to get the native texture pointer in WebGL.

    Does anybody have any idea how to get a native texture pointer in WebGL?
     
  7. logmaor

    logmaor

    Joined:
    Apr 10, 2019
    Posts:
    2
    So I figured out how to pass textures from the WebGL context to Unity with some trial and error.

    In JavaScript, create a texture, give it an integer name and add it to GL.textures:
    Code (JavaScript):
    1. const texture = GLctx.createTexture();
    2. texture.name = texId;
    3. GL.textures[texId] = texture;
    where texId is some integer ID where the texture lives. You have to make sure that it's not already used by another texture.

    Then at the Unity side you can do something like this:

    Code (CSharp):
    1. Texture texture = Texture2D.CreateExternalTexture(width, height, TextureFormat.RGBA32, false, false, new System.IntPtr(texId));
    Now we can use gl.texImage2D instead of gl.texSubImage2D to update the texture in WebGL which is way faster in Chrome(ium) and slightly faster in Firefox.
     
    gtk2k and QuentinWarnantSumo like this.
  8. Utarastas

    Utarastas

    Joined:
    Jul 27, 2017
    Posts:
    11
    Does this still work? I have been trying to use JavaScript to create a texture and then point to it from Unity side for a while now. The code runs and the video is displayed, but I can still see texSubImage2D being called in chromium performance window and still observe around 10-12 fps on our target device (Raspberry Pi 4B , WebGL build, Chromium, 720p), meanwhile examples like this : https://threejs.org/examples/?q=video#webxr_vr_video run fine (so does Youtube at 1080p).
     
    Last edited: Mar 6, 2021
  9. Marks4

    Marks4

    Joined:
    Feb 25, 2018
    Posts:
    491
    Any solution to this? How does this issue not happen with Unity's webcamtexture? There is no fps drop using Unity's webcamtexture.