Search Unity

WebGL transparent background

Discussion in 'Unity 5 Pre-order Beta' started by mikelis, Dec 9, 2014.

  1. mikelis

    mikelis

    Joined:
    Jan 7, 2013
    Posts:
    5
    I am working at broadcast company, and we decided use unity for real time graphics and transparent webgl canvas background is very important.
    Is transparent webgl background supported?
     
  2. jonas-echterhoff

    jonas-echterhoff

    Unity Technologies

    Joined:
    Aug 18, 2005
    Posts:
    1,531
    I assume you mean rendering WebGL content on top of other html elements, and showing the other html elements through transparent parts of the WebGL content.

    This is not possible right now, but would probably not be difficult to add by making the alpha channel in the frame buffer transparent. The main difficulty would be getting your transparency information into the alpha channel (as many shaders use it for other purposes), which might mean writing your own shaders (use the alpha render mode in scene view to see what is written there).

    Could you explain your use case a bit more? Before we consider adding this, I'd like to know why people might want to use it, because even if it might be easy to add, it would be another feature we have to keep supporting and testing in the future.
     
  3. mikelis

    mikelis

    Joined:
    Jan 7, 2013
    Posts:
    5
    Yes, correct.
    Thanks for info.

    This video shows, what we do -


    Currently we are using flash, but as we progress, we want to use more advanced technologies and build 3D panels. Currently, I have tested some unity webgl demo games on live graphics software, and performance is awesome.
    Some of new projects are build with three.js (with transparent canvas) but it is overhead in development, because we are small team. So, I see great potential, for unity in that field.
     
  4. rmast_taittowers

    rmast_taittowers

    Joined:
    May 19, 2014
    Posts:
    2
    What are you ultimately mixing your video in? If your broadcast switcher supports masking from a luma mask, it might be more straightforward to render two images per frame -- your graphics, and the mask.
     
  5. mikelis

    mikelis

    Joined:
    Jan 7, 2013
    Posts:
    5
    We are using casparCG for live graphics and Tricaster for mixing. Rendering two frames per frame seems hacky. Tell me more about it.
     
  6. jonas-echterhoff

    jonas-echterhoff

    Unity Technologies

    Joined:
    Aug 18, 2005
    Posts:
    1,531
    I see. But why are you using WebGL for this? Couldn't you more efficiently run this in standalones? There you could just get the video in using a frame grabbing device and render it to a WebCamTexture, and then you can do whatever you want with it. Even when using the web, maybe it would be nicer to actually find a way to get the video feed into unity then to find a way to transparently render unity onto the video feed. That way you would not have to worry about shader alpha, and you would have more abilities to use the video in different ways. (Neither is easy supported out of the box by unity, just trying to understand the use case better).
     
  7. mikelis

    mikelis

    Joined:
    Jan 7, 2013
    Posts:
    5
    Yes, standalone is more efficiently but our use case it is little bit different -
    WebGL content or Flash content is given to live graphics system, which extracts rgb and alpha channel. Then both channels is given to tricaster through network (iVGA) and mixed on top of the video. Then final video is streamed, for example, to youtube.
    Using this workflow, is resolved multiple problems - easy hardware setup, live graphic is independent from video and if graphics crashes, video stream continues undisturbed.
     
  8. Grigaluns

    Grigaluns

    Joined:
    Dec 9, 2014
    Posts:
    1
    Hello,

    I'm a colleague of Mikelis.

    rmast_taittowers
    Yes, our broadcast switchers support luma keying and it is actively used throughout the process. CasparCG playout server takes care of generating both video streams (fill+key) and keeping them in perfect sync. There are two transport mechanisms that can be used to get graphics from the rendering machine to switcher - the first is by using two SDI cables. A Blackmagic Decklink 4K Extreme card on the rendering machine outputs the two video streams and TriCaster mixes them. The other basically does the same thing only over network - NewTek (switcher vendor) has provided the protocol and tools to make it possible.

    So the basic concept is correct - render two frames and then forward them to the switcher. To actually implement it would either involve using Decklink SDK or reverse engineering NewTek's iVGA protocol - neither of which is impossible, of course, but we'd like to avoid the extra development effort if it's possible. At the moment we can afford the overhead of running Unity's WebGL export in CasparCG's built-in browser, the only thing holding us back is a transparent background.

    jonas
    Rendering video to a texture will have to do for now for development and demo purposes, but I doubt it will be good enough for production - audio syncing/delayed video and framerate mismaches between input video and output video + graphics are two problems that come to mind. Not touching the source video is the easiest way to insure it doesn't lose any quality which is extremely important for us. It does provide a way to do more impressive effects so we might want to return to this in the future.
     
  9. jonas-echterhoff

    jonas-echterhoff

    Unity Technologies

    Joined:
    Aug 18, 2005
    Posts:
    1,531
    Ok. I get the idea, but I am still reluctant to consider adding this as a feature in Unity. Because it is very much a niche case we'd need to keep support, maintain and QA, and because given the shader output issues I explained above, it would not be quite easy to use.

    That said, because it's so easy, I'll offer you an (admittedly hackish) solution you could try for now. Take the code below and put it into a .jslib file and drop that into your project. It should override the implementation of glClear to not do anything when clearing the alpha buffer, thus skipping the part where we clear the alpha buffer, and giving you transparency in WebGL.

    Code (csharp):
    1.  
    2. var LibraryGLClear = {
    3.     glClear: function(mask)
    4.     {
    5.         if (mask == 0x00004000)
    6.         {
    7.             var v = GLctx.getParameter(GLctx.COLOR_WRITEMASK);
    8.             if (!v[0] && !v[1] && !v[2] && v[3])
    9.                 // We are trying to clear alpha only -- skip.
    10.                 return;
    11.         }
    12.         GLctx.clear(mask);
    13.     }
    14. };
    15.  
    16. mergeInto(LibraryManager.library, LibraryGLClear);
    17.  
     
    tofind and wahnishjorge25 like this.
  10. mikelis

    mikelis

    Joined:
    Jan 7, 2013
    Posts:
    5
    Thanks, it works as excepted :)