Search Unity

  1. Unity Asset Manager is now available in public beta. Try it out now and join the conversation here in the forums.
    Dismiss Notice
  2. Megacity Metro Demo now available. Download now.
    Dismiss Notice
  3. Unity support for visionOS is now available. Learn more in our blog post.
    Dismiss Notice

Playing 360 videos with the VideoPlayer

Discussion in '5.6 Beta' started by DominiqueLrx, Mar 16, 2017.

  1. DominiqueLrx

    DominiqueLrx

    Unity Technologies

    Joined:
    Dec 14, 2016
    Posts:
    260
    Hi everyone!

    I finally took a bit of time to produce a simple 360 VideoPlayer example project with instructions.

    For the Impatient
    1. Download the Unity package here: https://oc.unity3d.com/index.php/s/kceb0rAE9fDcdqE
    2. Import it into Unity using Assets -> Import Package -> Custom Package ...
    3. Open the Scene.
    4. Hit play.
    5. Move the mouse to look around.
    In 5.6, the VideoPlayer doesn't offer workflow enhancers to streamline the usage of 360 videos so everything must be done by hand. There are a handful of steps to do manually which are all explained below with enough details for you to understand exactly what is happening and why.

    What's a 360 Video?

    What we call a 360 video can take many forms. One of the most common forms is a movie with a 2:1 aspect ratio, often 3840x1920, that uses an equirectangular projection to unwrap a sphere into a rectangle, as described here: https://en.wikipedia.org/wiki/Equirectangular_projection

    When looked at without putting them back into a sphere, these videos look deformed, especially at the top and bottom, corresponding to the sphere's poles.

    Creating a Projection Screen Inside Unity

    As mentioned above, in order for the 360 video to look right, it must be viewed when applied to the inside of a sphere. This can be done in Unity but typically poses two - solvable - problems:
    1. One cannot directly apply a texture to the inside of a sphere
    2. Once overcome with some normal-inversion tricks, applying a texture inside a sphere leaves visible artifacts at the poles.
    Both of these problems can be overcome with a relatively simple shader, see the details further down. So our projection "room" will be composed of a camera, sitting at the center of a sphere that uses this shader.

    Looking Around

    Since our projection screen wraps the camera completely, the camera can only look at a small portion of its inside surface at any given time. The demo project includes a script that lets you control the camera orientation using the mouse.

    Future VR integration might automate this so one doesn't have to add these inspection tools to the scene per se. This is conceptually similar to the Scale control at the top of the Game View.


    Playing The Video

    Once you have the camera, with its mouse control, and sphere-screen prepared, playing back a video in this scene is just a matter of importing a 360 video into Unity and drag-and-dropping the resulting Video Clip onto the sphere.

    This drag-and-drop is a workflow helper that will automate for you:
    1. The creation of the Video Player
    2. The initialization of the Video Player's Render Mode to "Material"
    3. The selection of the _MainTex texture parameter in the first material of the sphere's renderer as the target for the video playback.
    Future VR integration in the Video Player might augment the offered Render Mode options to recognize that the video is 360 footage and add options for the interaction with the camera such as projection type, field of view, etc. It might also free you from having to create a sphere and render directly into the scene.

    Doing It From Scratch

    The Unity package downloaded earlier has a working example of all the pieces assembled together. But if you want to redo each step for yourself to better understand how everything interacts, here is a step-by-step list of operations:
    1. Create a new scene
    2. Copy the 360 equirectangular shader (360Equirectangular.shader) into the Assets folder. Have a look inside to familiarize with the simple math involved.
    3. Create a new Material named InsideSphere.
    4. Change the InsideSphere shader to 360/Equirectangular so it uses the shader you've imported earlier.
    5. Create a Sphere using the GameObject->3D Object->Sphere menu.
    6. In the Sphere's Mesh Renderer, set the material to use the InsideSphere material created above.
    7. Set the Main Camera Position to 0, 0, 0 so it sits in the center of the sphere.
    8. Copy the MouseFollow.cs script into your Assets folder.
    9. Drag-and-drop the MouseFollow script onto the Main Camera so the mouse movements will control the camera orientation.
    10. Copy the 360_test_foggy_park_001.mp4 video into the Assets folder.
    11. Drag and drop the Video Clip onto the Sphere in the Hierarchy view. This will add a VideoPlayer component to the Sphere with the clip already set as its source, and that targets the current game object's renderer.
    If the 360 video clip has audio in it and you want to hear it:
    1. Set the VideoPlayer's Audio Output Mode to Audio Source
    2. Add an AudioSource component on the sphere
    3. Drag and drop the Audio Source (by clicking on the Audio Source title label) into the Audio Source field in the VideoPlayer editor.
    The Equirectangular Shader

    The shader used to texture the video into the sphere (360Equirectangular.shader) uses a cartesian-to-spherical coordinate conversion as described here:


    The fragment's normal is used, instead of its coordinates, in order to produce texture coordinates, which is exactly the inverse operation that was done when transfering the pixels from the 360 camera into a rectangle frame. The calculation is simplified by normalizing the sphere radius to 1 since ultimately we only want the inclination and azimuth angles to convert them back to [0, 1] values to provide texture sampling coordinates. This saves a few divisions in the calculation.

    The trick that performs the normal inversion hinted at earlier in this post is taken care of by the Cull Front directive at the top of the shader pass. This is one way to have the objects rendered "inside out" as described here: https://docs.unity3d.com/Manual/SL-CullAndDepth.html

    Final Note

    I'll probably convert this into a blog post when Unity 5.6 is officially released. Feel free to make suggestions on how this could be improved to be clearer or more useful.

    Dominique
    A/V developer at Unity
     
  2. MaskedMouse

    MaskedMouse

    Joined:
    Jul 8, 2014
    Posts:
    1,091
    I ran the demo you set up
    The shader seems to be problematic as it shows the video as one big green sphere or light brown.
    Whenever the material shader is set to standard it shows the video's texture (but ofcourse not in a good way)
     
  3. DominiqueLrx

    DominiqueLrx

    Unity Technologies

    Joined:
    Dec 14, 2016
    Posts:
    260
    What platform did you try this on? I've done this on OSX in case it helps. I'll try it on Windows when I have time.
     
  4. DominiqueLrx

    DominiqueLrx

    Unity Technologies

    Joined:
    Dec 14, 2016
    Posts:
    260
    Hi again,

    I updated the content. On Windows, the Graphics API for the project needed to be set to DX9, and I updated the bundled movie. The original one didn't play on my Windows machine, we'll figure out why later on.

    Sorry for the trouble!

    Dominique
     
  5. MaskedMouse

    MaskedMouse

    Joined:
    Jul 8, 2014
    Posts:
    1,091
    No worries!
    Happy to report the issues, I was indeed using a Windows 10 machine.
    Got my fingers crossed for the compatibility and stability of the video player on all platforms.
     
  6. morfaine

    morfaine

    Joined:
    Oct 1, 2015
    Posts:
    9
    Thanks for implementing the new player. I've been testing the 360 video playback with my own shader and code and it's working well. I'd like to stream the content from a URL for one of my projects and I have some questions.

    1. Will there be anyway to retrieve the video metadata (video length, height, width, etc) for the video when streaming from a URL?
    2. When seeking while streaming does the player need to download the video before being able to seek forward? If the video is an hour long will the player need to buffer an hour of video before being able to seek that far ahead?

    Many Thanks,
    Morfaine
     
  7. DominiqueLrx

    DominiqueLrx

    Unity Technologies

    Joined:
    Dec 14, 2016
    Posts:
    260
    Hi Morfaine!

    You have two good questions and I think I can help with both.
    When streaming from a URL, the video information becomes ready when the preparation is completed. The VideoPlayer has API entry points exactly for the purpose of waiting for the information to become ready. Of course, this information is ready immediately when what you are playing is a VideoClip, but not for a URL as you have found out. Here's an example of what you could do:
    Code (CSharp):
    1. public class WaitForInfo : MonoBehaviour {
    2.  
    3.     void Start ()
    4.     {
    5.         var vp = GetComponent<UnityEngine.Video.VideoPlayer>();
    6.         vp.prepareCompleted += Prepared;
    7.         vp.Prepare();
    8.     }
    9.  
    10.     void Prepared(UnityEngine.Video.VideoPlayer vp)
    11.     {
    12.         var width = vp.texture.width;
    13.         var height = vp.texture.height;
    14.         var duration = vp.frameCount / vp.frameRate;
    15.  
    16.         // Fake entry point...
    17.         UpdateTheUserInterfaceWithThePlayerInfo(width, height, duration);
    18.  
    19.         // Play immediately if this is what you want.
    20.         vp.Play();
    21.     }
    22. }
    Note that for now there is no notification about buffering so while you wait for the information to become available (which could be arbitrarily long for a URL), all you can do for now is implement some form of spinning cursor with a timeout to avoid waiting forever. We'll improve this in a subsequent release.
    This part of the implementation is done by the native libraries we use so we have no control over this. Some of the implementations we have seen make use of mechanisms such as http range requests to be able to fetch arbitrary parts of the file without having to download it completely. This is also a function of how the server supplying the content is implemented. So you will have to experiment to find out. We will probably gain more information about this over time and be able to share this in per-platform notes at some point.

    One thing I can tell you however is that the software VP8 implementation we are doing will download the whole file before playing back. This implementation is used on all platforms except Android. We'll of course improve this later on, but at least here you know exactly what to expect!

    Hope this helps,

    Dominique
    A/V developer at Unity
     
  8. morfaine

    morfaine

    Joined:
    Oct 1, 2015
    Posts:
    9
    Many thanks for the reply Dominique.

    I was expecting the VideoPlayer.clip variable in the Videoplayer instance to be updated to a new VideoClip instance containing all the metadata when the isPrepared flag was true. I see from your example code that I need to query the VideoPlayer texture and frame variables instead. Thanks for the heads up.

    Thanks for the info. Will you will be implementing HLS or DASH support in the the new VideoPlayer? This should help with the seek features I'm interested in.
     
  9. mattSydney

    mattSydney

    Joined:
    Nov 10, 2011
    Posts:
    171
    Can you include a video with sound? I would like to see a 360 video working with spatial audio correctly set up please
     
  10. DominiqueLrx

    DominiqueLrx

    Unity Technologies

    Joined:
    Dec 14, 2016
    Posts:
    260
    Very good point Matt, this video is a place-holder for actual original content being produced, I'll make sure the next version has a usable audio track in it.

    Dominique
     
    mattSydney likes this.
  11. jwvanderbeck

    jwvanderbeck

    Joined:
    Dec 4, 2014
    Posts:
    825
    How would you go about doing this for stereo content in VR where one part of the video needs to go to one eye specifically and another part of the video needs to go to the other eye specifically?

    I am having trouble figuring out how to essentially cut the frame into two sections that are then rendered on the proper eye cameras.
     
  12. DominiqueLrx

    DominiqueLrx

    Unity Technologies

    Joined:
    Dec 14, 2016
    Posts:
    260
    Good question, and good idea for a follow-up demo scene.

    I haven't tried this myself, but this blog post explains how to do it with the MovieTexture, essentially using the tiling features of texture parameter inputs: http://bernieroehl.com/360stereoinunity/

    So here, you would

    1. create one sphere per eye
    2. have the VideoPlayer produce its output into a RenderTexture (it has a render mode for this).
    3. use the 360 Equilateral shader described in my demo on each sphere
    4. use the RenderTexture as input to both sphere's, with each sphere's texture parameter tiling controls set as described in the blob post I am pointing to.

    Post your results here if you have time to try this out!

    Dominique
     
  13. Selzier

    Selzier

    Joined:
    Sep 23, 2014
    Posts:
    652
    I'm getting this error when importing the mp4:



    I couldn't find any 360 specific codecs so I downloaded the VLC 360 player, updated my video codes with k-lite, I still have the error. It seems to be related to the audio track only. I can play the video with Windows media player, VLC, etc, but there is no audio.
    - Windows 7 64bit
    - Unity 5.6.0f1

    Any ideas?
     
  14. Selzier

    Selzier

    Joined:
    Sep 23, 2014
    Posts:
    652
    I'm going through the Equirectangular Shader, but here's the problem I'm seeing right now.

    - The airshow.mp4 gave me encoding erros, see previous post.

    - I rendered a 360 video with VR Panorama 360 PRO Renderer (https://www.assetstore.unity3d.com/en/#!/content/35102) and imported the MP4 into Unity, I get an encoding error about video track and 1cva.


    - I uploaded the video to youtube (
    ) and then I used Youtube's option to "Download MP4". This file works in Unity (no errors) but there scene does not look correct (upside down/etc) and rotation does not work correctly:


    - I downloaded the "Surfers Video" from VLC: http://people.videolan.org/~jb/Builds/360/


    - This video gave me no errors again, audio even plays in Unity, but I have the same upside-down and no rotation error:


    Thanks for any help!
     
  15. Selzier

    Selzier

    Joined:
    Sep 23, 2014
    Posts:
    652
    I've got it working using basically an Unlit > Texture shader and everything works great in the editor:


    The only problem now is that when I export to Android the "Video Player" only shows up as black, so I don't see the video.
    Again, all looks and works great in Editor, but on Android it will only show a black texture. Thanks!
     
  16. ProgrammerSteve

    ProgrammerSteve

    Joined:
    Jul 13, 2014
    Posts:
    1
    I've got this to work on Windows using DX11 by changing the shaders frag function. The argument should be v2f instead of float3. Still works on OSX as well.

    Code (CSharp):
    1. fixed4 frag (v2f i) : SV_Target
    2. {
    3.     float2 equirectangularUV = ToRadialCoords(i.normal);
    4.     return tex2D(_MainTex, equirectangularUV);
    5. }
     
    antonheidar, nwxp, VINCENTEGG and 3 others like this.
  17. rschu

    rschu

    Joined:
    Nov 22, 2016
    Posts:
    5
    Does the new Video Player also support HLS or DASH streams for 360?
     
  18. kru64

    kru64

    Joined:
    Mar 25, 2017
    Posts:
    2
    Selzier, if you check the property of file Surfers_360.mp4 you see 'width of frame=2048,height=1024'.
    But if you check the property of file airshow.mp4 you see 'width of frame=,height='. These metadata are absent in this file and it's a reason,why unity can play some files and can't other
     
  19. theblitz

    theblitz

    Joined:
    Aug 22, 2016
    Posts:
    11
    I tried to use the video given in the sample code (airshow.mp4) but it simply doesn't work. Something is wrong with the file.
    I tried using a video I downloaded from elsewhere but for some reason it shows upside down in Unity and when I build for Android it doesn't show at all.

    I have managed to get it to work using 5.5 with a simple sphere and texture but not here.
     
  20. saik007

    saik007

    Joined:
    May 17, 2016
    Posts:
    9
    Guys...... I'm using unity5.6.0b9 , and all i want to know is does new video player option in unity5.6 supports Android or Not..Because i tried it different times but at the same in the unity documentation i saw that at present it only supports for Standalone and webgl...........and also unity recognized it as a bug : https://issuetracker.unity3d.com/issues/android-videoplayer-not-playing-video

    All i want to know is for now, does unity 5.6.0b9 support video playback for android???


    Regards
    sai.
     
  21. saik007

    saik007

    Joined:
    May 17, 2016
    Posts:
    9


    I don't think we need to 2 spheres for VR
    I already made an app which plays 360 videos in VR mode for google cardboard using inverted sphere and EasyMoviTexture, all you have to do it drag and drop the GVRmain and it will do all the things..........
     
  22. spacefrog

    spacefrog

    Joined:
    Jun 14, 2009
    Posts:
    734
    This is the way that works with GoogleVR as it creates 2 cam's itself which you can setup to filter/cull the other eye
    But AFAIK Unity's built in native VR requires you the create a separate camera for each eye, culling everything except the specific eye's content. A third camera for rendering the common stuff ( = appears same for each eye) and that is used for the input raycasting should render on top of the L/R eye cameras ( set the camera's depth accordingly and be sure to set the third cam to clear depth only )
    You also have to set the "target eye" properties of those cams correctly of course: "Both" for the third cam, "Left" and "Right" accordingly for the separate eye cams
     
    ShantiB95 likes this.
  23. MaskedMouse

    MaskedMouse

    Joined:
    Jul 8, 2014
    Posts:
    1,091
    I don't understand what is the difference is between using a Equirectangular shader or using a sphere with inverted normals using an unlit shader?
     
  24. saik007

    saik007

    Joined:
    May 17, 2016
    Posts:
    9


    In case, if you don't have access to any modeling software or you don't know to work on one, In that case we use shader programming to create Equirectangular shader else you can simply Invert the normals of sphere in maya or max and import it into unity which we can also do in unity i think so using some code but it will be more nice if we do it in modeling softaware
     
  25. saik007

    saik007

    Joined:
    May 17, 2016
    Posts:
    9


    I know this is how unity works to split screens for google cardboard that's the reason why mentioned no need to create to spheres and two cameras and then coding , arranging cameras, culling masks....so we can just import the gvr SDK and using gvr main prefab we can do with ease....That's what i meant to say...