Search Unity

  1. Megacity Metro Demo now available. Download now.
    Dismiss Notice
  2. Unity support for visionOS is now available. Learn more in our blog post.
    Dismiss Notice

[Released] AVPro Video - complete video playback solution

Discussion in 'Assets and Asset Store' started by AndrewRH, Feb 12, 2016.

  1. AndrewRH

    AndrewRH

    Joined:
    Jan 24, 2012
    Posts:
    2,806
    That's strange. Are you using the latest version of AVPro Video? Perhaps you could send us your project if the project persists?

    Thanks,
     
  2. AndrewRH

    AndrewRH

    Joined:
    Jan 24, 2012
    Posts:
    2,806
    Just wondering if you have tried the "offset" feature in the MediaPlayer (under Platform Specific > Android)? This lets you embed your video files within other files, which makes it harder for people to "rip" your videos. This means you can have MP4 video files in your StreamingAssets project that will not play in other players, but they will play in yours.

    I think this might be an easier way to achieve file hiding than putting the video files into the main APK (which is just a ZIP file and can be easily read).

    I hope that helps.
     
    hema_dubal likes this.
  3. AndrewRH

    AndrewRH

    Joined:
    Jan 24, 2012
    Posts:
    2,806
    I'm not totally sure as I haven't evaluated Easy Movie in quite some time. I just know that AVPro Video supports many platforms, has been optimised very well, has a really strong set of options and our email support system aims to resolve all customer issues very quickly.

    Again I'm not totally sure about the differences with Unity's Video Player component. I've heard that it doesn't handle streaming as well.... We really should draw up a comparison table with these other players, but we've been too busy developing the plugin. We use AVPro Video ourselves in many of our projects and we mostly prefer it over other options because we know that any customisations we require can easily be added - the same goes for customisations that our users request. I will think we have plenty of mileage with our asset. We have some existing advanced feature coming soon :)

    Thanks,
     
  4. AndrewRH

    AndrewRH

    Joined:
    Jan 24, 2012
    Posts:
    2,806
    Ahhhh that is a shame :( We'll get to testing OES soon and see what state it's in. Unity 5.3.6 was perhaps the heyday of OES support :) Thanks, we'll investigate further and see where we can take this.
     
  5. AndrewRH

    AndrewRH

    Joined:
    Jan 24, 2012
    Posts:
    2,806
    Which platform is this on?
    On some of our platforms we don't support videos with a different SAR/DAR. Your video is using DAR scaling, so the true resolution is 960x540 and it just gets scaled to 1920x540. Probably best to have your video files without any DAR scaling if possible.

    Thanks,
     
  6. AndrewRH

    AndrewRH

    Joined:
    Jan 24, 2012
    Posts:
    2,806
    Hi,

    I actually have no idea whether Blu-ray is supported by AVPro Video...we've never tested it. There is a free trial version on the website if you would like to test it yourself.

    Thanks,
     
  7. AndrewRH

    AndrewRH

    Joined:
    Jan 24, 2012
    Posts:
    2,806
    Hi Tom,

    That sounds like a good idea. We'll see if we can add it to an upcoming version. IF you look inside MediaPlayer.cs UpdateEvents you may be able to add support for it yourself...

    Thanks,
     
  8. 265lutab

    265lutab

    Joined:
    Sep 15, 2014
    Posts:
    155
    We want to avoid the hosting costs. We do have it working now by asking for external read and write permissions in the app.
     
  9. hema_dubal

    hema_dubal

    Joined:
    Oct 27, 2016
    Posts:
    6
    HI Andrew, Thanks for you reply. But I am unable to find out how to use this feature. Is it so, that I have to set a particular offset to a video File, and then the video will automatically get hidden at that offset in another file? And that another file will be found by the user on unzipping apk, and hence the video will become unplayable?
     
  10. AndrewRH

    AndrewRH

    Joined:
    Jan 24, 2012
    Posts:
    2,806
    Hi Hema,
    To use the offset feature you have to make your video file (eg mp4) so that it has extra data at the beginning. So for example if you have a dummy junk file and a your video file, you append your video file to the dummy junk file. Then if you try to play back this file normally, it will not be able to be played. But if you use AVPro Video and you specify the offset as the size in bytes of the dummy junk file, then it will play.
    Thanks,
     
  11. danielesuppo

    danielesuppo

    Joined:
    Oct 20, 2015
    Posts:
    331
    Hi Hema, unfortunately I didn't find any workaroud. I'm simply using lower resolution videos, but my mobile get so warm as well...
    I've sent a test project directly to Renderheads, that on my mobile stop to work, and they told me that they didn't have this kind of issue (with the same device, Samsung S6), so I'm thinking that this problem could be related to:
    - Android OS
    - Oculus services and applications
    - ... ???

    So, no luck for now, sorry

    Daniele
     
  12. jkqs

    jkqs

    Joined:
    Mar 15, 2016
    Posts:
    13
    I play the plugin demo on MAC without any pictures. What's the reason?
     
  13. AndrewRH

    AndrewRH

    Joined:
    Jan 24, 2012
    Posts:
    2,806
    I'm not sure...could you provide more information - such as your Unity version, macOS version, your console log..etc
    Thanks,
     
  14. lXg_99

    lXg_99

    Joined:
    Dec 15, 2015
    Posts:
    12
    I upgraded to the latest version of AVPro now, because the Player definition was not in my older ApplyToMesh.cs. When i try to call it from my controller script .Player is not accessible like you wrote. How and where should i call it?

    By your explanation it seems that the old video hasn't been unloaded when i enter a player again - and i don't want to hold a frame, i want to get rid of the one still popping up before the video plays. I issue a _mediaPlayer.Control.CloseVideo() when playback is finished or back is pressed - is that the right way to do it? Do i have to wait until i catch a Closing event?
     
  15. AndrewRH

    AndrewRH

    Joined:
    Jan 24, 2012
    Posts:
    2,806
    Player should be accessible - it is a public property.

    There is no need to call CloseVideo() because this will be called automatically when you call OpenVideoFromFile() on the MediaPlayer component. Also you shouldn't call CloseVideo() or OpenVideoFromFile() on the Control interface - just call it on the MediaPlayer.

    If you call _mediaPlayer.CloseVideo() there shouldn't be any need to wait for the Closing event.

    Hope that helps!

    Thanks,
     
  16. dixoncb

    dixoncb

    Joined:
    Jan 31, 2017
    Posts:
    4
    I'd like to extend our new WebRTC environment to UWP, so that I can play real-time video and audio sources from other devices in HoloLens. I can't see how this would map to the current media location schema. Are there any plans or workarounds? As for codecs, H264 is preferred but not essential.

    Thanks again ....
     
    amunyoz likes this.
  17. Cambesa

    Cambesa

    Joined:
    Jun 6, 2011
    Posts:
    119
    Hi Andrew,

    I am trying to colorize a playing video but it does not seem to work using the fast OES path.

    When I run the video on my pc with a blue tint, the video is indeed blue because it is using the fallback shader mentioned at the bottom of "AVProVideo/Unlit/OpaqueFixed (texture+color support) - Android OES ONLY" but when I run it on android where the OES shader is used, the video has the actual colors it originaly has.

    I tried multiplying the: gl_FragColor = texture2D(_MainTex, texVal.xy); by _Color but it causes weird fractal-like effects, as if the memory of other parts of the app get used as texture instead of the video.

    Can you think of a reason why it does this?

    Using Unity 5.5.3f1 and the newest AVPro(1.6.12, Jul 25, 2017)
    It was also happening with older AVPro versions.

    Kind regards,

    Cambesa
     
  18. plmx

    plmx

    Joined:
    Sep 10, 2015
    Posts:
    308
    I was wondering about rendering (standard, non-VR) video in a VR project (for the Vive). I've tried the demo version (with Apply-To-Mesh) and it shows substantial aliasing when scaled (i.e. viewed from a distance). In fact the video looks like a Unity UI image in which mip-maps are not enabled. Anything I can do to improve this, i.e. is there a similar option in your asset?
     
  19. BirdInTheCity

    BirdInTheCity

    Joined:
    Jun 9, 2017
    Posts:
    10
    I'm using AVPro for an in-store display that plays approx 40 short videos based on the user's choosing.

    Right now, the videos are pausing the main thread for about 2 seconds to load/open the video before each one plays. Is there a recommended way to remove this lag? Or perhaps force the loading on a separate thread?

    To be clear, this will only be deployed in-store on a top-of-the-line Windows 10 machine, so it's likely not a limitation of the system. Also, we can get additional RAM if it just makes sense to preload all the videos at the beginning, but I wanted to see if there was a more straight-forward solution. Any thoughts/advice?
     
  20. MarkHenryC

    MarkHenryC

    Joined:
    Nov 30, 2009
    Posts:
    67
    On a similar note: any thoughts on support for DRM (or Digital Rights Management, as the forum won't allow a search on a 3-letter acronym)?
     
  21. AndrewRH

    AndrewRH

    Joined:
    Jan 24, 2012
    Posts:
    2,806
    The OES path is broken on newer Unity's. 5.5.3 should be okay though...

    It sounds like your shader didn't compile properly on the GLSL device. If you added a multiply by _Color, you would then also need to declare _Color at the top in the Properties area, and also in the shader as a uniform Color variable. In theory it should work then. If you're having problems perhaps you can share your shader here?

    Thanks,
     
  22. AndrewRH

    AndrewRH

    Joined:
    Jan 24, 2012
    Posts:
    2,806
    Yes, by default there is no mip-map generation. You can enable this on the MediaPlayer by going to "Platform Specific" > "Windows" and enabling the "Generate Texture Mips" option. Make sure to set your texture filtering mode to Trilinear (in the "Properties" section) for best results.

    Thanks,
     
    plmx likes this.
  23. AndrewRH

    AndrewRH

    Joined:
    Jan 24, 2012
    Posts:
    2,806
    2 seconds is a long time. I wonder if it is related to the way your videos are encoded. On Windows 10 is should be using our Media Foundation + hardware decoding path, so it should be very quick and it should be asynchronous.

    Could you tell us more about the codec used, resolution, file size, and also the type of storage device (SSD / HDD) and its read speed in MB/s.

    If the video resolution isn't too high then preloading them with 40 MediaPlayers might be just fine...

    You may also want to disable any virus scanners or Windows defender from scanning files.

    Thanks,
     
  24. AndrewRH

    AndrewRH

    Joined:
    Jan 24, 2012
    Posts:
    2,806
    Supporting DRM is quite a pain. We're looking into this though, but I can't make any promises.

    What kind of DRM are you interested in having support for?

    Thanks,
     
  25. BirdInTheCity

    BirdInTheCity

    Joined:
    Jun 9, 2017
    Posts:
    10

    Ahh... that did it! I was using Direct Show as the API because the video resolution is 1920x1200 for these videos, and I got a "Loading Failed" error because of it. I tried resizing a video to 1920x1080 and it loaded seamlessly w/ Media Foundation.

    Is there a way to get around the 1920x1080 resolution limit with Media Foundation?
     
    Last edited: Aug 8, 2017
  26. AndrewRH

    AndrewRH

    Joined:
    Jan 24, 2012
    Posts:
    2,806
    That's odd...Media Foundation should support up to 4K for H.264 and 8K for H.265. What GPU do you have?
     
  27. BirdInTheCity

    BirdInTheCity

    Joined:
    Jun 9, 2017
    Posts:
    10
    The production computer is currently equipped with four Nvidia 1080 cards--though I don't think it's even capable of using more than one right now, since SLI mode isn't even enabled! (Someone in systems went on a spending spree!).

    Maybe this is a limitation of using Hap codec? Perhaps we should switch to h264 or something?
     
  28. BirdInTheCity

    BirdInTheCity

    Joined:
    Jun 9, 2017
    Posts:
    10
    Guess it was the codec. All problems resolved with H.264. Thanks for your guidance! Great product!
     
  29. WelchCompositions

    WelchCompositions

    Joined:
    Sep 30, 2013
    Posts:
    29
    I want to use an Image or Raw Image to display my video. Is this possible?

    Apply to material isn't seeming to work. I can see the Image's material component at the bottom of the inspector showing the video playing on it but the Image in the scene doesn't show anything but the default texture.
     
  30. AndrewRH

    AndrewRH

    Joined:
    Jan 24, 2012
    Posts:
    2,806
    Glad to hear it's resolved. I hadn't realised you were using Hap. For Hap selecting the DirectShow API is required as we don't yet support it via Media Foundation API. Hap shouldn't have a problem supporting the videos though, as long as their resolution is a multiple of 4. Best of luck,
     
  31. 265lutab

    265lutab

    Joined:
    Sep 15, 2014
    Posts:
    155
    I’m creating a Gear VR app that plays a large video that is 650MB. When submitting the app to the Oculus store their testing found that it would not play the video on a Galaxy S6 running Lollipop. The problem doesn’t seem to be happening on other Gear VR devices, but there must be a ‘correct’ way to do this that we have not found.

    I’m currently storing the video in the StreamingAssets folder and playing the video using AVPro. I have read that for large video files the StreamingAssets folder can cause problems because the videos are packed into a JAR file and have to be loaded into RAM to be played. Looking at other forum posts I have seen most people suggest using the PersistentDataPath, but I’m not sure how to put the video file into the PersistentDataPath on Android. I’m also not sure this is the best practice for this issue.


    Here are the links to a couple other forum posts I was looking at:
    https://forum.unity3d.com/threads/l...do-play-on-google-cardboard-any-ideas.395779/

    https://forums.oculus.com/community...-vr-but-do-play-on-google-cardboard-any-ideas
     
  32. AndrewRH

    AndrewRH

    Joined:
    Jan 24, 2012
    Posts:
    2,806
    We include a GUI component which you should use: DisplayUGUI

    Thanks,
     
  33. AndrewRH

    AndrewRH

    Joined:
    Jan 24, 2012
    Posts:
    2,806
    Yes you're right - it's probably down to a memory issue.
    Most people with large videos for Android VR apps will actually host them online and when the app starts for the first time have a download phase where the video file is cached into the persistent data folder on the device.
    I hope this helps you.
     
  34. 265lutab

    265lutab

    Joined:
    Sep 15, 2014
    Posts:
    155
    Thank you. Do you know if it is possible to store the video file in the APK and then move it to the persistent data folder? If possible I would like to avoid hosting my video elsewhere.
     
  35. AndrewRH

    AndrewRH

    Joined:
    Jan 24, 2012
    Posts:
    2,806
    Yes, I think you should be able to access it via the WWW class...then you could read it out. The problem with that is that you would need to load the whole file into managed memory (in a byte[] array). It might be worth trying it... There's a thread here:
    https://forum.unity3d.com/threads/c...mingassets-to-android-persistent-data.337769/
     
  36. GamerPET

    GamerPET

    Joined:
    Dec 25, 2013
    Posts:
    370
    Hello guys! Guess who's back! :D

    I'm doing another small project and I want to take advantage of the new and amazing PlayMaker actions.
    I have a small problem.

    I'm going to use a couple of smaller videos. Some loop, some don't. I need a way to seamlessly transition between a video and another. In the past I had a couple of different GameObjects with MediaPlayer so I was loading all the files and when I had to play them I just changed the MediaPlayer that is on the sphere and executed a "Play".

    Problem is that if I use only 1 media player, there is a super small but noticable lag when the video loads.

    I think we discussed in the past that the way to do it is have like 2 spheres. 1 Sphere plays the video, the other sphere loads the next one, then they swap.

    Any way to do this with the new playmaker actions... or do I have to get creative again?

    Thanks!


    /PET

    <3 <3
     
  37. lXg_99

    lXg_99

    Joined:
    Dec 15, 2015
    Posts:
    12
    Regarding the Player call you proposed:


    Not closing a video before switching back to the Menu-View has led to sometimes hearing the video still play - i'd like to have a cleaned, empty MediaPlayer when i return to the menu - keeping the last played video on pause is not clean, that's why i want to close it before the user selects anything new.

    What do you mean by control interface? In the three different player modules i use control scripts based on the simpleController script to do all the module based player commands - are you saying that i should put all that in the one MediaPlayer on the top level of the project and not use control scripts?
     
  38. Cambesa

    Cambesa

    Joined:
    Jun 6, 2011
    Posts:
    119
    We are using Unity 5.5.3 to make use of the OES path, thanks to that we can play 4K videos for hours in the GearVR without the app crashing by overheating(running on the S6)

    The shader I'm using is:
    Code (CSharp):
    1. Shader "AVProVideo/Unlit/OpaqueFixed (texture+color support) - Android OES ONLY"
    2. {
    3.     Properties
    4.     {
    5.         _MainTex ("Base (RGB)", 2D) = "black" {}
    6.         _ChromaTex("Chroma", 2D) = "gray" {}
    7.         _Color("Main Color", Color) = (1,1,1,1)
    8.  
    9.         [KeywordEnum(None, Top_Bottom, Left_Right)] Stereo("Stereo Mode", Float) = 0
    10.         [Toggle(APPLY_GAMMA)] _ApplyGamma("Apply Gamma", Float) = 0
    11.     }
    12.     SubShader
    13.     {
    14.         Tags { "RenderType"="Opaque" "IgnoreProjector"="False" "Queue"="Geometry" }
    15.         LOD 100
    16.         Lighting Off
    17.  
    18.         Pass
    19.         {
    20.             GLSLPROGRAM
    21.  
    22.             #pragma only_renderers gles gles3
    23.             #extension GL_OES_EGL_image_external : require
    24.             #extension GL_OES_EGL_image_external_essl3 : enable
    25.             precision mediump float;
    26.  
    27.             #ifdef VERTEX
    28.  
    29.             #include "UnityCG.glslinc"
    30.             #define SHADERLAB_GLSL
    31.             #include "AVProVideo.cginc"
    32.        
    33.             varying vec2 texVal;
    34.             //uniform vec4 _MainTex_ST;
    35.             uniform fixed4 _Color;
    36.  
    37.             void main()
    38.             {
    39.                 gl_Position = gl_ModelViewProjectionMatrix * gl_Vertex;
    40.                 //texVal = TRANSFORM_TEX(gl_MultiTexCoord0.xy, _MainTex);
    41.                 texVal = gl_MultiTexCoord0.xy;
    42.                 //texVal.x = 1.0 - texVal.x;
    43.                 texVal.y = 1.0 - texVal.y;
    44.             }
    45.             #endif
    46.  
    47.             #ifdef FRAGMENT
    48.  
    49.             varying vec2 texVal;
    50.  
    51.             uniform samplerExternalOES _MainTex;
    52.            
    53.  
    54.             void main()
    55.             {        
    56. #if defined(SHADER_API_GLES) || defined(SHADER_API_GLES3)
    57.                 gl_FragColor = texture2D(_MainTex, texVal.xy) * _Color;
    58. #else
    59.                 gl_FragColor = vec4(1.0, 1.0, 0.0, 1.0);
    60. #endif
    61.             }
    62.             #endif      
    63.                
    64.             ENDGLSL
    65.         }
    66.     }
    67.    
    68.     Fallback "AVProVideo/Unlit/Opaque (texture+color+fog+stereo support)"
    69. }
    The shader without the OES path is working and I changed the OES shader in a similar way, by multiplying the video color by a "uniform fixed4 _Color" but it causes fractalesque effects. By removing the "uniform fixed4 _Color" the shader is working as intended again, without the colorization effect of course but no fractalesque effects anymore.

    Both "uniform fixed4 _Color" and "uniform Color _Color" cause the fractalesque result.
    "uniform vec4 _Color" is also causing the same problem.

    Kind regards,

    Yorick
     
  39. 265lutab

    265lutab

    Joined:
    Sep 15, 2014
    Posts:
    155
    What is the best format for large videos being played on android using AVPro?
     
  40. Cambesa

    Cambesa

    Joined:
    Jun 6, 2011
    Posts:
    119
    We are using mp4 with h264 encoding with a bitrate of around 2mb/s, not sure if it is the best but it is a really good one, use the fast OES path to witheld the device from overheating and crashing. For newer devices like the Samsung Galaxy S6 and newer you can play 4k videos without problems, older devices like the Samsung Galaxy S5 can play 1080p videos.
     
  41. AndrewRH

    AndrewRH

    Joined:
    Jan 24, 2012
    Posts:
    2,806
    Hi PET!
    Yes, you would have to use multiple MediaPlayers. Two should be enough - you can ping-pong between them.
    You can just have one sphere. Let the ApplyToMesh point to the MediaPlayer that is loaded and playing. The other MediaPlayer can then load the video video and once it has received the first frame, just set the ApplyToMesh component to the other MediaPlayer.

    I hope that helps. Sorry I don't have a PlayMaker action for this! :)
     
  42. AndrewRH

    AndrewRH

    Joined:
    Jan 24, 2012
    Posts:
    2,806
    You're trying to access a static member of MediaPlayer in your code, but Player is a non-static member, so you need an instance of the MediaPlayer class. Eg:

    MediaPlayer mp;
    mp.Player = null;

    I hope that helps...

    For Control interface, I mean, when you're using the MediaPlayer class, there are some interfaces, like:

    mp.Control
    mp.TextureProducer

    which contain methods. You can call mp.Control.CloseVideo() and mp.Control.OpenVideoFromFile() but these are two special cases where you shouln't use the Control interface and instead call these methods directly on the MediaPlayer instance, eg: mp.CloseVideo().

    Thanks,
     
  43. AndrewRH

    AndrewRH

    Joined:
    Jan 24, 2012
    Posts:
    2,806
    Yes, that is strange.. I know GLSL is very very fickle on Android. Perhaps you can have a look at the ADB logcat running from a Development Build in Unity on Android to see what errors it reports?

    Thanks,
     
  44. AndrewRH

    AndrewRH

    Joined:
    Jan 24, 2012
    Posts:
    2,806
    MP4 file with H.265 3840x1920 @ 30fps, or 2048x2048 @ 60fps... Using full colour range (not limited range), and you may have to disable some advanced codec features if your video has a lot of motion..14Mb/s is what we've had success with.
     
  45. lXg_99

    lXg_99

    Joined:
    Dec 15, 2015
    Posts:
    12
    Thank you for your patience,

    i understood now what you meant with the control interfaces and will get rid of the .Control. for all CloseVideo() calls.

    [edit] After getting rid of the .Control.and building the app it seems like that was the whole problem - can't see any leftover frames on a quick check anymore!

    I'm sorry, but i still don't get how to nullify the Player - i'll get:
     
    Last edited: Aug 11, 2017
  46. AndrewRH

    AndrewRH

    Joined:
    Jan 24, 2012
    Posts:
    2,806
    sorry, not _mediaPlayer.Player, it should be the applyToMesh component, so:

    public ApplyToMesh apply;
    apply.Player = null;

    Hope that helps.
     
  47. jinghua1988

    jinghua1988

    Joined:
    Jul 26, 2016
    Posts:
    7
    Hi,

    I have a problem when I use avpro free demo in the follow condition.
    AVPro version:1.6.12
    device: samsung galaxy tab s2 6.0.1
    I put a InputField before my video, and if Virtual keyboard shows and hides for some times, my video will play unsmooth.

    Thanks.
     
  48. AndrewRH

    AndrewRH

    Joined:
    Jan 24, 2012
    Posts:
    2,806
    Hi,
    You need to enable multithreaded rendering option in player settings.
    Thanks,
     
  49. 265lutab

    265lutab

    Joined:
    Sep 15, 2014
    Posts:
    155
    I'm trying to use some level of DRM with the video I'm playing in unity. It doesn't need to be a complex solution. Just something so that the general users can't easily watch the videos outside of the app I'm making.
     
  50. DMMsys

    DMMsys

    Joined:
    Aug 4, 2017
    Posts:
    1
    Is HTTP Live Streaming (HLS) now supported in AVPro Video(windows)? If so, does is support encryption (AES)?