A Unity ID allows you to buy and/or subscribe to Unity products and services, shop in the Asset Store and participate
in the Unity community.
Discussion in 'Assets and Asset Store' started by AndrewRH, Feb 12, 2016.
Sometimes an hour, sometimes 2 days...
Cheers, Fingers crossed it isn't 2 day
I'm getting media files in following way, seperate track for audio (mp3) and seperate track for video (mp4). Does your player support this situation where 2 files work as one ?
It's live now
Hi, I would recommend muxing the video and audio into the same file. If you must have separate files for some reason, then no AVPro Video doesn't have any built-in features to play them as one. Each MediaPlayer component can only play a single file/url, so you would have to create two MediaPlayer components and possibly add some logic to try to keep them in sync.
We do include a scriptlet called "PlaybackSync" in the Demos folder which shows an attempt at syncing multiple MediaPlayers. In PlaybackSync you specify a master MediaPlayer and an array of slaves. In your case you would make the audio the master as it will adjust the position of the slaves to match the master, and skips in video are less noticable than skips in audio.
I hope that helps you. Thanks,
Thanks , i believe if you could add that audio only support then it will be a breaze.
Hello @AndrewRH .
What's the status of a Spatial Audio for a 360 video ... for Android?
Can it be done?
Thanks. I'll be updating and releasing a new version of Whirligig very soon. Was going to be Monday but I ended putting it off and getting into something else so hopefully Friday.
I have a question for you and the forum in general. I would like the video to produce light that is projected into the room, lighting the room. Up until now I've been using projectors and blur maps etc to achieve this but in truth they are rubbish, they don't generate shadows and there is always a harsh banding going on. Does anyone recommend any alternative ways to achieve this, whether it be a plugin or some other way? It's a shame that animation light cookies with colour don't work.
Are videos captured in portrait mode supported by the latest release (1.5.25)?
We've had the same issue. As a warning to others, I'd recommend never counting on MediaPlayerEvent.FirstFrameReady event to start a video on Android. This seems to be a relatively common issue in esoteric devices, as we've had quite a few complaints of videos not starting.
MediaPlayerEvent.ReadyToPlay does seem to fire just fine when auto-play is off.
Andrew, in the normal sequence of events -for non auto-play videos-, ReadyToPlay should always fire before FirstFrameReady, right? So we can safely rely on using ReadyToPlay to start the videos?
Videos with auto play enabled also seem to work just fine on their own.
To help debug, actually CanPlay () does return true. I've tested on a Sony Xperia Z Ultra, with this change in MediaPlayer.cs, the logs are attached:
private bool CanFireEvent(MediaPlayerEvent.EventType et, bool hasFired)
bool result = false;
if (m_events != null && m_Control != null && !hasFired)
Debug.Log (et + ": " + m_Control.CanPlay () + ", " + m_Texture.GetTextureFrameCount ());
There are no more media player events fired after the logs. It's the m_Texture.GetTextureFrameCount () that returns 0 causing the FirstFrameReady event to not be fired.
There are two media players in the scene, but OpenVideoFile if only called one of them at the time of these logs (I should have marked which media player the logs are coming from, but I don't physically have the device so it's complicated to update the apk) .
The only error in the logs is: "ERROR: MediaPlayer : Should have subtitle controller already set", which I think I've seen in videos that play fine as well, so I don't think it's related. We don't have any subtitles or anything.
How would you play an animated gif or loading bar-icon while a video is being buffered for playback?
Hello.... Question for you... And Thanks...
I am creating, with a team, a project for the Boulder International Film Festival where I have created a massive virtual world of Boulder Colorado with the Front Range of the Rockies. We are attempting to locate multiple (maybe 12) 360 vids scattered across the terrain where from the main menu, the user can teleport across the terrain to specific located 360 vid spheres. I presently have 3 sphere located and all three vids fire up immediately and therefore have a major impact on the overall frame rate. Please tell me, is there a distance from the sphere on off trigger for the vids presently available in your product? If not, I imagine this has already been done so could you point me in a direction to manage this please. I have tried setting the spheres with the Unity LOD system and the sphere meshes work with the LOD system but I still hear the videos are all playing and the frame rate remains low
I also need an HTTP cookie support. Is there any update for this topic?
We will add an audio-only mode soon, thanks
It isn't currently supported with our plugin no. Perhaps in the future but I don't have any more information about it right now. The solution at the moment would be to use Unity to play the audio separately - this should take care of the 3d transforms.
Hmmm, I don't know of a way.. I also would have thought of projectors, or perhaps trying a godrays effect..Good luck
Are videos captured in portrait mode supported by the latest release (1.5.25)?
Portrait orientation detection on iOS isn't currently supported. We're adding support for this in the next update. Thanks,
You could do this in a number of ways - cycling a series of sprites, running an animated shader or using the GUI system Image component and animating the radial cutoff value. This isn't a feature of AVPro Video so you would have to add this yourself. Perhaps in the future we can include an example one.
Apparently the error about subtitles is just a quirk of Android and can be ignored. Reference is here:
Yes normally ReadyToPlay should fire before FirstFrameAvailable. It can take a few frames for the first frame to make its way through the video decoder pipeline and get uploaded as an available texture for display.
Yes Android doesn't play any video frames when a video isn't set to auto-play. We actually had to manually add a Seek(0) in there to get it to display the initial frame. In the latest version of AVPro Video (1.5.24 and above) we have removed this and made it optional as it does cause extra overhead. This option is now in the Platform Specific > Android section. It may be that other Android devices work slightly differently. This is something we'll look into as we do more testing on a wider range of hardware.
Which version of AVPro Video were you testing this on?
I recommend that all videos have their "auto-start" option disabled. You would then need to write a little script to trigger and pause+rewind the video when the user enters and leaves each zone. This should be super simple to code. I don't know of any existing code for this, but it does sound useful so here it is.
This script (VideoTrigger.cs) below works with colliders with the Trigger flag ticked. Hope it's useful:
/// Causes a video to play when the trigger collider is entered and rewind+pause when it is exited
/// Audio is faded up and down too
public class VideoTrigger : MonoBehaviour
private MediaPlayer _mediaPlayer;
private float _fadeTimeMs = 500f;
private float _fade;
private float _fadeDirection;
private void OnTriggerEnter(Collider other)
if (_mediaPlayer != null)
_fadeDirection = 1f;
private void OnTriggerExit(Collider other)
if (_mediaPlayer != null)
_fadeDirection = -1f;
private void Update()
if (_fadeDirection != 0f)
// Fade the value
float speed = 1000 / _fadeTimeMs;
_fade += Time.deltaTime * _fadeDirection * speed;
if (_fade <= 0f)
// Complete the fade down
_fadeDirection = 0f;
else if (_fade >= 1f)
// Complete the fade up
_fadeDirection = 0f;
_fade = Mathf.Clamp01(_fade);
// Set the volume
if (_mediaPlayer != null && _mediaPlayer.Control != null)
HTTP cookie support is coming in the next update Thanks,
You are the Best
Andrew... Thank you very much for your response and assistance...
As a windows build it works perfectly... Way awesome...
No doubt I will be using your product often...
We had the issues with a slightly older version of the plugin, but then I also tested with 1.5.24 and the issue was still there for certain devices (with the 'show poster frame' option enabled).
I ended up adding a "FirstFrameAvailable" callback 0.2 seconds after "ReadyToPlay" (with a check for the "FirstFrameAvailable" event to only be called once for each "OpenVideo ()"). This worked out fine.
I've been developing a 360 video / image sphere application in Unity with the help of AVPro Video. In this app multiple video's and images can be chosen to view in 360 with the help of the Google VR SDK. I have a listview with which you can select the video's / images seperatly, and a 3D view in which you can select the video's with the GVR Reticle.
All is working fine, especially on Android (Samsung Galaxy s7 edge) I can play 4K video's. But when I try to implement the same app on iOS (iPhone 6) the app crashes on playing the videos. Even when playing HD video's, the app crashes sometimes.
Is it possible to play 4K .mp4 files with AVPro video on iOS? if so, do I miss some specific settings to make it work? if not, what is the highest resolution on iOS then?
The video files are .mp4, H264, AAC codec. I've tried 50fps and 25fps. Using the latest Unity 5.5, and I'm using the ios Trail package of AVPro Video.
Hope someone can help me with this issue.
Hey folks! Working with AV Pro for a client who's making a documentary project on an (Android) Samsung Tab A (2016). I've found AV Pro to be a fantastic plugin, and it's really helped in the production of this documentary.
...I have run into one problem, however!
In one part of the project, a video is triggered upon completing a jigsaw - and when the video is completed (either by playing until the end or until the playback is cancelled (with mp.CloseVideo()), a new button scales into view (in 0.22 seconds, triggered by the MediaPlayerEvent.EventType.Closing) and draws the users eye. Upon tapping the new button, it unfolds a menu of more videos to play.
This behavior works perfectly in-editor. However, when played on the target Android tablet, the button scales into view as soon as the jigsaw is completed, and BEFORE the jigsaw video has has begun playing.
This means that upon completing the jigsaw, something appears to move above the jigsaw for half a second before the video plays (which could be two or three minutes long) and when we return to the menu, there's nothing to draw the users eye to the new button.
Does anybody know how I could solve this problem and play the button scaling animation AFTER the video has played/is closed? Would be much appreciated!
I use and Love AVProVideo. But with the new Video object in 5.6 soon to be released, can anyone summarize the advantages if any, AVPro video offers over the new Video object in Unity? Both for mobile and Desktop deployment.
Is there a way to get an audio clip from an AVPro media player? I saw GrabAudio(), but cannot figure out how to grab the clip so I can edit it.
thanks in advance for your hard work for this nice plugin...
Actually, we have a problem for Mac...
We use AVProVideo in canvas,load video and wait for event "ReadyToPlay"...
Actually, the video is displayed as black screen and no audio... Then, if I lose and regain focus in the editor window(same as standalone mac builds with alt + tab and back), the video is played correctly...
The issue doesn't affect windows editor / standalone windows builds...
If you need more info, just ask...
Unity Version 5.3.5
Edit: Tested also with latest AvPro version but same problem...
I second this request. The ability to continue playing audio while a video is culled/minimized is a very useful feature. Please include it in the next release!
According to Apple's specs, you can play 4K starting with iPhone 6S, the more ancient models support up to a FullHD resolution.
Hi is it possible to get a regular 360 video (not Stereoscopic) to play inside a VR set (Cardboard). So it will artificially create the two cameras like you see in a lot of VR players on the appstore. I downloaded the trial version and tried it on my iphone and although the 08_Demo_360SphereVideo played fine is not in VR mode. I tried it on android (Galaxy S6 Edge )and just got a white screen (video wouldn't play)
So, in case it's useful to anyone, solved my problem -
1) Set a flag to true in the event 'Started'
2) Check if flag is true in event 'Closed' and/or 'Finished'
3) If so, do animation and turn on pips etc. Set flag to false.
Thanks to Renderheads for advice.
You say "attempt at syncing" which sounds like it's not quite optimal? For the scenario I'm hoping to use your software for I need precise sync of several videos (they must all be on the same frame number at each render cycle). Is this doable, or can you describe what blocks this?
Hello all, hello Renderheads.
In post #780 there was a short description of how to make GVR work with a movie-sphere (for 360).
I am trying to make this work and build it to an Android phone, S7 Edge, everything runs perfectly on my Mac but as soon as I am building it to the phone I am getting plenty of error messages. The scene only consists of GVR Head demo + Renderheads Sphere Demo. Where am I doing something wrong? Many thanks for all support!
iOS devices generally have less RAM than Android devices...how much RAM does your iOS device have? For 4K playback we've seen memory being a huge issue and basically if your device only has 1GB of memory then 4K playback isn't really possible - maybe if you freshly reboot your device so it has nothing loaded in memory.
Yes Unity are improving their video support - finally I'm not yet sure about all of the advantages, but there are a few things that spring to mind:
Reasons to still use AVPro Video over Unity 5.6 video support:
1) Support for the Hap codec
2) UWP and Hololens support (perhaps they will fix this by the final 5.6 release)
3) Good streaming support (I believe Unity are still working to improve their streaming support)
4) Built-in shaders for displaying stereo videos and also SBS/TB transparent videos
5) Subtitle support
6) Windows DirectShow codec support
7) Support for slowing down the video playback for video capture
8) Rapid updates by RenderHeads for any feature requests by users
I'm not sure what else there is yet. I guess it will all be made clear in the near future We'll continue to release new features in the plugin as we use it internally ourselves and value being able to update our video features whenever we want instead of relying on Unity updates. We'll probably be shifting focus more to the high-end user and start offering some features beyond what the new Unity video component offers.
There is no way to do this. The only thing you can do is in Windows (8 and above), select the option to use Unity Audio and use the AudioOutput component. You can then use a script to pass all of the audio that is played in realtime. But there is no way to just grab all of the audio out of the file without playing it live.
That's strange. Are you publishing from a Windows PC? There have been problems in the past publishing on Windows for Mac....
Which version of OS X are you running? does it happen with our sample scenes+videos too?
Yes it support playing non stereo videos in stereo. You just need to run the 360SphereDemo and enable VR mode in the Unit Player Settings and you're done. It should work fine....If you're still having trouble please let us know more details (Unity, AVPro Video and Android versions, log file etc). Thanks,
No problem. Glad that it worked. We'll tidy up that event state in the next release.
The syncing sample script simply tries to play multiple sources and keep them as close as possible on the same timeline. To get frame accurate syncing you would need to add another layer to this. You would need to buffer rendered frames and store their timestamp value, and then once enough frames were in the buffer, start to display them at the correct interval. This way you can guarantee that each video is displaying the same frame. This does add a bit of latency though, and extra memory usage and complexity. Hopefully we can implement an example of this soon and include it in the release. Thanks,
Well to get it working in GVR you don't actually need any GVR demos....You can create a blank project, import AVPro Video, enable VR mode, copy in your GearVR oculussig file to /Plugins/Android/assets/ folder and it should deploy. This is because modern Unity versions have native VR support.
I think in your case the 2 plugins are causing a conflict because they both contain an android manifest file. If you want to use both plugins, then try installing AVPro Video, deleting our android manifest file, and then installing the GVR Head demo.
Thanks. The HAP codec is a must for any desktop apps
Hey, I've been testing on iOS, and just discovered subtitles don't work on Android when invoked from a script. They did when I downloaded the trial version, but they don't anymore--I discovered this right after purchasing the pro version.
- So far I've only tested with a GameObject that hasn't been started yet (to prevent loading the wrong movie).
- It works when not invoked by a script, but this not my use-case.
- Everything is fine in the editor. (Is it a path validation bug?)
- It worked in the trial version, version 1.5.18
- When I installed the iOS version over that (version 1.5.22), it stopped working (yes, I'm still talking about Android)
- No subsequent version has worked, including the full Android version.
Edit: Andrew pointed out in my test case was a little over-complicated, with too much error-checking. The error-checking was screwing it up! (And yes, my code was running into a little difference between the Android runtime and the editor runtime, but both do work.)
Hi, I'm using your plugin for a WebGL project. It's working perfectly fine in Chrome, Safari and Edge, but not in Firefox.
In Firefox, there is no video feed, but the audio still plays. Before the latest update of the plugin, there was just a grey screen being shown instead of the video feed. After the latest update, the video object is not visible at all. Don't know if that helps to pinpoint the issue.
Fyi, it's the same thing in Explorer, but that's less important, since we're not targeting that browser.
I'm using a simple setup with just the Media Player script attached to a gameobject. Nothing fancy.
Have you encountered this problem before, and do you have any idea on how to make it work?
Yes, this is because we changed the Android version to load using a coroutine, so EnableSubtitles() actually returns false on Android as it hasn't loaded the subtitles at that point. We'll change it to return true as I think this makes more sense. Thanks,
We're about to release a new version of the plugin that fixes these browser issues. Look out for the new version in the next day or two. Thanks,
Whoa, sounds great! Thanks!
It's good to see how active is this forum! I need some help about loading videos dynamically.
It's a Windows desktop application with 3 MediaPlayer in the scene, one per user (it's an interactive table which admits three people). I have to handle 45 videos and I don't know how to do this in the best way. I'm pretty sure that I can't load all of them in the beginning and if I try to load some video using OpenVideoFromFile () the app freezes for a second... If you could send me a "how to" or something like that I would be eternally grateful.
Excuse my awful english and thank you.