A Unity ID allows you to buy and/or subscribe to Unity products and services, shop in the Asset Store and participate
in the Unity community.
Discussion in 'Assets and Asset Store' started by AndrewRH, Feb 12, 2016.
Which platform is this on? Which OS version?
Hello we're developing an Oculus application and just upgraded AVPro from 1.6.12 to 1.7.5. After the upgrade, AVPro's audio doesn't come out of Oculus home's audio playback device setting. In 1.7.5, the audio playback's output was decided by Window's own audio playback setting not the Oculus home's setting.
Is there a way to make sure AVPro's sound come out of Oculus home's audio playback setting?
That's surprising to hear. Please could you let me know:
1) Check the console output - it should print the exact version being used - please could you paste it here as sometimes there are issue when upgrading and DLL files can remain on old versions.
2) Which operating system version are you using?
3) In the MediaPlayer > Platform Specific > Windows, which Video API are you using? Do you have the "Use Unity Audio" option selected and are you using the AudioOutput component? This is what we recommend you use when you need to output to Oculus audio devices.
4) Which version of Unity are you using?
Hopefully after answering those we'll be able to get to the bottom of this issue.
Hi I'm getting some issues on IOS where I occasionally get flashes of red and green blocky pixels. I'm using the VR sphere shaders for playback. The flashes appear when I swap videos. I also had a situation where a particular video would play with the red & green constantly and it vanished after reuploading the video via iTunes file transfer. I assume the second issue was a corrupt video but I still get the red & green flash when the videos swap. Do you know what this is and how to stop it? Thank you.
Cardboard app so you know with 4k video.
Is it possible to see a video/screenshot of these artifacts? It's quite hard to imagine what they could be..
Thank you for the reply, after reading
All the setting you've asked was fine but after testing the same issue with different machines, it didn't have the problem. Sorry for the false alarm and thank you again for the quick reply!!
I played the movie shot with setting "leftright" on the demo sample scene "12 _ Demo _ 180 Sphere Video" and it looked like this. Originally it is a square, not a rectangle.
I see... perhaps you could email us your video so we can debug this?
Did you set the stereo packing mode on the MediaPlayer visual options?
Also which platform is this on?
This is the same issue i pointed out in my post:
Here is some footage from a 180 camera. I've placed a square in the centre. Load it into the 180 scene and you can see the issue in the unity editor.
Cheers - Nick
Thanks Nick, we will look into this more. I don't think the video link works, or it hasn't been shared properly? It would be great to use your video as a test as well.
Thanks Andrew. can you try this link pls: https://1drv.ms/u/s!AoiFe2In4__os9MCemiz3SJY9ZV-9A
And this is the link to the original 6k video file as shared on 360rumors.com:
Thanks at worked. I will these this this week.
Hello, we're working on an 360 streaming video app. This is revisiting question above. It's about m3u8 video's starting resolution. From the previous questions, we discussed how Apple's HLS picks the first video resolution listed in the root m3u8 playlist.
We've tested it by creating a m3u8 movie/playlist and placing the first entry with higher resolution sub m3u8. It seems, contrary to what HLS document says, changing the first entry in the playlist to the higher resolution didn't improve the first playback resolution. Here's a test m3u8 we've used.
Texts from the m3u8 above are pasted below. As you can see the first entry is 2160x1080 but AVPro started with 720x360 then quickly changed to 3840 X 1920.
We're confused since it doesn't seem to be working as the HLS document says. Do you have any insight on this? Is it even possible to develop AVPro player to select initial resolution of m3u8 if AVPro team decided to do it? BTW, thank you for the great support as always!
Basically what we don't want is the first impression of a video starting low resolution like screenshots below.
at the first frame : 1.png
at ~7 sec : 2.png
at ~14 sec : 3.png
Thanks for your detailed reply, this is very useful. Which platform(s) are you trying to do this on?
We're developing mainly for Oculus platform and possibly for GearVR later on.
Thank you for the reply, but we still need specific answers for the questions. Just in case you've missed the questions I've re-arranged them below. Could you please read this and answer them?
We're developing an VR app for Oculus/GearVR that plays 4k 360 streaming videos created from our client.
From the last discussion, you've mentioned that Apple HLS documentation says "The first entry in the master playlist will be played at the initiation of a stream and is used as part of a test to determine which stream is most appropriate."
Even though we tried AVPro by preparing a test m3u8 playlist with higher resolution video(2160x1080) as a first entry, AVPro player still played lower resolution 720x360 first and then moved to higher resolution after a few seconds.
Question: Does this mean AVPro is not following Apple HLS guideline or is this something AVPro doesn't have control over?
Question: If AVPro can fix this issue then is this already on your roadmap and if so, when would it be?
From our development experience, we could say that ,as more and more client developers need to support 4k 360 videos, we could see this exact same issue will be very problematic for many developers since the initial video playback resolution significantly affects user experience. It's very underwhelming to watch 360 video in low resolution at start.
We searched other player options but it was hard to find a video player that doesn't have this very same issue. If AVPro player fixes this issue, I believe it will be a real winner in the market.
Thank you for reading and we'll wait for your reply.
Thanks for your messages. So, on the Windows platform (Oculus Rift), we currently aren't able to support loading the first listed stream in the HLS manifest. We would like to add this feature, but it is going to take some time as it is something that we don't currently have control over, and getting control over it is a lot of work...It is on the roadmap, but we don't have a solid date for this. I would hope in the next 3 months.
With Android (GearVR), I think that it might already be supported. You would need to test it with both of the video api options (MediaPlayer and ExoPlayer) as I'm not sure how these handle stream orders by default..
Feel free to email our unity support team if you would like to go into this with more detail.
Hi @AndrewRH !
Thank for your previous answers.
I'm trying to start quickly the playback of the video from a specific timestamp, I can't find anything in the API to start playing from a timestamp.
My current solution is:
Calling Play, then waiting for the Event ReadyToPlay, then Seeking to my timestamp.
But this is not performant at all, since it has to prepare the video twice.
Can you recommend a better solution to achieve this?
Also I have a problem on iOS when switching video using the same video player, the texture instead of using the default texture on my material becomes green, see screenshot here: https://ibb.co/j7CGVx
This issue is fixed when deactivating YpCbCr420 option.
And just to confirm, the video player is using AVPro shader ( AVPro/Unlit/Opaque (texture+color+fog+stereo support )
Hi, I have just a little question.
I use a HAP video and I need to redirect the sound in Unity because I record a video with the sound.
I look the documentation use the AudioOutput but I don't succeed to redirect the sound in Unity.
I'm on Windows 10, And I use Unity2017.3.0f3
Thanks for your question. Unfortunately our Hap decoder requires that the DirectShow API is used, but the AudioOutput component doesn't support DirectShow.
Maybe you could encode your Hap video to MP4+H.264 and then switch the video API to Media Foundation in order to do your capture?
Hmm, perhaps you could load the video, but not play it, wait for ReadyToPlay and then do a seek and then call play? I'm not sure if this will improve the performance.
If you are having to load many videos, it may be faster to combine all of the videos together. That way you can just seek to start the playback of a new video segment which will be much faster than opening a new video.
The amount of time that seeking takes is also determined by how many key-frames your video is encoded with. The more key-frames you use, the faster the seeking will be.
Thanks for reporting the issue with the default texture. You are correct that disabling the YpCbCr option fixes this. We need to find a way to support normal default textures when using this mode which is a performance and memory optimisation.
That was helpful. We will use the email(email@example.com) for further questions. Thank you for the reply!
we are developing for hololens and rather often are asked to create 360 videos for it. See Microsoft Studios Holotour.
I am pretty sure they are faking big time as nothing upper horizon moves (static sky) and everything below is moving. Probably video is only playing on half of a sphere.
Anyways, my tests with spherical videos on this platform were rather disappointing. Using Unitys build in Videoplayer : Every video higher then HD would only show the first frame. HD is working but is super pixelated in the end.
- Why and how would we benefit from AVPro? Or why is it faster anyways...
- Does anybody know of a way to compress such videos Hololens friendly? Could not find anything on the internet regarding this.
- I Downloaded 4K 360 Videos from Youtube (since in the moment we don´t have anything to test with). Strangly enough all those Videos are split horizontally and it seams neither Unitys Player nor AVPro can handle this.
- i tried to attach a screenshot from such a file as seen from a video editor.
i would appreciate if anybody could shed a light on this (sorry if this was asked before, after searching long time i still could not find anything)
Hello, everyone. This is Jody from Magnopus. I'm taking over for now from Dooyul for video related questions. Right now I have a task to eliminate some pretty obvious lag (causing async time warp) on a VR app when playing a video using AVPro. What I observe is that the WindowsMediaPlayer in AVPro calls Native.OpenSource and it takes 472ms to complete which is way too long. It seems like it might be better as a thing that happens async and has a callback so it doesn't have to lock up the main thread. Any suggestions? Is there a fix already in the works?
AVpro Can't play h265(HEVC) anymore?
I just updated the plugin to 1.7.5 (unity 2017.3) and I get the following error in the file of my old project.
which played well in previous version.
Media detail -
Video : HEVC, 5760x1200, 24FPS
Audio : AAC 48000, stereo, 192kbps
[AVProVideo] Error: Loading failed. Codec not supported or video resolution too high or insufficient system resources.
RenderHeads.Media.AVProVideo.MediaPlayer:UpdateErrors() (at Assets/AVProVideo/Scripts/Components/MediaPlayer.cs:1584)
RenderHeads.Media.AVProVideo.MediaPlayer:Update() (at Assets/AVProVideo/Scripts/Components/MediaPlayer.cs:957)
Thanks for getting in touch and for reporting this issue in such detail.
So Microsoft has changed their build-in HEVC/H.265 support in the latest Windows 10 updates, which has caused a lot of problems.
In theory though they fixed the issues in the 1709 version of Windows 10, so I'm not sure why your video isn't playing. Please try rebooting your system to make sure all Windows Update tasks have completed.
If this doesn't help then please also try upgrading your GPU (NVidia) drivers, as this is what does the actual video decoding.
There is an interesting article about the issue here:
They basically say that it should all work fine, but "Users who do not want to wait for the update to rollout into their system can install the HEVC Video Extension app from the Microsoft store.", so you could try this too.
I've also sent you a simple HEVC test video here - perhaps you could check whether this video plays, as perhaps they changed something that only affects very high resolution video playback (5760x1200 is quite high).
Otherwise please could you send us your video (or a clip from it) so that we can test it here.
Let me know how it goes.
I wanted to follow up on this thread about color space issues. We've tracked this issue down to the NVIDIA Control Panel's default settings. By default it is set to Limited (16-235) range, with the option for the video player to override this setting programmatically. End users can manually change this setting and force either Limited or Full range, but it would be ideal for AVPro to expose this option to be set in Unity.
On a related note, the NVIDIA Control Panel also defaults the hue adjustment to 179 degrees instead of 180, which does actually introduce a slight hue shift. I've reached out to NVIDIA about this as it seems like a bug, but it should be able to be similarly controlled programmatically.
These settings can be controlled via the NVAPI, although they are only exposed in the NDA version. See this post for reference.
We've been battling these issues on our own Media Foundation powered hardware accelerated video player implementation and I wanted to share the insight as it was a long, frustrating process to figure it all out.
If you'd like an intro to our contacts at NVIDIA to get the ball rolling on the NDA version of the NVAPI I'm happy to do it.
Hi Jody. That call should already be async. Perhaps you could share an example piece of media/URL so we can test it?
Thanks for getting in touch. Let me answer your questions:
1) Yes it is generally faster, because we target each platform specifically instead of making a generic solution for all platforms. We're also able to add new features more easily, so customer requested features can more easily make it into the plugin.
2) I'm not sure exactly which settings are best for Hololens. We have some general guidelines in the AVPro Video PDF documentation for video encoding. Generally for low powered devices you want to make the videos as easy to decode as possible. This includes things like disabling CABAC (for H.264), disabling B-frames, limiting Reference frames to a small number, disabling deringing, using slices, and generally using as low profile as possible.
3) I'm not familiar with this issue - perhaps you could send us a test video that has this issue?
Oh thanks very much for sharing this! This is a very handy piece of information.
I don't think we want to be working with the NVAPI at his stage, but thanks for the tips.
AVPro Video version 1.8.0 has just been released!
You can find the updated version on the Asset Store and the free trial version / demos on our website.
Performance improvement by minimising texture copy operations
Upgraded ExoPlayer from 2.5.4 to 2.6.0
Upgraded Facebook Audio 360 from 1.3.0 to 1.4.0
New method GetCurrentDateTimeSecondsSince1970() for getting the absolute date-time of a video stream
New method GetSeekableTimeRanges() returns an array of seekable time ranges
Fixed bug where only up to 255 video player instances could be created
Fixed issue where camera set to skybox clear mode would make OES videos not appear, by changing shader to write to the depth buffer, just like the other VR shaders do
OES mode now uses less memory and starts faster
Fixed bug when using Facebook Audio 360 where sometimes the audio or video wouldn’t play
macOS, iOS and tvOS
New method SeekWithTolerance() offers fine seek control
New method GetCurrentDateTimeSecondsSince1970() for getting the absolute date-time of a video stream
New method GetSeekableTimeRanges() returns an array of seekable time ranges
Fixed YCbCr support in shader used by 180 video demo
Improved adaptive stream resolution switch seamlessness by eliminating the black frame flash
Added DirectShow chunk-based file loading method to allow for loading large files into memory without making large managed buffer allocations. See LoadFromBufferInChunks.cs
Fixed a bug when loading from buffer using DirectShow where memory was deleted twice
Fixed a D3D9 rendering bug that caused only the skybox to render on some newer versions of Unity due to renderstate restore issues
Upgraded Facebook Audio 360 from 1.3.0 to 1.4.0
Various improvements when using the AudioOutput component, including better AV sync and fixed glitching. Also added manual sample rate setting for streaming videos.
Improved UWP file loading demo script NativeMediaOpen.cs
Changed the way textures are updated to hopefully be more compatible
DisplayUGUI component now exposes Raycast Target field so videos can allow touch/mouse input to pass through
Added new events for video resolution change, stream stalling, buffering and seeking
Added SSL/HTTPS support to StreamParser
Various code refactoring and cleanup
I'm having issues with multiple audio tracks in a HAP encoded AVI that I am not able to resolve with your documentation: .Info.GetAudioTrackCount() will always return only one track in a file that contains two audio streams and it does not respond to .Control.SetAudioTrack(1).
As it's a stereo HAP video, I am using DirectShow as my video API in OpenGL and a non-headmounted stereo display.
Also, I've read a previous post that the feature was not working in Windows 7 at one pont. I'm on AVPro Video 1.8, Unity 2013.3.1f1, Windows 10 (x64), but the final app needs to run on Windows 7 (x64) - would that still be an issue for multiple audio tracks?
This is bizarre and I hope someone can help. I noticed that AvPro on a GearVR is only rendering the right eye of my stereo media. The exact same codebase (aside from oculus modifications) works on Daydream perfectly. Anybody else have an issue getting stereoscopic media to play on GearVR?
Hi been watching this project for awhile, very glad to see it is still actively supported (you are maybe the most responsive dev I've ever seen).
Our team is very interested in VR (specifically 180x180 sbs video), and it looks like the asset now has more support for VR. Are there any sample projects that demonstrate how to render these kinds of videos with different projection types (180 degree domes, spherical projection, etc...)?
Hmm that is strange..It should work. Perhaps you could send us a copy of the video (or a clip from it) so we can test?
That's very strange.
Are you using the VR single-pass rendering mode? Are you using OES mode? Are you using ExoPlayer or MediaPlayer Video API setting?
Perhaps you could send us over your project, or a cut-down version that demonstrates the issue? Also please let us know which version of Unity and AVPro Video you're using, and the Android device type and Android OS version. With more information we should be able to reproduce the issue and then offer a solution or fix.
Thanks, we do try
Yes, we have included demo scenes in the package for 360 spherical, 360 cubemap and 180 projections. You can download the free trial version so you can test drive it yourself. You can enable stereo mode by selecting the type of stereo encoding (SBS or TB) in the Visual tab of the MediaPlayer component.
Is AVPro able to automaticly render side-by-side view of the left and right eyes from a simple 360° video ? I've tried multiple configurations and read the doc few times but I'm not able to make it work.
Will try and package up a portion of the project to send to you. I am using Multi-pass rendering, not using OES mode, and using ExoPlayer. Unity 2017.3.0f3 and AVPro 1.8.0. Android is a Samsung S8 running Android 7.0. Videos are H.264 at 2048x2048 which work great on Daydream. I suspect it might be something more to do with Oculus, but covering all my options here in case a tweak in AVPro fixes it.
Which platform is this on? If it's on Windows then the video may contain stereo metadata. Our Windows implementation currently doesn't support videos with stereo metadata embedded, so you would need to encode without this. On Windows only one of the eyes is rendered for these videos.
But yes, in all other cases the stereo videos are fully supported and render both views by default, unless you select the Stereo Packing Mode in the MediaPlayer > Visual section, in which case it will render the appropriate eye for stereo VR rendering.
Let me know.
Yes if you can get it to happen in the blank project with one of our sample scenes, then this is the best chance for us to fix it easily. Even using one of our left/right or top/bottom sample videos instead of your own, if the issue isn't related to your video file (doubtful).
Hi Andrew, thank you for your quick response.
I only target Android devices. My videos are not stereo videos, they are plain 360° videos. But I need to make them available on Cardboard or Daydream by simulating the stereo (like GoogleVR SDK does).
I'm not sure what you mean by simulating stereo...I'm not sure anything does that? Please share a link if you have one.
But our plugin will simply display the video to both eyes when using VR..Perhaps that is what you meant? Or could you explain more about (with a screenshot ideally) how it isn't working as expected?
How do I disable all debug logs?
I think there's a misunderstanding You can find on Youtube plenty of plain 360° videos that you can watch with a headset. They are filmed in mono but to watch them in your headset they are "converted" in stereo on the fly (this is simulated, not really 3D/stereo) ->
So when I set Unity settings to VR Supported, for Cardboard for example, I import 08_Demo_360SphereVideo, I set the target eye of the main cam to Both, I still see the plain video. What else do I have to do ? Tried UpdateStereoMaterial script, Stereo mode Top/Bottom, Left/Right, ... When I set stereo debug tinting, the whole screen is reddish.
Sorry I'm kind of a beginner on Unity.
Here with some questions again, this time regarding your online streaming options:
1. Where is the cache for the streamed videos? We're currently making an app that needs to stream videos from online, but also have the option of saving them locally. Can I somehow abuse your cache for this and do somI ething like copying the cached video to StreamingAssets?
2. I have a bug currently, where after a couple of playthroughs in editor AVPro just stops playing the streams. Even when I manually press "Load" from the inspector button, it doesn't do anything. This might be a bug made by me as well, but it persists even though I stop playing and then run the app again. Only way to solve it is restarting Unity. Haven't seen the bug in a build yet, but haven't done extensive testing on a build either.
Thanks for your time!
You can do this by going to the Global Settings section in the Media Player component, and ticking the option "Disable Logging". This will still log any errors though.
Hope that helps,
Sorry I still don't understand the problem
What do you mean by "the plain video"? Is it possible you could share us a screenshot and show the issue?
1) I'm afraid we don't have access to the stream data that could be cached. This would be a LOT of work to implement as it wasn't a feature that the player was designed for.
2) Hmm this isn't a bug we know of. Which version of AVPro Video and Unity are you using, and which platform is this on? Also, what formats of video are you loading? If you can make a simple scene, or use one of our demo scenes, and replicate the issue then this is the best chance for us recreating and fixing the issue.
AVPro Video version 1.8.2 has just been released!
You can find the updated version on the Asset Store and the free trial version on our website.
Improved accuracy of the Stalled state/event which triggers when the player is unable to decode the next frame, usually due to network connection issues
Improved accuracy of IsBuffering state/event which occurs when data is being downloaded over the network
Fixed a bug with HLS live streams where pausing the stream or a dropped connection would stop streaming indefinitely. The stream now resumes correctly, or throws an error if the stream cannot be resumed
Please report any issues here on the forum, or to our customer support email.