A Unity ID allows you to buy and/or subscribe to Unity products and services, shop in the Asset Store and participate
in the Unity community.
Discussion in 'Assets and Asset Store' started by AndrewRH, Feb 12, 2016.
I think you should use English first.
Thanks. We have responded on github. This is where we will address your issue from now on
Would it be possible to post an issue to github? We will be able to gather and correlate all information better that way, hopefully help find a solution quicker.
To better help you, we will need more information. The quickest and best way to achieve this is to post an issue to the github issues board.
We will respond via github rather than pollute the forum with 'back and forward' questions!
I am afraid we can only handle support requests in English
I believe I have seen you have posted this to unitysupport@ support email address. We will try and help you via that channel.
Indeed. We can only offer support in English. Apologies for this. It's not cool, but we do not have a full complement of linguistic support staff at RenderHeads.
I see you have already posted up a support request for this issue to github...
That is the way you will receive support the quickest. By cross posting here you are only slowing down how quickly we can get to and help with your issue. Sorry to be blunt!
I believe this is the same issue as: https://github.com/RenderHeads/UnityPlugin-AVProVideo/issues/243
Please post any additional information you may have to that github issue.
The simple answer is...yes, but with caveats.
I assume you are targeting Windows/macOS...
If your hardware can do it, then you should have no problems...it should 'just work'. If you are piping the audio through unity, then that should also work for you.
We would always recommend downloading the free (watermarked) trial version to check AVPro Video works for your use-case before purchasing a license. You can grab that here: http://downloads.renderheads.com/2019/UnityPlugin-AVProVideo-Latest-Trial.unitypackage
thank you very much. Actually I have purchased all the products of avpro on the assetstore. Thank you for your news. I have no experience in 5.1 & 7.1 sound. I would like to consult specifically, in addition to the hardware requirements. Is there a special setting in the software (eg.avpro or window10)? Are there any special requirements for the video format or sound. Thanks again.
In addition, I have always been wondering if avpro provide windows all DLL source code files. If the price is appropriate, I want to buy it. thank.
Hello, I have posted my question on GitHub, please have a look, thank you.
Anyway, you don't respond to my questions on both the forum and git!
1, in the Windows platform or guidelines and the Riod platform sometimes do not receive the FirstFrameReady event, so I can not use this event to trigger seamless playback;
2, because the resolution of the media file I play is relatively large: 7680 x 1080, H265 encoding, there is a player already playing, near the end, preloaded another media, triggered by FirstFrameReady, but when loading the second player and initialize the error, the prompt is as follows:
Android Player (Droidlogic_X96Air_P2@192.168.15.52) (AVProVideo) Error: Failed to load. The file could not be found, the codec is not supported, the video resolution is too high, or the system resources are insufficient.
3, in addition, the player opens a non-existent file through the function: OpenViewFromFile(), the function returns still true, and will not trigger: MediaPlayerEvent.EventType.Error Event;
4, _MediaPlayer.CloseVideo() function, if the media does not load successfully, will not trigger: MediaPlayer Event.EventType.Closing event, causing some trouble processing list playback;
1, sometimes do not receive The FirstFrameReady event on the Windows platform or the andriod platform, so I can't use this event to trigger seamless playback;
2, because I play a larger resolution of the media file: 7680 x 1080, H265 encoding, there is already a player playing, near the end, preloaded another media, triggered by FirstFrameReady, but when loading the second player and initialize there is an error, prompt is as follows:
Android Player (Droidlogic_X96Air_P2@192.168.15.52) (AVProVideo) error: failed to load. The file could not be found, the codec is not supported, the video resolution is too high or non-system resources are not available.
3, in addition, the player opens a non-existent file through the function: OpenViewFromFile(), which returns true and does not trigger: MediaPlayer Event.EventType.Error event;
4. _MediaPlayer.CloseVideo() function, if the media load fails, will not trigger: MediaPlayer Event.EventType.Closing event, causing trouble processing list playback;
I am afraid we do not offer a license to access the source code.
As long as you encode your audio in accordance to one of the standard channel layouts, you should be all good.
That is not on the AVPRo Video issues board. Please post it to here: https://github.com/RenderHeads/UnityPlugin-AVProVideo
I am trying to play videos on android using aws CloudFront. This works so far on PC, web, but not android for some reason. I am just getting the error below. I will note loading the S3 URL works just fine, but not CloudFront.
"[AVProVideo] Error: Loading failed. File not found, codec not supported, video resolution too high or insufficient system resources."
I know that error is kinda a catch-all(I would love to see better error messages), so any help would be most appreciated. I can also setup a test project on github later if you would like.
I also can download the video using a WWW class/play it in other ways on android, which is why I am posting here as it seems to only be a problem with AVPro loading it.
Hi Renderheads Team!
I have posted the issue to github
Hello, I want to create a gallery with ability to play 360 video player using this asset. Is there a way to create thumbnail for the gallery?
I am using AVProVideo on IOS, to make the video manipulation app Surrealizer.
It worked perfectly when I released it feb4 2019.I have not made any updates to it since release but recently discovered that it doesn't work properly anymore.
It should work by accessing a video from the device video library, and then manipulating it during playback. but it doesn't seem to want to load the video any more.
I can only attribute this failure to some kind of change caused by ios updates leading to some kind of error in video addressing. I am getting the video library file address via a third party plugin and directing that to the media player with
mediaPlayer.OpenVideoFromFile(MediaPlayer.FileLocation.AbsolutePathOrURL, vidPath, true);
where vidPath is the video file address.
however it wont play.
but it did 12 months ago.
previously I was using
mediaPlayer.m_VideoPath = vidPath;
any idea why this might not work any more?
:SOLVED: apparently in my case, the file address I was getting was not correctly prefixed. Seems that now when you reference ios or android files you have to preface the address with"file://"
( "file://" +vidPath)
Does AVPro support 5.2K 3D-180 Video with Ambisonic Spatial Audio on the Oculus GO?
Is it possible to pass a c# stream for playback? Or is it possible to play a stream with self signed certificate and http authentication?
Is there a list that states which resolutions are supported by the different chipsets on the Android platform? In the documentation is a link to the Kodi Wiki provided but that contains no Info about resolution. The S8 for example supports nearly 4k but the latest Samsung A20 can't even do 2K.
Also the provided Facebook360 demo video doesn't work on Android, there is audio but video is black - am I missing something?
These are my own findings (in context of 360VR Stereoscropic content) I came up with, more then happy to be corrected:
Now I need to check at runtime the supported resolution and my Idea is to include a 1sec blank clip for each of these resolutions and go from the highest to the lowest and find out which one works.
Also for my scenario all videos will be downloaded expect the initial video, so I also need to figure out that out how to handle this, since I don't like to just use the lowest resolution and give a bad impression on a device that could support higher resolutions.
Nearly 15 days later, I didn't see your reply on git, and why do I think my question "pollution" forum?
Interesting, I actually changed from Email to this Forum because they stopped answering me since about 2 weeks.
This is kinda scary because I like to buy this but I can't drop 400€ on something that is isn't supported anymore or is hanging on life support.
hello RenderHeads Team
i would like to know if there is a way to have different blending modes ( additive, substract, exclusion ...) with your Avpro product ?
Thanks you for your feedback
i ve find a solution to blending AVPro video modes . I ve used asset filter cameras. ( like , for example, camera pack filter without Ad)
meanwhile i used one camera per footage
but thats work fine .
good news for us graphic designer !
Hello! Do you have any advise on how to properly use multiple cameras in the scene? I have a spectator cam that tracks the movements of the main camera, the spectator cam outputs to second display, not HMD. I'm having an issue where my stereo 360 video alternates between the two eyes in the spectator cam view, while the main camera is totally fine. I'm assuming it has something to do with the fact that only the main camera is assigned in the "update stereo material" script settings and I'm also using single pass instanced stereo rendering mode. Thank you!
Is there any way to change from stereoscopic rendering to monoscopic rendering at runtime? I am setting material.SetFloat("ForceEye", 1) for example during runtime. It does not actually set the force eye mode to Right if I use 1 or left if I use 2. Once call the above function, I am checking to see if the value is actually changing in the shader with material.GetFloat("ForceEye"). I am getting the correct value I set it to even though it is not actually changing from stereoscopic rendering to monoscopic rendering during runtime. I'd like the ability to change to monoscopic rendering during something like a pause menu. Any help or advice is appreciated. If it is not possible, let me know so I can make a feature request on the github page. Thank you.
Where can I find the AVPro Video documentation? The links on Renderheads homepage (http://renderheads.com/products/avpro-video/) are broken and the documentation is not included in the asset store package.
I would need to find out how to read metadata from an HLS stream - like what is the current timestamp of the segment.
There seems to be a lag in the AVPro Video starting in my scene, causing the video to get slightly out of sync (about 3-5 frames) with the other elements in the scene. (This is a music app, that needs the track in the video to be exactly synced with the elements in the scene)
I am already using Coroutines to make sure that the FinishedSeeking and Started video events have fired before any other playback in the scene is started. Here's an outline of how my Start Playback function is structured:
bool finishedSeeking = false;
bool videoStarted = false;
private void Seek()
finishedSeeking = false;
videoStarted = false;
yield return new WaitUntil(() => finishedSeeking);
yield return new WaitUntil(() => videoStarted);
//Start other elements in scene (Audio track, synthesizer, etc.)
//They are still slightly ahead of the video track, enough to be a problem.
yield return new WaitUntil(() => finishedSeeking);
public void OnVideoEvent(MediaPlayer mediaPlayer, MediaPlayerEvent.EventType eventType, ErrorCode error)
Debug.Log("VideoEvent: " + eventType.ToString() + " Error: " + error.ToString());
finishedSeeking = true;
videoStarted = true;
finishedSeeking = true;
videoStarted = true;
However, there is still about a 0.13 second lag between the AVPro video and the other elements.
My scene previously used Unity Video Player, and there was no detectable lag.
Is there a better way to manage seeking, or otherwise make sure there's no loss of synchronization when starting or seeking the video?
I want to use mediaPlayer.OpenVideoFromBuffer - currently the only supported
video API is DirectShow for that. But there is a difference in how the video looks when I use
"Media Foundation" (open video from the file) and when I read it from the buffer using
Direct Show - any advice??
I'm using MP4 video file on Windows 10.
What do you mean by "there is a difference"?
Or are you asking whether there is any visual difference? If so, then there shouldn't be any visual difference.
You could try to encode your videos in a way that makes seeking and decoding in general faster. Most H.264/H.265 encoders (eg ffmpeg) have presets for this such as "-fastdecode" which will make the files slightly larger, but will also make them faster to playback/seek. You could also add more keyframes to your video, and make sure that you're seeking to a keyframe (fastest) rather than an intermediate (B/P) frame as these take longer to decode unless the subsequent frames have already been decoded.
I hope this helps. Thanks,
The PDF documentation is included in the package, and is also linked on our website and the Unity asset store. I believe we had a problem with our website host last week that caused links to not work. It should be working now.
As far as I know we don't really have any means to query the current timestamp of a segment. Though I'm not completely sure what you mean by that.
Yes you should be able to change the rendering mode at runtime.
You should set it on the UpdateStereoMaterial component itself via:
updateStereoMaterial.ForceEyeMode = StereoEye.Left;
Hope that helps,
Hmm good question...
What if you set the spectator camera Target Eye property to left?
Glad to hear it works well
We don't have any list of supported resolutions/frame-rates etc...The market is just too fragmented with so many new devices, and often the information isn't easily available.
The 1 second blank clip is a good idea and one we have seem used before. You can have a few of these with different specs and see which one plays, and also validate that they can achieve the desired playback frame-rate. Of course it's not a perfect test because the bit-rate will be very low, and often bit-rate can be a factor in playback performance.
But yes, it's a good idea to do this as we haven't found any other reliable way to know the system limits.
I believe it should support this. The Ambisonic audio would have to be using our Facebook 360 system though as we don't support normal ambisonic videos directly without this encoding step.
You could try the free trial version to make sure it works as you intend.
We have a demo scene in the project called 06_Demo_FrameExtract which shows how to do this.
Basically though it just involves open the video as normal and extracting a frame.
Hmmm....I'm not sure why it wouldn't work. I'm actually not familiar with CloudFront. Did you ever find a solution? Is it just a normal MP4 video file, or something like HLS?
Do you mean a C# Stream class object? If so - no that is not supported. The closest we support to that is on UWP where you can pass in a IRandomAccessStream.
I believe playing a self signed certificate may be supported on some platforms. We have seen it working on all Apple platforms (macOS, iOS, tvOS) and Windows but currently not Android.
As for HTTP authentication - this is something we haven't tested. If you are using BASIC auth and just passing the user/pass via the URL, then it might work....
Sorry that's all the information I have on this right now.
Is there any way to protect video files on PC, Mac & Linux Standalone?
Is there a way on Android to check/test if a file can be played before actually playing it? Reason is that before downloading the Videos, we like to know which resolution the device supports, so that we can deliver the best resolution.
There isn't really a way to do this. You could perhaps rename the files to something like ".dat"...
There isn't a way to do this. You could try to maintain a list of all Android hardware limitations, but I think this is far too difficult. We generally will have some dummy black videos and just try to play them on startup to determine which resolutions are supported.
With my AVProVideo for Windows (recently bought, freshly installed. Unity at 2019.3.3 atm) I'm setting up a project with a 360 degree surround video. The application should be able to switch between using a VR HMD (HTC Vive Pro) and the monitor. Most things seemed to run fine until suddenly I cannot use my mouse to look around any more when not using the XR device. Whether I just tell Unity to disable it (XRSettings.enabled = false) or unplugging the Vive completely.
Using one of your demo scenes (I imported the whole package into my project) does not work (anymore) either.
Enabling the XR and looking around with the Vive works fine.
Any ideas where my mouse controle suddenly went?
Plus one question: does AVProVideo support switching the audio playback device? When I leave the Vive connected but disable XR in Unity, I still hear the audio via de Vive headset instead of speakers or other headphones.