A Unity ID allows you to buy and/or subscribe to Unity products and services, shop in the Asset Store and participate
in the Unity community.
Discussion in 'Assets and Asset Store' started by Lanre, Nov 18, 2017.
PM me with your invoice number and I'll share NatCorder 1.7.
Can you share your code?
Looks like you're using NatReader, which is pre-release. You can upgrade to the latest commit by removing the `com.natsuite.natreader` entry in your `Packages/package-lock.json` file. This isn't a NatCorder bug.
I have send PM. Thanks
Getting this same error despite having "Prepare iOS for Recording" enabled. Any idea what might be causing it?
Can you share the full logs from Xcode in a .txt attachment?
So we tracked the above down to a native plugin that created a sharedInstance of AVAudioSession. Apparently this somehow conflicted with the NATDevice recording (?). Removing the other plugin solved the issue.
That's probably not the explanation. The `sharedInstance` is just that--it's a singleton shared across the entire app, no matter where you create or get it from. And creating a new `AVAudioSession` will actually just return the shared instance.
I don't think it was the shared instance itself that was the issue, but what they were accessed for. More likely an issue with the other plugin than anything to do with yours.
I see, yeah that's plausible. Let me know I can help.
What could be the issue if I get squished videos with the CameraInput set to Camera.main?
It looks like the recorder records the whole screen and then squishes the image.
Thanks in advance, Mark
The only known scenario is when recording with augmented reality (ARFoundation, Vuforia, and so on). AR packages have to modify how the game cameras are rendered, so if you try to record them with a different aspect ratio, you will get squishing.
I am using AR Foundation, so that must be the issue then.
Is there any workaround for this or do I have to go with a resolution that is always based on the screen resolution so I have the same aspect ratio always as the device?
You'll have to record at a resolution that has the same aspect ratio as the screen.
Hi I tryed out the CropTextureInput and WatermarkTextureInput
Is it possible to use both at the same time together I could not figure that out are there some examples anywere?
For CropTextureInput I always get a black video some how is that one working?
when I build for Standalone MacOS on Architecture Intel 64-bit + Apple silicon with Unity 2020.3.16f and NatCorder 1.8.1 I get a "DllNotFoundException NatCorder". For Architecture Intel 64-bit it works like a charm without any Exceptions.
I already deleted the whole package an reimported it.
The watermark and crop texture inputs in NCPX are still WIP. They aren't ready for use yet. The API has not been finalized. Once it is, there will be a way to use both simultaneously.
Hm I'll check this out. NatCorder is built with both Intel and arm64 slices.
It seems like I am experiencing some kind of memory leak on iOS. This happens when I enable audio input with video input at the same time. Here is the memory consumption of the ReplayCam scene with record microphone enabled.
This test project was built with Unity 2020.3.17f and Natcorder version 1.8.1
I have been experiencing a small color difference with NatCorder (1.8.1) on android. (tested on a Galaxy S8, Android 9, iOS works perfect) The colors recorded do not seem to be a 1:1 match with what is shown on screen. Most notable with the lighter colors. Yellow becomes more orange and orange becomes more reddish. I have tried both gamma and linear space with the same result. I have tried to blit the color using a shader from linear to gamma, without the desired result.
Am I doing something wrong? Is there any way to solve this?
I have added an example what makes the problem a bit clearer. (I hope)
Hey there, I've been able to reproduce this. I'll work on adding a fix. Thanks for bringing it to my attention.
Hey Martin, I'm not entirely sure what could be causing this. First off, NatCorder doesn't (officially) support linear color, so you'll want to stick to gamma. This could be an issue with the encoder, specifically the color transfer function used in converting RGBA32 to YUV for encoding. Can you try verifying this on a different Android device to see if it's device-specific?
I can confirm the same happens on a Pixel 4 and galaxy s21. They all return the same color difference as the previous example.
That’s great. So do you have an estimate of when the fix would be out? And in the meantime, do you have any quick fixes or suggestions that I can try out?
I'll send you a DM on this.
No ETA for the fix, but I'll look into this weekend. I don't know of a workaround because I'm not sure where the leak is coming from.
iPhone 13 pro max and iPhone 13 pro can not get preview image.
get corruption image,but can CapturePhoto no problem.
Is there any update about supporting HDR?
I'm working to get an iPhone 13 right now. I'll get back to you on this immediately I get the device. I apologize for the delay.
What was the issue?
It is possible to record HDR cameras, but it requires extra setup
Modify `CameraInput` and in the constructor, change the format of the `frameDescriptor` to `RenderTextureFormat.ARGBFloat`.
Add a tonemapper to your game camera.
If you can send me a repro project, I can look into adding automatic support to NatCorder so you don't need to perform extra steps.
I am using the MP4recorder to record an AR scene with cameraInput. I have some questions regarding the video size:
1. Is the bitrate a reference only? I tried to set the bitrate to 3_600_000 and take a few videos. The bitrates of the three videos are 3397kbps, 3746kbps, and 4316kbps. Does that mean that the bitrate would change according to the video content? If yes, is there a maximum bitrate for 3_600_000?
2. I have a limit on file transfer, so I would like to keep the video file size under 30MB. I am not sure what would be the best way to do it. Do you have any suggestions?
Is it possible to turn down the video quality after the video is created? Or would it be possible to write the video file with settings different than the declared value?
Thank you for reading the questions. Hope you have a nice weekend )
Hey there, NatCorder uses variable bitrate across the MP4 and HEVC recorders. So yes, the effective bitrate of the video depends on the visual complexity of what you are recording. NatCorder doesn't support constant bitrate recording.
Hm this is a tough one. It's practically impossible to create a hard upper bound. What you can do is configure recording so that it's unlikely that the video will ever go beyond that size. For this, the most effective control is the recording resolution and duration. Other configurations like the bitrate and keyframe interval have less of an effect.
Not at all; NatCorder has no video editing or transcoding functionality. It is strictly a video recorder, so its work is done once you've finished recording a video.
We are encountering a native crash on Android when committing audio samples from a fixed-length array on another thread. This code works fine on every other platform (including iOS). The native crash indicates JNI throwing BufferOverflowException.
If we loop over each video frame, committing a subset of the audio samples on each iteration, Natcorder works fine.
See this <standalone repro script> that will crash on Android 11.
On Android the codec provides fixed-size buffers that clients (e.g. NatCorder) then fill with data and send back for encoding. Because of this, you can't commit an entire waveform in one call to `CommitSamples`. Instead, you have to chop it up and commit.
The default size of the buffer that the encoder provides is typically small (4096 or 8192 bytes, can't remember). NatCorder bumps this up to 16384 bytes. So in C#, the max size of a sample buffer you can commit is 8192 samples (16384 bytes divided by two bytes-per-short).
Hey there, just wanted to follow up on this. The memory leak seems to be something that's pretty hard to find. There's no explicit leak in NatCorder's own encoding code. It'll take longer to get this fixed, but I'm working on it.
Understood, thank you!
Hi I want to use the "MTLTextureInput" for iOS but the class is private, I can use the GLESTextureInput fine so.
How is that intended to be used why is the class not public like GLESTextureInput?
I am using the latest versions of both.
The MTLTextureInput will be removed entirely soon. Unity already supports async GPU readbacks on Metal, so you can use the AsyncTextureInput instead.