Search Unity

  1. Get the latest news, tutorials and offers directly to your inbox with our newsletters. Sign up now.
    Dismiss Notice

NatCorder - Video Recording API

Discussion in 'Assets and Asset Store' started by Lanre, Nov 18, 2017.

  1. Lanre

    Lanre

    Joined:
    Dec 26, 2013
    Posts:
    3,614
    Hey @Mr-Mechanical the System.IO.File API is what I'd use. Deciding when to delete videos, and keeping track of them, is where you'd need to put some thought. You could walk the persistent data path directory; or better yet always record to the same file name so you don't get tons of videos.
     
    Mr-Mechanical likes this.
  2. 254926964

    254926964

    Joined:
    Mar 19, 2021
    Posts:
    2
    Hi Lanre!
    I have a question about the example ReplayCam.
    I put a Sprite in the scene and the sprite is always move, when I start record ,the sprite paused.but the video also record the Sprite path,When i End record,the Sprite begin move. how can i fix it.? Thanks.
     
  3. Lanre

    Lanre

    Joined:
    Dec 26, 2013
    Posts:
    3,614
    This sounds like a bug that used to happen on Metal (iOS or macOS) where the screen would freeze when recording was enabled. Unity fixed it in 2019.2. What version of Unity are you using?
     
  4. 254926964

    254926964

    Joined:
    Mar 19, 2021
    Posts:
    2
    I update Unity to 2020,this problem has solved. Thanks for your help.:)
     
    Lanre likes this.
  5. bramble-operations

    bramble-operations

    Joined:
    Aug 27, 2019
    Posts:
    8
  6. Lanre

    Lanre

    Joined:
    Dec 26, 2013
    Posts:
    3,614
  7. bandyer

    bandyer

    Joined:
    Feb 9, 2021
    Posts:
    1
    Hi. Is 1.8.0 available ? Asset Store is still on 1.7.3
     
  8. Lanre

    Lanre

    Joined:
    Dec 26, 2013
    Posts:
    3,614
    Not just yet, though it has been in a beta for quite some time. You can always PM or email me to get the build.
     
  9. KalurielPrime

    KalurielPrime

    Joined:
    May 11, 2014
    Posts:
    5
    @Lanre

    In the documentation it states that timestamps should be monotonic, does this mean that both video and audio commits need to be in chronological order?

    It is also stated that "the very first timestamp for either an audio or video frame must be zero", is this for both first video and first audio frame? And in the case of both initially commits being at 0, does this mean we should be padding our silence if there is no audio initially?

    Thanks
     
  10. Lanre

    Lanre

    Joined:
    Dec 26, 2013
    Posts:
    3,614
    Yes, frames within a given track must be strictly monotonic (i.e, current frame timestamp > previous frame timestamp). This doesn't hold across tracks (video and audio), so one track can be 'behind' another, temporally speaking.
    It's either-or, so any one.
    Though I haven't tested this, if the time difference is negligible (only a few frames), then no need to commit silence.
     
  11. KalurielPrime

    KalurielPrime

    Joined:
    May 11, 2014
    Posts:
    5
    Lanre likes this.
  12. KalurielPrime

    KalurielPrime

    Joined:
    May 11, 2014
    Posts:
    5
    @Lanre

    I've done some debugging on the video issues we've seen and thought you migth be interested.

    We've had issues on Android of our audio buffer going over the max of 2048 samples per channel, and my predecessor resolved this by comitting the remaining samples on the next frame. This seemed to work fine as the audio just continues on as if there is no space between audio commits.

    However the time jump caused the video and audio to come out of sync and a shorter video. This was easy to resolve by just doing multiple commits per frame with the timstamp being offset by amount of nanoseconds the samples being commited prior to the remainder.

    However this also highlighted another issue - not committing audio frames when there is silence. If I commit audio like follows

    Code (CSharp):
    1.  
    2. VIDEO 0s --------------------------------------------------------------------- 40s
    3. AUDIO 0s -- no commit -- 10s -- commit -- 20s -- no commit -- 30s -- commit -- 40s
    4.  
    Then on Windows / iOS / macOS, I'll end up with something like this, with silence filling up the end of the video.
    Code (CSharp):
    1.  
    2. VIDEO 0s ----------------------------------------------------------------- 40s
    3. AUDIO 0s -- no audio -- 10s -- audio -- 20s -- audio -- 30s -- no audio -- 40s
    4.  
    However on Android, it just shortens the video regardless of the video commits and skips the initial period where no audio samples were commited.
    Code (CSharp):
    1.  
    2. VIDEO 10s -------------------------- 30s
    3. AUDIO 0s -- audio -- 10s -- audio -- 20s
    4.  
    I'm going to try padding the audio samples with the appropriate amount of 0s when there is silence between commits
     
    Last edited: Mar 29, 2021
  13. Lanre

    Lanre

    Joined:
    Dec 26, 2013
    Posts:
    3,614
    Are you limiting your sample buffers to not be longer than 2048 samples per channel? And how many channels is your audio data? On Android, the MP4 and HEVC recorders can accept sample buffers that have up to 16384 samples (so divide by the number of channels you have). This was increased from 8192 samples over 17 months ago.
    Yup sounds like this should work.
    This is pretty odd, and is likely an implementation detail from Android's `MediaMuxer`. NatCorder will wait until all encoded chunks from both encoders have been sent to the muxer before finalizing the session. So it must be that the muxer is truncating the streams to the shorter of the two.
     
  14. KalurielPrime

    KalurielPrime

    Joined:
    May 11, 2014
    Posts:
    5
    No dropped samples and we have two channels. The plugin is told our sample rate and without looking at the our source I'm guessing it will output as that.

    It will however occassionally feed us more samples so it could just be giving us the audio frames further into the future.

    I think the change to limit by 2048 per channel when commiting was done quite a while back back in early development when an issue appeared, so if it has been increased since I'll give 8192 a go.
     
    Last edited: Mar 29, 2021
    Lanre likes this.
  15. MasterControlProgram

    MasterControlProgram

    Joined:
    Apr 18, 2015
    Posts:
    38
    We're currently seeing an issue where a video recorded on a high end / latest iOS device records at the transcoding profile High@L5. This video then, when loaded on an Amazon Fire device, fails to play because that transcoding profile is not supported. The highest profile that device supports is High@L4. Is there a way to set the transcoding profile in Natcorder to something that's not the device's default?
     
  16. xmarty

    xmarty

    Joined:
    Feb 2, 2015
    Posts:
    3
    Hi,

    The first frame is black on Android.
    I use Unity 2019.4 / NatCorder 1.7.3 with NatRender(GLESReadback)
    when readback.request, NativeArray<byte> filled zero.
     
  17. Lanre

    Lanre

    Joined:
    Dec 26, 2013
    Posts:
    3,614
    Hey there. This is an interesting one; I've never had it come up before. I'll look into exposing this as a setting, though I can't make any guarantees that it will be exposed. What I could do is make a custom build of libNatCorder.a (the iOS library) that hard-codes the encoding profile.
     
  18. Lanre

    Lanre

    Joined:
    Dec 26, 2013
    Posts:
    3,614
    This sounds like a bug. Do you mind sharing your readback code? Also, when NatCorder 1.8.0 is released, you wouldn't have to use NatRender directly, as the NatCorder Performance Extensions (NCPX) library provides these constructs.
     
  19. MasterControlProgram

    MasterControlProgram

    Joined:
    Apr 18, 2015
    Posts:
    38
    Thanks Lanre for the reply! A custom build of libNatCorder.a would be greatly appreciated. Would you be able to set the iOS profile to High@L4? That should work fine on the devices we're having issues with. And yes, in the future, it'd be great to have this configurable if possible.

    Also, some context might be helpful! Our app allows you to record videos that are then uploaded to our server. A section of our app allows you to view these saved videos. These videos are associated with the user's account, so they can login to the same account on a different device. Thus, if they make a video on a newer iOS device, switch over to a lower end Kindle device using the same account, and try to view saved video - playback fails.

    Thanks again!
     
  20. MasterControlProgram

    MasterControlProgram

    Joined:
    Apr 18, 2015
    Posts:
    38
    We're investigating possibly adjusting our resolution of the video which might force it down to a lower transcoding profile Lanre. Will keep you posted!
     
  21. xmarty

    xmarty

    Joined:
    Feb 2, 2015
    Posts:
    3
    I've use your NatRender (https://github.com/natsuite/NatRender)
    And use NatCorder ReplayCam Sample scene.
    Project Setting:
    Auto Graphic API -> Off
    Graphics API-> [OpenGLES3]
    Multithreaded Rendering-> On
    ScriptBackEnd-> IL2CPP
    Target Architectures->[ARM64]

    CameraInput.cs Constructor
    Code (CSharp):
    1.              
    2. #if UNITY_EDITOR
    3.             readback = new AsyncReadback();
    4. #elif UNITY_ANDROID
    5.             readback = new GLESReadback(width, height, true);
    6. #elif UNITY_IOS
    7.             readback = new MTLReadback(width, height, true);
    8. #endif
    9.  
    CameraInput.cs OnFrame
    Code (CSharp):
    1.                 readback.Request(frameBuffer, buffer => {
    2.                     if (pixelBuffer != null) {
    3.                         buffer.CopyTo(pixelBuffer);    
    4.                         recorder.CommitFrame(pixelBuffer, timestamp);
    5.                     }
    6.                 });
     
  22. lc0034

    lc0034

    Joined:
    Jul 13, 2017
    Posts:
    5
    Hi.
    There is an error.

    Unity version : 2020.2.4f1

    Platform : PC(Window standalone)

    My source :
    mediaRecorder = new MP4Recorder(1280, 720, 30f, 48000, 2);

    error log :
    : Unable to load DLL 'NatCorder': The specified module could not be found.
    at NatSuite.Recorders.MP4Recorder+<>c__DisplayClass2_0.<.ctor>b__0 (NatSuite.Recorders.Internal.Bridge+RecordingHandler callback, System.IntPtr context) [0x00000] in <00000000000000000000000000000000>:0
    .
    .

    Help me...
     
  23. Lanre

    Lanre

    Joined:
    Dec 26, 2013
    Posts:
    3,614
    This would be the preferable solution so you don't keep needing custom NatCorder builds going into the future. Another option is to transcode videos on your server, an approach which is very common.
     
  24. Lanre

    Lanre

    Joined:
    Dec 26, 2013
    Posts:
    3,614
    Can you PM me with your invoice number and email address? I'll send you the NatCorder 1.8.0 beta build which you can use with NCPX. If the issue is reproducible with NCPX, I'll open a bug report.
     
  25. MasterControlProgram

    MasterControlProgram

    Joined:
    Apr 18, 2015
    Posts:
    38
    Hey Lanre! Just a follow up. Resolution is the solve here it seems. Basically we were recording, on higher end devices, a higher resolution. This pushes the transcoding profile to a more modern High@L5 which supports the higher resolution but is not compatible on lower end devices. If we record at a lower resolution, it automatically gets assigned the lower High@L4 transcoding profile which works on the lower end device. And yes, for existing videos that have this problem, we're going to do some backend transcoding for cross platform support. Thank a lot for you help!
     
  26. Lanre

    Lanre

    Joined:
    Dec 26, 2013
    Posts:
    3,614
    I will send you the package. Please remove your invoice number because this information should not be public.
     
  27. Lanre

    Lanre

    Joined:
    Dec 26, 2013
    Posts:
    3,614
    This sounds good. I'm glad you found an easy workaround!
     
  28. xmarty

    xmarty

    Joined:
    Feb 2, 2015
    Posts:
    3
    I tested it by apply the 1.8.0 beta package with NCPX
    using the GLESTextureInput in the ReplayCam example.
    However, the problem with a black frame was not solved.
     
  29. lc0034

    lc0034

    Joined:
    Jul 13, 2017
    Posts:
    5
    OK. Thankyou. I will wait.
     
    Lanre likes this.
  30. Lanre

    Lanre

    Joined:
    Dec 26, 2013
    Posts:
    3,614
    In this case open an issue on the NCPX repo. Make sure to add information about what device and OS version you are running on when you create the issue.
     
  31. Lanre

    Lanre

    Joined:
    Dec 26, 2013
    Posts:
    3,614
    For some reason, WeTransfer is failing to send you the file. Do you have another email address I can send it to?
     
  32. lc0034

    lc0034

    Joined:
    Jul 13, 2017
    Posts:
    5
    jck9906@naver.com
    Thank you
     
    Lanre likes this.
  33. lc0034

    lc0034

    Joined:
    Jul 13, 2017
    Posts:
    5
    Sorry. It's my fault.
    I changed the Architecture (in Build Settings) from x86 to x86_64 and it works fine.
    Thank you!
     
    Lanre likes this.
  34. Somshekar

    Somshekar

    Joined:
    Dec 16, 2016
    Posts:
    9
    Crash on Android - signal 6 (SIGABRT), code -6 (SI_TKILL), fault addr (Flutter + Unity)
    E/AndroidRuntime(25654): FATAL EXCEPTION: UnityMain
    E/AndroidRuntime(25654): Process: com.example.example_flutter_natcorder, PID: 25654
    E/AndroidRuntime(25654): java.lang.Error: *** *** *** *** *** *** *** *** *** *** *** *** *** *** *** ***
    E/AndroidRuntime(25654): Version '2019.4.22f1 (9fdda2fe27ad)', Build type 'Release', Scripting Backend 'il2cpp', CPU 'arm64-v8a'
    E/AndroidRuntime(25654): Build fingerprint: 'OnePlus/OnePlus3/OnePlus3T:9/PKQ1.181203.001/1911042108:user/release-keys'
    E/AndroidRuntime(25654): Revision: '0'
    E/AndroidRuntime(25654): ABI: 'arm64'
    E/AndroidRuntime(25654): Timestamp: 2021-04-09 11:25:06+0530
    E/AndroidRuntime(25654): pid: 25654, tid: 25796, name: UnityMain >>> com.example.example_flutter_natcorder <<<
    E/AndroidRuntime(25654): uid: 10411
    E/AndroidRuntime(25654): signal 6 (SIGABRT), code -6 (SI_TKILL), fault addr --------
    E/AndroidRuntime(25654): x0 0000000000000000 x1 00000000000064c4 x2 0000000000000006 x3 0000000000000008
    E/AndroidRuntime(25654): x4 000000000a013128 x5 000000000a013128 x6 000000000a013128 x7 00000000001fffff
    E/AndroidRuntime(25654): x8 0000000000000083 x9 ac39d5a6e35d00c0 x10 fffffff87ffffbdf x11 ac39d5a6e35d00c0
    E/AndroidRuntime(25654): x12 ac39d5a6e35d00c0 x13 fffffff87ffffbdf x14 00000000000000cb x15 0000007d6e3788c8
    E/AndroidRuntime(25654): x16 0000007d6e36c2b0 x17 0000007d6e27e088 x18 0000007ce8800080 x19 0000000000006436
    E/AndroidRuntime(25654): x20 00000000000064c4 x21 0000000000000083 x22 0000007cc8d25800 x23 0000007b8b8f8000
    E/AndroidRuntime(25654): x24 0000000000000014 x25 00000000ffffffff x26 0000000000000012 x27 0000000000000005
    E/AndroidRuntime(25654): x28 0000007ce80a5107 x29 0000007cbd7a67f0
    E/AndroidRuntime(25654): sp 0000007cbd7a67b0 lr 0000007d6e26f4d0 pc 0000007d6e26f4f0
    E/AndroidRuntime(25654):
    E/AndroidRuntime(25654): backtrace:
    E/AndroidRuntime(25654): #00 pc 00000000000224f0 /system/lib64/libc.so (abort+112) (BuildId: 055dd78cff796ef05668266325ca65a8)
    E/AndroidRuntime(25654): #01 pc 000000000046e080 /system/lib64/libart.so (art::Runtime::Abort(char const*)+1208) (BuildId: f1eeccb4c9a016b8729a9ad774977d5d)
    E/AndroidRuntime(25654): #02 pc 00000000000095a8 /system/lib64/libbase.so (android::base::LogMessage::~LogMessage()+720) (BuildId: 2865ab4ce5ce5dc6a4f92b1fb8e61a31)
    E/AndroidRuntime(25654): #03 pc 00000000002e9e6c /system/lib64/libart.so (art::JavaVMExt::JniAbort(char const*, char const*)+1660) (BuildId: f1eeccb4c9a016b8729a9ad774977d5d)
    E/AndroidRuntime(25654): #04 pc 00000000002ea0e8 /system/lib64/libart.so (art::JavaVMExt::JniAbortF(char const*, char const*, ...)+196) (BuildId: f1eeccb4c9a016b8729a9ad774977d5d)
    E/AndroidRuntime(25654): #05 pc 000000000049ef7c /system/lib64/libart.so (art::Thread::DecodeJObject(_jobject*) const+700) (BuildId: f1eeccb4c9a016b8729a9ad774977d5d)
    E/AndroidRuntime(25654): #06 pc 00000000000fcd10 /system/lib64/libart.so (art::(anonymous namespace)::ScopedCheck::CheckInstance(art::ScopedObjectAccess&, art::(anonymous namespace)::ScopedCheck::InstanceKind, _jobject*, bool)+96) (BuildId: f1eeccb4c9a016b8729a9ad774977d5d)
    E/AndroidRuntime(25654): #07 pc 00000000000fc004 /system/lib64/libart.so (art::(anonymous namespace)::ScopedCheck::CheckPossibleHeapValue(art::ScopedObjectAccess&, char, art::(anonymous namespace)::JniValueType)+580) (BuildId: f1eeccb4c9a016b8729a9ad774977d5d)
    E/AndroidRuntime(25654): #08 pc 00000000000fb5a8 /system/lib64/libart.so (art::(anonymous namespace)::ScopedCheck::Check(art::ScopedObjectAccess&, bool, char const*, art::(anonymous namespace)::JniValueType*)+628) (BuildId: f1eeccb4c9a016b8729a9ad774977d5d)
    E/AndroidRuntime(25654): #09 pc 00000000000ebb4c /system/lib64/libart.so (art::(anonymous namespace)::CheckJNI::GetObjectClass(_JNIEnv*, _jobject*)+688) (BuildId: f1eeccb4c9a016b8729a9ad774977d5d)
    E/AndroidRuntime(25654): #10 pc 000000000004a720 <anonymous:0000007cc88ff000>
    E/AndroidRuntime(25654):
    E/AndroidRuntime(25654): at libc.abort(abort:112)
    E/AndroidRuntime(25654): at libart.art::Runtime::Abort(char const*)(Abort:1208)
    E/AndroidRuntime(25654): at libbase.android::base::LogMessage::~LogMessage()(~LogMessage:720)
    E/AndroidRuntime(25654): at libart.art::JavaVMExt::JniAbort(char const*, char const*)(JniAbort:1660)
    E/AndroidRuntime(25654): at libart.art::JavaVMExt::JniAbortF(char const*, char const*, ...)(JniAbortF:196)
    E/AndroidRuntime(25654): at libart.art::Thread::DecodeJObject(_jobject*) const(DecodeJObject:700)
    E/AndroidRuntime(25654): at libart.art::(anonymous namespace)::ScopedCheck::CheckInstance(art::ScopedObjectAccess&, art::(anonymous namespace)::ScopedCheck::InstanceKind, _jobject*, bool):)96)
    E/AndroidRuntime(25654): at libart.art::(anonymous namespace)::ScopedCheck::CheckPossibleHeapValue(art::ScopedObjectAccess&, char, art::(anonymous namespace)::JniValueType):)580)
    E/AndroidRuntime(25654): at libart.art::(anonymous namespace)::ScopedCheck::Check(art::ScopedObjectAccess&, bool, char const*, art::(anonymous namespace)::JniValueType*):)628)
    E/AndroidRuntime(25654): at libart.art::(anonymous namespace)::CheckJNI::GetObjectClass(_JNIEnv*, _jobject*):)688)
    E/AndroidRuntime(25654): at Unknown.0x7cc8949720(Unknown Source:0)
    E/AndroidRuntime(25654): FATAL EXCEPTION: UnityMain
    E/AndroidRuntime(25654): Process: com.example.example_flutter_natcorder, PID: 25654
    E/AndroidRuntime(25654): java.lang.Error: FATAL EXCEPTION [UnityMain]
    E/AndroidRuntime(25654): Unity version : 2019.4.22f1
    E/AndroidRuntime(25654): Device model : OnePlus ONEPLUS A3003
    E/AndroidRuntime(25654): Device fingerprint: OnePlus/OnePlus3/OnePlus3T:9/PKQ1.181203.001/1911042108:user/release-keys
    E/AndroidRuntime(25654):
    E/AndroidRuntime(25654): Caused by: java.lang.Error: *** *** *** *** *** *** *** *** *** *** *** *** *** *** *** ***
    E/AndroidRuntime(25654): Version '2019.4.22f1 (9fdda2fe27ad)', Build type 'Release', Scripting Backend 'il2cpp', CPU 'arm64-v8a'
    E/AndroidRuntime(25654): Build fingerprint: 'OnePlus/OnePlus3/OnePlus3T:9/PKQ1.181203.001/1911042108:user/release-keys'
    E/AndroidRuntime(25654): Revision: '0'
    E/AndroidRuntime(25654): ABI: 'arm64'
    E/AndroidRuntime(25654): Timestamp: 2021-04-09 11:25:06+0530
    E/AndroidRuntime(25654): pid: 25654, tid: 25796, name: UnityMain >>> com.example.example_flutter_natcorder <<<
    E/AndroidRuntime(25654): uid: 10411
    E/AndroidRuntime(25654): signal 6 (SIGABRT), code -6 (SI_TKILL), fault addr --------
    E/AndroidRuntime(25654): x0 0000000000000000 x1 00000000000064c4 x2 0000000000000006 x3 0000000000000008
    E/AndroidRuntime(25654): x4 000000000a013128 x5 000000000a013128 x6 000000000a013128 x7 00000000001fffff
    E/AndroidRuntime(25654): x8 0000000000000083 x9 ac39d5a6e35d00c0 x10 fffffff87ffffbdf x11 ac39d5a6e35d00c0
    E/AndroidRuntime(25654): x12 ac39d5a6e35d00c0 x13 fffffff87ffffbdf x14 00000000000000cb x15 0000007d6e3788c8
    E/AndroidRuntime(25654): x16 0000007d6e36c2b0 x17 0000007d6e27e088 x18 0000007ce8800080 x19 0000000000006436
    E/AndroidRuntime(25654): x20 00000000000064c4 x21 0000000000000083 x22 0000007cc8d25800 x23 0000007b8b8f8000
    E/AndroidRuntime(25654): x24 0000000000000014 x25 00000000ffffffff x26 0000000000000012 x27 0000000000000005
    E/AndroidRuntime(25654): x28 0000007ce80a5107 x29 0000007cbd7a67f0
    E/AndroidRuntime(25654): sp 0000007cbd7a67b0 lr 0000007d6e26f4d0 pc 0000007d6e26f4f0
    E/AndroidRuntime(25654):
    E/AndroidRuntime(25654): backtrace:
    E/AndroidRuntime(25654): #00 pc 00000000000224f0 /system/lib64/libc.so (abort+112) (BuildId: 055dd78cff796ef05668266325ca65a8)
    E/AndroidRuntime(25654): #01 pc 000000000046e080 /system/lib64/libart.so (art::Runtime::Abort(char const*)+1208) (BuildId: f1eeccb4c9a016b8729a9ad774977d5d)
    E/AndroidRuntime(25654): #02 pc 00000000000095a8 /system/lib64/libbase.so (android::base::LogMessage::~LogMessage()+720) (BuildId: 2865ab4ce5ce5dc6a4f92b1fb8e61a31)
    E/AndroidRuntime(25654): #03 pc 00000000002e9e6c /system/lib64/libart.so (art::JavaVMExt::JniAbort(char const*, char const*)+1660) (BuildId: f1eeccb4c9a016b8729a9ad774977d5d)
    E/AndroidRuntime(25654): #04 pc 00000000002ea0e8 /system/lib64/libart.so (art::JavaVMExt::JniAbortF(char const*, char const*, ...)+196) (BuildId: f1eeccb4c9a016b8729a9ad774977d5d)
    E/AndroidRuntime(25654): #05 pc 000000000049ef7c /system/lib64/libart.so (art::Thread::DecodeJObject(_jobject*) const+700) (BuildId: f1eeccb4c9a016b8729a9ad774977d5d)
    E/AndroidRuntime(25654): #06 pc 00000000000fcd10 /system/lib64/libart.so (art::(anonymous namespace)::ScopedCheck::CheckInstance(art::ScopedObjectAccess&, art::(anonymous namespace)::ScopedCheck::InstanceKind, _jobject*, bool)+96) (BuildId: f1eeccb4c9a016b8729a9ad774977d5d)
    E/AndroidRuntime(25654): #07 pc 00000000000fc004 /system/lib64/libart.so (art::(anonymous namespace)::ScopedCheck::CheckPossibleHeapValue(art::ScopedObjectAccess&, char, art::(anonymous namespace)::JniValueType)+580) (BuildId: f1eeccb4c9a016b8729a9ad774977d5d)
    E/AndroidRuntime(25654): #08 pc 00000000000fb5a8 /system/lib64/libart.so (art::(anonymous namespace)::ScopedCheck::Check(art::ScopedObjectAccess&, bool, char const*, art::(anonymous namespace)::JniValueType*)+628) (BuildId: f1eeccb4c9a016b8729a9ad774977d5d)
    E/AndroidRuntime(25654): #09 pc 00000000000ebb4c /system/lib64/libart.so (art::(anonymous namespace)::CheckJNI::GetObjectClass(_JNIEnv*, _jobject*)+688) (BuildId: f1eeccb4c9a016b8729a9ad774977d5d)
    E/AndroidRuntime(25654): #10 pc 000000000004a720 <anonymous:0000007cc88ff000>
    E/AndroidRuntime(25654):
    E/AndroidRuntime(25654): at libc.abort(abort:112)
    E/AndroidRuntime(25654): at libart.art::Runtime::Abort(char const*)(Abort:1208)
    E/AndroidRuntime(25654): at libbase.android::base::LogMessage::~LogMessage()(~LogMessage:720)
    E/AndroidRuntime(25654): at libart.art::JavaVMExt::JniAbort(char const*, char const*)(JniAbort:1660)
    E/AndroidRuntime(25654): at libart.art::JavaVMExt::JniAbortF(char const*, char const*, ...)(JniAbortF:196)
    E/AndroidRuntime(25654): at libart.art::Thread::DecodeJObject(_jobject*) const(DecodeJObject:700)
    E/AndroidRuntime(25654): at libart.art::(anonymous namespace)::ScopedCheck::CheckInstance(art::ScopedObjectAccess&, art::(anonymous namespace)::ScopedCheck::InstanceKind, _jobject*, bool):)96)
    E/AndroidRuntime(25654): at libart.art::(anonymous namespace)::ScopedCheck::CheckPossibleHeapValue(art::ScopedObjectAccess&, char, art::(anonymous namespace)::JniValueType):)580)
    E/AndroidRuntime(25654): at libart.art::(anonymous namespace)::ScopedCheck::Check(art::ScopedObjectAccess&, bool, char const*, art::(anonymous namespace)::JniValueType*):)628)
    E/AndroidRuntime(25654): at libart.art::(anonymous namespace)::CheckJNI::GetObjectClass(_JNIEnv*, _jobject*):)688)
    E/AndroidRuntime(25654): at Unknown.0x7cc8949720(Unknown Source:0)
     
  35. Lanre

    Lanre

    Joined:
    Dec 26, 2013
    Posts:
    3,614
    Can you provide some context? When does the crash happen? What device and OS version? Can you share your recording code?
     
  36. lc0034

    lc0034

    Joined:
    Jul 13, 2017
    Posts:
    5
    Hi.
    Frame drop is too severe during recording.
    1-2 seconds is fine. But after that, the app freezes (same in the editor)

    Unity version : 2020.2.4f1
    Platform : PC(Window standalone)
    Renderpipeline : URP

    My source :
    private void StartRecord()
    {
    mediaRecorder = new MP4Recorder(1280, 720, 30f, 48000, 2);
    realtimeClock = new RealtimeClock();
    cameraInput = new CameraInput(mediaRecorder, realtimeClock, Camera.main);
    audioInput = new AudioInput(mediaRecorder, realtimeClock, audioListener);
    }
    upload_2021-4-15_18-37-53.png
    upload_2021-4-15_18-21-17.png
     

    Attached Files:

    Last edited: Apr 15, 2021
  37. shodgson_nl

    shodgson_nl

    Joined:
    Jan 8, 2018
    Posts:
    8
    Hi @Lanre, We're having some issues pausing on Android. If we stop and start committing several times, regardless of the next commit timestamp it'll put a long delay in the video like we jumped a few seconds forward on our next committed timestamp.
    However there doesn't seem to be a consistent way to produce it, the simplest way is just to start and stop several times.
     
  38. Lanre

    Lanre

    Joined:
    Dec 26, 2013
    Posts:
    3,614
    There is usually freezing when you create a recorder that expects audio (as in your case) but don't commit any audio samples to the recorder. This is a weird Windows-only behaviour, and it's not clear why MediaFoundation does this. So for you, I would check that the audio listener is actually receiving/processing audio samples. Create a script, add an OnAudioFilterRead method, log the sample buffer size, and attach the script to your audio listener (and play to see the logs).
     
  39. Lanre

    Lanre

    Joined:
    Dec 26, 2013
    Posts:
    3,614
    This happens if you aren't 'pausing' timestamps. Can you share your pause/resume code?
     
  40. shodgson_nl

    shodgson_nl

    Joined:
    Jan 8, 2018
    Posts:
    8
    Our timestamps aren't driven by a realtime clock, but by the number of audio samples output by the audio engine we use. So when we pause, our audio engine is just not outputting samples anymore.

    Code (CSharp):
    1.     private void CommitAudioBuffer(float[] buffer)
    2.     {
    3.         Recorder.CommitSamples(buffer, Timestamp);
    4.         int samples = buffer.Length / kNumAudioChannels;
    5.         Timestamp += SamplesToNanoseconds(samples, kAudioSampleRate);
    6.     }
    Other than this, the only other thing we do is make sure the timestamps are monotonic so samples / frames won't be commit if the timestamp hasn't incremented.

    A single pause seems fine (though I can't guarantee it without more testing), but if we pause and unpause several times that is when delays in the video and audio are introduced. However the delays aren't for every subsequent pause after they begin, it is seemingly random when they are added.
    I don't believe them to be silence in the audio samples as the video itself freezes too, and it continues again from where it was before it was paused (and is still in sync with the audio) and no video frames after unpausing have been lost.
     
  41. Lanre

    Lanre

    Joined:
    Dec 26, 2013
    Posts:
    3,614
    How are you handling pausing video frames? If the pauses in the video aren't exactly at the boundaries of when you pause recording in code, then that could be a separate issue. Can you share a video with the pause?
     
  42. shodgson_nl

    shodgson_nl

    Joined:
    Jan 8, 2018
    Posts:
    8
    CommitVideoFrame is called at the end of a renderpass where we do extra rendering for the video frame. I'll look at producing a video with the issue that I can share.

    Code (CSharp):
    1.     public void CommitVideoFrame(ref CommandBuffer cmd, RenderTexture framebuffer)
    2.     {
    3.         long timestamp = Timestamp;
    4.  
    5.         timestamp = (VideoTimestamp == -1) ? 0 : timestamp; // first frame should always start at 0
    6.  
    7.         // this is here to be sure our video timestamps are monotonic
    8.         if (timestamp == VideoTimestamp)
    9.         {
    10.             return;
    11.         }
    12.  
    13.         VideoTimestamp = timestamp;
    14.  
    15.         if (SystemInfo.supportsAsyncGPUReadback)
    16.         {
    17.             cmd.RequestAsyncReadback(
    18.                 framebuffer,
    19.                 0,
    20.                 request =>
    21.                 {
    22.                     var nativeArray = request.GetData<byte>();
    23.                     CommitVideoFrame(nativeArray, timestamp);
    24.                 }
    25.             );
    26.         }
    27.         else
    28.         {
    29.             UnsupportedAsyncReadback_CommitVideoFrameAsyncEntry(framebuffer, timestamp);
    30.         }
    31.     }
    32.  
    33.     private async void UnsupportedAsyncReadback_CommitVideoFrameAsyncEntry(RenderTexture framebuffer, long timestamp)
    34.     {
    35.         // Wait for the next frame to readback so we don't delay the current frame
    36.         await new WaitForEndOfFrame();
    37.         await new WaitForUpdate();
    38.  
    39.         // Read pixels from RenderTexture
    40.         var oldRT = RenderTexture.active;
    41.         {
    42.             RenderTexture.active = framebuffer;
    43.             UnsupportedAsyncReadbackTexture.ReadPixels(new Rect(0, 0, UnsupportedAsyncReadbackTexture.width, UnsupportedAsyncReadbackTexture.height), 0, 0);
    44.  
    45.             var nativeArray = UnsupportedAsyncReadbackTexture.GetRawTextureData<byte>();
    46.             CommitVideoFrame(nativeArray, timestamp);
    47.         }
    48.         RenderTexture.active = oldRT;
    49.     }
    50.  
    51.     private void CommitVideoFrame(Unity.Collections.NativeArray<byte> nativeArray, long timestamp)
    52.     {
    53.         if (nativeArray.IsCreated)
    54.         {
    55.             unsafe
    56.             {
    57.                 IntPtr nativeArrayPtr = new IntPtr(
    58.                     Unity.Collections.LowLevel.Unsafe.NativeArrayUnsafeUtility.GetUnsafePtr(nativeArray)
    59.                 );
    60.  
    61.                 if (nativeArrayPtr != IntPtr.Zero)
    62.                 {
    63.                     Recorder.CommitFrame(nativeArrayPtr, timestamp);
    64.                 }
    65.             }
    66.         }
    67.     }
     
  43. Lanre

    Lanre

    Joined:
    Dec 26, 2013
    Posts:
    3,614
    I should note that I like how your code is crafted to avoid making allocations; looks like you've hand-optimized recording. Anyway, I'm not sure that this is the relevant recording code. What I'm specifically looking to see is how you handle pausing and resuming, though it seems like since you're not using recorder inputs or clocks, you simply suspend sending frames to the recorder. How are the input timestamps calculated? I don't see any incrementing; only (re)assignment.
     
  44. shodgson_nl

    shodgson_nl

    Joined:
    Jan 8, 2018
    Posts:
    8
    Ah sorry, this was mentioned in my second reply

    Though I did miss the samples to nanoseconds method
    Code (CSharp):
    1.     public static long SamplesToNanoseconds(int samples, int sampleRate)
    2.     {
    3.         return (long)(((double)samples * 1000000000.0) / (double)sampleRate);
    4.     }
    So our audio engine is just paused and it no longer outputs samples while paused. We had to stop using the realtime clock as any hiccups in Unity would cause gaps in our audio samples which caused the video and audio to get out of sync enough to be noticeable, even if it only happened once at the beginning of recording.
     
  45. Lanre

    Lanre

    Joined:
    Dec 26, 2013
    Posts:
    3,614
    This is for audio timestamps and it checks out. How about video?
     
  46. shodgson_nl

    shodgson_nl

    Joined:
    Jan 8, 2018
    Posts:
    8
    Video is driven by audio, so it uses the audio timestamp, it is not as obvious as my quick edit just named it Timestamp instead of AudioTimestamp, but you can see long timestamp is assigned from it

     
  47. Lanre

    Lanre

    Joined:
    Dec 26, 2013
    Posts:
    3,614
    This doesn't seem correct. Your audio timestamps are based on the sample rate (44.1KHz I think). Using this as a video timestamp would means your video stream is running at 44,100 frames per second. Your video timestamps need to be computed independently, based on the frame rate of the app (or however often you are committing video frames).
     
  48. shodgson_nl

    shodgson_nl

    Joined:
    Jan 8, 2018
    Posts:
    8
    Sorry I don't quite understand what you mean. We're only submitting videos once a frame at most, the timestamp used for the video frame that gets committed is calculated using the number of audio samples we have committed overall and the sample rate (48k :)) to determine how many nanoseconds worth of time that is.

    So if before a video frame we have committed 48k samples (its actually a little more complicated as we use pending samples too from our buffer in the calculation), we know it should have a timestamp of 1 second which is 1e9 nanoseconds. This means our video frames are in sync with the audio engine's output.
     
  49. Lanre

    Lanre

    Joined:
    Dec 26, 2013
    Posts:
    3,614
    Yeah this is what I don't understand. You have to confirm that your synthesized video timestamps aren't causing uneven timing in the video. This could easily be your culprit.
     
  50. Lanre

    Lanre

    Joined:
    Dec 26, 2013
    Posts:
    3,614
    Yup, NatCorder supports ARFoundation (ARCore, ARKit), Wikitude, and Vuforia. I have an example project for ARFoundation with NatCorder in the works.
    How you choose to start and stop recording is up to you. NatCorder is used programmatically, giving you full flexibility.
    NatCorder always saves recordings to the app's private documents directory. You can use the free NatShare plugin to copy the recording to the camera roll on iOS and Android.
     
unityunity