Search Unity

NatCorder - Video Recording API

Discussion in 'Assets and Asset Store' started by Lanre, Nov 18, 2017.

  1. Lanre

    Lanre

    Joined:
    Dec 26, 2013
    Posts:
    3,973
    WebM is just a container format, and NatCorder only produces a webm file on WebGL (encoded with the vp8 codec in most cases).
     
  2. pocketpair

    pocketpair

    Joined:
    Jul 7, 2015
    Posts:
    72
    Hello, I'm the developer of Overdungeon ( https://store.steampowered.com/app/919370/Overdungeon/ ) and trying to introduce NatCorder into Overdungeon.

    Unfortunately on Mac, it looks strange captured video. Please have a look.




    1. captured by Mac. (with OS function.)
    2. captured by natcorder on Mac.
    Actual Video:
    https://imgur.com/a/ekPb3Mj

    It looks blue filtered. Do you have any ideas why it happens?
    On windows, it works fine. (We didn't try on iPhone and Android)

    Environment:
    Unity version: 2019.1.7f
    On Windows 10
    On MacOS 10.14.4

    Player settings:





    By the way, we changed some codes for optimization with AsyncGPUReadbackRequest.
    It works on windows, but it doesn't work on Mac with the modification.
    https://gist.github.com/urokuta/b8862be31319f095541af88245c4d6f0

    Anyway, please let me know if you have any ideas.
    Thank you for your development and your support.
     
    Last edited: Jun 19, 2019
  3. Lanre

    Lanre

    Joined:
    Dec 26, 2013
    Posts:
    3,973
    The issue you are facing relates to swizzling. Email me with your invoice number for the 1.6.0 build (which uses AsyncGPUReadback).
     
  4. SweatyChair

    SweatyChair

    Joined:
    Feb 15, 2016
    Posts:
    140
    I believe this question must be asked before,

    Any workaround for Windows 7 standalone? May be a fallback for GIF recording if the machine is Win7.

    There's still 1/3 people in Win7, I think this issue must be take care of....
     
  5. Lanre

    Lanre

    Joined:
    Dec 26, 2013
    Posts:
    3,973
    Nope. We simply don't support Windows 7 (or Windows 8 for that matter). NatCorder requires Windows 10, 64-bit.
     
  6. HEROTECH70

    HEROTECH70

    Joined:
    May 24, 2017
    Posts:
    74
    On NatCorder 1.6 how do you supply the raw data array to the encoder?
    I am converting my texture into a byte[] but the output is strange



    As a side note.

    VideoRecorder.Commit() needs to be able to take in a NativeArray<byte> too (avoiding so conversion to byte[])

    And videorecorder needs an option to set framerate to variable, before I could achieve this behaviour by setting the video framerate in the options to 0, now doing so will create a blank file
     
    Last edited: Jun 24, 2019
  7. Lanre

    Lanre

    Joined:
    Dec 26, 2013
    Posts:
    3,973
    How are you converting to a byte[]? Can you share your code? The video indicates that there is a mismatch between the width that the recorder expects, and the width of the pixel buffer you are committing.
    At some point, we need to extract a byte[] from the NativeArray<byte>, so we leave that to the client. The front-end API is designed to not be Unity-specific.
    The H.264 codec doesn't allow this. You must provide a valid framerate setting.
     
  8. HEROTECH70

    HEROTECH70

    Joined:
    May 24, 2017
    Posts:
    74
    Here is the code
    Code (csharp):
    1.  
    2.         recordingClock = new RealtimeClock();
    3.         videoRecorder = new MP4Recorder(
    4.             Screen.width,
    5.             Screen.height,
    6.             30,
    7.             0,
    8.             0,
    9.             OnReplay,
    10.             5000000,
    11.             1
    12.         );
    13.  
    then to commit the frame data
    Code (csharp):
    1.  
    2. NativeArray<byte> rawData = req.GetData<byte>();
    3. rawData.CopyTo(rawTextureData); //same size as the req buffer
    4. videoRecorder.CommitFrame(rawTextureData, recordingClock.Timestamp);
    5.  
    I checked and req.width and req.height match Screen.Width and Screen.Height

    In an earlier post you said that version 1.6 uses AsyncGPUReadback, but that feature will not work on OpenGL ES

    I am currently trying to figure out a way to reduce the texture read time from the gpu, but I am starting to think it may not be easily achievable.
     
  9. sapiains

    sapiains

    Joined:
    Feb 19, 2015
    Posts:
    2
    hi, I bought natcorde last week to record the movement in a texture, the asset adapted well to the project but I need the resulting video to be inverted horizontally

    Is it possible to make a horizontal mirror when recording the video?

    thanks!
     
  10. Lanre

    Lanre

    Joined:
    Dec 26, 2013
    Posts:
    3,973
    What exactly is `req`? This object likely has the answer to the issue at hand. On Android (or more accurately, OpenGL and GLES), you can use Pixel Buffer Objects to effectively achieve the same functionality as AsyncGPUReadback.
     
  11. Lanre

    Lanre

    Joined:
    Dec 26, 2013
    Posts:
    3,973
    You can do this by blitting your frame data to the NatCorder frames (from AcquireFrame), but with a custom shader that does the horizontal flip. The GreyWorld example does something like this to record a greyscale version of the webcam.
     
  12. HEROTECH70

    HEROTECH70

    Joined:
    May 24, 2017
    Posts:
    74
    req is an AsyncGPUReadback before I found out it is not supported on android.
    Anyway I am not using that anymore

    Code (csharp):
    1.  
    2. textureToEncode.ReadPixels(new Rect(0, 0, Screen.width, Screen.height), 0, 0);
    3. textureToEncode.Apply();
    4.  
    5. videoRecorder.CommitFrame(textureToEncode.GetPixels32(), data.timeStamp); //same result
    6. videoRecorder.CommitFrame(textureToEncode.GetPixels(), data.timeStamp); //totally corrupted
    7. videoRecorder.CommitFrame(textureToEncode.GetRawTextureData(), data.timeStamp); // same result
    8.  
    Besides all this, if I don't exit play mode after stopping recording and starting a new recording Unity will crash.

    On another sidenote, PBOs are only supported on opengl es 3 devices, that graphics api is supported by android 5.0 (21) but it is no guarantee it is implemented.
     
  13. Lanre

    Lanre

    Joined:
    Dec 26, 2013
    Posts:
    3,973
    How do you initialize `textureToEncode`? PBO's are supported on OpenGL ES3, which is a requirement for Android devices since API level 18. NatCorder itself does not support OpenGL ES2.
     
  14. HEROTECH70

    HEROTECH70

    Joined:
    May 24, 2017
    Posts:
    74
    Code (csharp):
    1.  
    2. textureToEncode = new Texture2D(Screen.width, Screen.height, TextureFormat.RGBA32, false);
    3. textureToEncode.ReadPixels(new Rect(0, 0, Screen.width, Screen.height), 0, 0);
    4.  
    also getting this following the sample provided in replay cam (Samsung galaxy S5)


    AndroidJavaException: java.lang.NoSuchMethodError: no non-static method with name='address' signature='()J' in class Ljava.lang.Object;
    06-25 15:54:13.502 26114 27004 E Unity : java.lang.NoSuchMethodError: no non-static method with name='address' signature='()J' in class Ljava.lang.Object;
    06-25 15:54:13.502 26114 27004 E Unity : at com.unity3d.player.ReflectionHelper.getMethodID(Unknown Source)
    06-25 15:54:13.502 26114 27004 E Unity : at com.unity3d.player.ReflectionHelper.nativeProxyInvoke(Native Method)
    06-25 15:54:13.502 26114 27004 E Unity : at com.unity3d.player.ReflectionHelper.a(Unknown Source)
    06-25 15:54:13.502 26114 27004 E Unity : at com.unity3d.player.ReflectionHelper$1.invoke(Unknown Source)
    06-25 15:54:13.502 26114 27004 E Unity : at java.lang.reflect.Proxy.invoke(Proxy.java:393)
    06-25 15:54:13.502 26114 27004 E Unity : at $Proxy11.onReadback(Unknown Source)
    06-25 15:54:13.502 26114 27004 E Unity : at com.yusufolokoba.natrender.AsyncGPUReadback.dispose(AsyncGPUReadback.java:90)
    06-25 15:54:13.502 26114 27004 E Unity : at com.yusufolokoba.natrender.AsyncGPUReadback.access$300(AsyncGPUReadback.java:13)
    06-25 15:54:13.502 26114 27004 E Unity : at com.yusufolokoba.natrender.AsyncGPUReadback$1.run(AsyncGPUReadback.java:108)
    06-25 15:54:13.502 26114 27004 E Unity : at android.os.Handler.handleCallback(Handler.java:739)
    06-25 15:54:13.502 26114 27004 E Unity : at android.os.Handler.dispatchMessage(Handler.java:95)
    06-25 15:54:13.502 26114 27004 E Unity : at android.os.Looper.loop(Looper.java:158)
    06-25 15:54:13.502 26114 27004 E Unity : at android.os
     
  15. Lanre

    Lanre

    Joined:
    Dec 26, 2013
    Posts:
    3,973
    Delete and reimport NatCorder 1.6.0 in your project. The issue you are facing is weird.
     
  16. HEROTECH70

    HEROTECH70

    Joined:
    May 24, 2017
    Posts:
    74
    I deleted and reimported the 1.6.0 version, still no luck.
    I will simply disable the recording feature as this is taking far too much time to implement.
     
    Lanre likes this.
  17. distastee

    distastee

    Joined:
    Mar 25, 2014
    Posts:
    66
    Hey Lanre - Any ETA for the new release? I'm hoping to upgrade to the new NatCorder/NatMic in hopes of getting rid of the hitch when I first start recording videos.
     
  18. SominHC

    SominHC

    Joined:
    Jul 20, 2018
    Posts:
    8
    Hi Lanre!
    I've been trying to use Natcorder to capture gameplay highlights the players can share after the game. This works great when I can predictably say that the next x seconds will be interesting. Quite often though I can't tell in advance.
    Is there a way to use Natcorder so that it can buffer the last x seconds and save it to disc once I tell it to? I don't think its an existing feature, judging from the Docs and stuff, but maybe there is a way to do it by hand that I'm not seeing?

    cheers
     
  19. J_Xiaopi

    J_Xiaopi

    Joined:
    Nov 9, 2018
    Posts:
    13
    hi,
    I bought a NatCorder. Can it record the screen instead of the camera?
     
  20. Ethosh_Unity

    Ethosh_Unity

    Joined:
    Sep 26, 2018
    Posts:
    1
    Will this package work with ARKit and ARCore?
     
  21. Lanre

    Lanre

    Joined:
    Dec 26, 2013
    Posts:
    3,973
    The builds are ready, we just aren't releasing them yet. You can always email me with your invoice number and I'll share them with you.
     
  22. Lanre

    Lanre

    Joined:
    Dec 26, 2013
    Posts:
    3,973
    For most applications, recording the camera is synonymous with recording the screen (afterall what is shown on screen is what the game camera sees). Are you facing any issues?
     
  23. Lanre

    Lanre

    Joined:
    Dec 26, 2013
    Posts:
    3,973
    Yup.
     
  24. J_Xiaopi

    J_Xiaopi

    Joined:
    Nov 9, 2018
    Posts:
    13
    Thanks for your reply. I have solved this problem.I have a new question:
    when I was using NatMicCorder-Demo-master, I had a microphone delay problem.I hope you can help me. It means a lot to me.
     

    Attached Files:

  25. Lanre

    Lanre

    Joined:
    Dec 26, 2013
    Posts:
    3,973
    This will be fixed in the next NatMic release.
     
  26. J_Xiaopi

    J_Xiaopi

    Joined:
    Nov 9, 2018
    Posts:
    13
    Ok,If there is a beta, I sent you an email!;)
     
    Last edited: Jul 1, 2019
  27. Lanre

    Lanre

    Joined:
    Dec 26, 2013
    Posts:
    3,973
    Email me with your invoice number.
     
  28. J_Xiaopi

    J_Xiaopi

    Joined:
    Nov 9, 2018
    Posts:
    13
    Invoice number is included in the email~;)
     
    Lanre likes this.
  29. paulcasti

    paulcasti

    Joined:
    Jul 1, 2019
    Posts:
    1
    how can i check this plugin before buying?
     
  30. Aidan-Wolf

    Aidan-Wolf

    Joined:
    Jan 6, 2014
    Posts:
    59
    Hi @Lanre,

    How do I film in Landscape on mobile? It seems that it still records in portrait, with the width and height flipped, regardless of if it's in landscape mode.
     
  31. Aidan-Wolf

    Aidan-Wolf

    Joined:
    Jan 6, 2014
    Posts:
    59
    Nevermind, I reworked the GreyWorld sample code as to record a proper Landscape video in my mobile AR app.

    If you or anyone are interested, I can clean up the code and share here. Landscape should be an option in the official plugin anyways.
     
    Lanre likes this.
  32. pedrobarrosbrasil

    pedrobarrosbrasil

    Joined:
    Apr 4, 2019
    Posts:
    8
    Could you help me out?
    After the recording is done, I wanted to display the recorded video and give the user the option to discard it or share it using natshare. Im kinda newbie, so its not working as I wanted, I know its simple, but I just cant figure it out!
     
  33. Develoop

    Develoop

    Joined:
    Dec 22, 2013
    Posts:
    11
    With the last version of NatCorder 1.5.1 you are supporting minimum API level Android 6.0. Are you planing to make a version that support Android 5.0 ? And if your answer is YES, when will be available ? :)
     
  34. HappyShip

    HappyShip

    Joined:
    Jun 15, 2017
    Posts:
    7
    I'm very interested in this! Would you mind sharing it?
     
  35. Lanre

    Lanre

    Joined:
    Dec 26, 2013
    Posts:
    3,973
    There isn't a concept of a landscape option, because NatCorder really doesn't have anything to do with the app orientation. Simply set your recording resolution to a landscape resolution (width > height), then use a CameraRecorder. The CameraRecorder renders the camera to a RenderTexture that matches the recording resolution, so it's like displaying on a virtual screen that is landscape.
     
  36. Lanre

    Lanre

    Joined:
    Dec 26, 2013
    Posts:
    3,973
    This is more UI stuff that doesn't have anything to do with NatCorder. How you implement it is entirely up to your app design.
     
  37. Lanre

    Lanre

    Joined:
    Dec 26, 2013
    Posts:
    3,973
    Yup, NatCorder 1.6.0 drops the Android requirement to API level 21 (Android 5.0). The build is ready, we just haven't released it yet. Email me with your invoice number and I'll share it with you.
     
    Develoop likes this.
  38. J_Xiaopi

    J_Xiaopi

    Joined:
    Nov 9, 2018
    Posts:
    13
    Hi @Lanre,
    I sent it to olokobayusuf@gmail.com. I am not sure if it is. If you received my email, please let me know.Thank you!;)
     
  39. Lanre

    Lanre

    Joined:
    Dec 26, 2013
    Posts:
    3,973
    Just responded.
     
  40. Aidan-Wolf

    Aidan-Wolf

    Joined:
    Jan 6, 2014
    Posts:
    59
    Hi Lanre,

    The app freezes completely for about 1.5-2.0s every time I start recording when using NatCorder + NatMic. Is this expected behavior? I'm on an iPhone Xs recording at half resolution.

    Update - here is the code:
    Code (CSharp):
    1. private void Start() {
    2.             audioDevice = AudioDevice.Devices[0];
    3. }
    4.  
    5. public void StartRecording() {
    6. recordingClock = new RealtimeClock();
    7. videoRecorder = new MP4Recorder(
    8.                     562,
    9.                     1218,
    10.                     30,
    11.                     44100,
    12.                     2,
    13.                     ShowVideo
    14.                 );
    15.  
    16.                     cameraInput = new CameraInput(videoRecorder, recordingClock, Camera.main);
    17.  
    18.                 micInput = new MicAudioInput(videoRecorder, recordingClock);
    19.                 audioDevice.StartRecording(sampleRate, channelCount, micInput);
    20. }
     
  41. Lanre

    Lanre

    Joined:
    Dec 26, 2013
    Posts:
    3,973
    Can you share the logs? Also, what happens in the Xcode profiler?
     
  42. Lanre

    Lanre

    Joined:
    Dec 26, 2013
    Posts:
    3,973
    NatCorder 1.6.0 is Live!

    Android 21 support, experimental support for HEVC (on iOS, macOS, and Windows), commit raw pixel data (no more GC!), thread-safe recorders (record in a background thread), and more.

    Requires Android 21+, iOS 11+, Windows 10 64-bit, macOS 10.13+, Unity 2018.3+.
     
    sam598 and Menion-Leah like this.
  43. sam598

    sam598

    Joined:
    Sep 21, 2014
    Posts:
    60
    So why can't 1.6 accept RenderTextures anymore?

    I understand that hardware encoders can block a thread, and in some specific use cases adding the ability to submit a byte array can be beneficial to a multithreaded application. But why remove the option for RenderTextures entirely?

    In most hardware encoders providing a GL/Metal/Surface texture to the encoder is much faster than a bytebuffer. And on the Unity side of things (and without knowing how the plugin is working internally) I would imagine:
    RenderTexture -> HardwareEncoder
    would be much faster than:
    RenderTexture -> Texture2D -> Colors32 -> ByteBuffer -> HardwareEncoder

    Even if the performance loss is negligible, this wonderful piece of code:

    Code (CSharp):
    1. var frame = videoRecorder.AcquireFrame();
    2. Graphics.Blit(renderTexture, frame);
    3. videoRecorder.CommitFrame(frame, timestamp);
    now becomes:

    Code (CSharp):
    1. readbackBuffer = readbackBuffer ?? new Texture2D(videoRecorder.pixelWidth, videoRecorder.pixelHeight, TextureFormat.RGBA32, false, false);
    2. RenderTexture.active = renderTexture;
    3. readbackBuffer.ReadPixels(new Rect(0, 0, texture.width, texture.height), 0, 0, false);
    4. readbackBuffer.GetRawTextureData<byte>().CopyTo(pixelBuffer);
    5. videoRecorder.CommitFrame(pixelBuffer, timestamp);
    Is there no way to add this back to MP4Recorder? Especially since all the necessary code seems to exist in CameraInput.cs
     
  44. Lanre

    Lanre

    Joined:
    Dec 26, 2013
    Posts:
    3,973
    Memory stability and performance. First, I should clarify that it is misleading to say that NatCorder doesn't 'accept' RenderTextures. RenderTextures aren't part of the contract, video frames are. As for RenderTextures, we had a few reasons to remove them:
    - When doing offline recording (recording in a for-loop), using RenderTextures becomes quite a pain because of all the GPU allocations. In most cases, this blows up. Compare with a `byte[]` or `Color32[]` where the .NET runtime can perform GC upon memory pressure.
    - Following from the above point, the memory pressure is hard to track with RenderTextures, and on all platforms except Android, we have to perform a readback of pixel data from said RenderTexture before sending to the encoder anyway. Pushing the readback responsibility away from the `IMediaRecorder` implementations makes things super simple.
    - Following from the above point, we wanted to allow developers choose their app's performance characteristics during recording. There are two ways to readback pixel data from the GPU: synchronously (using Texture2D.ReadPixels) and asynchronously (using AsyncGPUReadback). The former means high CPU cost but extremely predictable memory whereas the latter means practically zero CPU cost but very volatile (GPU and system) memory pressure. So if a developer had a GPU-bound app, they'd want a synchronous readback to prevent memory pressure from exploding, and vice versa. Federating this process meant that the developer couldn't choose, but now, they can.
    - When recording from a Texture2D, WebCamTexture, NatCam, or NatExtractor, the previous system meant having to upload frame data to a RenderTexture (typically by blitting). This was wasteful, as we were going to perform a readback to get that data anyway.
    - Finally, the actual process of committing a RenderTexture hasn't changed. All we did what shift the readback out of the `IMediaRecorder` implementation, as all the above points show that it has significant advantages.

    Because Unity 2018.3 makes it easy to get pixel data from a RenderTexture. See AsyncGPUReadback.
    While it seems plausible, this is false. On macOS, iOS, Windows, and WebGL the encoder expects a pixel buffer. On Metal specifically (iOS and macOS), you would have to blit the current frame texture to another texture that is backed by a CoreVideo Metal texture cache (so that you can access the pixel data in shared memory). The extra draw call, and the added memory unpredictability when committing a lot of frames, makes this a suboptimal choice. We implemented this originally and moved away from it a long time ago. Android could have been the exception, but alas it wasn't. This is because we would have to keep the committed RenderTexture alive up until we knew it had been sent to the encoder (blit to the encoder surface). This would be made complicated by Unity's unclear semantics about the lifecycle of RenderTextures gotten from RenderTexture.AcquireFrame. What we ended up doing was using PBO's, which are neither better nor worse than what was in place before (performance and memory wise), but had the advantage of having very clear lifetime semantics that we could build upon.
    Yes, but similarly this complicated piece of code:
    Code (CSharp):
    1. if (webcamTexture.didUpdateThisFrame) {
    2.     var encoderFrame = videoRecorder.AcquireFrame();
    3.     Graphics.Blit(webcamTexture, encoderFrame);
    4.     videoRecorder.CommitFrame(encoderFrame, timestamp);
    5. }
    becomes this simple piece of code:
    Code (CSharp):
    1. if (webcamTexture.didUpdateThisFrame)
    2.     videoRecorder.CommitFrame(webcamTexture.GetPixel32(), timestamp);
    And the simplifications when considering recording from say NatExtractor, which makes a full transcoding stack, are much more significant.
    The goal was to shift this to the client, in order for them to have control over the performance and memory behaviour of the recording process (and also have new ways to squeeze out performance like mutlithreading). A simpler thing for you to do is use AsyncGPUReadback:
    Code (CSharp):
    1. AsyncGPUReadback.Request(renderTexture, 0, request => {
    2.     mediaRecorder.CommitFrame(request.GetData<byte>().ToArray(), clock.Timestamp);
    3. });
     
    Menion-Leah likes this.
  45. jasminepark

    jasminepark

    Joined:
    Aug 21, 2018
    Posts:
    3
    Hi, I would like to record my screen by masking out certain UI elements (eg buttons) on Android. Currently it is recording everything on my screen. Is this possible?
     
  46. HEROTECH70

    HEROTECH70

    Joined:
    May 24, 2017
    Posts:
    74
    As said before Async GPU readback is not guaranteed to be implemented, my phone supports android API level 21
    But if I try to use it I will get an unsupported message in the logs.
     
  47. Lanre

    Lanre

    Joined:
    Dec 26, 2013
    Posts:
    3,973
    You can create a second camera and use layer masks to ensure that the UI elements are only visible to the second camera. Then record the primary camera (which wouldn't be able to see the layered objects).
     
  48. Lanre

    Lanre

    Joined:
    Dec 26, 2013
    Posts:
    3,973
    We have our own implementation of AsyncGPUReadback for OpenGL ES that is almost the same as Unity's. See CameraInput.cs.
     
  49. frimarichard

    frimarichard

    Joined:
    Jul 24, 2017
    Posts:
    31
    Hi, I'm trying to migrate from 1.5.1 to 1.6.0, and I'm hitting an AndroidJavaException when creating an instance of the MP4Recorder class. Any ideas?

    Unity: 2018.3.12f1.
    Device: OnePlus 6, Android version 9.

    Code (CSharp):
    1. new MP4Recorder(RECORD_RESOLUTION, recordHeight, 30, audioSampleRate, audioChannels, callback)
     
  50. Aidan-Wolf

    Aidan-Wolf

    Joined:
    Jan 6, 2014
    Posts:
    59
    It's the same normal logs

    [B]NatCorder: MP4 recorder finishing
    NatCorder: Prepared video encoder at resolution 562x1218@30Hz with average bitrate 5909760 and keyframe interval 3s
    NatCorder: Prepared audio encoder for 2 channels at 44100Hz
    NatMic: Microphone Apple: AURemoteIO started recording
    [Technique] World tracking performance is being affected by resource constraints [1][/B]


    I'm really just asking for a straight answer because I'm on a deadline and it's impossible to miss this lag (I'm using sample code.) So, is there something wrong with the plugin or is this a limitation of the Phone/OS/etc? My assumption is it's not keeping the audioDevice stream open, reopening it every time, and causing a bad IO delay.

    - Blazing fast. NatCorder is heavily optimized performance following from design breakthroughs
    - AR support. NatCorder has full support for Vuforia, ARCore, and ARKit.

    [Technique] World tracking performance is being affected by resource constraints [1][/B]
    [Technique] World tracking performance is being affected by resource constraints [1][/B]
    [Technique] World tracking performance is being affected by resource constraints [1][/B][/CODE]
     
    Last edited: Jul 5, 2019