Search Unity

NatCorder - Video Recording API

Discussion in 'Assets and Asset Store' started by Lanre, Nov 18, 2017.

  1. Gango_98_

    Gango_98_

    Joined:
    Aug 10, 2020
    Posts:
    3


    Up, please, I need help on this
     
  2. Lanre

    Lanre

    Joined:
    Dec 26, 2013
    Posts:
    3,973
    You are specifying an audio format in the constructor (44100 sample rate, 1 channel count) but you aren't committing any audio. If you don't want to commit audio, remove those arguments from the constructor.
     
  3. ROBYER1

    ROBYER1

    Joined:
    Oct 9, 2015
    Posts:
    1,454
    Would you suggest using the Unity audio mixer component for this? Sounds like it is perfect for this job?

    https://docs.unity3d.com/Manual/AudioMixer.html

    I am stumped on how it is so hard for us to record both the mic and in game audio together as I can't find much reference online for it
     
    Last edited: Jun 21, 2021
  4. Lanre

    Lanre

    Joined:
    Dec 26, 2013
    Posts:
    3,973
    I'm not familiar with the AudioMixer API so I don't know if/how you can extract the mixed audio from it. I recommend looking for free/open-source audio/DSP libraries that you can use for mixing.
     
    ROBYER1 likes this.
  5. ROBYER1

    ROBYER1

    Joined:
    Oct 9, 2015
    Posts:
    1,454
    How would you advise we grab the sampleBuffer from the microphone audioDevice when we initialise it so that we can combine it with the game audio on an audioSource - doing something like this to combine the byte data of the mic audio with the game audio to ping back to Natcorder?

    Code (CSharp):
    1. audioDevice.StartRunning((sampleBuffer, timestamp)
    https://answers.unity.com/questions/697734/how-to-mix-audio-tracks-in-unity3d.html
     
  6. Lanre

    Lanre

    Joined:
    Dec 26, 2013
    Posts:
    3,973
    NatDevice provides the raw PCM data from the microphone, and so does Unity with its `OnAudioFilterRead` callback:
    Code (CSharp):
    1. audioDevice.StartRunning(OnSampleBuffer);
    2.  
    3. // Audio from the microphone
    4. void OnSampleBuffer (float[] sampleBuffer, long timestamp) {
    5.  
    6. }
    7.  
    8. // Audio from Unity
    9. void OnAudioFilterRead (float[] sampleBuffer, int channels) {
    10.  
    11. }
    12.  
    You'll have to mix the audio from both of those sources. You'll also need to make sure that the audio formats match (sample rate and channel count from the mic are the same as Unity's audio engine).

    When you mix the audio from both sources, call `recorder.CommitSamples`.
     
    ROBYER1 likes this.
  7. ROBYER1

    ROBYER1

    Joined:
    Oct 9, 2015
    Posts:
    1,454
    Thanks for the quick response this was really helpful, hopefully we should be able to crack it from here
     
  8. Lanre

    Lanre

    Joined:
    Dec 26, 2013
    Posts:
    3,973
    Sure thing. There's a lotta stuff to look out for, around synchronized access (both of those methods get invoked on separate threads) and timing (both methods can get called at separate times). But if you can navigate these you should be good.
     
    ROBYER1 likes this.
  9. ROBYER1

    ROBYER1

    Joined:
    Oct 9, 2015
    Posts:
    1,454
    In our scenario it is just a video recording of a 1-1 voip call while in game,using game audio for the incoming user's voice and the mic for outgoing audio to both be combined.

    As there is already a lot of latency with the call, the audio matching up perfectly isn't an issue here I just need a way of combining the mic audio with another audio source which was way harder to do than I would have liked!
     
    Lanre likes this.
  10. ROBYER1

    ROBYER1

    Joined:
    Oct 9, 2015
    Posts:
    1,454
    Are you still using this to record both game and mic sound together? Just trying your samplebuffer fix on IOS as we had the same issue

    I would also like to know how you landed at such a specific buffer size - why 16384?

    Edit: The fix worked only sometimes, other times the buffers would still overflow somehow!!

    Posting back here to help others as this is a bit outside of my comfort zone to try, I am trying another fix first that I thought of.

    From Lanre on Discord:

    "You'll likely need to rearchitect the MixerDevice to use a new mental model for mixing. Currently, it accumulates buffers from one source (I think from the audio device, can't remember) then performs mixing in the callback of the other source. Mentally, a cleaner approach is to have both sources commit audio samples into circular buffers. Using circular buffers makes it impossible to have a buffer overflow; instead old samples are discarded. Then have a third component whose job it is to dequeue samples from both sources simultaneously, mix them, then push them out.
    You should reimplement this from scratch, only using MixerDevice as a guide."
     
    Last edited: Jun 22, 2021
    Lanre likes this.
  11. ROBYER1

    ROBYER1

    Joined:
    Oct 9, 2015
    Posts:
    1,454
    Looking into similar Ring buffer / Circular buffer based passing of audio implementations, I stumbled across this script in the Agora Unity SDK (a video call plugin we are using in our app). I wonder if this is a good starting point for a Ring Buffer based MixerDevice rewrite, I had a chop at it so far and got it to pass a buffer from the Audio Listener in Unity to the Ringbuffer but after a short time it fills the Ring buffer because we aren't dequeueing anything or taking things out of the buffer as they go in.

    I'm really stumped with this as there is very little in terms of documentation on how to do such a thing with audio and ringbuffers but I feel this script is quite close - see ringbuffer file attached

    Agora Sample Ring Buffer:
    Code (CSharp):
    1. using System;
    2. using System.Runtime.InteropServices;
    3. using System.Threading;
    4. using UnityEngine;
    5. using UnityEngine.UI;
    6. using agora_gaming_rtc;
    7. using RingBuffer;
    8.  
    9. namespace CustomAudioSink
    10. {
    11.     public class CustomAudioSinkSample : MonoBehaviour
    12.     {
    13.         /*
    14.         [SerializeField] private string APP_ID = "YOUR_APPID";
    15.  
    16.         [SerializeField] private string TOKEN = "";
    17.  
    18.         [SerializeField] private string CHANNEL_NAME = "YOUR_CHANNEL_NAME";
    19.         public Text logText;
    20.         //private Logger logger;
    21.                 */
    22.         private IRtcEngine mRtcEngine = null;
    23.  
    24.         private IAudioRawDataManager _audioRawDataManager;
    25.  
    26.         private int CHANNEL = 1;
    27.         private int SAMPLE_RATE = 44100;
    28.         private int PULL_FREQ_PER_SEC = 100;
    29.  
    30.         private int count;
    31.  
    32.         private int writeCount;
    33.         private int readCount;
    34.  
    35.         private RingBuffer<float> audioBuffer;
    36.         private AudioClip _audioClip;
    37.  
    38.  
    39.         private Thread _pullAudioFrameThread;
    40.         private bool _pullAudioFrameThreadSignal = true;
    41.  
    42.         private bool _startSignal;
    43.  
    44.         private bool started = false;
    45.  
    46.         public AudioSource mainAudioSource;
    47.  
    48.         // Start is called before the first frame update
    49.         void Start()
    50.         {
    51.             //var ifValid = CheckAppId();
    52.             //InitRtcEngine();
    53.             //JoinChannel();
    54.  
    55.             //var aud = mainAudioSource;
    56.  
    57.             //if (ifValid)
    58.             //StartPullAudioFrame("externalClip");
    59.         }
    60.  
    61.         /*
    62.         void Update()
    63.         {
    64.             //PermissionHelper.RequestMicrophontPermission();
    65.         }
    66.  
    67.  
    68.         bool CheckAppId()
    69.         {
    70.             //logger = new Logger(logText);
    71.             //return logger.DebugAssert(APP_ID.Length > 10, "Please fill in your appId in Canvas!!!!!");
    72.             return true;
    73.         }
    74.  
    75.         void InitRtcEngine()
    76.         {
    77.             mRtcEngine = IRtcEngine.GetEngine(APP_ID);
    78.             mRtcEngine.SetExternalAudioSink(true, SAMPLE_RATE, CHANNEL);
    79.             mRtcEngine.SetLogFile("log.txt");
    80.             mRtcEngine.OnJoinChannelSuccess += OnJoinChannelSuccessHandler;
    81.             mRtcEngine.OnLeaveChannel += OnLeaveChannelHandler;
    82.             mRtcEngine.OnWarning += OnSDKWarningHandler;
    83.             mRtcEngine.OnError += OnSDKErrorHandler;
    84.             mRtcEngine.OnConnectionLost += OnConnectionLostHandler;
    85.         }
    86.  
    87.         void JoinChannel()
    88.         {
    89.             mRtcEngine.JoinChannelByKey(TOKEN, CHANNEL_NAME, "", 0);
    90.         }
    91.         */
    92.         public void StartPullAudioFrame(string clipName)
    93.         {
    94.             started = true;
    95.             _audioRawDataManager = AudioRawDataManager.GetInstance(mRtcEngine);
    96.  
    97.             var bufferLength = SAMPLE_RATE / PULL_FREQ_PER_SEC * CHANNEL * 1000; // 10-sec-length buffer
    98.             audioBuffer = new RingBuffer<float>(bufferLength);
    99.  
    100.             _pullAudioFrameThread = new Thread(PullAudioFrameThread);
    101.             _pullAudioFrameThread.Start();
    102.  
    103.             _audioClip = AudioClip.Create(clipName,
    104.                 SAMPLE_RATE / PULL_FREQ_PER_SEC * CHANNEL, CHANNEL, SAMPLE_RATE, true,
    105.                 OnAudioRead);
    106.             mainAudioSource.clip = _audioClip;
    107.             mainAudioSource.loop = true;
    108.             mainAudioSource.Play();
    109.         }
    110.  
    111.         void OnApplicationQuit()
    112.         {
    113.             Debug.Log("Clearing redirected audio buffer");
    114.             if (started == true)
    115.             {
    116.                 _pullAudioFrameThreadSignal = false;
    117.                 _pullAudioFrameThread.Abort();
    118.                 audioBuffer.Clear();
    119.             }
    120.         }
    121.  
    122.         private void OnDestroy()
    123.         {
    124.             Debug.Log("Clearing redirected audio buffer");
    125.             if (started == true)
    126.             {
    127.                 _pullAudioFrameThreadSignal = false;
    128.                 _pullAudioFrameThread.Abort();
    129.                 audioBuffer.Clear();
    130.             }
    131.  
    132.         }
    133.  
    134.         /*
    135.         void OnJoinChannelSuccessHandler(string channelName, uint uid, int elapsed)
    136.         {
    137.             //logger.UpdateLog(string.Format("sdk version: {0}", IRtcEngine.GetSdkVersion()));
    138.             //logger.UpdateLog(string.Format("onJoinChannelSuccess channelName: {0}, uid: {1}, elapsed: {2}", channelName,
    139.             //uid, elapsed));
    140.         }
    141.  
    142.         void OnLeaveChannelHandler(RtcStats stats)
    143.         {
    144.             //logger.UpdateLog("OnLeaveChannelSuccess");
    145.         }
    146.  
    147.         void OnSDKWarningHandler(int warn, string msg)
    148.         {
    149.             //logger.UpdateLog(string.Format("OnSDKWarning warn: {0}, msg: {1}", warn, msg));
    150.         }
    151.  
    152.         void OnSDKErrorHandler(int error, string msg)
    153.         {
    154.             //logger.UpdateLog(string.Format("OnSDKError error: {0}, msg: {1}", error, msg));
    155.         }
    156.  
    157.         void OnConnectionLostHandler()
    158.         {
    159.             //logger.UpdateLog(string.Format("OnConnectionLost "));
    160.         }
    161.         */
    162.         private void PullAudioFrameThread()
    163.         {
    164.             var avsync_type = 0;
    165.             var bytesPerSample = 2;
    166.             var type = AUDIO_FRAME_TYPE.FRAME_TYPE_PCM16;
    167.             var channels = CHANNEL;
    168.             var samples = SAMPLE_RATE / PULL_FREQ_PER_SEC * CHANNEL;
    169.             var samplesPerSec = SAMPLE_RATE;
    170.             var buffer = Marshal.AllocHGlobal(samples * bytesPerSample);
    171.             var freq = 1000 / PULL_FREQ_PER_SEC;
    172.  
    173.             var tic = new TimeSpan(DateTime.Now.Ticks);
    174.  
    175.             while (_pullAudioFrameThreadSignal)
    176.             {
    177.                 var toc = new TimeSpan(DateTime.Now.Ticks);
    178.                 if (toc.Subtract(tic).Duration().Milliseconds >= freq)
    179.                 {
    180.                     tic = new TimeSpan(DateTime.Now.Ticks);
    181.                     _audioRawDataManager.PullAudioFrame(buffer, (int)type, samples, bytesPerSample, channels,
    182.                         samplesPerSec, 0, avsync_type);
    183.  
    184.                     var byteArray = new byte[samples * bytesPerSample];
    185.                     Marshal.Copy(buffer, byteArray, 0, samples * bytesPerSample);
    186.  
    187.                     var floatArray = ConvertByteToFloat16(byteArray);
    188.                     lock (audioBuffer)
    189.                     {
    190.                         audioBuffer.Put(floatArray);
    191.                     }
    192.  
    193.                     writeCount += floatArray.Length;
    194.                     count += 1;
    195.                 }
    196.  
    197.                 if (count == 100)
    198.                 {
    199.                     _startSignal = true;
    200.                 }
    201.             }
    202.  
    203.             Marshal.FreeHGlobal(buffer);
    204.         }
    205.  
    206.         private static float[] ConvertByteToFloat16(byte[] byteArray)
    207.         {
    208.             var floatArray = new float[byteArray.Length / 2];
    209.             for (var i = 0; i < floatArray.Length; i++)
    210.             {
    211.                 floatArray[i] = BitConverter.ToInt16(byteArray, i * 2) / 32768f; // -Int16.MinValue
    212.             }
    213.  
    214.             return floatArray;
    215.         }
    216.  
    217.  
    218.         private void OnAudioRead(float[] data)
    219.         {
    220.             if (!_startSignal) return;
    221.             for (var i = 0; i < data.Length; i++)
    222.             {
    223.                 lock (audioBuffer)
    224.                 {
    225.                     data[i] = audioBuffer.Get();
    226.                 }
    227.  
    228.                 readCount += 1;
    229.             }
    230.  
    231.             Debug.LogFormat("buffer length remains: {0}", writeCount - readCount);
    232.         }
    233.     }
    234. }
    My attempt so far:

    Code (CSharp):
    1. using System;
    2. using System.Collections;
    3. using System.Collections.Generic;
    4. using System.Runtime.InteropServices;
    5. using System.Threading;
    6. using UnityEngine;
    7. using RingBuffer;
    8.  
    9. public class AudioListenerReader : MonoBehaviour
    10. {
    11.  
    12.     public RingBuffer<float> audioBuffer;
    13.  
    14.     public bool recording = false;
    15.  
    16.     private int CHANNEL = 1;
    17.     private int SAMPLE_RATE = 44100;
    18.     private int PULL_FREQ_PER_SEC = 100;
    19.  
    20.     private int count;
    21.  
    22.     private int writeCount;
    23.     private int readCount;
    24.  
    25.     private Thread _pullAudioFrameThread;
    26.     private bool _pullAudioFrameThreadSignal = true;
    27.  
    28.     #region General functions
    29.  
    30.     void OnApplicationQuit()
    31.     {
    32.         Debug.Log("Clearing redirected audio buffer");
    33.  
    34.         _pullAudioFrameThreadSignal = false;
    35.         _pullAudioFrameThread.Abort();
    36.         audioBuffer.Clear();
    37.  
    38.     }
    39.  
    40.     private void OnDestroy()
    41.     {
    42.         Debug.Log("Clearing redirected audio buffer");
    43.  
    44.         _pullAudioFrameThreadSignal = false;
    45.         _pullAudioFrameThread.Abort();
    46.         audioBuffer.Clear();
    47.     }
    48.  
    49.     #endregion
    50.  
    51.  
    52.     public void startRecording()
    53.     {
    54.         Debug.Log("recording listener");
    55.         audioBuffer = new RingBuffer<float>(4096);
    56.         recording = true;
    57.         _pullAudioFrameThread = new Thread(PullAudioFrameThread);
    58.         _pullAudioFrameThread.Start();
    59.     }
    60.  
    61.     public void stopRecording()
    62.     {
    63.         recording = false;
    64.         clearBuffer();
    65.     }
    66.  
    67.     public void clearBuffer()
    68.     {
    69.  
    70.     }
    71.  
    72.     private void PullAudioFrameThread()
    73.     {
    74.         var avsync_type = 0;
    75.         var bytesPerSample = 2;
    76.         var channels = CHANNEL;
    77.         var samples = SAMPLE_RATE / PULL_FREQ_PER_SEC * CHANNEL;
    78.         var samplesPerSec = SAMPLE_RATE;
    79.         var buffer = Marshal.AllocHGlobal(samples * bytesPerSample);
    80.         var freq = 1000 / PULL_FREQ_PER_SEC;
    81.  
    82.         var tic = new TimeSpan(DateTime.Now.Ticks);
    83.  
    84.         while (_pullAudioFrameThreadSignal)
    85.         {
    86.             var toc = new TimeSpan(DateTime.Now.Ticks);
    87.             if (toc.Subtract(tic).Duration().Milliseconds >= freq)
    88.             {
    89.                 tic = new TimeSpan(DateTime.Now.Ticks);
    90.  
    91.                 OnAudioRead();
    92.  
    93.                 writeCount += floatArray.Length;
    94.                 count += 1;
    95.             }
    96.  
    97.             if (count == 100)
    98.             {
    99.                 _startSignal = true;
    100.             }
    101.         }
    102.  
    103.         Marshal.FreeHGlobal(buffer);
    104.     }
    105.  
    106.     // Audio from Unity
    107.     void OnAudioFilterRead(float[] sampleBuffer, int channels)
    108.     {
    109.         if (recording == true)
    110.         {
    111.             Debug.Log("Samplebuffer length is: " + sampleBuffer.Length);
    112.             //  Debug.Log("Unity core samplebuffer" + sampleBuffer.Length);
    113.             lock (audioBuffer)
    114.             {
    115.                 audioBuffer.Put(sampleBuffer);
    116.  
    117.             }
    118.         }
    119.     }
    120.  
    121.     private void OnAudioRead(float[] data)
    122.     {
    123.         if (recording == true)
    124.         {
    125.             for (var i = 0; i < data.Length; i++)
    126.             {
    127.                 lock (audioBuffer)
    128.                 {
    129.                     data[i] = audioBuffer.Get();
    130.                 }
    131.  
    132.                 readCount += 1;
    133.             }
    134.  
    135.             Debug.LogFormat("buffer length remains: {0}", writeCount - readCount);
    136.         }
    137.     }
    138.  
    139. }
    140.  
     

    Attached Files:

  12. Blarp

    Blarp

    Joined:
    May 13, 2014
    Posts:
    270
    @Lanre great plugin A++, just wanted to know how/where the video is stored at runtime so I can send the final video to a webserver (NOT STREAMING ;)). Taking a look at StopRecording() and it wasn't quite apparent to me. Looking to basically send it over to a webserver after record and was looking at where that is stored.

    I read this from a previous post:
    "Natcorder that gives you a path to the recorded video and you and you can take it from there."

    I assume I just use {Path} and its good to go. Attempting that, MIGHT BE THAT EASY lol
     
    Last edited: Jun 22, 2021
    ROBYER1 likes this.
  13. Lanre

    Lanre

    Joined:
    Dec 26, 2013
    Posts:
    3,973
    Yo! The `FinishWriting` method returns the path to the recording. You can then send the file to your webserver.
     
    Blarp likes this.
  14. Dreamport_Developer

    Dreamport_Developer

    Joined:
    Aug 21, 2018
    Posts:
    9
    Hey! I really need help with the implementation of recording on IOS. As soon as the recording starts, the application crashes.Everything works for android
     

    Attached Files:

  15. Lanre

    Lanre

    Joined:
    Dec 26, 2013
    Posts:
    3,973
    The log indicates that you are passing in a wrong value for the keyframe interval:
    Code (CSharp):
    1. AVVideoCompressionPropertiesKey dictionary must specify a positive value for AVVideoMaxKeyFrameIntervalKey
     
  16. Dreamport_Developer

    Dreamport_Developer

    Joined:
    Aug 21, 2018
    Posts:
    9
    Thanks for your reply! Is it possible that this code works on android but not on ios?

    Code (CSharp):
    1. public class ScreenRecorder : MonoBehaviour
    2. {
    3.     [SerializeField] private Camera _camera;
    4.  
    5.     private MP4Recorder _mP4Recorder;
    6.     private HEVCRecorder _hEVCRecorder;
    7.     private CameraInput _cameraInput;
    8.     private AudioInput _audioInput;
    9.  
    10.     public void StartRecord()
    11.     {
    12.         //_mP4Recorder = new MP4Recorder(720, 1280, 0.1f);
    13.         _hEVCRecorder = new HEVCRecorder(720, 1280, 0.1f);
    14.         RealtimeClock clock = new RealtimeClock();
    15.         _cameraInput = new CameraInput(_hEVCRecorder, clock, _camera);
    16.         _audioInput = new AudioInput(_hEVCRecorder, clock, _camera.GetComponent<AudioListener>());
    17.     }
    18.  
    19.     public async void StopRecord()
    20.     {      
    21.         _cameraInput.Dispose();
    22.         _audioInput.Dispose();
    23.         string path = await _hEVCRecorder.FinishWriting();
    24.         NativeGallery.SaveVideoToGallery(path, "Tazovsky Hearts", "Recording");
    25.     }
    26. }
     
  17. ROBYER1

    ROBYER1

    Joined:
    Oct 9, 2015
    Posts:
    1,454
    What part doesn't work? I would suspect that NativeGallery.SaveVideoToGallery... call is the main culprit, check you are using a valid path/have the right permissions to write there on IOS?
     
  18. Dreamport_Developer

    Dreamport_Developer

    Joined:
    Aug 21, 2018
    Posts:
    9
    It all worked! I changed the frame rate to 0.5f I have no idea how it helped and what this parameter is responsible for, the main thing is that it records and saves everything.
     
    ROBYER1 likes this.
  19. Dreamport_Developer

    Dreamport_Developer

    Joined:
    Aug 21, 2018
    Posts:
    9
    But, for some reason, the sound is not recorded ... :(
     
  20. ROBYER1

    ROBYER1

    Joined:
    Oct 9, 2015
    Posts:
    1,454
    Are you recording the microphone or in game audio?

    If it is mic, use something like this -

    Code (CSharp):
    1.                     audioDevice = query.currentDevice as AudioDevice;
    2.                     audioDevice.sampleRate = AudioSettings.outputSampleRate;
    3.                     audioDevice.channelCount = (int)AudioSettings.speakerMode;
    4.  
    5.        clock = new RealtimeClock();
    6.  
    7. audioDevice.StartRunning((sampleBuffer, timestamp) => recorder.CommitSamples(sampleBuffer, clock.timestamp));
    8.  
     
  21. Dreamport_Developer

    Dreamport_Developer

    Joined:
    Aug 21, 2018
    Posts:
    9
    Thanks for helping to solve the problem :)
    I am recording audio from an application. NatDevice hasn't bought it yet.
     
    ROBYER1 likes this.
  22. ROBYER1

    ROBYER1

    Joined:
    Oct 9, 2015
    Posts:
    1,454
    To record the game sound, link the audioInput to your audio listener in your scene, something like
    Code (CSharp):
    1. audioInput = new AudioInput(recorder, mainAudioListener);
     
  23. Lanre

    Lanre

    Joined:
    Dec 26, 2013
    Posts:
    3,973
    The MP4 and HEVC recorders specify the keyframe interval in seconds, whereas the native recorders specify them in frames. So NatCorder converts the value you pass in to frames by multiplying your keyframe interval by your frame rate. The default keyframe interval in the constructor is either 2 or 3. But for some reason, you are passing in 0.1 as your frame rate (which looks erroneous). So:
    Code (CSharp):
    1. round(3 * 0.1) = round(0.3) = 0
    The native encoder receives 0 for the keyframe interval, throws an exception, and the app crashes. On Android, it works because if the encoder receives 0, it will generate a video with all keyframes, which takes up a lot more storage than a 'normal' video.

    TL;DR: Passing 0.1 as the frame rate is erroneous. Your app is likely running at 30FPS or 60FPS, so pass in 30 or 60.
     
  24. Lanre

    Lanre

    Joined:
    Dec 26, 2013
    Posts:
    3,973
    If you want to record audio, you have to pass a valid audio format (sample rate and channel count) to the constructor. See the docs.
     
  25. Dreamport_Developer

    Dreamport_Developer

    Joined:
    Aug 21, 2018
    Posts:
    9
    OMG, thx Lanre, thx ROBYER1. All is working!

    Oh, this is a wonderful feeling when you enter the gallery, and there is a recorded video with sound!
     
  26. Lanre

    Lanre

    Joined:
    Dec 26, 2013
    Posts:
    3,973
    I'm glad you got it working!
     
  27. Dreamport_Developer

    Dreamport_Developer

    Joined:
    Aug 21, 2018
    Posts:
    9
    guys, maybe you know how to hide the UI for writing? Or you have a link to the tutorial. I heard (maybe even in this thread) that a second camera needs to be created. The situation is further complicated by the fact that it is nevertheless necessary to record some UI.
     
  28. Gango_98_

    Gango_98_

    Joined:
    Aug 10, 2020
    Posts:
    3
    Thank you, it was my fault
     
    Lanre likes this.
  29. enguerrandDesmet

    enguerrandDesmet

    Joined:
    Mar 30, 2021
    Posts:
    1
    Hello, I'm trying to use NatCorder (version 1.8.0) in order to reencode a video extracted frame by frame with Unity's videoPlaye in order to add a post process on this video (via objects in front of a camera and a shader)

    I used the implementation of the CameraInput to make a custom Input of this videoPlayer and apply my effects. However Unity freezes when calling "_recorder.CommitFrame(_pixelBuffer, timestamp);"
    always at about the 18th frame commit. However, I can't find out why, it seems to be a native library issue.



    I tried two options (either use both input systems as proposed in the CameraInput "SystemInfo.supportsAsyncGPUReadback? (ITextureInput) new AsyncTextureInput(_recorder) : new TextureInput(_recorder); " or to redo a method to commit the frame myself in both cases. (I did my tests only on a machine that supports AsyncGPUReadback what I want)

    Here is my code:

    initialisation part:

    Code (CSharp):
    1. _clock = new FixedIntervalClock(1 / _videoPlayer.frameRate);
    2. _recorder = new MP4Recorder((int)_videoPlayer.width, (int)_videoPlayer.height, _videoPlayer.frameRate);
    3. _input = SystemInfo.supportsAsyncGPUReadback ? (ITextureInput) new AsyncTextureInput(_recorder) : new TextureInput(_recorder);
    4.      
    5. var (width, height) = _recorder.frameSize;
    6. var frameDescriptor = new RenderTextureDescriptor(width, height, RenderTextureFormat.ARGB32, 24);
    7. frameDescriptor.sRGB = true;
    8. _frameBuffer = RenderTexture.GetTemporary(frameDescriptor);
    9. _readbackBuffer = SystemInfo.supportsAsyncGPUReadback ? null : new Texture2D(_frameBuffer.width, _frameBuffer.height, TextureFormat.RGBA32, false, false);
    10. _pixelBuffer = new byte[_frameBuffer.width * _frameBuffer.height * 4];
    main loop :

    Code (CSharp):
    1. private IEnumerator OnFrame () {
    2.     var endOfFrame = new WaitForEndOfFrame();
    3.     var wait = new WaitForSeconds(0.5f);
    4.     _frameCount = -1;
    5.     _videoPlayer.frame = -1;
    6.  
    7.     while ((ulong)++_frameCount < _videoPlayer.frameCount) {
    8.         Debug.Log($"[EncodingHandler] OnFrame: {_frameCount.ToString()}");
    9.  
    10.         yield return endOfFrame;
    11.  
    12.         // seek and get videoPlayer texture
    13.         _videoPlayer.StepForward();
    14.         _videoPlayer.Play();
    15.  
    16.         // wait _videoPlayer for safety
    17.         // TODO: find a better way to wait frame of videoPlayer (frameReady/seekCompleted event ?)
    18.         yield return wait;
    19.  
    20.         Texture vpTexture = _videoPlayer.texture;
    21.  
    22.         Graphics.Blit(vpTexture, _frameBuffer); // copy to frameBuffer
    23.  
    24.         _camera.targetTexture = _frameBuffer; // reset targetTexture isn't needed
    25.         _camera.Render(); // Render Camera overlay objects
    26.  
    27.         // Render postProcessingEffect (essentially blit with some materials)
    28.         ApplyColorEffect(_frameBuffer);
    29.  
    30.         _input?.CommitFrame(_frameBuffer, _clock?.timestamp ?? 0L); // Commit frameBuffer
    31.     }
    32.  
    33.     StopRecording();
    34. }
    My alternative method for commiting frame :

    Code (CSharp):
    1. // CommitFrame(_frameBuffer, _clock.timestamp);
    2. public unsafe void CommitFrame (Texture frameBuffer, long timestamp) {
    3.        if (SystemInfo.supportsAsyncGPUReadback)
    4.        {
    5.            // AsyncGPUReadback.Request(_frameBuffer, 0, request => _recorder?.CommitFrame(
    6.            //     NativeArrayUnsafeUtility.GetUnsafeBufferPointerWithoutChecks(request.GetData<byte>()),
    7.            //     timestamp
    8.            // ));
    9.          
    10.            AsyncGPUReadback.Request(frameBuffer, 0, request =>
    11.            {
    12.                if (_pixelBuffer != null)
    13.                {
    14.                    request.GetData<byte>().CopyTo(_pixelBuffer);
    15.                    _recorder.CommitFrame(_pixelBuffer, timestamp);
    16.                }
    17.            });
    18.        }
    19.        else
    20.        {
    21.            // fallback if AsyncGPUReadback isn't supported
    22.            var prevActive = RenderTexture.active;
    23.            RenderTexture.active = _frameBuffer;
    24.            _readbackBuffer.ReadPixels(new Rect(0, 0, _frameBuffer.width, _frameBuffer.height), 0, 0, false);
    25.            _readbackBuffer.GetRawTextureData<byte>().CopyTo(_pixelBuffer);
    26.            _recorder.CommitFrame(_pixelBuffer, timestamp);
    27.            RenderTexture.active = prevActive;
    28.        }
    29.    }
    If you need more details don't hesitate to ask me. Hopefully someone will find my problem.If you have any tips or a better way to do this process or more efficiently I am also welcome to any advice (I am starting with NatCorder and unity)

    Kind regards.
     
  30. Blarp

    Blarp

    Joined:
    May 13, 2014
    Posts:
    270
    EDIT: figured it out, I had to use the same clock in this method and the startrecording method. If I make two clocks, it obv won't work. So I just made clock a public var and both methods pull from that.

    In ReplayCam, how do you switch to a different camera mid recording?

    I have two camera angles and I want to get the first, and then switch over to the other while its recording.

    In the middle of recording, I'll dispose and add the new camera to the input, but there is a noticeable amount of frames missing in between, as if there was a delay in this action.

    public void CameraIn()
    {
    cameraInput.Dispose();
    var clock = new RealtimeClock();
    cameraInput = new CameraInput(recorder, clock, theCamera);
    }
     
    Last edited: Jun 24, 2021
  31. Lanre

    Lanre

    Joined:
    Dec 26, 2013
    Posts:
    3,973
    Create a second camera that can only see your UI canvas (render mode - screenspace camera). Then only record the main camera.
     
  32. Lanre

    Lanre

    Joined:
    Dec 26, 2013
    Posts:
    3,973
    Nothing looks out of place in your code. Are you recording on Windows? Also, your alternate code effectively does the same thing as the main code. NatCorder 1.8 simply shifted the sync and async readbacks into their own classes for better encapsulation; they're functionally equivalent.

    For testing, I recommend sticking to the normal `TextureInput` which does synchronous readbacks. It's gonna have worse performance, but it's better for debugging. Does it freeze with the `TextureInput`?
     
  33. Lanre

    Lanre

    Joined:
    Dec 26, 2013
    Posts:
    3,973
    Dispose and recreate the camera input with your new game camera, but use the same clock as before. Your platform is more forgiving; other platforms will throw an exception or crash. Creating a new clock means that the first frame from the new camera gets committed with timestamp zero, but you don't want that. You want time to continue as normal.
     
  34. Dreamport_Developer

    Dreamport_Developer

    Joined:
    Aug 21, 2018
    Posts:
    9
    Hello again!
    Your help is very necessary. When recording a video with sound on Android, everything is fine, but when recording the same video on IOS, the sound is greatly accelerated.
    Code (CSharp):
    1. public void StartRecord()
    2.     {
    3.         _mP4Recorder = new MP4Recorder(720, 1280, 30, 48000, 2);
    4.         RealtimeClock clock = new RealtimeClock();
    5.         _cameraInput = new CameraInput(_mP4Recorder, clock, _camera);
    6.         _audioInput = new AudioInput(_mP4Recorder, _camera.GetComponent<AudioListener>());
    7.     }
    8.  
    9.     public async void StopRecord()
    10.     {      
    11.         _cameraInput.Dispose();
    12.         _audioInput.Dispose();
    13.         string path = await _mP4Recorder.FinishWriting();
    14.         NativeGallery.SaveVideoToGallery(path, "Tazovsky Hearts", "Recording");
    15.     }
     
  35. Lanre

    Lanre

    Joined:
    Dec 26, 2013
    Posts:
    3,973
    You are hard-coding the audio format (sample rate and channel count). When recording audio from Unity (like you are because you are using an `AudioInput`), you must use Unity's audio engine format.
     
  36. FenrirGameStudio

    FenrirGameStudio

    Joined:
    Dec 4, 2017
    Posts:
    7
    Hey guys,
    I've spent 3 days trying to implement my own solution for recording video using:
    • The Update method
    • The LateUpdate method
    • A Coroutine
    • OnPostRender
    • Making the methods Async
    • Threading
    • Callbacks from the ScriptableRenderPipeline
    • AsyncGPUReadback
    • Compute shaders with compute buffers
    And many different ways to get the pixels from a screenshot for instance, however I've gone from having an average FPS of 250 to 15 at most while exporting everything as a video.

    I guess there's some sort of magic to your plugin that I just haven't realized is the case or maybe I'm doing something wrong with my code.

    In any case will this plugin help me have a framerate of at least 60 FPS while rendering out the videos?

    Best regards
    Kenneth 'Light' Berle
     
  37. Dreamport_Developer

    Dreamport_Developer

    Joined:
    Aug 21, 2018
    Posts:
    9
    Hello again everyone!

    So before me there was a problem of combining sound from the microphone and from the game. After reading the last few pages, I realized that the problem is urgent, and perhaps even solvable. But how (using only NATCorder and NATdevice) I still don't get it ...

    Is there some simple and straightforward example where both the sound from the game and the sound from the microphone are recorded in MP4Recorder?
     
  38. Lanre

    Lanre

    Joined:
    Dec 26, 2013
    Posts:
    3,973
    That's a pretty steep drop. Are you using NatCorder for recording? What device, OS version, and graphics API are you running on?
    I don't quite understand; are you using NatCorder for recording? Or are you implementing your own recording solution from scratch?
    It's impossible to make any such guarantees without knowing the specific rendering behaviour of your app. Have you profiled the app's CPU and GPU frame time to see where all of the rendering time is being spent? NatCorder has a comprehensive list of performance best-practices, but the most important factor is your app itself, without any recording.
     
  39. Lanre

    Lanre

    Joined:
    Dec 26, 2013
    Posts:
    3,973
    Combining mic and game audio isn't supported by either NatCorder or NatDevice. It turns out to be quite complicated, and because I am not an audio engineer, I have not been able to write a proper implementation.
     
  40. GavinChen

    GavinChen

    Joined:
    Aug 14, 2018
    Posts:
    7
    i use the package and build in window10 X64 21H1
    when record mp4 it crash
    can help me fixd the bug?
     

    Attached Files:

  41. GavinChen

    GavinChen

    Joined:
    Aug 14, 2018
    Posts:
    7
    when click button, it was call the function "StartRecording()",and crash

    void StartRecording()
    {
    #if UNITY_EDITOR
    return;
    #endif

    try
    {
    string _name = StageManager.GetStage<RecordModeStage>().vedioFileName + DateTime.Now.ToString("_MM_dd_yyyy HH_mm_ss") + ".mp4";
    OutputVideoPath = Path.Combine(GetFolderPath(), _name);

    // Start recording
    var frameRate = 30;
    var sampleRate = AudioSettings.outputSampleRate;
    var channelCount = (int)AudioSettings.speakerMode;
    Vector2Int _vec = StageManager.GetStage<RootStage>().GetScreenSize();
    var clock = new RealtimeClock();
    recorder = new MP4Recorder(_vec.x, _vec.y, frameRate, sampleRate, channelCount);
    // Create recording inputs
    cameraInput = new CameraInput(recorder, clock, Camera.main);
    audioInput = new AudioInput(recorder, clock, Camera.main.GetComponent<AudioListener>());
    }
    catch (Exception exception)
    {
    UIManager.GetInstance().CreateScrollableMsgBox(exception.Message);
    }
    }
     
  42. GavinChen

    GavinChen

    Joined:
    Aug 14, 2018
    Posts:
    7

    There is a situation
    On the same window10 computer
    Create two local users (one is the Chinese name and the other is the English name)
    Users with Chinese names will crash when recording
    Users with English names are normal when recording

    did any way to change the temp mp4 path?
     
  43. Lanre

    Lanre

    Joined:
    Dec 26, 2013
    Posts:
    3,973
    Hey there, can you share your recording code? Are you able to reproduce this crash in the ReplayCam example? The stack trace shows a call to `FinishWriting` while creating the recorder which is incorrect.
     
  44. Lanre

    Lanre

    Joined:
    Dec 26, 2013
    Posts:
    3,973
    I see, it looks like the recording path is the cause. You can set the recording path by modifying the code. Open MP4Recorder.cs and find the line that says `Utility.GetPath`. I will add a fix for this in the next release.
     
  45. GavinChen

    GavinChen

    Joined:
    Aug 14, 2018
    Posts:
    7
    I have three windows64 computers having problems with recording
    Two of them crashed directly
    Another reason is unknown

    I modified "Mp4Recorder.cs" (as attached)
    The crash problem can be temporarily solved

    The other one will trigger a catch when recording
    The error message is "NatCorder"
    How can I fix this problem?
     

    Attached Files:

    • aaa.png
      aaa.png
      File size:
      171.2 KB
      Views:
      304
  46. Lanre

    Lanre

    Joined:
    Dec 26, 2013
    Posts:
    3,973
    You will need to specify a full path if you are providing the recording path to the recorder. You cannot just provide a file name. Can you provide stack traces from the crashes?
     
  47. VladK02

    VladK02

    Joined:
    May 11, 2021
    Posts:
    11
    Hello, thank you for your very useful asset. I am having some problems with the recording procedure. I am trying to start and stop recording when the user pressed the corresponding buttons. On my windows, after pressing the stop button the unity editor totally crashes. Could you please tell me why these problems occurred?
    I have attached my code

    Also, I have tried the example ReplayCam on my Android device, but it seems that the sound does not record (I have turned it on). Are there any special settings for sound recording on android devices?
     

    Attached Files:

  48. GavinChen

    GavinChen

    Joined:
    Aug 14, 2018
    Posts:
    7
    I have provided #3842 #3843#3844#3847 in the previous post
     
  49. Lanre

    Lanre

    Joined:
    Dec 26, 2013
    Posts:
    3,973
    Hey there can you share the crash logs from Windows? This seems to be the second report of crashing on Windows that I've gotten with the 1.8.1 update. Does the ReplayCam example also crash?
    Make sure that you have enabled microphone permissions for your app.
     
  50. Lanre

    Lanre

    Joined:
    Dec 26, 2013
    Posts:
    3,973
    I'll try to repro this and get back to you.