Search Unity

  1. Megacity Metro Demo now available. Download now.
    Dismiss Notice
  2. Unity support for visionOS is now available. Learn more in our blog post.
    Dismiss Notice

NatDevice - Media Device API

Discussion in 'Assets and Asset Store' started by Lanre, Dec 17, 2015.

?

Should we add exposure controls in v1.3? This means dropping support for iOS 7

Poll closed Jun 10, 2016.
  1. Yes

    9 vote(s)
    75.0%
  2. No

    3 vote(s)
    25.0%
  1. mahna3411

    mahna3411

    Joined:
    Dec 11, 2018
    Posts:
    39
    Do I not need to have the mentioned mobile phone in front of me?
    Do you have a training link for this?
    It is very important for us to solve this problem.
     
  2. Lanre

    Lanre

    Joined:
    Dec 26, 2013
    Posts:
    3,970
    Yes you, or anyone who collects the logs, will need the physical device. I don't have any training link; follow the official Android documentation that I linked.
     
  3. mahna3411

    mahna3411

    Joined:
    Dec 11, 2018
    Posts:
    39
    Hello ,
    Log file added.
    Only this mobile has a problem, even other mobiles that have the same resolution work properly
     

    Attached Files:

  4. Lanre

    Lanre

    Joined:
    Dec 26, 2013
    Posts:
    3,970
    You are right that this is device-specific. The logs show that the preview resolution for both cameras is 1280x720, but for some reason the preview data is still stretched. A bug like this is practically impossible to fix, given that the numbers are correct (resolutions) but the preview data from the native camera framework seems to be skewed. There is no way for NatDevice to detect it and correct it.
     
  5. mahna3411

    mahna3411

    Joined:
    Dec 11, 2018
    Posts:
    39
    Is it possible that this is a problem on more devices?
    We have purchased this package and expect it to work properly, this is not good! Do you really not know a way to solve this problem?
     
  6. Lanre

    Lanre

    Joined:
    Dec 26, 2013
    Posts:
    3,970
    I haven't gotten any reports of an issue like this, over the past 3+ years. So overwhelming chances are that it is specific to that device. And as I mentioned earlier, the preview resolution reported by the camera is correct. In that case, there is literally no way of detecting that the preview is stretched unless you look at it. As such there isn't a fix I can roll out to address this issue.
     
  7. mahna3411

    mahna3411

    Joined:
    Dec 11, 2018
    Posts:
    39

    Thanks, let me know if you have a way
    My email: mahdi.bazei@gmail.com
     
    Lanre likes this.
  8. mahna3411

    mahna3411

    Joined:
    Dec 11, 2018
    Posts:
    39
    Hello
    I use natdevice and natcorder plugins in the project at the same time.
    There is a duplicate file in both called NatRender.aar that I have to delete one of them to get the android build .
    If I delete NatRender.aar in the natdevice folder, the camera will not work.
    If I delete NatRender.aar in the natcorder folder, the record will not work.
    The size of the two NatRender.aar is also different.
    Please help me, thanks.
     
  9. Lanre

    Lanre

    Joined:
    Dec 26, 2013
    Posts:
    3,970
    Delete both NatCorder and NatDevice from your project, then import both of them from the Asset Store. The latest versions (NatCorder 1.7.3, NatDevice 1.0.2) implicitly resolve the clashes by using a shared folder structure.
     
  10. Deleted User

    Deleted User

    Guest

    Can't iterate thru the list of devices; any idea why ? Not implementing Enumerate ?

    Capture d’écran 2020-08-24 à 14.54.11.png
     
  11. Lanre

    Lanre

    Joined:
    Dec 26, 2013
    Posts:
    3,970
    This feature is coming in the next update. For now, you have to use `query.devices`.
     
  12. AdminOh

    AdminOh

    Joined:
    Feb 11, 2016
    Posts:
    23
    Hello @Lanre

    I have sent you multiple emails but I could not get an answer from you. Could you give us an update? Thanks.
     
  13. Lanre

    Lanre

    Joined:
    Dec 26, 2013
    Posts:
    3,970
    Hey there sorry for the late response. I just responded.
     
  14. GilbertLau

    GilbertLau

    Joined:
    Dec 3, 2017
    Posts:
    26
    Hi, I got the following errors when running on android. I have doing nothing special just texture2d = camdevice.startrunning(); Any hint on what would cause this?

    upload_2020-8-28_11-54-27.png
     
  15. morph_jhKim

    morph_jhKim

    Joined:
    Aug 28, 2020
    Posts:
    4
    When I use MixerDevice I get the following error
    It only occurs on iOS and not on Android.


    NatDevice Error: Sample buffer delegate raised exception: System.ArgumentException: Destination array was not long enough. Check destIndex and length, and the array's lower bounds
    at System.Array.Copy (System.Array sourceArray, System.Int32 sourceIndex, System.Array destinationArray, System.Int32 destinationIndex, System.Int32 length) [0x000da] in <437ba245d8404784b9fbab9b439ac908>:0
    at System.Collections.Generic.List`1[T].CopyTo (System.Int32 index, T[] array, System.Int32 arrayIndex, System.Int32 count) [0x00013] in <437ba245d8404784b9fbab9b439ac908>:0
    at NatSuite.Devices.MixerDevice+<>c__DisplayClass10_0.<StartRunning>b__1 (System.Single[] sampleBuffer, System.Int64 timestamp) [0x0001e] in C:\Users\xxx\Desktop\APPTest\NatCorderDevice\Assets\NatDevice\Plugins\Managed\MixerDevice.cs:75
    at NatSuite.Devices.Internal.NativeAudioDevice+<>c__DisplayClass14_0.<StartRunning>b__0 (System.Single[] sampleBuffer, System.Int64 timestamp) [0x00002] in C:\Users\xxx\Desktop\APPTest\NatCorderDevice\Assets\NatDevice\Plugins\Managed\Internal\NativeAudioDevice.cs:54
    UnityEngine.Debug:LogError(Object)
    NatSuite.Devices.Internal.<>c__DisplayClass14_0:<StartRunning>b__0(Single[], Int64) (at Assets/NatDevice/Plugins/Managed/Internal/NativeAudioDevice.cs:55)
    NatSuite.Devices.Internal.NativeAudioDevice:OnSampleBuffer(IntPtr, IntPtr, Int32, Int64) (at Assets/NatDevice/Plugins/Managed/Internal/NativeAudioDevice.cs:76)
     
  16. Lanre

    Lanre

    Joined:
    Dec 26, 2013
    Posts:
    3,970
    It seems that the native image provided by the camera is null. Can you capture the full, unfiltered logs from logcat in a .txt attachment? I'll need to see everything from app start to when this error pops up.
     
  17. Lanre

    Lanre

    Joined:
    Dec 26, 2013
    Posts:
    3,970
    It looks like you are using an outdated version of NatDevice. Upgrade to the latest version on the Asset Store.

    EDIT: Delete NatDevice in your project before importing the update.
     
  18. GilbertLau

    GilbertLau

    Joined:
    Dec 3, 2017
    Posts:
    26

    Here it is. Thanks in advance. It is hurting my performance on a low end machine.
     

    Attached Files:

    • log.txt
      File size:
      99.8 KB
      Views:
      295
  19. Lanre

    Lanre

    Joined:
    Dec 26, 2013
    Posts:
    3,970
    What device does this happen on? The logs mention a USB camera. NatDevice does not support USB OTG devices. It only supports built-in cameras.
     
  20. GilbertLau

    GilbertLau

    Joined:
    Dec 3, 2017
    Posts:
    26

    I am running it on a RK3399 development board. Yes, the camera is connected via USB. and in fact, I can get the cam images without issue, but keep throwing this exception. Would you mind add a check null on this? It could be very helpful.
     
  21. Lanre

    Lanre

    Joined:
    Dec 26, 2013
    Posts:
    3,970
    There isn't a need for a null check. The implementation is catching the exception and logging it to the console. It is an explicit exception that isn't supposed to happen. As such silencing it with a null check would be dangerous.
     
  22. pzo

    pzo

    Joined:
    Aug 27, 2020
    Posts:
    3
    Hi I'm testing few technologies (Unity, UE4, Qt) for my project and wondering if Unity + NatDevice would be a good fit.
    I have read this thread (first 10 and last 10 pages), but still I have few questions, before committing to Unity and buying asset.

    Overview of my use case: I'm doing motion tracking and need high frame rate stream for image processing (OpenCV) and lowest latency possible.

    1) Does NatDevice still include all source codes? I know that was the case with NatCam but want to be sure NatDevice provides as well (inside assetstore, package content shows only binary blob [libNatDevice.a, NatDevice.aar]). Most likely I would need to tweak and/or add some extra functionality not currently available.

    2) Does NatDevice allow setting frame rate at e.g. 120+fps? I don't need to render anything at 120 fps on screen but definitely need to process natively at around ~120fps to reduce latency.

    3) Does NatDevice support depth camera stream feed and/or in PhotoCapture?

    4) Does `IAudioDevice` allow to choose build-in microphone on iOS (iphones have 3 microphones: bottom, front, back)
    however on iOS those are iterated and chosen via data sources:
    `var dataSources: [AVAudioSessionDataSourceDescription]? { get }` (AVAudioSessionPortDescription)

    5) if 4) is true does IAudioDevice allow choosing polarn pattern of microphone? (I guess probably not)
    `var supportedPolarPatterns: [AVAudioSession.PolarPattern]?` (AVAudioSessionDataSourceDescription)

    6) Is there native delegate similar to NatCam for processing audio in native code (again for reduced latency)
    Basicly the following native equivalent:
    `delegate void SampleBufferDelegate (float[] sampleBuffer, long timestamp);`
     
  23. Lanre

    Lanre

    Joined:
    Dec 26, 2013
    Posts:
    3,970
    The native sources are not available to developers without a separate (paid) agreement. This is the case with both NatDevice and NatCorder.
    We don't advertise this, but at least on iOS you can set the preview to 120FPS (and I believe at least one dev has hit 240FPS). It's not a priority for 'normal' uses of the API, so I haven't had reason to focus on it.
    No depth camera, but it does support photo capture.
    No this isn't offered. This is what we expose as audio devices.
    By extension of the above, nope.
    Every API in the NatSuite framework has a single C header (in this case, NatDevice.h) that all platform implementations must conform to. And all the .NET front-ends simply expose a native bridge with syntactic C# 7 sugar. The native headers are always platform-agnostic, exposing opaque pointers, and having no other dependencies beyond one or two standard C headers (think `stdint.h` and `stdbool.h`).

    That being said, you shouldn't need to actually use the header. For audio, all the C# bridge is doing is just marshaling audio samples to .NET and sending it to the client-provided delegate:
    Code (CSharp):
    1. [MonoPInvokeCallback(typeof(Bridge.SampleBufferHandler))]
    2. private static void OnSampleBuffer (IntPtr context, IntPtr sampleBuffer, int sampleCount, long timestamp) {
    3.     var samples = new float[sampleCount];
    4.     Marshal.Copy(sampleBuffer, samples, 0, sampleCount);
    5.     (((GCHandle)context).Target as Action<float[], long>)(samples, timestamp);
    6. }
    You can hack this code to simply trampoline right back into your own native code, or do whatever else you choose.
     
  24. morph_jhKim

    morph_jhKim

    Joined:
    Aug 28, 2020
    Posts:
    4
    Thanks for the reply, but the answer you left didn't solve the problem.
    I am currently using the latest version of NatDevice. (1.0.2)
    This error only occurs when using MixerAudio on an iPhone device, the error is as follows:

    NatDevice Error: Sample buffer delegate raised exception: System.ArgumentException: Destination array was not long enough. Check destIndex and length, and the array's lower bounds
    at System.Array.Copy (System.Array sourceArray, System.Int32 sourceIndex, System.Array destinationArray, System.Int32 destinationIndex, System.Int32 length) [0x00000] in <00000000000000000000000000000000>:0
    at NatSuite.Devices.MixerDevice+<>c__DisplayClass10_0.<StartRunning>b__1 (System.Single[] sampleBuffer, System.Int64 timestamp) [0x00000] in <00000000000000000000000000000000>:0
    at NatSuite.Devices.SampleBufferDelegate.Invoke (System.Single[] sampleBuffer, System.Int64 timestamp) [0x00000] in <00000000000000000000000000000000>:0
    at NatSuite.Devices.Internal.NativeAudioDevice+<>c__DisplayClass15_0.<StartRunning>b__0 (System.Single[] sampleBuffer, System.Int64 timestamp) [0x00000] in <00000000000000000000000000000000>:0
    at System.Action`2[T1,T2].Invoke (T1 arg1, T2 arg2) [0x00000] in <00000000000000000000000000000000>:0
    at NatSuite.Devices.Internal.NativeAudioDevice.OnSampleBuffer (System.IntPtr context, System.IntPtr sampleBuffer, System.Int32 sampleCount, System.Int64 timestamp) [0x00000] in <00000000000000000000000000000000>:0
    System.Action`2:Invoke(T1, T2)
    NatSuite.Devices.Internal.NativeAudioDevice:OnSampleBuffer(IntPtr, IntPtr, Int32, Int64)

    The code I used is below.
    Code (CSharp):
    1.  
    2. recordingClock = new RealtimeClock();
    3. videoRecorder = new MP4Recorder(
    4. previewCamera.targetTexture.width, previewCamera.targetTexture.height, 30, AudioSettings.outputSampleRate (int)AudioSettings.speakerMode, (int)(960 * 540 * 11.4f), keyFrameInterval );
    5.  
    6. var deviceQuery = new MediaDeviceQuery(MediaDeviceQuery.Criteria.AudioDevice);
    7. micAudioDevice = deviceQuery.currentDevice as AudioDevice;
    8.  
    9. var audioListener = ARCameraManager.Instance.ARCamera.GetComponent<AudioListener>();
    10. mixerAudioDevice = new MixerDevice(micAudioDevice, audioListener);
    11.  
    12. mixerAudioDevice.StartRunning((sampleBuffer, timestamp) => videoRecorder.CommitSamples(sampleBuffer, recordingClock.timestamp));
     
  25. Lanre

    Lanre

    Joined:
    Dec 26, 2013
    Posts:
    3,970
    In that case, you can manually modify the MixerDevice implementation to resolve this. Open MixerDevice.cs in NatSuite > Plugins > Managed > Devices and in the StartRunning method, increase the size of the `copyBuffer` array to 16384.
     
    morph_jhKim likes this.
  26. morph_jhKim

    morph_jhKim

    Joined:
    Aug 28, 2020
    Posts:
    4

    I solved the problem in the way you gave me. thank you for your help.
     
    Lanre likes this.
  27. Jove25

    Jove25

    Joined:
    Mar 8, 2019
    Posts:
    12
    Hello Lanre!

    May you add property/function to NatSuite to get the brightness value from the camera? - I may send you a native code that implements this.
    Or can I somehow to get access to NatSuite API / inherit your camera classes and add it myself (in my native plugin)?

    Thanks.
     
    Last edited: Sep 1, 2020
  28. Lanre

    Lanre

    Joined:
    Dec 26, 2013
    Posts:
    3,970
    Hi there. What exactly are you referring to by the brightness value? If you are referring to the EXIF metadata, then this isn't supported and won't be added because as far as I can tell, Android doesn't expose similar data. For camera features in NatDevice, we require feature parity between iOS and Android.
     
  29. mahna3411

    mahna3411

    Joined:
    Dec 11, 2018
    Posts:
    39
    Does the NatDevice not work in unity editor mode?
     
  30. pzo

    pzo

    Joined:
    Aug 27, 2020
    Posts:
    3
    Thanks for extensive reply. Do you mind sharing what is the price for NatDevice source code? Or at least a ballpark figure that indie dev can expect in case need access to source code.
     
  31. Jove25

    Jove25

    Joined:
    Mar 8, 2019
    Posts:
    12
    I need to get information about the lighting level on iOS.

    On Android, I get this information from a light sensor (via my own java plugin). On iOS, this information can be obtained using the IOKit.framework. Although it is a public framework, Apple discourages developers from using it, and any apps using it will be rejected from App Store. So I can't use IOKit.framework.

    The alternatives are:
    1. Get exposure (both: iOS and Android).
    2. Get kCGImagePropertyExifBrightnessValue key at EXIF dictionary (iOS only).

    In both cases, I cannot get this information, because the camera is already initialized in NatSuite and when I try a second initialization in its own plugin, NatSuite stops working.
    If parity between iOS and Android is a prerequisite, please add the current exposure to NatSuite.
     
  32. Lanre

    Lanre

    Joined:
    Dec 26, 2013
    Posts:
    3,970
    NatDevice works in the editor. On standalone platforms, NatDevice will fall back to WebCamTexture for camera support. But it has native microphone support like other platforms.
     
  33. Lanre

    Lanre

    Joined:
    Dec 26, 2013
    Posts:
    3,970
    I'll send you a PM.
     
  34. Lanre

    Lanre

    Joined:
    Dec 26, 2013
    Posts:
    3,970
    Getting the exposure time is also not feasible. On Android, the key is optional and only supported on cameras with the full camera2 support. So you can't even expect it to be present on all devices your app might be run on.

    I'm not sure what exactly you are trying to compute, but a much easier thing to do would be to compute a reduction on the preview frame itself. Taking the mean of pixel intensities would be a good starting point; from here you can build more involved reductions based on the frame histograms.
     
  35. Jove25

    Jove25

    Joined:
    Mar 8, 2019
    Posts:
    12
    It's okay if some (probably old) android phones don't have this information. It is possible to return null in such cases.
    In this case, I can use data from the light sensor on Android.

    I need real-time lighting information. The calculation for each frame takes a lot of time and may give an incorrect result.
     
  36. Lanre

    Lanre

    Joined:
    Dec 26, 2013
    Posts:
    3,970
    I don't think this feature fits well with NatDevice. The camera API isn't designed for providing analytic information about incoming frames, as such it would be the kind of feature only you and perhaps an incredibly small number of other developers might find useful.

    You can use a compute shader to calculate a reduction on the camera frame. It is going to be much faster than writing a for-loop; and will give you the correct results always (after all the reduction is being computed directly on what the camera sees at that very moment in time). I won't be building this feature into NatDevice because it is out of scope.
     
  37. Jove25

    Jove25

    Joined:
    Mar 8, 2019
    Posts:
    12
    This will not give me the correct result, because the brightness of the picture is not equal to the lighting level.
    In bright light, an image with a black background will give a low brightness value, and in low light with a white background, it will show a high brightness.

    There are only 2 ways to get the data correctly: from the light sensor for the screen (on the front of the phone) or from the light sensor of the main camera. - For this they are intended: if it was possible to calculate the lighting level algorithmically, they would not be needed.
    The first one cannot be used for iOS, the second one blocks NatSuite for use.
     
  38. dri_richard

    dri_richard

    Joined:
    Mar 10, 2017
    Posts:
    153
    Hi

    We’re switching from NatMic to NatDevice because we had an occasional problem with audio recording not starting.

    Were encountering a few problems though - the one we’ve been able to isolate so far is that AVAudioIONode.isVoiceProcessingEnabledis being called, which is in iOS 13, even on devices on other iOS versions.
     
  39. Lanre

    Lanre

    Joined:
    Dec 26, 2013
    Posts:
    3,970
    Android doesn't expose any brightness level information. What it does expose is the sensor's exposure time, which varies with auto exposure and other factors.

    It looks like your best bet is to write a native plugin that fetches the brightness level as you need it. On Android, it is apparently exposed as a separate sensor, not in any way tied to the camera. As far as NatDevice is concerned, this is out of scope (and quite the undertaking) so I won't be exposing frame metadata.
     
  40. Lanre

    Lanre

    Joined:
    Dec 26, 2013
    Posts:
    3,970
    The next update fixes this issue. I'll be submitting the update to Unity this weekend.
     
  41. dri_richard

    dri_richard

    Joined:
    Mar 10, 2017
    Posts:
    153
    Thanks.
    Another issue we now have is with Airpods. We get this exception, even though we are not trying to specify the sample rate:

    Uncaught exception: com.apple.coreaudio.avfaudio: required condition is false: format.sampleRate == hwFormat.sampleRate
     
  42. Jove25

    Jove25

    Joined:
    Mar 8, 2019
    Posts:
    12
    Lanre, please read my messages above carefully, so as not to give another silly advice.
    NatCam blocks the API for working with the camera, so I cannot write a camera native plugin that would simultaneously work with NatCam.
    For Android, there is a workaround - to use information from the screen light sensor (I have already developed this plugin). There is no such possibility for iOS.
    The only ways for me remains: do not use NatCam anymore or decompile it and modify it myself.
     
  43. Lanre

    Lanre

    Joined:
    Dec 26, 2013
    Posts:
    3,970
    I got your email, and I'll send the build later today. I'm not able to reproduce this error with AirPod Pros. Are you using the latest version of NatDevice from the Asset Store?
     
  44. Lanre

    Lanre

    Joined:
    Dec 26, 2013
    Posts:
    3,970
    My point is exactly this: you have to write a native plugin that fully substitutes NatDevice, on iOS. On Android, the light sensor and camera are fully tangent so NatDevice will not be affected.

    You should not (try to) decompile NatDevice because it's a violation of the license, and is likely not worth your time. As far as NatDevice is concerned, reporting light sensor data is out of scope: it deviates from the NatDevice specification; and it lacks parity on Android.
     
  45. dri_richard

    dri_richard

    Joined:
    Mar 10, 2017
    Posts:
    153
    Thanks.
    Yes we are. We've also noticed that our code to detect the headset/airpods being disconnected (we pause the game then) is being triggered wrongly (when no disconnection has occurred), so something seems to have changed.
    One slightly unusual thing we do is set the AVAudioSession to AVAudioSessionCategoryPlayback.
     
  46. Lanre

    Lanre

    Joined:
    Dec 26, 2013
    Posts:
    3,970
    NatDevice requires `AVAudioSessionCategoryPlayAndRecord`, and sets this category every time a `MediaDeviceQuery` is created from Unity.

    You're probably disconnecting the AirPods when you set the category by not specifying the audio options when updating the AVAudioSession. NatDevice sets the following:
    Code (CSharp):
    1. const NSUInteger AUDIO_OPTIONS = AVAudioSessionCategoryOptionAllowBluetoothA2DP | AVAudioSessionCategoryOptionAllowBluetooth | AVAudioSessionCategoryOptionDefaultToSpeaker | AVAudioSessionCategoryOptionMixWithOthers;
    I don't recommend changing the category unless it is absolutely crucial for your app to function, keeping in mind that it might mess with NatDevice's audio recording.
     
  47. dri_richard

    dri_richard

    Joined:
    Mar 10, 2017
    Posts:
    153
    That's fine. We were setting AVAudioSessionCategoryPlayback in a subclass of UnityAppController so that the game audio experience was consistent from the beginning. We can switch to AVAudioSessionCategoryPlayAndRecord.

    This feels more concerning. Our app is a music game, and it doesn't make sense for its audio to be mixed with other apps. But as you highlight, we weren't setting these options at all, and I don't know what the default values are on iOS.
     
  48. Lanre

    Lanre

    Joined:
    Dec 26, 2013
    Posts:
    3,970
    You can still reset the category once you've created the MediaDeviceQuery, so you can disable mixing.
     
    dri_richard likes this.
  49. dri_richard

    dri_richard

    Joined:
    Mar 10, 2017
    Posts:
    153
    OK great, I’ll try this tomorrow.
     
  50. dri_richard

    dri_richard

    Joined:
    Mar 10, 2017
    Posts:
    153
    I've done some experimenting and the results are a little complex:

    I modified the HotMic sample to play a looping AudioClip, to mimic our game which has constant audio output, and then built it for iOS.
    If I have AirPods connected when I launch, the audio plays through them. When I press the record button, the audio output reverts to the phone speakers. Notably, this change doesn't happen at the MediaDeviceQuery, but when the device starts running.
    I made a further modification, to use the second audiodevice in the query, rather than the first. Now, audio continues to play through the AirPods when the recording starts.

    My conclusion is that audio output through AirPods stops if we record through the phone's built-in microphone.

    However, with NatMic, it 'just worked'. My guess is that with AirPods connected, we were lucky that our query for a microphone with echo cancellation available found it first. NatDevice doesn't allow for that kind of query, so the first found microphone has changed.

    So my question is, how can we find the AirPods through NatDevice? Or perhaps we should use the microphone whose Unique ID is not "Buiilt-In Microphone"