NatCam doesn't support linear lighting. On some models of devices (some S7 Edge and S8 models), we see lag introduced by the graphics driver when NatCam is reading back data from the GPU. The current workaround is to disable the `PreviewData` flag in NatCam > Plugins > Managed > Platforms > Android > NatCamAndroid.cs. This workaround will disable the CaputreFrame function. We have the full fix coming in the next release.
2 more questions. 1) Taking a photo on takes a couple of seconds, is that to be expected? (Samsung S8+) 2) Is it possible to include UI in a photo?
Depending on the photo resolution and the lighting conditions, yes. No. Photos are captured from the camera.
Hi, I'm on Natcam 1.5f3 and i've this problem with minicam on Android : If i set Preview resolution with Highest and Photo resolution with Medium it's ok but with superior resolution (HD + Full or Highest) i obtain a purple capture on Android Any idea plz ?
OpenGL ES doesn't support textures of that large size, so the texture is not created (and looks purple). You must reduce your photo resolution.
Hello @Lanre , as promised I sent you an email with the crash/issue reports triggered by NatCam and NatCorder. Feel free to email me back if you need more information.
Hello I use webcam in Orbbec Astra Pro(kind of depth sensor like kinect) and it has problem. It is written 30fps in Orbbec official site and also works 30fps in windows Camera app but doesn't it in unity. If I load it by webCamTexture, Its maximum fps is just 10. So, NatCam can solve it? If you can't be sure it, could you send me example program for test? Thank you.
NatCam does not have native support on Windows; it falls back to WebCamTexture. So you will not see any improvement by using NatCam.
Hi Lanre. I ran the MiniCam example on the iOS platform (iPhone 6, iPhone 8), but I encountered the problem that the camera image is displayed as an image rotated by 180 degrees when it is in the landscape orientation. This problem seems to occur also in NatCam 2.0f2.
Hello @Lanre ! I have a project that worked with NatCam 2.0f1 on my Sony Xperia Z3 Compact, but crashes on start with NatCam 2.0f2. What can I give you to help diagnose the problem?
Yes, you can enable development build. NatCam will always log verbosely, so just send me the logs from logcat.
We are facing the same problem, also at Screen orientation PortraitUpsideDown the camera preview is 90° off on both Android and iOS.
I just now made sure that the problem was fixed with the new version. Thank you for the quick response.
Hi @Lanre , I update from NatCam 2.0f2 to NatCam 2.0f3, the camera image will be delayed, but NatCam 2.0f2 is OK.Can you help with this?
Hello @Lanre! I'm trying to create a video with sound, created a script from the example and hung it on the listener. But when I run the application in the editor, I show errors: Internal_CreateGameObject can only be called from the main thread. Constructors and field initializers will be executed from the loading thread when loading a scene. Don't use this function in the constructor or field initializers, instead move initialization code to the Awake or Start function. UnityEngine.GameObject:.ctor(String) NatCamU.Dispatch.DispatchUtility:.cctor() NatCorderU.Core.NatCorder:.cctor() (at Assets/NatCorder/Plugins/Managed/NatCorder.cs:118) NatCorderU.Core.NatCorder:.cctor() (at Assets/NatCorder/Plugins/Managed/NatCorder.cs:118) AudioGetter:OnAudioFilterRead(Single[], Int32) (at Assets/NatCorder/Examples/ReplayCam/AudioGetter.cs:48) AudioGetter:OnAudioFilterRead(Single[], Int32) (at Assets/NatCorder/Examples/ReplayCam/AudioGetter.cs:47) And UnityException: Internal_CreateGameObject can only be called from the main thread. Constructors and field initializers will be executed from the loading thread when loading a scene. Don't use this function in the constructor or field initializers, instead move initialization code to the Awake or Start function. UnityEngine.GameObject..ctor (System.String name) (at /Users/builduser/buildslave/unity/build/artifacts/generated/bindings_old/common/Core/GameObjectBindings.gen.cs:393) NatCamU.Dispatch.DispatchUtility..cctor () Rethrow as TypeInitializationException: An exception was thrown by the type initializer for NatCamU.Dispatch.DispatchUtility NatCorderU.Core.NatCorder..cctor () (at Assets/NatCorder/Plugins/Managed/NatCorder.cs:118) Rethrow as TypeInitializationException: An exception was thrown by the type initializer for NatCorderU.Core.NatCorder AudioGetter.OnAudioFilterRead (System.Single[] samples, Int32 channels) (at Assets/NatCorder/Examples/ReplayCam/AudioGetter.cs:47) Script AudioGetter.cs: Code (CSharp): using UnityEngine; using System.Collections; using NatCorderU.Core; /* * We attach this script to an AudioSource/AudioListener * and forward the audio to NatCorder */ [RequireComponent(typeof(AudioSource))] public class AudioGetter : MonoBehaviour, IAudioSource { public int sampleRate { get { return AudioSettings.outputSampleRate; } } public int sampleCount { get { int buffLen, numBuffer; AudioSettings.GetDSPBufferSize(out buffLen, out numBuffer); return buffLen; } } public int channelCount { get { return (int)AudioSettings.speakerMode; } } public void Dispose() { Debug.Log("Disposed, but not implemented"); } private void OnAudioFilterRead(float[] samples, int channels) { long timestamp = (long)(AudioSettings.dspTime * 1e+9f); NatCorder.CommitSamples(samples, timestamp); Debug.Log("tick"); } } Script ReplayCam: Code (CSharp): namespace NatCorderU.Examples { using UnityEngine; using UnityEngine.UI; using System.Collections; using Core; using UnityEngine.Video; public class ReplayCam : MonoBehaviour { //public VideoPlayer videoPlayer; public VideoViewer videoViewer; //public Animator recAnimator; public bool recordMicrophoneAudio; public AudioSource audioSource; //public AudioListener mainListener; public Text textShowPath; public bool process; public Transform listenerTransform; void Start() { videoViewer = GetComponent<VideoViewer>(); process = false; } public void StartRecording () { // Create a recording configuration const float DownscaleFactor = 2f / 3; var configuration = new Configuration((int)(Screen.width * DownscaleFactor), (int)(Screen.height * DownscaleFactor)); // Start recording with microphone audio if (recordMicrophoneAudio) { StartMicrophone(); Replay.StartRecording(Camera.main, configuration, OnReplay, audioSource, true); //Replay.StartRecording(Camera.main, configuration, OnReplay, mainListener, true); } // Start recording without microphone audio else { AudioGetter audioGetter = listenerTransform.GetComponent<AudioGetter>(); Replay.StartRecording(Camera.main, configuration, OnReplay, audioGetter); //Replay.StartRecording(Camera.main, configuration, OnReplay); } //recAnimator.SetBool("ShowIndicator", true); process = true; } private void StartMicrophone () { #if !UNITY_WEBGL || UNITY_EDITOR // No `Microphone` API on WebGL :( // If the clip has not been set, set it now if (audioSource.clip == null) { audioSource.clip = Microphone.Start(null, true, 60, 48000); while (Microphone.GetPosition(null) <= 0) ; } // Play through audio source audioSource.timeSamples = Microphone.GetPosition(null); audioSource.loop = true; audioSource.Play(); #endif } public void StopRecording () { if (recordMicrophoneAudio) audioSource.Stop(); Replay.StopRecording(); //recAnimator.SetBool("ShowIndicator", false); process = false; } void OnReplay (string path) { Debug.Log("Saved recording to: "+path); // textShowPath.text = path; videoViewer.ViewAndPlayVideo(path); //https://forum.unity.com/threads/native-gallery-for-android-ios-open-source.519619/ //NativeGallery.SaveVideoToGallery(path, "Beavers AR", "video{0}.mp4"); // Playback the video #if UNITY_IOS //Handheld.PlayFullScreenMovie("file://" + path); #elif UNITY_ANDROID //Handheld.PlayFullScreenMovie(path); #endif } } }
You're posting on the wrong thread . Anyway, the fix is simple. Create the AudioGetter right before StartRecording and pass it in: Code (CSharp): AudioGetter getter = sceneListener.AddComponent<AudioGetter>(); Replay.StartRecording(..., getter);
We watch a fast moving things via enable the front-facing camera and will be feel the image is delay and distortion. Which is discovered by NatCam 2.0f1 update to NatCam 2.0f3 version. for example, when i recording my finger and fast moving by the front-facing camera. the image display is delayed and distortion while compare with real fingers moving. It's just NatCam 2.0f1 version is working well. i was compiled Scripting Backend=IL2CPP and runing Android OS 8.0.0 / API-26 (HUAWEISTF-AL10/341(C00)) . More details, you could refer to following two demos. https://github.com/EnoxSoftware/Nat....1/NatCamWithOpenCVForUnityExample_v1.0.1.apk, https://github.com/EnoxSoftware/Nat...2/NatCamWithOpenCVForUnityExample_v1.0.2.apk, NatCamWithOpenCVForUnityExample_v1.0.1.apk was using NatCam 2.0f1 which is working well ,without delay and distortion. but NatCamWithOpenCVForUnityExample_v1.0.2.apk had delay and distortion issues。 the following information is from the two simples with my cellphone. ### System Info ### OpenCVForUnity version = opencvforunity 2.2.8 (3.4.1-dev) Build Unity version = 2017.3.0f3 Build target = Android Scripting backend = IL2CPP operatingSystem = Android OS 8.0.0 / API-26 (HUAWEISTF-AL10/341(C00)) iPhone.generation = deviceModel = HUAWEI STF-AL10 deviceName = <unknown> deviceType = Handheld graphicsDeviceName = Mali-G71 graphicsDeviceVendor = ARM processorType = ARMv7 VFPv3 NEON graphicsMemorySize = 2048 systemMemorySize = 5729 graphicsDeviceID = 0 graphicsDeviceType = OpenGLES3 graphicsDeviceVendorID = 0 graphicsDeviceVersion = OpenGL ES 3.2 v1.r8p0-00cet0.c14e014a87b26dd123ddd0d6c57eeff3 graphicsMultiThreaded = False graphicsShaderLevel = 50 copyTextureSupport = Basic, Copy3D, DifferentTypes, TextureToRT, RTToTexture supportsAccelerometer = True supportsGyroscope = True supportsVibration = True supportsLocationService = True ################### ### System Info ### OpenCVForUnity version = opencvforunity 2.2.9 (3.4.1-dev) Build Unity version = 2017.3.0f3 Build target = Android Scripting backend = IL2CPP operatingSystem = Android OS 8.0.0 / API-26 (HUAWEISTF-AL10/341(C00)) iPhone.generation = deviceModel = HUAWEI STF-AL10 deviceName = <unknown> deviceType = Handheld graphicsDeviceName = Mali-G71 graphicsDeviceVendor = ARM processorType = ARMv7 VFPv3 NEON graphicsMemorySize = 2048 systemMemorySize = 5729 graphicsDeviceID = 0 graphicsDeviceType = OpenGLES3 graphicsDeviceVendorID = 0 graphicsDeviceVersion = OpenGL ES 3.2 v1.r8p0-00cet0.c14e014a87b26dd123ddd0d6c57eeff3 graphicsMultiThreaded = False graphicsShaderLevel = 50 copyTextureSupport = Basic, Copy3D, DifferentTypes, TextureToRT, RTToTexture supportsAccelerometer = True supportsGyroscope = True supportsVibration = True supportsLocationService = True ###################
Hi there. Sorry about this, I didn't get an email notification of your first post. I cannot reproduce any delay or distortion in any of the APK's. I'm able to achieve smooth 30FPS with a 1920x1080 preview on my S7 Edge. Can you email me a video showing the delay you describe, along with full logs from logcat?
A couple of questions) 1 - When I ask NatCam for the vertical FOV of the camera, I get a strange value. The correct value, according to the formula given https://stackoverflow.com/questions...ivalent-of-camera-parameters-gethorizontalvie (and verified experimentally) is 41.25078, but NatCam says it's 53.2988. Perhaps it doesn't take into account some aspect ratio correction, but I haven't been able to reverse engineer that number to determine why it's incorrect. Where is NatCam getting this vertical FOV number from? The horizontal FOV it gives seems to be correct. Correct FOV is important for any AR/VR application, so the world rotation matches the camera preview rotation, otherwise objects appear to drift in relation to the world when rotating the device. 2 - Is there a way to figure out which device NatCam chooses when you select RearCamera and there are multiple available? i.e., which WebCamDevice does it correspond to from the WebCamTexture.devices array? Is it always just the first one, or can it vary? Thanks!
If you are on Android, then the FOV value is gotten directly from the camera API. NatCam uses the camera1 API to ensure greater compatibility, not camera2. NatCam doesn't rely on WebCamDevice for identifying cameras. For populating DeviceCamera.FrontCamera and DeviceCamera.RearCamera, NatCam uses the first camera it finds with that position. You can see the implementation in the static constructor of the DeviceCamera class.
I'm using NatCam to preview on a texture. I need to release it temporarily and then start playing again. I need to switch to another native plugin (actually, Voxelbusters Cross Platform Native Plugins) to perform some functions. When I call NatCam.Release(), I can call VB CPNP and that's fine, but how do I start NatCam again afterwards? NatCam.Play() seems to throw an exception.
This is most likely because VB CPNP does not properly release the camera (what NatCam.Release does). Reach out to their customer support.
Oh? But I’m finding I get an exception even if I don’t call anything else. To test I used 2 UI buttons. One calls NatCam.Release(), the other NatCam.Play(). Should that work? The second button throws an exception (Thanks so much for the fast reply. I’m not at my computer and can post the exception stack trace when I get back. Just wanted to make sure I’d implemented the correct sequence of calls.)
Here's the stack trace from logcat on a Samsung Tab 4 tablet: Code (CSharp): E [ 5338] Unity AndroidJavaException: java.lang.NullPointerException E [ 5338] Unity java.lang.NullPointerException E [ 5338] Unity at com.yusufolokoba.natcam.NatCam.play(NatCam.java:61) E [ 5338] Unity at com.unity3d.player.UnityPlayer.nativeRender(Native Method) E [ 5338] Unity at com.unity3d.player.UnityPlayer.c(Unknown Source) E [ 5338] Unity at com.unity3d.player.UnityPlayer$e$2.queueIdle(Unknown Source) E [ 5338] Unity at android.os.MessageQueue.next(MessageQueue.java:207) E [ 5338] Unity at android.os.Looper.loop(Looper.java:131) E [ 5338] Unity at com.unity3d.player.UnityPlayer$e.run(Unknown Source) E [ 5338] Unity at UnityEngine.AndroidJNISafe.CheckException () [0x00000] in <filename unknown>:0 E [ 5338] Unity at UnityEngine.AndroidJNISafe.CallVoidMethod (IntPtr obj, IntPtr methodID, UnityEngine.jvalue[] args) [0x00000] in <filename unknown>:0 E [ 5338] Unity at UnityEngine.AndroidJavaObject._Call (System.String methodName, System.Object[] args) [0x00000] in <filename unknown>:0 E [ 5338] Unity at UnityEngine.AndroidJavaObject.Call (System.String methodName, System.Object[] args) [0x00000] in <filename unknown>:0 E [ 5338] Unity at NatCamU.Core.Platforms.NatCamAndroid.Play () [0x00000] in <filename Here is the stack trace running on Unity Editor: Code (CSharp): IndexOutOfRangeException: Array index is out of range. NatCamU.Core.Platforms.NatCamLegacy.Play () (at Assets/NatCam/Plugins/Managed/Platforms/NatCamLegacy.cs:74) NatCamU.Core.NatCam.Play (NatCamU.Core.DeviceCamera camera) (at Assets/NatCam/Plugins/Managed/NatCam.cs:75) InkMage.DeviceCameraController.Resume () (at Assets/_Project/Scripts/Camera/DeviceCameraController.cs:118) UnityEngine.Events.InvokableCall.Invoke () (at C:/buildslave/unity/build/Runtime/Export/UnityEvent.cs:166) UnityEngine.Events.UnityEvent.Invoke () (at C:/buildslave/unity/build/Runtime/Export/UnityEvent_0.cs:58) UnityEngine.UI.Button.Press () (at C:/buildslave/unity/build/Extensions/guisystem/UnityEngine.UI/UI/Core/Button.cs:36) UnityEngine.UI.Button.OnPointerClick (UnityEngine.EventSystems.PointerEventData eventData) (at C:/buildslave/unity/build/Extensions/guisystem/UnityEngine.UI/UI/Core/Button.cs:45) UnityEngine.EventSystems.ExecuteEvents.Execute (IPointerClickHandler handler, UnityEngine.EventSystems.BaseEventData eventData) (at C:/buildslave/unity/build/Extensions/guisystem/UnityEngine.UI/EventSystem/ExecuteEvents.cs:50) UnityEngine.EventSystems.ExecuteEvents.Execute[IPointerClickHandler] (UnityEngine.GameObject target, UnityEngine.EventSystems.BaseEventData eventData, UnityEngine.EventSystems.EventFunction`1 functor) (at C:/buildslave/unity/build/Extensions/guisystem/UnityEngine.UI/EventSystem/ExecuteEvents.cs:261) UnityEngine.EventSystems.EventSystem:Update()
The first button litterally just calls Code (CSharp): NatCam.Release() and the second Code (CSharp): NatCam.Play() .
Chances are that you never set a camera for NatCam to play from, so NatCam.Camera is null. You can pass in a camera to NatCam.Play like so: `NatCam.Play(DeviceCamera.RearCamera);`
I do set the camera in Awake. Code (CSharp): private void Awake() { Log("Awake"); _background = GetComponent<RawImage>(); Assert.IsNotNull(_background, "DeviceCameraImage requires a RawImage component"); _aspectRatioFitter = GetComponent<AspectRatioFitter>(); Assert.IsNotNull(_aspectRatioFitter, "DeviceCameraImage requires a AspectRatioFitter component"); if (DeviceCamera.RearCamera == null) { if (DeviceCamera.FrontCamera == null) { LogWarning("No device camera is available"); } else { LogWarning("No rear camera defined - switching to front camera"); NatCam.Camera = DeviceCamera.FrontCamera; } } else { Log("Found rear camera"); NatCam.Camera = DeviceCamera.RearCamera; } } private void Start() { Log("Start - NatCam.Camera: " + NatCam.Camera); if (NatCam.Camera != null) { // Set the preview resolution to Full HD NatCam.Camera.PreviewResolution = CameraResolution._1920x1080; // Set the photo resolution to highest NatCam.Camera.PhotoResolution = CameraResolution.Highest; // Start the camera preview NatCam.Play(NatCam.Camera); // Get notified when the preview actually starts NatCam.OnStart += OnStartCamera; } // see whether the camera can be flipped if (DeviceCamera.FrontCamera != null && DeviceCamera.RearCamera != null) { Log("OnStartCamera - Invoking Flip Camera ENABLED"); _flipCameraEnabled.Invoke(); } else { Log("OnStartCamera - Invoking Flip Camera DISABLED"); _flipCameraDisabled.Invoke(); } } Do I need to do this explicitly when calling Play?
PS: I've just tested this, and before NatCam.Release(), NatCam.Camera is set to a value. Straight after NatCam.Release() it is null. Should I go through all the steps in my Awake routine after NatCam.Release()?
After calling Release, the camera is nullified since NatCam actually releases the hardware cameras. Also, I don't recommend calling NatCam from Awake because NatCam's functions should be called from the main thread (the only exception is CaptureFrame, which is designed to be called from any thread) and Unity doesn't guarantee that Awake is called on the main thread.
OK - got it. As you can see I was only setting NatCam.Camera there but I have moved that to Start. Should I remove the OnStart listener before Release and reinstate it after? I saw in the docs you add the listener after NatCam.Play - is there a reason it's not defined before? With this version, it's working in the Editor but not on my android device. Stack trace is: Code (CSharp): E [ 7941] Unity NatCam Error: Failed to open camera 0 with error: Fail to connect to camera service D [ 7941] dalvikvm GC_FOR_ALLOC freed 45K, 29% free 8128K/11348K, paused 37ms, total 37ms D [ 7941] dalvikvm GC_FOR_ALLOC freed 454K, 32% free 8128K/11804K, paused 29ms, total 29ms E [ 7941] Unity AndroidJavaException: java.lang.NullPointerException E [ 7941] Unity java.lang.NullPointerException E [ 7941] Unity at com.yusufolokoba.natcam.NatCam.play(NatCam.java:63) E [ 7941] Unity at com.unity3d.player.UnityPlayer.nativeRender(Native Method) E [ 7941] Unity at com.unity3d.player.UnityPlayer.c(Unknown Source) E [ 7941] Unity at com.unity3d.player.UnityPlayer$e$2.queueIdle(Unknown Source) E [ 7941] Unity at android.os.MessageQueue.next(MessageQueue.java:207) E [ 7941] Unity at android.os.Looper.loop(Looper.java:131) E [ 7941] Unity at com.unity3d.player.UnityPlayer$e.run(Unknown Source) E [ 7941] Unity at UnityEngine.AndroidJNISafe.CheckException () [0x0008c] in /Users/builduser/buildslave/unity/build/Runtime/Export/AndroidJNISafe.cs:24 E [ 7941] Unity at UnityEngine.AndroidJNISafe.CallVoidMethod (IntPtr obj, IntPtr methodID, UnityEngine.jvalue[] args) [0x00011] in /Users/builduser/buildslave/unity/build/Runtime/Export/AndroidJNISafe.cs:357 E [ 7941] Unity at UnityEngine.AndroidJavaObject._Call (System.String methodName, System.Object[] args) [0x00038] in /Users/builduser/buildslave/unity/build/Runtime/Export/AndroidJavaImpl.cs:254 E [ 7941] Unity at UnityEngine.AndroidJava Here's what I have now in the code for the 2 buttons, Pause and Resume: Code (CSharp): public void Pause() { NatCam.OnStart -= OnStartCamera; NatCam.Release(); } public void Resume() { Debug.Log("BEFORE PLAY"); if (DeviceCamera.RearCamera == null) { if (DeviceCamera.FrontCamera != null) { Debug.Log("FRONT CAMERA"); NatCam.Camera = DeviceCamera.FrontCamera; } } else { Debug.Log("REAR CAMERA"); NatCam.Camera = DeviceCamera.RearCamera; } if (NatCam.Camera != null) { // Set the preview resolution to Full HD // NatCam.Camera.PreviewResolution = CameraResolution._1920x1080; // Start the camera preview NatCam.Play(NatCam.Camera); // Get notified when the preview actually starts NatCam.OnStart += OnStartCamera; } Debug.Log("AFTER PLAY"); }
Sorry, Lanre!! I think I test the wrong version that last time. Apologies! I built it again and it's fine on the Android device too.
No, you're fantastic. It's a great plugin - exactly what we needed for our project. Thank you so much.
You can set the preview resolution of the camera like so: Code (CSharp): NatCam.Camera.PreviewResolution = new CameraResolution(width, height); You must note that the camera isn't guaranteed to support the resolution you set, in which case NatCam will set the closest supported resolution that is actually supported. So if your new device doesn't support the iPad camera resolution, the resolution on your new device will be different.
Hi @Lanre , I've noticed some null references within NatCam when attempting to use it on my Galaxy S8. This is what occurs when I take a picture or begin video capture. NatCam is working normally in Unity Editor. If needed I can email the script to you. 08-10 13:02:57.977 8393 8494 E Unity : java.lang.NullPointerException: Attempt to invoke virtual method 'java.util.List android.hardware.Camera$Parameters.getSupportedPictureSizes()' on a null object reference 08-10 13:02:57.977 8393 8494 E Unity : at com.yusufolokoba.natcam.Utilities.ClosestPhotoResolution(Utilities.java:27) 08-10 13:02:57.977 8393 8494 E Unity : at com.yusufolokoba.natcam.NatCamDevice.setPhotoResolution(NatCamDevice.java:141) 08-10 13:02:57.977 8393 8494 E Unity : at com.unity3d.player.UnityPlayer.nativeRender(Native Method) 08-10 13:02:57.977 8393 8494 E Unity : at com.unity3d.player.UnityPlayer.c(Unknown Source:0) 08-10 13:02:57.977 8393 8494 E Unity : at com.unity3d.player.UnityPlayer$e$2.queueIdle(Unknown Source:72) 08-10 13:02:57.977 8393 8494 E Unity : at android.os.MessageQueue.next(MessageQueue.java:394) 08-10 13:02:57.977 8393 8494 E Unity : at android.os.Looper.loop(Looper.java:142) 08-10 13:02:57.977 8393 8494 E Unity : at com.unity3d.player.UnityPlayer$e.run(Unknown Source:32) 08-10 13:02:57.977 8393 8494 E Unity : at UnityEngine.AndroidJNISafe.CheckException () [0x0008c] in /Users/builduser/buildslave/unity/build/Runtime/Export/AndroidJNISafe.cs:24 08-10 13:02:57.977 8393 8494 E Unity : at Unity