Search Unity

NatDevice - Media Device API

Discussion in 'Assets and Asset Store' started by Lanre, Dec 17, 2015.

?

Should we add exposure controls in v1.3? This means dropping support for iOS 7

Poll closed Jun 10, 2016.
  1. Yes

    9 vote(s)
    75.0%
  2. No

    3 vote(s)
    25.0%
  1. YHS

    YHS

    Joined:
    Aug 10, 2014
    Posts:
    31
    Hi Lanre, recently I update my project to NatCam2.0. What happen to me is, after I update the old API and build to iphone6(ios11), weird glitch pump out when I switch to the rear camera from front camera rapidly. Switch back to font camera is totally fine. Only happen in front camera.

    I try your MiniCam example but works fine. So I think the problem might exist in my old scene. Maybe is my script or object. So I want to remake base on the example you provide. Then I found out if I drag any old objects from my scene to MiniCam, MiniCam will have glitch on phone rear camera too. (For example: "Preview" Object in the hierarchy.) Even I remove it from hierarchy the glitch still happen.

    Try to provide the info I have now from xcode. The glitch shows up \eEven with out the "Thread 1L EXC_BAD_ACCESS" error is pump up.
    I'll be very appreciate If you have any advise for me. Thank you.

     

    Attached Files:

    Last edited: May 11, 2018
  2. YHS

    YHS

    Joined:
    Aug 10, 2014
    Posts:
    31

    I found the problem but not really sure the reason. If I check "use front camera" on the MiniCam Component. After I launch the app and switch to back camera, it will cause glitch. But If I uncheck "use front camera" and launch app with back camera at most beginning, glitch won't happened.
     
  3. Lanre

    Lanre

    Joined:
    Dec 26, 2013
    Posts:
    3,973
    Can you send me an email. The sensor on the 6P is physically inverted, so we have to correct for it. I believe it is the front camera that has this issue. I'll send you updated libs with fixes.
     
  4. Lanre

    Lanre

    Joined:
    Dec 26, 2013
    Posts:
    3,973
    I think this is a bug. I'll look into it. Email me so I can send you an updated library with the fix.
     
  5. YHS

    YHS

    Joined:
    Aug 10, 2014
    Posts:
    31
    Hi Lanre

    I pump into other issue, since my app post picture to FaceBook, I need this app to be review by FB and they only accept iOS simulator build. What I am doing now is use Unity(PlayerSetting->Target SDK set to "Simulator") to build a xcode project.

    In Xcode, when I run simulator, the error occur like below image.

    And about the FB post function, I use FBSDK, it just screen capture and post it. NatCam just help me to take photo before that. I also use example "mainCam" to try it but the result is the same.
    Maybe you can give me more advice too, Thank you

    Thank you
     

    Attached Files:

    Last edited: May 13, 2018
  6. Lanre

    Lanre

    Joined:
    Dec 26, 2013
    Posts:
    3,973
    If you want to build for the simulator, you have to force NatCam to use the NatCamLegacy backend instead of the NatCamiOS backend. To do so, open NatCam.cs in NatCam > Plugins > Managed > NatCam.cs and change this line (line 174):
    Code (CSharp):
    1. #elif UNITY_IOS
    2. new NatCamiOS();
    to this:
    Code (CSharp):
    1. #elif UNITY_IOS
    2. new NatCamLegacy();
    Make sure to revert this change when building for the device.
     
  7. kennyallau

    kennyallau

    Joined:
    Aug 5, 2014
    Posts:
    8
    Hi @Lanre ,

    Is there a way for Natcam to overwrite a saved video file rather than generating new files after recording?
     
  8. Lanre

    Lanre

    Joined:
    Dec 26, 2013
    Posts:
    3,973
    NatCam doesn't record videos anymore. When it did, it created a new file. There was no way to set the name of the recording file. What you can do is to manually delete old videos or rename the new ones using the System.IO.File API.
     
  9. shawww

    shawww

    Joined:
    Sep 30, 2014
    Posts:
    43
    @Lanre Used NatCam in the past for project, and it was great. I have a new project where I need to record 4K @ 60 FPS using the new HEVC codec available on iOS. I was thinking of buying NatCorder and using both together...

    So, two questions: 1. Will this support a 4K camera running at 60 FPS on, say an iPhone X? And 2. Can I use NatCam and Natcorder together to record that video stream / is there an example of NatCam + NatCorder in usage?
     
  10. henriqueranj

    henriqueranj

    Joined:
    Feb 18, 2016
    Posts:
    177
    Hello @Lanre ,

    Upon submission of an app to the Google Play Store, they detected that the app crashed in specific devices due to the NatCam Thread. Bellow follows the crash stack trace for the NatCam Thread (the app's process id is redacted).


    Huawei Mate 9
    FATAL EXCEPTION: NatCam Camera Thread
    Process: xx.xx.xxxx, PID: 11228
    java.lang.Error: FATAL EXCEPTION [NatCam Camera Thread]
    Unity version : 2017.3.1p1
    Device model : HUAWEI MHA-L29
    Device fingerprint: HUAWEI/MHA-L29/HWMHA:7.0/HUAWEIMHA-L29/C567B138:user/release-keys
    Caused by: java.lang.RuntimeException: Camera is being used after Camera.release() was called
    at android.hardware.Camera.setPreviewTexture(Native Method)
    at com.yusufolokoba.natcam.NatCamDevice.close(NatCamDevice.java:79)
    at com.yusufolokoba.natcam.NatCamDevice.open(NatCamDevice.java:62)
    at com.yusufolokoba.natcam.NatCam$3.run(NatCam.java:134)
    at android.os.Handler.handleCallback(Handler.java:755)
    at android.os.Handler.dispatchMessage(Handler.java:95)
    at android.os.Looper.loop(Looper.java:156)
    at android.os.HandlerThread.run(HandlerThread.java:61)

    Samsung Galaxy S7 Edge
    FATAL EXCEPTION: NatCam Camera Thread
    Process: xx.xx.xxxx, PID: 16184
    java.lang.Error: FATAL EXCEPTION [NatCam Camera Thread]
    Unity version : 2017.3.1p1
    Device model : samsung SM-G935F
    Device fingerprint: samsung/hero2ltexx/hero2lte:6.0.1/MMB29K/G935FXXU1APB6:user/release-keys
    Caused by: java.lang.RuntimeException: Camera is being used after Camera.release() was called
    at android.hardware.Camera.native_getParameters(Native Method)
    at android.hardware.Camera.getParameters(Camera.java:1996)
    at com.yusufolokoba.natcam.NatCamDevice.getParams(NatCamDevice.java:372)
    at com.yusufolokoba.natcam.NatCamDevice.play(NatCamDevice.java:90)
    at com.yusufolokoba.natcam.NatCam.onPictureTaken(NatCam.java:188)
    at android.hardware.Camera$EventHandler.handleMessage(Camera.java:1165)
    at android.os.Handler.dispatchMessage(Handler.java:102)
    at android.os.Looper.loop(Looper.java:158)
    at android.os.HandlerThread.run(HandlerThread.java:61)

    LG G6
    FATAL EXCEPTION: NatCam Camera Thread
    Process: xx.xx.xxxx, PID: 10983
    java.lang.Error: FATAL EXCEPTION [NatCam Camera Thread]
    Unity version : 2017.3.1p1
    Device model : LGE LGUS997
    Device fingerprint: lge/lucye_nao_us/lucye:7.0/NRD90U/1708714109a03:user/release-keys
    Caused by: java.lang.NullPointerException: Attempt to invoke virtual method 'void com.yusufolokoba.natcam.NatCamDevice.play()' on a null object reference
    at com.yusufolokoba.natcam.NatCam$3.run(NatCam.java:135)
    at android.os.Handler.handleCallback(Handler.java:751)
    at android.os.Handler.dispatchMessage(Handler.java:95)
    at android.os.Looper.loop(Looper.java:154)
    at android.os.HandlerThread.run(HandlerThread.java:61)

    Samsung Galaxy J1 Ace
    FATAL EXCEPTION: NatCam Camera Thread
    Process: xx.xx.xxxx, PID: 14447
    java.lang.Error: FATAL EXCEPTION [NatCam Camera Thread]
    Unity version : 2017.3.1p1
    Device model : samsung SM-J111M
    Device fingerprint: samsung/j1acevelteub/j1acevelte:5.1.1/LMY47V/J111MUBU0AQC2:user/release-keys
    Caused by: java.lang.NullPointerException: Attempt to invoke virtual method 'void android.graphics.SurfaceTexture.release()' on a null object reference
    at com.yusufolokoba.natcam.rendering.SurfaceTextureRenderContext.releaseSurfaceTexture(SurfaceTextureRenderContext.java:63)
    at com.yusufolokoba.natcam.NatCam$2.run(NatCam.java:101)
    at android.os.Handler.handleCallback(Handler.java:739)
    at android.os.Handler.dispatchMessage(Handler.java:95)
    at android.os.Looper.loop(Looper.java:145)
    at android.os.HandlerThread.run(HandlerThread.java:61)

    Huawei P8 Lite
    FATAL EXCEPTION: NatCam Camera Thread
    Process: xx.xx.xxxx, PID: 17559
    java.lang.Error: FATAL EXCEPTION [NatCam Camera Thread]
    Unity version : 2017.3.1p1
    Device model : HUAWEI ALE-L23
    Device fingerprint: Huawei/ALE-L23/hwALE-H:5.0.1/HuaweiALE-L23/C605B150:user/release-keys
    Caused by: java.lang.NullPointerException: Attempt to invoke virtual method 'void com.yusufolokoba.natcam.NatCamDevice.play()' on a null object reference
    at com.yusufolokoba.natcam.NatCam$3.run(NatCam.java:135)
    at android.os.Handler.handleCallback(Handler.java:739)
    at android.os.Handler.dispatchMessage(Handler.java:95)
    at android.os.Looper.loop(Looper.java:135)
    at android.os.HandlerThread.run(HandlerThread.java:61)

    Have you witnessed these crashes before? And can you advice on how to fix it? All other Android devices show no problems with this.

    Further, I can also send to your email the logcat dumps as soon as I receive them.
     
  11. Lanre

    Lanre

    Joined:
    Dec 26, 2013
    Posts:
    3,973
    NatCorder does not support the HEVC codec just yet. Right now, it uses H.264 AVC across all platforms it supports (iOS, Android, Windows, macOS).
    Yes, you can use both. All you have to do is record the NatCam.Preview texture (if you want to record only the preview). Here's an example.
     
  12. Lanre

    Lanre

    Joined:
    Dec 26, 2013
    Posts:
    3,973
    I saw your email. Let's discuss over email.
     
  13. Masegi

    Masegi

    Joined:
    May 24, 2014
    Posts:
    17
    Hi,

    I try to use NatCam with OpenCvSharp (https://assetstore.unity.com/packages/tools/integration/opencv-for-unity-100374), but I just can't access the texture as Texture2D.

    Can I convert the preview to texture2d? I can't use the example code, cause I'm using another opencv package without put in the Mat object.

    nvm, this is working:
    Code (CSharp):
    1. Texture2D tex = new Texture2D(NatCam.Camera.PreviewResolution.width, NatCam.Camera.PreviewResolution.height);
    2. NatCam.CaptureFrame(tex);
     
    Last edited: May 16, 2018
    Lanre likes this.
  14. Lanre

    Lanre

    Joined:
    Dec 26, 2013
    Posts:
    3,973
    You can use the NatCam.CaptureFrame function to fill a Texture2D you provide with the preview data for the current frame. Or even better (for performance), use a byte[] with the CaptureFrame function and call Mat.put with the byte[].
     
  15. Masegi

    Masegi

    Joined:
    May 24, 2014
    Posts:
    17
    in the editor it's working fine with captureframe but it's not working on android
     
  16. Lanre

    Lanre

    Joined:
    Dec 26, 2013
    Posts:
    3,973
    Can you provide more information? How is it not working? Make sure that the `PreviewData` flag is set to true in NatCam > Plugins > Managed > Platforms > Android > NatCamAndroid.cs.
     
  17. Masegi

    Masegi

    Joined:
    May 24, 2014
    Posts:
    17
    I checked it, it's set to true, but I just get an empty texture on android

    I get in the console:
    NatCam Error: Texture size must match that of NatCam.Preview

    using / tried this now:
    Code (CSharp):
    1.  
    2.             Texture2D tex = new Texture2D(NatCam.Preview.width, NatCam.Preview.height,TextureFormat.RGBA32,false);
    3.             NatCam.CaptureFrame(tex);

    [edit]
    sry my bad, dont know why, but its working now, but atually performance istn that much better now :(
    [edit]
     
    Last edited: May 16, 2018
  18. Lanre

    Lanre

    Joined:
    Dec 26, 2013
    Posts:
    3,973
    I decided to respond here since all the info is here
    I'm planning a change that should fix this issue (and every other instance of it). The way we handle the active camera is pretty weird, and it stems from how Android decides to notify the client (NatCam) of new camera frames. But since we've updated NatCam to use the new render pipeline, all the weird handling shouldn't be necessary.
    Are you taking a picture and immediately switching cameras or pausing or releasing NatCam?
    Check that you have set NatCam.Camera before calling NatCam.Play (or pass in the camera you want to use to NatCam.Play). Also, check that the user has granted camera permissions to the app (`NatCam.Implementation.HasPermissions`).
    I'll look into this. It looks like an actual bug.
    Same as the G6.
     
  19. Lanre

    Lanre

    Joined:
    Dec 26, 2013
    Posts:
    3,973
    What device are you running on?
     
  20. Masegi

    Masegi

    Joined:
    May 24, 2014
    Posts:
    17
    my test device is a nexus galaxy A3 but I guess the opencv part is the problem
     
  21. Lanre

    Lanre

    Joined:
    Dec 26, 2013
    Posts:
    3,973
    OpenCV is very resource-heavy. You must optimize your CV pipeline.
     
  22. henriqueranj

    henriqueranj

    Joined:
    Feb 18, 2016
    Posts:
    177
    Thanks for the informative reply, @Lanre ! Do you know when an update with the fixes from NatCam will be ready?
     
  23. angelsm85

    angelsm85

    Joined:
    Oct 27, 2015
    Posts:
    63
    Hi @Lanre ! I want to capture a frame from Natcam.Preview and resize the image (to be smaller) before download to a server. How can I resize it? This is my code:

    Code (CSharp):
    1. Texture2D frame = new Texture2D(NatCam.Preview.width, NatCam.Preview.height, TextureFormat.RGBA32, false);
    2. NatCam.CaptureFrame(frame);
    3.  
    4. byte[] bytes = frame.EncodeToPNG();
    5. imageShot.texture = frame as Texture;
    6.  
    7. var form = new WWWForm();
    8. form.AddField("somefield", "somedata");
    9. form.AddBinaryData("file", bytes, "snapshot.png", "image/png");
    10.  
    11. WWW w = new WWW(uploadUrl, form);
    12. yield return w;
    Thanks in advance!
     
    Last edited: May 16, 2018
  24. Lanre

    Lanre

    Joined:
    Dec 26, 2013
    Posts:
    3,973
    Check back within two days.
     
  25. Lanre

    Lanre

    Joined:
    Dec 26, 2013
    Posts:
    3,973
    You will have to manually downsample the image using some interpolation method (nearest neighbor, bilinear, bicubic). We have a new API called NatImage that supports image resizing, but Unity is yet to approve it.
     
  26. supertwang

    supertwang

    Joined:
    Mar 23, 2018
    Posts:
    2
    Here you go Lanre. Am I doing something wrong?

    Code (CSharp):
    1. // Periodically the webcam can become unresponsive, which makes it difficult to work on this code base because it can
    2. // cause Unity to crash on launch.  This switch enables functionality without the webcam if needed.
    3. #define OPENCV_FACES
    4. #define ENABLE_WEBCAM
    5. //#undef ENABLE_WEBCAM
    6.  
    7. // This switch pulled from /Assets/NatCam/Core/Plugins/Managed/NatCam.cs (line 130, or thereabouts)
    8. // This lets us only use features that the platform supports.
    9. // The extra UNITY_EDITOR || UNITY_STANDALONE check at the beginning is needed separately from the final #else, even
    10. // though they invoke similar settings
    11. #if UNITY_EDITOR || UNITY_STANDALONE
    12. #undef FOCUS_SUPPORT
    13. #undef EXPOSURE_SUPPORT
    14. #undef ZOOM_SUPPORT
    15. #elif UNITY_IOS || UNITY_ANDROID
    16. #define FOCUS_SUPPORT
    17. #define EXPOSURE_SUPPORT
    18. #define ZOOM_SUPPORT
    19. #else
    20. #undef FOCUS_SUPPORT
    21. #undef EXPOSURE_SUPPORT
    22. #undef ZOOM_SUPPORT
    23. #endif
    24.  
    25. // DEBUGGING
    26.  
    27. using System.Collections;
    28. using System.Collections.Generic;
    29. using UnityEngine;
    30. using UnityEngine.EventSystems;
    31. using UnityEngine.UI;
    32. using CmrdCam;
    33. using OpenCVForUnity;
    34.  
    35. #if ENABLE_WEBCAM
    36. using NatCamU.Core;
    37. #endif
    38.  
    39.  
    40. namespace CmrdCam {
    41.     // Using NatCam Pro asset to enable camera settings adjustments
    42.     //
    43.     //   Docs:      http://docs.natcam.io/
    44.     //   Tutorial:  https://medium.com/@olokobayusuf/natcam-tutorial-series-1-starting-off-dc3990f5dab6
    45.     //   Forum:     http://forum.unity3d.com/threads/natcam-webcam-api.374690/
    46.     //
    47.     public class WebCam : NatCamBehaviour//MonoBehaviour
    48.     {
    49.  
    50.         public enum CamStates
    51.         {
    52.             Begin                  = 0
    53.             ,Started               = 1
    54.             ,NeedWebCamNativeSizes = 2
    55.             ,WebCamRefined         = 4    
    56.             ,WebCamRefinementRequested = 8
    57.             ,DoCapture = 16
    58.             ,RecognizeFaces = 32
    59.             ,DoFaceDetect = 64
    60.         };
    61.  
    62.  
    63.         public float  myExposure;         // exposure setting (if supported)
    64.         public float  myFocusX,myFocusY;  // focus position (if supported)
    65.         public bool   myAutoFocusFlag = true;
    66.         [Range(0.0f,1.0f)]
    67.         public float  myDetectScale = 0.4f;
    68.         public int    myDetectW = -1;
    69.         public int    myDetectH = -1;
    70.  
    71.         // which webcam index we want
    72.         public int  myCameraNumber   = 1;
    73.         public bool myFrontFacingFlag = true;
    74.      
    75.         // Camera sizes & aspects
    76.         //   Requested: Our desired (full-frame) output size
    77.         public int  myRequestedW = 720;
    78.         public int  myRequestedH = 480;
    79.  
    80.         public List<Preview> myClients;
    81.  
    82.         [System.NonSerialized]
    83.         public string myStatus;
    84.  
    85.         //   Native:    Native resolution of camera hardware
    86.         [System.NonSerialized]
    87.         protected int myNativeW=-1;
    88.         [System.NonSerialized]
    89.         protected int myNativeH=-1;
    90.         [System.NonSerialized]
    91.         protected float myNativeAspect=-1.0f;
    92.         //   Refined:   Size returned after we requested our preferred size
    93.         [System.NonSerialized]
    94.         protected int myRefinedW=-1; //
    95.         [System.NonSerialized]
    96.         protected int myRefinedH=-1;
    97.         [System.NonSerialized]
    98.         protected float myRefinedAspect=-1.0f;
    99.  
    100.         [System.NonSerialized]
    101.         protected float myLastAspect = -1.0f; // most refined aspect ratio
    102.  
    103.         [System.NonSerialized]
    104.         protected ulong myState;
    105.  
    106.         [System.NonSerialized]
    107.         protected Texture myTexture = null;
    108.  
    109.         [System.NonSerialized]
    110.         protected Texture2D myLastCapture = null;
    111.  
    112.         [System.NonSerialized]
    113.         protected Texture2D myLastPreview = null;
    114.         [System.NonSerialized]
    115.         int myLastPreviewW;
    116.         [System.NonSerialized]
    117.         int myLastPreviewH;
    118.  
    119.         public float myFaceDetectFps;
    120.  
    121.         [System.NonSerialized]
    122.         float myFaceDetectSecsLeft;
    123.         [System.NonSerialized]
    124.         OpenCVForUnity.Rect[] myFaceRects = null;
    125.         List<UnityEngine.Rect> myFaceRectsF = null;
    126.         int myFaceDetectCascadeW;
    127.         int myFaceDetectCascadeH;
    128.  
    129.      
    130.         #if OPENCV_FACES
    131.      
    132.         [System.NonSerialized]
    133.         CascadeClassifier myCascade = null;
    134.      
    135.         #if UNITY_WEBGL && !UNITY_EDITOR
    136.         Stack<IEnumerator> myCoroutines = null;
    137.         #endif
    138.         #endif
    139.  
    140.         void Awake()
    141.         {
    142.             myLastCapture = null;
    143.             myLastPreview = null;
    144.             myLastPreviewW = -1;
    145.             myLastPreviewH = -1;
    146.             myState       = 0;
    147.             myStatus      = "STARTING";
    148.             #if UNITY_WEBGL && !UNITY_EDITOR
    149.             myCoroutines = new Stack<IEnumerator> ();
    150.             #endif
    151.  
    152.             SetState( CamStates.RecognizeFaces );
    153.             myCascade     = null;
    154.             myFaceDetectFps = 2.0f;
    155.             myFaceDetectSecsLeft = 1.0f/myFaceDetectFps;
    156.             MatOfRect faces = new MatOfRect ();                  
    157.             myFaceRects = faces.toArray ();
    158.             myFaceRectsF = new List<UnityEngine.Rect>();
    159.         }
    160.  
    161.         // Use this for initialization
    162.         void Start ()
    163.         {
    164.             //Debug.Log ("Script has been started");
    165.             SetState( CamStates.Started );
    166.             //--------------------------------------------------------------------------------------- INIT NatCam
    167.  
    168.             #if ENABLE_WEBCAM
    169.             SetState( CamStates.NeedWebCamNativeSizes );
    170.  
    171.             // See which cameras are available
    172.             int fn = 0;
    173.             int rn = 0;
    174.             for (int i = 0, len = DeviceCamera.Cameras.Length; i < len; i++) {
    175.                 // Cache the camera
    176.                 DeviceCamera camera = DeviceCamera.Cameras[i];
    177.              
    178.                 if (camera.IsFrontFacing) {
    179.                     fn++;
    180.                     print("Camera[ " + i + " ] = Front[ " + fn + " ]");
    181.                 }
    182.                 else {
    183.                     rn++;
    184.                     print("Camera[ " + i + " ] = Rear[ " + fn + " ]");
    185.                 }
    186.  
    187.                 if ((myFrontFacingFlag) && (myCameraNumber == fn )) {
    188.                     if (!NatCam.Camera)
    189.                         NatCam.Camera = camera;
    190.                    // break;
    191.                 } else if ((!myFrontFacingFlag) && (myCameraNumber == rn )) {
    192.                     if (!NatCam.Camera)
    193.                         NatCam.Camera = camera;
    194.                  //   break;
    195.                 }
    196.             }
    197.             if ((!NatCam.Camera) && (DeviceCamera.Cameras.Length>0)) {
    198.                 if (myFrontFacingFlag && (fn>0))
    199.                     NatCam.Camera = DeviceCamera.FrontCamera;
    200.                 else if ((!myFrontFacingFlag) && (rn>0))
    201.                     NatCam.Camera = DeviceCamera.RearCamera;
    202.                 if (!NatCam.Camera)
    203.                     NatCam.Camera = DeviceCamera.Cameras[0];
    204.             }
    205.             if (NatCam.Camera) {
    206.                 NatCam.Camera.SetPreviewResolution( NatCamU.Core.Resolution.Highest );
    207.                 //NatCam.Camera.SetPreviewResolution( new NatCamU.Core.Resolution( 640, 480 ) );
    208.                 // NatCam.Camera.SetPhotoResolution( NatCamU.Core.Resolution.Highest);
    209.                 NatCam.Camera.SetFramerate(60);  // usually the highest you'll ever go. clamped at framerate of application.
    210.             }
    211.  
    212.             #if EXPOSURE_SUPPORT
    213.             if (NatCam.Camera) {
    214.                 NatCam.Camera.ExposureMode = ExposureMode.Locked;
    215.             }
    216.             #else
    217.             Debug.Log("WebCam exposure control not supported");
    218.             #endif
    219.             if (NatCam.Camera) {
    220.                 NatCam.OnStart += OnStart;  // register our callback
    221.                 NatCam.OnFrame += OnFrame;  // register our callback
    222.                 NatCam.Verbose = true;
    223.                 NatCam.Play();
    224.             }
    225.             #endif
    226.  
    227.             #if OPENCV_FACES
    228.             string cascadeFn;
    229.             //cascadeFn = "haarcascade_frontalface_alt.xml";
    230.             cascadeFn = "haarcascade_frontalface_alt2.xml";
    231.             //cascadeFn = "haarcascade_frontalface_default.xml";
    232.             cascadeFn = "lbpcascade_frontalface.xml";
    233.             if (cascadeFn == "haarcascade_frontalface_alt.xml") {
    234.                 myFaceDetectCascadeW = myFaceDetectCascadeH = 20;
    235.             } else if (cascadeFn == "haarcascade_frontalface_alt2.xml") {
    236.                 myFaceDetectCascadeW = myFaceDetectCascadeH = 20;
    237.             } else if  (cascadeFn == "haarcascade_frontalface_default.xml") {
    238.                 myFaceDetectCascadeW = myFaceDetectCascadeH = 24;
    239.             } else if  (cascadeFn == "lbpcascade_frontalface.xml") {
    240.                 myFaceDetectCascadeW = myFaceDetectCascadeH = 24;
    241.             }
    242.  
    243.             // Use this for initialization
    244.             #if UNITY_WEBGL && !UNITY_EDITOR
    245.          
    246.             var getFilePath_Coroutine;
    247.             getFilePath_Coroutine = Utils.getFilePathAsync(cascadeFn,
    248.                                                            (result) => {
    249.                                                                myCoroutines.Clear ();
    250.                                                              
    251.                                                                myCascade = new CascadeClassifier ();
    252.                                                                myCascade.load(result);
    253.                                                                if (myCascade.empty ()) {
    254.                                                                    Debug.LogError ("cascade file is not loaded.Please copy from “OpenCVForUnity/StreamingAssets/” to “Assets/StreamingAssets/” folder. ");
    255.                                                                }
    256.                                                              
    257.                                                                Run ();
    258.                                                            },
    259.                                                            (result, progress) => {
    260.                                                                Debug.Log ("getFilePathAsync() progress : " + result + " " + Mathf.CeilToInt (progress * 100) + "%");
    261.                                                            });
    262.             myCoroutines.Push (getFilePath_Coroutine);
    263.             StartCoroutine (getFilePath_Coroutine);
    264.             #else
    265.             //cascade = new CascadeClassifier (Utils.getFilePath ("lbpcascade_frontalface.xml"));
    266.             myCascade = new CascadeClassifier ();
    267.             myCascade.load (Utils.getFilePath (cascadeFn));
    268.             print("FILE PATH: " + Utils.getFilePath (cascadeFn));
    269.             #if !UNITY_WSA_10_0
    270.             if (myCascade.empty ()) {
    271.                 Debug.LogError ("cascade file '" + cascadeFn + "' is not loaded.  Please copy from “OpenCVForUnity/StreamingAssets/” to “Assets/StreamingAssets/” folder. ");
    272.             }
    273.             #endif
    274.  
    275.             #endif          
    276.          
    277.             #endif
    278.         }
    279.  
    280.         public Vector2 GetFocus( out bool auto )
    281.         {
    282.             Vector2 f = new Vector2( myFocusX, myFocusY );
    283.             auto = myAutoFocusFlag;
    284.             return f;
    285.         }
    286.         public void SetFocus( Vector2 normPos, bool auto )
    287.         {
    288.             myFocusX = Mathf.Clamp( normPos.x, 0.0f, 1.0f );
    289.             myFocusY = Mathf.Clamp( normPos.y, 0.0f, 1.0f );
    290.  
    291.             myAutoFocusFlag = auto;
    292.  
    293.             //------------------------------------------------------------ focus
    294.             #if FOCUS_SUPPORT
    295.             Debug.Log("  Focus Support: YES");
    296.             if (NatCam.Camera) {
    297.                 //NatCam.Camera.SetFocus( normPos ); // focus on the middle of the screen
    298.                 NatCam.Camera.FocusMode = auto?FocusMode.Autofocus:FocusMode.TapToFocus;
    299.                 NatCam.Camera.FocusPoint = normPos; // focus on the middle of the screen
    300.             }
    301.             #else
    302.             Debug.Log("  Focus Support: NO");
    303.             #endif
    304.         }
    305.  
    306.         public void AddClient( Preview client )
    307.         {
    308.             myClients.Add(client);
    309.             SetStatus( myStatus );
    310.         }
    311.  
    312.         Texture GetWebCamTexture()
    313.         {
    314.             return myTexture;
    315.         }
    316.  
    317.         void SetStatus( string msg )
    318.         {
    319.             myStatus = msg;
    320.             for (int i=0; i<myClients.Count; i++) {
    321.                 myClients[i].WebCamStatusChanged(msg);
    322.             }
    323.         }
    324.  
    325.         // called once camera starts
    326.         void OnStart()
    327.         {
    328.             if ( StateIsSetTest( CamStates.NeedWebCamNativeSizes ) ) {
    329.                 ClearState( CamStates.NeedWebCamNativeSizes );
    330.  
    331.                 SetStatus( "INITIALIZING");
    332.  
    333.  
    334.                 bool isPlaying;
    335.                 Material previewMatl;
    336.                 Texture2D tex;
    337.                 string desc,color;
    338.  
    339.                 #if ENABLE_WEBCAM    
    340.                 desc  = "NAT CAM";
    341.                 color = "green";
    342.  
    343.                 myNativeW = NatCam.Preview.width;
    344.                 myNativeH = NatCam.Preview.height;
    345.                 isPlaying = NatCam.IsPlaying;
    346.  
    347.                 myTexture = NatCam.Preview;
    348.                 //for (int i=0; i<myClients.Count; i++) {
    349.                 //    myClients[i].SetTexture(myTexture);
    350.                 //}
    351.  
    352.                 #else
    353.                 desc  = "NAT CAM";
    354.                 color = "ref";
    355.                 myNativeW = 720;
    356.                 myNativeH = 480;
    357.                 isPlaying = false;
    358.  
    359.                 ClearState( CamStates.WebCamRefinementRequested );
    360.                 SetState( CamStates.WebCamRefined );
    361.  
    362.  
    363.                 myRefinedW = 720;
    364.                 myRefinedH = 480;
    365.  
    366.                 #endif
    367.                 myLastAspect = myNativeAspect = ((float) myNativeW ) / ((float) myNativeH);
    368.                 string size = myNativeW + " x " + myNativeH;
    369.                 Debug.Log("<color="+color+">Start "+desc+"</color> "+size+" @ aspect = " + myLastAspect);
    370.                 for (int i=0; i<myClients.Count; i++) {
    371.                     myClients[i].WebCamGeomChanged();
    372.                 }
    373.              
    374.  
    375.                 /*
    376.                   print("Request Resolution: " + myScreenW + " x " + myScreenH );
    377.                   NatCam.Pause();
    378.                   NatCam.Camera.SetPreviewResolution( new NatCamU.Core.Resolution( myScreenW, myScreenH ) );
    379.                   //NatCam.Camera.SetFramerate( 10.0f );
    380.  
    381.                   //------------------------------------------------------------ focus
    382.                   #if FOCUS_SUPPORT
    383.                   Debug.Log("  Focus Support: YES");
    384.                   NatCam.Camera.SetFocus( new Vector2( 0.5f, 0.5f ) ); // focus on the middle of the screen
    385.                   #else
    386.                   Debug.Log("  Focus Support: NO");
    387.                   #endif
    388.          
    389.                   //------------------------------------------------------------ exposure
    390.                   #if EXPOSURE_SUPPORT
    391.                   Debug.Log("  Exposure Support: YES");
    392.                   float min = NatCam.Camera.MinExposureBias;
    393.                   float max = NatCam.Camera.MaxExposureBias;
    394.                   NatCam.Camera.ExposureBias = min + (max-min)/2.0f;
    395.                   #else
    396.                   Debug.Log("  Exposure Support: NO");
    397.                   #endif
    398.  
    399.                   NatCam.Play();
    400.                 */
    401.                 SetState( CamStates.WebCamRefinementRequested );
    402.                 SetStatus( "REFINING" );
    403.  
    404.             }
    405.          
    406.         }
    407.         //------------------------------------------------------------------------------------------ STATE MGMT
    408.         public void SetState( CamStates s )           { myState = myState | ((ulong) s );          }
    409.         public void SetState( CamStates s, bool val ) { if (val) SetState(s); else ClearState(s);  }
    410.         public void ClearState( CamStates s )         { myState = myState & ( ~((ulong) s ));      }
    411.         public bool StateIsSetTest( CamStates s )     { return (( myState & ((ulong) s )) != 0);   }
    412.         public bool StateIsNotSetTest( CamStates s )  { return (( myState & ((ulong) s )) == 0);   }
    413.      
    414.         void OnGUI()
    415.         {
    416.             // Make a background box
    417.             //GUI.Box(new Rect(10,10,100,90), "CamStates");
    418.          
    419.             // Make the second button.
    420.             // if(GUI.Button(new Rect(20,70,80,20), "Level 2")) { /* do something */ }
    421.          
    422.             float y=30;
    423.             float x=25;
    424.             float w=200;
    425.             float h=20;
    426.          
    427.             GUI.Toggle(new UnityEngine.Rect(x,y,w,h), StateIsSetTest(CamStates.Started), "Started"); y+= h;
    428.             GUI.Toggle(new UnityEngine.Rect(x,y,w,h), StateIsSetTest(CamStates.NeedWebCamNativeSizes), "NeedWebCamNativeSizes"); y+= h;
    429.             GUI.Toggle(new UnityEngine.Rect(x,y,w,h), StateIsSetTest(CamStates.WebCamRefinementRequested), "WebCamRefinementRequested"); y+= h;
    430.             GUI.Toggle(new UnityEngine.Rect(x,y,w,h), StateIsSetTest(CamStates.WebCamRefined), "WebCamRefined"); y+= h;
    431.             GUI.Toggle(new UnityEngine.Rect(x,y,w,h), StateIsSetTest(CamStates.DoCapture), "DoCapture"); y+= h;
    432.         }
    433.  
    434.         // callback from NatCam when the web cam has started
    435.         void OnFrame()
    436.         {
    437.             //Debug.Log("<color=red>NatCam Frame!</color>");
    438.  
    439.             // Now that we know our maximum values, request less performance-intensive settings appropriate to the
    440.             // screen real-estate we're occupying under the current layout
    441.             if ( StateIsSetTest( CamStates.WebCamRefinementRequested ) )
    442.             {
    443.                 ClearState( CamStates.WebCamRefinementRequested );
    444.                 SetState( CamStates.WebCamRefined );
    445.  
    446.                 #if ENABLE_WEBCAM
    447.                 myRefinedW = NatCam.Preview.width;
    448.                 myRefinedH = NatCam.Preview.height;
    449.                 #else
    450.                 myRefinedW = 720;
    451.                 myRefinedH = 480;
    452.                 #endif
    453.                 myLastAspect = myRefinedAspect = ((float) myRefinedW ) / ((float) myRefinedH);
    454.                 for (int i=0; i<myClients.Count; i++) {
    455.                     myClients[i].WebCamGeomChanged();
    456.                 }
    457.                 SetStatus( "ACTIVE" );
    458.  
    459.                 float zoom = 1.0f;
    460.                 #if ZOOM_SUPPORT
    461.                 zoom = NatCam.Camera?NatCam.Camera.MaxZoomRatio:1.0f;
    462.                 print("Refined WebCam: "+myRefinedW + " x " + myRefinedH + " @ "+ myRefinedAspect + ", MaxZoom " + zoom);
    463.                 #else
    464.                 print("Refined WebCam: "+myRefinedW + " x " + myRefinedH + " @ "+ myRefinedAspect + ", Zoom: Not Supported");
    465.                 #endif
    466.             }
    467.  
    468.  
    469.             #if OPENCV_FACES
    470.             if ( StateIsSetTest( CamStates.WebCamRefined ) && StateIsSetTest( CamStates.RecognizeFaces ) ) {
    471.              
    472.                 Mat matrix = new Mat();
    473.  
    474.                 if (!NatCam.PreviewMatrix(ref matrix)) return;
    475.  
    476.  
    477.                 Mat grayMat = new Mat ();
    478.                 Mat showMat = null;
    479.  
    480.                 bool  doDetect    = false;
    481.                 float detectScale = myDetectScale;
    482.  
    483.                 if (detectScale != 1.0f) {
    484.                     Imgproc.resize( matrix, grayMat, new Size(), detectScale, detectScale, Imgproc.INTER_LINEAR);
    485.                     Imgproc.cvtColor( grayMat, grayMat, Imgproc.COLOR_RGBA2GRAY );
    486.                 } else {
    487.                     Imgproc.cvtColor( matrix, grayMat, Imgproc.COLOR_RGBA2GRAY );
    488.                 }
    489.                 Imgproc.equalizeHist (grayMat, grayMat);
    490.                 OpenCVForUnity.Core.flip(grayMat,grayMat,-1); // MAT image is flipped vertically, unflip
    491.  
    492.                 showMat = grayMat;
    493.              
    494.                 if (StateIsSetTest( CamStates.DoFaceDetect ) && (myCascade != null) ) {
    495.                     doDetect = true;
    496.                     ClearState( CamStates.DoFaceDetect );
    497.  
    498.                     float dw = (float) grayMat.width();
    499.                     float dh = (float) grayMat.height();
    500.                     myDetectW = grayMat.width();
    501.                     myDetectH = grayMat.height();
    502.                  
    503.                     MatOfRect faces = new MatOfRect ();                  
    504.                     Size cascadeSize = new Size( myFaceDetectCascadeW, myFaceDetectCascadeH );
    505.                     myCascade.detectMultiScale (grayMat, faces, 1.1, 2, 1, cascadeSize, new Size ());
    506.                     myFaceRects = faces.toArray ();
    507.                     Debug.Log( "Detected " + myFaceRects.Length + " faces..." );
    508.                     myFaceRectsF.Clear();
    509.                     // normalize the rects
    510.                     for (int i = 0; i < myFaceRects.Length; i++) {
    511.                         OpenCVForUnity.Rect rect = myFaceRects[i];
    512.                         Debug.Log ( "  Face["+i+"] " + rect+" on "+myLastPreviewW+" x "+myLastPreviewH+" image" );
    513.                         myFaceRectsF.Add( new UnityEngine.Rect(rect.x / dw,
    514.                                                                rect.y / dh,
    515.                                                                rect.width / dw,
    516.                                                                rect.height / dh) );
    517.                     }
    518.  
    519.                 }
    520.  
    521.                 if (myFaceRectsF.Count > 0) {
    522.                     Scalar color  = new Scalar(255,0,0,255);
    523.                     Point startPt = new Point();
    524.                     Point endPt   = new Point();
    525.                     int thickness = 2;
    526.                     print("Drawing "+myFaceRectsF.Count+" faces");
    527.                     for (int i = 0; i < myFaceRectsF.Count; i++) {
    528.                         UnityEngine.Rect rect = myFaceRectsF[i];
    529.  
    530.                         startPt.x = rect.x * showMat.cols();
    531.                         startPt.y = rect.y * showMat.rows();
    532.                         endPt.x   = (rect.x + rect.width)  * showMat.cols();
    533.                         endPt.y   = (rect.y + rect.height) * showMat.rows();
    534.                      
    535.                         //Debug.Log ( "Face "+rect+" on "+showMat.cols()+" x "+showMat.rows()+" image" );
    536.                         Imgproc.rectangle (showMat, startPt, endPt, color, thickness );
    537.                     }
    538.                 }
    539.              
    540.                 // called when capture complete by NatCam
    541.                 bool reset = false;
    542.                 if ( (matrix.cols() != myLastPreviewW ) || (matrix.rows() != myLastPreviewH) || (!myLastPreview)) {
    543.                     if (myLastPreview) {
    544.                         Texture2D.Destroy( myLastPreview ); // Note: NatCam photo textures must be manually destroyed
    545.                     }
    546.                     myLastPreviewW = showMat.cols();
    547.                     myLastPreviewH = showMat.rows();
    548.                     myLastPreview  = new Texture2D (myLastPreviewW, myLastPreviewH, TextureFormat.RGBA32, false);
    549.                     reset = true;
    550.  
    551.                 }
    552.  
    553. //                OpenCVForUnity.Core.flip(showMat,showMat,-1);
    554.                 Utils.matToTexture2D (showMat, myLastPreview);
    555.  
    556.                     for (int i=0; i<myClients.Count; i++) {
    557.                         if (myClients[i].GetImage() != myLastPreview)
    558.                             myClients[i].SetTexture(myLastPreview);
    559.                     }
    560.  
    561.                 myLastPreview.Apply();
    562.             }
    563.  
    564.             #endif
    565.         }
    566.  
    567.         public float GetAspect()
    568.         {
    569.             return myLastAspect;
    570.         }
    571.         public float GetRefinedW()
    572.         {
    573.             return myRefinedW;
    574.         }
    575.         public float GetRefinedH()
    576.         {
    577.             return myRefinedH;
    578.         }
    579.  
    580.         public void DoCapture()
    581.         {
    582.             SetState( CamStates.DoCapture );
    583.         }
    584.  
    585.  
    586.  
    587.         // called when capture complete by NatCam
    588.         void OnCapture( Texture2D photo )
    589.         {
    590.             if (myLastCapture)
    591.             {
    592.                 Texture2D.Destroy( myLastCapture ); // Note: NatCam photo textures must be manually destroyed
    593.             }
    594.             myLastCapture = photo;
    595.  
    596.          
    597.         }
    598.  
    599.         Texture2D GetCapture( UnityEngine.Rect normCaptureRect )
    600.         {
    601.             if (!myLastCapture)
    602.                 return null;
    603.  
    604.             int photoW = myLastCapture.width;
    605.             int photoH = myLastCapture.height;
    606.  
    607.             float normX = normCaptureRect.x;
    608.             float normY = normCaptureRect.y;
    609.             float normW = normCaptureRect.width;
    610.             float normH = normCaptureRect.height;
    611.  
    612.             int capturePixelW = (int) (photoW * normW);
    613.             int capturePixelH = (int) (photoH * normH);
    614.             int capturePixelX = (int) (photoW * normX);
    615.             int capturePixelY = (int) (photoH * normY);
    616.  
    617.             Texture2D tx = new Texture2D( capturePixelW, capturePixelH );
    618.             tx.SetPixels( myLastCapture.GetPixels( capturePixelX, capturePixelY, capturePixelW, capturePixelH ) );
    619.             tx.Apply();
    620.  
    621.             string size1 = ""+myLastCapture.width + " x " + myLastCapture.height;
    622.             string size2 = ""+capturePixelW +" x "+capturePixelH;
    623.             print("CAPTURE! "+ size1 + " ==> " + size2 + " @ ( "+capturePixelX+", "+capturePixelY+" )");
    624.  
    625.             return tx;
    626.         }
    627.  
    628.  
    629.         public void SetExposure( float value )
    630.         {
    631.             myExposure = value;
    632.             #if EXPOSURE_SUPPORT
    633.             Debug.Log("<color=blue>NatCam Exposure!</color>");
    634.             #if ENABLE_WEBCAM
    635.             float min = NatCam.Camera?NatCam.Camera.MinExposureBias:0.0f;
    636.             float max = NatCam.Camera?NatCam.Camera.MaxExposureBias:1.0f;
    637.             if (NatCam.Camera) {
    638.                 NatCam.Camera.ExposureBias = min + (value*(max-min));
    639.             }
    640.             #else
    641.             Debug.Log("<color=blue>No Exposure Support.</color>");
    642.             #endif
    643.             #endif
    644.  
    645.         }
    646.         public float GetExposure()
    647.         {
    648.             return myExposure;
    649.         }
    650.  
    651.  
    652.         // Update is called once per frame
    653.         void Update ()
    654.         {
    655.  
    656.             #if ENABLE_WEBCAM
    657.             // must must be playing to update
    658.             if (!NatCam.IsPlaying) {
    659.                 // TODO: Some kind of message on screen "Web Cam can not be activated"
    660.                 return;
    661.             }
    662.             #endif
    663.  
    664.             // Capture the native web-cam resolution if it hasn't been captured
    665.             if (StateIsSetTest( CamStates.NeedWebCamNativeSizes )) {
    666.                 //Debug.Log("Need native sizes");
    667.                 return;
    668.             }
    669.  
    670.             if ( StateIsSetTest( CamStates.DoCapture ) ) {
    671.                 #if ENABLE_WEBCAM
    672.                 print("Capture Photo");
    673.                 NatCam.CapturePhoto(OnCapture);
    674.                 #else
    675.                 print("DO CAPTURE!");
    676.                 Texture2D tex = new Texture2D(2,2);
    677.                 tex.SetPixel( 0, 0, Color.black );
    678.                 tex.SetPixel( 1, 0, Color.black );
    679.                 tex.SetPixel( 0, 1, Color.black );
    680.                 tex.SetPixel( 1, 1, Color.black );
    681.                 tex.Apply(); // send to GPU
    682.                 OnCapture(tex);
    683.                 #endif    
    684.                 ClearState( CamStates.DoCapture );
    685.             }
    686.          
    687.             if (StateIsSetTest( CamStates.RecognizeFaces ) ) {
    688.                 myFaceDetectSecsLeft -= Time.deltaTime;
    689.                 if (myFaceDetectSecsLeft <= 0.0f) {
    690.                     myFaceDetectSecsLeft = 1.0f/myFaceDetectFps;
    691.                     SetState( CamStates.DoFaceDetect );
    692.                 }
    693.             }
    694.         }
    695.     }
    696.  
    697.  
    698. }
    699.  
     
  27. Lanre

    Lanre

    Joined:
    Dec 26, 2013
    Posts:
    3,973
    You seem to be using an older version of NatCam (I see you use NatCamBehaviour which has been deprecated). Make sure to upgrade to 2.0f1. Also, you can significantly clean out your code. You don't need the FOCUS/ZOOM/EXPOSURE_SUPPORT definitions. Just code as if they were all supported because if not, NatCam will simply log to the console that it isn't supported (which you are manually doing right now).
     
  28. Airship

    Airship

    Joined:
    Sep 10, 2011
    Posts:
    260
    Hi, I asked earlier about checking if camera permission was granted to prevent iOS from crashing. I wrapped my code in the following below, but now the permission is never asked on new devices. Is there a way to ask for camera permission if the below returns false, without subsequently crashing if user manually disables camera permission?

    Code (CSharp):
    1.  if (NatCam.Implementation.HasPermissions) {
    2.     ...
    3. }
     
  29. Lanre

    Lanre

    Joined:
    Dec 26, 2013
    Posts:
    3,973
    This is because when NatCam.Play is called, the iOS camera SDK implicitly requests the camera permission if the permission has never been requested before. To better handle this, explicitly request permissions using `Application.RequestUserAuthorization` and check for the permission using `NatCam.Implementation.HasPermissions`.
     
  30. Airship

    Airship

    Joined:
    Sep 10, 2011
    Posts:
    260
    Thanks for your fast response. Does Application.RequestUserAuthorization actually work for non web-players? Aren't you requesting the authorization in the iOS plugin? Can that be done from Unity natively?
     
  31. Lanre

    Lanre

    Joined:
    Dec 26, 2013
    Posts:
    3,973
    Yes, it works on other platforms, including iOS.
    Not at all. This happens automatically on iOS (on the first app run).
     
  32. Airship

    Airship

    Joined:
    Sep 10, 2011
    Posts:
    260
    Got it, thanks again!
     
    Lanre likes this.
  33. YHS

    YHS

    Joined:
    Aug 10, 2014
    Posts:
    31
    HI, build your "miniCam" example onto ios and found out toggleFlash is not working with rear camera. It's always stay "auto" mode. Please check it. Thanks
     
  34. henriqueranj

    henriqueranj

    Joined:
    Feb 18, 2016
    Posts:
    177
    Hello Lanre, for the sake of the submission I fixed this by changing how the initialisation and release of NatCam is done in the Scene, and added more loading indicators when starting the camera and capturing photos to reduce input bashing breaking the app. Also, due to multitouch in UI buttons done in the same frame it was causing the initialisation or release to be triggered twice therefore crashing the app.
     
    Lanre likes this.
  35. henriqueranj

    henriqueranj

    Joined:
    Feb 18, 2016
    Posts:
    177
    Hello @Lanre , in my recently released app it was reported by Crashlytics that NatCam crashed on a Sony XPeria X with Android 8. It reported the following log:
    Fatal Exception: java.lang.NullPointerException: Attempt to invoke virtual method 'android.graphics.SurfaceTexture com.yusufolokoba.natcam.rendering.SurfaceTextureRenderContext.surfaceTexture()' on a null object reference
    at com.yusufolokoba.natcam.NatCam$3.run(NatCam.java:134)
    at android.os.Handler.handleCallback(Handler.java:789)
    at android.os.Handler.dispatchMessage(Handler.java:98)
    at android.os.Looper.loop(Looper.java:251)
    at android.os.HandlerThread.run(HandlerThread.java:65)


    It also follows in attachment a full log with the state of other threads in the device. Do you know what may cause this? Or are you aware of this issue?
     

    Attached Files:

  36. henriqueranj

    henriqueranj

    Joined:
    Feb 18, 2016
    Posts:
    177
    Hi @Lanre , I am receiving more crash reports due to NatCam. These crashes have been reported only once per device and not constantly. In the attachments it follows the crash report with log of other threads if it helps.

    Huawei P9 Lite, Nokia 6, OnePlus 6:

    Fatal Exception: java.lang.RuntimeException: startPreview failed
    at android.hardware.Camera.startPreview(Camera.java)
    at com.yusufolokoba.natcam.NatCamDevice.play(NatCamDevice.java:97)
    at com.yusufolokoba.natcam.NatCam$3.run(NatCam.java:135)
    at android.os.Handler.handleCallback(Handler.java:761)
    at android.os.Handler.dispatchMessage(Handler.java:98)
    at android.os.Looper.loop(Looper.java:156)
    at android.os.HandlerThread.run(HandlerThread.java:61)


    OnePlus One:

    Fatal Exception: android.renderscript.RSRuntimeException: Loading of ScriptC script failed.
    at android.renderscript.ScriptC.<init>(ScriptC.java:63)
    at com.yusufolokoba.natcam.ScriptC_photo.<init>(ScriptC_photo.java:42)
    at com.yusufolokoba.natcam.ScriptC_photo.<init>(ScriptC_photo.java:34)
    at com.yusufolokoba.natcam.Photo.<init>(Photo.java:43)
    at com.yusufolokoba.natcam.NatCam.onPictureTaken(NatCam.java:186)
    at android.hardware.Camera$EventHandler.handleMessage(Camera.java:1118)
    at android.os.Handler.dispatchMessage(Handler.java:102)
    at android.os.Looper.loop(Looper.java:148)
    at android.os.HandlerThread.run(HandlerThread.java:61)

    Can you give us some insight on these crashes?
     

    Attached Files:

  37. YHS

    YHS

    Joined:
    Aug 10, 2014
    Posts:
    31
    @Lanre seems disappear....
     
  38. Lanre

    Lanre

    Joined:
    Dec 26, 2013
    Posts:
    3,973
    What device did you test on?
     
  39. Lanre

    Lanre

    Joined:
    Dec 26, 2013
    Posts:
    3,973
    We have made significant changes that fix issues like this and issues relating to starting and releasing NatCam several times. I will push the update after final testing tomorrow.
     
  40. Lanre

    Lanre

    Joined:
    Dec 26, 2013
    Posts:
    3,973
    Can you send me the full logs? This might be related to permissions.
    This is specific to the device. I'll need to see the full logs to figure out what exactly is going on.
     
  41. kennyallau

    kennyallau

    Joined:
    Aug 5, 2014
    Posts:
    8
    Hi @Lanre

    Quick question: How do I turn off the photo capture sound?
     
  42. henriqueranj

    henriqueranj

    Joined:
    Feb 18, 2016
    Posts:
    177
    Hi Lanre, unfortunately I am not able to retrieve the full logs as these were triggered by end users and I received the crash reports through Crashlytics. The best I can have is the log of the report and the current state of the other threads as I attached in the last post.

    It may be that the users are doing weird stuff, such as disabling the permissions with the Android app in background and then opening it again when they dont have camera permissions anymore. The app actively checks for permissions when the user requests for camera usage.
     
  43. Lanre

    Lanre

    Joined:
    Dec 26, 2013
    Posts:
    3,973
    You can't. This is controlled by the operating system, out of reach of the exposed API.
     
  44. Lanre

    Lanre

    Joined:
    Dec 26, 2013
    Posts:
    3,973
    The crash on the P9 Lite, Nokia 6, and OnePlus 6 looks like a device-specific issue. It is usually something out of reach of NatCam. The crash on the OnePlus One is definitely device-specific, because we have not reproduced it or gotten reports of it from any other device. Without the full logs, it's impossible to tell what exactly is going on. Can you get your hands on any of these devices? On the Xperia X, the crash will be fixed by the changes we have made.
     
    henriqueranj likes this.
  45. henriqueranj

    henriqueranj

    Joined:
    Feb 18, 2016
    Posts:
    177
    Hi @Lanre thanks for the insights. What do you mean with issues out of reach of NatCam? Could you elaborate a bit more on each? With such information I could think of ways to avoid certain interactions/behaviours with NatCam so it does not crash/fail in the app.

    Unfortunately I wont get my hands on these devices. The app is currently released and I am trying to assess these crashes for future reference.
     
  46. Lanre

    Lanre

    Joined:
    Dec 26, 2013
    Posts:
    3,973
    NatCam uses the native Android camera API (android.hardware.camera) when running on Android. The API exposes a number of methods. The internal implementation of these methods depends on the device as it is implemented by the device vendor/OEM. When the internal implementation has a constraint that is not included in the Android spec, then the OEM implementation might throw an exception. Such a detail is out of reach of NatCam. We only see issues like this on less-mainstream manufacturers. Manufacturers like Samsung adhere pretty strictly to the Android spec, so these issues don't exist.
    You did mention that these exceptions were raised only once per device. I don't think this issue is fatal. I will see what I can find.
     
    henriqueranj likes this.
  47. henriqueranj

    henriqueranj

    Joined:
    Feb 18, 2016
    Posts:
    177
    Thank you for the explanation.

    I am seeing some of these issues now with Samsungs too. However, now that we have more users, these crashes amount to 100 out of ~8000 Android users. I will send you more information via email later. As we won't fix these issues for our current released app, I believe these can help you mature and improve the stability of Natcam/Natcorder.
     
  48. Lanre

    Lanre

    Joined:
    Dec 26, 2013
    Posts:
    3,973
    What Samsung devices?
    Please do. I would like to do whatever I can to resolve this. Thankfully, it doesn't affect a large proportion.
     
    henriqueranj likes this.
  49. henriqueranj

    henriqueranj

    Joined:
    Feb 18, 2016
    Posts:
    177
    Hi @Lanre , I'll get back to you on this by Friday. Will compile the Crashlytics reports and send over to you.
     
    Lanre likes this.
  50. SniperED007

    SniperED007

    Joined:
    Sep 29, 2013
    Posts:
    345
    Just bought this, tried it on my Galaxy S8+ and have 2 issues already:
    1) Doesn't seem to work at all with Linear lighting
    2) (after changing back to Gamma) It runs really slowly (looks like it's about 15 FPS) using the MiniCam sample (The default WebCamTexture I was using before runs MUCH faster) on a Galaxy S7 it was like 5 FPS

    Is that expected results?