Search Unity

[RELEASED] OpenCV for Unity

Discussion in 'Assets and Asset Store' started by EnoxSoftware, Oct 30, 2014.

  1. EnoxSoftware

    EnoxSoftware

    Joined:
    Oct 29, 2014
    Posts:
    1,565
    There is currently no such function.
    In order to detect multiple pattern images, I think it is necessary to change the code significantly.
    MarkerLessARExample is based on https://github.com/MasteringOpenCV/code/tree/master/Chapter3_MarkerlessAR. Please refer to MasteringOpenCV for a detailed algorithm.
     
  2. EnoxSoftware

    EnoxSoftware

    Joined:
    Oct 29, 2014
    Posts:
    1,565
    1. There is no such plan at the moment.
    2. Since this package is a clone of OpenCV Java, you are able to use the same API as OpenCV Java 3.3.1.
    So "OpenCV for Unity" does not support OpenCL module.
    3. It is possible to combine yolo with VideoCapture class.
     
  3. kimot94

    kimot94

    Joined:
    Dec 26, 2017
    Posts:
    1
    Hi, first of all thanks a lot for this plugin.

    I have a project on image processing. The function is to capture and image, change it to threshold and count the pixel number from the threshold image. Anybody know how to count the pixels number (the number of black pixels or white pixels maybe both of the image)? Please help me!
     
  4. EnoxSoftware

    EnoxSoftware

    Joined:
    Oct 29, 2014
    Posts:
    1,565
    Since this asset is a clone of OpenCV Java, you are able to use the same API as OpenCV Java.http://enoxsoftware.github.io/OpenCVForUnity/3.0.0/doc/html/index.html
    If there is implementation example using "OpenCV Java", I think that it can be implemented even using "OpenCV for Unity".
    http://answers.opencv.org/question/8354/android-counting-white-pixels-in-binary-mat/
    1. Imgproc.threshold();ThresholdExample
    2. Core.countNonZero();
     
  5. JagmeetS

    JagmeetS

    Joined:
    Jan 2, 2018
    Posts:
    1
    Hi Enox,

    Season's Greetings!

    I am facing a very strange crash issue. I am trying to detect an object. I am converting the Mat to GrayMat and then detect the contours, In my scene I need to do this on the feed from 2 or more cameras.

    When I am doing this for a single camera it works very well. When I do it for more than 1 cameras Unity / Application crashes with an
    Access Violation (0xc0000005)
    in module opencvforunity.dll at 0033:

    The error.log message is:
    Unity Player [version: Unity 5.6.1f1_2860b30f0b54]

    opencvforunity.dll caused an Access Violation (0xc0000005)
    in module opencvforunity.dll at 0033:7a521f0e.

    Error occurred at 2017-12-11_153136.
    E:\EXE6\Exe.exe, run by FSS02.
    56% memory in use.
    8093 MB physical memory [3480 MB free].
    12341 MB paging file [5357 MB free].
    134217728 MB user address space [134217038 MB free].
    Read from location 00000000 caused an access violation.

    My output,log points to:
    0x00007FF949731F0E (opencvforunity) xphoto_SimpleWB_setP_10

    0x00007FF949A370F5 (opencvforunity) xphoto_SimpleWB_setP_10

    0x00007FF949582211 (opencvforunity) imgproc_Imgproc_cvtColor_11

    0x00000000060C6738 (Mono JIT Code) (wrapper managed-to-native) OpenCVForUnity.Imgproc:imgproc_Imgproc_cvtColor_11 (intptr,intptr,int)


    Please help,
     
  6. upressplay

    upressplay

    Joined:
    Aug 8, 2017
    Posts:
    24
    We downloaded your ARHeadWithFaceMaskExample.Example and have been having a hard time getting the desired performance and recording results.

    Is there an easy way that you know of to combine both ARProps and Masks into 1 camera?

    We’ve been circling this issue for a couple days. Trying a variety of mathematical set ups to sync both experiences.

    We are getting a variety of issues trying to record a combined RenderTexture. Along with performance issues in regards to the internal buffer streams of Unity Camera and WebCam Textures.

    We think having 1 camera and 1 rendertexture would eliminate the glitches and missed frames from the AR Props camera.

    Thoughts?
     
  7. EnoxSoftware

    EnoxSoftware

    Joined:
    Oct 29, 2014
    Posts:
    1,565
    Thank you very much for reporting.
    Could you send me the code that caused the error?
    https://enoxsoftware.com/opencvforunity/contact/technical-inquiry/
     
  8. EnoxSoftware

    EnoxSoftware

    Joined:
    Oct 29, 2014
    Posts:
    1,565
    Probably, It is difficult to combine both ARProps and Masks into 1 camera.
     
  9. luigis

    luigis

    Joined:
    Oct 30, 2013
    Posts:
    25
    Hi evryone,
    Is there the possibility to use other video codecs instead of motionJPG? Can i use h264 for instance? Thanks
     
  10. EnoxSoftware

    EnoxSoftware

    Joined:
    Oct 29, 2014
    Posts:
    1,565
    I succeeded in playing video files other than mjpeg format on Windows.
    1)Download "OpenCV for Windows Version 3.3.1"(http://opencv.org/downloads.html).
    2)Set PATH variable to "opencv_ffmpeg331.dll" or "opencv_ffmpeg331_64.dll".
    if 32bit, "¥path¥to¥opencv¥build¥x86¥vc14¥bin¥".
    if 64bit, "¥path¥to¥opencv¥build¥x64¥vc14¥bin¥".
    Or
    2)Copy to Project Folder.


    In my environment(Windows) these files worked correctly with VideoCaptureExample.
    http://www.gomplayer.jp/img/sample/mp4_h264_aac.mp4
    http://www.gomplayer.jp/img/sample/mp4_mpeg4_aac.mp4
    http://www.gomplayer.jp/img/sample/mov_h264_aac.mov
     
  11. JasonWild

    JasonWild

    Joined:
    Jan 9, 2014
    Posts:
    14
    First off Great Product!!
    I have noticed on the markerbasedARexample that the gameobject vibrates and shakes alot on the detected marker. Is there anywhere that I can change some settings or code to reduce the shaking gameobject?

    Thanks!!
     
  12. EnoxSoftware

    EnoxSoftware

    Joined:
    Oct 29, 2014
    Posts:
    1,565
    there is no such settings or code. I think that vibration is alleviated by applying a filter that removes noise.
    http://opencvexamples.blogspot.com/2014/01/kalman-filter-implementation-tracking.html
     
  13. dienat

    dienat

    Joined:
    May 27, 2016
    Posts:
    417
    I want that once detected a marker creates a gameobject in that marker position and that marker stays there even when i am not seeing the marker anymore, for instance a big gameobject that can be seen part of it even though i am not seeing the marker anymore. that can be done with opencv?
     
  14. EnoxSoftware

    EnoxSoftware

    Joined:
    Oct 29, 2014
    Posts:
    1,565
    Its functionality is not supported by Aruco module or MarkerBasedARExample.
    I think it is necessary to use a gyro sensor and a depth sensor.
     
  15. kilroyone

    kilroyone

    Joined:
    Sep 18, 2014
    Posts:
    4
    I have errors when building ipa file.
    Just empty project with plugin.

    bitcode enabled
    211112.jpg

    bitcode disabled
    2222122.jpg

    Any ideas?
     
  16. EnoxSoftware

    EnoxSoftware

    Joined:
    Oct 29, 2014
    Posts:
    1,565
    Please see Q&A of ReadMe.pdf.

    [iOS]Submit to App Store issues: Unsupported Architecture x86, i386“Unsupported Architecture. Your executable contains unsupported architecture ‘[x86_64, i386]’.”
    “The problem is that the Buy framework contains a build for both the simulator (x86_64) and the actual devices (ARM).
    Of course, you aren’t allowed to submit to the App Store a binary for an unsupported architecture, so the solution is to “manually” remove the unneeded architectures from the final binary, before submitting it.” http://ioscake.com/submit-to-app-store-issues-unsupported-architecture-x86.html

    Please add the script of this page to BuildPhases->RunScript.
    http://ikennd.ac/blog/2015/02/stripping-unwanted-architectures-from-dynamic-libraries-in-xcode/

    remove_embedd_framework0.png
     
  17. kilroyone

    kilroyone

    Joined:
    Sep 18, 2014
    Posts:
    4
    Thank you, but it not helped.

    Always have an different errors.

    Only one thing that helped me is downgrade of plugin's version.
     
  18. Avani22

    Avani22

    Joined:
    Sep 2, 2017
    Posts:
    3
    I wrote a simple code for svm classification using OpenCvForUnity based on OpenCv3.0.0 beta 7. The svm is getting trained but while saving model the unity crashes.
     

    Attached Files:

  19. EnoxSoftware

    EnoxSoftware

    Joined:
    Oct 29, 2014
    Posts:
    1,565
    Could you tell me about your test environment?
    OpenCV for Unity version :
    Unity version :
    Xcode version :
     
  20. EnoxSoftware

    EnoxSoftware

    Joined:
    Oct 29, 2014
    Posts:
    1,565
    Algorithm.save () method seems to have a bug in OpenCV3.0.0.
    https://github.com/Itseez/opencv/issues/5894

    The latest version(2.2.5) of OpenCVForUnity is based on OpenCV 3.3.1, so this bug has been fixed.
    Please use the latest version of OpenCVForUnity.
     
  21. Avani22

    Avani22

    Joined:
    Sep 2, 2017
    Posts:
    3
    [10:43 PM, 1/27/2018] Avani: Hello Sir, I am a college student working on a project in Unity. Our college is sponsoring this project to buy opencvforunity 2.2.5 plugins....so i just want to confirm whether this version works fine with unity 2017.3 for image processing, CNN and SVM. Also what is the licensing period for opencvforunity 2.2.5? Waiting for your quick reply.
     
  22. jojoh

    jojoh

    Joined:
    Jan 22, 2013
    Posts:
    15
    Hi there, thanks for the plugin,

    I'm trying to do continuous blobtracking on a depthimage picture.

    When I run this code for about 5 minutes I get the error below, do you have an idea what might be going wrong ?

    opencverror.png
    Code (csharp):
    1.  
    2.                 inputTexture = test.GetComponent<RawImage>().texture;
    3.  
    4.                 if(inputTexture2D == null) inputTexture2D = new Texture2D(inputTexture.width, inputTexture.height);
    5.  
    6.            
    7.                 Utils.textureToTexture2D(inputTexture, inputTexture2D);
    8.  
    9.                 Mat tempMat = new Mat(inputTexture.height, inputTexture.width, CvType.CV_8UC1);
    10.  
    11.                 Utils.texture2DToMat(inputTexture2D, tempMat);
    12.  
    13.                 tempMat = tempMat.submat(new OpenCVForUnity.Rect(dsm.mapLeft, (512 - dsm.mapTop), (dsm.mapRight - dsm.mapLeft), (dsm.mapTop - dsm.mapBot)));
    14.  
    15.                 Imgproc.blur(tempMat, tempMat, new Size(10, 10));
    16.                 Imgproc.resize(tempMat, tempMat, new Size((dsm.mapRight - dsm.mapLeft) / 2, (dsm.mapTop - dsm.mapBot) / 2));
    17.                 Imgproc.threshold(tempMat, tempMat, thresh, max, stylo);
    18.            
    19.                 //Mat outMat = new Mat(tempMat.rows(), tempMat.cols(), tempMat.type());
    20.  
    21.                 invertcolormatrix = new Mat(tempMat.rows(), tempMat.cols(), tempMat.type(), new Scalar(255, 255, 255));
    22.  
    23.                 Core.subtract(invertcolormatrix, tempMat, tempMat);
    24.            
    25.                 System.Collections.Generic.List<MatOfPoint> contours = new System.Collections.Generic.List<MatOfPoint>();
    26.  
    27. hierarchy = new Mat();
    28.  
    29.                 Imgproc.findContours(tempMat, contours, hierarchy, Imgproc.RETR_TREE, Imgproc.CHAIN_APPROX_NONE);
    30.  
    31.  
    32.                 for (int i = 0; i < contours.Count; i++)
    33.                 {
    34.                     if (Imgproc.contourArea(contours) > 50)
    35.                     {
    36.                         OpenCVForUnity.Rect rect = Imgproc.boundingRect(contours);
    37.  
    38.                         if (rect.height > 80)
    39.                         {
    40.                            // Debug.Log(i + " has " + contours.toList().Count.ToString() + " points");
    41.  
    42.                             Imgproc.drawContours(tempMat, contours, i, new Scalar(255, 0, 0), -1);
    43.                          }
    44.                     }
    45.                
    46.                 }
    47.  
    48.                 foreach (var item in contours)
    49.                 {
    50.                     item.Dispose();
    51.                 }
    52.                 contours.Clear();
    53.                 hierarchy.Dispose();
    54.  
    55.                 outputTexture = new Texture2D(tempMat.cols(), tempMat.rows(), TextureFormat.RGBA32, false);
    56.  
    57.                 Utils.matToTexture2D(tempMat, outputTexture);
    58.  
    59.                 gameObject.GetComponent<Renderer>().material.mainTexture = outputTexture;
    60.  
     
    Last edited: Feb 5, 2018
  23. Deleted User

    Deleted User

    Guest

    Hello,

    I hope you are having a good day. I'm really new to OpenCV and im trying to make a face detection application on Hololens. I can detect the face using a picture. I can even show the picture and the detected faces. My problem is that i would like to have the coordinates of the detected face rectangle and save it in a vector3 and using that, place a game object. But the problem is that i cant find the coordinates of the face rectangle since its in a matrix. I was going through the posts here and i found a piece of code that you put but i cant get the coordinates. Either im being dumb or im missing something. The code that i found from you was the following:

    Code (CSharp):
    1. MatOfRect faces = new MatOfRect ();
    2.    
    3. if (cascade != null)
    4.       cascade.detectMultiScale (grayMat, faces, 1.1, 2, 2,
    5.                                            new Size (20, 20), new Size ());
    6.      OpenCVForUnity.Rect[] rects = faces.toArray ();
    7.      for (int i = 0; i < rects.Length; i++) {
    8.           Debug.Log ("detect faces " + rects [i]);
    9.           Core.rectangle (imgMat, new Point (rects [i].x, rects [i].y), new Point (rects [i].x + rects [i].width, rects [i].y + rects [i].height), new Scalar (255, 0, 0, 255), 2);
    10.      }
     
  24. Deleted User

    Deleted User

    Guest

    I have already changed the Core.rectangle to Imgproc.rectangle.
    I forgot to mention that.
     
  25. EnoxSoftware

    EnoxSoftware

    Joined:
    Oct 29, 2014
    Posts:
    1,565
    Does this error occur even if you exclude OpenCVforUnity?
    https://forum.unity.com/threads/d3d11-failed-to-create-2d-texture.392286/
     
  26. EnoxSoftware

    EnoxSoftware

    Joined:
    Oct 29, 2014
    Posts:
    1,565
    Please try HoloLensFaceDetectionOverlayExample.
    https://github.com/EnoxSoftware/HoloLensWithOpenCVForUnityExample
     
  27. Deleted User

    Deleted User

    Guest

    Im actually taking help from there but i cant seem to find the coordinates that i want. It detects the face and puts the rectangle but i cant find the coordinates of the rectangle.
     
  28. Deleted User

    Deleted User

    Guest

    One more thing is that im using the following example as my reference. But even in the example itself i cant get the coordinates.
    https://github.com/EnoxSoftware/HoloLensWithDlibFaceLandmarkDetectorExample
     
  29. EnoxSoftware

    EnoxSoftware

    Joined:
    Oct 29, 2014
    Posts:
    1,565
    HoloLensWithOpenCVForUnityExample detects a face from the image of the RGB camera. In this example, 3D coordinates can not be acquired.
    In order to acquire 3D coordinates, you probably need to use Spatial Mapping.
     
  30. jojoh

    jojoh

    Joined:
    Jan 22, 2013
    Posts:
    15
  31. jojoh

    jojoh

    Joined:
    Jan 22, 2013
    Posts:
    15
    The 2017.3 error, sorry for crappy quality
     

    Attached Files:

  32. EnoxSoftware

    EnoxSoftware

    Joined:
    Oct 29, 2014
    Posts:
    1,565
    Thank you very much for reporting.
    Could you email me the code you tested? store@enoxsoftware.com
     
  33. Joy0023

    Joy0023

    Joined:
    Oct 30, 2017
    Posts:
    4
    Hellow I have a question here: how can I attach the head in the FaceTrackerARExample not to just 1 face but to all the faces tracked?
    I commented the Line 483 of the script FaceTrackerARExample.cs
    // | Objdetect.CASCADE_FIND_BIGGEST_OBJECT
    to find all faces, but I cant't figure out how to duplicate the head for all.
    Any help? :D
     
  34. yumianhuli1

    yumianhuli1

    Joined:
    Mar 14, 2015
    Posts:
    92
    1、Hello! Can I get the Mat address(numerical value) in Unity by using your plugin.
    2、If the other wrapper program want to visit the Mat(image) of origin program and imshow it ,is there a way to do this ?
     
    Last edited: Feb 7, 2018
  35. EnoxSoftware

    EnoxSoftware

    Joined:
    Oct 29, 2014
    Posts:
    1,565
    Code (CSharp):
    1. using UnityEngine;
    2. using System.Collections;
    3.  
    4. using System.Collections.Generic;
    5. using UnityEngine.UI;
    6.  
    7. #if UNITY_5_3 || UNITY_5_3_OR_NEWER
    8. using UnityEngine.SceneManagement;
    9. #endif
    10. using OpenCVForUnity;
    11. using OpenCVFaceTracker;
    12.  
    13. namespace FaceTrackerExample
    14. {
    15.     /// <summary>
    16.     /// Face tracker AR example.
    17.     /// This Example was referring to http://www.morethantechnical.com/2012/10/17/head-pose-estimation-with-opencv-opengl-revisited-w-code/
    18.     /// and use effect asset from http://ktk-kumamoto.hatenablog.com/entry/2014/09/14/092400
    19.     /// </summary>
    20.     [RequireComponent(typeof(WebCamTextureToMatHelper))]
    21.     public class FaceTrackerARExample : MonoBehaviour
    22.     {
    23.  
    24.         /// <summary>
    25.         /// The should draw face points.
    26.         /// </summary>
    27.         public bool isShowingFacePoints;
    28.  
    29.         /// <summary>
    30.         /// The is showing face points toggle.
    31.         /// </summary>
    32.         public Toggle isShowingFacePointsToggle;
    33.        
    34.         /// <summary>
    35.         /// The should draw axes.
    36.         /// </summary>
    37.         public bool isShowingAxes;
    38.  
    39.         /// <summary>
    40.         /// The is showing axes toggle.
    41.         /// </summary>
    42.         public Toggle isShowingAxesToggle;
    43.        
    44.         /// <summary>
    45.         /// The should draw head.
    46.         /// </summary>
    47.         public bool isShowingHead;
    48.  
    49.         /// <summary>
    50.         /// The is showing head toggle.
    51.         /// </summary>
    52.         public Toggle isShowingHeadToggle;
    53.        
    54.         /// <summary>
    55.         /// The should draw effects.
    56.         /// </summary>
    57.         public bool isShowingEffects;
    58.  
    59.         /// <summary>
    60.         /// The is showing effects toggle.
    61.         /// </summary>
    62.         public Toggle isShowingEffectsToggle;
    63.  
    64.         /// <summary>
    65.         /// The auto reset mode. if ture, Only if face is detected in each frame, face is tracked.
    66.         /// </summary>
    67.         public bool isAutoResetMode;
    68.  
    69.         /// <summary>
    70.         /// The auto reset mode toggle.
    71.         /// </summary>
    72.         public Toggle isAutoResetModeToggle;
    73.        
    74.         /// <summary>
    75.         /// The axes.
    76.         /// </summary>
    77.         public GameObject axes;
    78.        
    79.         /// <summary>
    80.         /// The head.
    81.         /// </summary>
    82.         public GameObject head;
    83.        
    84.         /// <summary>
    85.         /// The right eye.
    86.         /// </summary>
    87.         public GameObject rightEye;
    88.        
    89.         /// <summary>
    90.         /// The left eye.
    91.         /// </summary>
    92.         public GameObject leftEye;
    93.        
    94.         /// <summary>
    95.         /// The mouth.
    96.         /// </summary>
    97.         public GameObject mouth;
    98.        
    99.         /// <summary>
    100.         /// The rvec noise filter range.
    101.         /// </summary>
    102.         [Range(0, 50)]
    103.         public float
    104.             rvecNoiseFilterRange = 8;
    105.        
    106.         /// <summary>
    107.         /// The tvec noise filter range.
    108.         /// </summary>
    109.         [Range(0, 360)]
    110.         public float
    111.             tvecNoiseFilterRange = 90;
    112.        
    113.         /// <summary>
    114.         /// The gray mat.
    115.         /// </summary>
    116.         Mat grayMat;
    117.        
    118.         /// <summary>
    119.         /// The texture.
    120.         /// </summary>
    121.         Texture2D texture;
    122.        
    123.         /// <summary>
    124.         /// The cascade.
    125.         /// </summary>
    126.         CascadeClassifier cascade;
    127.        
    128.         /// <summary>
    129.         /// The face tracker.
    130.         /// </summary>
    131.         FaceTracker faceTracker;
    132.        
    133.         /// <summary>
    134.         /// The face tracker parameters.
    135.         /// </summary>
    136.         FaceTrackerParams faceTrackerParams;
    137.        
    138.         /// <summary>
    139.         /// The AR camera.
    140.         /// </summary>
    141.         public Camera ARCamera;
    142.        
    143.         /// <summary>
    144.         /// The cam matrix.
    145.         /// </summary>
    146.         Mat camMatrix;
    147.        
    148.         /// <summary>
    149.         /// The dist coeffs.
    150.         /// </summary>
    151.         MatOfDouble distCoeffs;
    152.        
    153.         /// <summary>
    154.         /// The invert Y.
    155.         /// </summary>
    156.         Matrix4x4 invertYM;
    157.        
    158.         /// <summary>
    159.         /// The transformation m.
    160.         /// </summary>
    161.         Matrix4x4 transformationM = new Matrix4x4();
    162.        
    163.         /// <summary>
    164.         /// The invert Z.
    165.         /// </summary>
    166.         Matrix4x4 invertZM;
    167.        
    168.         /// <summary>
    169.         /// The ar m.
    170.         /// </summary>
    171.         Matrix4x4 ARM;
    172.  
    173.         /// <summary>
    174.         /// The ar game object.
    175.         /// </summary>
    176.         public GameObject[] ARGameObject;
    177.  
    178.         /// <summary>
    179.         /// The should move AR camera.
    180.         /// </summary>
    181.         public bool shouldMoveARCamera;
    182.        
    183.         /// <summary>
    184.         /// The 3d face object points.
    185.         /// </summary>
    186.         MatOfPoint3f objectPoints;
    187.        
    188.         /// <summary>
    189.         /// The image points.
    190.         /// </summary>
    191.         MatOfPoint2f imagePoints;
    192.        
    193.         /// <summary>
    194.         /// The rvec.
    195.         /// </summary>
    196.         Mat rvec;
    197.        
    198.         /// <summary>
    199.         /// The tvec.
    200.         /// </summary>
    201.         Mat tvec;
    202.        
    203.         /// <summary>
    204.         /// The rot m.
    205.         /// </summary>
    206.         Mat rotM;
    207.        
    208.         /// <summary>
    209.         /// The old rvec.
    210.         /// </summary>
    211.         Mat[] oldRvec;
    212.        
    213.         /// <summary>
    214.         /// The old tvec.
    215.         /// </summary>
    216.         Mat[] oldTvec;
    217.  
    218.         /// <summary>
    219.         /// The web cam texture to mat helper.
    220.         /// </summary>
    221.         WebCamTextureToMatHelper webCamTextureToMatHelper;
    222.  
    223.         /// <summary>
    224.         /// The tracker_model_json_filepath.
    225.         /// </summary>
    226.         private string tracker_model_json_filepath;
    227.        
    228.         /// <summary>
    229.         /// The haarcascade_frontalface_alt_xml_filepath.
    230.         /// </summary>
    231.         private string haarcascade_frontalface_alt_xml_filepath;
    232.  
    233.  
    234.         // Use this for initialization
    235.         void Start()
    236.         {
    237.             webCamTextureToMatHelper = gameObject.GetComponent<WebCamTextureToMatHelper>();
    238.  
    239.  
    240.             isShowingFacePointsToggle.isOn = isShowingFacePoints;
    241.             isShowingAxesToggle.isOn = isShowingAxes;
    242.             isShowingHeadToggle.isOn = isShowingHead;
    243.             isShowingEffectsToggle.isOn = isShowingEffects;
    244.             isAutoResetModeToggle.isOn = isAutoResetMode;
    245.  
    246.             #if UNITY_WEBGL && !UNITY_EDITOR
    247.             StartCoroutine(getFilePathCoroutine());
    248.             #else
    249.             tracker_model_json_filepath = Utils.getFilePath("tracker_model.json");
    250.             haarcascade_frontalface_alt_xml_filepath = Utils.getFilePath("haarcascade_frontalface_alt.xml");
    251.             Run();
    252.             #endif
    253.            
    254.         }
    255.  
    256.         #if UNITY_WEBGL && !UNITY_EDITOR
    257.         private IEnumerator getFilePathCoroutine()
    258.         {
    259.             var getFilePathAsync_0_Coroutine = StartCoroutine(Utils.getFilePathAsync("tracker_model.json", (result) => {
    260.                 tracker_model_json_filepath = result;
    261.             }));
    262.             var getFilePathAsync_1_Coroutine = StartCoroutine(Utils.getFilePathAsync("haarcascade_frontalface_alt.xml", (result) => {
    263.                 haarcascade_frontalface_alt_xml_filepath = result;
    264.             }));
    265.            
    266.            
    267.             yield return getFilePathAsync_0_Coroutine;
    268.             yield return getFilePathAsync_1_Coroutine;
    269.            
    270.             Run();
    271.         }
    272.         #endif
    273.  
    274.         private void Run()
    275.         {
    276.             //set 3d face object points.
    277.             objectPoints = new MatOfPoint3f(new Point3(-31, 72, 86),//l eye
    278.                 new Point3(31, 72, 86),//r eye
    279.                 new Point3(0, 40, 114),//nose
    280.                 new Point3(-20, 15, 90),//l mouse
    281.                 new Point3(20, 15, 90)//r mouse
    282. //                                                                                                                                                            ,
    283. //                                                                                                                                                            new Point3 (-70, 60, -9),//l ear
    284. //                                                                                                                                                            new Point3 (70, 60, -9)//r ear
    285.             );
    286.             imagePoints = new MatOfPoint2f();
    287.             rvec = new Mat();
    288.             tvec = new Mat();
    289.  
    290.             oldRvec = new Mat[ARGameObject.Length];
    291.             oldTvec = new Mat[ARGameObject.Length];
    292.  
    293.             rotM = new Mat(3, 3, CvType.CV_64FC1);
    294.  
    295.             //initialize FaceTracker
    296.             faceTracker = new FaceTracker(tracker_model_json_filepath);
    297.             //initialize FaceTrackerParams
    298.             faceTrackerParams = new FaceTrackerParams();
    299.  
    300.             cascade = new CascadeClassifier();
    301.             cascade.load(haarcascade_frontalface_alt_xml_filepath);
    302. //            if (cascade.empty())
    303. //            {
    304. //                Debug.LogError("cascade file is not loaded.Please copy from “FaceTrackerExample/StreamingAssets/” to “Assets/StreamingAssets/” folder. ");
    305. //            }
    306.  
    307.  
    308.  
    309.             webCamTextureToMatHelper.Initialize();
    310.  
    311.  
    312.         }
    313.  
    314.         /// <summary>
    315.         /// Raises the webcam texture to mat helper initialized event.
    316.         /// </summary>
    317.         public void OnWebCamTextureToMatHelperInitialized()
    318.         {
    319.             Debug.Log("OnWebCamTextureToMatHelperInitialized");
    320.            
    321.             Mat webCamTextureMat = webCamTextureToMatHelper.GetMat();
    322.  
    323.             texture = new Texture2D(webCamTextureMat.cols(), webCamTextureMat.rows(), TextureFormat.RGBA32, false);
    324.             gameObject.GetComponent<Renderer>().material.mainTexture = texture;
    325.  
    326.  
    327.             gameObject.transform.localScale = new Vector3(webCamTextureMat.cols(), webCamTextureMat.rows(), 1);
    328.            
    329.             Debug.Log("Screen.width " + Screen.width + " Screen.height " + Screen.height + " Screen.orientation " + Screen.orientation);
    330.            
    331.             float width = webCamTextureMat.width();
    332.             float height = webCamTextureMat.height();
    333.            
    334.             float imageSizeScale = 1.0f;
    335.            
    336.             width = gameObject.transform.localScale.x;
    337.             height = gameObject.transform.localScale.y;
    338.  
    339.             float widthScale = (float)Screen.width / width;
    340.             float heightScale = (float)Screen.height / height;
    341.             if (widthScale < heightScale)
    342.             {
    343.                 Camera.main.orthographicSize = (width * (float)Screen.height / (float)Screen.width) / 2;
    344.                 imageSizeScale = (float)Screen.height / (float)Screen.width;
    345.             } else
    346.             {
    347.                 Camera.main.orthographicSize = height / 2;
    348.             }
    349.        
    350.                                    
    351.                                    
    352.             int max_d = (int)Mathf.Max(width, height);
    353.             double fx = max_d;
    354.             double fy = max_d;
    355.             double cx = width / 2.0f;
    356.             double cy = height / 2.0f;
    357.             camMatrix = new Mat(3, 3, CvType.CV_64FC1);
    358.             camMatrix.put(0, 0, fx);
    359.             camMatrix.put(0, 1, 0);
    360.             camMatrix.put(0, 2, cx);
    361.             camMatrix.put(1, 0, 0);
    362.             camMatrix.put(1, 1, fy);
    363.             camMatrix.put(1, 2, cy);
    364.             camMatrix.put(2, 0, 0);
    365.             camMatrix.put(2, 1, 0);
    366.             camMatrix.put(2, 2, 1.0f);
    367.             Debug.Log("camMatrix " + camMatrix.dump());
    368.  
    369.             distCoeffs = new MatOfDouble(0, 0, 0, 0);
    370.             Debug.Log("distCoeffs " + distCoeffs.dump());
    371.                                    
    372.             Size imageSize = new Size(width * imageSizeScale, height * imageSizeScale);
    373.             double apertureWidth = 0;
    374.             double apertureHeight = 0;
    375.             double[] fovx = new double[1];
    376.             double[] fovy = new double[1];
    377.             double[] focalLength = new double[1];
    378.             Point principalPoint = new Point(0, 0);
    379.             double[] aspectratio = new double[1];
    380.                                                          
    381.                                    
    382.             Calib3d.calibrationMatrixValues(camMatrix, imageSize, apertureWidth, apertureHeight, fovx, fovy, focalLength, principalPoint, aspectratio);
    383.                                    
    384.             Debug.Log("imageSize " + imageSize.ToString());
    385.             Debug.Log("apertureWidth " + apertureWidth);
    386.             Debug.Log("apertureHeight " + apertureHeight);
    387.             Debug.Log("fovx " + fovx [0]);
    388.             Debug.Log("fovy " + fovy [0]);
    389.             Debug.Log("focalLength " + focalLength [0]);
    390.             Debug.Log("principalPoint " + principalPoint.ToString());
    391.             Debug.Log("aspectratio " + aspectratio [0]);
    392.                                    
    393.                                    
    394.             //To convert the difference of the FOV value of the OpenCV and Unity.
    395.             double fovXScale = (2.0 * Mathf.Atan((float)(imageSize.width / (2.0 * fx)))) / (Mathf.Atan2((float)cx, (float)fx) + Mathf.Atan2((float)(imageSize.width - cx), (float)fx));
    396.             double fovYScale = (2.0 * Mathf.Atan((float)(imageSize.height / (2.0 * fy)))) / (Mathf.Atan2((float)cy, (float)fy) + Mathf.Atan2((float)(imageSize.height - cy), (float)fy));
    397.            
    398.             Debug.Log("fovXScale " + fovXScale);
    399.             Debug.Log("fovYScale " + fovYScale);
    400.            
    401.            
    402.             //Adjust Unity Camera FOV https://github.com/opencv/opencv/commit/8ed1945ccd52501f5ab22bdec6aa1f91f1e2cfd4
    403.             if (widthScale < heightScale)
    404.             {
    405.                 ARCamera.fieldOfView = (float)(fovx [0] * fovXScale);
    406.             } else
    407.             {
    408.                 ARCamera.fieldOfView = (float)(fovy [0] * fovYScale);
    409.             }
    410.                                    
    411.                                    
    412.                                    
    413.             invertYM = Matrix4x4.TRS(Vector3.zero, Quaternion.identity, new Vector3(1, -1, 1));
    414.             Debug.Log("invertYM " + invertYM.ToString());
    415.            
    416.             invertZM = Matrix4x4.TRS(Vector3.zero, Quaternion.identity, new Vector3(1, 1, -1));
    417.             Debug.Log("invertZM " + invertZM.ToString());
    418.  
    419.  
    420.  
    421.             grayMat = new Mat(webCamTextureMat.rows(), webCamTextureMat.cols(), CvType.CV_8UC1);
    422.            
    423.  
    424.            
    425.            
    426. //            axes.SetActive(false);
    427. //            head.SetActive(false);
    428. //            rightEye.SetActive(false);
    429. //            leftEye.SetActive(false);
    430. //            mouth.SetActive(false);
    431.            
    432.         }
    433.  
    434.         /// <summary>
    435.         /// Raises the webcam texture to mat helper disposed event.
    436.         /// </summary>
    437.         public void OnWebCamTextureToMatHelperDisposed()
    438.         {
    439.             Debug.Log("OnWebCamTextureToMatHelperDisposed");
    440.                                    
    441.             faceTracker.reset();
    442.  
    443.             grayMat.Dispose();
    444.             camMatrix.Dispose();
    445.             distCoeffs.Dispose();
    446.         }
    447.  
    448.         /// <summary>
    449.         /// Raises the webcam texture to mat helper error occurred event.
    450.         /// </summary>
    451.         /// <param name="errorCode">Error code.</param>
    452.         public void OnWebCamTextureToMatHelperErrorOccurred(WebCamTextureToMatHelper.ErrorCode errorCode)
    453.         {
    454.             Debug.Log("OnWebCamTextureToMatHelperErrorOccurred " + errorCode);
    455.         }
    456.  
    457.         // Update is called once per frame
    458.         void Update()
    459.         {
    460.  
    461.             if (webCamTextureToMatHelper.IsPlaying() && webCamTextureToMatHelper.DidUpdateThisFrame())
    462.             {
    463.                
    464.                 Mat rgbaMat = webCamTextureToMatHelper.GetMat();
    465.  
    466.  
    467.                 //convert image to greyscale
    468.                 Imgproc.cvtColor(rgbaMat, grayMat, Imgproc.COLOR_RGBA2GRAY);
    469.                                        
    470.                                        
    471.                 if (/*isAutoResetMode ||*/ faceTracker.getPoints().Count <= 0)
    472.                 {
    473. //                                      Debug.Log ("detectFace");
    474.                                            
    475.                     //convert image to greyscale
    476.                     using (Mat equalizeHistMat = new Mat())
    477.                     using (MatOfRect faces = new MatOfRect())
    478.                     {
    479.                                                
    480.                         Imgproc.equalizeHist(grayMat, equalizeHistMat);
    481.                                                
    482.                         cascade.detectMultiScale(equalizeHistMat, faces, 1.1f, 2, 0
    483. //                        | Objdetect.CASCADE_FIND_BIGGEST_OBJECT
    484.                         | Objdetect.CASCADE_SCALE_IMAGE, new OpenCVForUnity.Size(equalizeHistMat.cols() * 0.15, equalizeHistMat.cols() * 0.15), new Size());
    485.                                                
    486.                                                
    487.                                                
    488.                         if (faces.rows() > 0)
    489.                         {
    490. //                                              Debug.Log ("faces " + faces.dump ());
    491.  
    492.                             List<OpenCVForUnity.Rect> rectsList = faces.toList();
    493.                             List<Point[]> pointsList = faceTracker.getPoints();
    494.                        
    495. //                            if (isAutoResetMode)
    496. //                            {
    497. //                                //add initial face points from MatOfRect
    498. //                                if (pointsList.Count <= 0)
    499. //                                {
    500. //                                    faceTracker.addPoints(faces);                          
    501. ////                                                                      Debug.Log ("reset faces ");
    502. //                                } else
    503. //                                {
    504. //                          
    505. //                                    for (int i = 0; i < rectsList.Count; i++)
    506. //                                    {
    507. //                              
    508. //                                        OpenCVForUnity.Rect trackRect = new OpenCVForUnity.Rect(rectsList [i].x + rectsList [i].width / 3, rectsList [i].y + rectsList [i].height / 2, rectsList [i].width / 3, rectsList [i].height / 3);
    509. //                                        //It determines whether nose point has been included in trackRect.                                    
    510. //                                        if (i < pointsList.Count && !trackRect.contains(pointsList [i] [67]))
    511. //                                        {
    512. //                                            rectsList.RemoveAt(i);
    513. //                                            pointsList.RemoveAt(i);
    514. ////                                                                                      Debug.Log ("remove " + i);
    515. //                                        }
    516. //                                        Imgproc.rectangle(rgbaMat, new Point(trackRect.x, trackRect.y), new Point(trackRect.x + trackRect.width, trackRect.y + trackRect.height), new Scalar(0, 0, 255, 255), 2);
    517. //                                    }
    518. //                                }
    519. //                            } else
    520. //                            {
    521.                             faceTracker.addPoints(faces);
    522. //                            }
    523.  
    524.                             //draw face rect
    525.                             for (int i = 0; i < rectsList.Count; i++)
    526.                             {
    527.                                 #if OPENCV_2
    528.                                 Core.rectangle (rgbaMat, new Point (rectsLIst [i].x, rectsList [i].y), new Point (rectsList [i].x + rectsList [i].width, rectsList [i].y + rectsList [i].height), new Scalar (255, 0, 0, 255), 2);
    529.                                 #else
    530.                                 Imgproc.rectangle(rgbaMat, new Point(rectsList [i].x, rectsList [i].y), new Point(rectsList [i].x + rectsList [i].width, rectsList [i].y + rectsList [i].height), new Scalar(255, 0, 0, 255), 2);
    531.                                 #endif
    532.                             }
    533.  
    534.                         } else
    535.                         {
    536. //                            if (isAutoResetMode)
    537. //                            {
    538. //                                faceTracker.reset();
    539. //                      
    540. //                                rightEye.SetActive(false);
    541. //                                leftEye.SetActive(false);
    542. //                                head.SetActive(false);
    543. //                                mouth.SetActive(false);
    544. //                                axes.SetActive(false);
    545. //                            }
    546.                         }
    547.                                                
    548.                     }
    549.                                            
    550.                 }
    551.                                        
    552.                                        
    553.                 //track face points.if face points <= 0, always return false.
    554.                 if (faceTracker.track(grayMat, faceTrackerParams))
    555.                 {
    556.                     if (isShowingFacePoints)
    557.                         faceTracker.draw(rgbaMat, new Scalar(255, 0, 0, 255), new Scalar(0, 255, 0, 255));
    558.                                            
    559.                     #if OPENCV_2
    560.                     Core.putText (rgbaMat, "'Tap' or 'Space Key' to Reset", new Point (5, rgbaMat.rows () - 5), Core.FONT_HERSHEY_SIMPLEX, 0.8, new Scalar (255, 255, 255, 255), 2, Core.LINE_AA, false);
    561.                     #else
    562.                     Imgproc.putText(rgbaMat, "'Tap' or 'Space Key' to Reset", new Point(5, rgbaMat.rows() - 5), Core.FONT_HERSHEY_SIMPLEX, 0.8, new Scalar(255, 255, 255, 255), 2, Imgproc.LINE_AA, false);
    563.                     #endif
    564.                            
    565.  
    566.                     for (int i = 0; i < faceTracker.getPoints().Count; i++)
    567.                     {
    568.    
    569.                    
    570.                                            
    571.                         Point[] points = faceTracker.getPoints() [i];
    572.                                            
    573.                                            
    574.                         if (points.Length > 0)
    575.                         {
    576.                                                
    577. //                                              for (int i = 0; i < points.Length; i++) {
    578. //                                                      #if OPENCV_2
    579. //                          Core.putText (rgbaMat, "" + i, new Point (points [i].x, points [i].y), Core.FONT_HERSHEY_SIMPLEX, 0.3, new Scalar (0, 0, 255, 255), 2, Core.LINE_AA, false);
    580. //                                                      #else
    581. //                                                      Imgproc.putText (rgbaMat, "" + i, new Point (points [i].x, points [i].y), Core.FONT_HERSHEY_SIMPLEX, 0.3, new Scalar (0, 0, 255, 255), 2, Core.LINE_AA, false);
    582. //                                                      #endif
    583. //                                              }
    584.                                                
    585.                                                
    586.                             imagePoints.fromArray(
    587.                                 points [31],//l eye
    588.                                 points [36],//r eye
    589.                                 points [67],//nose
    590.                                 points [48],//l mouth
    591.                                 points [54] //r mouth
    592. //                                                                              ,
    593. //                                                                                              points [0],//l ear
    594. //                                                                                              points [14]//r ear
    595.                             );
    596.                                                
    597.                                                
    598.                             Calib3d.solvePnP(objectPoints, imagePoints, camMatrix, distCoeffs, rvec, tvec);
    599.                                                
    600.                             bool isRefresh = false;
    601.                                                
    602.                             if (tvec.get(2, 0) [0] > 0 && tvec.get(2, 0) [0] < 1200 * ((float)rgbaMat.cols() / (float)webCamTextureToMatHelper.requestedWidth))
    603.                             {
    604.                                                    
    605.                                 isRefresh = true;
    606.                                                    
    607.                                 if (oldRvec [i] == null)
    608.                                 {
    609.                                     oldRvec [i] = new Mat();
    610.                                     rvec.copyTo(oldRvec [i]);
    611.                                 }
    612.                                 if (oldTvec [i] == null)
    613.                                 {
    614.                                     oldTvec [i] = new Mat();
    615.                                     tvec.copyTo(oldTvec [i]);
    616.                                 }
    617.                                                    
    618.                                                    
    619.                                 //filter Rvec Noise.
    620.                                 using (Mat absDiffRvec = new Mat())
    621.                                 {
    622.                                     Core.absdiff(rvec, oldRvec [i], absDiffRvec);
    623.                                                        
    624.                                     //              Debug.Log ("absDiffRvec " + absDiffRvec.dump());
    625.                                                        
    626.                                     using (Mat cmpRvec = new Mat())
    627.                                     {
    628.                                         Core.compare(absDiffRvec, new Scalar(rvecNoiseFilterRange), cmpRvec, Core.CMP_GT);
    629.                                                            
    630.                                         if (Core.countNonZero(cmpRvec) > 0)
    631.                                             isRefresh = false;
    632.                                     }
    633.                                 }
    634.                                                    
    635.                                                    
    636.                                                    
    637.                                 //filter Tvec Noise.
    638.                                 using (Mat absDiffTvec = new Mat())
    639.                                 {
    640.                                     Core.absdiff(tvec, oldTvec [i], absDiffTvec);
    641.                                                        
    642.                                     //              Debug.Log ("absDiffRvec " + absDiffRvec.dump());
    643.                                                        
    644.                                     using (Mat cmpTvec = new Mat())
    645.                                     {
    646.                                         Core.compare(absDiffTvec, new Scalar(tvecNoiseFilterRange), cmpTvec, Core.CMP_GT);
    647.                                                            
    648.                                         if (Core.countNonZero(cmpTvec) > 0)
    649.                                             isRefresh = false;
    650.                                     }
    651.                                 }
    652.                                                    
    653.                                                    
    654.                                                    
    655.                             }
    656.                                                
    657.                             if (isRefresh)
    658.                             {
    659.                                                    
    660. //                                if (isShowingEffects)
    661. //                                    rightEye.SetActive(true);
    662. //                                if (isShowingEffects)
    663. //                                    leftEye.SetActive(true);
    664. //                                if (isShowingHead)
    665. //                                    head.SetActive(true);
    666. //                                if (isShowingAxes)
    667. //                                    axes.SetActive(true);
    668.                                                    
    669. //                                                  
    670. //                                if ((Mathf.Abs((float)(points [48].x - points [56].x)) < Mathf.Abs((float)(points [31].x - points [36].x)) / 2.2
    671. //                                    && Mathf.Abs((float)(points [51].y - points [57].y)) > Mathf.Abs((float)(points [31].x - points [36].x)) / 2.9)
    672. //                                    || Mathf.Abs((float)(points [51].y - points [57].y)) > Mathf.Abs((float)(points [31].x - points [36].x)) / 2.7)
    673. //                                {
    674. //                                                      
    675. //                                    if (isShowingEffects)
    676. //                                        mouth.SetActive(true);
    677. //                                                      
    678. //                                } else
    679. //                                {
    680. //                                    if (isShowingEffects)
    681. //                                        mouth.SetActive(false);
    682. //                                }
    683.                                                    
    684.                                                    
    685.                                                    
    686.                                 rvec.copyTo(oldRvec [i]);
    687.                                 tvec.copyTo(oldTvec [i]);
    688.                                                    
    689.                                 Calib3d.Rodrigues(rvec, rotM);
    690.                                                    
    691.                                 transformationM.SetRow(0, new Vector4((float)rotM.get(0, 0) [0], (float)rotM.get(0, 1) [0], (float)rotM.get(0, 2) [0], (float)tvec.get(0, 0) [0]));
    692.                                 transformationM.SetRow(1, new Vector4((float)rotM.get(1, 0) [0], (float)rotM.get(1, 1) [0], (float)rotM.get(1, 2) [0], (float)tvec.get(1, 0) [0]));
    693.                                 transformationM.SetRow(2, new Vector4((float)rotM.get(2, 0) [0], (float)rotM.get(2, 1) [0], (float)rotM.get(2, 2) [0], (float)tvec.get(2, 0) [0]));
    694.                                 transformationM.SetRow(3, new Vector4(0, 0, 0, 1));
    695.                                                    
    696. //                                if (shouldMoveARCamera)
    697. //                                {
    698. //
    699. //                                    if (ARGameObject != null)
    700. //                                    {
    701. //                                        ARM = ARGameObject [i].transform.localToWorldMatrix * invertZM * transformationM.inverse * invertYM;
    702. //                                        ARUtils.SetTransformFromMatrix(ARCamera.transform, ref ARM);
    703. //                                        ARGameObject [i].SetActive(true);
    704. //                                    }
    705. //                                } else
    706. //                                {
    707.                                 ARM = ARCamera.transform.localToWorldMatrix * invertYM * transformationM * invertZM;
    708.  
    709.                                 if (ARGameObject [i] != null)
    710.                                 {
    711.                                     ARUtils.SetTransformFromMatrix(ARGameObject [i].transform, ref ARM);
    712.                                     ARGameObject [i].SetActive(true);
    713.                                 }
    714. //                                }
    715.  
    716.                             }
    717.                         }
    718.                     }
    719.                 }
    720.                                        
    721. //                              Core.putText (rgbaMat, "W:" + rgbaMat.width () + " H:" + rgbaMat.height () + " SO:" + Screen.orientation, new Point (5, rgbaMat.rows () - 10), Core.FONT_HERSHEY_SIMPLEX, 1.0, new Scalar (255, 255, 255, 255), 2, Core.LINE_AA, false);
    722.                                        
    723.                 Utils.matToTexture2D(rgbaMat, texture, webCamTextureToMatHelper.GetBufferColors());
    724.                                        
    725.             }
    726.                                    
    727.             if (Input.GetKeyUp(KeyCode.Space) || Input.touchCount > 0)
    728.             {
    729.                 faceTracker.reset();
    730.  
    731.                 for (int i = 0; i < oldRvec.Length; i++)
    732.                 {
    733.                     if (oldRvec [i] != null)
    734.                     {
    735.                         oldRvec [i].Dispose();
    736.                         oldRvec [i] = null;
    737.                     }
    738.                 }
    739.                 for (int i = 0; i < oldTvec.Length; i++)
    740.                 {
    741.                     if (oldTvec [i] != null)
    742.                     {
    743.                         oldTvec [i].Dispose();
    744.                         oldTvec [i] = null;
    745.                     }
    746.                 }
    747.  
    748.                 for (int i = 0; i < ARGameObject.Length; i++)
    749.                 {
    750.                     ARGameObject [i].SetActive(false);
    751.                 }
    752. //                                      
    753. //                rightEye.SetActive(false);
    754. //                leftEye.SetActive(false);
    755. //                head.SetActive(false);
    756. //                mouth.SetActive(false);
    757. //                axes.SetActive(false);
    758.             }
    759.                    
    760.         }
    761.  
    762.         /// <summary>
    763.         /// Raises the disable event.
    764.         /// </summary>
    765.         void OnDisable()
    766.         {
    767.             webCamTextureToMatHelper.Dispose();
    768.  
    769.             if (cascade != null)
    770.                 cascade.Dispose();
    771.         }
    772.  
    773.         /// <summary>
    774.         /// Raises the back button event.
    775.         /// </summary>
    776.         public void OnBackButton()
    777.         {
    778.             #if UNITY_5_3 || UNITY_5_3_OR_NEWER
    779.             SceneManager.LoadScene("FaceTrackerExample");
    780.             #else
    781.             Application.LoadLevel("FaceTrackerExample");
    782.             #endif
    783.         }
    784.  
    785.         /// <summary>
    786.         /// Raises the play button event.
    787.         /// </summary>
    788.         public void OnPlayButton()
    789.         {
    790.             webCamTextureToMatHelper.Play();
    791.         }
    792.  
    793.         /// <summary>
    794.         /// Raises the pause button event.
    795.         /// </summary>
    796.         public void OnPauseButton()
    797.         {
    798.             webCamTextureToMatHelper.Pause();
    799.         }
    800.  
    801.         /// <summary>
    802.         /// Raises the stop button event.
    803.         /// </summary>
    804.         public void OnStopButton()
    805.         {
    806.             webCamTextureToMatHelper.Stop();
    807.         }
    808.  
    809.         /// <summary>
    810.         /// Raises the change camera button event.
    811.         /// </summary>
    812.         public void OnChangeCameraButton()
    813.         {
    814.             webCamTextureToMatHelper.Initialize(null, webCamTextureToMatHelper.requestedWidth, webCamTextureToMatHelper.requestedHeight, !webCamTextureToMatHelper.requestedIsFrontFacing);
    815.         }
    816.  
    817.         /// <summary>
    818.         /// Raises the is showing face points toggle event.
    819.         /// </summary>
    820.         public void OnIsShowingFacePointsToggle()
    821.         {
    822.             if (isShowingFacePointsToggle.isOn)
    823.             {
    824.                 isShowingFacePoints = true;
    825.             } else
    826.             {
    827.                 isShowingFacePoints = false;
    828.             }
    829.         }
    830.  
    831.         /// <summary>
    832.         /// Raises the is showing axes toggle event.
    833.         /// </summary>
    834.         public void OnIsShowingAxesToggle()
    835.         {
    836.             if (isShowingAxesToggle.isOn)
    837.             {
    838.                 isShowingAxes = true;
    839.             } else
    840.             {
    841.                 isShowingAxes = false;
    842.                 axes.SetActive(false);
    843.             }
    844.         }
    845.  
    846.         /// <summary>
    847.         /// Raises the is showing head toggle event.
    848.         /// </summary>
    849.         public void OnIsShowingHeadToggle()
    850.         {
    851.             if (isShowingHeadToggle.isOn)
    852.             {
    853.                 isShowingHead = true;
    854.             } else
    855.             {
    856.                 isShowingHead = false;
    857.                 head.SetActive(false);
    858.             }
    859.         }
    860.  
    861.         /// <summary>
    862.         /// Raises the is showin effects toggle event.
    863.         /// </summary>
    864.         public void OnIsShowingEffectsToggle()
    865.         {
    866.             if (isShowingEffectsToggle.isOn)
    867.             {
    868.                 isShowingEffects = true;
    869.             } else
    870.             {
    871.                 isShowingEffects = false;
    872.                 rightEye.SetActive(false);
    873.                 leftEye.SetActive(false);
    874.                 mouth.SetActive(false);
    875.             }
    876.         }
    877.  
    878.         /// <summary>
    879.         /// Raises the change auto reset mode toggle event.
    880.         /// </summary>
    881.         public void OnIsAutoResetModeToggle()
    882.         {
    883.             if (isAutoResetModeToggle.isOn)
    884.             {
    885.                 isAutoResetMode = true;
    886.             } else
    887.             {
    888.                 isAutoResetMode = false;
    889.             }
    890.         }
    891.  
    892.     }
    893. }
    ar_multi_face.PNG

    Please try by the following procedure.
    1. Replace FaceTrackerARExample code with this code.
    2. Duplicate ARObjects.
    3. Attach the duplicated ARObjects.
     
    Joy0023 likes this.
  36. david-arcus

    david-arcus

    Joined:
    Jul 17, 2017
    Posts:
    9
    Hello,

    I'm using this OpenCV plugin with your Unity dlib library and it's really excellent, thanks for your hard work.

    I have a C++ dlib app that uses OpenCV's optical flow to stabilise facial landmarks (it reduces the jitter). I'm trying to port it to C# but am confused about some of the classes used. Could you tell me if the following functions could be converted using your library?

    Code (CSharp):
    1. // courtesy of https://github.com/zhucebuliaolongchuan
    2.  
    3. // calculate the variance between two points
    4. double calDistanceDiff(std::vector<cv::Point2f> curPoints, std::vector<cv::Point2f> lastPoints) {
    5.     double variance = 0.0;
    6.     double sum = 0.0;
    7.     std::vector<double> diffs;
    8.     if (curPoints.size() == lastPoints.size()) {
    9.         for (int i = 0; i < curPoints.size(); i++) {
    10.             double diff = std::sqrt(std::pow(curPoints[i].x - lastPoints[i].x, 2.0) + std::pow(curPoints[i].y - lastPoints[i].y, 2.0));
    11.             sum += diff;
    12.             diffs.push_back(diff);
    13.         }
    14.         double mean = sum / diffs.size();
    15.         for (int i = 0; i < curPoints.size(); i++) {
    16.             variance += std::pow(diffs[i] - mean, 2);
    17.         }
    18.         return variance / diffs.size();
    19.     }
    20.     return variance;
    21. }
    Is there an equivalent to
    Code (CSharp):
    1. cv::Point2f
    ? If so could you show me how you might achieve the following?

    Code (CSharp):
    1. cv::Mat prevgray, gray;
    2. std::vector<cv::Point2f> nextTrackPts;
    3. std::vector<cv::Point2f> prevTrackPts;    
    4.  
    5. for (int i = 0; i < 68; i++) {
    6.      prevTrackPts.push_back(cv::Point2f(0, 0));
    7. }
    8.  
    9. // ....
    10.  
    11. std::vector<uchar> status;
    12. std::vector<float> err;
    13.  
    14. calcOpticalFlowPyrLK(prevgray, gray, prevTrackPts, nextTrackPts, status, err);
    15.  
    16.  
    Thank you
     
    Last edited: Feb 7, 2018
  37. EnoxSoftware

    EnoxSoftware

    Joined:
    Oct 29, 2014
    Posts:
    1,565
    1.
    pixel data address
    long dataAddr ()
    http://enoxsoftware.github.io/OpenC..._1_mat.html#ac576026f06bc7b3d5e9ef5e6e442061c

    object pointer
    IntPtr getNativeObjAddr ()
    http://enoxsoftware.github.io/OpenC..._1_mat.html#a8e8f7823193ae0636f28c065a7bb9bc9

    2.
    Unfortunately, I do not know whether it is feasible or not.
     
  38. Joy0023

    Joy0023

    Joined:
    Oct 30, 2017
    Posts:
    4
    Thank you so much!
     
  39. yumianhuli1

    yumianhuli1

    Joined:
    Mar 14, 2015
    Posts:
    92
  40. yumianhuli1

    yumianhuli1

    Joined:
    Mar 14, 2015
    Posts:
    92
    I also have a question about camera calibration in unity.How can I get Q – Output disparity-to-depth mapping matrix by stereoRectify(...) and two virtual cameras in unity,is there a example/demo/playmaker camera calibration function workflow in unity?Thank U!
     
    Last edited: Feb 11, 2018
  41. sjmtechs

    sjmtechs

    Joined:
    Jun 20, 2017
    Posts:
    11
    While suing face detection sample, once we have found a face, how can i extract the face image to save it?
    I have found a result in previous post. But Highgui.imwrite does not seems to recognized.
    Any Solution ?


    Code (CSharp):
    1. if (cascade != null)
    2.                          cascade.detectMultiScale (grayMat, faces, 1.1, 2, 2,
    3.                                            new Size (20, 20), new Size ());
    4.               OpenCVForUnity.Rect[] rects = faces.toArray ();
    5.               for (int i = 0; i < rects.Length; i++) {
    6.                          Debug.Log ("detect faces " + rects [i]);
    7.                          Core.rectangle (imgMat, new Point (rects [i].x, rects [i].y), new Point (rects [i].x + rects [i].width, rects [i].y + rects [i].height), new Scalar (255, 0, 0, 255), 2);        
    8. ///////////////////////////////////////////////////////////////////////////////////////////////////////
    9.                           Mat faceMat = new Mat(imgMat, rects[i]);
    10.                           Highgui.imwrite("C:/savefolder/face_" + i + ".jpg", faceMat);
    11. ///////////////////////////////////////////////////////////////////////////////////////////////////////
    12.               }
     
  42. brohanjoe

    brohanjoe

    Joined:
    Jan 25, 2017
    Posts:
    1
    Hello,
    I'm currently using OpenCVForUnity 2.1.5 and I'm having troubles using correctly a CLAHE.
    The main constructor asks for an IntPtr and I don't have any clue on how to pass such a thing correctly.
    I tried to do :
    IntPtr addr = Marshal.AllocHGlobal(sizeof(CLAHE));
    but sizeof can't recognize CLAHE,
    then I try to do
    CLAHE clahe = new CLAHE(IntPtr.Zero);
    but every time I try to test Unity is crashing instantly.
    Can you provide me some help on how to initialize algorithms classes like this one please, I also encountered this problem using backgrounds subtractors.
    Have a good day,
     
  43. EnoxSoftware

    EnoxSoftware

    Joined:
    Oct 29, 2014
    Posts:
    1,565
    Please refer to this code.
    Code (CSharp):
    1. using UnityEngine;
    2. using System.Collections;
    3. using System.Collections.Generic;
    4. using System;
    5. using System.Runtime.InteropServices;
    6.  
    7. #if UNITY_5_3 || UNITY_5_3_OR_NEWER
    8. using UnityEngine.SceneManagement;
    9. #endif
    10. using OpenCVForUnity;
    11. using DlibFaceLandmarkDetector;
    12.  
    13. namespace DlibFaceLandmarkDetectorExample
    14. {
    15.     /// <summary>
    16.     /// WebCamTextureToMatHelper Example
    17.     /// </summary>
    18.     [RequireComponent (typeof(WebCamTextureToMatHelper))]
    19.     public class OfKmWebCamTextureToMatHelperExample : MonoBehaviour
    20.     {
    21.         /// <summary>
    22.         /// The texture.
    23.         /// </summary>
    24.         Texture2D texture;
    25.  
    26.         /// <summary>
    27.         /// The webcam texture to mat helper.
    28.         /// </summary>
    29.         WebCamTextureToMatHelper webCamTextureToMatHelper;
    30.  
    31.         /// <summary>
    32.         /// The face landmark detector.
    33.         /// </summary>
    34.         FaceLandmarkDetector faceLandmarkDetector;
    35.  
    36.         /// <summary>
    37.         /// The sp_human_face_68_dat_filepath.
    38.         /// </summary>
    39.         string sp_human_face_68_dat_filepath;
    40.  
    41.         #if UNITY_WEBGL && !UNITY_EDITOR
    42.         Stack<IEnumerator> coroutines = new Stack<IEnumerator> ();
    43.         #endif
    44.  
    45.  
    46.         List<Point> kalman_points;
    47.         List<Point> predict_points;
    48.  
    49.         // Kalman Filter Setup (68 Points Test)
    50.         const int stateNum = 272;
    51.         const int measureNum = 136;
    52.  
    53.         KalmanFilter KF;
    54.         Mat state;
    55.         Mat processNoise;
    56.         Mat measurement;
    57.  
    58.         Mat prevgray, gray;
    59.         List<Point> prevTrackPts;
    60.         List<Point> nextTrackPts;
    61.  
    62.         MatOfPoint2f mOP2fPrevTrackPts;
    63.         MatOfPoint2f mOP2fNextTrackPts;
    64.         MatOfByte status;
    65.         MatOfFloat err;
    66.  
    67.         // Use this for initialization
    68.         void Start ()
    69.         {
    70.             webCamTextureToMatHelper = gameObject.GetComponent<WebCamTextureToMatHelper> ();
    71.  
    72.             #if UNITY_WEBGL && !UNITY_EDITOR
    73.             var getFilePath_Coroutine = DlibFaceLandmarkDetector.Utils.getFilePathAsync ("sp_human_face_68.dat", (result) => {
    74.                 coroutines.Clear ();
    75.  
    76.                 sp_human_face_68_dat_filepath = result;
    77.                 Run ();
    78.             });
    79.             coroutines.Push (getFilePath_Coroutine);
    80.             StartCoroutine (getFilePath_Coroutine);
    81.             #else
    82.             sp_human_face_68_dat_filepath = DlibFaceLandmarkDetector.Utils.getFilePath ("sp_human_face_68.dat");
    83.             Run ();
    84.             #endif
    85.         }
    86.  
    87.         private void Run ()
    88.         {
    89.             faceLandmarkDetector = new FaceLandmarkDetector (sp_human_face_68_dat_filepath);
    90.  
    91.             webCamTextureToMatHelper.Initialize ();
    92.  
    93.  
    94.  
    95.             // Initialize measurement points
    96.             kalman_points = new List<Point> ();
    97.             for (int i = 0; i < 68; i++) {
    98.                 kalman_points.Add (new Point (0.0, 0.0));
    99.             }
    100.  
    101.             // Initialize prediction points
    102.             predict_points = new List<Point> ();
    103.  
    104.             for (int i = 0; i < 68; i++) {
    105.                 predict_points.Add (new Point (0.0, 0.0));
    106.             }
    107.  
    108.  
    109.             KF = new KalmanFilter (stateNum, measureNum, 0, CvType.CV_32F);
    110.             state = new Mat (stateNum, 1, CvType.CV_32FC1);
    111.             processNoise = new Mat (stateNum, 1, CvType.CV_32F);
    112.             measurement = Mat.zeros (measureNum, 1, CvType.CV_32F);
    113.             //            Debug.Log ("measurement " + measurement.ToString ());
    114.  
    115.             // Generate a matrix randomly
    116.             Core.randn (state, 0, 0.0);
    117.  
    118.             // Generate the Measurement Matrix
    119.             KF.set_transitionMatrix (Mat.zeros (stateNum, stateNum, CvType.CV_32F));
    120.             for (int i = 0; i < stateNum; i++) {
    121.                 for (int j = 0; j < stateNum; j++) {
    122.                     if (i == j || (j - measureNum) == i) {
    123.                         KF.get_transitionMatrix ().put (i, j, new float[]{ 1.0f });
    124.                     } else {
    125.                         KF.get_transitionMatrix ().put (i, j, new float[]{ 0.0f });
    126.                     }  
    127.                 }
    128.             }
    129.  
    130.             //!< measurement matrix (H) 观测模型
    131.             Core.setIdentity (KF.get_measurementMatrix ());
    132.  
    133.             //!< process noise covariance matrix (Q)
    134.             Core.setIdentity (KF.get_processNoiseCov (), Scalar.all (1e-5));
    135.  
    136.             //!< measurement noise covariance matrix (R)
    137.             Core.setIdentity (KF.get_measurementNoiseCov (), Scalar.all (1e-1));
    138.  
    139.             //!< priori error estimate covariance matrix (P'(k)): P'(k)=A*P(k-1)*At + Q)*/  A代表F: transitionMatrix
    140.             Core.setIdentity (KF.get_errorCovPost (), Scalar.all (1));
    141.  
    142.             Core.randn (KF.get_statePost (), 0, 0.0);
    143.  
    144.  
    145.             // Initialize Optical Flow
    146.             prevgray = new Mat ();
    147.             gray = new Mat ();
    148.             prevTrackPts = new List<Point> ();
    149.             nextTrackPts = new List<Point> ();
    150.             for (int i = 0; i < 68; i++) {
    151.                 prevTrackPts.Add (new Point (0, 0));
    152.             }
    153.             //            for (int i = 0; i < 68; i++) {
    154.             //                nextTrackPts.Add (new Point (0, 0));
    155.             //            }
    156.  
    157.  
    158.             mOP2fPrevTrackPts = new MatOfPoint2f ();
    159.             mOP2fNextTrackPts = new MatOfPoint2f ();
    160.             status = new MatOfByte ();
    161.             err = new MatOfFloat ();
    162.         }
    163.  
    164.         /// <summary>
    165.         /// Raises the web cam texture to mat helper initialized event.
    166.         /// </summary>
    167.         public void OnWebCamTextureToMatHelperInitialized ()
    168.         {
    169.             Debug.Log ("OnWebCamTextureToMatHelperInitialized");
    170.  
    171.             Mat webCamTextureMat = webCamTextureToMatHelper.GetMat ();
    172.  
    173.             texture = new Texture2D (webCamTextureMat.cols (), webCamTextureMat.rows (), TextureFormat.RGBA32, false);
    174.             gameObject.GetComponent<Renderer> ().material.mainTexture = texture;
    175.  
    176.             gameObject.transform.localScale = new Vector3 (webCamTextureMat.cols (), webCamTextureMat.rows (), 1);
    177.             Debug.Log ("Screen.width " + Screen.width + " Screen.height " + Screen.height + " Screen.orientation " + Screen.orientation);
    178.                                    
    179.             float width = webCamTextureMat.width ();
    180.             float height = webCamTextureMat.height ();
    181.                                    
    182.             float widthScale = (float)Screen.width / width;
    183.             float heightScale = (float)Screen.height / height;
    184.             if (widthScale < heightScale) {
    185.                 Camera.main.orthographicSize = (width * (float)Screen.height / (float)Screen.width) / 2;
    186.             } else {
    187.                 Camera.main.orthographicSize = height / 2;
    188.             }
    189.         }
    190.  
    191.         /// <summary>
    192.         /// Raises the web cam texture to mat helper disposed event.
    193.         /// </summary>
    194.         public void OnWebCamTextureToMatHelperDisposed ()
    195.         {
    196.             Debug.Log ("OnWebCamTextureToMatHelperDisposed");
    197.         }
    198.  
    199.         /// <summary>
    200.         /// Raises the web cam texture to mat helper error occurred event.
    201.         /// </summary>
    202.         /// <param name="errorCode">Error code.</param>
    203.         public void OnWebCamTextureToMatHelperErrorOccurred (WebCamTextureToMatHelper.ErrorCode errorCode)
    204.         {
    205.             Debug.Log ("OnWebCamTextureToMatHelperErrorOccurred " + errorCode);
    206.         }
    207.  
    208.         // Update is called once per frame
    209.         void Update ()
    210.         {
    211.             if (webCamTextureToMatHelper.IsPlaying () && webCamTextureToMatHelper.DidUpdateThisFrame ()) {
    212.  
    213.                 Mat rgbaMat = webCamTextureToMatHelper.GetMat ();
    214.  
    215.                 OpenCVForUnityUtils.SetImage (faceLandmarkDetector, rgbaMat);
    216.  
    217.                 //detect face rects
    218.                 List<UnityEngine.Rect> detectResult = faceLandmarkDetector.Detect ();
    219.  
    220.  
    221.                 List<Vector2> points = null;
    222.                 if (detectResult.Count == 1) {
    223.                     //detect landmark points
    224.                     points = faceLandmarkDetector.DetectLandmark (detectResult [0]);
    225.                 }
    226.  
    227.                 if (prevgray.total () == 0) {
    228.                     Debug.Log ("prevgray:" + prevgray.ToString ());
    229.                     Imgproc.cvtColor (rgbaMat, prevgray, Imgproc.COLOR_RGBA2GRAY);
    230.  
    231.                     if (points != null) {
    232.                         for (int i = 0; i < points.Count; i++) {
    233.                             prevTrackPts [i].x = points [i].x;
    234.                             prevTrackPts [i].y = points [i].y;
    235.                         }
    236.                     }
    237.                 }
    238.  
    239.                 // Update Kalman Filter Points
    240.                 if (points != null) {
    241.  
    242.                     for (int i = 0; i < points.Count; i++) {
    243.                         kalman_points [i].x = points [i].x;
    244.                         kalman_points [i].y = points [i].y;
    245.                     }
    246.                 }
    247.  
    248.                 // Kalman Prediction
    249.                 Mat prediction = KF.predict ();
    250.                 // std::vector<cv::Point2f> predict_points;
    251.                 //                Debug.Log ("prediction " + prediction.ToString ());
    252.                 float[] tmpPrediction = new float[prediction.total ()];
    253.                 prediction.get (0, 0, tmpPrediction);
    254.                 for (int i = 0; i < 68; i++) {
    255.                     predict_points [i].x = tmpPrediction [i * 2];
    256.                     predict_points [i].y = tmpPrediction [i * 2 + 1];
    257.                 }
    258.                 //                for (int i = 0; i < 68; i++) {
    259.                 //                    predict_points [i].x = (float)prediction.get (i * 2, 0) [0];
    260.                 //                    predict_points [i].y = (float)prediction.get (i * 2 + 1, 0) [0];
    261.                 //                }
    262.                 prediction.Dispose ();
    263.  
    264.  
    265.                 if (points != null) {
    266.                     Imgproc.cvtColor (rgbaMat, gray, Imgproc.COLOR_RGBA2GRAY);
    267.                     if (prevgray.total () > 0) {
    268.  
    269.                         mOP2fPrevTrackPts.fromList (prevTrackPts);
    270.                         mOP2fNextTrackPts.fromList (nextTrackPts);
    271.  
    272.                         Video.calcOpticalFlowPyrLK (prevgray, gray, mOP2fPrevTrackPts, mOP2fNextTrackPts, status, err);
    273.  
    274.                         prevTrackPts = mOP2fPrevTrackPts.toList ();
    275.                         nextTrackPts = mOP2fNextTrackPts.toList ();
    276.  
    277.  
    278.                         // if the face is moving so fast, use dlib to detect the face
    279.                         double diff = calDistanceDiff (prevTrackPts, nextTrackPts);
    280.                         Debug.Log ("variance:" + diff);
    281.                         if (diff > 1.0) {
    282.                             Debug.Log ("DLIB");
    283.                             for (int i = 0; i < points.Count; i++) {
    284.                                 Imgproc.circle (rgbaMat, new Point (points [i].x, points [i].y), 2, new Scalar (255, 0, 0, 255), -1);
    285.                                 nextTrackPts [i].x = points [i].x;
    286.                                 nextTrackPts [i].y = points [i].y;
    287.                             }
    288.                         } else if (diff <= 1.0 && diff > 0.005) {
    289.                             // In this case, use Optical Flow
    290.                             Debug.Log ("Optical Flow");
    291.                             for (int i = 0; i < nextTrackPts.Count; i++) {
    292.                                 Imgproc.circle (rgbaMat, nextTrackPts [i], 2, new Scalar (0, 0, 255, 255), -1);
    293.                             }
    294.                         } else {
    295.                             // In this case, use Kalman Filter
    296.                             Debug.Log ("Kalman Filter");
    297.                             for (int i = 0; i < predict_points.Count; i++) {
    298.                                 Imgproc.circle (rgbaMat, predict_points [i], 2, new Scalar (0, 255, 0, 255), -1);
    299.                                 nextTrackPts [i].x = predict_points [i].x;
    300.                                 nextTrackPts [i].y = predict_points [i].y;
    301.                             }
    302.                         }
    303.                     }
    304.                     //                    std::swap(prevTrackPts, nextTrackPts);
    305.                     Swap (ref prevTrackPts, ref nextTrackPts);
    306.                     //                    std::swap(prevgray, gray);
    307.                     Swap (ref prevgray, ref gray);
    308.                 }
    309.  
    310.                 // Update Measurement
    311.                 float[] tmpMeasurement = new float[measurement.total ()];
    312.                 for (int i = 0; i < 136; i++) {
    313.                     if (i % 2 == 0) {
    314.                         tmpMeasurement [i] = (float)kalman_points [i / 2].x;
    315.                     } else {
    316.                         tmpMeasurement [i] = (float)kalman_points [(i - 1) / 2].y;
    317.                     }
    318.                 }
    319.                 measurement.put (0, 0, tmpMeasurement);
    320.                 //                for (int i = 0; i < 136; i++) {
    321.                 //                    if (i % 2 == 0) {
    322.                 ////                        Debug.Log ("measurement " + measurement.ToString ());
    323.                 //                        measurement.put (i, 0, new float[]{ (float)(kalman_points [i / 2].x) });
    324.                 //                    } else {
    325.                 //                        measurement.put (i, 0, new float[]{ (float)(kalman_points [(i - 1) / 2].y) });
    326.                 //                    }
    327.                 //                }
    328.  
    329.                 // Update the Measurement Matrix
    330.                 measurement += KF.get_measurementMatrix () * state;
    331.                 KF.correct (measurement);
    332.                
    333. //                foreach (var rect in detectResult) {
    334. //
    335. //                    //detect landmark points
    336. //                    List<Vector2> points = faceLandmarkDetector.DetectLandmark (rect);
    337. //
    338. //                    //draw landmark points
    339. //                    OpenCVForUnityUtils.DrawFaceLandmark (rgbaMat, points, new Scalar (0, 255, 0, 255), 2);
    340. //
    341. //                    //draw face rect
    342. //                    OpenCVForUnityUtils.DrawFaceRect (rgbaMat, rect, new Scalar (255, 0, 0, 255), 2);
    343. //                }
    344.  
    345.                 Imgproc.putText (rgbaMat, "W:" + rgbaMat.width () + " H:" + rgbaMat.height () + " SO:" + Screen.orientation, new Point (5, rgbaMat.rows () - 10), Core.FONT_HERSHEY_SIMPLEX, 0.5, new Scalar (255, 255, 255, 255), 1, Imgproc.LINE_AA, false);
    346.  
    347.                 OpenCVForUnity.Utils.matToTexture2D (rgbaMat, texture, webCamTextureToMatHelper.GetBufferColors ());
    348.             }
    349.         }
    350.  
    351.         /// <summary>
    352.         /// Raises the destroy event.
    353.         /// </summary>
    354.         void OnDestroy ()
    355.         {
    356.             if (webCamTextureToMatHelper != null)
    357.                 webCamTextureToMatHelper.Dispose ();
    358.  
    359.             if (faceLandmarkDetector != null)
    360.                 faceLandmarkDetector.Dispose ();
    361.  
    362.  
    363.             if (KF != null)
    364.                 KF.Dispose ();
    365.             if (state != null)
    366.                 state.Dispose ();
    367.             if (processNoise != null)
    368.                 processNoise.Dispose ();
    369.             if (measurement != null)
    370.                 measurement.Dispose ();
    371.  
    372.             if (prevgray != null)
    373.                 prevgray.Dispose ();
    374.             if (gray != null)
    375.                 gray.Dispose ();
    376.  
    377.             if (mOP2fPrevTrackPts != null)
    378.                 mOP2fPrevTrackPts.Dispose ();
    379.             if (mOP2fNextTrackPts != null)
    380.                 mOP2fNextTrackPts.Dispose ();
    381.             if (status != null)
    382.                 status.Dispose ();
    383.             if (err != null)
    384.                 err.Dispose ();
    385.            
    386.  
    387.  
    388.  
    389.             #if UNITY_WEBGL && !UNITY_EDITOR
    390.             foreach (var coroutine in coroutines) {
    391.                 StopCoroutine (coroutine);
    392.                 ((IDisposable)coroutine).Dispose ();
    393.             }
    394.             #endif
    395.         }
    396.  
    397.         /// <summary>
    398.         /// Raises the back button click event.
    399.         /// </summary>
    400.         public void OnBackButtonClick ()
    401.         {
    402.             #if UNITY_5_3 || UNITY_5_3_OR_NEWER
    403.             SceneManager.LoadScene ("DlibFaceLandmarkDetectorExample");
    404.             #else
    405.             Application.LoadLevel ("DlibFaceLandmarkDetectorExample");
    406.             #endif
    407.         }
    408.  
    409.         /// <summary>
    410.         /// Raises the play button click event.
    411.         /// </summary>
    412.         public void OnPlayButtonClick ()
    413.         {
    414.             webCamTextureToMatHelper.Play ();
    415.         }
    416.  
    417.         /// <summary>
    418.         /// Raises the pause button click event.
    419.         /// </summary>
    420.         public void OnPauseButtonCkick ()
    421.         {
    422.             webCamTextureToMatHelper.Pause ();
    423.         }
    424.  
    425.         /// <summary>
    426.         /// Raises the stop button click event.
    427.         /// </summary>
    428.         public void OnStopButtonClick ()
    429.         {
    430.             webCamTextureToMatHelper.Stop ();
    431.         }
    432.  
    433.         /// <summary>
    434.         /// Raises the change camera button click event.
    435.         /// </summary>
    436.         public void OnChangeCameraButtonClick ()
    437.         {
    438.             webCamTextureToMatHelper.Initialize (null, webCamTextureToMatHelper.requestedWidth, webCamTextureToMatHelper.requestedHeight, !webCamTextureToMatHelper.requestedIsFrontFacing);
    439.         }
    440.  
    441.  
    442.         // This function is to calculate the variance
    443.         double calDistanceDiff (List<Point> curPoints, List<Point> lastPoints)
    444.         {
    445.             double variance = 0.0;
    446.             double sum = 0.0;
    447.             List<double> diffs = new List<double> ();
    448.             if (curPoints.Count == lastPoints.Count) {
    449.                 for (int i = 0; i < curPoints.Count; i++) {
    450.                     double diff = Math.Sqrt (Math.Pow (curPoints [i].x - lastPoints [i].x, 2.0) + Math.Pow (curPoints [i].y - lastPoints [i].y, 2.0));
    451.                     sum += diff;
    452.                     diffs.Add (diff);
    453.                 }
    454.                 double mean = sum / diffs.Count;
    455.                 for (int i = 0; i < curPoints.Count; i++) {
    456.                     variance += Math.Pow (diffs [i] - mean, 2);
    457.                 }
    458.                 return variance / diffs.Count;
    459.             }
    460.             return variance;
    461.         }
    462.  
    463.         static void Swap<T> (ref T a, ref T b)
    464.         {
    465.             var t = a;
    466.             a = b;
    467.             b = t;
    468.         }
    469.     }
    470. }
     
    david-arcus likes this.
  44. zhengi

    zhengi

    Joined:
    Oct 19, 2017
    Posts:
    3
    Has anyone had issues with Aruco camera calibration? When I run Aruco.calibrateCameraCharuco and provide the correct parameters, the function never terminates and Unity freezes. I can provide a code sample if needed.
     
  45. EnoxSoftware

    EnoxSoftware

    Joined:
    Oct 29, 2014
    Posts:
    1,565
    Unfortunately, There is no example of camera calibration.
     
  46. EnoxSoftware

    EnoxSoftware

    Joined:
    Oct 29, 2014
    Posts:
    1,565
    Please change Highgui.imwrite to Imgcodecs.imwrite.
    Code (CSharp):
    1. Imgcodecs.imwrite("C:/savefolder/face_" + i + ".jpg", faceMat);
     
  47. EnoxSoftware

    EnoxSoftware

    Joined:
    Oct 29, 2014
    Posts:
    1,565
    Please use Imgproc.createCLAHE ().
    Code (CSharp):
    1. CLAHE clahe = Imgproc.createCLAHE();
    https://stackoverflow.com/questions/35154686/opencv-3-java-binding-apply-clahe-to-image
     
  48. EnoxSoftware

    EnoxSoftware

    Joined:
    Oct 29, 2014
    Posts:
    1,565
    Could you provide your code?
     
  49. Sithdown

    Sithdown

    Joined:
    Aug 8, 2014
    Posts:
    10
    Hello;

    I've been trying to use the ARHeadWithFaceMaskExample you posted here but with no luck.

    This is the error I get when I launch the Scene:

    and after that, this error is spammed every frame:

    The only change I made was on ARHeadWithFaceMaskExample.cs: added "using OpenCVForUnityExample;" because lines 845 and 850 were using ARUtils but there was no ARUtils loaded.

    Any ideas?

    Unity 2017.3p1


    EDIT: Fixed!

    The problem was with ARHeadWithFaceMaskExample.cs trying to load "shape_predictor_68_face_landmarks.dat". That file is not included, but can be downloaded from the dlib webpage:
    http://dlib.net/files/shape_predictor_68_face_landmarks.dat.bz2

    That fixed the error :)
     
    Last edited: Feb 13, 2018
  50. SpiderJones

    SpiderJones

    Joined:
    Mar 29, 2014
    Posts:
    246
    Hi, at work we have purchased OpenCV for unity and Dlib FaceLandmark Detector. We want to take a Texture2D and place a 3d model over the users face, the same thing that is done in your ARHeadExample, but with out the camera. I've tried to isolate the transform placement code in the ARHeadExample script, but it seems coupled and dependent on a WebCamTextureToMatHelper object. Can you share an example showing specifically how to get the position, scale and rotation from the landmark points in the context of a Texture2d? Thanks!