Search Unity

  1. Check out the Unite LA keynote for updates on the Visual Effect Editor, the FPS Sample, ECS, Unity for Film and more! Watch it now!
    Dismiss Notice
  2. The Unity Pro & Visual Studio Professional Bundle gives you the tools you need to develop faster & collaborate more efficiently. Learn more.
    Dismiss Notice
  3. Improved Prefab workflow (includes Nested Prefabs!), 2D isometric Tilemap and more! Get the 2018.3 Beta now.
    Dismiss Notice
  4. Want more efficiency in your development work? Sign up to receive weekly tech and creative know-how from Unity experts.
    Dismiss Notice
  5. Improve your Unity skills with a certified instructor in a private, interactive classroom. Watch the overview now.
    Dismiss Notice
  6. Want to see the most recent patch releases? Take a peek at the patch release page.
    Dismiss Notice

[RELEASED] OpenCV for Unity

Discussion in 'Assets and Asset Store' started by EnoxSoftware, Oct 30, 2014.

  1. EnoxSoftware

    EnoxSoftware

    Joined:
    Oct 29, 2014
    Posts:
    965
    https://github.com/EnoxSoftware/Ope...xamples/WebCamTextureToMatHelper.cs#L266-L269
    Please change the code as follows.
    Code (CSharp):
    1.  
    2.                     //#if !UNITY_EDITOR && !(UNITY_STANDALONE || UNITY_WEBGL)
    3.                     if (screenOrientation == ScreenOrientation.Portrait || screenOrientation == ScreenOrientation.PortraitUpsideDown) {
    4.                         rotatedRgbaMat = new Mat (webCamTexture.width, webCamTexture.height, CvType.CV_8UC4);
    5.                     }
    6.                     //#endif
    7.  
     
  2. sjmtechs

    sjmtechs

    Joined:
    Jun 20, 2017
    Posts:
    11

    Well, it seems facetracking stopped working.
    Also facetracking started working when camera is manually rotated - 90 Degree.
    So is it possible to put web camera in a normal position and shoot & tracking in portrait mode ?
     
  3. EnoxSoftware

    EnoxSoftware

    Joined:
    Oct 29, 2014
    Posts:
    965
    TextRecognitionExample is included in OpenCV for Unity v2.2.2. This example can detect single letters.
    https://github.com/EnoxSoftware/Ope...es/text/TextExample/TextRecognitionExample.cs
     
  4. EnoxSoftware

    EnoxSoftware

    Joined:
    Oct 29, 2014
    Posts:
    965
    Probably it is difficult.
     
  5. JoystickLab

    JoystickLab

    Joined:
    Mar 18, 2016
    Posts:
    11
    I have already seen that. It looks like the text recognition works on an image. I need to detect it runtime. I mean, the text should be detected when the web cam is capturing video. Not getting any idea how to proceed that.
     
  6. Great-Peter

    Great-Peter

    Joined:
    Oct 9, 2015
    Posts:
    14
    upload_2017-10-15_18-25-4.png

    Hi Enox!
    I can't detect face on FaceTrackerExample scene in FaceTracker Example assets.
    openCV is v3.30
    What is the problem?
     
  7. EnoxSoftware

    EnoxSoftware

    Joined:
    Oct 29, 2014
    Posts:
    965
    Does haarcascade_frontalface_alt.xml and tracker_model.json exist in the StreamingAssets folder?

    facetracker.PNG
     
    Great-Peter likes this.
  8. EnoxSoftware

    EnoxSoftware

    Joined:
    Oct 29, 2014
    Posts:
    965
    Could you try this code?
    Code (CSharp):
    1. using UnityEngine;
    2. using System.Collections;
    3. using System.Collections.Generic;
    4. using UnityEngine.UI;
    5. using System;
    6. using System.Xml;
    7.  
    8.  
    9. #if UNITY_5_3 || UNITY_5_3_OR_NEWER
    10. using UnityEngine.SceneManagement;
    11. #endif
    12. using OpenCVForUnity;
    13.  
    14. namespace OpenCVForUnityExample
    15. {
    16.     /// <summary>
    17.     /// WebCamTextureTextRecognitionExample Example
    18.     /// </summary>
    19.     [RequireComponent (typeof(WebCamTextureToMatHelper))]
    20.     public class WebCamTextureTextRecognitionExample : MonoBehaviour
    21.     {
    22.         /// <summary>
    23.         /// The flip vertical toggle.
    24.         /// </summary>
    25.         public Toggle flipVerticalToggle;
    26.        
    27.         /// <summary>
    28.         /// The flip horizontal toggle.
    29.         /// </summary>
    30.         public Toggle flipHorizontalToggle;
    31.  
    32.         /// <summary>
    33.         /// The texture.
    34.         /// </summary>
    35.         Texture2D texture;
    36.  
    37.         /// <summary>
    38.         /// The webcam texture to mat helper.
    39.         /// </summary>
    40.         WebCamTextureToMatHelper webCamTextureToMatHelper;
    41.  
    42.         /// <summary>
    43.         /// The binary mat.
    44.         /// </summary>
    45.         Mat binaryMat;
    46.  
    47.         /// <summary>
    48.         /// The mask mat.
    49.         /// </summary>
    50.         Mat maskMat;
    51.  
    52.         /// <summary>
    53.         /// The rgb mat.
    54.         /// </summary>
    55.         Mat rgbMat;
    56.  
    57.         /// <summary>
    58.         /// The er filter1.
    59.         /// </summary>
    60.         ERFilter er_filter1;
    61.  
    62.         /// <summary>
    63.         /// The er filter2.
    64.         /// </summary>
    65.         ERFilter er_filter2;
    66.  
    67.         /// <summary>
    68.         /// The decoder.
    69.         /// </summary>
    70.         OCRHMMDecoder decoder;
    71.  
    72.  
    73.         string trained_classifierNM1_xml_filepath;
    74.         string trained_classifierNM2_xml_filepath;
    75.         string OCRHMM_transitions_table_xml_filepath;
    76.         string OCRHMM_knn_model_data_xml_gz_filepath;
    77.  
    78.         #if UNITY_WEBGL && !UNITY_EDITOR
    79.         Stack<IEnumerator> coroutines = new Stack<IEnumerator> ();
    80.         #endif
    81.  
    82.         // Use this for initialization
    83.         void Start ()
    84.         {
    85.  
    86.             #if UNITY_WEBGL && !UNITY_EDITOR
    87.             var getFilePath_Coroutine = GetFilePath ();
    88.             coroutines.Push (getFilePath_Coroutine);
    89.             StartCoroutine (getFilePath_Coroutine);
    90.             #else
    91.             trained_classifierNM1_xml_filepath = Utils.getFilePath ("text/trained_classifierNM1.xml");
    92.             trained_classifierNM2_xml_filepath = Utils.getFilePath ("text/trained_classifierNM2.xml");
    93.             OCRHMM_transitions_table_xml_filepath = Utils.getFilePath ("text/OCRHMM_transitions_table.xml");
    94.             #if UNITY_ANDROID && !UNITY_EDITOR
    95.             OCRHMM_knn_model_data_xml_gz_filepath = Utils.getFilePath ("text/OCRHMM_knn_model_data.xml");
    96.             #else
    97.             OCRHMM_knn_model_data_xml_gz_filepath = Utils.getFilePath ("text/OCRHMM_knn_model_data.xml.gz");
    98.             #endif
    99.             Run ();
    100.             #endif
    101.  
    102.         }
    103.  
    104.         #if UNITY_WEBGL && !UNITY_EDITOR
    105.         private IEnumerator GetFilePath ()
    106.         {
    107.      
    108.         var getFilePathAsync_1_Coroutine = Utils.getFilePathAsync ("text/trained_classifierNM1.xml", (result) => {
    109.         trained_classifierNM1_xml_filepath = result;
    110.         });
    111.         coroutines.Push (getFilePathAsync_1_Coroutine);
    112.         yield return StartCoroutine (getFilePathAsync_1_Coroutine);
    113.  
    114.         var getFilePathAsync_2_Coroutine = Utils.getFilePathAsync ("text/trained_classifierNM2.xml", (result) => {
    115.         trained_classifierNM2_xml_filepath = result;
    116.         });
    117.         coroutines.Push (getFilePathAsync_2_Coroutine);
    118.         yield return StartCoroutine (getFilePathAsync_2_Coroutine);
    119.         var getFilePathAsync_3_Coroutine = Utils.getFilePathAsync ("text/OCRHMM_transitions_table.xml", (result) => {
    120.         OCRHMM_transitions_table_xml_filepath = result;
    121.         });
    122.         coroutines.Push (getFilePathAsync_3_Coroutine);
    123.         yield return StartCoroutine (getFilePathAsync_3_Coroutine);
    124.         //Please strip ".gz" when using ".gz" file on WebGL platform.
    125.         var getFilePathAsync_4_Coroutine = Utils.getFilePathAsync ("text/OCRHMM_knn_model_data.xml", (result) => {
    126.         OCRHMM_knn_model_data_xml_gz_filepath = result;
    127.         });
    128.         coroutines.Push (getFilePathAsync_4_Coroutine);
    129.         yield return StartCoroutine (getFilePathAsync_4_Coroutine);
    130.  
    131.         coroutines.Clear ();
    132.  
    133.         Run ();
    134.         }
    135.         #endif
    136.  
    137.         private void Run ()
    138.         {
    139.             Utils.setDebugMode (true);
    140.  
    141.  
    142.             er_filter1 = OpenCVForUnity.Text.createERFilterNM1 (trained_classifierNM1_xml_filepath, 8, 0.00015f, 0.13f, 0.2f, true, 0.1f);
    143.  
    144.             er_filter2 = OpenCVForUnity.Text.createERFilterNM2 (trained_classifierNM2_xml_filepath, 0.5f);
    145.  
    146.  
    147.             Mat transition_p = new Mat (62, 62, CvType.CV_64FC1);
    148.             //            string filename = "OCRHMM_transitions_table.xml";
    149.             //            FileStorage fs(filename, FileStorage::READ);
    150.             //            fs["transition_probabilities"] >> transition_p;
    151.             //            fs.release();
    152.  
    153.             //Load TransitionProbabilitiesData.
    154.             transition_p.put (0, 0, GetTransitionProbabilitiesData (OCRHMM_transitions_table_xml_filepath));
    155.  
    156.  
    157.             Mat emission_p = Mat.eye (62, 62, CvType.CV_64FC1);
    158.             string voc = "abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789";
    159.             decoder = OCRHMMDecoder.create (OCRHMM_knn_model_data_xml_gz_filepath,
    160.                 voc, transition_p, emission_p);
    161.  
    162.  
    163.             webCamTextureToMatHelper = gameObject.GetComponent<WebCamTextureToMatHelper> ();
    164.             webCamTextureToMatHelper.Initialize ();
    165.  
    166.             flipVerticalToggle.isOn = webCamTextureToMatHelper.flipVertical;
    167.             flipHorizontalToggle.isOn = webCamTextureToMatHelper.flipHorizontal;
    168.         }
    169.  
    170.         /// <summary>
    171.         /// Raises the webcam texture to mat helper initialized event.
    172.         /// </summary>
    173.         public void OnWebCamTextureToMatHelperInitialized ()
    174.         {
    175.             Debug.Log ("OnWebCamTextureToMatHelperInitialized");
    176.  
    177.             Mat webCamTextureMat = webCamTextureToMatHelper.GetMat ();
    178.  
    179.             texture = new Texture2D (webCamTextureMat.cols (), webCamTextureMat.rows (), TextureFormat.RGBA32, false);
    180.  
    181.             gameObject.GetComponent<Renderer> ().material.mainTexture = texture;
    182.  
    183.             gameObject.transform.localScale = new Vector3 (webCamTextureMat.cols (), webCamTextureMat.rows (), 1);
    184.             Debug.Log ("Screen.width " + Screen.width + " Screen.height " + Screen.height + " Screen.orientation " + Screen.orientation);
    185.  
    186.                                    
    187.             float width = webCamTextureMat.width ();
    188.             float height = webCamTextureMat.height ();
    189.                                    
    190.             float widthScale = (float)Screen.width / width;
    191.             float heightScale = (float)Screen.height / height;
    192.             if (widthScale < heightScale) {
    193.                 Camera.main.orthographicSize = (width * (float)Screen.height / (float)Screen.width) / 2;
    194.             } else {
    195.                 Camera.main.orthographicSize = height / 2;
    196.             }
    197.  
    198.  
    199.             binaryMat = new Mat ();
    200.             maskMat = new Mat ();
    201.             rgbMat = new Mat ();
    202.         }
    203.  
    204.         /// <summary>
    205.         /// Raises the webcam texture to mat helper disposed event.
    206.         /// </summary>
    207.         public void OnWebCamTextureToMatHelperDisposed ()
    208.         {
    209.             Debug.Log ("OnWebCamTextureToMatHelperDisposed");
    210.  
    211.             if(binaryMat != null)
    212.                 binaryMat.Dispose ();
    213.             if(maskMat != null)
    214.                 maskMat.Dispose ();
    215.             if(rgbMat != null)
    216.                 rgbMat.Dispose ();
    217.  
    218.         }
    219.  
    220.         /// <summary>
    221.         /// Raises the webcam texture to mat helper error occurred event.
    222.         /// </summary>
    223.         /// <param name="errorCode">Error code.</param>
    224.         public void OnWebCamTextureToMatHelperErrorOccurred (WebCamTextureToMatHelper.ErrorCode errorCode)
    225.         {
    226.             Debug.Log ("OnWebCamTextureToMatHelperErrorOccurred " + errorCode);
    227.  
    228.         }
    229.  
    230.         // Update is called once per frame
    231.         void Update ()
    232.         {
    233.  
    234.             if (webCamTextureToMatHelper.IsPlaying () && webCamTextureToMatHelper.DidUpdateThisFrame ()) {
    235.  
    236.                 Mat rgbaMat = webCamTextureToMatHelper.GetMat ();
    237.  
    238.                 Imgproc.cvtColor (rgbaMat, rgbMat, Imgproc.COLOR_RGBA2RGB);
    239.  
    240.  
    241.                 /*Text Detection*/
    242.                 Imgproc.cvtColor (rgbMat, binaryMat, Imgproc.COLOR_RGB2GRAY);
    243.                 Imgproc.threshold (binaryMat, binaryMat, 0, 255, Imgproc.THRESH_BINARY | Imgproc.THRESH_OTSU);
    244.                 Core.absdiff (binaryMat, new Scalar (255), maskMat);
    245.  
    246.  
    247.                 List<MatOfPoint> regions = new List<MatOfPoint> ();
    248.  
    249.  
    250.                 OpenCVForUnity.Text.detectRegions (binaryMat, er_filter1, er_filter2, regions);
    251.                              
    252. //                Debug.Log ("regions.Count " + regions.Count);
    253.  
    254.                 MatOfRect groups_rects = new MatOfRect ();
    255.                 List<OpenCVForUnity.Rect> rects = new List<OpenCVForUnity.Rect> ();
    256.                 OpenCVForUnity.Text.erGrouping (rgbMat, binaryMat, regions, groups_rects);
    257.  
    258.  
    259.                 for (int i = 0; i < regions.Count; i++) {
    260.                     regions [i].Dispose ();
    261.                 }
    262.                 regions.Clear ();
    263.  
    264.                 rects.AddRange (groups_rects.toList ());
    265.  
    266.                 groups_rects.Dispose ();
    267.  
    268.  
    269.  
    270.                 /*Text Recognition (OCR)*/
    271.  
    272.                 List<Mat> detections = new List<Mat> ();
    273.  
    274.                 for (int i = 0; i < (int)rects.Count; i++) {
    275.  
    276.                     Mat group_img = new Mat ();
    277.  
    278.                     maskMat.submat (rects [i]).copyTo (group_img);
    279.                     Core.copyMakeBorder (group_img, group_img, 15, 15, 15, 15, Core.BORDER_CONSTANT, new Scalar (0));
    280.                     detections.Add (group_img);
    281.  
    282.                 }
    283.  
    284. //                Debug.Log ("detections.Count " + detections.Count);
    285.  
    286. //                Debug.Log ("rects.Count " + rects.Count);
    287.  
    288.                 //#Visualization
    289.                 for (int i = 0; i < rects.Count; i++) {
    290.  
    291.                     Imgproc.rectangle (rgbaMat, new Point (rects [i].x, rects [i].y), new Point (rects [i].x + rects [i].width, rects [i].y + rects [i].height), new Scalar (255, 0, 0, 255), 2);
    292.                     Imgproc.rectangle (rgbaMat, new Point (rects [i].x, rects [i].y), new Point (rects [i].x + rects [i].width, rects [i].y + rects [i].height), new Scalar (255, 255, 255, 255), 1);
    293.  
    294.                     Imgproc.putText (rgbaMat, "" + i, new Point (rects [i].x, rects [i].y), Core.FONT_HERSHEY_SIMPLEX, 0.5, new Scalar (255, 0, 0, 255), 1, Imgproc.LINE_AA, false);
    295.                                      
    296.                 }
    297.                    
    298.  
    299.                 for (int i = 0; i < detections.Count; i++) {
    300.  
    301.                     string output = decoder.run (detections [i], 0);
    302.                     Debug.Log ("output " + output);
    303.                     if (string.IsNullOrEmpty (output)) {
    304.                         Debug.LogError ("IsNullOrEmpty output " + output);
    305.                     } else {
    306.                         Imgproc.putText (rgbaMat, "  " + output, new Point (rects [i].x, rects [i].y), Core.FONT_HERSHEY_SIMPLEX, 0.5, new Scalar (255, 0, 0, 255), 1, Imgproc.LINE_AA, false);
    307.                     }
    308.                 }
    309.  
    310.                 for (int i = 0; i < detections.Count; i++) {
    311.                     detections [i].Dispose ();
    312.                 }
    313.                 detections.Clear ();
    314.  
    315.  
    316.                 Imgproc.putText (rgbaMat, "W:" + rgbaMat.width () + " H:" + rgbaMat.height () + " SO:" + Screen.orientation, new Point (5, rgbaMat.rows () - 10), Core.FONT_HERSHEY_SIMPLEX, 1.0, new Scalar (255, 255, 255, 255), 2, Imgproc.LINE_AA, false);
    317.  
    318.                 Utils.matToTexture2D (rgbaMat, texture, webCamTextureToMatHelper.GetBufferColors ());
    319.             }
    320.         }
    321.  
    322.         /// <summary>
    323.         /// Gets the transition probabilities data.
    324.         /// </summary>
    325.         /// <returns>The transition probabilities data.</returns>
    326.         /// <param name="filePath">File path.</param>
    327.         double[] GetTransitionProbabilitiesData (string filePath)
    328.         {
    329.             XmlDocument xmlDoc = new XmlDocument ();
    330.             xmlDoc.Load (filePath);
    331.  
    332.  
    333.             XmlNode dataNode = xmlDoc.GetElementsByTagName ("data").Item (0);
    334.             //            Debug.Log ("dataNode.InnerText " + dataNode.InnerText);
    335.             string[] dataString = dataNode.InnerText.Split (new string[] {
    336.                 " ",
    337.                 "\r\n", "\n"
    338.             }, StringSplitOptions.RemoveEmptyEntries);
    339.             //            Debug.Log ("dataString.Length " + dataString.Length);
    340.  
    341.             double[] data = new double[dataString.Length];
    342.             for (int i = 0; i < data.Length; i++) {
    343.                 try {
    344.                     data [i] = Convert.ToDouble (dataString [i]);
    345.                 } catch (FormatException) {
    346.                     Debug.Log ("Unable to convert '{" + dataString [i] + "}' to a Double.");
    347.                 } catch (OverflowException) {
    348.                     Debug.Log ("'{" + dataString [i] + "}' is outside the range of a Double.");
    349.                 }
    350.             }      
    351.  
    352.             return data;
    353.         }
    354.  
    355.         /// <summary>
    356.         /// Raises the destroy event.
    357.         /// </summary>
    358.         void OnDestroy ()
    359.         {
    360.             webCamTextureToMatHelper.Dispose ();
    361.  
    362.             if(er_filter1 != null)er_filter1.Dispose ();
    363.             if(er_filter2 != null)er_filter2.Dispose ();
    364.             if(decoder != null)decoder.Dispose ();
    365.  
    366.             #if UNITY_WEBGL && !UNITY_EDITOR
    367.             foreach (var coroutine in coroutines) {
    368.             StopCoroutine (coroutine);
    369.             ((IDisposable)coroutine).Dispose ();
    370.             }
    371.             #endif
    372.  
    373.             Utils.setDebugMode (false);
    374.         }
    375.  
    376.         /// <summary>
    377.         /// Raises the back button click event.
    378.         /// </summary>
    379.         public void OnBackButtonClick ()
    380.         {
    381.             #if UNITY_5_3 || UNITY_5_3_OR_NEWER
    382.             SceneManager.LoadScene ("OpenCVForUnityExample");
    383.             #else
    384.             Application.LoadLevel ("OpenCVForUnityExample");
    385.             #endif
    386.         }
    387.  
    388.         /// <summary>
    389.         /// Raises the play button click event.
    390.         /// </summary>
    391.         public void OnPlayButtonClick ()
    392.         {
    393.             webCamTextureToMatHelper.Play ();
    394.         }
    395.  
    396.         /// <summary>
    397.         /// Raises the pause button click event.
    398.         /// </summary>
    399.         public void OnPauseButtonClick ()
    400.         {
    401.             webCamTextureToMatHelper.Pause ();
    402.         }
    403.  
    404.         /// <summary>
    405.         /// Raises the stop button click event.
    406.         /// </summary>
    407.         public void OnStopButtonClick ()
    408.         {
    409.             webCamTextureToMatHelper.Stop ();
    410.         }
    411.  
    412.         /// <summary>
    413.         /// Raises the change camera button click event.
    414.         /// </summary>
    415.         public void OnChangeCameraButtonClick ()
    416.         {
    417.             webCamTextureToMatHelper.Initialize (null, webCamTextureToMatHelper.requestedWidth, webCamTextureToMatHelper.requestedHeight, !webCamTextureToMatHelper.requestedIsFrontFacing);
    418.         }
    419.  
    420.         /// <summary>
    421.         /// Raises the flip vertical toggle value changed event.
    422.         /// </summary>
    423.         public void OnFlipVerticalToggleValueChanged ()
    424.         {
    425.             if (flipVerticalToggle.isOn) {
    426.                 webCamTextureToMatHelper.flipVertical = true;
    427.             } else {
    428.                 webCamTextureToMatHelper.flipVertical = false;
    429.             }
    430.         }
    431.  
    432.         /// <summary>
    433.         /// Raises the flip horizontal toggle value changed event.
    434.         /// </summary>
    435.         public void OnFlipHorizontalToggleValueChanged ()
    436.         {
    437.             if (flipHorizontalToggle.isOn) {
    438.                 webCamTextureToMatHelper.flipHorizontal = true;
    439.             } else {
    440.                 webCamTextureToMatHelper.flipHorizontal = false;
    441.             }
    442.         }
    443.     }
    444. }
     
    Great-Peter likes this.
  9. ikasapoglu

    ikasapoglu

    Joined:
    Jul 17, 2017
    Posts:
    13
    Hi enox, I need to turn on the flashlight, I wrote a camera script/method but it doesn't work because webcamtexturetomathelper is using camera too. So, how can I turn on flashlight with webcamtexturetomathelper ? (In mobile device)

    And the exception is "Fail to connect to camera service".
     
  10. EnoxSoftware

    EnoxSoftware

    Joined:
    Oct 29, 2014
    Posts:
    965
    It seems difficult to use WebCamTexture and flash light at the same time.
    https://tutel.me/c/programming/questions/30657247/get+camera+feed+with+flashlight

    Perhaps, I think you need to combine with these assets.
    https://assetstore.unity.com/packages/tools/integration/natcam-core-webcam-api-75716
    https://assetstore.unity.com/packages/tools/camera/camera-capture-kit-56673
     
    ikasapoglu likes this.
  11. Gustavo-Quiroz

    Gustavo-Quiroz

    Joined:
    Jul 26, 2013
    Posts:
    36
    Hello,

    Which is the proper way to convert Unity Texture to OpenCV Mat?

    I have tried with the following code but the final texture doesn't seem to be modified at all:

    Code (CSharp):
    1. Mat rgbaMat = new Mat (texture.height, texture.width, CvType.CV_8UC4);
    2. OpenCVForUnity.Utils.textureToMat (texture, rgbaMat);
    3.  
    4. Imgproc.line(rgbaMat, new Point(0, 0), new Point(rgbaMat.cols(), rgbaMat.rows()), new Scalar(255, 0, 0, 255), 4);
    5.  
    6. OpenCVForUnity.Utils.matToTexture2D (rgbaMat, texture);
    7. renderer.material.mainTexture = texture;
    Note, it does work if first I convert the texture to texture2D:

    Code (CSharp):
    1. Mat rgbaMat = new Mat (preview.height, preview.width, CvType.CV_8UC4);
    2. Texture2D temp = new Texture2D (rgbaMat.cols (), rgbaMat.rows (), TextureFormat.RGBA32, false);
    3.  
    4. OpenCVForUnity.Utils.textureToTexture2D(preview, temp);
    5. OpenCVForUnity.Utils.texture2DToMat (temp, rgbaMat);
    6.  
    7. Imgproc.line(rgbaMat, new Point(0, 0), new Point(rgbaMat.cols(), rgbaMat.rows()), new Scalar(255, 0, 0, 255), 4);
    8.  
    9. OpenCVForUnity.Utils.matToTexture2D (rgbaMat, temp);
    10.  
    11. renderer.material.mainTexture = temp;
     
  12. EnoxSoftware

    EnoxSoftware

    Joined:
    Oct 29, 2014
    Posts:
    965
    OpenCVForUnity.Utils.textureToMat () method does not work on some environments.
    Since OpenCVForUnity.Utils.textureToTexture2D () method works on all platforms, i recommend using OpenCVForUnity.Utils.textureToTexture2D ().
     
  13. Great-Peter

    Great-Peter

    Joined:
    Oct 9, 2015
    Posts:
    14
    Hi enox!
    in FaceTracker.cs on FaceTrackerExample
    Do you know how to turn points[0][n] into screen positoin?
    I want to put my heart image on eye.
    do you know?
    I'm trying bellow but not correct position.

    heartimg.position = new vector3(points[0][23].x, points[0][23].y, 0f);

    help me!
     
    Last edited: Oct 23, 2017
  14. Great-Peter

    Great-Peter

    Joined:
    Oct 9, 2015
    Posts:
    14

    Fix it!!!
    GREAT!
     
  15. EnoxSoftware

    EnoxSoftware

    Joined:
    Oct 29, 2014
    Posts:
    965
    It is possible to change the display object of FaceTracker ARExample.
    facetracker_ar_2.PNG
    facetracker_ar_1.PNG
     
  16. Gustavo-Quiroz

    Gustavo-Quiroz

    Joined:
    Jul 26, 2013
    Posts:
    36
    Hello,

    I'm trying to convert a texture (from a GameObject´s renderer.material.MainTexture which is being updated with mobile´s camera feed) to Texture2D, then the Texture2D to a Mat in order to recognize faces in the original texture and then convert the Mat to Texture again and display it someplace.

    My problem is that after converting Mat to Texture and display it, the resulting texture is transparent except by a text I print on the Mat, here is my code:

    Code (CSharp):
    1. public UnityEngine.UI.RawImage debugImage;
    2.  
    3. Mat rgbaMat;
    4. Renderer renderer;
    5.  
    6. Color32[] colors;
    7. Texture2D texture;
    8.  
    9. void Start(){
    10.     renderer = GetComponent<Renderer> ();
    11.     OpenCVForUnity.Utils.setDebugMode (true);
    12.  
    13.     int width = (int)renderer.material.mainTexture.width;
    14.     int height = (int)renderer.material.mainTexture.height;
    15.  
    16.     colors = new Color32[width * height];
    17.     rgbaMat = new Mat (height, width, CvType.CV_8UC4);
    18.     texture = new Texture2D (width, height, TextureFormat.RGBA32, false);  
    19. }
    20.  
    21. void Update(){
    22.     OpenCVForUnity.Utils.textureToTexture2D (renderer.material.mainTexture, texture);
    23.     OpenCVForUnity.Utils.texture2DToMat (texture, rgbaMat);
    24.  
    25.     //....Logic....
    26.  
    27.     Imgproc.putText (rgbaMat, "Faces " + detectResult.Count, new Point (5, rgbaMat.rows () - 10), Core.FONT_HERSHEY_SIMPLEX, 1.5, new Scalar (255, 255, 255, 255), 5, Imgproc.LINE_AA, false);
    28.     OpenCVForUnity.Utils.matToTexture2D (rgbaMat, texture, colors);
    29.  
    30.     debugImage.texture = texture;
    31. }
    This code works great in the editor (Mac) but in Android is where happens the texture full of transparency (Actually I can not detect any face in Android and I think its because the Mat is somehow created with the dimensions of the texture but with no color information more than alpha).

    Is there any suggestion for my case?

    Note* I know that OpenCV has a built-in way to get frames from the camera and convert them to Mat but I'm working on top of a system that has its own camera implementation and I cant replace it, I do have access to a IntPtr RGBA32GPUPtr that is updated with the camera feed but even when using copyToMat (IntPtr intPtr, Mat mat) I get the same result: a Texture full of transparency with a text added with OpenCV.
     
  17. EnoxSoftware

    EnoxSoftware

    Joined:
    Oct 29, 2014
    Posts:
    965
    It seems that processing failed on these lines.
    OpenCVForUnity.Utils.textureToTexture2D (renderer.material.mainTexture, texture);
    OpenCVForUnity.Utils.texture2DToMat (texture, rgbaMat);

    Is it possible to get a pixel data Array instead of RGBA32GPUPtr? If possible, it is possible to copy data directly to Mat.
    http://enoxsoftware.github.io/OpenC..._utils.html#a721b794c719915cb450c794fce7a8568
    Code (CSharp):
    1. Utils.copyToMat<byte>(byteArray, rgbaMat);  
     
  18. Handsome-Wisely

    Handsome-Wisely

    Joined:
    Mar 20, 2013
    Posts:
    68
    hi!author. i want to computer my HOGDescriptor so i use like this:
    but i can not get fMat, can you tell me why?
    thanks
     
  19. Handsome-Wisely

    Handsome-Wisely

    Joined:
    Mar 20, 2013
    Posts:
    68
    Code (CSharp):
    1.  imgMat = new Mat(tex.height, tex.width, CvType.CV_32F);
    2.                 Utils.texture2DToMat(tex, imgMat);
    3.                 HOGDescriptor hg = new HOGDescriptor(new Size(32, 64), new Size(16, 16), new Size(8, 8), new Size(8, 8), 9);
    4.                 fMat = new MatOfFloat();
    5.                 hg.compute(imgMat, fMat);
    6.                 float[] fa = fMat.toArray();
    my code like this
     
  20. ikasapoglu

    ikasapoglu

    Joined:
    Jul 17, 2017
    Posts:
    13
    Enox, can I do auto focus(It should work on ios android and editor) ? Did you try this ? Because I can not catch marker or others.

    Thank you.
     
  21. EnoxSoftware

    EnoxSoftware

    Joined:
    Oct 29, 2014
    Posts:
    965
    In order to display the native opencv's error code, please enclose the code in Utils.setDebugMode(true) and Utils.setDebugMode(false).
    Errors are displayed in the console?

    Also, Please refer to the next post.
    https://forum.unity.com/threads/released-opencv-for-unity.277080/page-21#post-2968159
    https://stackoverflow.com/questions/38233753/android-opencv-why-hog-descriptors-are-always-zero
    https://stackoverflow.com/questions/24560626/hog-parameters-in-opencv-java-version/24562589
     
  22. Handsome-Wisely

    Handsome-Wisely

    Joined:
    Mar 20, 2013
    Posts:
    68
    very thank you for your help. now i can get my matoffloat train my SVM model ok. another question come is when i use swm.save(path); u3d editor breaks!
    my code like this:
    Code (CSharp):
    1.  int totalSampleCount = posTex.Length + negTex.Length;
    2.             Mat sampleFeaturesMat = new Mat(totalSampleCount, mathSize, CvType.CV_32FC1);
    3.             Mat sampleLabelMat = new Mat(totalSampleCount, 1, CvType.CV_32SC1);
    4.             for (int i = 0; i < posTex.Length; i++)
    5.             {
    6.                 MatOfFloat res = getHogRes(posTex[i]);
    7.                 sampleFeaturesMat.put(i, 0, res.toArray());
    8.                 sampleLabelMat.put(i, 0, new int[] { 1 });
    9.             }
    10.             int startPos = posTex.Length;
    11.             for (int i = 0; i < negTex.Length; i++)
    12.             {
    13.                 MatOfFloat res = getHogRes(negTex[i]);
    14.                 sampleFeaturesMat.put(startPos, 0, res.toArray());
    15.                 sampleLabelMat.put(startPos, 0, new int[] { -1 });
    16.                 startPos++;
    17.             }
    18.             SVM svm = SVM.create();
    19.             svm.setType(SVM.C_SVC);
    20.             svm.setKernel(SVM.LINEAR);
    21.             svm.setTermCriteria(new TermCriteria(TermCriteria.MAX_ITER, 100, 1e-6));
    22.             svm.train(sampleFeaturesMat, Ml.ROW_SAMPLE, sampleLabelMat);
    23.             Debug.Log("train complete!");
    24.             try
    25.             {
    26.                 svmPath = Application.streamingAssetsPath + "/hog.xml";
    27.                 //this line make u3d break
    28.                 svm.save(svmPath);
    29.             }
    30.             catch(Exception _ex)
    31.             {
    32.                 Debug.LogError(_ex);
    33.             }
    and u3d break log like this:
     
  23. charlee-qq

    charlee-qq

    Joined:
    Oct 2, 2012
    Posts:
    4
    hello, may i ask you something. in ArUco i want to spawn each object in each marker id (the objects are different) ex. maker id:1 spawn cube , marker id2: spawn sphere. how to do that?
    thank you so much.
     
  24. ikasapoglu

    ikasapoglu

    Joined:
    Jul 17, 2017
    Posts:
    13
    Hello, firstly define your gameObjects like cube1, cube2, cube3. And define a Dictionary<int, Mat> lets say its name markerSet, then detect markers. After the detection of markers, just loop for ids, it gives you markers' own id. And add them to our markerSet<ids, corners> Like;
    Code (CSharp):
    1. for (i < ids.total()){
    2. markerset.add(ids[i], corners[i])
    3. }
    Now you know markers and their ids. And now you can instantiate a cube like calling
    Code (CSharp):
    1. markerSet[1].get(0,0) //1 is the marker id and 0,0 is the point of the left corner.
    Hopefully it would be help you. But I think, of course enox have a better idea, you can expect him to come.
     
    charlee-qq and EnoxSoftware like this.
  25. kwkw

    kwkw

    Joined:
    Mar 3, 2014
    Posts:
    9
  26. EnoxSoftware

    EnoxSoftware

    Joined:
    Oct 29, 2014
    Posts:
    965
    Does the problem occur even if you specify a different svmPath?
     
  27. EnoxSoftware

    EnoxSoftware

    Joined:
    Oct 29, 2014
    Posts:
    965
    kwkw likes this.
  28. MathiasNorregaard

    MathiasNorregaard

    Joined:
    Mar 1, 2017
    Posts:
    5
    Hey there, has there been any reports of issues with iOS 11?
    I'm using Unity 5.6.4. It works fine with Android, but when building to iOS, I get 212 "Apple Mach-O Linker ID" errors during linking in XCode.
    They all look sort of like this:

    Code (CSharp):
    1. Undefined symbols for architecture arm64:
    2.   "_xphoto_Xphoto_dctDenoising_11", referenced from:
    3.       _Xphoto_dctDenoising_m1784276375 in Bulk_Assembly-CSharp_3.o
    4.       _Xphoto_xphoto_Xphoto_dctDenoising_11_m2173816032 in Bulk_Assembly-CSharp_3.o
    5.      (maybe you meant: _Xphoto_xphoto_Xphoto_dctDenoising_11_m2173816032)
    6.  
    So I'm not getting any errors in Unity, but just in XCode. Any ideas? I don't have any problems with building for iOS usually.
     
  29. Gustavo-Quiroz

    Gustavo-Quiroz

    Joined:
    Jul 26, 2013
    Posts:
    36
    Unfurnatelly this didn't help either, what I had to do for solving this was to wait for the end of the frame before convert Texture to Mat:

    Code (CSharp):
    1. IEnumerator _Tracking () {
    2.         OpenCVForUnity.Utils.setDebugMode (true);
    3.         int width = (int)renderer.material.mainTexture.width;
    4.         int height = (int)renderer.material.mainTexture.height;
    5.  
    6.         while (tracking) {
    7.             yield return new WaitForEndOfFrame ();
    8.  
    9.             rgbaMat = rgbaMat ?? new Mat (width, height, CvType.CV_8UC4);
    10.             texture = texture ?? new Texture2D (width, height, TextureFormat.RGBA32, false);
    11.  
    12.             Utils.textureToTexture2D (renderer.material.mainTexture, texture);
    13.             Utils.fastTexture2DToMat (texture, rgbaMat);
    14.  
    15.             //....Logic....
    16.  
    17.             Imgproc.putText (rgbaMat, "Faces " + detectResult.Count, new Point (5, rgbaMat.rows () - 10), Core.FONT_HERSHEY_SIMPLEX, 1.5, new Scalar (255, 255, 255, 255), 5, Imgproc.LINE_AA, false);
    18.             OpenCVForUnity.Utils.matToTexture2D (rgbaMat, texture, colors);
    19.         }
    20.         rgbaMat.Dispose ();
    21.         OpenCVForUnity.Utils.setDebugMode (false);
    22. }
     
    EnoxSoftware likes this.
  30. charlee-qq

    charlee-qq

    Joined:
    Oct 2, 2012
    Posts:
    4
    um.. i want to fix a game object in each marker id. ex. marker id1 has a red cube only, in attached photo has a number of marker id (is green so sorry for small size) spawn on a marker and when to spawning a red cube on a marker I wanna fix a red cube in a marker 1 only (sometimes the red cube and blue cube are swapped position on the marker ) , and where is a function for display a marker id like green character on marker?
    thank you for help.
     

    Attached Files:

  31. ikasapoglu

    ikasapoglu

    Joined:
    Jul 17, 2017
    Posts:
    13
    I think, I don't get it your problem exactly and I have to see your code because it isn't seems like a problem with opencv.

    The DrawDetectedMarkers method does it.
     
  32. ikasapoglu

    ikasapoglu

    Joined:
    Jul 17, 2017
    Posts:
    13
    I can not catch marker or others because of the blurry cam image
    @EnoxSoftware
     
  33. EnoxSoftware

    EnoxSoftware

    Joined:
    Oct 29, 2014
    Posts:
    965
  34. EnoxSoftware

    EnoxSoftware

    Joined:
    Oct 29, 2014
    Posts:
    965
    Is ImportSettings set correctly?
    スクリーンショット 2015-07-25 22.30.11.png
     
  35. wbknox

    wbknox

    Joined:
    Aug 1, 2016
    Posts:
    11
    Thanks for providing access to this wonderful library within Unity.

    Are matrix operations like the ones listed here implemented in some way? E.g., "A*alpha" for scaling clearly doesn't work in C#; was an alternative made available within OpenCVForUnity?
     
  36. charlee-qq

    charlee-qq

    Joined:
    Oct 2, 2012
    Posts:
    4
    Thank you for comment. I'm sorry for making you confused. Renew the DrawDetectedMarkers method for drawing a frame of marker. and how to check what number of marker when the system found ex. if found marker id:1 debug.log("marker1");
     
  37. EnoxSoftware

    EnoxSoftware

    Joined:
    Oct 29, 2014
    Posts:
    965
  38. EnoxSoftware

    EnoxSoftware

    Joined:
    Oct 29, 2014
    Posts:
    965
    You can change game objects that display depending on the id of the marker.
    Code (CSharp):
    1. using UnityEngine;
    2. using System.Collections;
    3. using System.Collections.Generic;
    4.  
    5. #if UNITY_5_3 || UNITY_5_3_OR_NEWER
    6. using UnityEngine.SceneManagement;
    7. #endif
    8. using OpenCVForUnity;
    9.  
    10. namespace OpenCVForUnityExample
    11. {
    12.     /// <summary>
    13.     /// ArUco Example
    14.     /// An example of marker-based AR view and camera pose estimation using the aruco (ArUco Marker Detection) module.
    15.     /// Referring to https://github.com/opencv/opencv_contrib/blob/master/modules/aruco/samples/detect_markers.cpp.
    16.     /// http://docs.opencv.org/3.1.0/d5/dae/tutorial_aruco_detection.html
    17.     /// </summary>
    18.     public class ArUcoExample : MonoBehaviour
    19.     {
    20.         /// <summary>
    21.         /// The image texture.
    22.         /// </summary>
    23.         public Texture2D imgTexture;
    24.        
    25.         /// <summary>
    26.         /// The dictionary identifier.
    27.         /// </summary>
    28.         public int dictionaryId = 10;
    29.        
    30.         /// <summary>
    31.         /// Determines if shows rejected markers.
    32.         /// </summary>
    33.         public bool showRejected = true;
    34.        
    35.         /// <summary>
    36.         /// Determines if applied the pose estimation.
    37.         /// </summary>
    38.         public bool applyEstimationPose = true;
    39.        
    40.         /// <summary>
    41.         /// The length of the marker.
    42.         /// </summary>
    43.         public float markerLength = 100;
    44.  
    45.         /// <summary>
    46.         /// The AR game object 1.
    47.         /// </summary>
    48.         public GameObject ARGameObject_1;
    49.  
    50.         /// <summary>
    51.         /// The AR game object 2.
    52.         /// </summary>
    53.         public GameObject ARGameObject_2;
    54.        
    55.         /// <summary>
    56.         /// The AR camera.
    57.         /// </summary>
    58.         public Camera ARCamera;
    59.        
    60.         /// <summary>
    61.         /// Determines if request the AR camera moving.
    62.         /// </summary>
    63.         public bool shouldMoveARCamera = false;
    64.        
    65.         // Use this for initialization
    66.         void Start ()
    67.         {
    68.             Mat rgbMat = new Mat (imgTexture.height, imgTexture.width, CvType.CV_8UC3);
    69.            
    70.             Utils.texture2DToMat (imgTexture, rgbMat);
    71.             Debug.Log ("imgMat dst ToString " + rgbMat.ToString ());
    72.            
    73.            
    74.             gameObject.transform.localScale = new Vector3 (imgTexture.width, imgTexture.height, 1);
    75.             Debug.Log ("Screen.width " + Screen.width + " Screen.height " + Screen.height + " Screen.orientation " + Screen.orientation);
    76.            
    77.             float width = rgbMat.width ();
    78.             float height = rgbMat.height ();
    79.            
    80.             float imageSizeScale = 1.0f;
    81.             float widthScale = (float)Screen.width / width;
    82.             float heightScale = (float)Screen.height / height;
    83.             if (widthScale < heightScale) {
    84.                 Camera.main.orthographicSize = (width * (float)Screen.height / (float)Screen.width) / 2;
    85.                 imageSizeScale = (float)Screen.height / (float)Screen.width;
    86.             } else {
    87.                 Camera.main.orthographicSize = height / 2;
    88.             }
    89.            
    90.            
    91.             // set cameraparam.
    92.             int max_d = (int)Mathf.Max (width, height);
    93.             double fx = max_d;
    94.             double fy = max_d;
    95.             double cx = width / 2.0f;
    96.             double cy = height / 2.0f;
    97.             Mat camMatrix = new Mat (3, 3, CvType.CV_64FC1);
    98.             camMatrix.put (0, 0, fx);
    99.             camMatrix.put (0, 1, 0);
    100.             camMatrix.put (0, 2, cx);
    101.             camMatrix.put (1, 0, 0);
    102.             camMatrix.put (1, 1, fy);
    103.             camMatrix.put (1, 2, cy);
    104.             camMatrix.put (2, 0, 0);
    105.             camMatrix.put (2, 1, 0);
    106.             camMatrix.put (2, 2, 1.0f);
    107.             Debug.Log ("camMatrix " + camMatrix.dump ());
    108.            
    109.            
    110.             MatOfDouble distCoeffs = new MatOfDouble (0, 0, 0, 0);
    111.             Debug.Log ("distCoeffs " + distCoeffs.dump ());
    112.            
    113.            
    114.             // calibration camera.
    115.             Size imageSize = new Size (width * imageSizeScale, height * imageSizeScale);
    116.             double apertureWidth = 0;
    117.             double apertureHeight = 0;
    118.             double[] fovx = new double[1];
    119.             double[] fovy = new double[1];
    120.             double[] focalLength = new double[1];
    121.             Point principalPoint = new Point (0, 0);
    122.             double[] aspectratio = new double[1];
    123.            
    124.             Calib3d.calibrationMatrixValues (camMatrix, imageSize, apertureWidth, apertureHeight, fovx, fovy, focalLength, principalPoint, aspectratio);
    125.            
    126.             Debug.Log ("imageSize " + imageSize.ToString ());
    127.             Debug.Log ("apertureWidth " + apertureWidth);
    128.             Debug.Log ("apertureHeight " + apertureHeight);
    129.             Debug.Log ("fovx " + fovx [0]);
    130.             Debug.Log ("fovy " + fovy [0]);
    131.             Debug.Log ("focalLength " + focalLength [0]);
    132.             Debug.Log ("principalPoint " + principalPoint.ToString ());
    133.             Debug.Log ("aspectratio " + aspectratio [0]);
    134.            
    135.            
    136.             // To convert the difference of the FOV value of the OpenCV and Unity.
    137.             double fovXScale = (2.0 * Mathf.Atan ((float)(imageSize.width / (2.0 * fx)))) / (Mathf.Atan2 ((float)cx, (float)fx) + Mathf.Atan2 ((float)(imageSize.width - cx), (float)fx));
    138.             double fovYScale = (2.0 * Mathf.Atan ((float)(imageSize.height / (2.0 * fy)))) / (Mathf.Atan2 ((float)cy, (float)fy) + Mathf.Atan2 ((float)(imageSize.height - cy), (float)fy));
    139.            
    140.             Debug.Log ("fovXScale " + fovXScale);
    141.             Debug.Log ("fovYScale " + fovYScale);
    142.            
    143.            
    144.             // Adjust Unity Camera FOV https://github.com/opencv/opencv/commit/8ed1945ccd52501f5ab22bdec6aa1f91f1e2cfd4
    145.             if (widthScale < heightScale) {
    146.                 ARCamera.fieldOfView = (float)(fovx [0] * fovXScale);
    147.             } else {
    148.                 ARCamera.fieldOfView = (float)(fovy [0] * fovYScale);
    149.             }
    150.            
    151.            
    152.            
    153.             Mat ids = new Mat ();
    154.             List<Mat> corners = new List<Mat> ();
    155.             List<Mat> rejected = new List<Mat> ();
    156.             Mat rvecs = new Mat ();
    157.             Mat tvecs = new Mat ();
    158.             Mat rotMat = new Mat (3, 3, CvType.CV_64FC1);
    159.            
    160.             DetectorParameters detectorParams = DetectorParameters.create ();
    161.             Dictionary dictionary = Aruco.getPredefinedDictionary (dictionaryId);
    162.            
    163.            
    164.             // detect markers.
    165.             Aruco.detectMarkers (rgbMat, dictionary, corners, ids, detectorParams, rejected, camMatrix, distCoeffs);
    166.  
    167.             // estimate pose.
    168.             if (applyEstimationPose && ids.total () > 0)
    169.                 Aruco.estimatePoseSingleMarkers (corners, markerLength, camMatrix, distCoeffs, rvecs, tvecs);
    170.  
    171.             if (ids.total () > 0) {
    172.                 Aruco.drawDetectedMarkers (rgbMat, corners, ids, new Scalar (255, 0, 0));
    173.                
    174.                 if (applyEstimationPose) {
    175.                     for (int i = 0; i < ids.total (); i++) {
    176.                         Debug.Log ("ids.dump() " + ids.dump ());
    177.                        
    178.                         Aruco.drawAxis (rgbMat, camMatrix, distCoeffs, rvecs, tvecs, markerLength * 0.5f);
    179.                        
    180.                         // This example can display ARObject on only first detected marker.
    181. //                        if (i == 0) {
    182.  
    183.                         // position
    184.                         double[] tvec = tvecs.get (i, 0);
    185.  
    186.                         // rotation
    187.                         double[] rv = rvecs.get (i, 0);
    188.                         Mat rvec = new Mat (3, 1, CvType.CV_64FC1);
    189.                         rvec.put (0, 0, rv [0]);
    190.                         rvec.put (1, 0, rv [1]);
    191.                         rvec.put (2, 0, rv [2]);
    192.                         Calib3d.Rodrigues (rvec, rotMat);
    193.  
    194.                         Matrix4x4 transformationM = new Matrix4x4 (); // from OpenCV
    195.                         transformationM.SetRow (0, new Vector4 ((float)rotMat.get (0, 0) [0], (float)rotMat.get (0, 1) [0], (float)rotMat.get (0, 2) [0], (float)tvec [0]));
    196.                         transformationM.SetRow (1, new Vector4 ((float)rotMat.get (1, 0) [0], (float)rotMat.get (1, 1) [0], (float)rotMat.get (1, 2) [0], (float)tvec [1]));
    197.                         transformationM.SetRow (2, new Vector4 ((float)rotMat.get (2, 0) [0], (float)rotMat.get (2, 1) [0], (float)rotMat.get (2, 2) [0], (float)tvec [2]));
    198.                         transformationM.SetRow (3, new Vector4 (0, 0, 0, 1));
    199.                         Debug.Log ("transformationM " + transformationM.ToString ());
    200.  
    201.                         Matrix4x4 invertZM = Matrix4x4.TRS (Vector3.zero, Quaternion.identity, new Vector3 (1, 1, -1));
    202.                         Debug.Log ("invertZM " + invertZM.ToString ());
    203.                            
    204.                         Matrix4x4 invertYM = Matrix4x4.TRS (Vector3.zero, Quaternion.identity, new Vector3 (1, -1, 1));
    205.                         Debug.Log ("invertYM " + invertYM.ToString ());
    206.  
    207.                         // right-handed coordinates system (OpenCV) to left-handed one (Unity)
    208.                         Matrix4x4 ARM = invertYM * transformationM;
    209.                            
    210.                         // Apply Z axis inverted matrix.
    211.                         ARM = ARM * invertZM;
    212.  
    213.  
    214.  
    215.                         ARM = ARCamera.transform.localToWorldMatrix * ARM;
    216.  
    217.                         Debug.Log ("ARM " + ARM.ToString ());
    218.  
    219.  
    220.  
    221.                         //get marker's id
    222.                         int id = (int)ids.get (i, 0) [0];
    223.  
    224.                         if (id == 1) {
    225.  
    226.                             ARUtils.SetTransformFromMatrix (ARGameObject_1.transform, ref ARM);
    227.                         } else if (id == 2) {
    228.                            
    229.                             ARUtils.SetTransformFromMatrix (ARGameObject_2.transform, ref ARM);
    230.                         }
    231. //                        }
    232.                     }
    233.                 }
    234.             }
    235.            
    236.             if (showRejected && rejected.Count > 0)
    237.                 Aruco.drawDetectedMarkers (rgbMat, rejected, new Mat (), new Scalar (0, 0, 255));
    238.            
    239.            
    240.             Texture2D texture = new Texture2D (rgbMat.cols (), rgbMat.rows (), TextureFormat.RGBA32, false);
    241.            
    242.             Utils.matToTexture2D (rgbMat, texture);
    243.            
    244.             gameObject.GetComponent<Renderer> ().material.mainTexture = texture;
    245.         }
    246.        
    247.         // Update is called once per frame
    248.         void Update ()
    249.         {
    250.            
    251.         }
    252.  
    253.         /// <summary>
    254.         /// Raises the back button click event.
    255.         /// </summary>
    256.         public void OnBackButtonClick ()
    257.         {
    258.             #if UNITY_5_3 || UNITY_5_3_OR_NEWER
    259.             SceneManager.LoadScene ("OpenCVForUnityExample");
    260.             #else
    261.             Application.LoadLevel ("OpenCVForUnityExample");
    262.             #endif
    263.         }
    264.     }
    265. }
    MultiAR.PNG aruco_sample2.png
     
    ikasapoglu likes this.
  39. kwkw

    kwkw

    Joined:
    Mar 3, 2014
    Posts:
    9
    can you help me
    arhead very low fps. can you help me how to combine ARhead and Frameoptimization
    thanks
     
    Last edited: Nov 8, 2017
  40. Story_holic

    Story_holic

    Joined:
    Aug 8, 2017
    Posts:
    4
    Hello, i am new to OpenCV in Unity, and i have a question.
    I can't solve this problem...
    cascade file is not loaded.Please copy from “OpenCVForUnity/StreamingAssets/” to “Assets/StreamingAssets/” folder.
    UnityEngine.Debug:LogError(Object)
    OpenCVForUnityExample.FaceDetectionExample:Start() (at Assets/OpenCVForUnity/Examples/MainModules/objdetect/FaceDetectionExample/FaceDetectionExample.cs:53)

    I already move StreamingAssets folder but this error message appears..
    how can i solve this problem?
     
  41. sticklezz

    sticklezz

    Joined:
    Oct 27, 2015
    Posts:
    33
    Is there a way to use the unaltered webcam source as one texture ("_SourceTex ") , and the OpenCv altered one as "_MainTex" in a shader?
     
  42. Story_holic

    Story_holic

    Joined:
    Aug 8, 2017
    Posts:
    4
    upload_2017-11-8_21-39-6.png
    Hello, I tried tutorial Video, But it still errored..
    I moved 2017.1version to 5.6.4version but error message keeps appear.
    Can you help me this cascade problem?
     
  43. wbknox

    wbknox

    Joined:
    Aug 1, 2016
    Posts:
    11
  44. EnoxSoftware

    EnoxSoftware

    Joined:
    Oct 29, 2014
    Posts:
    965
    OpenCV face detection is effective for speed improvement.
    Code (CSharp):
    1. using UnityEngine;
    2. using System.Collections;
    3. using System.Collections.Generic;
    4. using System;
    5. using UnityEngine.UI;
    6.  
    7. #if UNITY_5_3 || UNITY_5_3_OR_NEWER
    8. using UnityEngine.SceneManagement;
    9. #endif
    10. using OpenCVForUnity;
    11. using DlibFaceLandmarkDetector;
    12.  
    13. namespace DlibFaceLandmarkDetectorExample
    14. {
    15.     /// <summary>
    16.     /// AR Head Example
    17.     /// This example was referring to http://www.morethantechnical.com/2012/10/17/head-pose-estimation-with-opencv-opengl-revisited-w-code/
    18.     /// and use effect asset from http://ktk-kumamoto.hatenablog.com/entry/2014/09/14/092400.
    19.     /// </summary>
    20.     [RequireComponent (typeof(WebCamTextureToMatHelper))]
    21.     public class ARHeadExample : MonoBehaviour
    22.     {
    23.         /// <summary>
    24.         /// Determines if displays face points.
    25.         /// </summary>
    26.         public bool displayFacePoints;
    27.        
    28.         /// <summary>
    29.         /// The display face points toggle.
    30.         /// </summary>
    31.         public Toggle displayFacePointsToggle;
    32.        
    33.         /// <summary>
    34.         /// Determines if displays display axes
    35.         /// </summary>
    36.         public bool displayAxes;
    37.        
    38.         /// <summary>
    39.         /// The display axes toggle.
    40.         /// </summary>
    41.         public Toggle displayAxesToggle;
    42.        
    43.         /// <summary>
    44.         /// Determines if displays head.
    45.         /// </summary>
    46.         public bool displayHead;
    47.        
    48.         /// <summary>
    49.         /// The display head toggle.
    50.         /// </summary>
    51.         public Toggle displayHeadToggle;
    52.        
    53.         /// <summary>
    54.         /// Determines if displays effects.
    55.         /// </summary>
    56.         public bool displayEffects;
    57.        
    58.         /// <summary>
    59.         /// The display effects toggle.
    60.         /// </summary>
    61.         public Toggle displayEffectsToggle;
    62.        
    63.         /// <summary>
    64.         /// The axes.
    65.         /// </summary>
    66.         public GameObject axes;
    67.        
    68.         /// <summary>
    69.         /// The head.
    70.         /// </summary>
    71.         public GameObject head;
    72.        
    73.         /// <summary>
    74.         /// The right eye.
    75.         /// </summary>
    76.         public GameObject rightEye;
    77.        
    78.         /// <summary>
    79.         /// The left eye.
    80.         /// </summary>
    81.         public GameObject leftEye;
    82.        
    83.         /// <summary>
    84.         /// The mouth.
    85.         /// </summary>
    86.         public GameObject mouth;
    87.        
    88.         /// <summary>
    89.         /// The AR camera.
    90.         /// </summary>
    91.         public Camera ARCamera;
    92.        
    93.         /// <summary>
    94.         /// The AR game object.
    95.         /// </summary>
    96.         public GameObject ARGameObject;
    97.        
    98.         /// <summary>
    99.         /// Determines if request the AR camera moving.
    100.         /// </summary>
    101.         public bool shouldMoveARCamera;
    102.        
    103.         /// <summary>
    104.         /// The mouth particle system.
    105.         /// </summary>
    106.         ParticleSystem[] mouthParticleSystem;
    107.        
    108.         /// <summary>
    109.         /// The texture.
    110.         /// </summary>
    111.         Texture2D texture;
    112.        
    113.         /// <summary>
    114.         /// The face landmark detector.
    115.         /// </summary>
    116.         FaceLandmarkDetector faceLandmarkDetector;
    117.        
    118.         /// <summary>
    119.         /// The cameraparam matrix.
    120.         /// </summary>
    121.         Mat camMatrix;
    122.        
    123.         /// <summary>
    124.         /// The distortion coeffs.
    125.         /// </summary>
    126.         MatOfDouble distCoeffs;
    127.  
    128.         /// <summary>
    129.         /// The matrix that inverts the Y axis.
    130.         /// </summary>
    131.         Matrix4x4 invertYM;
    132.        
    133.         /// <summary>
    134.         /// The matrix that inverts the Z axis.
    135.         /// </summary>
    136.         Matrix4x4 invertZM;
    137.        
    138.         /// <summary>
    139.         /// The transformation matrix.
    140.         /// </summary>
    141.         Matrix4x4 transformationM = new Matrix4x4 ();
    142.  
    143.         /// <summary>
    144.         /// The transformation matrix for AR.
    145.         /// </summary>
    146.         Matrix4x4 ARM;
    147.        
    148.         /// <summary>
    149.         /// The 3d face object points.
    150.         /// </summary>
    151.         MatOfPoint3f objectPoints;
    152.        
    153.         /// <summary>
    154.         /// The image points.
    155.         /// </summary>
    156.         MatOfPoint2f imagePoints;
    157.        
    158.         /// <summary>
    159.         /// The rvec.
    160.         /// </summary>
    161.         Mat rvec;
    162.        
    163.         /// <summary>
    164.         /// The tvec.
    165.         /// </summary>
    166.         Mat tvec;
    167.        
    168.         /// <summary>
    169.         /// The rot mat.
    170.         /// </summary>
    171.         Mat rotMat;
    172.        
    173.         /// <summary>
    174.         /// The webcam texture to mat helper.
    175.         /// </summary>
    176.         WebCamTextureToMatHelper webCamTextureToMatHelper;
    177.        
    178.         /// <summary>
    179.         /// The sp_human_face_68_dat_filepath.
    180.         /// </summary>
    181.         string sp_human_face_68_dat_filepath;
    182.  
    183.  
    184.  
    185.  
    186.         /// <summary>
    187.         /// The gray mat.
    188.         /// </summary>
    189.         Mat grayMat;
    190.  
    191.         /// <summary>
    192.         /// The cascade.
    193.         /// </summary>
    194.         CascadeClassifier cascade;
    195.  
    196.         /// <summary>
    197.         /// The detection result.
    198.         /// </summary>
    199.         List<UnityEngine.Rect> detectResult;
    200.  
    201.         /// <summary>
    202.         /// The haarcascade_frontalface_alt_xml_filepath.
    203.         /// </summary>
    204.         string haarcascade_frontalface_alt_xml_filepath;
    205.  
    206.  
    207.  
    208.         #if UNITY_WEBGL && !UNITY_EDITOR
    209.         Stack<IEnumerator> coroutines = new Stack<IEnumerator> ();
    210.         #endif
    211.        
    212.         // Use this for initialization
    213.         void Start ()
    214.         {
    215.             displayFacePointsToggle.isOn = displayFacePoints;
    216.             displayAxesToggle.isOn = displayAxes;
    217.             displayHeadToggle.isOn = displayHead;
    218.             displayEffectsToggle.isOn = displayEffects;
    219.            
    220.             #if UNITY_WEBGL && !UNITY_EDITOR
    221.             var getFilePath_Coroutine = DlibFaceLandmarkDetector.Utils.getFilePathAsync ("sp_human_face_68.dat", (result) => {
    222.                 coroutines.Clear ();
    223.  
    224.                 sp_human_face_68_dat_filepath = result;
    225.                 Run ();
    226.             });
    227.             coroutines.Push (getFilePath_Coroutine);
    228.             StartCoroutine (getFilePath_Coroutine);
    229.             #else
    230.             haarcascade_frontalface_alt_xml_filepath = OpenCVForUnity.Utils.getFilePath ("haarcascade_frontalface_alt.xml");
    231.             sp_human_face_68_dat_filepath = DlibFaceLandmarkDetector.Utils.getFilePath ("sp_human_face_68.dat");
    232.             Run ();
    233.             #endif
    234.         }
    235.  
    236.         private void Run ()
    237.         {
    238.             cascade = new CascadeClassifier (haarcascade_frontalface_alt_xml_filepath);
    239.             //            if (cascade.empty ()) {
    240.             //                Debug.LogError ("cascade file is not loaded.Please copy from “FaceTrackerExample/StreamingAssets/” to “Assets/StreamingAssets/” folder. ");
    241.             //            }
    242.  
    243.             //set 3d face object points.
    244.             objectPoints = new MatOfPoint3f (
    245.                 new Point3 (-34, 90, 83),//l eye (Interpupillary breadth)
    246.                 new Point3 (34, 90, 83),//r eye (Interpupillary breadth)
    247.                 new Point3 (0.0, 50, 120),//nose (Nose top)
    248.                 new Point3 (-26, 15, 83),//l mouse (Mouth breadth)
    249.                 new Point3 (26, 15, 83),//r mouse (Mouth breadth)
    250.                 new Point3 (-79, 90, 0.0),//l ear (Bitragion breadth)
    251.                 new Point3 (79, 90, 0.0)//r ear (Bitragion breadth)
    252.             );
    253.             imagePoints = new MatOfPoint2f ();
    254.             rotMat = new Mat (3, 3, CvType.CV_64FC1);
    255.            
    256.             faceLandmarkDetector = new FaceLandmarkDetector (sp_human_face_68_dat_filepath);
    257.            
    258.             webCamTextureToMatHelper = gameObject.GetComponent<WebCamTextureToMatHelper> ();
    259.             webCamTextureToMatHelper.Initialize ();
    260.         }
    261.  
    262.         /// <summary>
    263.         /// Raises the web cam texture to mat helper initialized event.
    264.         /// </summary>
    265.         public void OnWebCamTextureToMatHelperInitialized ()
    266.         {
    267.             Debug.Log ("OnWebCamTextureToMatHelperInitialized");
    268.            
    269.             Mat webCamTextureMat = webCamTextureToMatHelper.GetMat ();
    270.            
    271.             texture = new Texture2D (webCamTextureMat.cols (), webCamTextureMat.rows (), TextureFormat.RGBA32, false);
    272.            
    273.             gameObject.GetComponent<Renderer> ().material.mainTexture = texture;
    274.            
    275.             gameObject.transform.localScale = new Vector3 (webCamTextureMat.cols (), webCamTextureMat.rows (), 1);
    276.             Debug.Log ("Screen.width " + Screen.width + " Screen.height " + Screen.height + " Screen.orientation " + Screen.orientation);
    277.            
    278.            
    279.             float width = webCamTextureMat.width ();
    280.             float height = webCamTextureMat.height ();
    281.            
    282.             float imageSizeScale = 1.0f;
    283.             float widthScale = (float)Screen.width / width;
    284.             float heightScale = (float)Screen.height / height;
    285.             if (widthScale < heightScale) {
    286.                 Camera.main.orthographicSize = (width * (float)Screen.height / (float)Screen.width) / 2;
    287.                 imageSizeScale = (float)Screen.height / (float)Screen.width;
    288.             } else {
    289.                 Camera.main.orthographicSize = height / 2;
    290.             }
    291.            
    292.            
    293.             //set cameraparam
    294.             int max_d = (int)Mathf.Max (width, height);
    295.             double fx = max_d;
    296.             double fy = max_d;
    297.             double cx = width / 2.0f;
    298.             double cy = height / 2.0f;
    299.             camMatrix = new Mat (3, 3, CvType.CV_64FC1);
    300.             camMatrix.put (0, 0, fx);
    301.             camMatrix.put (0, 1, 0);
    302.             camMatrix.put (0, 2, cx);
    303.             camMatrix.put (1, 0, 0);
    304.             camMatrix.put (1, 1, fy);
    305.             camMatrix.put (1, 2, cy);
    306.             camMatrix.put (2, 0, 0);
    307.             camMatrix.put (2, 1, 0);
    308.             camMatrix.put (2, 2, 1.0f);
    309.             Debug.Log ("camMatrix " + camMatrix.dump ());
    310.            
    311.            
    312.             distCoeffs = new MatOfDouble (0, 0, 0, 0);
    313.             Debug.Log ("distCoeffs " + distCoeffs.dump ());
    314.            
    315.            
    316.             //calibration camera
    317.             Size imageSize = new Size (width * imageSizeScale, height * imageSizeScale);
    318.             double apertureWidth = 0;
    319.             double apertureHeight = 0;
    320.             double[] fovx = new double[1];
    321.             double[] fovy = new double[1];
    322.             double[] focalLength = new double[1];
    323.             Point principalPoint = new Point (0, 0);
    324.             double[] aspectratio = new double[1];
    325.            
    326.             Calib3d.calibrationMatrixValues (camMatrix, imageSize, apertureWidth, apertureHeight, fovx, fovy, focalLength, principalPoint, aspectratio);
    327.            
    328.             Debug.Log ("imageSize " + imageSize.ToString ());
    329.             Debug.Log ("apertureWidth " + apertureWidth);
    330.             Debug.Log ("apertureHeight " + apertureHeight);
    331.             Debug.Log ("fovx " + fovx [0]);
    332.             Debug.Log ("fovy " + fovy [0]);
    333.             Debug.Log ("focalLength " + focalLength [0]);
    334.             Debug.Log ("principalPoint " + principalPoint.ToString ());
    335.             Debug.Log ("aspectratio " + aspectratio [0]);
    336.            
    337.            
    338.             //To convert the difference of the FOV value of the OpenCV and Unity.
    339.             double fovXScale = (2.0 * Mathf.Atan ((float)(imageSize.width / (2.0 * fx)))) / (Mathf.Atan2 ((float)cx, (float)fx) + Mathf.Atan2 ((float)(imageSize.width - cx), (float)fx));
    340.             double fovYScale = (2.0 * Mathf.Atan ((float)(imageSize.height / (2.0 * fy)))) / (Mathf.Atan2 ((float)cy, (float)fy) + Mathf.Atan2 ((float)(imageSize.height - cy), (float)fy));
    341.            
    342.             Debug.Log ("fovXScale " + fovXScale);
    343.             Debug.Log ("fovYScale " + fovYScale);
    344.            
    345.            
    346.             //Adjust Unity Camera FOV https://github.com/opencv/opencv/commit/8ed1945ccd52501f5ab22bdec6aa1f91f1e2cfd4
    347.             if (widthScale < heightScale) {
    348.                 ARCamera.fieldOfView = (float)(fovx [0] * fovXScale);
    349.             } else {
    350.                 ARCamera.fieldOfView = (float)(fovy [0] * fovYScale);
    351.             }
    352.            
    353.  
    354.             invertYM = Matrix4x4.TRS (Vector3.zero, Quaternion.identity, new Vector3 (1, -1, 1));
    355.             Debug.Log ("invertYM " + invertYM.ToString ());
    356.  
    357.             invertZM = Matrix4x4.TRS (Vector3.zero, Quaternion.identity, new Vector3 (1, 1, -1));
    358.             Debug.Log ("invertZM " + invertZM.ToString ());
    359.            
    360.            
    361.             axes.SetActive (false);
    362.             head.SetActive (false);
    363.             rightEye.SetActive (false);
    364.             leftEye.SetActive (false);
    365.             mouth.SetActive (false);
    366.            
    367.             mouthParticleSystem = mouth.GetComponentsInChildren<ParticleSystem> (true);
    368.  
    369.  
    370.             grayMat = new Mat (webCamTextureMat.rows (), webCamTextureMat.cols (), CvType.CV_8UC1);
    371.  
    372.             detectResult = new List<UnityEngine.Rect> ();
    373.         }
    374.  
    375.         /// <summary>
    376.         /// Raises the web cam texture to mat helper disposed event.
    377.         /// </summary>
    378.         public void OnWebCamTextureToMatHelperDisposed ()
    379.         {
    380.             Debug.Log ("OnWebCamTextureToMatHelperDisposed");
    381.            
    382.             camMatrix.Dispose ();
    383.             distCoeffs.Dispose ();
    384.  
    385.             grayMat.Dispose ();
    386.         }
    387.  
    388.         /// <summary>
    389.         /// Raises the web cam texture to mat helper error occurred event.
    390.         /// </summary>
    391.         /// <param name="errorCode">Error code.</param>
    392.         public void OnWebCamTextureToMatHelperErrorOccurred (WebCamTextureToMatHelper.ErrorCode errorCode)
    393.         {
    394.             Debug.Log ("OnWebCamTextureToMatHelperErrorOccurred " + errorCode);
    395.         }
    396.        
    397.         // Update is called once per frame
    398.         void Update ()
    399.         {
    400.             if (webCamTextureToMatHelper.IsPlaying () && webCamTextureToMatHelper.DidUpdateThisFrame ()) {
    401.                
    402.                 Mat rgbaMat = webCamTextureToMatHelper.GetMat ();
    403.                
    404.                
    405.                 OpenCVForUnityUtils.SetImage (faceLandmarkDetector, rgbaMat);
    406.                
    407.                 //detect face rects
    408. //                if (useOpenCVFaceDetector) {
    409.                 // convert image to greyscale.
    410.                 Imgproc.cvtColor (rgbaMat, grayMat, Imgproc.COLOR_RGBA2GRAY);
    411.  
    412.                 using (Mat equalizeHistMat = new Mat ())
    413.                 using (MatOfRect faces = new MatOfRect ()) {
    414.                     Imgproc.equalizeHist (grayMat, equalizeHistMat);
    415.  
    416.                     cascade.detectMultiScale (equalizeHistMat, faces, 1.1f, 2, 0 | Objdetect.CASCADE_SCALE_IMAGE, new OpenCVForUnity.Size (equalizeHistMat.cols () * 0.15, equalizeHistMat.cols () * 0.15), new Size ());
    417.  
    418.                     List<OpenCVForUnity.Rect> opencvDetectResult = faces.toList ();
    419.  
    420.                     // adjust to Dilb's result.
    421.                     detectResult.Clear ();
    422.                     foreach (var opencvRect in opencvDetectResult) {
    423.                         detectResult.Add (new UnityEngine.Rect ((float)opencvRect.x, (float)opencvRect.y + (float)(opencvRect.height * 0.1f), (float)opencvRect.width, (float)opencvRect.height));
    424.                     }
    425.                 }
    426.  
    427. //                } else {
    428. //
    429. //                    detectResult = faceLandmarkDetector.Detect ();
    430. //
    431. //                }
    432.  
    433.                
    434.                 if (detectResult.Count > 0) {
    435.                    
    436.                     //detect landmark points
    437.                     List<Vector2> points = faceLandmarkDetector.DetectLandmark (detectResult [0]);
    438.                    
    439.                     if (displayFacePoints)
    440.                         OpenCVForUnityUtils.DrawFaceLandmark (rgbaMat, points, new Scalar (0, 255, 0, 255), 2);
    441.                    
    442.                     imagePoints.fromArray (
    443.                         new Point ((points [38].x + points [41].x) / 2, (points [38].y + points [41].y) / 2),//l eye (Interpupillary breadth)
    444.                         new Point ((points [43].x + points [46].x) / 2, (points [43].y + points [46].y) / 2),//r eye (Interpupillary breadth)
    445.                         new Point (points [30].x, points [30].y),//nose (Nose top)
    446.                         new Point (points [48].x, points [48].y),//l mouth (Mouth breadth)
    447.                         new Point (points [54].x, points [54].y), //r mouth (Mouth breadth)
    448.                         new Point (points [0].x, points [0].y),//l ear (Bitragion breadth)
    449.                         new Point (points [16].x, points [16].y)//r ear (Bitragion breadth)
    450.                     );
    451.                    
    452.                     // Estimate head pose.
    453.                     if (rvec == null || tvec == null) {
    454.                         rvec = new Mat (3, 1, CvType.CV_64FC1);
    455.                         tvec = new Mat (3, 1, CvType.CV_64FC1);
    456.                         Calib3d.solvePnP (objectPoints, imagePoints, camMatrix, distCoeffs, rvec, tvec);
    457.                     }
    458.                        
    459.                     double tvec_z = tvec.get (2, 0) [0];
    460.  
    461.                     if (double.IsNaN (tvec_z) || tvec_z < 0) { // if tvec is wrong data, do not use extrinsic guesses.
    462.                         Calib3d.solvePnP (objectPoints, imagePoints, camMatrix, distCoeffs, rvec, tvec);
    463.                     } else {
    464.                         Calib3d.solvePnP (objectPoints, imagePoints, camMatrix, distCoeffs, rvec, tvec, true, Calib3d.SOLVEPNP_ITERATIVE);
    465.                     }
    466.  
    467. //                    Debug.Log (tvec.dump());
    468.                    
    469.                     if (!double.IsNaN (tvec_z)) {
    470.                        
    471.                         if (Mathf.Abs ((float)(points [43].y - points [46].y)) > Mathf.Abs ((float)(points [42].x - points [45].x)) / 6.0) {
    472.                             if (displayEffects)
    473.                                 rightEye.SetActive (true);
    474.                         }
    475.                        
    476.                         if (Mathf.Abs ((float)(points [38].y - points [41].y)) > Mathf.Abs ((float)(points [39].x - points [36].x)) / 6.0) {
    477.                             if (displayEffects)
    478.                                 leftEye.SetActive (true);
    479.                         }
    480.                         if (displayHead)
    481.                             head.SetActive (true);
    482.                         if (displayAxes)
    483.                             axes.SetActive (true);
    484.                        
    485.                        
    486.                         float noseDistance = Mathf.Abs ((float)(points [27].y - points [33].y));
    487.                         float mouseDistance = Mathf.Abs ((float)(points [62].y - points [66].y));
    488.                         if (mouseDistance > noseDistance / 5.0) {
    489.                             if (displayEffects) {
    490.                                 mouth.SetActive (true);
    491.                                 foreach (ParticleSystem ps in mouthParticleSystem) {
    492.                                     var em = ps.emission;
    493.                                     em.enabled = true;
    494.                                     ps.startSize = 40 * (mouseDistance / noseDistance);
    495.                                 }
    496.                             }
    497.                         } else {
    498.                             if (displayEffects) {
    499.                                 foreach (ParticleSystem ps in mouthParticleSystem) {
    500.                                     var em = ps.emission;
    501.                                     em.enabled = false;
    502.                                 }
    503.                             }
    504.                         }
    505.  
    506.                         Calib3d.Rodrigues (rvec, rotMat);
    507.                        
    508.                         transformationM.SetRow (0, new Vector4 ((float)rotMat.get (0, 0) [0], (float)rotMat.get (0, 1) [0], (float)rotMat.get (0, 2) [0], (float)tvec.get (0, 0) [0]));
    509.                         transformationM.SetRow (1, new Vector4 ((float)rotMat.get (1, 0) [0], (float)rotMat.get (1, 1) [0], (float)rotMat.get (1, 2) [0], (float)tvec.get (1, 0) [0]));
    510.                         transformationM.SetRow (2, new Vector4 ((float)rotMat.get (2, 0) [0], (float)rotMat.get (2, 1) [0], (float)rotMat.get (2, 2) [0], (float)tvec.get (2, 0) [0]));
    511.                         transformationM.SetRow (3, new Vector4 (0, 0, 0, 1));
    512.                        
    513.                         // right-handed coordinates system (OpenCV) to left-handed one (Unity)
    514.                         ARM = invertYM * transformationM;
    515.                        
    516.                         // Apply Z axis inverted matrix.
    517.                         ARM = ARM * invertZM;
    518.                        
    519.                         if (shouldMoveARCamera) {
    520.  
    521.                             ARM = ARGameObject.transform.localToWorldMatrix * ARM.inverse;
    522.                            
    523.                             ARUtils.SetTransformFromMatrix (ARCamera.transform, ref ARM);
    524.                         } else {
    525.  
    526.                             ARM = ARCamera.transform.localToWorldMatrix * ARM;
    527.                            
    528.                             ARUtils.SetTransformFromMatrix (ARGameObject.transform, ref ARM);
    529.                         }
    530.                     }
    531.                 }
    532.                
    533.                 Imgproc.putText (rgbaMat, "W:" + rgbaMat.width () + " H:" + rgbaMat.height () + " SO:" + Screen.orientation, new Point (5, rgbaMat.rows () - 10), Core.FONT_HERSHEY_SIMPLEX, 0.5, new Scalar (255, 255, 255, 255), 1, Imgproc.LINE_AA, false);
    534.                
    535.                 OpenCVForUnity.Utils.matToTexture2D (rgbaMat, texture, webCamTextureToMatHelper.GetBufferColors ());
    536.             }
    537.         }
    538.  
    539.         /// <summary>
    540.         /// Raises the destroy event.
    541.         /// </summary>
    542.         void OnDestroy ()
    543.         {
    544.             if (webCamTextureToMatHelper != null)
    545.                 webCamTextureToMatHelper.Dispose ();
    546.            
    547.             if (faceLandmarkDetector != null)
    548.                 faceLandmarkDetector.Dispose ();
    549.  
    550.             if (cascade != null)
    551.                 cascade.Dispose ();
    552.  
    553.             #if UNITY_WEBGL && !UNITY_EDITOR
    554.             foreach (var coroutine in coroutines) {
    555.                 StopCoroutine (coroutine);
    556.                 ((IDisposable)coroutine).Dispose ();
    557.             }
    558.             #endif
    559.         }
    560.  
    561.         /// <summary>
    562.         /// Raises the back button click event.
    563.         /// </summary>
    564.         public void OnBackButtonClick ()
    565.         {
    566.             #if UNITY_5_3 || UNITY_5_3_OR_NEWER
    567.             SceneManager.LoadScene ("DlibFaceLandmarkDetectorExample");
    568.             #else
    569.             Application.LoadLevel ("DlibFaceLandmarkDetectorExample");
    570.             #endif
    571.         }
    572.  
    573.         /// <summary>
    574.         /// Raises the play button click event.
    575.         /// </summary>
    576.         public void OnPlayButtonClick ()
    577.         {
    578.             webCamTextureToMatHelper.Play ();
    579.         }
    580.  
    581.         /// <summary>
    582.         /// Raises the pause button click event.
    583.         /// </summary>
    584.         public void OnPauseButtonClick ()
    585.         {
    586.             webCamTextureToMatHelper.Pause ();
    587.         }
    588.  
    589.         /// <summary>
    590.         /// Raises the stop button click event.
    591.         /// </summary>
    592.         public void OnStopButtonClick ()
    593.         {
    594.             webCamTextureToMatHelper.Stop ();
    595.         }
    596.  
    597.         /// <summary>
    598.         /// Raises the change camera button click event.
    599.         /// </summary>
    600.         public void OnChangeCameraButtonClick ()
    601.         {
    602.             webCamTextureToMatHelper.Initialize (null, webCamTextureToMatHelper.requestedWidth, webCamTextureToMatHelper.requestedHeight, !webCamTextureToMatHelper.requestedIsFrontFacing);
    603.         }
    604.  
    605.         /// <summary>
    606.         /// Raises the display face points toggle value changed event.
    607.         /// </summary>
    608.         public void OnDisplayFacePointsToggleValueChanged ()
    609.         {
    610.             if (displayFacePointsToggle.isOn) {
    611.                 displayFacePoints = true;
    612.             } else {
    613.                 displayFacePoints = false;
    614.             }
    615.         }
    616.  
    617.         /// <summary>
    618.         /// Raises the display axes toggle value changed event.
    619.         /// </summary>
    620.         public void OnDisplayAxesToggleValueChanged ()
    621.         {
    622.             if (displayAxesToggle.isOn) {
    623.                 displayAxes = true;
    624.             } else {
    625.                 displayAxes = false;
    626.                 axes.SetActive (false);
    627.             }
    628.         }
    629.  
    630.         /// <summary>
    631.         /// Raises the display head toggle value changed event.
    632.         /// </summary>
    633.         public void OnDisplayHeadToggleValueChanged ()
    634.         {
    635.             if (displayHeadToggle.isOn) {
    636.                 displayHead = true;
    637.             } else {
    638.                 displayHead = false;
    639.                 head.SetActive (false);
    640.             }
    641.         }
    642.  
    643.         /// <summary>
    644.         /// Raises the display effects toggle value changed event.
    645.         /// </summary>
    646.         public void OnDisplayEffectsToggleValueChanged ()
    647.         {
    648.             if (displayEffectsToggle.isOn) {
    649.                 displayEffects = true;
    650.             } else {
    651.                 displayEffects = false;
    652.                 rightEye.SetActive (false);
    653.                 leftEye.SetActive (false);
    654.                 mouth.SetActive (false);
    655.             }
    656.         }
    657.     }
    658. }
     
    kwkw likes this.
  45. EnoxSoftware

    EnoxSoftware

    Joined:
    Oct 29, 2014
    Posts:
    965
    Do another example using the StreamingAssets folder work correctly?
    PCAExample
    VideoCaptureExample
     
  46. victorho

    victorho

    Joined:
    Jul 18, 2015
    Posts:
    1
    I just wanted to confirm before purchasing this asset that you have compiled with ocl module for android
     
  47. EnoxSoftware

    EnoxSoftware

    Joined:
    Oct 29, 2014
    Posts:
    965
    Since this package is a clone of OpenCV Java, you are able to use the same API as OpenCV Java 3.3.0.
    "OpenCV for Unity" does not support OpenCL module.
     
  48. sjmtechs

    sjmtechs

    Joined:
    Jun 20, 2017
    Posts:
    11
    Hello,

    While using FaceMaskExample, How can I control opacity/alpha of Masks ?
     
  49. sjmtechs

    sjmtechs

    Joined:
    Jun 20, 2017
    Posts:
    11
    Nevermind, I have found it.
    FaceMask Example > Material > FaceMaskShader
    I have controlled Base.W value manually

    base.w = base.w * mask.x * mask.x * mask.x * (1 - _Fade);
    to
    base.w = base.w * mask.x * mask.x * mask.x * (1 - 0.15);
     
    EnoxSoftware likes this.
  50. Story_holic

    Story_holic

    Joined:
    Aug 8, 2017
    Posts:
    4
    I found what is wrong. user name is korean-language so It didn't work.
    I fix it. Thanks.
     
    EnoxSoftware likes this.