Search Unity

[RELEASED] OpenCV for Unity

Discussion in 'Assets and Asset Store' started by EnoxSoftware, Oct 30, 2014.

  1. Avyon

    Avyon

    Joined:
    Jul 14, 2016
    Posts:
    1
    Hello!

    I was wondering if there was a package I am missing or something I could have done wrong while building the example scenes? I just purchased the package and tried going through all the examples but some are throwing errors at me or just look like nothing is happening?

    The tracking example - for example says -
    Screen position out of view frustum (screen pos 0.779846, 851.676697) (Camera rect 0 0 600 900)
    UnityEngine.SendMouseEvents:DoSendMouseEvents(Int32)

    Thank you!
     
  2. TBruce

    TBruce

    Joined:
    Jan 18, 2015
    Posts:
    86
    Hi,

    I am trying to convert some python using OpenCV to OpenCV for Unity and I am stuck on the following piece of code

    invobjectblocks = 255 - objectblocks[1]

    where both invobjectblocks and objectblocks are both a Mat.

    Could you please tell me how to convert this?

    Thank you in advance!
     
  3. EnoxSoftware

    EnoxSoftware

    Joined:
    Oct 29, 2014
    Posts:
    1,564
    Could you tell me your test environment?
    Unity version :
    OpenCV for Unity version :
    build platform :
    error message :
     
  4. EnoxSoftware

    EnoxSoftware

    Joined:
    Oct 29, 2014
    Posts:
    1,564
    Could you try this code?
    Code (CSharp):
    1. invobjectblocks = new Scalar (255) - objectblocks[1];
     
  5. violetforest

    violetforest

    Joined:
    May 26, 2015
    Posts:
    10
    @EnoxSoftware Any way to find Extreme Points in OpenCVforunity?

    (c++ example)
    Code (CSharp):
    1. Point extLeft  = *min_element(pts.begin(), pts.end(),
    2.                      [](const Point& lhs, const Point& rhs) {
    3.                          return lhs.x < rhs.x;
    4.                  });
    5. Point extRight = *max_element(pts.begin(), pts.end(),
    6.                      [](const Point& lhs, const Point& rhs) {
    7.                          return lhs.x < rhs.x;
    8.                  });
    9. Point extTop   = *min_element(pts.begin(), pts.end(),
    10.                      [](const Point& lhs, const Point& rhs) {
    11.                          return lhs.y < rhs.y;
    12.                  });
    13. Point extBot   = *max_element(pts.begin(), pts.end(),
    14.                      [](const Point& lhs, const Point& rhs) {
    15.                          return lhs.y < rhs.y;
    16.                  });
     
  6. ina

    ina

    Joined:
    Nov 15, 2010
    Posts:
    1,085
    I'm trying to combine the WebCamTextureARExample with WebCamTextureFaceMaskExample.

    I'm not sure why one is using UnityEngine.Rect instead of OpenCVForUnity.Rect but managed to (I hope?) convert everything to be interchangeable with first detected face.

    It seems to run but I am getting errors shown in screenshot below. Strange ones that do not indicate line number such as
    Invalid AABB a errors

    https://gyazo.com/5661ba1ebbda32889c22ae20bf74a4ec

    Please help :-\
     
  7. Semashi

    Semashi

    Joined:
    Dec 8, 2015
    Posts:
    4
    Hi,

    Now, I'm trying to do stereo measurement with rectification (on Unity 5.5.1f1 and latest OpenCV for Unity 2.1.6 for Windows and HoloLens build).

    It seems not working correctly without any errors.
    Currently, I'm testing with below code.
    But output values from stereoRectify looks not changed (in below code, P1 and P2 output value are just eye matrices).
    Rectified image debug11.PNG and debug12.PNG are black image because of wrong input params.

    Do you have a minimum working sample code?

    Code (CSharp):
    1.  
    2. using UnityEngine;
    3. using System.Collections;
    4. using System.Collections.Generic;
    5.  
    6. using OpenCVForUnity;
    7.  
    8.  
    9. public class RectifyTest : MonoBehaviour {
    10.  
    11.     private Mat C1, D1, C2, D2;
    12.     private Mat R, T, R1, R2, P1, P2, Q;
    13.     private Mat[] rmap1, rmap2;
    14.  
    15.     // Use this for initialization
    16.     void Start () {
    17.  
    18.         Size size = new Size(640, 480);
    19.  
    20.         C1 = Mat.eye(3, 3, CvType.CV_64F);
    21.         C2 = Mat.eye(3, 3, CvType.CV_64F);
    22.  
    23.         D1 = Mat.zeros(8, 1, CvType.CV_64F);
    24.         D2 = Mat.zeros(8, 1, CvType.CV_64F);
    25.  
    26.         R = Mat.eye(3, 3, CvType.CV_64F);
    27.         T = Mat.zeros(3, 1, CvType.CV_64F);
    28.         T.put(0, 0, 1.0);
    29.         R1 = Mat.eye(3, 3, CvType.CV_64F);
    30.         R2 = Mat.eye(3, 3, CvType.CV_64F);
    31.         P1 = Mat.eye(3, 4, CvType.CV_64F);
    32.         P2 = Mat.eye(3, 4, CvType.CV_64F);
    33.         Q = Mat.eye(4, 4, CvType.CV_64F);
    34.  
    35.         rmap1 = new Mat[2];
    36.         rmap2 = new Mat[2];
    37.  
    38.         rmap1[0] = new Mat();
    39.         rmap1[1] = new Mat();
    40.  
    41.         rmap2[0] = new Mat();
    42.         rmap2[1] = new Mat();
    43.  
    44.         double fx = 600.0;
    45.         double fy = 600.0;
    46.         double cx = 320.0;
    47.         double cy = 240.0;
    48.  
    49.         // put camera params
    50.         C1.put(0, 0, fx);
    51.         C1.put(1, 0, 0.0);
    52.         C1.put(2, 0, 0.0);
    53.         C1.put(0, 1, 0.0);
    54.         C1.put(1, 1, fy);
    55.         C1.put(2, 1, 0.0);
    56.         C1.put(0, 2, cx);
    57.         C1.put(1, 2, cy);
    58.         C1.put(2, 2, 1.0);
    59.  
    60.         C2 = C1.clone();
    61.         D2 = D1.clone();
    62.  
    63.         Mat left_image = Imgcodecs.imread("left01.jpg");
    64.         Mat right_image = Imgcodecs.imread("right01.jpg");
    65.         Mat left_r_image = Mat.zeros(left_image.size(), left_image.type());
    66.         Mat right_r_image = Mat.zeros(right_image.size(), right_image.type());
    67.  
    68.         Calib3d.stereoRectify(C1, D1, C2, D2,
    69.                              size,
    70.                              R, T, R1, R2, P1, P2, Q, Calib3d.CALIB_ZERO_DISPARITY);
    71.         Calib3d.initUndistortRectifyMap(C1, D1,
    72.                                         R1, P1, size, CvType.CV_32FC1,
    73.                                         rmap1[0], rmap1[1]);
    74.         Calib3d.initUndistortRectifyMap(C2, D2,
    75.                                         R2, P2, size, CvType.CV_32FC1,
    76.                                         rmap2[0], rmap2[1]);
    77.      
    78.         double[] debug_values = new double[12];
    79.  
    80.         P1.get(0, 0, debug_values);
    81.         for (int i = 0; i < 12; i++)
    82.             Debug.Log(debug_values[i]);
    83.  
    84.         P2.get(0, 0, debug_values);
    85.         for (int i = 0; i < 12; i++)
    86.             Debug.Log(debug_values[i]);
    87.  
    88.         Imgproc.remap(left_image, left_r_image,
    89.                      rmap1[0], rmap1[1], Imgproc.INTER_LINEAR);
    90.         Imgproc.remap(right_image, right_r_image,
    91.                      rmap2[0], rmap2[1], Imgproc.INTER_LINEAR);
    92.  
    93.         Imgcodecs.imwrite("debug01.PNG", left_image);
    94.         Imgcodecs.imwrite("debug02.PNG", right_image);
    95.  
    96.         Imgcodecs.imwrite("debug11.PNG", left_r_image);
    97.         Imgcodecs.imwrite("debug12.PNG", right_r_image);
    98.  
    99.         Debug.Log("Processed");
    100.  
    101.     }
    102. }
    103.  
    104.  
     
    Last edited: May 7, 2017
  8. EnoxSoftware

    EnoxSoftware

    Joined:
    Oct 29, 2014
    Posts:
    1,564
    It is possible to display the native side error code by enclosing the code with Utils.setDebugMode ().
    calib3d::stereoRectify_13() : C:\Users\satoo\Desktop\opencv\modules\calib3d\src\fisheye.cpp:535: error: (-215) D.empty() || ((D.total() == 4) && (D.depth() == CV_32F || D.depth() == CV_64F)) in function cv::fisheye::estimateNewCameraMatrixForUndistortRectify

    Code (CSharp):
    1. using UnityEngine;
    2. using System.Collections;
    3. using System.Collections.Generic;
    4.  
    5. using OpenCVForUnity;
    6.  
    7.  
    8. public class RectifyTest : MonoBehaviour
    9. {
    10.  
    11.  
    12.     private Mat C1, D1, C2, D2;
    13.     private Mat R, T, R1, R2, P1, P2, Q;
    14.     private Mat[] rmap1, rmap2;
    15.  
    16.     // Use this for initialization
    17.     void Start ()
    18.     {
    19.  
    20.         Utils.setDebugMode (true);
    21.  
    22.      
    23.         Size size = new Size (640, 480);
    24.      
    25.         C1 = Mat.eye (3, 3, CvType.CV_64F);
    26.         C2 = Mat.eye (3, 3, CvType.CV_64F);
    27.      
    28. //        D1 = Mat.zeros (8, 1, CvType.CV_64F);
    29. //        D2 = Mat.zeros (8, 1, CvType.CV_64F);
    30.         D1 = Mat.zeros (4, 1, CvType.CV_64F);
    31.         D2 = Mat.zeros (4, 1, CvType.CV_64F);
    32.      
    33.         R = Mat.eye (3, 3, CvType.CV_64F);
    34.         T = Mat.zeros (3, 1, CvType.CV_64F);
    35.         R1 = Mat.eye (3, 3, CvType.CV_64F);
    36.         R2 = Mat.eye (3, 3, CvType.CV_64F);
    37.         P1 = Mat.eye (3, 4, CvType.CV_64F);
    38.         P2 = Mat.eye (3, 4, CvType.CV_64F);
    39.         Q = Mat.eye (4, 4, CvType.CV_64F);
    40.      
    41.         rmap1 = new Mat[2];
    42.         rmap2 = new Mat[2];
    43.      
    44.         rmap1 [0] = new Mat ();
    45.         rmap1 [1] = new Mat ();
    46.      
    47.         rmap2 [0] = new Mat ();
    48.         rmap2 [1] = new Mat ();
    49.      
    50.         double fx = 600.0;
    51.         double fy = 600.0;
    52.         double cx = 320.0;
    53.         double cy = 240.0;
    54.      
    55.         // put camera params
    56.         C1.put (0, 0, fx);
    57.         C1.put (1, 0, 0.0);
    58.         C1.put (2, 0, 0.0);
    59.         C1.put (0, 1, 0.0);
    60.         C1.put (1, 1, fy);
    61.         C1.put (2, 1, 0.0);
    62.         C1.put (0, 2, cx);
    63.         C1.put (1, 2, cy);
    64.         C1.put (2, 2, 1.0);
    65.      
    66.         C2 = C1.clone ();
    67.         D2 = D1.clone ();
    68.      
    69.         Mat left_image = Imgcodecs.imread ("left01.jpg");
    70.         Mat right_image = Imgcodecs.imread ("right01.jpg");
    71.  
    72.         Mat left_r_image = Mat.zeros (left_image.size (), left_image.type ());
    73.         Mat right_r_image = Mat.zeros (right_image.size (), right_image.type ());
    74.      
    75.         Calib3d.stereoRectify (C1, D1, C2, D2,
    76.                               size,
    77.                               R, T, R1, R2, P1, P2, Q, Calib3d.CALIB_ZERO_DISPARITY);
    78.         Calib3d.initUndistortRectifyMap (C1, D1,
    79.                                         R1, P1, size, CvType.CV_32FC1,
    80.                                         rmap1 [0], rmap1 [1]);
    81.         Calib3d.initUndistortRectifyMap (C2, D2,
    82.                                         R2, P2, size, CvType.CV_32FC1,
    83.                                         rmap2 [0], rmap2 [1]);
    84.      
    85.         double[] debug_values = new double[12];
    86.      
    87.         P1.get (0, 0, debug_values);
    88.         for (int i = 0; i < 12; i++)
    89.             Debug.Log (debug_values [i]);
    90.      
    91.         P2.get (0, 0, debug_values);
    92.         for (int i = 0; i < 12; i++)
    93.             Debug.Log (debug_values [i]);
    94.      
    95.         Imgproc.remap (left_image, left_r_image,
    96.                       rmap1 [0], rmap1 [1], Imgproc.INTER_LINEAR);
    97.         Imgproc.remap (right_image, right_r_image,
    98.                       rmap2 [0], rmap2 [1], Imgproc.INTER_LINEAR);
    99.      
    100.         Imgcodecs.imwrite ("debug01.png", left_image);
    101.         Imgcodecs.imwrite ("debug02.png", right_image);
    102.      
    103.         Imgcodecs.imwrite ("debug11.png", left_r_image);
    104.         Imgcodecs.imwrite ("debug12.png", right_r_image);
    105.      
    106.         Debug.Log ("Processed");
    107.  
    108.  
    109.         Utils.setDebugMode (false);
    110.  
    111.     }
    112. }

    Since this package is a clone of OpenCV Java, OpenCV Java information will be helpful.
     
    Last edited: May 7, 2017
  9. Semashi

    Semashi

    Joined:
    Dec 8, 2015
    Posts:
    4
    Thanks!

    According to assertion, stereoRectify seems only allows 4 parameter distortion model (D.total()==4 means so).
    I modified so and it works!

     
  10. EnoxSoftware

    EnoxSoftware

    Joined:
    Oct 29, 2014
    Posts:
    1,564
    I do not know how to find the Extreme Points by easy methods.
    Imgproc.minAreaRect() finds a rotated rectangle of the minimum area enclosing the input 2D point set.
    http://playwithopencv.blogspot.jp/2013/02/opencv-java.html
     
  11. ina

    ina

    Joined:
    Nov 15, 2010
    Posts:
    1,085
    Any advice appreciated - not sure why there are Invalid AABB a errors at all - i have disabled all UI etc

     
  12. TBruce

    TBruce

    Joined:
    Jan 18, 2015
    Posts:
    86
    Hi @EnoxSoftware,

    Thank you for you reply but unfortunately that did not work. I get the following error:

    error CS0021: Cannot apply indexing with [] to an expression of type `OpenCVForUnity.Mat'​

    If I do this

    invobjectblocks = new Scalar (255) - objectblocks;​

    I do not get an error but it is the same as

    Core.bitwise_not (objectblocks, invobjectblocks);
    which is not the result I am looking for.

    The original Python code is scripted like so

    img = cv2.imread("image.jpg")

    blurred = cv2.medianBlur(img,BLUR_FACTOR)

    blue = cv2.split(blurred)[0]
    green = cv2.split(blurred)[1]
    red = cv2.split(blurred)[2]

    objectblocks = cv2.threshold(red, RED_LOW_THRESHOLD, 255, 1) # 185 --> 235
    invobjectblocks = 255 - objectblocks[1]​

    which so far I have translated to this

    Texture2D texture = (Texture2D)Resources.Load("Textures/image"); break;

    Mat matImage = new Mat (texture.height, texture.width, CvType.CV_8UC4);

    Mat blurred = new Mat(texture.height, texture.width, CvType.CV_8UC4);
    Mat red = new Mat(texture.height, texture.width, CvType.CV_8UC4);
    Mat green = new Mat(texture.height, texture.width, CvType.CV_8UC4);
    Mat blue = new Mat(texture.height, texture.width, CvType.CV_8UC4);
    Mat objectblocks = new Mat(texture.height, texture.width, CvType.CV_8UC4);
    Mat invobjectblocks = new Mat(texture.height, texture.width, CvType.CV_8UC4);

    Utils.texture2DToMat (texture, matImage);

    Imgproc.medianBlur(matImage, blurred, blurFactor);

    List<Mat> colors = new List<Mat>();
    Core.split(blurred, colors);
    Debug.Log("colors.count = " + colors.Count);
    blue = colors[0];
    green = colors[1];
    red = colors[2];

    Imgproc.threshold(red, objectblocks, redLowThreshold, 255, Imgproc.THRESH_TOZERO_INV); // # 185 --> 235​

    I am not sure if what I converted to c# is correct but all help with this is greatly appreciated.

    Thank you in advance!
     
  13. EnoxSoftware

    EnoxSoftware

    Joined:
    Oct 29, 2014
    Posts:
    1,564
  14. ginoadriano

    ginoadriano

    Joined:
    Dec 15, 2014
    Posts:
    2
    I'm trying to figure out how to hide the webcamtexture but keep the face-recognition active.
    Working on a demo with two screens, in one i need the webcam-output with the face-tracking and recognition, in the 2nd i just need a black screen and the face recognition rectangle. How can i do this? Been breaking my head over it for two weeks allready.
     
  15. EnoxSoftware

    EnoxSoftware

    Joined:
    Oct 29, 2014
    Posts:
    1,564
    Please try this code.
    Code (CSharp):
    1.             Texture2D texture = (Texture2D)Resources.Load ("lena");
    2. //            break;
    3.          
    4.             Mat matImage = new Mat (texture.height, texture.width, CvType.CV_8UC4);
    5.          
    6.             Mat blurred = new Mat (texture.height, texture.width, CvType.CV_8UC4);
    7. //            Mat red = new Mat (texture.height, texture.width, CvType.CV_8UC4);
    8. //            Mat green = new Mat (texture.height, texture.width, CvType.CV_8UC4);
    9. //            Mat blue = new Mat (texture.height, texture.width, CvType.CV_8UC4);
    10. //            Mat objectblocks = new Mat (texture.height, texture.width, CvType.CV_8UC4);
    11. //            Mat invobjectblocks = new Mat (texture.height, texture.width, CvType.CV_8UC4);
    12.             Mat red = new Mat (texture.height, texture.width, CvType.CV_8UC1);
    13.             Mat green = new Mat (texture.height, texture.width, CvType.CV_8UC1);
    14.             Mat blue = new Mat (texture.height, texture.width, CvType.CV_8UC1);
    15.             Mat objectblocks = new Mat (texture.height, texture.width, CvType.CV_8UC1);
    16.             Mat invobjectblocks = new Mat (texture.height, texture.width, CvType.CV_8UC1);
    17.          
    18.             Utils.texture2DToMat (texture, matImage);
    19.          
    20. //            Imgproc.medianBlur(matImage, blurred, blurFactor);
    21.             Imgproc.medianBlur (matImage, blurred, 5);
    22.          
    23.             List<Mat> colors = new List<Mat> ();
    24.             Core.split (blurred, colors);
    25.             Debug.Log ("colors.count = " + colors.Count);
    26.             blue = colors [0];
    27.             green = colors [1];
    28.             red = colors [2];
    29.          
    30. //            Imgproc.threshold (red, objectblocks, redLowThreshold, 255, Imgproc.THRESH_TOZERO_INV); // # 185 --> 235
    31.             Imgproc.threshold (red, objectblocks, 100, 255, Imgproc.THRESH_TOZERO_INV); // # 185 --> 235
    32.  
    33.             invobjectblocks = new Scalar (255) - objectblocks;
    34.  
    35.  
    36.             Texture2D texture_out = new Texture2D (invobjectblocks.cols (), invobjectblocks.rows (), TextureFormat.RGBA32, false);
    37.          
    38.             Utils.matToTexture2D (invobjectblocks, texture_out);
    39.          
    40.             gameObject.GetComponent<Renderer> ().material.mainTexture = texture_out;
     
  16. EnoxSoftware

    EnoxSoftware

    Joined:
    Oct 29, 2014
    Posts:
    1,564
    Please try this code.
    • Create new Quad.
    • Attach new Quad to "Quad 2" in inspector.
    Code (CSharp):
    1. using UnityEngine;
    2. using System.Collections;
    3. using System.Collections.Generic;
    4. using System;
    5.  
    6. #if UNITY_5_3 || UNITY_5_3_OR_NEWER
    7. using UnityEngine.SceneManagement;
    8. #endif
    9. using OpenCVForUnity;
    10.  
    11. namespace OpenCVForUnityExample
    12. {
    13.     /// <summary>
    14.     /// WebCamTexture detect face example.
    15.     /// </summary>
    16.     public class WebCamTextureDetectFaceExample2 : MonoBehaviour
    17.     {
    18.         public GameObject quad2;
    19.  
    20.         /// <summary>
    21.         /// The texture2.
    22.         /// </summary>
    23.         Texture2D texture2;
    24.  
    25.  
    26.         /// <summary>
    27.         /// The gray mat.
    28.         /// </summary>
    29.         Mat grayMat;
    30.  
    31.         /// <summary>
    32.         /// The texture.
    33.         /// </summary>
    34.         Texture2D texture;
    35.  
    36.         /// <summary>
    37.         /// The cascade.
    38.         /// </summary>
    39.         CascadeClassifier cascade;
    40.  
    41.         /// <summary>
    42.         /// The faces.
    43.         /// </summary>
    44.         MatOfRect faces;
    45.  
    46.         /// <summary>
    47.         /// The web cam texture to mat helper.
    48.         /// </summary>
    49.         WebCamTextureToMatHelper webCamTextureToMatHelper;
    50.  
    51.         #if UNITY_WEBGL && !UNITY_EDITOR
    52.         private Stack<IEnumerator> coroutineStack = new Stack<IEnumerator> ();
    53.         #endif
    54.  
    55.         // Use this for initialization
    56.         void Start ()
    57.         {
    58.             webCamTextureToMatHelper = gameObject.GetComponent<WebCamTextureToMatHelper> ();
    59.  
    60.             #if UNITY_WEBGL && !UNITY_EDITOR
    61.             var filepath_Coroutine = Utils.getFilePathAsync ("lbpcascade_frontalface.xml", (result) => {
    62.                 coroutineStack.Clear ();
    63.  
    64.                 cascade = new CascadeClassifier ();
    65.                 cascade.load (result);
    66.  
    67.                 webCamTextureToMatHelper.Init ();
    68.             });
    69.             coroutineStack.Push (filepath_Coroutine);
    70.             StartCoroutine (filepath_Coroutine);
    71.             #else
    72.             cascade = new CascadeClassifier ();
    73.             cascade.load (Utils.getFilePath ("lbpcascade_frontalface.xml"));
    74. //            cascade = new CascadeClassifier ();
    75. //            cascade.load (Utils.getFilePath ("haarcascade_frontalface_alt.xml"));
    76. //            if (cascade.empty ()) {
    77. //                Debug.LogError ("cascade file is not loaded.Please copy from “OpenCVForUnity/StreamingAssets/” to “Assets/StreamingAssets/” folder. ");
    78. //            }
    79.  
    80.             webCamTextureToMatHelper.Init ();
    81.             #endif
    82.         }
    83.  
    84.         /// <summary>
    85.         /// Raises the web cam texture to mat helper inited event.
    86.         /// </summary>
    87.         public void OnWebCamTextureToMatHelperInited ()
    88.         {
    89.             Debug.Log ("OnWebCamTextureToMatHelperInited");
    90.            
    91.             Mat webCamTextureMat = webCamTextureToMatHelper.GetMat ();
    92.  
    93.             texture = new Texture2D (webCamTextureMat.cols (), webCamTextureMat.rows (), TextureFormat.RGBA32, false);
    94.  
    95.             gameObject.GetComponent<Renderer> ().material.mainTexture = texture;
    96.  
    97.  
    98.             texture2 = new Texture2D (webCamTextureMat.cols (), webCamTextureMat.rows (), TextureFormat.RGBA32, false);
    99.            
    100.             quad2.GetComponent<Renderer> ().material.mainTexture = texture2;
    101.  
    102.  
    103. //            gameObject.transform.localScale = new Vector3 (webCamTextureMat.cols (), webCamTextureMat.rows (), 1);
    104. //            Debug.Log ("Screen.width " + Screen.width + " Screen.height " + Screen.height + " Screen.orientation " + Screen.orientation);
    105. //
    106. //
    107. //            float width = webCamTextureMat.width ();
    108. //            float height = webCamTextureMat.height ();
    109. //          
    110. //            float widthScale = (float)Screen.width / width;
    111. //            float heightScale = (float)Screen.height / height;
    112. //            if (widthScale < heightScale) {
    113. //                Camera.main.orthographicSize = (width * (float)Screen.height / (float)Screen.width) / 2;
    114. //            } else {
    115. //                Camera.main.orthographicSize = height / 2;
    116. //            }
    117.  
    118.             grayMat = new Mat (webCamTextureMat.rows (), webCamTextureMat.cols (), CvType.CV_8UC1);
    119.  
    120.             faces = new MatOfRect ();
    121.         }
    122.  
    123.         /// <summary>
    124.         /// Raises the web cam texture to mat helper disposed event.
    125.         /// </summary>
    126.         public void OnWebCamTextureToMatHelperDisposed ()
    127.         {
    128.             Debug.Log ("OnWebCamTextureToMatHelperDisposed");
    129.  
    130.             if (grayMat != null)
    131.                 grayMat.Dispose ();
    132.  
    133.             if (faces != null)
    134.                 faces.Dispose ();
    135.         }
    136.  
    137.         /// <summary>
    138.         /// Raises the web cam texture to mat helper error occurred event.
    139.         /// </summary>
    140.         /// <param name="errorCode">Error code.</param>
    141.         public void OnWebCamTextureToMatHelperErrorOccurred (WebCamTextureToMatHelper.ErrorCode errorCode)
    142.         {
    143.             Debug.Log ("OnWebCamTextureToMatHelperErrorOccurred " + errorCode);
    144.         }
    145.  
    146.         // Update is called once per frame
    147.         void Update ()
    148.         {
    149.             if (webCamTextureToMatHelper.IsPlaying () && webCamTextureToMatHelper.DidUpdateThisFrame ()) {
    150.                
    151.                 Mat rgbaMat = webCamTextureToMatHelper.GetMat ();
    152.  
    153.                 Imgproc.cvtColor (rgbaMat, grayMat, Imgproc.COLOR_RGBA2GRAY);
    154.                 Imgproc.equalizeHist (grayMat, grayMat);
    155.                
    156.                
    157.                 if (cascade != null)
    158.                     cascade.detectMultiScale (grayMat, faces, 1.1, 2, 2, // TODO: objdetect.CV_HAAR_SCALE_IMAGE
    159.                         new Size (grayMat.cols () * 0.2, grayMat.rows () * 0.2), new Size ());
    160.                
    161.                 grayMat.setTo (new Scalar (0));
    162.                
    163.                 OpenCVForUnity.Rect[] rects = faces.toArray ();
    164.                 for (int i = 0; i < rects.Length; i++) {
    165.                     //              Debug.Log ("detect faces " + rects [i]);
    166.                    
    167.                     Imgproc.rectangle (grayMat, new Point (rects [i].x, rects [i].y), new Point (rects [i].x + rects [i].width, rects [i].y + rects [i].height), new Scalar (255), 2);
    168.                 }
    169.                
    170. //              Imgproc.putText (rgbaMat, "W:" + rgbaMat.width () + " H:" + rgbaMat.height () + " SO:" + Screen.orientation, new Point (5, rgbaMat.rows () - 10), Core.FONT_HERSHEY_SIMPLEX, 1.0, new Scalar (255, 255, 255, 255), 2, Imgproc.LINE_AA, false);
    171.                
    172.                 Utils.matToTexture2D (rgbaMat, texture, webCamTextureToMatHelper.GetBufferColors ());
    173.  
    174.                 Utils.matToTexture2D (grayMat, texture2, webCamTextureToMatHelper.GetBufferColors ());
    175.             }
    176.         }
    177.  
    178.         /// <summary>
    179.         /// Raises the disable event.
    180.         /// </summary>
    181.         void OnDisable ()
    182.         {
    183.             webCamTextureToMatHelper.Dispose ();
    184.  
    185.             if (cascade != null)
    186.                 cascade.Dispose ();
    187.  
    188.             #if UNITY_WEBGL && !UNITY_EDITOR
    189.             foreach (var coroutine in coroutineStack) {
    190.                 StopCoroutine (coroutine);
    191.                 ((IDisposable)coroutine).Dispose ();
    192.             }
    193.             #endif
    194.         }
    195.  
    196.         /// <summary>
    197.         /// Raises the back button event.
    198.         /// </summary>
    199.         public void OnBackButton ()
    200.         {
    201.             #if UNITY_5_3 || UNITY_5_3_OR_NEWER
    202.             SceneManager.LoadScene ("OpenCVForUnityExample");
    203.             #else
    204.             Application.LoadLevel ("OpenCVForUnityExample");
    205.             #endif
    206.         }
    207.  
    208.         /// <summary>
    209.         /// Raises the play button event.
    210.         /// </summary>
    211.         public void OnPlayButton ()
    212.         {
    213.             webCamTextureToMatHelper.Play ();
    214.         }
    215.  
    216.         /// <summary>
    217.         /// Raises the pause button event.
    218.         /// </summary>
    219.         public void OnPauseButton ()
    220.         {
    221.             webCamTextureToMatHelper.Pause ();
    222.         }
    223.  
    224.         /// <summary>
    225.         /// Raises the stop button event.
    226.         /// </summary>
    227.         public void OnStopButton ()
    228.         {
    229.             webCamTextureToMatHelper.Stop ();
    230.         }
    231.  
    232.         /// <summary>
    233.         /// Raises the change camera button event.
    234.         /// </summary>
    235.         public void OnChangeCameraButton ()
    236.         {
    237.             webCamTextureToMatHelper.Init (null, webCamTextureToMatHelper.requestWidth, webCamTextureToMatHelper.requestHeight, !webCamTextureToMatHelper.requestIsFrontFacing);
    238.         }
    239.     }
    240. }
    detectFaceMultiQuad.PNG
     
  17. TBruce

    TBruce

    Joined:
    Jan 18, 2015
    Posts:
    86
    Hello @EnoxSoftware,

    Thank you for your reply. I will test this out and let you know how this works,
     
  18. viktoria93

    viktoria93

    Joined:
    Mar 5, 2017
    Posts:
    5
    Hello @EnoxSoftware,
    I'm trying to find a good solution for an algorithm, which can match (or compare) a 2d image to a 3d scene. First step was to use the Canny algortihm on the image and now I want this pictures to be compared to a Blender object (both are architectual buildings). The purpose is to find the location of the image in the built scene and after that map it with the texture, which is shown in the picture.
    Do OpenCV have any helpful algortihms for my issue?
    Thank you.
     
  19. ina

    ina

    Joined:
    Nov 15, 2010
    Posts:
    1,085
    Is Tizen supported?
     
  20. EnoxSoftware

    EnoxSoftware

    Joined:
    Oct 29, 2014
    Posts:
    1,564
    This article is helpful.
    http://docs.opencv.org/trunk/d7/d53/tutorial_py_pose.html
    solvePnP() Finds an object pose from 3D-2D point correspondences.
    http://docs.opencv.org/3.2.0/d9/d0c/group__calib3d.html#ga549c2075fac14829ff4a58bc931c033d
     
  21. EnoxSoftware

    EnoxSoftware

    Joined:
    Oct 29, 2014
    Posts:
    1,564
    Tizen is not supported.
     
  22. estelle27

    estelle27

    Joined:
    May 19, 2017
    Posts:
    3
    Hello, I just run the examples given by this plugin, but I can open my video files (even the video file the plugin give for test) when running the VideoCaptureExample, how can I deal with it? Here are some compiling errors:
     

    Attached Files:

    • 1.jpg
      1.jpg
      File size:
      108.6 KB
      Views:
      1,031
  23. unitycis

    unitycis

    Joined:
    May 18, 2017
    Posts:
    5
    Hello
    Can we detect rotation and seating of human body using this OpenCV for Unity plugin?
     
  24. superjayman

    superjayman

    Joined:
    May 31, 2013
    Posts:
    185
    OpenCV3.2.0, is GPU functionality supported? does it work with the CUDA module?..
     
  25. EnoxSoftware

    EnoxSoftware

    Joined:
    Oct 29, 2014
    Posts:
    1,564
    Do you already moved the StreamingAssets folder?
    unnamed.png
     
  26. EnoxSoftware

    EnoxSoftware

    Joined:
    Oct 29, 2014
    Posts:
    1,564
    Since this asset is a clone of OpenCV Java, you are able to use the same API as OpenCV Java.
    If there is implementation example using "OpenCV Java", I think that it can be implemented even using "OpenCV for Unity".
     
  27. EnoxSoftware

    EnoxSoftware

    Joined:
    Oct 29, 2014
    Posts:
    1,564
    Since this package is a clone of OpenCV Java, you are able to use the same API as OpenCV Java 3.2.0.
    "OpenCV for Unity" does not support CUDA module.
     
  28. estelle27

    estelle27

    Joined:
    May 19, 2017
    Posts:
    3
    Ok, I use another computer and run the scene successfully,But I find that I can only open some video files, I can the example video file and one of my file which resolution is 512x288 ,But I can not open a video file which resolution is 1920*1088. Is a resolution limit of the video file used in the VideoCapture class?
    Thank you!!
     
  29. chuyanchuyan

    chuyanchuyan

    Joined:
    May 23, 2017
    Posts:
    1
    Hi @EnoxSoftware,

    I tried to learn how to use projectPoints() as follows.
    First, 3D points are generated on Unity.
    Secondly, transfer camera.gameobject.transform to OpenCV rvec and tvec(※).
    Finally, project 3D points to image points by using porjectPoints().

    I imitated ARUco Sample, but I do not know the theory of transforming the coordinate between Unity and OpenCV.
    I got the such a result by trial and error.

    I want to know how to transfer Unity Transform to OpenCV rvec and tvec.
    Could you give me some advices?

    Thanks in Advance.


    (※)
    Code (CSharp):
    1. void setExtrinsicParameter(Transform transform, Mat rvec, Mat tvec)
    2. {
    3.     Matrix4x4 matrix = new Matrix4x4();
    4.     Matrix4x4 transformationM = new Matrix4x4();
    5.     Matrix4x4 invertYM = new Matrix4x4();
    6.     Mat RotMat = new Mat(3, 3, CvType.CV64FC1);
    7.  
    8.     matrix = transform.worldToLocalMatrix;
    9.     //Debug.Log("matrix:" + matrix.ToString());
    10.  
    11.     invertYM = Matrix4x4.TRS(Vector3.zero, Quaternion.identity, new Vector3(1, -1, 1));
    12.  
    13.     transformationM = invertYM * matrix * invertYM;
    14.     //Debug.Log("tranfromationM " + transformationM.ToString());
    15.  
    16.     RotMat.put(0, 0, transformationM.m00);
    17.     RotMat.put(0, 1, transformationM.m01);
    18.     RotMat.put(0, 2, transformationM.m02);
    19.     RotMat.put(1, 0, transformationM.m10);
    20.     RotMat.put(1, 1, transformationM.m11);
    21.     RotMat.put(1, 2, transformationM.m12);
    22.     RotMat.put(2, 0, transformationM.m20);
    23.     RotMat.put(2, 1, transformationM.m21);
    24.     RotMat.put(2, 2, transformationM.m22);
    25.     //Debug.Log("RotMat:" + RotMat.dump());
    26.  
    27.     Calib3d.Rodrigues(RotMat, rvec);
    28.     Debug.Log("rvec:" + rvec.dump());
    29.  
    30.     tvec.put(0, 0, transformationM.m03);
    31.     tvec.put(1, 0, transformationM.m13);
    32.     tvec.put(2, 0, transformationM.m23);
    33.     Debug.Log("tvec:" + tvec.dump());
    34. }
     
  30. EnoxSoftware

    EnoxSoftware

    Joined:
    Oct 29, 2014
    Posts:
    1,564
    In my environment I can open my video(1920 x 1080).
    1920x1080.PNG
    Is 512x288 and 1920x1080 the same video format?
     
  31. EnoxSoftware

    EnoxSoftware

    Joined:
    Oct 29, 2014
    Posts:
    1,564
    Unfortunately, I also do not fully understand coordinate transform between Unity and OpenCV.
    Perhaps your code is no problem.
     
  32. gamelot

    gamelot

    Joined:
    May 6, 2017
    Posts:
    1
    Hi ,
    Can you help us to customize this part ?
    i try but there's some difficult problem with imageSizeScale.

    Thanks

     
  33. EnoxSoftware

    EnoxSoftware

    Joined:
    Oct 29, 2014
    Posts:
    1,564
    Code (CSharp):
    1.             float imageSizeScale = 1.0f;
    2.             float widthScale = (float)Screen.width / width;
    3.             float heightScale = (float)Screen.height / height;
    4.             if (widthScale < heightScale) {
    5.                 //                Camera.main.orthographicSize = (width * (float)Screen.height / (float)Screen.width) / 2;
    6.                 //                imageSizeScale = (float)Screen.height / (float)Screen.width;
    7.                 Camera.main.orthographicSize = height / 2;
    8.             } else {
    9.                 //                Camera.main.orthographicSize = height / 2;
    10.                 Camera.main.orthographicSize = (width * (float)Screen.height / (float)Screen.width) / 2;
    11.                 imageSizeScale = (float)Screen.height / (float)Screen.width;
    12.             }
    13.  
    14. ----------------------------------------------------------------------------------------------
    15.  
    16.             if (widthScale < heightScale)
    17.             {
    18. //                ARCamera.fieldOfView = (float)(fovx [0] * fovXScale);
    19.                 ARCamera.fieldOfView = (float)(fovy [0] * fovYScale);
    20.             } else
    21.             {
    22. //                ARCamera.fieldOfView = (float)(fovy [0] * fovYScale);
    23.                 ARCamera.fieldOfView = (float)(fovx [0] * fovXScale);
    24.             }
    25.  
     
    Last edited: May 30, 2017
    valdeezzee likes this.
  34. pranee

    pranee

    Joined:
    Feb 5, 2016
    Posts:
    3
    Hello,
    Recently I bought opencv for unity. Its wonderful being a computer vision student it is great learning for me. I actually want to use ORB for keypoint detection and description and FLANN for keypoint matching. So I wrote this in the Features2DExample code but it results in 0 matches. It works with BRUTEFORCE_HAMMINGGLUT etc but I want to use FLANN. Please could anyone help me in this.
    Code (CSharp):
    1.  MatOfKeyPoint keypoints1 = new MatOfKeyPoint ();
    2.             Mat descriptors1 = new Mat ();
    3.             Mat mask = new Mat (0, 0,  CvType.CV_8UC3);
    4.             mask.setTo (new Scalar (255, 255, 255));
    5.             detector.detectAndCompute (img1Mat, mask, keypoints1, descriptors1);
    6.             print (keypoints1.rows ());
    7.             //extractor.compute (img1Mat, keypoints1, descriptors1);
    8.  
    9.             MatOfKeyPoint keypoints2 = new MatOfKeyPoint ();
    10.             Mat descriptors2 = new Mat ();
    11.        
    12.             detector.detectAndCompute (img2Mat, mask, keypoints2, descriptors2);
    13.             print (keypoints2.rows ());
    14.  
    15.             FlannBasedMatcher flann = FlannBasedMatcher.create ();
    16.             //DescriptorMatcher matcher = DescriptorMatcher.create (DescriptorMatcher.BRUTEFORCE_HAMMINGLUT);
    17.             MatOfDMatch matches = new MatOfDMatch ();
    18.             flann.match (descriptors1, descriptors2, matches);
    19.             print (matches.rows());
    20.             //matcher.match (descriptors1, descriptors2, matches);
    21.             print (Time.realtimeSinceStartup - start_time);
    22.  
    23.             Mat resultImg = new Mat ();
    24.  
    25.             Features2d.drawMatches (img1Mat, keypoints1, img2Mat, keypoints2, matches, resultImg);
    26.  
    27.  
    28.             Texture2D texture = new Texture2D (resultImg.cols (), resultImg.rows (), TextureFormat.RGBA32, false);
    29.        
    30.             Utils.matToTexture2D (resultImg, texture);
    31.  
    32.             gameObject.GetComponent<Renderer> ().material.mainTexture = texture;
     
  35. EnoxSoftware

    EnoxSoftware

    Joined:
    Oct 29, 2014
    Posts:
    1,564
    http://whoopsidaisies.hatenablog.com/entry/2013/12/07/135810
    The ORB does not seem to correspond to FlannBasedMatcher.

    This code works fine.
    http://docs.opencv.org/3.0-beta/doc/py_tutorials/py_feature2d/py_matcher/py_matcher.html
    Code (CSharp):
    1. using UnityEngine;
    2. using System.Collections;
    3. using System.Collections.Generic;
    4.  
    5. #if UNITY_5_3 || UNITY_5_3_OR_NEWER
    6. using UnityEngine.SceneManagement;
    7. #endif
    8. using OpenCVForUnity;
    9.  
    10. namespace OpenCVForUnityExample
    11. {
    12.     /// <summary>
    13.     /// Feature2D example.
    14.     /// </summary>
    15.     public class FlannBasedMatcherExample : MonoBehaviour
    16.     {
    17.         // Use this for initialization
    18.         void Start ()
    19.         {
    20.             Texture2D imgTexture = Resources.Load ("lena") as Texture2D;
    21.          
    22.             Mat img1Mat = new Mat (imgTexture.height, imgTexture.width, CvType.CV_8UC3);
    23.             Utils.texture2DToMat (imgTexture, img1Mat);
    24.             Debug.Log ("img1Mat.ToString() " + img1Mat.ToString ());
    25.  
    26.             Mat img2Mat = new Mat (imgTexture.height, imgTexture.width, CvType.CV_8UC3);
    27.             Utils.texture2DToMat (imgTexture, img2Mat);
    28.             Debug.Log ("img2Mat.ToString() " + img2Mat.ToString ());
    29.  
    30.  
    31.             float angle = UnityEngine.Random.Range (0, 360), scale = 1.0f;
    32.  
    33.             Point center = new Point (img2Mat.cols () * 0.5f, img2Mat.rows () * 0.5f);
    34.  
    35.             Mat affine_matrix = Imgproc.getRotationMatrix2D (center, angle, scale);
    36.  
    37.             Imgproc.warpAffine (img1Mat, img2Mat, affine_matrix, img2Mat.size ());
    38.  
    39.  
    40.  
    41.             // Initiate SIFT detector
    42.             SIFT sift = SIFT.create ();
    43.  
    44.             // find the keypoints and descriptors with SIFT
    45.             MatOfKeyPoint kp1 = new MatOfKeyPoint ();
    46.             Mat des1 = new Mat ();
    47.             Mat mask = new Mat ();
    48.             sift.detectAndCompute (img1Mat, mask, kp1, des1);
    49.  
    50.             MatOfKeyPoint kp2 = new MatOfKeyPoint ();
    51.             Mat des2 = new Mat ();
    52.             sift.detectAndCompute (img2Mat, mask, kp2, des2);
    53.  
    54.  
    55.             FlannBasedMatcher flann = new FlannBasedMatcher ();
    56.  
    57.             List<MatOfDMatch> matches = new List<MatOfDMatch> ();
    58.             flann.knnMatch (des1, des2, matches, 2);
    59.  
    60.  
    61.             Mat resultImg = new Mat ();
    62.             Features2d.drawMatchesKnn (img1Mat, kp1, img2Mat, kp2, matches, resultImg);
    63.  
    64.             Texture2D texture = new Texture2D (resultImg.cols (), resultImg.rows (), TextureFormat.RGBA32, false);
    65.      
    66.             Utils.matToTexture2D (resultImg, texture);
    67.  
    68.             gameObject.GetComponent<Renderer> ().material.mainTexture = texture;
    69.         }
    70.  
    71.         // Update is called once per frame
    72.         void Update ()
    73.         {
    74.  
    75.         }
    76.  
    77.         public void OnBackButton ()
    78.         {
    79.             #if UNITY_5_3 || UNITY_5_3_OR_NEWER
    80.             SceneManager.LoadScene ("OpenCVForUnityExample");
    81.             #else
    82.             Application.LoadLevel ("OpenCVForUnityExample");
    83.             #endif
    84.         }
    85.     }
    86. }
     
  36. ina

    ina

    Joined:
    Nov 15, 2010
    Posts:
    1,085
    What do people use for recording video?
     
  37. eating2016

    eating2016

    Joined:
    Aug 1, 2016
    Posts:
    5
    Hi,
    I'm considering buying this asset, and I would like to do Principal Component Analysis.

    I did find pca_analysis function in opencv C++, but I didn't see similar example in ExampleCode.
    So I wonder is there any function will do PCA in this opencv for unity asset?

    thank you for your patient.
     
  38. EnoxSoftware

    EnoxSoftware

    Joined:
    Oct 29, 2014
    Posts:
    1,564
    I converted the code of this page using OpenCVForUnity.
    http://qiita.com/hitomatagi/items/92fc43226ca37c0f90f6
    Code (CSharp):
    1. using UnityEngine;
    2. using System.Collections;
    3. using System.Collections.Generic;
    4.  
    5. #if UNITY_5_3 || UNITY_5_3_OR_NEWER
    6. using UnityEngine.SceneManagement;
    7. #endif
    8. using OpenCVForUnity;
    9.  
    10. namespace OpenCVForUnityExample
    11. {
    12.     /// <summary>
    13.     /// PCA example.
    14.     /// </summary>
    15.     public class PCAExample : MonoBehaviour
    16.     {
    17.  
    18.         void drawAxis (Mat img, Point start_pt, Point vec, Scalar color, double length)
    19.         {
    20.          
    21.             int CV_AA = 16;
    22.              
    23.             Point end_pt = new Point (start_pt.x + length * vec.x, start_pt.y + length * vec.y);
    24.              
    25.             Imgproc.circle (img, start_pt, 5, color, 1);
    26.            
    27.             Imgproc.line (img, start_pt, end_pt, color, 1, CV_AA, 0);
    28.      
    29.  
    30.             double angle = System.Math.Atan2 (vec.y, vec.x);
    31.              
    32.             double qx0 = end_pt.x - 9 * System.Math.Cos (angle + System.Math.PI / 4);
    33.             double qy0 = end_pt.y - 9 * System.Math.Sin (angle + System.Math.PI / 4);
    34.             Imgproc.line (img, end_pt, new Point (qx0, qy0), color, 1, CV_AA, 0);
    35.  
    36.             double qx1 = end_pt.x - 9 * System.Math.Cos (angle - System.Math.PI / 4);
    37.             double qy1 = end_pt.y - 9 * System.Math.Sin (angle - System.Math.PI / 4);
    38.             Imgproc.line (img, end_pt, new Point (qx1, qy1), color, 1, CV_AA, 0);
    39.      
    40.         }
    41.  
    42.         // Use this for initialization
    43.         void Start ()
    44.         {
    45.  
    46.             Mat src = Imgcodecs.imread (Utils.getFilePath ("pca_test1.jpg"));
    47.  
    48.             Debug.Log ("src.ToString() " + src.ToString ());
    49.  
    50.             // Convert image to grayscale
    51.             Mat gray = new Mat ();
    52.             Imgproc.cvtColor (src, gray, Imgproc.COLOR_BGR2GRAY);
    53.             // Convert image to binary
    54.             Mat bw = new Mat ();
    55.             Imgproc.threshold (gray, bw, 50, 255, Imgproc.THRESH_BINARY | Imgproc.THRESH_OTSU);
    56.             // Find all the contours in the thresholded image
    57.  
    58.             Mat hierarchy = new Mat ();
    59.             List<MatOfPoint> contours = new List<MatOfPoint> ();
    60.             Imgproc.findContours (bw, contours, hierarchy, Imgproc.RETR_LIST, Imgproc.CHAIN_APPROX_NONE);
    61.          
    62.             for (int i = 0; i < contours.Count; ++i) {
    63.                 // Calculate the area of each contour
    64.                 double area = Imgproc.contourArea (contours [i]);
    65.                 // Ignore contours that are too small or too large
    66.                 if (area < 1e2 || 1e5 < area)
    67.                     continue;
    68.                 // Draw each contour only for visualisation purposes
    69.                 Imgproc.drawContours (src, contours, i, new Scalar (0, 0, 255), 2);
    70.  
    71.                 //Construct a buffer used by the pca analysis
    72.                 List<Point> pts = contours [i].toList ();
    73.                 int sz = pts.Count;
    74.                 Mat data_pts = new Mat (sz, 2, CvType.CV_64FC1);
    75.                 for (int p = 0; p < data_pts.rows(); ++p) {
    76.                     data_pts.put (p, 0, pts [p].x);
    77.                     data_pts.put (p, 1, pts [p].y);
    78.                 }
    79.  
    80.                 Mat mean = new Mat ();
    81.                 Mat eigenvectors = new Mat ();
    82.                 Core.PCACompute (data_pts, mean, eigenvectors, 1);
    83.                 Debug.Log ("mean.dump() " + mean.dump ());
    84.                 Debug.Log ("eigenvectors.dump() " + eigenvectors.dump ());
    85.  
    86.                 Point cntr = new Point (mean.get (0, 0) [0], mean.get (0, 1) [0]);
    87.                 Point vec = new Point (eigenvectors.get (0, 0) [0], eigenvectors.get (0, 1) [0]);
    88.              
    89.                 drawAxis (src, cntr, vec, new Scalar (255, 255, 0), 150);
    90.  
    91.                 data_pts.Dispose ();
    92.                 mean.Dispose ();
    93.                 eigenvectors.Dispose ();
    94.  
    95.             }
    96.  
    97.  
    98.             Imgproc.cvtColor (src, src, Imgproc.COLOR_BGR2RGB);
    99.  
    100.             Texture2D texture = new Texture2D (src.cols (), src.rows (), TextureFormat.RGBA32, false);
    101.  
    102.             Utils.matToTexture2D (src, texture);
    103.  
    104.             gameObject.GetComponent<Renderer> ().material.mainTexture = texture;
    105.  
    106.         }
    107.  
    108.         // Update is called once per frame
    109.         void Update ()
    110.         {
    111.  
    112.         }
    113.  
    114.         public void OnBackButton ()
    115.         {
    116.             #if UNITY_5_3 || UNITY_5_3_OR_NEWER
    117.             SceneManager.LoadScene ("OpenCVForUnityExample");
    118.             #else
    119.             Application.LoadLevel ("OpenCVForUnityExample");
    120.             #endif
    121.         }
    122.     }
    123. }
    pca_test1.jpg
    pca_test1.jpg

    PCAExample_ScreenShot.PNG
     
    Last edited: Jun 7, 2017
    eating2016 likes this.
  39. ina

    ina

    Joined:
    Nov 15, 2010
    Posts:
    1,085
  40. EnoxSoftware

    EnoxSoftware

    Joined:
    Oct 29, 2014
    Posts:
    1,564
  41. estelle27

    estelle27

    Joined:
    May 19, 2017
    Posts:
    3
    Hi, Last time I asked why I could not open my video file. But then I found that this is because I forgot to copy the opencv_ffmpeg310_64.dll file to my project:(:(. Anyway, Thanks for your advice.And this time I still have some problems about Background Subtraction.I can get the background texture,foreground texture and video texture using the BackgroundSubtractor class now. I am wondering How can I get the static video texture(moving average of the video) using BackgroundSubtractor class. Do you have any better suggestions?
    Thank you in advance.
     
  42. eating2016

    eating2016

    Joined:
    Aug 1, 2016
    Posts:
    5
    hi,

    sorry for bothering again, But do you have function that calculates ICP (iterative closest points) like opencv ICP() or this thread?

    or similar function that calculate correspondence between two arrays of 2d points?

    thanks for your patient
     
    Last edited: Jun 12, 2017
  43. EnoxSoftware

    EnoxSoftware

    Joined:
    Oct 29, 2014
    Posts:
    1,564
    I have only this example for now.
    https://github.com/EnoxSoftware/Ope...nity/Examples/BackgroundSubtractorMOG2Example
     
  44. EnoxSoftware

    EnoxSoftware

    Joined:
    Oct 29, 2014
    Posts:
    1,564
  45. pranee

    pranee

    Joined:
    Feb 5, 2016
    Posts:
    3
    Hello,
    I would like to track the camera. So to initialize the camera internal parameters matrix I'm trying to create a mat which has 3X3 matrix
    Code (CSharp):
    1.  
    2.             CameraMatrix = new Mat (3, 1, CvType.CV_32FC2);
    3.          
    4.             CameraMatrix.put (0, 0, new float[] {811.13273602f, 0,  322.47589929f});
    5.             CameraMatrix.put (1, 0, new float[]{ 0, 811.27489725f, 225.78685243f});
    6.             CameraMatrix.put (2, 0, new float[]{ 0, 0, 1 });
    7.  
    8.             w = new Mat (3, 1, CvType.CV_32FC2);
    9.             w.put (0, 0, new float[] {0, -1, 0});
    10.             w.put (1, 0, new float[] {1, 0, 0});
    11.             w.put (2, 0, new float[] {0, 0, 1});
    Then I calculated the optical flow using the sample code and then to estimate the camera movement i have written this code
    Code (CSharp):
    1.  
    2.                     Mat E = Calib3d.findEssentialMat (mMOP2fptsPrev, mMOP2fptsThis, CameraMatrix, Calib3d.FM_RANSAC, 0.999, 2);
    3.                     Core.SVDecomp (E, w, u, vt);
    4.                     Mat R = u * w * vt;
    5.                     Mat T = u.col (2);  
    6.                     print (R);
    7.                     print (T);
    But I'm getting errors as:
    CvException: Provided data element number (3) should be multiple of the Mat channels count (2) at cameraMatrix first put statement
    CvException: Native object address is NULL at findEssentialMat function call
    Please could anyone help me in this.

    Thank You
     
  46. estelle27

    estelle27

    Joined:
    May 19, 2017
    Posts:
    3
    Hello,
    I used the example OpenCV for Unity given to get the foreground mat. Now I would like to get the foreground bounding box (around only the moving object detected) using BackgroundSubtractorMOG2 class. How can I get it ? Furthermore can I get the vertexes(maybe the bottom-left and the top-right vertexounding box) used for rendering in my shader?
    Here is the code used for getting foreground mat. 1.jpg
     
  47. EnoxSoftware

    EnoxSoftware

    Joined:
    Oct 29, 2014
    Posts:
    1,564
    Please change from CvType.CV_32FC2 to CvType.CV_32FC3.
    Code (CSharp):
    1.             CameraMatrix = new Mat (3, 1, CvType.CV_32FC3);
    2.        
    3.             CameraMatrix.put (0, 0, new float[] {811.13273602f, 0,  322.47589929f});
    4.             CameraMatrix.put (1, 0, new float[]{ 0, 811.27489725f, 225.78685243f});
    5.             CameraMatrix.put (2, 0, new float[]{ 0, 0, 1 });
    6.             w = new Mat (3, 1, CvType.CV_32FC3);
    7.             w.put (0, 0, new float[] {0, -1, 0});
    8.             w.put (1, 0, new float[] {1, 0, 0});
    9.             w.put (2, 0, new float[] {0, 0, 1});
     
  48. EnoxSoftware

    EnoxSoftware

    Joined:
    Oct 29, 2014
    Posts:
    1,564
    Code (CSharp):
    1. using UnityEngine;
    2. using System.Collections;
    3. using System.Collections.Generic;
    4. using System;
    5.  
    6. #if UNITY_5_3 || UNITY_5_3_OR_NEWER
    7. using UnityEngine.SceneManagement;
    8. #endif
    9. using OpenCVForUnity;
    10.  
    11. namespace OpenCVForUnityExample
    12. {
    13.     /// <summary>
    14.     /// BackgroundSubstractorVideoCapture example. (Example of playing video files using the VideoCapture class)
    15.     /// </summary>
    16.     public class BackgroundSubstractorVideoCaptureExample : MonoBehaviour
    17.     {
    18.         /// <summary>
    19.         /// The videocapture.
    20.         /// </summary>
    21.         VideoCapture capture;
    22.  
    23.         /// <summary>
    24.         /// The rgb mat.
    25.         /// </summary>
    26.         Mat rgbMat;
    27.  
    28.         /// <summary>
    29.         /// The colors.
    30.         /// </summary>
    31.         Color32[] colors;
    32.  
    33.         /// <summary>
    34.         /// The texture.
    35.         /// </summary>
    36.         Texture2D texture;
    37.         BackgroundSubtractorMOG backgroundSubstractorMOG;
    38.         Mat fgmaskMat;
    39.  
    40.         #if UNITY_WEBGL && !UNITY_EDITOR
    41.         Stack<IEnumerator> coroutines = new Stack<IEnumerator> ();
    42.         #endif
    43.        
    44.         // Use this for initialization
    45.         void Start ()
    46.         {
    47.             capture = new VideoCapture ();
    48.  
    49.             #if UNITY_WEBGL && !UNITY_EDITOR
    50.             var getFilePath_Coroutine = Utils.getFilePathAsync("768x576_mjpeg.mjpeg", (result) => {
    51.                 coroutines.Clear ();
    52.  
    53.                 capture.open (result);
    54.                 Init();
    55.             });
    56.             coroutines.Push (getFilePath_Coroutine);
    57.             StartCoroutine (getFilePath_Coroutine);
    58.             #else
    59.             capture.open (Utils.getFilePath ("768x576_mjpeg.mjpeg"));
    60.             Init ();
    61.             #endif
    62.         }
    63.  
    64.         private void Init ()
    65.         {
    66.             rgbMat = new Mat ();
    67.  
    68.             if (capture.isOpened ()) {
    69.                 Debug.Log ("capture.isOpened() true");
    70.             } else {
    71.                 Debug.Log ("capture.isOpened() false");
    72.             }
    73.  
    74.  
    75.             Debug.Log ("CAP_PROP_FORMAT: " + capture.get (Videoio.CAP_PROP_FORMAT));
    76.             Debug.Log ("CV_CAP_PROP_PREVIEW_FORMAT: " + capture.get (Videoio.CV_CAP_PROP_PREVIEW_FORMAT));
    77.             Debug.Log ("CAP_PROP_POS_MSEC: " + capture.get (Videoio.CAP_PROP_POS_MSEC));
    78.             Debug.Log ("CAP_PROP_POS_FRAMES: " + capture.get (Videoio.CAP_PROP_POS_FRAMES));
    79.             Debug.Log ("CAP_PROP_POS_AVI_RATIO: " + capture.get (Videoio.CAP_PROP_POS_AVI_RATIO));
    80.             Debug.Log ("CAP_PROP_FRAME_COUNT: " + capture.get (Videoio.CAP_PROP_FRAME_COUNT));
    81.             Debug.Log ("CAP_PROP_FPS: " + capture.get (Videoio.CAP_PROP_FPS));
    82.             Debug.Log ("CAP_PROP_FRAME_WIDTH: " + capture.get (Videoio.CAP_PROP_FRAME_WIDTH));
    83.             Debug.Log ("CAP_PROP_FRAME_HEIGHT: " + capture.get (Videoio.CAP_PROP_FRAME_HEIGHT));
    84.  
    85.             capture.grab ();
    86.             capture.retrieve (rgbMat, 0);
    87.             int frameWidth = rgbMat.cols ();
    88.             int frameHeight = rgbMat.rows ();
    89.             colors = new Color32[frameWidth * frameHeight];
    90.             texture = new Texture2D (frameWidth, frameHeight, TextureFormat.RGBA32, false);
    91.             gameObject.transform.localScale = new Vector3 ((float)frameWidth, (float)frameHeight, 1);
    92.             float widthScale = (float)Screen.width / (float)frameWidth;
    93.             float heightScale = (float)Screen.height / (float)frameHeight;
    94.             if (widthScale < heightScale) {
    95.                 Camera.main.orthographicSize = ((float)frameWidth * (float)Screen.height / (float)Screen.width) / 2;
    96.             } else {
    97.                 Camera.main.orthographicSize = (float)frameHeight / 2;
    98.             }
    99.             capture.set (Videoio.CAP_PROP_POS_FRAMES, 0);
    100.  
    101.             gameObject.GetComponent<Renderer> ().material.mainTexture = texture;
    102.  
    103.  
    104.             backgroundSubstractorMOG = Bgsegm.createBackgroundSubtractorMOG ();
    105.             fgmaskMat = new Mat (rgbMat.rows (), rgbMat.cols (), CvType.CV_8UC1);
    106.         }
    107.  
    108.         // Update is called once per frame
    109.         void Update ()
    110.         {
    111.             //Loop play
    112.             if (capture.get (Videoio.CAP_PROP_POS_FRAMES) >= capture.get (Videoio.CAP_PROP_FRAME_COUNT))
    113.                 capture.set (Videoio.CAP_PROP_POS_FRAMES, 0);
    114.  
    115.             //error PlayerLoop called recursively! on iOS.reccomend WebCamTexture.
    116.             if (capture.grab ()) {
    117.  
    118.                 capture.retrieve (rgbMat, 0);
    119.  
    120.                 Imgproc.cvtColor (rgbMat, rgbMat, Imgproc.COLOR_BGR2RGB);
    121.  
    122.                 backgroundSubstractorMOG.apply (rgbMat, fgmaskMat);
    123.                
    124.                 //Debug.Log ("Mat toString " + rgbMat.ToString ());
    125.  
    126.  
    127.                 List<MatOfPoint> contours = new List<MatOfPoint> ();
    128.                 Mat hierarchy = new Mat ();
    129.                
    130.                
    131.                 Imgproc.findContours (fgmaskMat, contours, hierarchy, Imgproc.RETR_CCOMP, Imgproc.CHAIN_APPROX_SIMPLE);
    132.                
    133.                
    134.                 for (int i = 0; i < contours.Count; i++) {
    135. //                    Imgproc.drawContours (rgbMat, contours, i, new Scalar (255, 0, 0), 3);
    136.                     if (Imgproc.contourArea (contours [i]) > 50) {
    137.                         OpenCVForUnity.Rect rect = Imgproc.boundingRect (contours [i]);
    138.                         if (rect.height > 28) {
    139.                             Imgproc.rectangle (rgbMat, new Point (rect.x, rect.y), new Point (rect.x + rect.width, rect.y + rect.height), new Scalar (0, 0, 255), 3);
    140.                         }
    141.                     }
    142.                 }
    143.  
    144.                 foreach (var item in contours) {
    145.                     item.Dispose ();
    146.                 }
    147.                 contours.Clear ();
    148.                 hierarchy.Dispose ();
    149.  
    150.                
    151.                 Utils.matToTexture2D (rgbMat, texture, colors);
    152.             }
    153.         }
    154.        
    155.         void OnDestroy ()
    156.         {
    157.             capture.release ();
    158.  
    159.             if (rgbMat != null)
    160.                 rgbMat.Dispose ();
    161.             if (fgmaskMat != null)
    162.                 fgmaskMat.Dispose ();
    163.  
    164.             #if UNITY_WEBGL && !UNITY_EDITOR
    165.             foreach (var coroutine in coroutines) {
    166.                 StopCoroutine (coroutine);
    167.                 ((IDisposable)coroutine).Dispose ();
    168.             }
    169.             #endif
    170.         }
    171.  
    172.         /// <summary>
    173.         /// Raises the back button click event.
    174.         /// </summary>
    175.         public void OnBackButtonClick ()
    176.         {
    177.             #if UNITY_5_3 || UNITY_5_3_OR_NEWER
    178.             SceneManager.LoadScene ("OpenCVForUnityExample");
    179.             #else
    180.             Application.LoadLevel ("OpenCVForUnityExample");
    181.             #endif
    182.         }
    183.     }
    184. }
    BackgroundSubstractorVideoCaptureExample.PNG
     
  49. juuuuun

    juuuuun

    Joined:
    Feb 17, 2014
    Posts:
    23
    Hi,

    I'm using Unity 5.6f1 and I have trouble using Ximgproc.createStructuredEdgeDetection function on IOS. I got the following error on Xcode when I try to use this function on iOS device (10.3.2) and app crashes. It works fine on Unity Editor but doesn't seem to work on IOS.


    Code (csharp):
    1.  
    2. 2017-06-19 18:07:48.294526+0900 edgecontour[4933:2007957] [MC] System group container for systemgroup.com.apple.configurationprofiles path is /private/var/containers/Shared/SystemGroup/systemgroup.com.apple.configurationprofiles
    3.  
    4. 2017-06-19 18:07:48.296212+0900 edgecontour[4933:2007957] [MC] Reading from public effective user settings.
    5.  
    6. OpenCV Error: The function/feature is not implemented (There is no compressed file storage support in this configuration) in cvOpenFileStorage, file /Users/satoo/opencv/ios/opencv/modules/core/src/persistence.cpp, line 4235
    7.  
    8. 2017-06-19 18:07:49.125208+0900 edgecontour[4933:2007957] Ptr<cv::ximgproc::StructuredEdgeDetection> *ximgproc_Ximgproc_createStructuredEdgeDetection_11(char *) [Line 1659] ximgproc::createStructuredEdgeDetection_11() : /Users/satoo/opencv/ios/opencv/modules/core/src/persistence.cpp:4235: error: (-213) There is no compressed file storage support in this configuration in function cvOpenFileStorage
    9.  
    10. libc++abi.dylib: terminating with uncaught exception of type Il2CppExceptionWrapper
    11.  
    The code I'm using on Unity is as follows.

    Code (csharp):
    1.  
    2. using System.Collections;
    3. using System.Collections.Generic;
    4. using UnityEngine;
    5. using NatCamU.Core;
    6. using OpenCVForUnity;
    7.  
    8. public class StructuredEdgeScanner : NatCamBehaviour {
    9.     public static StructuredEdgeScanner instance;
    10.     Texture2D texture;
    11.     int width;
    12.     int height;
    13.     public float textureResolution = 3f;
    14.     StructuredEdgeDetection structuredEdgeDetection;
    15.  
    16.     public override void Start(){
    17.         instance = this;
    18.  
    19.         base.Start();
    20.  
    21.         structuredEdgeDetection = Ximgproc.createStructuredEdgeDetection(Application.streamingAssetsPath + "/model.yml.gz");
    22.  
    23.    
    24.     }
    25.  
    26.     public override void OnPreviewStart () {
    27.         width = (int)(NatCam.Preview.width);// / textureResolution);
    28.         height = (int)(NatCam.Preview.height);// / textureResolution);
    29.         texture = new Texture2D (width , height, TextureFormat.RGBA32, false, false);
    30.         // Set our UI Panel's RawImage texture to our destination texture
    31.         preview.texture = texture;
    32.     }
    33.  
    34.     public override void OnPreviewUpdate () {
    35.    
    36.         CaptureStructuredEdge(textureResolution);
    37.  
    38.     }
    39.  
    40.     public Texture2D CaptureStructuredEdge(float res, bool makeNewTexture = false){
    41.         var mat = NatCam.PreviewMatrix;
    42.         Mat bgMat = new Mat();
    43.         Imgproc.cvtColor(mat,bgMat,Imgproc.COLOR_RGBA2RGB);
    44.         Imgproc.resize(bgMat,bgMat,new Size(width/res, height/res));
    45.         Core.flip(bgMat,bgMat,0);
    46.         Core.flip(bgMat,bgMat,1);
    47.  
    48.         Mat fsrc = new Mat();
    49.         bgMat.convertTo(fsrc, CvType.CV_32F,1.0 / 255.0);
    50.  
    51.         Mat edgeMat = new Mat(fsrc.rows(),fsrc.cols(),fsrc.type());
    52.  
    53.         structuredEdgeDetection.detectEdges(fsrc,edgeMat);
    54.         edgeMat.convertTo(edgeMat,mat.type(),255.0);
    55.         Imgproc.resize(edgeMat,edgeMat,new Size(width, height));
    56.  
    57.         var newTexture = texture;
    58.         if(makeNewTexture){
    59.             newTexture = new Texture2D(width, height, TextureFormat.RGBA32,false,false);
    60.         }
    61.  
    62.         Utils.matToTexture2D (edgeMat, newTexture);
    63.    
    64.         fsrc.release();
    65.         edgeMat.release();
    66.         bgMat.release();
    67.         mat.release();
    68.  
    69.         return newTexture;
    70.     }
    71.  
    72. }
    73.  
    74.  
    I found a following page which looks common to my problem but can't apply the same solution since the library for Unity is already compiled.

    https://github.com/opencv/opencv/issues/8106

    I'd appreciate if anybody has an idea how to solve this issue.

    Thank you,
     
    Last edited: Jun 19, 2017
  50. ToolkitSpA

    ToolkitSpA

    Joined:
    Jun 7, 2017
    Posts:
    3
    Hello, your plugin is nice, but I have a problem with the WebCamARSample, when I move the quad a couple of pixels in Y axis, the AR Objects appears really weird

    here is a screenshot about how it looks like

    https://ibb.co/hsD1oQ