Search Unity

[RELEASED] OpenCV for Unity

Discussion in 'Assets and Asset Store' started by EnoxSoftware, Oct 30, 2014.

  1. EnoxSoftware

    EnoxSoftware

    Joined:
    Oct 29, 2014
    Posts:
    1,565
    I am going to try that I use FaceTrackerARSample with NatCam. Please wait for a while.

    Please set the requestWidth and requestHeight as follows.
    requestWidth 1280
    requestHeight 720
    ios_1280x720_2.jpg
     
  2. JetNikus

    JetNikus

    Joined:
    Jun 2, 2016
    Posts:
    3
    Hello EnoxSoftware!
    I've found that iOS (and os x) version of plugin size is huge. Is it possible to reduce it? May be strip unused modules or something like that.. Thanks
     
    Oshiyama likes this.
  3. Greg-Bassett

    Greg-Bassett

    Joined:
    Jul 28, 2009
    Posts:
    628
    OK, I have purchased Open CV today! Quick question, with face detection is it possible to get the rect values for the coloured square? I want to be able to move my device and detect when the face is roughly in the centre of the screen.

    Thanks in advance!
     
  4. oleray

    oleray

    Joined:
    Jan 30, 2016
    Posts:
    3
    Hi EnoxSoftware, hi everyone,
    I have purchased OpenCV for Unity with the aim of creating AR apps, particularly detecting and augmenting human faces.
    I want to obtain the kind of result shown in this video

    I've been testing FaceTrackerARSample, imported my own 3D models instead of the red head etc.. but the result is not that good yet: especially there is way too much noise and the 3D model "jumps", the "smoothness" is far from the one in the video.

    What are the ways I can improve the existing code ? Did I miss some parameters that I can adjust?
    Do I have to train my own cascade classifier? Would I obtain significantly better results?
    Would augmenting the number of features (currently left/right mouth, left/right eye, nose) reduce the noise?
    Do I need to implement some kind of "floor value" for the delta of the face features' position, under which the position of my 3D object wouldn't be updated? Any of you have an experience in this?

    Thanks a lot in advance
     
  5. EnoxSoftware

    EnoxSoftware

    Joined:
    Oct 29, 2014
    Posts:
    1,565
  6. Greg-Bassett

    Greg-Bassett

    Joined:
    Jul 28, 2009
    Posts:
    628
  7. Greg-Bassett

    Greg-Bassett

    Joined:
    Jul 28, 2009
    Posts:
    628
  8. dwisaras18

    dwisaras18

    Joined:
    Jun 4, 2016
    Posts:
    1
    hy Enox..if there is no instance Markerless Augmented Reality menggunakan opencv for unity...I need and hard to get
     
  9. zoe_nightshade

    zoe_nightshade

    Joined:
    May 12, 2016
    Posts:
    27
    hello! so far i've been successful with stitching two images. however, when i try to stitch newly captured images from camera, the output is somehow distorted. it seems im having problems with finding the like features of the image. is there anything im missing from my code?

    Code (CSharp):
    1. Texture2D imgTemplate = Resources.Load ("2") as Texture2D;
    2.             Texture2D imgTexture = Resources.Load ("1") as Texture2D;
    3.          
    4.             Mat matSrc = new Mat (imgTemplate.height, imgTemplate.width, CvType.CV_8UC3);
    5.             Utils.texture2DToMat (imgTemplate, matSrc);
    6.      
    7.          
    8.             Mat matScene = new Mat (imgTexture.height, imgTexture.width, CvType.CV_8UC3);
    9.             Utils.texture2DToMat (imgTexture, matScene);
    10.          
    11.          
    12.             FeatureDetector detector = FeatureDetector.create (FeatureDetector.ORB);
    13.             DescriptorExtractor extractor = DescriptorExtractor.create (DescriptorExtractor.ORB);
    14.          
    15.             MatOfKeyPoint keypointsSrc = new MatOfKeyPoint ();
    16.             Mat descriptorsSrc = new Mat ();
    17.          
    18.             detector.detect (matSrc, keypointsSrc);
    19.             extractor.compute (matSrc, keypointsSrc, descriptorsSrc);
    20.          
    21.             MatOfKeyPoint keypointsScene = new MatOfKeyPoint ();
    22.             Mat descriptorsScene = new Mat ();
    23.          
    24.             detector.detect (matScene, keypointsScene);
    25.             extractor.compute (matScene, keypointsScene, descriptorsScene);
    26.          
    27.             DescriptorMatcher matcher = DescriptorMatcher.create (DescriptorMatcher.BRUTEFORCE_HAMMINGLUT);
    28.             MatOfDMatch matches = new MatOfDMatch ();
    29.          
    30.             matcher.match (descriptorsSrc, descriptorsScene, matches);
    31.          
    32.             //////////////////////////////////////////////////////////////////////////////////////////////////
    33.      
    34.             List<DMatch> matchesList = matches.toList ();
    35.          
    36.             double max_dist = 0;
    37.             double min_dist = 100;
    38.             for (int i = 0; i < descriptorsSrc.rows (); i++) {
    39.              
    40.                 double dist = (double)matchesList [i].distance;
    41.                 if (dist < min_dist)
    42.                     min_dist = dist;
    43.                 if (dist > max_dist)
    44.                     max_dist = dist;
    45.             }
    46.          
    47.             List<DMatch> good_matches = new List<DMatch> ();
    48.             for (int i = 0; i < descriptorsSrc.rows (); i++) {
    49.                 if (matchesList [i].distance < 2*min_dist) {
    50.                     good_matches.Add (matchesList [i]);
    51.                 }
    52.             }
    53.             MatOfDMatch gm = new MatOfDMatch ();
    54.             gm.fromList (good_matches);
    55.          
    56.             List<Point> objList = new List<Point> ();
    57.             List<Point> sceneList = new List<Point> ();
    58.          
    59.             List<KeyPoint> keypoints_objectList = keypointsSrc.toList ();
    60.             List<KeyPoint> keypoints_sceneList = keypointsScene.toList ();
    61.          
    62.             for (int i = 0; i < good_matches.Count; i++) {
    63.                 objList.Add (keypoints_objectList [good_matches [i].queryIdx].pt);
    64.                 sceneList.Add (keypoints_sceneList [good_matches [i].trainIdx].pt);
    65.             }
    66.          
    67.             MatOfPoint2f obj = new MatOfPoint2f ();
    68.             MatOfPoint2f scene = new MatOfPoint2f ();
    69.          
    70.             obj.fromList (objList);
    71.             scene.fromList (sceneList);
    72.          
    73.         Mat H = Calib3d.findHomography(obj, scene, Calib3d.RANSAC, 18);
    74.      
    75.             Mat srcRectMat = new Mat (4, 1, CvType.CV_32FC2);
    76.             Mat dstRectMat = new Mat (4, 1, CvType.CV_32FC2);
    77.          
    78.             Point[] obj_corners = new Point[4];
    79.             obj_corners [0] = new Point (0, 0);
    80.             obj_corners [1] = new Point ((double)matSrc.cols (), 0);
    81.             obj_corners [2] = new Point (matSrc.cols (), matSrc.rows ());
    82.             obj_corners [3] = new Point (0, matSrc.rows ());
    83.          
    84.             Point[] scene_corners = new Point [4];
    85.          
    86.             MatOfPoint2f srcPointMat = new MatOfPoint2f (obj_corners);
    87.             MatOfPoint2f dstPointMat = new MatOfPoint2f ();
    88.          
    89.             Core.perspectiveTransform (srcPointMat, dstPointMat, H);
    90.             scene_corners = dstPointMat.toArray ();
    91.          
    92.          
    93.             //////////////////////////////////////////////////////////////////////////////////////////////////
    94.      
    95.             Mat warpimg = matSrc.clone();
    96.             OpenCVForUnity.Size ims = new OpenCVForUnity.Size(matSrc.cols() + matScene.cols(),matSrc.rows());
    97.             Imgproc.warpPerspective(matSrc, warpimg, H, ims);
    98.             OpenCVForUnity.Rect _rect = new OpenCVForUnity.Rect(0,0,matScene.cols(),matScene.rows());
    99.             Mat half = new Mat (warpimg, _rect);
    100.             matScene.copyTo(half);
    101.             _Texture = new Texture2D (warpimg.cols (), warpimg.rows (), TextureFormat.RGBA32, false);
    102.             Utils.matToTexture2D (warpimg, _Texture);
    103.        
     
  10. bluemike

    bluemike

    Joined:
    Oct 26, 2015
    Posts:
    18
    Hi

    When do you plan to support WebGL ?

    Thanks

    Michael
     
  11. aespinosa

    aespinosa

    Joined:
    Feb 1, 2016
    Posts:
    33
    Hi enox!!

    I am trying to use corner detection (harris) in my code. I am following the steps in OpenCV documentation:

    void cornerHarris_demo( int, void* )
    {

    Mat dst, dst_norm, dst_norm_scaled;
    dst = Mat::zeros( src.size(), CV_32FC1 );

    /// Detector parameters
    int blockSize = 2;
    int apertureSize = 3;
    double k = 0.04;

    /// Detecting corners
    cornerHarris( src_gray, dst, blockSize, apertureSize, k, BORDER_DEFAULT );

    /// Normalizing
    normalize( dst, dst_norm, 0, 255, NORM_MINMAX, CV_32FC1, Mat() );
    convertScaleAbs( dst_norm, dst_norm_scaled );

    /// Drawing a circle around corners
    for( int j = 0; j < dst_norm.rows ; j++ )
    { for( int i = 0; i < dst_norm.cols; i++ )
    {
    if( (int) dst_norm.at<float>(j,i) > thresh )
    {
    circle( dst_norm_scaled, Point( i, j ), 5, Scalar(0), 2, 8, 0 );
    }
    }
    }
    /// Showing the result
    namedWindow( corners_window, CV_WINDOW_AUTOSIZE );
    imshow( corners_window, dst_norm_scaled );
    }

    However i don't know how to implement the par of the for loop. The line: "if( (int) dst_norm.at<float>(j,i) > thresh )" i don't know how to implement. I don't know if this is neccesary? Or i can do it in another way? I just need to show the corners in my output image.

    Thank you so much!!
     
  12. EnoxSoftware

    EnoxSoftware

    Joined:
    Oct 29, 2014
    Posts:
    1,565
    Unfortunately, Currently, there is no MakerLessARSample.
     
  13. EnoxSoftware

    EnoxSoftware

    Joined:
    Oct 29, 2014
    Posts:
    1,565
    Is there the example of the output image?
     
  14. EnoxSoftware

    EnoxSoftware

    Joined:
    Oct 29, 2014
    Posts:
    1,565
    Unfortunately, It is difficult to support WebGL platform due to technical problems.
     
  15. Deleted User

    Deleted User

    Guest

    Hello, I'm working on a project (Thesis) and I'm facing some problems. It's an Augmented Reality project where I am using the solvePnP method from your asset. Firstly I am trying to simulate the real camera's settings to my virutal camera (camera object in the scene), by calculating the field of view , aspect ratio , near and far clipping planes in order to create the perspective projection matrix. The real camera is calibrated using OpenCV calibration and I know the intrinsic parameters and distortion coefficient. Then at a point I am using the solvePnP method from your asset in order to get the rotation matrix (I'm converting the 3x1 rodrigues vector to a 3x3 rotation matrix as well) and translation vector. Now the problem is here: I want to place the camera in the scene having the correct position and orientation using those extrinsic parameters (rotation and translation from solvePnP). I can't get it working correct. Thanks !
     
  16. MoCoder

    MoCoder

    Joined:
    Dec 4, 2012
    Posts:
    7
    Hi EnoxSoftware, hi everyone,
    Help me,
    Thanks

    ========== OUTPUTING STACK TRACE ==================

    0x70BF4F16 (opencvforunity) core_Mat_n_1cols
    0x054BFB08 (Mono JIT Code) (wrapper managed-to-native) OpenCVForUnity.Mat:core_Mat_n_1cols (intptr)
    0x054BFACB (Mono JIT Code) OpenCVForUnity.Mat:cols ()
    0x0550D7E8 (Mono JIT Code) AutoPhotograph:TakingPhoto (OpenCVForUnity.Mat,OpenCVForUnity.Rect)
    0x054C71C9 (Mono JIT Code) AutoPhotograph:Update ()
    0x0549F541 (Mono JIT Code) (wrapper runtime-invoke) object:runtime_invoke_void__this__ (object,intptr,intptr,intptr)
    0x100F0672 (mono) mono_set_defaults
    0x1005D85E (mono) mono_runtime_invoke
    0x00F90C5E (QianTingPhoto) scripting_gchandle_get_target
    0x0103E6AA (QianTingPhoto) ScriptingArguments::AddString
    0x0103E4A5 (QianTingPhoto) ScriptingArguments::AddString
    0x00F807DA (QianTingPhoto) IIMGUI::IIMGUI
    0x00F82328 (QianTingPhoto) GetMonoBehaviourInConstructor
    0x01002E42 (QianTingPhoto) RectT<int>::Contains
    0x010A5ED2 (QianTingPhoto) PlayerMainWndProc
    0x010A7B65 (QianTingPhoto) PlayerWinMain
    0x0141D258 (QianTingPhoto) TransferBase::IsSerializingForGameRelease
    0x0144C1D0 (QianTingPhoto) TransferBase::IsSerializingForGameRelease
    0x772D33CA (kernel32) BaseThreadInitThunk
    0x77969ED2 (ntdll) RtlInitializeExceptionChain
    0x77969EA5 (ntdll) RtlInitializeExceptionChain

    ========== END OF STACKTRACE ===========

    on Windows platform
    unity 5.2
     
    Last edited: Jun 10, 2016
  17. EnoxSoftware

    EnoxSoftware

    Joined:
    Oct 29, 2014
    Posts:
    1,565
    (int) dst_norm.at<float>(j,i) → (int)dst_norm.get (j, i)[0]
    Code (CSharp):
    1. using UnityEngine;
    2. using System.Collections;
    3.  
    4. #if UNITY_5_3 || UNITY_5_3_OR_NEWER
    5. using UnityEngine.SceneManagement;
    6. #endif
    7. using OpenCVForUnity;
    8.  
    9. namespace OpenCVForUnitySample
    10. {
    11.         /// <summary>
    12.         /// Texture2D to mat sample.
    13.         /// </summary>
    14.         public class CornerHarrisSample : MonoBehaviour
    15.         {
    16.                 int thresh = 200;
    17.  
    18.                 // Use this for initialization
    19.                 void Start ()
    20.                 {
    21.  
    22.                         Texture2D imgTexture = Resources.Load ("lena") as Texture2D;
    23.  
    24.                         Mat src = new Mat (imgTexture.height, imgTexture.width, CvType.CV_8UC4);
    25.  
    26.                         Utils.texture2DToMat (imgTexture, src);
    27.                         Debug.Log ("src ToString " + src.ToString ());
    28.  
    29.                         Mat gray = new Mat (imgTexture.height, imgTexture.width, CvType.CV_8UC1);
    30.                         Imgproc.cvtColor (src, gray, Imgproc.COLOR_RGBA2GRAY);
    31.  
    32.                         Mat dst, dst_norm, dst_norm_scaled;
    33.                         dst = Mat.zeros (src.size (), CvType.CV_32FC1);
    34.                         dst_norm = new Mat ();
    35.                         dst_norm_scaled = new Mat ();
    36.  
    37.                         // Detecting corners
    38.                         Imgproc.cornerHarris (gray, dst, 7, 5, 0.05, Core.BORDER_DEFAULT);
    39.  
    40.                         // Normalizing
    41.                         Core.normalize (dst, dst_norm, 0, 255, Core.NORM_MINMAX, CvType.CV_32FC1, new Mat ());
    42.                         Core.convertScaleAbs (dst_norm, dst_norm_scaled);
    43.  
    44.  
    45.                         // Drawing a circle around corners
    46.                         for (int j = 0; j < dst_norm.rows(); j++) {
    47.                                 for (int i = 0; i < dst_norm.cols(); i++) {
    48.                                         if ((int)dst_norm.get (j, i)[0] > thresh) {
    49.                                                 Imgproc.circle (dst_norm_scaled, new Point (i, j), 5, new Scalar (0), 2, 8, 0);
    50.                                         }
    51.                                 }
    52.                         }
    53.  
    54.  
    55.                         Texture2D texture = new Texture2D (dst_norm_scaled.cols (), dst_norm_scaled.rows (), TextureFormat.RGBA32, false);
    56.  
    57.                         Utils.matToTexture2D (dst_norm_scaled, texture);
    58.  
    59.                         gameObject.GetComponent<Renderer> ().material.mainTexture = texture;
    60.  
    61.                 }
    62.  
    63.                 // Update is called once per frame
    64.                 void Update ()
    65.                 {
    66.  
    67.                 }
    68.  
    69.                 public void OnBackButton ()
    70.                 {
    71.                         #if UNITY_5_3 || UNITY_5_3_OR_NEWER
    72.             SceneManager.LoadScene ("OpenCVForUnitySample");
    73.                         #else
    74.                         Application.LoadLevel ("OpenCVForUnitySample");
    75.                         #endif
    76.                 }
    77.         }
    78. }
    79.  
     
  18. EnoxSoftware

    EnoxSoftware

    Joined:
    Oct 29, 2014
    Posts:
    1,565
    OpenCV for Unity
    Released Version 2.0.4


    Version changes
    2.0.4
    [Android]Added Support for Split Application Binary (.OBB)
    [Android]Removed opencvforunity.jar.
     
  19. EnoxSoftware

    EnoxSoftware

    Joined:
    Oct 29, 2014
    Posts:
    1,565
    Please refer to the next sample code about the convertion of Unity's Camera position.
    https://github.com/EnoxSoftware/MarkerBasedARSample
    https://github.com/EnoxSoftware/FaceTrackerSample
     
    Deleted User likes this.
  20. EnoxSoftware

    EnoxSoftware

    Joined:
    Oct 29, 2014
    Posts:
    1,565
    Could you show me the code that caused the error?
     
  21. Deleted User

    Deleted User

    Guest

    Hi! thanks for the help. I have a question. Why you set manual hardcoded the camera parameters for MarkerBasedARSample ?


    Code (CSharp):
    1. //set cameraparam
    2.                         int max_d = (int)Mathf.Max (imgMat.rows (), imgMat.cols ());
    3.                         Mat camMatrix = new Mat (3, 3, CvType.CV_64FC1);
    4.                         camMatrix.put (0, 0, max_d);
    5.                         camMatrix.put (0, 1, 0);
    6.                         camMatrix.put (0, 2, imgMat.cols () / 2.0f);
    7.                         camMatrix.put (1, 0, 0);
    8.                         camMatrix.put (1, 1, max_d);
    9.                         camMatrix.put (1, 2, imgMat.rows () / 2.0f);
    10.                         camMatrix.put (2, 0, 0);
    11.                         camMatrix.put (2, 1, 0);
    12.                         camMatrix.put (2, 2, 1.0f);
    13.                         Debug.Log ("camMatrix " + camMatrix.dump ());
    14.  
    15.                         MatOfDouble distCoeffs = new MatOfDouble (0, 0, 0, 0);
    16.                         Debug.Log ("distCoeffs " + distCoeffs.dump ());
    17.  
    I got my camera's parameters and distortion coeffs using OpenCV's calibration method and storing them in an XML file. Then reading that XML file and constructing the camera's parameters and distortion coeffs matrices. But the position of the virtual object when I put it in the scene, it's not so correct. I think something is going wrong when I am trying to setup the virtual camera.
     
  22. MoCoder

    MoCoder

    Joined:
    Dec 4, 2012
    Posts:
    7
    Thanks,EnoxSoftware

    Code (CSharp):
    1.  
    2.     void Update()
    3.     {
    4.         Mat rgbaMat = webCamTextureToMatHelper.GetMat();
    5.  
    6.         Imgproc.cvtColor(rgbaMat, grayMat, Imgproc.COLOR_RGBA2GRAY);
    7.  
    8.         //convert image to greyscale
    9.         using (Mat equalizeHistMat = new Mat())
    10.         using (MatOfRect faces = new MatOfRect())
    11.         {
    12.             Imgproc.equalizeHist(grayMat, equalizeHistMat);
    13.  
    14.             cascade.detectMultiScale(equalizeHistMat, faces, 1.1f, 2, 0
    15.                     | Objdetect.CASCADE_FIND_BIGGEST_OBJECT,
    16.                     new OpenCVForUnity.Size(equalizeHistMat.rows() * 0.25f, equalizeHistMat.rows() * 0.25f), new Size());
    17.  
    18.             if (faces.rows() > 0)
    19.             {
    20.                 OpenCVForUnity.Rect[] rects = faces.toArray();
    21.                 TakingPhoto(rgbaMat.clone(), rects[0]);
    22.             }
    23.         }
    24.     }
    25.  
    26.  
    27.  
    28.     void TakingPhoto(Mat mat, OpenCVForUnity.Rect rect)
    29.     {
    30.      
    31.         Mat m = new Mat(mat, rect);
    32.         Mat t = new Mat(new Size(templateMat.width(), templateMat.height()), templateMat.type());
    33.  
    34.         templateMat.copyTo(t);
    35.         Imgproc.resize(t, t, new Size(rect.width, rect.height));
    36.  
    37.         try
    38.         {
    39.             Color32[] c = new Color32[m.cols() * m.rows()];
    40.             Texture2D t2d = new Texture2D(m.cols(), m.rows(), TextureFormat.RGBA32, false);
    41.  
    42.             for (int col = 0; col < m.cols(); col++)
    43.             {
    44.                 for (int row = 0; row < m.rows(); row++)
    45.                 {
    46.                     double[] rgba = m.get(row, col);
    47.                     rgba[3] = t.get(row, col)[3];
    48.                     rgba[0] = rgba[0] * 1.25f + 10;
    49.                     rgba[1] = rgba[1] * 1.25f + 10;
    50.                     rgba[2] = rgba[2] * 1.25f + 10;
    51.                     m.put(row, col, rgba);
    52.                 }
    53.             }
    54.  
    55.             Utils.matToTexture2D(m, t2d, c);
    56.         }
    57.         catch(System.Exception e)
    58.         {
    59.             Debug.LogError(e.ToString());
    60.         }
    61.  
    62.         t.Dispose();
    63.         m.Dispose();
    64.         mat.Dispose();    
    65.     }
     
  23. Altezio

    Altezio

    Joined:
    Jan 6, 2016
    Posts:
    6
    Hi EnoxSoftware, I am trying CardboardMarketBasedARSample, but I have some warnings on PreRender and PostRender components. It seems that I dont have associated scripts for those 2 components, and I cant discover what is missing.
    I am trying to solve this because I cant make ARObjects to appear in front of the marker.
     

    Attached Files:

  24. RobbGraySBL

    RobbGraySBL

    Joined:
    Feb 4, 2014
    Posts:
    40
    Seeing a crash doing compare. Here's the culprit.


    var rectangle = new OpenCVForUnity.Rect(image.width() / 4, image.height() / 4, 3 * image.width() / 4, 3 * image.height() / 4);
    Mat result = new Mat();
    Mat bgdModel = new Mat();
    Mat fgdModel = new Mat();
    Mat source = new Mat(1, 1, CvType.CV_8U, new Scalar(3));
    Imgproc.grabCut(image, result, rectangle, bgdModel, fgdModel, 8, Imgproc.GC_INIT_WITH_RECT);
    Core.compare(result, source, result, Core.CMP_EQ);
     
  25. EnoxSoftware

    EnoxSoftware

    Joined:
    Oct 29, 2014
    Posts:
    1,565
    Since I do not yet deeply understand how to convert OpenCV's CameraParameter to the Unity's virtual camera settings, I have set manual hardcoded the camera parameters for MarkerBasedARSample for now.
     
  26. EnoxSoftware

    EnoxSoftware

    Joined:
    Oct 29, 2014
    Posts:
    1,565
    I tried this code, but the error did not occur in my environment.
    Unity5.2.5f1
    Windows10
    Code (CSharp):
    1. using UnityEngine;
    2. using System.Collections;
    3.  
    4. #if UNITY_5_3 || UNITY_5_3_OR_NEWER
    5. using UnityEngine.SceneManagement;
    6. #endif
    7. using OpenCVForUnity;
    8.  
    9. namespace OpenCVForUnitySample
    10. {
    11.     /// <summary>
    12.     /// WebCamTexture detect face sample.
    13.     /// </summary>
    14.     public class WebCamTextureDetectFaceSample : MonoBehaviour
    15.     {
    16.  
    17.         /// <summary>
    18.         /// The colors.
    19.         /// </summary>
    20.         Color32[] colors;
    21.  
    22.         /// <summary>
    23.         /// The gray mat.
    24.         /// </summary>
    25.         Mat grayMat;
    26.  
    27.         /// <summary>
    28.         /// The texture.
    29.         /// </summary>
    30.         Texture2D texture;
    31.  
    32.         /// <summary>
    33.         /// The cascade.
    34.         /// </summary>
    35.         CascadeClassifier cascade;
    36.  
    37.         /// <summary>
    38.         /// The faces.
    39.         /// </summary>
    40.         MatOfRect faces;
    41.  
    42.         /// <summary>
    43.         /// The web cam texture to mat helper.
    44.         /// </summary>
    45.         WebCamTextureToMatHelper webCamTextureToMatHelper;
    46.  
    47.         //Mat templateMat;
    48.  
    49.         //public GameObject faceQuad;
    50.  
    51.         // Use this for initialization
    52.         void Start ()
    53.         {
    54.                        
    55.             webCamTextureToMatHelper = gameObject.GetComponent<WebCamTextureToMatHelper> ();
    56.             webCamTextureToMatHelper.Init ();
    57.                        
    58.  
    59.             //templateMat = new Mat (100, 100, CvType.CV_8UC4);
    60.             //templateMat.setTo(new Scalar(255,255,255,255));
    61.         }
    62.  
    63.         /// <summary>
    64.         /// Raises the web cam texture to mat helper inited event.
    65.         /// </summary>
    66.         public void OnWebCamTextureToMatHelperInited ()
    67.         {
    68.             Debug.Log ("OnWebCamTextureToMatHelperInited");
    69.            
    70.             Mat webCamTextureMat = webCamTextureToMatHelper.GetMat ();
    71.            
    72.             colors = new Color32[webCamTextureMat.cols () * webCamTextureMat.rows ()];
    73.             texture = new Texture2D (webCamTextureMat.cols (), webCamTextureMat.rows (), TextureFormat.RGBA32, false);
    74.  
    75.             grayMat = new Mat (webCamTextureMat.rows (), webCamTextureMat.cols (), CvType.CV_8UC1);
    76.             cascade = new CascadeClassifier (Utils.getFilePath ("lbpcascade_frontalface.xml"));
    77.             //cascade = new CascadeClassifier (Utils.getFilePath ("haarcascade_frontalface_alt.xml"));
    78.             if (cascade.empty ()) {
    79.                 Debug.LogError ("cascade file is not loaded.Please copy from “OpenCVForUnity/StreamingAssets/” to “Assets/StreamingAssets/” folder. ");
    80.             }
    81.             faces = new MatOfRect ();
    82.  
    83.            
    84.             gameObject.transform.localScale = new Vector3 (webCamTextureMat.cols (), webCamTextureMat.rows (), 1);
    85.            
    86.             Debug.Log ("Screen.width " + Screen.width + " Screen.height " + Screen.height + " Screen.orientation " + Screen.orientation);
    87.            
    88.             float width = 0;
    89.             float height = 0;
    90.            
    91.             width = gameObject.transform.localScale.x;
    92.             height = gameObject.transform.localScale.y;
    93.            
    94.             float widthScale = (float)Screen.width / width;
    95.             float heightScale = (float)Screen.height / height;
    96.             if (widthScale < heightScale) {
    97.                 Camera.main.orthographicSize = (width * (float)Screen.height / (float)Screen.width) / 2;
    98.             } else {
    99.                 Camera.main.orthographicSize = height / 2;
    100.             }
    101.            
    102.             gameObject.GetComponent<Renderer> ().material.mainTexture = texture;
    103.  
    104.         }
    105.  
    106.         /// <summary>
    107.         /// Raises the web cam texture to mat helper disposed event.
    108.         /// </summary>
    109.         public void OnWebCamTextureToMatHelperDisposed ()
    110.         {
    111.             Debug.Log ("OnWebCamTextureToMatHelperDisposed");
    112.  
    113.             if (grayMat != null)
    114.                 grayMat.Dispose ();
    115.             if (cascade != null)
    116.                 cascade.Dispose ();
    117.             if (faces != null)
    118.                 faces.Dispose ();
    119.         }
    120.  
    121.         // Update is called once per frame
    122.         void Update ()
    123.         {
    124.  
    125.             if (webCamTextureToMatHelper.isPlaying () && webCamTextureToMatHelper.didUpdateThisFrame ()) {
    126.  
    127.                 Mat rgbaMat = webCamTextureToMatHelper.GetMat ();
    128.                
    129.                 Imgproc.cvtColor (rgbaMat, grayMat, Imgproc.COLOR_RGBA2GRAY);
    130.                
    131.                 //convert image to greyscale
    132.                 using (Mat equalizeHistMat = new Mat())
    133.                 using (MatOfRect faces = new MatOfRect()) {
    134.                     Imgproc.equalizeHist (grayMat, equalizeHistMat);
    135.                    
    136.                     cascade.detectMultiScale (equalizeHistMat, faces, 1.1f, 2, 0
    137.                         | Objdetect.CASCADE_FIND_BIGGEST_OBJECT,
    138.                                              new OpenCVForUnity.Size (equalizeHistMat.rows () * 0.25f, equalizeHistMat.rows () * 0.25f), new Size ());
    139.                    
    140.                     if (faces.rows () > 0) {
    141.                         OpenCVForUnity.Rect[] rects = faces.toArray ();
    142.                         TakingPhoto (rgbaMat.clone (), rects [0]);
    143.  
    144.                         Imgproc.rectangle (rgbaMat, new Point (rects [0].x, rects [0].y), new Point (rects [0].x + rects [0].width, rects [0].y + rects [0].height), new Scalar (255, 0, 0, 255), 2);
    145.                     }
    146.                 }
    147.                 Utils.matToTexture2D (rgbaMat, texture, colors);
    148.  
    149. //                                Mat rgbaMat = webCamTextureToMatHelper.GetMat ();
    150. //
    151. //                                Imgproc.cvtColor (rgbaMat, grayMat, Imgproc.COLOR_RGBA2GRAY);
    152. //                                Imgproc.equalizeHist (grayMat, grayMat);
    153. //              
    154. //              
    155. //                                if (cascade != null)
    156. //                                        cascade.detectMultiScale (grayMat, faces, 1.1, 2, 2, // TODO: objdetect.CV_HAAR_SCALE_IMAGE
    157. //                                              new Size (grayMat.cols () * 0.2, grayMat.rows () * 0.2), new Size ());
    158. //              
    159. //              
    160. //                                OpenCVForUnity.Rect[] rects = faces.toArray ();
    161. //                                for (int i = 0; i < rects.Length; i++) {
    162. //                                        //                Debug.Log ("detect faces " + rects [i]);
    163. //                  
    164. //                                        Imgproc.rectangle (rgbaMat, new Point (rects [i].x, rects [i].y), new Point (rects [i].x + rects [i].width, rects [i].y + rects [i].height), new Scalar (255, 0, 0, 255), 2);
    165. //                                }
    166. //              
    167. ////                Imgproc.putText (rgbaMat, "W:" + rgbaMat.width () + " H:" + rgbaMat.height () + " SO:" + Screen.orientation, new Point (5, rgbaMat.rows () - 10), Core.FONT_HERSHEY_SIMPLEX, 1.0, new Scalar (255, 255, 255, 255), 2, Imgproc.LINE_AA, false);
    168. //              
    169. //                                Utils.matToTexture2D (rgbaMat, texture, colors);
    170.             }
    171.  
    172.         }
    173.  
    174.         void TakingPhoto (Mat mat, OpenCVForUnity.Rect rect)
    175.         {
    176.            
    177.             Mat m = new Mat (mat, rect);
    178.             Mat t = new Mat (new Size (templateMat.width (), templateMat.height ()), templateMat.type ());
    179.            
    180.             templateMat.copyTo (t);
    181.             Imgproc.resize (t, t, new Size (rect.width, rect.height));
    182.  
    183.             Debug.Log ("m "+m.ToString());
    184.            
    185.             try {
    186.                 Color32[] c = new Color32[m.cols () * m.rows ()];
    187.                 Texture2D t2d = new Texture2D (m.cols (), m.rows (), TextureFormat.RGBA32, false);
    188.                
    189.                 for (int col = 0; col < m.cols(); col++) {
    190.                     for (int row = 0; row < m.rows(); row++) {
    191.                         double[] rgba = m.get (row, col);
    192.                         rgba [3] = t.get (row, col) [3];
    193.                         rgba [0] = rgba [0] * 1.25f + 10;
    194.                         rgba [1] = rgba [1] * 1.25f + 10;
    195.                         rgba [2] = rgba [2] * 1.25f + 10;
    196.                         m.put (row, col, rgba);
    197.                     }
    198.                 }
    199.                
    200.                 Utils.matToTexture2D (m, t2d, c);
    201.  
    202.                 //faceQuad.GetComponent<Renderer> ().material.mainTexture = t2d;
    203.             } catch (System.Exception e) {
    204.                 Debug.LogError (e.ToString ());
    205.             }
    206.            
    207.             t.Dispose ();
    208.             m.Dispose ();
    209.             mat.Dispose ();  
    210.         }
    211.  
    212.         /// <summary>
    213.         /// Raises the disable event.
    214.         /// </summary>
    215.         void OnDisable ()
    216.         {
    217.             webCamTextureToMatHelper.Dispose ();
    218.         }
    219.  
    220.         /// <summary>
    221.         /// Raises the back button event.
    222.         /// </summary>
    223.         public void OnBackButton ()
    224.         {
    225.             #if UNITY_5_3 || UNITY_5_3_OR_NEWER
    226.             SceneManager.LoadScene ("OpenCVForUnitySample");
    227.             #else
    228.             Application.LoadLevel ("OpenCVForUnitySample");
    229.             #endif
    230.         }
    231.  
    232.         /// <summary>
    233.         /// Raises the play button event.
    234.         /// </summary>
    235.         public void OnPlayButton ()
    236.         {
    237.             webCamTextureToMatHelper.Play ();
    238.         }
    239.  
    240.         /// <summary>
    241.         /// Raises the pause button event.
    242.         /// </summary>
    243.         public void OnPauseButton ()
    244.         {
    245.             webCamTextureToMatHelper.Pause ();
    246.         }
    247.  
    248.         /// <summary>
    249.         /// Raises the stop button event.
    250.         /// </summary>
    251.         public void OnStopButton ()
    252.         {
    253.             webCamTextureToMatHelper.Stop ();
    254.         }
    255.  
    256.         /// <summary>
    257.         /// Raises the change camera button event.
    258.         /// </summary>
    259.         public void OnChangeCameraButton ()
    260.         {
    261.             webCamTextureToMatHelper.Init (null, webCamTextureToMatHelper.requestWidth, webCamTextureToMatHelper.requestHeight, !webCamTextureToMatHelper.requestIsFrontFacing);
    262.         }
    263.     }
    264. }
     
  27. EnoxSoftware

    EnoxSoftware

    Joined:
    Oct 29, 2014
    Posts:
    1,565
    Thank you very much for reporting.
    Please change the following line.
    -using MarkerBasedARSample;
    +using OpenCVMarkerBasedAR;
    https://github.com/EnoxSoftware/Car...mmit/6a41a4b3ff5921d155483930eb023ab370a5b7a8
     
  28. EnoxSoftware

    EnoxSoftware

    Joined:
    Oct 29, 2014
    Posts:
    1,565
    img of Imgproc.grabCut() must be Input 8-bit 3-channel image.
    http://docs.opencv.org/3.1.0/d7/d1b...#ga909c1dda50efcbeaa3ce126be862b37f&gsc.tab=0
    Code (CSharp):
    1.                        Texture2D imgTexture = Resources.Load ("lena") as Texture2D;
    2.  
    3.                         Mat image = new Mat (imgTexture.height, imgTexture.width, CvType.CV_8UC3);
    4.  
    5.                         Utils.texture2DToMat (imgTexture, image);
    6.                         Debug.Log ("image ToString " + image.ToString ());
    7.  
    8.  
    9.                         var rectangle = new OpenCVForUnity.Rect (image.width () / 4, image.height () / 4, 3 * image.width () / 4, 3 * image.height () / 4);
    10.                         Mat result = new Mat ();
    11.                         Mat bgdModel = new Mat ();
    12.                         Mat fgdModel = new Mat ();
    13.                         Mat source = new Mat (1, 1, CvType.CV_8U, new Scalar (3));
    14.                         Imgproc.grabCut (image, result, rectangle, bgdModel, fgdModel, 8, Imgproc.GC_INIT_WITH_RECT);
    15.                         Core.compare (result, source, result, Core.CMP_EQ);
    16.  
    17.                         Debug.Log ("result ToString " + result.ToString ());
    18.  
    19.  
    20.                         Texture2D texture = new Texture2D (result.cols (), result.rows (), TextureFormat.RGBA32, false);
    21.                         Utils.matToTexture2D (result, texture);
    22.                         gameObject.GetComponent<Renderer> ().material.mainTexture = texture;
     
  29. Altezio

    Altezio

    Joined:
    Jan 6, 2016
    Posts:
    6

    Attached Files:

  30. EnoxSoftware

    EnoxSoftware

    Joined:
    Oct 29, 2014
    Posts:
    1,565
  31. Altezio

    Altezio

    Joined:
    Jan 6, 2016
    Posts:
    6
  32. chelnok

    chelnok

    Joined:
    Jul 2, 2012
    Posts:
    680
    I don't have opencv, but i recognised from the screenshot, that the problem is part of google cardboard skd. You are missing CardboardPreRender.cs that should be found from assets/cardboard/scripts ,unless you have new version of the (google cardboard) sdk. Perhaps opencv is using older version of the (cardboard) sdk, and you have downloaded the new one?
     
  33. MoCoder

    MoCoder

    Joined:
    Dec 4, 2012
    Posts:
    7
    Well, happen once in a while, i'll check it again. thanks
     
  34. Deleted User

    Deleted User

    Guest

    Thanks it works awesome! In addition to this, how can I send a matrix from the C++ dll to C# (the opposite thing)? thanks
     
  35. prasetion

    prasetion

    Joined:
    Apr 3, 2014
    Posts:
    28
    hello EnoxSoftware, i have read some post in this forum but i still can not fix the orientation when portrait mode. I have imported new package faceTrackerARSample from assetstore, but it is not working for me. I have changed request width and height acording size of portrait resolution. Can you give me some clue or reference that solve the problem, maybe i miss something from this forum, thanks

    FYI, i use 1.1.4 version of faceTrackerARSample, openCV for Unity 2.0.3 and unity 5.3.1
     
    Last edited: Jun 16, 2016
  36. EnoxSoftware

    EnoxSoftware

    Joined:
    Oct 29, 2014
    Posts:
    1,565
    When you set portrait mode, what kind of problem occurs?

    Please set request width and height acording size of Landscape resolution in any orientation mode.

    facetracker_orientation.PNG facetracker_webcamhelper.PNG Screenshot_20160616-220717.png
     
  37. EnoxSoftware

    EnoxSoftware

    Joined:
    Oct 29, 2014
    Posts:
    1,565
    Code (CSharp):
    1. //C# side
    2. int width = 512;
    3. int height = 512;
    4. int bytesPerPixel = 3;
    5. byte[] dstArray = new byte[width * height * bytesPerPixel];
    6. GCHandle arrayHandle = GCHandle.Alloc (dstArray, GCHandleType.Pinned);
    7. Native_GetMat (arrayHandle.AddrOfPinnedObject (), width, height, bytesPerPixel);
    8. arrayHandle.Free ();
    9.  
    10. [DllImport ("dllname")]
    11. private static extern void Native_GetMat (IntPtr byteArray, int width, int height, int bytesPerPixel);
    12.  
    13.  
    14.  
    15. //Your C++ dll side
    16. extern "C" UNITY_INTERFACE_EXPORT void UNITY_INTERFACE_API Native_GetMat( const unsigned char* byteArray, int width, int height, int bytesPerPixel) {
    17.    cv::Mat srcMat( height, width, CV_8UC3);
    18.  
    19.    memcpy( byteArray, srcMat.data, width*height*bytesPerPixel);
    20. }
     
  38. Lanre

    Lanre

    Joined:
    Dec 26, 2013
    Posts:
    3,973
    Hi @EnoxSoftware and @iPhoneCoder ! It has been a while since we last spoke, but I am currently working on updating NatCam to v1.3 and then I will start creating YouTube videos and samples. I will be working closely woth @EnoxSoftware to create tighter integration between NatCam and OpenCVForUnity. Keep up the great work!

    @iPhoneCoder I will be working on converting the FaceTracker sample to use NatCam. You can head over to the forum page to post there.
     
  39. prasetion

    prasetion

    Joined:
    Apr 3, 2014
    Posts:
    28
    Thanks for the replay, i mean the image that video capture possible to fullscreen in portrait mode(?). just how to get fullscreen in portrait mode. Change the request width and height still not solve the problem. but after i update the project sample again from asset store it can be fullscreen after i change request width and height, like a charm. Sorry if i did not give clear question

    thanks
     
    Last edited: Jun 21, 2016
    ina likes this.
  40. Altezio

    Altezio

    Joined:
    Jan 6, 2016
    Posts:
    6
    Sorry was busy with work just now I could come back to this project. Yes I downloaded the new cardboard assets. Is that the problem? Can you think of a solution?
     
  41. pepone3d

    pepone3d

    Joined:
    Jul 5, 2012
    Posts:
    27
    Heyho! I love your new 'Auto Reset Mode' (exactly what I waited for) - but can I get rid of the red and blue box beeing always on now?
     
  42. EnoxSoftware

    EnoxSoftware

    Joined:
    Oct 29, 2014
    Posts:
    1,565
  43. chelnok

    chelnok

    Joined:
    Jul 2, 2012
    Posts:
    680
    Well if the problem is with the version, you can download and try some older version of cardboard sdk: https://github.com/googlevr/gvr-unity-sdk/releases

    but i guess @EnoxSoftware should confirm that is the problem, and tell you what version should be used. All i can tell is that cardboard sdk version 0.6 works with android 4.1 (jellybean) and 0.7 with 4.4 (kitkat) and i guess 0.8 (new) is targeted mostly for daydream ready devices as the name is no more cardboard sdk, new name is GoogleVR.
     
  44. mfahadminhas

    mfahadminhas

    Joined:
    Jun 22, 2016
    Posts:
    1
    Hi,



    Can we make AR application like shown in the link with the help of OpenCV for Unity?

    OpenCV can help with the following,
    - detect which region is colored and with what color.
    - manipulate the texture in the real time, i.e to put that color in the same region of the texture so that it can be applied to the 3D model.

    can we do these two tasks with help of OpenCV for unity?
     
  45. Deleted User

    Deleted User

    Guest

    Does your solvePnP implementation differs from the C++'s one ? I am implementing my own algorithm inside a native plugin c++ and I am returning to Unity the rotation and translation matrices from solvePnP. As soon as I set the camera pose based on that rotation and translation my virtual objects in the scene seem to be at a wrong position.
     
  46. EnoxSoftware

    EnoxSoftware

    Joined:
    Oct 29, 2014
    Posts:
    1,565
  47. EnoxSoftware

    EnoxSoftware

    Joined:
    Oct 29, 2014
    Posts:
    1,565
    Probably I think it to be feasible in the following procedure.
    1. Find the point of the four corners.
    2. Get a Top-Down View of the Image http://www.pyimagesearch.com/2014/08/25/4-point-opencv-getperspective-transform-example/
    3. Utils.matToTexture2D()
     
  48. EnoxSoftware

    EnoxSoftware

    Joined:
    Oct 29, 2014
    Posts:
    1,565
    Because C++'s solvePnP() method is called when "OpenCV for Unity" solvePnP() method was called, a result same as C++ is returned.
    please refer to "MarkerBasedARSample" and "FaceTrackerARSample" about the conversion of solovePnP's results.
     
  49. Deleted User

    Deleted User

    Guest

    Hello, I saw your projects that include solvePnP from your asset. The problem is that I am using SURF algorithm for an AR application (you don't use SIFT/SURF) that I'm working on (my thesis). So I needed to create a native plugin (C++ dll) in order to use SURF. I am executing as well the solvePnP method from that plugin, and then based on the produced rotation (rodrigues to rotation matrixblah blah..) and translation I return them in Unity3D. The next step is to create the camera pose (4x4 mat with R and t in homogenous coordinates), then using your ARUtils class the rotation/translation/scale is extracted and placed on the camera transform. Results are bad :(... Is there any convertion of those values in order to work properly in Unity3D ??
     
  50. Deleted User

    Deleted User

    Guest

    I just figured it out that I had to use the appropriate projection matrix and set it on the AR camera. Cheers ! :)