Search Unity

[RELEASED] OpenCV for Unity

Discussion in 'Assets and Asset Store' started by EnoxSoftware, Oct 30, 2014.

  1. neshius108

    neshius108

    Joined:
    Nov 19, 2015
    Posts:
    110
    That fixed the problem but then I think warping is probably not what I want to achieve.

    My goal is to be able to deform/morph a point (x,y) and its surrounding area to another location, moving all the pixels with them. Imagine having points on a small circle and stretching it to match a different shape. How would you do that?

    EDIT: Is it possible to do something like this but within OpenCV directly? https://github.com/cxcxcxcx/imgwarp-opencv

    EDIT2: Would that be achievable using: https://forum.unity3d.com/threads/released-opencv-for-unity.277080/page-19#post-2869425 ?
     
    Last edited: Aug 1, 2017
  2. intelligent2610

    intelligent2610

    Joined:
    Nov 26, 2016
    Posts:
    2
    Can I using "OpenCV for Unity" to detect hand and Vuforia to detect Image Marker on the same time
     
  3. -Aymeric-

    -Aymeric-

    Joined:
    Oct 21, 2014
    Posts:
    110
    Hello,

    It seems this plugin doesn't work in a thread, isn't it? Which amount of works it would represent to make it working with thread? Since OpenCV operations are rather long, it's a top notch feature!
    Unity can't use thread if scripts derived from MonoBehaviour, but for 99% of the plugin it shouldn't be the case!
    Have you considered it @EnoxSoftware?
     
  4. EnoxSoftware

    EnoxSoftware

    Joined:
    Oct 29, 2014
    Posts:
    1,564
    Complex warping processing is possible using ThinPlateSplineShapeTransformer class.
    Code (CSharp):
    1. using UnityEngine;
    2. using System.Collections;
    3.  
    4. #if UNITY_5_3 || UNITY_5_3_OR_NEWER
    5. using UnityEngine.SceneManagement;
    6. #endif
    7. using OpenCVForUnity;
    8.  
    9. namespace OpenCVForUnitySample
    10. {
    11.     /// <summary>
    12.     /// Texture2D to mat sample.
    13.     /// </summary>
    14.     public class ThinPlateSplineShapeTransformerExample : MonoBehaviour
    15.     {
    16.  
    17.         // Use this for initialization
    18.         void Start ()
    19.         {
    20.             Utils.setDebugMode (true);
    21.  
    22.  
    23.             Texture2D imgTexture = Resources.Load ("input") as Texture2D;
    24.  
    25.             Mat img = new Mat (imgTexture.height, imgTexture.width, CvType.CV_8UC4);
    26.  
    27.             Utils.texture2DToMat (imgTexture, img);
    28.             Debug.Log ("imgMat.ToString() " + img.ToString ());
    29.  
    30.  
    31.             OpenCVForUnity.ThinPlateSplineShapeTransformer tps = Shape.createThinPlateSplineShapeTransformer (0);
    32.             MatOfPoint2f sourcePoints = new MatOfPoint2f (
    33.                 new Point (0, 0),
    34.                 new Point (799, 0),
    35.                 new Point (0, 599),
    36.  
    37.              
    38.                 new Point (200, 200),
    39.                 new Point (400, 200),
    40.                 new Point (300, 300),
    41.  
    42.                 new Point (799, 599)
    43.             );
    44.             MatOfPoint2f targetPoints = new MatOfPoint2f (
    45.                 new Point (0, 0),
    46.                 new Point (799, 0),
    47.                 new Point (0, 599),
    48.  
    49.              
    50.                 new Point (100, 200),
    51.                 new Point (450, 100),
    52.                 new Point (300, 500),
    53.  
    54.                 new Point (799, 599)
    55.             );
    56.             MatOfDMatch matches = new MatOfDMatch (
    57.                 new DMatch (0, 0, 0),
    58.                 new DMatch (1, 1, 0),
    59.                 new DMatch (2, 2, 0),
    60.                 new DMatch (3, 3, 0),
    61.                 new DMatch (4, 4, 0),
    62.                 new DMatch (5, 5, 0),
    63.                 new DMatch (6, 6, 0)
    64.             );
    65.  
    66.  
    67.             //http://stackoverflow.com/questions/32207085/shape-transformers-and-interfaces-opencv3-0
    68.             Core.transpose (sourcePoints, sourcePoints);
    69.             Core.transpose (targetPoints, targetPoints);
    70.  
    71.             Debug.Log ("sourcePoints " + sourcePoints.ToString ());
    72.             Debug.Log ("targetPoints " + targetPoints.ToString ());
    73.  
    74.             tps.estimateTransformation (targetPoints, sourcePoints, matches);
    75.  
    76.             MatOfPoint2f transPoints = new MatOfPoint2f ();
    77.             tps.applyTransformation (sourcePoints, transPoints);
    78.  
    79.             Debug.Log ("sourcePoints " + sourcePoints.dump ());
    80.             Debug.Log ("targetPoints " + targetPoints.dump ());
    81.             Debug.Log ("transPoints " + transPoints.dump ());
    82.  
    83.  
    84.             Mat res = new Mat ();
    85.  
    86.             tps.warpImage (img, res);
    87.  
    88.             //plot points
    89.             Point[] sourcePointsArray = sourcePoints.toArray ();
    90.             Point[] targetPointsArray = targetPoints.toArray ();
    91.             for (int i = 0; i < sourcePointsArray.Length; i++) {
    92.                 Imgproc.arrowedLine (res, sourcePointsArray [i], targetPointsArray [i], new Scalar (255, 255, 0, 255), 3, Imgproc.LINE_AA, 0, 0.2);
    93.  
    94.                 Imgproc.circle (res, sourcePointsArray [i], 10, new Scalar (255, 0, 0, 255), -1);
    95.                 Imgproc.circle (res, targetPointsArray [i], 10, new Scalar (0, 0, 255, 255), -1);
    96.             }
    97.  
    98.  
    99.  
    100.  
    101.             Texture2D texture = new Texture2D (res.cols (), res.rows (), TextureFormat.RGBA32, false);
    102.  
    103.             Utils.matToTexture2D (res, texture);
    104.  
    105.             gameObject.GetComponent<Renderer> ().material.mainTexture = texture;
    106.  
    107.  
    108.             Utils.setDebugMode (false);
    109.  
    110.         }
    111.    
    112.         // Update is called once per frame
    113.         void Update ()
    114.         {
    115.    
    116.         }
    117.  
    118.         public void OnBackButton ()
    119.         {
    120.             #if UNITY_5_3 || UNITY_5_3_OR_NEWER
    121.             SceneManager.LoadScene ("OpenCVForUnitySample");
    122.             #else
    123.             Application.LoadLevel ("OpenCVForUnitySample");
    124.             #endif
    125.         }
    126.     }
    127. }
    ThinPlateSplineShapeTransformer.PNG
     
  5. neshius108

    neshius108

    Joined:
    Nov 19, 2015
    Posts:
    110
    I tried that but it seems not working well.

    Here is a simplified snippet:

    Code (CSharp):
    1. void Start () {
    2.             //.... Other stuff
    3.            InitFaceSpline();
    4. }
    5.         void InitFaceSpline ()
    6.         {
    7.             for (int i = 0; i < 68; i++) {
    8.                 psDest.Add (new Point (MaskPoints [i].x, MaskPoints [i].y));
    9.                 psMatches.Add (new DMatch (i, i, 0));
    10.             }
    11.  
    12.  
    13.             tps = Shape.createThinPlateSplineShapeTransformer (0);
    14.             sourcePoints = new MatOfPoint2f (psSrc.ToArray ());
    15.             matches = new OpenCVForUnity.MatOfDMatch (psMatches.ToArray ());
    16.  
    17.             targetPoints = new MatOfPoint2f (psDest.ToArray ());
    18.         }
    19.  
    20.  
    21. void Update() {
    22.         // Other dlib stuff
    23.         if (landmarkPoints.Count > 0) {
    24.                         psSrc.Clear ();
    25.                         psSrc = OpenCVForUnityUtils.DrawFaceLandmarkPoints (rgbaMat, landmarkPoints [0], new Scalar (0, 255, 0, 255), 4);
    26.  
    27.                         if (psSrc.Count > 0) {
    28.                             sourcePoints = new MatOfPoint2f (psSrc.ToArray ());
    29.                             //http://stackoverflow.com/questions/32207085/shape-transformers-and-interfaces-opencv3-0
    30.                             Core.transpose (sourcePoints, sourcePoints);
    31.                             Core.transpose (targetPoints, targetPoints);
    32.  
    33.                             tps.estimateTransformation (targetPoints, sourcePoints, matches);
    34.                             tps.applyTransformation (sourcePoints, transPoints);
    35.                             tps.warpImage (rgbaMat, rgbaMat);
    36.                         }
    37.                     }
    38.  
    39. OpenCVForUnity.Utils.matToTexture2D (rgbaMat, texture, colors);
    40.  
    41. }
    When I enable this section, the program freezes for a while and then the texture turns to white. My goal is to warp the 68 points detected by dlib landmark detector to other 68 points manually defined (and static).
     
  6. EnoxSoftware

    EnoxSoftware

    Joined:
    Oct 29, 2014
    Posts:
    1,564
    OpenCV for Unity work in multi thread.
    This example is a multi-threaded face detection code.
    https://github.com/EnoxSoftware/Ope...ple/WebCamTextureAsyncFaceDetectionExample.cs
     
  7. cb_emerge

    cb_emerge

    Joined:
    May 11, 2017
    Posts:
    3
    Hi there,

    I've been trying to run the OpenCV & Unity example on the HoloLens; I've followed the instructions here exactly: https://github.com/EnoxSoftware/HoloLensWithOpenCVForUnityExample but I can't get the application to run successfully on the HoloLens, I keep getting this error when I try to to run it from Visual Studio(Release | x86 | Start Debugging).

    Exception thrown: 'System.DllNotFoundException' in Assembly-CSharp.dll
    DllNotFoundException: Unable to load DLL 'opencvforunity': The specified module could not be found. (Exception from HRESULT: 0x8007007E)
    at OpenCVForUnity.Mat.core_Mat_n_1Mat__III(Int32 rows, Int32 cols, Int32 type)
    at HoloLensWithOpenCVForUnityExample.WebCamTextureToMatHelper.<_Initialize>d__24.MoveNext()
    at UnityEngine.SetupCoroutine.InvokeMoveNext(IEnumerator enumerator, IntPtr returnValueAddress)
    at UnityEngine.SetupCoroutine.$Invoke1(Int64 instance, Int64* args)
    at UnityEngine.Internal.$MethodUtility.InvokeMethod(Int64 instance, Int64* args, IntPtr method)
    (Filename: <Unknown> Line: 0)


    I've also tried changing the import settings for the opencvforunity DLL (see screenshot attached), but no luck, still getting the same error.
    Any suggestions on how I can fix this?

    Thanks in advance for your help!
     

    Attached Files:

  8. farzad_fm

    farzad_fm

    Joined:
    Mar 21, 2014
    Posts:
    1
    Hi, everybody.
    I have a question .how can we use many markerless images, for example, having 100 markerless images without any issue or problem for CPU processing(I get my images from the server). I added 10 markers with array to the scene then it got way too much slow

    Code (CSharp):
    1. using UnityEngine;
    2. using System.Collections;
    3. using UnityEngine.UI;
    4.  
    5. #if UNITY_5_3 || UNITY_5_3_OR_NEWER
    6. using UnityEngine.SceneManagement;
    7. #endif
    8. using OpenCVForUnity;
    9.  
    10. namespace MarkerLessARExample
    11. {
    12.     /// <summary>
    13.     /// Marker Less AR example from WebCamTexture.
    14.     /// https://github.com/MasteringOpenCV/code/tree/master/Chapter3_MarkerlessAR by using "OpenCV for Unity"
    15.     /// </summary>
    16.     [RequireComponent(typeof(WebCamTextureToMatHelper))]
    17.     public class WebCamTextureMarkerLessARExample : MonoBehaviour
    18.     {
    19.  
    20.         Texture2D[] patternTexture  =new Texture2D[10];
    21.         /// <summary>
    22.         /// The pattern mat.
    23.         /// </summary>
    24.         Mat[] patternMat = new Mat[10];
    25.  
    26.         /// <summary>
    27.         /// The pattern raw image.
    28.         /// </summary>
    29.         public RawImage[] patternRawImage;
    30.  
    31.         /// <summary>
    32.         /// The texture.
    33.         /// </summary>
    34.         Texture2D texture;
    35.  
    36.         /// <summary>
    37.         /// The web cam texture to mat helper.
    38.         /// </summary>
    39.         WebCamTextureToMatHelper webCamTextureToMatHelper;
    40.  
    41.         /// <summary>
    42.         /// The gray mat.
    43.         /// </summary>
    44.         Mat grayMat;
    45.  
    46.         /// <summary>
    47.         /// The AR game object.
    48.         /// </summary>
    49.         public GameObject ARGameObject;
    50.      
    51.         /// <summary>
    52.         /// The AR camera.
    53.         /// </summary>
    54.         public Camera ARCamera;
    55.      
    56.         /// <summary>
    57.         /// The cam matrix.
    58.         /// </summary>
    59.         Mat camMatrix;
    60.      
    61.         /// <summary>
    62.         /// The dist coeffs.
    63.         /// </summary>
    64.         MatOfDouble distCoeffs;
    65.      
    66.         /// <summary>
    67.         /// The invert Y.
    68.         /// </summary>
    69.         Matrix4x4 invertYM;
    70.      
    71.         /// <summary>
    72.         /// The transformation m.
    73.         /// </summary>
    74.         Matrix4x4 transformationM;
    75.      
    76.         /// <summary>
    77.         /// The invert Z.
    78.         /// </summary>
    79.         Matrix4x4 invertZM;
    80.      
    81.         /// <summary>
    82.         /// The ar m.
    83.         /// </summary>
    84.         Matrix4x4 ARM;
    85.      
    86.         /// <summary>
    87.         /// The should move AR camera.
    88.         /// </summary>
    89.         public bool shouldMoveARCamera;
    90.  
    91.         /// <summary>
    92.         /// The pattern.
    93.         /// </summary>
    94.         Pattern[] pattern = new  Pattern[10];
    95.  
    96.         /// <summary>
    97.         /// The pattern tracking info.
    98.         /// </summary>
    99.         PatternTrackingInfo[] patternTrackingInfo = new PatternTrackingInfo[10];
    100.  
    101.         /// <summary>
    102.         /// The pattern detector.
    103.         /// </summary>
    104.         PatternDetector[] patternDetector = new PatternDetector[10];
    105.  
    106.         /// <summary>
    107.         /// The is showing axes.
    108.         /// </summary>
    109.         public bool isShowingAxes = false;
    110.  
    111.         /// <summary>
    112.         /// The is showing axes toggle.
    113.         /// </summary>
    114.         public Toggle isShowingAxesToggle;
    115.  
    116.         /// <summary>
    117.         /// The axes.
    118.         /// </summary>
    119.         public GameObject axes;
    120.  
    121.         /// <summary>
    122.         /// The is showing cube.
    123.         /// </summary>
    124.         public bool isShowingCube = false;
    125.  
    126.         /// <summary>
    127.         /// The is showing cube toggle.
    128.         /// </summary>
    129.         public Toggle isShowingCubeToggle;
    130.  
    131.         /// <summary>
    132.         /// The cube.
    133.         /// </summary>
    134.         public GameObject cube;
    135.  
    136.         /// <summary>
    137.         /// The is showing video.
    138.         /// </summary>
    139.         public bool isShowingVideo = false;
    140.  
    141.         /// <summary>
    142.         /// The is showing video toggle.
    143.         /// </summary>
    144.         public Toggle isShowingVideoToggle;
    145.  
    146.         /// <summary>
    147.         /// The video.
    148.         /// </summary>
    149.         public GameObject video;
    150.         bool[] patternFound = new bool[10];
    151.      
    152.         public static string path;
    153.         // Use this for initialization
    154.         void Start ()
    155.         {
    156.          
    157.             print(Application.persistentDataPath);
    158.             isShowingAxesToggle.isOn = isShowingAxes;
    159.             axes.SetActive (isShowingAxes);
    160.             isShowingCubeToggle.isOn = isShowingCube;
    161.             cube.SetActive (isShowingCube);
    162.             isShowingVideoToggle.isOn = isShowingVideo;
    163.             video.SetActive (isShowingVideo);
    164.  
    165.             ARGameObject.gameObject.SetActive (false);
    166.  
    167.  
    168.  
    169.             webCamTextureToMatHelper = gameObject.GetComponent<WebCamTextureToMatHelper> ();
    170.             for (int i = 0; i < PlayerPrefs.GetInt("length"); i++)
    171.             {
    172.              
    173.          
    174.             //patternMat[i] = Imgcodecs.imread (Application.persistentDataPath + "/patternImg"+i+".jpg");
    175.                 patternMat[i] = Imgcodecs.imread (PlayerPrefs.GetString("path") + "/patternImg"+i+".jpg");
    176.             if (patternMat[i].total () == 0) {
    177.  
    178.                 OnPatternCaptureButton ();
    179.             } else {
    180.          
    181.                 Imgproc.cvtColor (patternMat[i], patternMat[i], Imgproc.COLOR_BGR2RGB);
    182.  
    183.                 patternTexture[i] = new Texture2D (patternMat[i].width (), patternMat[i].height (), TextureFormat.RGBA32, false);
    184.                 Utils.matToTexture2D (patternMat[i], patternTexture[i]);
    185.                 Debug.Log ("patternMat dst ToString " + patternMat.ToString ());
    186.              
    187.                 patternRawImage[i].texture = patternTexture[i];
    188.                 patternRawImage[i].rectTransform.localScale = new Vector3 (1.0f, (float)patternMat[i].height () / (float)patternMat[i].width (), 1.0f);
    189.              
    190.                 pattern[i] = new Pattern ();
    191.                 patternTrackingInfo[i] = new PatternTrackingInfo ();
    192.              
    193.                 patternDetector[i] = new PatternDetector (null, null, null, true);
    194.              
    195.                 patternDetector[i].buildPatternFromImage (patternMat[i], pattern[i]);
    196.                 patternDetector[i].train (pattern[i]);
    197.  
    198.  
    199.                 webCamTextureToMatHelper.Init ();
    200.             }
    201.          
    202.             }
    203.  
    204.         }
    205.  
    206.         /// <summary>
    207.         /// Raises the web cam texture to mat helper inited event.
    208.         /// </summary>
    209.         public void OnWebCamTextureToMatHelperInited ()
    210.         {
    211.             Debug.Log ("OnWebCamTextureToMatHelperInited");
    212.  
    213.  
    214.             Mat webCamTextureMat = webCamTextureToMatHelper.GetMat ();
    215.                  
    216.             texture = new Texture2D (webCamTextureMat.width(), webCamTextureMat.height(), TextureFormat.RGBA32, false);
    217.             gameObject.GetComponent<Renderer> ().material.mainTexture = texture;
    218.  
    219.  
    220.             grayMat = new Mat (webCamTextureMat.rows (), webCamTextureMat.cols (), CvType.CV_8UC1);
    221.                  
    222.  
    223.             gameObject.transform.localScale = new Vector3 (webCamTextureMat.width(), webCamTextureMat.height(), 1);
    224.          
    225.             Debug.Log ("Screen.width " + Screen.width + " Screen.height " + Screen.height + " Screen.orientation " + Screen.orientation);
    226.  
    227.  
    228.  
    229.             float width = webCamTextureMat.width ();
    230.             float height = webCamTextureMat.height ();
    231.          
    232.             float imageSizeScale = 1.0f;
    233.             float widthScale = (float)Screen.width / width;
    234.             float heightScale = (float)Screen.height / height;
    235.             if (widthScale < heightScale) {
    236.                 Camera.main.orthographicSize = (width * (float)Screen.height / (float)Screen.width) / 2;
    237.                 imageSizeScale = (float)Screen.height / (float)Screen.width;
    238.             } else {
    239.                 Camera.main.orthographicSize = height / 2;
    240.             }
    241.          
    242.          
    243.             //set cameraparam
    244.             int max_d = (int)Mathf.Max (width, height);
    245.             double fx = max_d;
    246.             double fy = max_d;
    247.             double cx = width / 2.0f;
    248.             double cy = height / 2.0f;
    249.             camMatrix = new Mat (3, 3, CvType.CV_64FC1);
    250.             camMatrix.put (0, 0, fx);
    251.             camMatrix.put (0, 1, 0);
    252.             camMatrix.put (0, 2, cx);
    253.             camMatrix.put (1, 0, 0);
    254.             camMatrix.put (1, 1, fy);
    255.             camMatrix.put (1, 2, cy);
    256.             camMatrix.put (2, 0, 0);
    257.             camMatrix.put (2, 1, 0);
    258.             camMatrix.put (2, 2, 1.0f);
    259.             Debug.Log ("camMatrix " + camMatrix.dump ());
    260.          
    261.          
    262.             distCoeffs = new MatOfDouble (0, 0, 0, 0);
    263.             Debug.Log ("distCoeffs " + distCoeffs.dump ());
    264.          
    265.          
    266.             //calibration camera
    267.             Size imageSize = new Size (width * imageSizeScale, height * imageSizeScale);
    268.             double apertureWidth = 0;
    269.             double apertureHeight = 0;
    270.             double[] fovx = new double[1];
    271.             double[] fovy = new double[1];
    272.             double[] focalLength = new double[1];
    273.             Point principalPoint = new Point (0, 0);
    274.             double[] aspectratio = new double[1];
    275.          
    276.             Calib3d.calibrationMatrixValues (camMatrix, imageSize, apertureWidth, apertureHeight, fovx, fovy, focalLength, principalPoint, aspectratio);
    277.          
    278.             Debug.Log ("imageSize " + imageSize.ToString ());
    279.             Debug.Log ("apertureWidth " + apertureWidth);
    280.             Debug.Log ("apertureHeight " + apertureHeight);
    281.             Debug.Log ("fovx " + fovx [0]);
    282.             Debug.Log ("fovy " + fovy [0]);
    283.             Debug.Log ("focalLength " + focalLength [0]);
    284.             Debug.Log ("principalPoint " + principalPoint.ToString ());
    285.             Debug.Log ("aspectratio " + aspectratio [0]);
    286.          
    287.          
    288.             //To convert the difference of the FOV value of the OpenCV and Unity.
    289.             double fovXScale = (2.0 * Mathf.Atan ((float)(imageSize.width / (2.0 * fx)))) / (Mathf.Atan2 ((float)cx, (float)fx) + Mathf.Atan2 ((float)(imageSize.width - cx), (float)fx));
    290.             double fovYScale = (2.0 * Mathf.Atan ((float)(imageSize.height / (2.0 * fy)))) / (Mathf.Atan2 ((float)cy, (float)fy) + Mathf.Atan2 ((float)(imageSize.height - cy), (float)fy));
    291.          
    292.             Debug.Log ("fovXScale " + fovXScale);
    293.             Debug.Log ("fovYScale " + fovYScale);
    294.          
    295.          
    296.             //Adjust Unity Camera FOV https://github.com/opencv/opencv/commit/8ed1945ccd52501f5ab22bdec6aa1f91f1e2cfd4
    297.             if (widthScale < heightScale) {
    298.                 ARCamera.fieldOfView = (float)(fovx [0] * fovXScale);
    299.             } else {
    300.                 ARCamera.fieldOfView = (float)(fovy [0] * fovYScale);
    301.             }
    302.                      
    303.                      
    304.                      
    305.                      
    306.             transformationM = new Matrix4x4 ();
    307.                      
    308.             invertYM = Matrix4x4.TRS (Vector3.zero, Quaternion.identity, new Vector3 (1, -1, 1));
    309.             Debug.Log ("invertYM " + invertYM.ToString ());
    310.                      
    311.             invertZM = Matrix4x4.TRS (Vector3.zero, Quaternion.identity, new Vector3 (1, 1, -1));
    312.             Debug.Log ("invertZM " + invertZM.ToString ());
    313.                      
    314.  
    315.  
    316.  
    317.  
    318.             //if WebCamera is frontFaceing,flip Mat.
    319.             if (webCamTextureToMatHelper.GetWebCamDevice ().isFrontFacing) {
    320.                 webCamTextureToMatHelper.flipHorizontal = true;
    321.             }            
    322.                        
    323.         }
    324.  
    325.         /// <summary>
    326.         /// Raises the web cam texture to mat helper disposed event.
    327.         /// </summary>
    328.         public void OnWebCamTextureToMatHelperDisposed ()
    329.         {
    330.             Debug.Log ("OnWebCamTextureToMatHelperDisposed");
    331.          
    332.             if (grayMat != null)
    333.                 grayMat.Dispose ();
    334.         }
    335.  
    336.         /// <summary>
    337.         /// Raises the web cam texture to mat helper error occurred event.
    338.         /// </summary>
    339.         /// <param name="errorCode">Error code.</param>
    340.         public void OnWebCamTextureToMatHelperErrorOccurred(WebCamTextureToMatHelper.ErrorCode errorCode){
    341.             Debug.Log ("OnWebCamTextureToMatHelperErrorOccurred " + errorCode);
    342.         }
    343.      
    344.         // Update is called once per frame
    345.         void Update ()
    346.         {
    347.             if (webCamTextureToMatHelper.IsPlaying () && webCamTextureToMatHelper.DidUpdateThisFrame ()) {
    348.              
    349.                 Mat rgbaMat = webCamTextureToMatHelper.GetMat ();
    350.  
    351.                 Imgproc.cvtColor (rgbaMat, grayMat, Imgproc.COLOR_RGBA2GRAY);
    352.  
    353.                 for (int i = 0; i < PlayerPrefs.GetInt("length"); i++)
    354.                 {
    355.                  
    356.                      patternFound[i] = patternDetector[i].findPattern (grayMat, patternTrackingInfo[i]);
    357.                  
    358.                                  
    359.                 if (patternFound[i]) {
    360.                    // Debug.Log ("patternFound " + i);
    361.                     patternTrackingInfo[i].computePose (pattern[i], camMatrix, distCoeffs);
    362.  
    363.                     //Marker to Camera Coordinate System Convert Matrix
    364.                     transformationM = patternTrackingInfo[i].pose3d;
    365.                     //Debug.Log ("transformationM " + transformationM.ToString ());
    366.  
    367.                     if (shouldMoveARCamera) {
    368.                         ARM = ARGameObject.transform.localToWorldMatrix * invertZM * transformationM.inverse * invertYM;
    369.                         //Debug.Log ("ARM " + ARM.ToString ());
    370.                              
    371.                         ARUtils.SetTransformFromMatrix (ARCamera.transform, ref ARM);
    372.                     } else {
    373.                              
    374.                         ARM = ARCamera.transform.localToWorldMatrix * invertYM * transformationM * invertZM;
    375.                         //Debug.Log ("ARM " + ARM.ToString ());
    376.                              
    377.                         ARUtils.SetTransformFromMatrix (ARGameObject.transform, ref ARM);
    378.                     }
    379.  
    380.                     ARGameObject.GetComponent<DelayableSetActive> ().SetActive (true);
    381.                 } else {
    382.  
    383.                     ARGameObject.GetComponent<DelayableSetActive> ().SetActive (false, 0.5f);
    384.                 }
    385.                      
    386.                 }
    387.              
    388.              
    389.  
    390.              
    391.                 Utils.matToTexture2D (rgbaMat, texture, webCamTextureToMatHelper.GetBufferColors ());
    392.  
    393.             }
    394.          
    395.         }
    396.  
    397.         /// <summary>
    398.         /// Raises the disable event.
    399.         /// </summary>
    400.         void OnDisable ()
    401.         {
    402.             webCamTextureToMatHelper.Dispose ();
    403.             for (int i = 0; i < PlayerPrefs.GetInt("length"); i++)
    404.             {
    405.                 if (patternMat != null)
    406.                 patternMat[i].Dispose ();
    407.             }
    408.          
    409.         }
    410.  
    411.  
    412.         /// <summary>
    413.         /// Raises the back button event.
    414.         /// </summary>
    415.         public void OnBackButton ()
    416.         {
    417.             #if UNITY_5_3 || UNITY_5_3_OR_NEWER
    418.             SceneManager.LoadScene ("MarkerLessARExample");
    419.             #else
    420.             Application.LoadLevel ("MarkerLessARExample");
    421.             #endif
    422.         }
    423.      
    424.         /// <summary>
    425.         /// Raises the play button event.
    426.         /// </summary>
    427.         public void OnPlayButton ()
    428.         {
    429.             webCamTextureToMatHelper.Play ();
    430.         }
    431.      
    432.         /// <summary>
    433.         /// Raises the pause button event.
    434.         /// </summary>
    435.         public void OnPauseButton ()
    436.         {
    437.             webCamTextureToMatHelper.Pause ();
    438.         }
    439.      
    440.         /// <summary>
    441.         /// Raises the stop button event.
    442.         /// </summary>
    443.         public void OnStopButton ()
    444.         {
    445.             webCamTextureToMatHelper.Stop ();
    446.         }
    447.      
    448.         /// <summary>
    449.         /// Raises the change camera button event.
    450.         /// </summary>
    451.         public void OnChangeCameraButton ()
    452.         {
    453.             webCamTextureToMatHelper.Init (null, webCamTextureToMatHelper.requestWidth, webCamTextureToMatHelper.requestHeight, !webCamTextureToMatHelper.requestIsFrontFacing);
    454.         }
    455.  
    456.         /// <summary>
    457.         /// Raises the is showing axes toggle event.
    458.         /// </summary>
    459.         public void OnIsShowingAxesToggle ()
    460.         {
    461.             if (isShowingAxesToggle.isOn) {
    462.                 isShowingAxes = true;
    463.             } else {
    464.                 isShowingAxes = false;
    465.             }
    466.             axes.SetActive (isShowingAxes);
    467.         }
    468.  
    469.         /// <summary>
    470.         /// Raises the is showing cube toggle event.
    471.         /// </summary>
    472.         public void OnIsShowingCubeToggle ()
    473.         {
    474.             if (isShowingCubeToggle.isOn) {
    475.                 isShowingCube = true;
    476.             } else {
    477.                 isShowingCube = false;
    478.             }
    479.             cube.SetActive (isShowingCube);
    480.         }
    481.  
    482.         /// <summary>
    483.         /// Raises the is showing video toggle event.
    484.         /// </summary>
    485.         public void OnIsShowingVideoToggle ()
    486.         {
    487.             if (isShowingVideoToggle.isOn) {
    488.                 isShowingVideo = true;
    489.             } else {
    490.                 isShowingVideo = false;
    491.             }
    492.             video.SetActive (isShowingVideo);
    493.         }
    494.  
    495.         /// <summary>
    496.         /// Raises the pattern capture button event.
    497.         /// </summary>
    498.         public void OnPatternCaptureButton ()
    499.         {
    500.             #if UNITY_5_3 || UNITY_5_3_OR_NEWER
    501.             SceneManager.LoadScene ("CapturePattern");
    502.             #else
    503.             Application.LoadLevel ("CapturePattern");
    504.             #endif
    505.         }
    506.     }
    507.  
    508. }
    509.  
     
    Last edited: Aug 5, 2017
  9. iddqd

    iddqd

    Joined:
    Apr 14, 2012
    Posts:
    501
    Hi, if i understood correctly, then opencv also includes ffmpeg. Does your asset also include ffmpeg?
    Thanks
     
  10. EnoxSoftware

    EnoxSoftware

    Joined:
    Oct 29, 2014
    Posts:
    1,564
    In order to display the native opencv's error code, please enclose the code in Utils.setDebugMode(true) and Utils.setDebugMode(false).
    Errors are displayed in the console?
     
  11. EnoxSoftware

    EnoxSoftware

    Joined:
    Oct 29, 2014
    Posts:
    1,564
    It seems to "import settings" are not correct.
    Please make sure it is set as follows.
    MenuItem.png
    uwp_setting.PNG
     
  12. neshius108

    neshius108

    Joined:
    Nov 19, 2015
    Posts:
    110
    No error whatsoever, just white texture and unresponsiveness of the application.

    If I comment out the following lines:

    Code (CSharp):
    1. Core.transpose (sourcePoints, sourcePoints);
    2. Core.transpose (targetPoints, targetPoints);
    The whole application becomes much more responsive but still white texture.

    So, for some reason, I have 68 facial points, I warp them to another 68 points (I placed them manually), the order and meaning is the same (I followed this diagram for dlib facial landmark points: https://sourceforge.net/p/dclib/dis.../a98e/attachment/points_S022_006_00000017.png ) but the final result is a white mat. I would expect some horribly messed up image not a completely white one :/

    Any clue?
     
  13. cb_emerge

    cb_emerge

    Joined:
    May 11, 2017
    Posts:
    3
    Hi there,

    Thanks very much for your reply, I fixed the import settings as you suggested and now I can run the AR Marker sample on the HoloLens. However, the cube never seems to be placed right on top of the marker the way it shows in your demo video.

    Instead either part of it appears below the marker, or the whole cube is significantly above the marker, depending on how I move my head (up/down). Any suggestions on how to fix this?

    Thanks! Capture.JPG
     
  14. chliebel

    chliebel

    Joined:
    Aug 8, 2017
    Posts:
    1
    Hi, I’m trying to run the HoloLens sample and it works so far, but the performance is tremendously slow. In your demo video, there’s a framerate of ~20 fps. When I try it myself, I get a frame rate of barely 5 fps. What am I missing? Quality is already set to Fastest, also tried a Release build…

    In addition, have you built a new example after HoloLensCameraStream closed their issue? Seems to be related to perf as well.

    Thanks!
     
    Last edited: Aug 8, 2017
  15. EnoxSoftware

    EnoxSoftware

    Joined:
    Oct 29, 2014
    Posts:
    1,564
    You need to customize these values.
    https://github.com/EnoxSoftware/Hol...rUcoExample/HoloLensArUcoExample.cs#L162-L174
     
  16. EnoxSoftware

    EnoxSoftware

    Joined:
    Oct 29, 2014
    Posts:
    1,564
    Have you built the example by using the same version environment?
    Unity 5.6.1f1
    HoloToolKit v1.5.7.0

    Also,There is no new example using OpenCVforUntiy with HoloLensCameraStream.
     
  17. josegal

    josegal

    Joined:
    Jul 17, 2017
    Posts:
    1
    Greetings, some one could tell me if it is posible in android to load a video mp4 , process it with OpenCV and save it as mp4 again?. I tried but I just could load a video mp4 in unity editor and was unable to save it as a new video file.
     
  18. neshius108

    neshius108

    Joined:
    Nov 19, 2015
    Posts:
    110
  19. EnoxSoftware

    EnoxSoftware

    Joined:
    Oct 29, 2014
    Posts:
    1,564
    This code works without a problem.
    Code (CSharp):
    1. using UnityEngine;
    2. using System.Collections;
    3. using System.Collections.Generic;
    4. using System;
    5.  
    6. #if UNITY_5_3 || UNITY_5_3_OR_NEWER
    7. using UnityEngine.SceneManagement;
    8. #endif
    9. using OpenCVForUnity;
    10. using DlibFaceLandmarkDetector;
    11.  
    12. namespace DlibFaceLandmarkDetectorExample
    13. {
    14.     /// <summary>
    15.     /// Texture2DToMat example.
    16.     /// An example of using "Dlib FaceLandmark Detector"  together with "OpenCV for Unity".
    17.     /// </summary>
    18.     public class Texture2DToMatExample : MonoBehaviour
    19.     {
    20.         /// <summary>
    21.         /// The image texture.
    22.         /// </summary>
    23.         public Texture2D imgTexture;
    24.  
    25.         /// <summary>
    26.         /// The shape_predictor_68_face_landmarks_dat_filepath.
    27.         /// </summary>
    28.         string shape_predictor_68_face_landmarks_dat_filepath;
    29.  
    30.         #if UNITY_WEBGL && !UNITY_EDITOR
    31.         Stack<IEnumerator> coroutines = new Stack<IEnumerator> ();
    32.         #endif
    33.  
    34.         // Use this for initialization
    35.         void Start ()
    36.         {
    37.             #if UNITY_WEBGL && !UNITY_EDITOR
    38.             var getFilePath_Coroutine = DlibFaceLandmarkDetector.Utils.getFilePathAsync ("shape_predictor_68_face_landmarks.dat", (result) => {
    39.                 coroutines.Clear ();
    40.  
    41.                 shape_predictor_68_face_landmarks_dat_filepath = result;
    42.                 Run ();
    43.             });
    44.             coroutines.Push (getFilePath_Coroutine);
    45.             StartCoroutine (getFilePath_Coroutine);
    46.             #else
    47.             shape_predictor_68_face_landmarks_dat_filepath = DlibFaceLandmarkDetector.Utils.getFilePath ("shape_predictor_68_face_landmarks.dat");
    48.             Run ();
    49.             #endif
    50.         }
    51.  
    52.         private void Run ()
    53.         {
    54.             OpenCVForUnity.Utils.setDebugMode (true);
    55.  
    56.  
    57.             Mat imgMat = new Mat (imgTexture.height, imgTexture.width, CvType.CV_8UC4);
    58.  
    59.             // Convert Unity Texture2D to OpenCV Mat.
    60.             OpenCVForUnity.Utils.texture2DToMat (imgTexture, imgMat);
    61.             Debug.Log ("imgMat dst ToString " + imgMat.ToString ());
    62.  
    63.             gameObject.transform.localScale = new Vector3 (imgTexture.width, imgTexture.height, 1);
    64.             Debug.Log ("Screen.width " + Screen.width + " Screen.height " + Screen.height + " Screen.orientation " + Screen.orientation);
    65.            
    66.             float width = imgMat.width ();
    67.             float height = imgMat.height ();
    68.            
    69.             float widthScale = (float)Screen.width / width;
    70.             float heightScale = (float)Screen.height / height;
    71.             if (widthScale < heightScale) {
    72.                 Camera.main.orthographicSize = (width * (float)Screen.height / (float)Screen.width) / 2;
    73.             } else {
    74.                 Camera.main.orthographicSize = height / 2;
    75.             }
    76.  
    77.  
    78.             FaceLandmarkDetector faceLandmarkDetector = new FaceLandmarkDetector (shape_predictor_68_face_landmarks_dat_filepath);
    79.  
    80.             OpenCVForUnityUtils.SetImage (faceLandmarkDetector, imgMat);
    81.  
    82.  
    83.             Mat res = new Mat ();
    84.        
    85.             //detect face rectdetecton
    86.             List<FaceLandmarkDetector.RectDetection> detectResult = faceLandmarkDetector.DetectRectDetection ();
    87.                        
    88.             foreach (var result in detectResult) {
    89.                 Debug.Log ("rect : " + result.rect);
    90.                 Debug.Log ("detection_confidence : " + result.detection_confidence);
    91.                 Debug.Log ("weight_index : " + result.weight_index);
    92.            
    93.                 //              Debug.Log ("face : " + rect);
    94.  
    95. //                Imgproc.putText (imgMat, "" + result.detection_confidence, new Point (result.rect.xMin, result.rect.yMin - 20), Core.FONT_HERSHEY_SIMPLEX, 0.5, new Scalar (255, 255, 255, 255), 1, Imgproc.LINE_AA, false);
    96. //                Imgproc.putText (imgMat, "" + result.weight_index, new Point (result.rect.xMin, result.rect.yMin - 5), Core.FONT_HERSHEY_SIMPLEX, 0.5, new Scalar (255, 255, 255, 255), 1, Imgproc.LINE_AA, false);
    97.  
    98.                 //detect landmark points
    99.                 List<Vector2> points = faceLandmarkDetector.DetectLandmark (result.rect);
    100.  
    101. //                Debug.Log ("face points count : " + points.Count);
    102. //                //draw landmark points
    103. //                OpenCVForUnityUtils.DrawFaceLandmark (imgMat, points, new Scalar (0, 255, 0, 255), 2);
    104. //
    105. //                //draw face rect
    106. //                OpenCVForUnityUtils.DrawFaceRect (imgMat, result.rect, new Scalar (255, 0, 0, 255), 2);
    107.  
    108.  
    109.                 OpenCVForUnity.ThinPlateSplineShapeTransformer tps = Shape.createThinPlateSplineShapeTransformer (0);
    110.  
    111.                 List<Point> listSourcePoints = new List<Point> ();
    112.                 listSourcePoints.Add (new Point (0, 0));
    113.                 listSourcePoints.Add (new Point (imgMat.width (), 0));
    114.                 listSourcePoints.Add (new Point (0, imgMat.height ()));
    115.                 listSourcePoints.Add (new Point (imgMat.width (), imgMat.height ()));
    116.                 for (int i = 0; i < points.Count; i++) {
    117.                     listSourcePoints.Add (new Point (points [i].x, points [i].y));
    118.                 }
    119.                 MatOfPoint2f sourcePoints = new MatOfPoint2f (
    120.                                                 listSourcePoints.ToArray ()
    121.                                             );
    122.  
    123.                 List<Point> listTargetPoints = new List<Point> ();
    124.                 listTargetPoints.Add (new Point (0, 0));
    125.                 listTargetPoints.Add (new Point (imgMat.width (), 0));
    126.                 listTargetPoints.Add (new Point (0, imgMat.height ()));
    127.                 listTargetPoints.Add (new Point (imgMat.width (), imgMat.height ()));
    128.                 for (int i = 0; i < points.Count; i++) {
    129.                     listTargetPoints.Add (new Point (points [i].x + UnityEngine.Random.Range (-5, 5), points [i].y + UnityEngine.Random.Range (-5, 5)));
    130.                 }
    131.                 MatOfPoint2f targetPoints = new MatOfPoint2f (
    132.                                                 listTargetPoints.ToArray ()
    133.                                             );
    134.  
    135.                 List<DMatch> listDMatchs = new List<DMatch> ();
    136.                 for (int i = 0; i < points.Count + 4; i++) {
    137.                     listDMatchs.Add (new DMatch (i, i, 0));
    138.                 }
    139.                 MatOfDMatch matches = new MatOfDMatch (
    140.                                           listDMatchs.ToArray ()
    141.                                       );
    142.  
    143.  
    144.                 //http://stackoverflow.com/questions/32207085/shape-transformers-and-interfaces-opencv3-0
    145.                 Core.transpose (sourcePoints, sourcePoints);
    146.                 Core.transpose (targetPoints, targetPoints);
    147.  
    148.                 Debug.Log ("sourcePoints " + sourcePoints.ToString ());
    149.                 Debug.Log ("targetPoints " + targetPoints.ToString ());
    150.  
    151.                 tps.estimateTransformation (targetPoints, sourcePoints, matches);
    152.  
    153.                 MatOfPoint2f transPoints = new MatOfPoint2f ();
    154.                 tps.applyTransformation (sourcePoints, transPoints);
    155.  
    156.                 Debug.Log ("sourcePoints " + sourcePoints.dump ());
    157.                 Debug.Log ("targetPoints " + targetPoints.dump ());
    158.                 Debug.Log ("transPoints " + transPoints.dump ());
    159.  
    160.  
    161.  
    162.  
    163.                 tps.warpImage (imgMat, res);
    164.  
    165.                 //plot points
    166.                 Point[] sourcePointsArray = sourcePoints.toArray ();
    167.                 Point[] targetPointsArray = targetPoints.toArray ();
    168.                 for (int i = 0; i < sourcePointsArray.Length; i++) {
    169.                     Imgproc.arrowedLine (res, sourcePointsArray [i], targetPointsArray [i], new Scalar (255, 255, 0, 255), 2, Imgproc.LINE_AA, 0, 0.2);
    170.  
    171.                     Imgproc.circle (res, sourcePointsArray [i], 3, new Scalar (255, 0, 0, 255), -1);
    172.                     Imgproc.circle (res, targetPointsArray [i], 3, new Scalar (0, 0, 255, 255), -1);
    173.                 }
    174.  
    175.             }
    176.  
    177.             faceLandmarkDetector.Dispose ();
    178.  
    179.             Texture2D texture = new Texture2D (res.cols (), res.rows (), TextureFormat.RGBA32, false);
    180.  
    181.             // Convert OpenCV Mat to Unity Texture2D.
    182.             OpenCVForUnity.Utils.matToTexture2D (res, texture);
    183.  
    184.             gameObject.GetComponent<Renderer> ().material.mainTexture = texture;
    185.  
    186.  
    187.             OpenCVForUnity.Utils.setDebugMode (false);
    188.         }
    189.  
    190.         // Update is called once per frame
    191.         void Update ()
    192.         {
    193.    
    194.         }
    195.  
    196.         /// <summary>
    197.         /// Raises the destroy event.
    198.         /// </summary>
    199.         void OnDestroy ()
    200.         {
    201.             #if UNITY_WEBGL && !UNITY_EDITOR
    202.             foreach (var coroutine in coroutines) {
    203.                 StopCoroutine (coroutine);
    204.                 ((IDisposable)coroutine).Dispose ();
    205.             }
    206.             #endif
    207.         }
    208.  
    209.         /// <summary>
    210.         /// Raises the back button click event.
    211.         /// </summary>
    212.         public void OnBackButtonClick ()
    213.         {
    214.             #if UNITY_5_3 || UNITY_5_3_OR_NEWER
    215.             SceneManager.LoadScene ("DlibFaceLandmarkDetectorExample");
    216.             #else
    217.             Application.LoadLevel ("DlibFaceLandmarkDetectorExample");
    218.             #endif
    219.         }
    220.     }
    221. }
    cap0.PNG cap1.PNG
     
  20. EnoxSoftware

    EnoxSoftware

    Joined:
    Oct 29, 2014
    Posts:
    1,564
    https://forum.unity3d.com/threads/released-opencv-for-unity.277080/page-18#post-2818261

    VideoWriter class has been implemented in "OpenCV for Unity".
    However, this class is not supported in Android.
    https://enoxsoftware.com/opencvforunity/documentation/support-modules/
    http://forum.unity3d.com/threads/released-opencv-for-unity.277080/page-16#post-2709724
     
  21. micsun-al

    micsun-al

    Joined:
    Apr 19, 2016
    Posts:
    8
    I have a Kinect for depth sensor, and a high quality camera for a video feed, and need to calibrate them.

    I got most of the code working, but don't know what function to use to feed the checkerboard patterns in to to get the position/rotation/fov/etc.

    I think OpenCV's MultiCameraCalibration is the solution, but I can't find how to access this in OpenCV for Unity.


    I also have purchased NextStage Pro and basically want to do the same functionality. The problem with NextStage Pro is it takes more time than we want to take to re-calibrate for our projects.

    Can I accomplish multiple camera calibration with another function? I have the Kinect depth data being output to an OpenCV Mat, and am producing a List<Mat> of both Kinect depth map, and high quality camera.

    Here's my attempt at trying to calibrate the cameras, but I didn't really know what it is expecting...also I have a feeling Calib3d.calibrateCamera isn't for multiple cameras.

    Code (CSharp):
    1.         void CalibrateCamera() {
    2.             print("Calibrate Camera - START");
    3.             depthFiles = System.IO.Directory.GetFiles(RequestDirectory(SaveDirectories.DEPTH));
    4.             webcamFiles = System.IO.Directory.GetFiles(RequestDirectory(SaveDirectories.WEBCAM));
    5.             Texture2D tex;
    6.  
    7.             foreach (string depthFile in depthFiles) {
    8.                 tex = new Texture2D((int)depthDims.width, (int)depthDims.height, TextureFormat.ARGB32, false);
    9.                 tex.LoadImage(System.IO.File.ReadAllBytes(depthFile));
    10.                 mat = new Mat(tex.height, tex.width, CvType.CV_8UC1);
    11.                 Utils.texture2DToMat(tex, mat);
    12.                 depthMats.Add(mat);
    13.             }
    14.  
    15.             foreach (string webcamFile in webcamFiles) {
    16.                 tex = new Texture2D((int)webcamDims.width, (int)webcamDims.height, TextureFormat.ARGB32, false);
    17.                 tex.LoadImage(System.IO.File.ReadAllBytes(webcamFile));
    18.                 mat = new Mat(tex.height, tex.width, CvType.CV_8UC1);
    19.                 Utils.texture2DToMat(tex, mat);
    20.                 webcamMats.Add(mat);
    21.             }
    22.  
    23.             // TODO: might need to resize depth to webcam size?
    24.             cameraMatrix = new Mat((int)depthDims.height, (int)depthDims.width, CvType.CV_8UC1);
    25.             distCoeffs = new Mat((int)depthDims.height, (int)depthDims.width, CvType.CV_8UC1);
    26.             rvecs = new List<Mat>();
    27.             tvecs = new List<Mat>();
    28. //            Calib3d.calibrateCamera(depthMats, webcamMats, new Size(depthDims.width, depthDims.height), cameraMatrix, distCoeffs, rvecs, tvecs);
    29. //            Multi
    30. //            OpenCVForUnity.MultiTracker mt = new MultiTracker();
    31.             // !!! So confused what to do here !!!
    32.  
    33.             print("Calibrate Camera - END");
    34.         }
    Thank you!
     
  22. micsun-al

    micsun-al

    Joined:
    Apr 19, 2016
    Posts:
    8
     
  23. micsun-al

    micsun-al

    Joined:
    Apr 19, 2016
    Posts:
    8
  24. cb_emerge

    cb_emerge

    Joined:
    May 11, 2017
    Posts:
    3
    Hi there,

    Thanks for the quick reply.
    I've been researching and can't figure out how to get these values for my HoloLens. When I try to get the focal length values for example:
    Code (CSharp):
    1. #if !UNITY_EDITOR && UNITY_WSA
    2. using System.Collections.Generic;
    3. using Windows.Storage.Streams;
    4. using Windows.Networking;
    5. using Windows.Foundation;
    6. using Windows.Media.Devices.Core;
    7. #endif
    8.  
    9. public class Communication : MonoBehaviour {
    10.  
    11. #if !UNITY_EDITOR && UNITY_WSA
    12.     CameraIntrinsics colorIntrinsics;
    13. #endif
    14.  
    15.     public void setup() {
    16. #if !UNITY_EDITOR && UNITY_WSA
    17.         System.Numerics.Vector2 focalLength = colorIntrinsics.FocalLength;
    18. #endif
    I keep getting the error in the screenshot attached.

    Do you happen to have a sample you could share showing how to get the CameraIntrinsics values for the HoloLens?

    Thanks for your help!

     

    Attached Files:

  25. neshius108

    neshius108

    Joined:
    Nov 19, 2015
    Posts:
    110
    @EnoxSoftware That looks like a great example. I think I understand what I forgot (the corners of the image).

    Anyway, now I have another problem: I just installed Unity 2017... That meant that all those `#if UNITY_5` stopped working and I had to go through one by one to change them into UNITY_5_3_OR_NEWER.
    However I am still getting this error when running the webcam texture example:

    Code (CSharp):
    1. ArgumentException: The output Mat object has to be of the same size
    2. OpenCVForUnity.Utils.webCamTextureToMat (UnityEngine.WebCamTexture webCamTexture, OpenCVForUnity.Mat mat, UnityEngine.Color32[] bufferColors) (at Assets/OpenCVForUnity/org/opencv/unity/Utils.cs:683)
    3. FaceMaskExample.WebCamTextureToMatHelper.GetMat () (at Assets/FaceMaskExample/Scripts/Utils/WebCamTextureToMatHelper.cs:375)
    4. FaceMaskExample.WebCamTextureFaceMaskExample.OnWebCamTextureToMatHelperInited () (at Assets/FaceMaskExample/Scripts/WebCamTextureFaceMaskExample.cs:224)
    5. UnityEngine.Events.InvokableCall.Invoke (System.Object[] args) (at /Users/builduser/buildslave/unity/build/Runtime/Export/UnityEvent.cs:154)
    6. UnityEngine.Events.InvokableCallList.Invoke (System.Object[] parameters) (at /Users/builduser/buildslave/unity/build/Runtime/Export/UnityEvent.cs:637)
    7. UnityEngine.Events.UnityEventBase.Invoke (System.Object[] parameters) (at /Users/builduser/buildslave/unity/build/Runtime/Export/UnityEvent.cs:773)
    8. UnityEngine.Events.UnityEvent.Invoke () (at /Users/builduser/buildslave/unity/build/Runtime/Export/UnityEvent_0.cs:52)
    9. FaceMaskExample.WebCamTextureToMatHelper+<init>c__Iterator0.MoveNext () (at Assets/FaceMaskExample/Scripts/Utils/WebCamTextureToMatHelper.cs:256)
    10. UnityEngine.SetupCoroutine.InvokeMoveNext (IEnumerator enumerator, IntPtr returnValueAddress) (at /Users/builduser/buildslave/unity/build/Runtime/Export/Coroutines.cs:17)
    11.  
    I checked and it is basically this line:

    Code (CSharp):
    1. if (mat.cols () != webCamTexture.width || mat.rows () != webCamTexture.height)
    `mat` seems to be a 0x0 matrix. I tried to add:

    Code (CSharp):
    1. if(mat.rows() == 0) {
    2.    mat =  new Mat (webCamTexture.height, webCamTexture.width, CvType.CV_8UC4);
    3. }
    But still nothing. Any idea?
     
  26. giveupgames

    giveupgames

    Joined:
    Mar 30, 2014
    Posts:
    3
    Hello, I bought the asset and I want to perform the simple marker detection as described on this page : http://docs.opencv.org/trunk/d5/dae/tutorial_aruco_detection.html

    I found the corresponding methods in your library, but I get memory errors after running it a couple times. I've attached the code I'm using to try to perform marker detection. My guess is that I'm improperly using Mats .....
     

    Attached Files:

  27. EnoxSoftware

    EnoxSoftware

    Joined:
    Oct 29, 2014
    Posts:
    1,564
    MultiCameraCalibration is not implemented in OpenCVforUnity.
    Unfortunately, I am not familiar with camera calibration, so I can not answer your question.
     
  28. giveupgames

    giveupgames

    Joined:
    Mar 30, 2014
    Posts:
    3
    Here's updated, easier to read code + my crash files....
    Code (CSharp):
    1.         void Start ()
    2.         {
    3.  
    4.             webCam = new WebCamTexture();
    5.             webCam.Play();
    6.  
    7.             arucoDict = Aruco.getPredefinedDictionary(Aruco.DICT_6X6_250);
    8.         }
    9.  
    10.         void Update ()
    11.         {
    12.             frames++;
    13.  
    14.             if (frames > totalFrames)
    15.                 return;
    16.  
    17.             if (webCam.isPlaying == false)
    18.                 return;
    19.  
    20.  
    21.             using (Mat sourceImageMat = new Mat(webCam.height, webCam.width, CvType.CV_8UC4))
    22.             {
    23.                 Utils.webCamTextureToMat(webCam, sourceImageMat);
    24.  
    25.                 using (MatOfInt ids = new MatOfInt())
    26.                 {
    27.                     using (DetectorParameters parameters = DetectorParameters.create())
    28.                     {
    29.                         List<Mat> output = new List<Mat>();
    30.                         Aruco.detectMarkers(sourceImageMat, arucoDict, output, ids);
    31.  
    32.                         if (ids == null)
    33.                             return;
    34.  
    35.                         Debug.Log(string.Format("ch:{0}  col:{1}  row:{2}  t:{3}",
    36.                             ids.channels(),
    37.                             ids.cols(),
    38.                             ids.rows(),
    39.                             ids.type()));
    40.                     }
    41.                 }
    42.             }
    43.         }
    44.  
    45.  
     

    Attached Files:

  29. ddsinteractive

    ddsinteractive

    Joined:
    May 1, 2013
    Posts:
    28
    Is there a webcamtexture to blob example using just the shapes of the people with no background?
     
  30. EnoxSoftware

    EnoxSoftware

    Joined:
    Oct 29, 2014
    Posts:
    1,564
    ArUcoWebCamTextureExample work without a problem in your environment?
    https://github.com/EnoxSoftware/Ope...rib/ArUcoExample/ArUcoWebCamTextureExample.cs
     
  31. EnoxSoftware

    EnoxSoftware

    Joined:
    Oct 29, 2014
    Posts:
    1,564
    Could you tell me about your test environment?
    OpenCV for Unity version :
    Unity version :
    Build Platform :
     
  32. EnoxSoftware

    EnoxSoftware

    Joined:
    Oct 29, 2014
    Posts:
    1,564
    HoloLensWithOpenCVForUnityExample has been updated.
    Could you try this code?
    https://github.com/EnoxSoftware/Hol...nsicsChecker/CameraIntrinsicsCheckerHelper.cs

    /*
    * If we are building a HoloLens application using Unity, trying to use System.Numerics.Vectors (.NET 4.6.1) will cause problems.
    * "Reference rewriter: Error: method System.Numerics.Vector2 Windows.Media.Devices.Core.CameraIntrinsics::get_FocalLength() doesn't exist in target framework "
    * I assume this is because the WinRT / UWP / WSA target in Unity does not use .Net 4.6.1 yet.
    * The workaround for now was to comment out the code in the script for Unity build and then to uncomment in VS when it's actually build for the HL.
    * (need to enable "Unity C # Projects" of the build setting in Unity Editor)
    * See https://forums.hololens.com/discussion/7032/using-net-4-6-features-not-supported-by-unity-wsa-build.
    *
    * In order to make this script work, it is necessary to uncomment from line 66 to line 71.
    */
     
  33. neshius108

    neshius108

    Joined:
    Nov 19, 2015
    Posts:
    110
    @EnoxSoftware I managed to solve that problem by simply going back to the previous version of Unity. For the time being, I just want to be able my core issue.

    Anyway, thanks for your solution but I am still having some problems, even after using your method. First of all, it doesn't seem to be able to run every update (too heavy/slow?) and no result so far.

    Since our implementation is pretty similar (and now almost identical), here are some comments:

    Code (CSharp):
    1. Core.transpose (sourcePoints, sourcePoints);
    2. Core.transpose (targetPoints, targetPoints);
    These two operations are very very expensive and seem to slow down the whole frame a lot. For the moment I commented them out as they don't seem to change much.

    Code (CSharp):
    1. MatOfPoint2f transPoints = new MatOfPoint2f ();
    2. tps.applyTransformation (sourcePoints, transPoints);
    Here you are applying the transformation from the source point to an empty MatOfPoint2f, is that correct? I thought I had to give in the target positions (I tried both and still no result).
     
    Last edited: Aug 22, 2017
  34. gil_m1

    gil_m1

    Joined:
    Jul 14, 2017
    Posts:
    1
    Hello everyone,
    wonder if anyone could help me with a really simple issue:

    problem:
    I wanted to create ROI on dest image and copy a section of src image inside of ROI that I created.
    however, it doesn't copy at all.

    1. initial state
    upload_2017-8-22_17-15-7.png

    2. expecting result
    upload_2017-8-22_16-48-44.png

    3. actual result
    upload_2017-8-22_17-15-34.png
    4. Code
    Code (csharp):
    1.  
    2. public class ROITEST : MonoBehaviour
    3. {
    4.  
    5.    public Texture2D srcImg, dstImg;
    6.    public RawImage srcOutImg, OutputImg;
    7.    void Start ()
    8.    {
    9.       var srcMat = new Mat(srcImg.height,srcImg.width,CvType.CV_8UC3); //create src mat
    10.       var dstMat = new Mat(dstImg.height,dstImg.width,CvType.CV_8UC3); // same size as src mat
    11.  
    12.       Utils.texture2DToMat(srcImg,srcMat); // convert src image to src mat
    13.       Utils.texture2DToMat(dstImg,dstMat); // convert dst image to dst mat
    14.      
    15.       var roi =  new OpenCVForUnity.Rect(150, 100, 150, 300);  
    16.       var mask = new Mat(dstMat,roi); //create mask with roi
    17.      
    18.       srcMat.copyTo(mask); //copy src mat to mask
    19.      
    20.       var result = new Texture2D(dstMat.cols(),dstMat.rows(),TextureFormat.RGBA32, false);
    21.       var srcDebug = new Texture2D(srcMat.cols(),srcMat.rows(),TextureFormat.RGBA32, false);
    22.      
    23.       Utils.matToTexture2D(dstMat,result);
    24.       Utils.matToTexture2D(srcMat,srcDebug);
    25.      
    26.       OutputImg.texture = result;
    27.       srcOutImg.texture = srcDebug;
    28.       srcMat.Dispose();
    29.       dstMat.Dispose();
    30.       mask.Dispose();
    31.    }
    32.    
    33. }
    34.  
    Any help would be appreciated. thank you!
     

    Attached Files:

  35. Snouto

    Snouto

    Joined:
    May 27, 2013
    Posts:
    9
    Hi @EnoxSoftware

    I'm attempting to convert some Kalman-based jitter reduction code and introduce to the WebCamTextureMarkerBasedARExample.cs script. I'm attempting to port the C++ code here but there are several places that are causing me conversion issues (due to lack of knowledge):

    the variable declarations
    Code (CSharp):
    1. vector <Point2f> prev_corner, cur_corner;
    2.         vector <Point2f> prev_corner2, cur_corner2;
    3.         vector <uchar> status;
    4.         vector <float> err;
    in c++. I've researched and it seems the "vector" declaration should become a List in c#. However, further in the code:
    Code (CSharp):
    1. goodFeaturesToTrack(prev_grey, prev_corner, 200, 0.01, 30);
    2.         calcOpticalFlowPyrLK(prev_grey, cur_grey, prev_corner, cur_corner, status, err);
    we see where these variables are used. In OpenCVForUnity these functions are somewhat different. Imgproc.goodFeaturesToTrack requires a MatofPoint for the "corners" variable, whereas in C++ above you can see it's MatOfPoint2f (in OpenCVForUnity definition).
    Secondly, calcOpticalFlowPyrLK expects the arrays as defined above for "corner" variables, however in OpenCVForUnity the expectation of Video.calcOpticalFlowPyrLK is a single "MatOfPoint2f".

    This means that when moving further down the code we hit issues:
    Code (CSharp):
    1. for(size_t i=0; i < status.size(); i++) {
    2.             if(status[i]) {
    3.                 prev_corner2.push_back(prev_corner[i]);
    4.                 cur_corner2.push_back(cur_corner[i]);
    5.             }
    6.         }
    Status, being MatOfByte, returns a Size object from "status.size()" but this is not iterate able in the way it is shown in C++. Regardless, we can not reference "MatOfPoint2f" objects with array indexing anyway "prev_corner[ i ]". I did consider converting them to a List or Array, however we end up with "Point" values and these can't be used in the "push_back" method call.

    There are some further issues too:
    Code (CSharp):
    1. double dx = T.at<double>(0,2);
    2.         double dy = T.at<double>(1,2);
    3.         double da = atan2(T.at<double>(1,0), T.at<double>(0,0));
    There is no "at" method attached to the "Mat" object in OpenCVForUnity, but there is a "get" method. Unfortunately this returns a "double[]" array so it's not immediately useable, and probably therefore isn't the correct method.

    I haven't gone much further than this point due to the issues above.

    Could you shine a light on how to go about porting C++ code to C# and OpenCVForUnity?

    Incidentally I noticed in the OpenCVFroUnity package a Kalman class. Does it perform the same functionality as described in the link above or is it unrelated (it's not used in the C++ example I referenced above).

    Cheers
     
    Last edited: Aug 23, 2017
  36. EnoxSoftware

    EnoxSoftware

    Joined:
    Oct 29, 2014
    Posts:
    1,564
    It seems that srcMat and mask need to be the same size.
    http://docs.opencv.org/3.2.0/d3/d63/classcv_1_1Mat.html#a33fd5d125b4c302b0c9aa86980791a77

    This code works correctly.
    Code (CSharp):
    1.  
    2. public class ROITEST : MonoBehaviour
    3. {
    4.  
    5.    public Texture2D srcImg, dstImg;
    6.    public RawImage srcOutImg, OutputImg;
    7.    void Start ()
    8.    {
    9.       var srcMat = new Mat(srcImg.height,srcImg.width,CvType.CV_8UC3); //create src mat
    10.       var dstMat = new Mat(dstImg.height,dstImg.width,CvType.CV_8UC3); // same size as src mat
    11.  
    12.       Utils.texture2DToMat(srcImg,srcMat); // convert src image to src mat
    13.       Utils.texture2DToMat(dstImg,dstMat); // convert dst image to dst mat
    14.    
    15.       var roi =  new OpenCVForUnity.Rect(150, 100, 150, 300);
    16.  
    17.       var srcMask = new Mat(srcMat,roi); //create mask with roi
    18.       var dstMask = new Mat(dstMat,roi); //create mask with roi
    19.    
    20.       srcMask.copyTo(dstMask); //copy srcMask to dstMask
    21.    
    22.       var result = new Texture2D(dstMat.cols(),dstMat.rows(),TextureFormat.RGBA32, false);
    23.       var srcDebug = new Texture2D(srcMat.cols(),srcMat.rows(),TextureFormat.RGBA32, false);
    24.    
    25.       Utils.matToTexture2D(dstMat,result);
    26.       Utils.matToTexture2D(srcMat,srcDebug);
    27.    
    28.       OutputImg.texture = result;
    29.       srcOutImg.texture = srcDebug;
    30.       srcMat.Dispose();
    31.       dstMat.Dispose();
    32.       srcMask.Dispose();
    33.       dstMask.Dispose();
    34.    }
    35.  
    36. }
    37.  
     
  37. neshius108

    neshius108

    Joined:
    Nov 19, 2015
    Posts:
    110
    @EnoxSoftware, do you have any tip about the spline issue? Still can't figure it out :/
     
  38. EnoxSoftware

    EnoxSoftware

    Joined:
    Oct 29, 2014
    Posts:
    1,564
    Since this asset is a clone of OpenCV Java, you are able to use the same API as OpenCV Java.
    If there is implementation example using "OpenCV Java", I think that it can be implemented even using "OpenCV for Unity".
    Please refer to these examples.
    https://github.com/EnoxSoftware/Ope...ples/OpticalFlowExample/OpticalFlowExample.cs
    Code (CSharp):
    1.         double dx = T.get(0,2)[0];
    2.         double dy = T.get(1,2)[0];
     
  39. EnoxSoftware

    EnoxSoftware

    Joined:
    Oct 29, 2014
    Posts:
    1,564
    Unfortunately, ThinPlateSplineShapeTransformer seems to be buggy. For details, please see the OpenCV source code.
    https://github.com/opencv/opencv/issues/5440
    http://answers.opencv.org/question/69384/shape-transformers-and-interfaces/
    https://github.com/opencv/opencv/issues/7084
    http://docs.opencv.org/3.2.0/df/dfe/classcv_1_1ShapeTransformer.html
     
    Last edited: Aug 29, 2017
  40. neshius108

    neshius108

    Joined:
    Nov 19, 2015
    Posts:
    110
  41. HNKMaster

    HNKMaster

    Joined:
    Aug 31, 2012
    Posts:
    19
    Hello.

    Is there way to do multiple object tracking with CamShift? I tried duplicating the code, but I couldn't get any results, to the point it stopped working.

    Cheers.
     
  42. sjmtechs

    sjmtechs

    Joined:
    Jun 20, 2017
    Posts:
    11
    Hello

    I am recently bought package OpenCV and Dlib FaceLandMark Detector.
    While using example WebCamTextureFaceMaskExample, How can i just mask a regular
    texture on detected face rather than mask a face from resource images.
     

    Attached Files:

    • How.png
      How.png
      File size:
      200.8 KB
      Views:
      879
  43. EnoxSoftware

    EnoxSoftware

    Joined:
    Oct 29, 2014
    Posts:
    1,564
    https://www.assetstore.unity3d.com/#!/content/79999
    In the FaceMaskExample, the face image is deformed by moving the vertices of a mesh generated from a face points.
     
  44. EnoxSoftware

    EnoxSoftware

    Joined:
    Oct 29, 2014
    Posts:
    1,564
    It is possible to add non-human face mask image.
    To add new mask image, you need to add the face information you want to add to "Assets \ FaceMaskExample \ Scripts \ Data \ ExampleDataSet.cs".
    If you add new mask image that can not be automatically detected with facedetector like a panda or an animated face, you need to manually add landmark point data.You also need to add new mask image to the "Resource" folder.
     
  45. sjmtechs

    sjmtechs

    Joined:
    Jun 20, 2017
    Posts:
    11
    Hello,

    While using "WebCamTextureFaceMaskExample",
    How can I remove facemask image pic from upper right cornor ?
     
  46. EnoxSoftware

    EnoxSoftware

    Joined:
    Oct 29, 2014
    Posts:
    1,564
  47. HNKMaster

    HNKMaster

    Joined:
    Aug 31, 2012
    Posts:
    19
    Sorry to bump my question, but I need to know if this is viable.

    I must add, what I want is to track two objects based on two colors using CamShift.
     
    Last edited: Sep 7, 2017
  48. zalo10

    zalo10

    Joined:
    Dec 5, 2012
    Posts:
    21
    I'm getting really strange behavior from the undistortion function; it's completely mangling my images and stretching them in a strange way.

    I would really like to be able to use my fisheye cameras (leap motion) with the modules that require rectified images (like AruCo).

    Here's the code I'm using (should yield an identity transformation as it is); this (and variations of it) all seem to mangle the images in basically the same way, so I'm kind of stuck...
    Code (CSharp):
    1.       Mat K = Mat.eye(3, 3, CvType.CV_64F);
    2.       K.put(0, 0, 1.0f);
    3.       K.put(1, 1, 1.0f);
    4.       K.put(0, 2, image.Width / 2);
    5.       K.put(1, 2, image.Height / 2);
    6.  
    7.       Mat D = new Mat(4, 1, CvType.CV_64F);
    8.       D.put(0, 0, 1.0f);
    9.  
    10.       Calib3d.undistortImage(openCVMat, unwarpedMat, K, D);
     
  49. EnoxSoftware

    EnoxSoftware

    Joined:
    Oct 29, 2014
    Posts:
    1,564
  50. sjmtechs

    sjmtechs

    Joined:
    Jun 20, 2017
    Posts:
    11
    Hello,

    While using "WebCamTextureFaceMaskExample",
    If camera detects two people it places two same mask, How can I random the mask for multiple detection ?