Search Unity

  1. Welcome to the Unity Forums! Please take the time to read our Code of Conduct to familiarize yourself with the forum rules and how to post constructively.
  2. Have a look at our Games Focus blog post series which will show what Unity is doing for all game developers – now, next year, and in the future.
    Dismiss Notice
Dismiss Notice
Join us on Dec 8, 2022, between 7 am & 7 pm EST, in the DOTS Dev Blitz Day 2022 - Q&A forum, Discord, and Unity3D Subreddit to learn more about DOTS directly from the Unity Developers.

[RELEASED] OpenCV for Unity

Discussion in 'Assets and Asset Store' started by EnoxSoftware, Oct 30, 2014.

  1. EnoxSoftware

    EnoxSoftware

    Joined:
    Oct 29, 2014
    Posts:
    1,469
    Is the version of OpenCVForUnity you tried 2.4.7? The opencvforunity.bundle for OpenCVForUnity 2.4.7 includes x86_64 and arm64 architectures.
     
  2. EnoxSoftware

    EnoxSoftware

    Joined:
    Oct 29, 2014
    Posts:
    1,469
    Unfortunately, I don't own an iPhone 12pro, so I tried it on my iPhoneSE2 and ARHeadExample worked fine. I have also tested it on UnityEditor using the DeviceSimulator package and it works fine.
    ARHead_iPhone12max.PNG
     
  3. Dragantium

    Dragantium

    Joined:
    Feb 14, 2015
    Posts:
    16
     
  4. Dragantium

    Dragantium

    Joined:
    Feb 14, 2015
    Posts:
    16
    I bought Open CV for unity and it was a great decision, now I am dealing with face identification and although I understood the process, I would need a help to associate the detected face to a gameobject or take information from it, so that each stored face has a name. Thanks for your time
     
  5. unity_7024A8A4ADC4FA396E4B

    unity_7024A8A4ADC4FA396E4B

    Joined:
    Dec 17, 2021
    Posts:
    7
    Hi.
    Is this a bug?
    Is there a workaround?

    opencvForUnity
    Mat m;
    m*=0.5;
    Has no effect
    operator *() is create temp mat and reference copy

    opencvSharp
    public static implicit operator Mat(MatExpr self)
    public static MatExpr operator *(Mat a, Scalar s)
    Copy by assignment
     
  6. EnoxSoftware

    EnoxSoftware

    Joined:
    Oct 29, 2014
    Posts:
    1,469

    I have tried this test code and it seems to work fine.

    Code (CSharp):
    1. Mat m1 = new Mat(3, 3, CvType.CV_64FC1);
    2. m1.put(0, 0, 1, 2, 3, 4, 5, 6, 7, 8, 9);
    3. Debug.Log("m1=" + m1.dump());
    4. m1*=0.5;
    5. Debug.Log("m1*=0.5=" + m1.dump());
    Result:
    Code (CSharp):
    1. m1=[1, 2, 3;
    2. 4, 5, 6;
    3. 7, 8, 9]
    4. m1*=0.5=[0.5, 1, 1.5;
    5. 2, 2.5, 3;
    6. 3.5, 4, 4.5]
    Similar operator example code is included in the Examples/Basic/MatBasicProcessingExample scene to see how it works.


    And in the Mat.cs code, operator is implemented as follows.

    Code (CSharp):
    1.         // Scaling.
    2.         // A * alpha, alpha * A
    3.         public static Mat operator *(Mat a, double s)
    4.         {
    5.             Mat m = new Mat();
    6.             Core.multiply(a, Scalar.all(s), m);
    7.             return m;
    8.         }
    9.         public static Mat operator *(double s, Mat a)
    10.         {
    11.             Mat m = new Mat();
    12.             Core.multiply(a, Scalar.all(s), m);
    13.             return m;
    14.         }


    See this page for other implemented operators.
    https://enoxsoftware.com/opencvforunity/way-to-translation-of-mat-class-operators-defined-in-cpp/
     
  7. unity_7024A8A4ADC4FA396E4B

    unity_7024A8A4ADC4FA396E4B

    Joined:
    Dec 17, 2021
    Posts:
    7
    I'm sorry, I didn't have enough code.
    I verified it more deeply.

    Code (CSharp):
    1.  
    2. {
    3.                 Mat m1 = new Mat(3, 3, CvType.CV_64FC1);
    4.                 m1.put(0, 0, 1, 2, 3, 4, 5, 6, 7, 8, 9);
    5.  
    6.                 Mat copyref_m2 = m1;
    7.  
    8.                 Debug.Log("------direct--------");
    9.                 Debug.Log("m1=" + m1.dump());
    10.  
    11.                 OpenCVForUnity.CoreModule.Core.multiply(m1, Scalar.all(0.5f), m1);
    12.  
    13.                 Debug.Log("m1*=0.5=" + m1.dump());
    14.                 Debug.Log("copyref_m2*=0.5=" + copyref_m2.dump());
    15.             }
    16.             {
    17.                 Mat m1 = new Mat(3, 3, CvType.CV_64FC1);
    18.                 m1.put(0, 0, 1, 2, 3, 4, 5, 6, 7, 8, 9);
    19.  
    20.                 Mat copyref_m2 = m1;
    21.  
    22.  
    23.                 Debug.Log("------self--------");
    24.                 Debug.Log("m1=" + m1.dump());
    25.  
    26.                 m1 *= 0.5;
    27.  
    28.                 Debug.Log("m1*=0.5=" + m1.dump());
    29.                 Debug.Log("copyref_m2*=0.5=" + copyref_m2.dump());
    30.             }
    31.  
    32.             {
    33.                 Mat m1 = new Mat(3, 3, CvType.CV_64FC1);
    34.                 m1.put(0, 0, 1, 2, 3, 4, 5, 6, 7, 8, 9);
    35.                 Mat copyref_m2 = m1;
    36.  
    37.                 void Func(Mat m)
    38.                 {
    39.                     m *= 0.5f;
    40.                 }
    41.  
    42.                 Debug.Log("------argments--------");
    43.                 Debug.Log("m1=" + m1.dump());
    44.  
    45.                 Func(m1);
    46.                 Debug.Log("m1*=0.5=" + m1.dump());
    47.                 Debug.Log("copyref_m2*=0.5=" + copyref_m2.dump());
    48.             }
    49.  
    50.             {
    51.                 Mat m1 = new Mat(3, 3, CvType.CV_64FC1);
    52.                 m1.put(0, 0, 1, 2, 3, 4, 5, 6, 7, 8, 9);
    53.                 Mat submat_m2 = m1.submat(0, 3, 0, 3);
    54.  
    55.  
    56.                 m1 *= 0.5;
    57.  
    58.                 Debug.Log("------submat--------");
    59.                 Debug.Log("m1=" + m1.dump());
    60.  
    61.                 Debug.Log("m1*=0.5=" + m1.dump());
    62.                 Debug.Log("submat_m2*=0.5=" + submat_m2.dump());
    63.             }
    64.  
    Result:
    Code (CSharp):
    1.  
    2. ------direct--------
    3.  
    4. m1=[1, 2, 3;
    5.  4, 5, 6;
    6.  7, 8, 9]
    7.  
    8. m1*=0.5=[0.5, 1, 1.5;
    9.  2, 2.5, 3;
    10.  3.5, 4, 4.5]
    11.  
    12. copyref_m2*=0.5=[0.5, 1, 1.5;
    13.  2, 2.5, 3;
    14.  3.5, 4, 4.5]
    15.  
    16. ------self--------
    17.  
    18. m1=[1, 2, 3;
    19.  4, 5, 6;
    20.  7, 8, 9]
    21.  
    22. m1*=0.5=[0.5, 1, 1.5;
    23.  2, 2.5, 3;
    24.  3.5, 4, 4.5]
    25.  
    26. copyref_m2*=0.5=[1, 2, 3;
    27.  4, 5, 6;
    28.  7, 8, 9]
    29.  
    30. ------argments--------
    31.  
    32. m1=[1, 2, 3;
    33.  4, 5, 6;
    34.  7, 8, 9]
    35.  
    36. m1*=0.5=[1, 2, 3;
    37.  4, 5, 6;
    38.  7, 8, 9]
    39.  
    40. copyref_m2*=0.5=[1, 2, 3;
    41.  4, 5, 6;
    42.  7, 8, 9]
    43.  
    44. ------submat--------
    45.  
    46. m1=[0.5, 1, 1.5;
    47.  2, 2.5, 3;
    48.  3.5, 4, 4.5]
    49.  
    50. m1*=0.5=[0.5, 1, 1.5;
    51.  2, 2.5, 3;
    52.  3.5, 4, 4.5]
    53.  
    54. submat_m2*=0.5=[1, 2, 3;
    55.  4, 5, 6;
    56.  7, 8, 9]
    57.  
     
  8. EnoxSoftware

    EnoxSoftware

    Joined:
    Oct 29, 2014
    Posts:
    1,469
    Your validation result is not wrong.
    In c#, it is not possible to explicitly overload compound assignment operators such as "*=". Instead, binary operator overloading is used implicitly.
    Therefore, whenever an operator is used, a new mat is created and assigned.
    Unfortunately, the only workaround I can think of is to use the Core.multiply function to do the processing without using operators.
     
  9. cecarlsen

    cecarlsen

    Joined:
    Jun 30, 2006
    Posts:
    735
    Hi @EnoxSoftware thanks for integrating the flags, much appreciated.

    I am testing them out using the example chessboard provided by OpenCv here (both flags enabled). But it does not seem to have any effect. As soon as any of the inner corners exit the camera image, I lose tracking. I wonder if you can reproduce this? You already have an example up here - just need to enable the flags.
     
  10. unity_7024A8A4ADC4FA396E4B

    unity_7024A8A4ADC4FA396E4B

    Joined:
    Dec 17, 2021
    Posts:
    7
    In opencvsharp, it was emulated with the cast operator.
    mat * = MatExp (mat)
    Probably it seems that emulation can be done like this,
    but using Core.multiply seems to be good this time.
    thank you!
     
  11. matrix211v1

    matrix211v1

    Joined:
    Jan 20, 2009
    Posts:
    193
    Hello!

    I would like to use the WebCamTextureMarkerLessARExample and have multiple images that it can detect and based on the return ID, I can then toggle on the GameObject necessary. Can you please either point me to a forum post where that happens or give me some insight on how to use an array of images to pattern match?

    Thanks!
     
  12. ibompuis

    ibompuis

    Joined:
    Sep 13, 2012
    Posts:
    78
    Hi,
    After some test with Feature matching, I try to know how to get result like image match TRUE/FALSE
    Actualy result showing line between images but how to know if image match without see the line result ?

    I would like to use server side analysis for this and send result to user

    Thanks
     
  13. EnoxSoftware

    EnoxSoftware

    Joined:
    Oct 29, 2014
    Posts:
    1,469
    The PatternDetector class can only detect one target image. Therefore, if you want to detect multiple target images, you must use multiple PatternDetector classes.
     
  14. EnoxSoftware

    EnoxSoftware

    Joined:
    Oct 29, 2014
    Posts:
    1,469
    MarkerLessARExample is a good reference for the process of evaluating the similarity between two images.
    https://github.com/EnoxSoftware/Mar...RExample/MarkerLessAR/PatternDetector.cs#L213
     
    ibompuis likes this.
  15. ibompuis

    ibompuis

    Joined:
    Sep 13, 2012
    Posts:
    78
  16. kukewilly

    kukewilly

    Joined:
    Jan 3, 2019
    Posts:
    44
    Hello,

    I'm having an issue with the distance accuracy of the Aruco tag readings between 2 devices (android vs PC). When I calibrate and run the app on my PC it reads them accurately. But then when i use the same camera on an android device it reads them as being further away. The problem is exaggerated at 720p, and 480p seems to be pretty accurate, but in both cases the further I am from the tag the more exaggerated the extra distance.

    FYI I am using a plug-in to get the video feed into my build on android, and not using the WebcamTextureToMatHelper script.

    These are the things I know:

    1. Mat rgbamat in Update on ArucoWebcamTextureExample have the same width and height on both PC and android.

    PC:


    https://imgur.com/a/3a6sqMg

    ANDROID:

    https://imgur.com/a/Hgls4Vr


    2. The camera feed appears the same between PC and Android (doesnt look smaller or distorted or dif resolution)

    I suspect that the problem is in the Start() function for ArucoWebcamTextureExample where it scales the screen (line 75 - 95) but I'm not sure.

    Code (CSharp):
    1.  void Start()
    2.         {
    3.             fpsMonitor = GetComponent<FpsMonitor> ();
    4.  
    5.             Markers = FindObjectOfType<MarkerCoordinates>().gameObject;
    6.             USBCamera = FindObjectOfType<USBCamera>().gameObject;
    7.  
    8.             markerTypeDropdown.value = (int)markerType;
    9.             dictionaryIdDropdown.value = (int)dictionaryId;
    10.             showRejectedCornersToggle.isOn = showRejectedCorners;
    11.             refineMarkerDetectionToggle.isOn = refineMarkerDetection;
    12.             refineMarkerDetectionToggle.interactable = (markerType == MarkerType.GridBoard || markerType == MarkerType.ChArUcoBoard);
    13.             enableLowPassFilterToggle.isOn = enableLowPassFilter;
    14.  
    15.             webCamTextureToMatHelper = gameObject.GetComponent<WebCamTextureToMatHelper> ();
    16.  
    17.             #if UNITY_ANDROID && !UNITY_EDITOR
    18.             // Avoids the front camera low light issue that occurs in only some Android devices (e.g. Google Pixel, Pixel2).
    19.             webCamTextureToMatHelper.avoidAndroidFrontCameraLowLightIssue = true;
    20.             Debug.Log("Initialize OpenCV in Android");
    21.             OnWebCamTextureToMatHelperInitialized();
    22.             #endif
    23.  
    24.             #if UNITY_EDITOR
    25.             Debug.Log("Initialize OpenCV in Unity");
    26.             webCamTextureToMatHelper.Initialize ();
    27.             #endif
    28.  
    29.         }
    30.  
    31.         /// <summary>
    32.         /// Raises the webcam texture to mat helper initialized event.
    33.         /// </summary>
    34.         public void OnWebCamTextureToMatHelperInitialized ()
    35.         {
    36.             Debug.Log ("OnWebCamTextureToMatHelperInitialized");
    37.             Mat webCamTextureMat;
    38.  
    39. #if UNITY_EDITOR
    40.                 webCamTextureMat = webCamTextureToMatHelper.GetMat ();
    41.                 texture = new Texture2D (webCamTextureMat.cols (), webCamTextureMat.rows (), TextureFormat.RGB24, false);
    42.                 Utils.fastMatToTexture2D(webCamTextureMat, texture);
    43.                 gameObject.GetComponent<Renderer> ().material.mainTexture = texture;
    44. #endif
    45.  
    46.  
    47. #if UNITY_ANDROID && !UNITY_EDITOR
    48.                 USBCamTexture = USBCamera.GetComponent<USBCamera>().tempTexture2D;
    49.                 if (USBCamTexture == null)
    50.                 {
    51.                     Debug.Log("Texture2D from UVCCamera is null");
    52.                 }
    53.              
    54.                 RenderTexture tmp = RenderTexture.GetTemporary(USBCamTexture.width, USBCamTexture.height, 0, RenderTextureFormat.Default, RenderTextureReadWrite.Linear);
    55.                 Graphics.Blit(USBCamTexture, tmp);
    56.                 RenderTexture previous = RenderTexture.active;
    57.                 RenderTexture.active = tmp;
    58.                 readableTexture = new Texture2D (USBCamTexture.width, USBCamTexture.height, TextureFormat.RGBA32, false);
    59.                 readableTexture.ReadPixels(new UnityEngine.Rect(0, 0, tmp.width, tmp.height), 0, 0);
    60.                 readableTexture.Apply();
    61.                 RenderTexture.active = previous;
    62.                 RenderTexture.ReleaseTemporary(tmp);
    63.  
    64.                 USBCamMat = new Mat (USBCamTexture.height, USBCamTexture.width, CvType.CV_8UC4);
    65.  
    66.                 texture = new Texture2D (USBCamTexture.width, USBCamTexture.height, TextureFormat.RGB24, false);
    67.                 gameObject.GetComponent<Renderer> ().material.mainTexture = texture;
    68.                 Utils.texture2DToMat (readableTexture, USBCamMat);
    69.                 Core.flip(USBCamMat, USBCamMat, -1);
    70.                 webCamTextureMat = USBCamMat;
    71.  
    72. #endif
    73.          
    74.             Debug.Log(webCamTextureMat);
    75.             gameObject.transform.localScale = new Vector3 (webCamTextureMat.cols (), webCamTextureMat.rows (), 1);
    76.             Debug.Log ("Screen.width " + Screen.width + " Screen.height w" + Screen.height + " Screen.orientation " + Screen.orientation);
    77.  
    78.             if (fpsMonitor != null) {
    79.                 fpsMonitor.Add ("width", webCamTextureMat.width ().ToString ());
    80.                 fpsMonitor.Add ("height", webCamTextureMat.height ().ToString ());
    81.                 fpsMonitor.Add ("orientation", Screen.orientation.ToString ());
    82.             }
    83.  
    84.          
    85.             float width = webCamTextureMat.width ();
    86.             float height = webCamTextureMat.height ();
    87.          
    88.             float imageSizeScale = 1.0f;
    89.             float widthScale = (float)Screen.width / width;
    90.             float heightScale = (float)Screen.height / height;
    91.             if (widthScale < heightScale) {
    92.                 Camera.main.orthographicSize = (width * (float)Screen.height / (float)Screen.width) / 2;
    93.                 imageSizeScale = (float)Screen.height / (float)Screen.width;
    94.             } else {
    95.                 Camera.main.orthographicSize = height / 2;
    96.             }
    97.          
    98.  
    99.             // set camera parameters.
    100.             double fx;
    101.             double fy;
    102.             double cx;
    103.             double cy;
    104.  
    105.             string loadDirectoryPath = Path.Combine (Application.persistentDataPath, "ArUcoCameraCalibrationExample");
    106.             string calibratonDirectoryName = "camera_parameters" + width + "x" + height;
    107.             string loadCalibratonFileDirectoryPath = Path.Combine (loadDirectoryPath, calibratonDirectoryName);
    108.             string loadPath = Path.Combine (loadCalibratonFileDirectoryPath, calibratonDirectoryName + ".xml");
    109.             if (useStoredCameraParameters && File.Exists (loadPath)) {
    110.                 CameraParameters param;
    111.                 XmlSerializer serializer = new XmlSerializer (typeof(CameraParameters));
    112.                 using (var stream = new FileStream (loadPath, FileMode.Open)) {
    113.                     param = (CameraParameters)serializer.Deserialize (stream);
    114.                 }
    115.  
    116.                 camMatrix = param.GetCameraMatrix ();
    117.                 distCoeffs = new MatOfDouble (param.GetDistortionCoefficients ());
    118.  
    119.                 fx = param.camera_matrix [0];
    120.                 fy = param.camera_matrix [4];
    121.                 cx = param.camera_matrix [2];
    122.                 cy = param.camera_matrix [5];
    123.  
    124.                 Debug.Log ("Loaded CameraParameters from a stored XML file.");
    125.                 Debug.Log ("loadPath: " + loadPath);
    126.  
    127.             } else {
    128.                 int max_d = (int)Mathf.Max (width, height);
    129.                 fx = max_d;
    130.                 fy = max_d;
    131.                 cx = width / 2.0f;
    132.                 cy = height / 2.0f;
    133.  
    134.                 camMatrix = new Mat (3, 3, CvType.CV_64FC1);
    135.                 camMatrix.put (0, 0, fx);
    136.                 camMatrix.put (0, 1, 0);
    137.                 camMatrix.put (0, 2, cx);
    138.                 camMatrix.put (1, 0, 0);
    139.                 camMatrix.put (1, 1, fy);
    140.                 camMatrix.put (1, 2, cy);
    141.                 camMatrix.put (2, 0, 0);
    142.                 camMatrix.put (2, 1, 0);
    143.                 camMatrix.put (2, 2, 1.0f);
    144.  
    145.                 distCoeffs = new MatOfDouble (0, 0, 0, 0);
    146.  
    147.                 Debug.Log ("Created a dummy CameraParameters.");
    148.             }
    149.              
    150.             Debug.Log ("camMatrix " + camMatrix.dump ());
    151.             Debug.Log ("distCoeffs " + distCoeffs.dump ());
    152.  
    153.  
    154.             // calibration camera matrix values.
    155.             Size imageSize = new Size (width * imageSizeScale, height * imageSizeScale);
    156.             double apertureWidth = 0;
    157.             double apertureHeight = 0;
    158.             double[] fovx = new double[1];
    159.             double[] fovy = new double[1];
    160.             double[] focalLength = new double[1];
    161.             Point principalPoint = new Point (0, 0);
    162.             double[] aspectratio = new double[1];
    163.          
    164.             Calib3d.calibrationMatrixValues (camMatrix, imageSize, apertureWidth, apertureHeight, fovx, fovy, focalLength, principalPoint, aspectratio);
    165.          
    166.             Debug.Log ("imageSize " + imageSize.ToString ());
    167.             Debug.Log ("apertureWidth " + apertureWidth);
    168.             Debug.Log ("apertureHeight " + apertureHeight);
    169.             Debug.Log ("fovx " + fovx [0]);
    170.             Debug.Log ("fovy " + fovy [0]);
    171.             Debug.Log ("focalLength " + focalLength [0]);
    172.             Debug.Log ("principalPoint " + principalPoint.ToString ());
    173.             Debug.Log ("aspectratio " + aspectratio [0]);
    174.          
    175.          
    176.             // To convert the difference of the FOV value of the OpenCV and Unity.
    177.             double fovXScale = (2.0 * Mathf.Atan ((float)(imageSize.width / (2.0 * fx)))) / (Mathf.Atan2 ((float)cx, (float)fx) + Mathf.Atan2 ((float)(imageSize.width - cx), (float)fx));
    178.             double fovYScale = (2.0 * Mathf.Atan ((float)(imageSize.height / (2.0 * fy)))) / (Mathf.Atan2 ((float)cy, (float)fy) + Mathf.Atan2 ((float)(imageSize.height - cy), (float)fy));
    179.  
    180.             Debug.Log ("fovXScale " + fovXScale);
    181.             Debug.Log ("fovYScale " + fovYScale);
    182.          
    183.          
    184.             // Adjust Unity Camera FOV https://github.com/opencv/opencv/commit/8ed1945ccd52501f5ab22bdec6aa1f91f1e2cfd4
    185.             if (widthScale < heightScale) {
    186.                 arCamera.fieldOfView = (float)(fovx [0] * fovXScale);
    187.             } else {
    188.                 arCamera.fieldOfView = (float)(fovy [0] * fovYScale);
    189.             }
    190.             // Display objects near the camera.
    191.             arCamera.nearClipPlane = 0.01f;
    192.          
    193.          
    194.             rgbMat = new Mat (webCamTextureMat.rows (), webCamTextureMat.cols (), CvType.CV_8UC3);
    195.             ids = new Mat ();
    196.             corners = new List<Mat> ();
    197.             rejectedCorners = new List<Mat> ();
    198.             rvecs = new Mat ();
    199.             tvecs = new Mat ();
    200.             rotMat = new Mat (3, 3, CvType.CV_64FC1);
    201.          
    202.          
    203.             detectorParams = DetectorParameters.create ();
    204.             dictionary = Aruco.getPredefinedDictionary ((int)dictionaryId);
    205.  
    206.             rvec = new Mat ();
    207.             tvec = new Mat ();
    208.             recoveredIdxs = new Mat ();
    209.  
    210.             gridBoard = GridBoard.create (gridBoradMarkersX, gridBoradMarkersY, gridBoradMarkerLength, gridBoradMarkerSeparation, dictionary, gridBoradMarkerFirstMarker);
    211.  
    212.             charucoCorners = new Mat ();
    213.             charucoIds = new Mat ();
    214.             charucoBoard = CharucoBoard.create (chArUcoBoradSquaresX, chArUcoBoradSquaresY, chArUcoBoradSquareLength, chArUcoBoradMarkerLength, dictionary);
    215.  
    216.             diamondCorners = new List<Mat> ();
    217.             diamondIds = new Mat (1, 1, CvType.CV_32SC4);
    218.             diamondIds.put (0, 0, new int[] { diamondId1, diamondId2, diamondId3, diamondId4 });
    219.  
    220.  
    221.             // if WebCamera is frontFaceing, flip Mat.
    222.             //webCamTextureToMatHelper.flipHorizontal = webCamTextureToMatHelper.GetWebCamDevice ().isFrontFacing;
    223.         }
    Any thoughts on what might be happening?
     
    Last edited: Jul 13, 2022
  17. matrix211v1

    matrix211v1

    Joined:
    Jan 20, 2009
    Posts:
    193
    Hello

    I'm using the WebCamTextureMarkerLessARExample. It works with the CapturePattern saving the image to the Android and then the PatternDetector will be fine.

    However, I have the images I need to detect, therefore my code looks like this:
    Code (CSharp):
    1.        
    2.         patternMat = Imgcodecs.imread(Application.streamingAssetsPath + "/" + patternName + ".jpg");
    3.  
    4.         Debug.Log("What is my path: " + Application.streamingAssetsPath + "/" + patternName + ".jpg");
    5.         Imgproc.cvtColor(patternMat, patternMat, Imgproc.COLOR_BGR2RGB);
    6.  
    7.         Texture2D patternTexture = new Texture2D(patternMat.width(), patternMat.height(), TextureFormat.RGBA32, false);
    8.  
    9.         //To reuse mat, set the flipAfter flag to true.
    10.         Utils.matToTexture2D(patternMat, patternTexture, true, 0, true);
    11.         Debug.Log("patternMat dst ToString " + patternMat.ToString());
    12.  
    I have my images stored in StreamingAssets and in the PC it works fine, but the Android it seem it's either not finding the files or something different because I keep getting this error:

    Code (CSharp):
    1. GetPixels32 called on a degenerate image (dimensions 0x0)
    2. OpenCVForUnity.UnityUtils.Utils:matToTexture2D(Mat, Texture2D, Color32[], Boolean, Int32, Boolean, Boolean, Boolean)
    3. OpenCVForUnity.UnityUtils.Utils:matToTexture2D(Mat, Texture2D, Boolean, Int32, Boolean, Boolean, Boolean)
    4. <CardReadDelay>d__10:MoveNext()
    5. UnityEngine.SetupCoroutine:InvokeMoveNext(IEnumerator, IntPtr)
    6.  
    7. [./Runtime/Graphics/Texture2D.cpp line -617697764]
    8. UnityException: Texture '' is not configured correctly to allow GetPixels
    9. at OpenCVForUnity.UnityUtils.Utils.matToTexture2D (OpenCVForUnity.CoreModule.Mat mat, UnityEngine.Texture2D texture2D, UnityEngine.Color32[] bufferColors, System.Boolean flip, System.Int32 flipCode, System.Boolean flipAfter, System.Boolean updateMipmaps, System.Boolean makeNoLongerReadable) [0x00000] in <00000000000000000000000000000000>:0
    10. at OpenCVForUnity.UnityUtils.Utils.matToTexture2D (OpenCVForUnity.CoreModule.Mat mat, UnityEngine.Texture2D texture2D, System.Boolean flip, System.Int32 flipCode, System.Boolean flipAfter, System.Boolean updateMipmaps, System.Boolean makeNoLongerReadable) [0x00000] in <00000000000000000000000000000000>:0
    11. at ScanbugPattern+<CardReadDelay>d__10.MoveNext () [0x00000] in <00000000000000000000000000000000>:0
    12. at UnityEngine.SetupCoroutine.InvokeMoveNext (System.Collections.IEnumerator enumerator, System.IntPtr returnValueAddress) [0x00000] in <00000000000000000000000000000000>:0
    So the patternMat seems to be null or returning a incorrect value but it works on PC. I'm overlooking something very obvious.

    Any suggestions?
     
  18. EnoxSoftware

    EnoxSoftware

    Joined:
    Oct 29, 2014
    Posts:
    1,469
    Are calibration parameters read from a file at runtime?
    Also, is there any possibility that the parameters calibrated on PC may differ from those calibrated on Android?
     
  19. EnoxSoftware

    EnoxSoftware

    Joined:
    Oct 29, 2014
    Posts:
    1,469
    Probably the image file is failing to load.
    Could you try using the Utils.getFilePath() method?
    Code (CSharp):
    1. patternMat = Imgcodecs.imread( Utils.getFilePath( patternName + ".jpg"));
     
  20. matrix211v1

    matrix211v1

    Joined:
    Jan 20, 2009
    Posts:
    193
    Perfect! That fixed it. You are most awesome!
     
  21. kukewilly

    kukewilly

    Joined:
    Jan 3, 2019
    Posts:
    44
    I think you you're right about not being able to access the calibration file on android. The file path is outside my Unity project folder so that obviously won't build into the game. How do I change the calibration save path? Looks like I have to alter Application.persistentDataPath but says read only.
     
  22. matrix211v1

    matrix211v1

    Joined:
    Jan 20, 2009
    Posts:
    193
    I'm at the last part. Just need to get the speed of detection up for the framerate is like 5 fps.

    This is how I'm loading up all 26 elements to do the detection on.
    Code (CSharp):
    1.        int index = 0;
    2.         foreach(var item in patternNames)
    3.         {
    4.             patternMat[index] = Imgcodecs.imread(Utils.getFilePath(item + ".jpg"));
    5.  
    6.             Debug.Log("What is my path: " + Utils.getFilePath(item + ".jpg"));
    7.             Imgproc.cvtColor(patternMat[index], patternMat[index], Imgproc.COLOR_BGR2RGB);
    8.  
    9.             Texture2D patternTexture = new Texture2D(patternMat[index].width(), patternMat[index].height(), TextureFormat.RGBA32, false);
    10.  
    11.             //To reuse mat, set the flipAfter flag to true.
    12.             Utils.matToTexture2D(patternMat[index], patternTexture, true, 0, true);
    13.             Debug.Log("patternMat dst ToString " + patternMat[index].ToString());
    14.  
    15.             pattern = new Pattern();
    16.             patternTrackingInfo = new PatternTrackingInfo();
    17.  
    18.             patternDetector[index] = new PatternDetector(null, null, null, true);
    19.  
    20.             patternDetector[index].buildPatternFromImage(patternMat[index], pattern);
    21.             patternDetector[index].train(pattern);
    22.  
    23.             index++;
    24.         }
    And here is the update loop
    Code (CSharp):
    1.     void Update()
    2.     {
    3.         if (!hasFoundPattern && readyToScan)
    4.         {
    5.             if (webCamTextureToMatHelper.IsPlaying() && webCamTextureToMatHelper.DidUpdateThisFrame())
    6.             {
    7.                 int index = 0;
    8.                 foreach(var item in patternNames)
    9.                 {
    10.                     bool patternFound = patternDetector[index].findPattern(grayMatTest, patternTrackingInfo);
    11.                     if (patternFound)
    12.                     {
    13.                         hasFoundPattern = true;
    14.                         modelOfCards[index].SetActive(true);
    15.                     }
    16.                     index++;
    17.                 }
    18.             }
    19.         }
    20.     }
    Is there a better way to do this OR is there a setting to tune how many points it detecting for a pattern?

    Thanks!
     
  23. EnoxSoftware

    EnoxSoftware

    Joined:
    Oct 29, 2014
    Posts:
    1,469
    The cause of the fps drop may be that each time the PatternDetector.findPattern() method is called, the feature point detection process for the video frame is executed in duplicate.
    https://github.com/EnoxSoftware/Mar...ple/MarkerLessAR/PatternDetector.cs#L215-L219
    If the code is modified to run these processes only once per frame and use the feature points obtained, fps may be improved.
     
  24. EnoxSoftware

    EnoxSoftware

    Joined:
    Oct 29, 2014
    Posts:
    1,469
    Could you copy the generated "ArUcoCameraCalibrationExample" folder to the StreamingAsset folder and change the code as follows?

    ArUcoCameraCalibrationExample_getFilePath.PNG

    Code (CSharp):
    1.             //string loadDirectoryPath = Path.Combine(Application.persistentDataPath, "ArUcoCameraCalibrationExample");
    2.             string loadDirectoryPath = "ArUcoCameraCalibrationExample";
    3.             string calibratonDirectoryName = "camera_parameters" + width + "x" + height;
    4.             string loadCalibratonFileDirectoryPath = Path.Combine(loadDirectoryPath, calibratonDirectoryName);
    5.             //string loadPath = Path.Combine(loadCalibratonFileDirectoryPath, calibratonDirectoryName + ".xml");
    6.             string loadPath = Utils.getFilePath( Path.Combine(loadCalibratonFileDirectoryPath, calibratonDirectoryName + ".xml"));
    7.             if (useStoredCameraParameters && File.Exists(loadPath))
     
    kukewilly likes this.
  25. kukewilly

    kukewilly

    Joined:
    Jan 3, 2019
    Posts:
    44

    This worked like a charm thanks :D.
     
    EnoxSoftware likes this.
  26. awesleybr12

    awesleybr12

    Joined:
    Mar 19, 2022
    Posts:
    1
    Hi,

    I've been doing work with face recognition using the FaceDetectorYN and the FaceRecognizerSF. Both of these classes work fine on windows; however, when I exported the project and tried to run it on my mac I got an error when the FaceDetectorYN was being created. I cross checked this with the facerecoginition example provided in OpenCVForUnity and the same error occurs where a C++ error is given when the FaceDetectorYN is trying to be created. The code below is the line that causes the C++ error to occur on Mac but this same error does not occur on windows.
    Code (CSharp):
    1.                 FaceDetectorYN faceDetector = FaceDetectorYN.create(fd_model_filepath, "", new Size(imageSizeW, imageSizeH), scoreThreshold, nmsThreshold, topK);
    2.  
    Thanks for any help you are able to give!
     
  27. EnoxSoftware

    EnoxSoftware

    Joined:
    Oct 29, 2014
    Posts:
    1,469
    Thank you very much for reporting.
    Currently there seems to be a loading error in the onnx model as follows
    https://github.com/opencv/opencv/issues/22139

    Could you download the modified onnx model file and add it to the StremingAssets folder?
    https://github.com/opencv/opencv_zo...ction_yunet/face_detection_yunet_2022mar.onnx

    https://github.com/EnoxSoftware/Ope...ple/FaceDetectorYNWebCamTextureExample.cs#L82
    Code (CSharp):
    1. protected static readonly string MODEL_FILENAME = "objdetect/face_detection_yunet_2022mar.onnx";
     
  28. Anastasia-Devana

    Anastasia-Devana

    Joined:
    Aug 18, 2013
    Posts:
    20
    Hi there, just bought the plugin to use with Magic Leap. Unfortunately, your Magic Leap examples on GitHub currently don't work. The video gets captured fine, but nothing is being detected by the model (objects or faces) I'm seeing the same error for both scenes, like this: "VideoCaptureExample failed to GetIntrinsicCalibrationParameters. Reason: MLResult_NotImplemented".

    DlibFaceLandmarkDetector has different errors, and won't compile. It's looking for "'OpenCVForUnityUtils'".

    Any thoughts on how to get either of these to work?
     
    Last edited: Aug 3, 2022
  29. EnoxSoftware

    EnoxSoftware

    Joined:
    Oct 29, 2014
    Posts:
    1,469
    Could you tell me the environment you tested?
    Lumin OS version:
    Lumin SDK version:
    Unity version:
    OpenCV for Unity version:


    Could you import the DlibFaceLandmarkDetectorWithOpenCVExample.unitypackage?
    Import_DlibFaceLandmarkDetectorWithOpenCVExample.PNG
     
  30. Anastasia-Devana

    Anastasia-Devana

    Joined:
    Aug 18, 2013
    Posts:
    20
    @EnoxSoftware I have an update. Got the examples to work.

    Importing the additional Unity package for DlibFaceLandmarkDetector fixed that issue (I missed that step in the instructions).

    And the issue with the other video example is that it's using a deprecated method call. So I changed the API calls, and it works now.

    Also, just FYI, the minimum size for face detection was set too large, so you would have to be extremely close to a person for the face to be detected. I reduced the minimum size and it's much more useful now.

    Another question I had is about the performance on Magic Leap. The basic face detection seems to run pretty fast. But yolov3-tiny is about 1 FPS. And text detection is much slower - around several seconds. Is that expected, or am I doing something wrong with my setup?

    Unity: 2020.3.29f1
    Lumin OS: 0.98.33
    Lumin SDK: 0.26.0
    OpenCV: 2.4.8
     
  31. kloogens

    kloogens

    Joined:
    Jul 1, 2015
    Posts:
    103
    Is there a color converter in OpenCvToUnity that converts Unity.Color (RGB) to HSV color in OpenCV?

    I see Unity.Color has an RGBToHSV method but Unity's HSV format is different than that of OpenCVs version of HSV format.

    or am I just being thick?
     
    Last edited: Aug 3, 2022
  32. Karmahero

    Karmahero

    Joined:
    Feb 11, 2012
    Posts:
    6
    Is there a known bug for using RTSP in the latest build? Running in Windows x64 (Editor or Build) with "opencv_videoio_ffmpeg460_64.dll" added to Plugins, the feed works, but it updates at a very low frame rate ( <0.01fps) and the application is almost unresponsive.

    This is just using the "BackgroundSubtractorComparisonExample" with an RTSP stream while a local video file runs at 130 fps for the screen and the video at 30fps (which is its video rate). The same RTSP stream through python or VLC runs in real time.

    Unity Version: 2021.3.7f1 URP

    Here is a public RTSP test stream to verify against: rtsp://rtsp.stream/pattern
     
    Last edited: Aug 6, 2022
  33. EnoxSoftware

    EnoxSoftware

    Joined:
    Oct 29, 2014
    Posts:
    1,469
    The inference process in the Dnn module for mats larger than 1000 x 1000 pixels takes time. So I think that once Mat is downsized and then the inference process is performed, the inference time will be reduced.
     
    Anastasia-Devana likes this.
  34. EnoxSoftware

    EnoxSoftware

    Joined:
    Oct 29, 2014
    Posts:
    1,469

    Unfortunately, OpenCVForUnity doesn’t provide a method to convert Unity.Color RGB to OpenCV HSV.

    Unity.Color HSV range is: H: 0.0 to 1.0 / S: 0.0 to 1.0 / V: 0.0 to 1.0.
    OpenCV HSV range is: H: 0 to 179 / S: 0 to 255 / V: 0 to 255.
     
  35. kloogens

    kloogens

    Joined:
    Jul 1, 2015
    Posts:
    103

    That is what I thought.

    So if I multiply the Unity H (0 - 1) by 179 to get OpenCV H equivalent?

    For example in Unity H = .25 therefore in OpenCV the H equivalent value would be .25 * 179 = 44.75?

    That is what I'm doing but still isn't working the way I would expect.


    I'm picking an (RGB) color from a texture, then using a threshold value I want to create the Min and Max range (in HSV) of said selected color. However, I can't even seem to convert the select color to its HVS counterpart.

    Like everyone else, I'm struggling to get the proper color ranges to use with Core.inRange


    Can anyone offer any suggestions?
     
  36. Karmahero

    Karmahero

    Joined:
    Feb 11, 2012
    Posts:
    6
    A C# solution can be found here: https://www.geeksforgeeks.org/program-change-rgb-color-model-hsv-color-model/

    Just replaced the double with float and return to Vector3:

    Code (CSharp):
    1.    static Vector3 rgb_to_hsv(float r, float g, float b)
    2.     {
    3.         // R, G, B values are divided by 255
    4.         // to change the range from 0..255 to 0..1
    5.         r = r / 255.0f;
    6.         g = g / 255.0f;
    7.         b = b / 255.0f;
    8.         // h, s, v = hue, saturation, value
    9.         float cmax = Math.Max(r, Math.Max(g, b)); // maximum of r, g, b
    10.         float cmin = Math.Min(r, Math.Min(g, b)); // minimum of r, g, b
    11.         float diff = cmax - cmin; // diff of cmax and cmin.
    12.         float h = -1, s = -1;
    13.        
    14.         // if cmax and cmax are equal then h = 0
    15.         if (cmax == cmin)
    16.             h = 0;
    17.         // if cmax equal r then compute h
    18.         else if (cmax == r)
    19.             h = (60 * ((g - b) / diff) + 360) % 360;
    20.         // if cmax equal g then compute h
    21.         else if (cmax == g)
    22.             h = (60 * ((b - r) / diff) + 120) % 360;
    23.         // if cmax equal b then compute h
    24.         else if (cmax == b)
    25.             h = (60 * ((r - g) / diff) + 240) % 360;
    26.         // if cmax equal zero
    27.         if (cmax == 0)
    28.             s = 0;
    29.         else
    30.             s = (diff / cmax) * 100;
    31.         // compute v
    32.         float v = cmax * 100;
    33.         return new Vector3(h, s, v);
    34.     }
     
    kloogens likes this.
  37. kloogens

    kloogens

    Joined:
    Jul 1, 2015
    Posts:
    103

    Thank you, much appreciated
     
  38. EnoxSoftware

    EnoxSoftware

    Joined:
    Oct 29, 2014
    Posts:
    1,469
    When I tried the following URL in my environment, it played without any problem ( 100fps ).
    OpenCVforUnity 2.4.8
    opencv_videoio_ffmpeg460_64.dll
    rtsp://wowzaec2demo.streamlock.net/vod/mp4:BigBuckBunny_115k.mp4

    However, when I tried the URL you showed, the frame rate was very low, as you reported.
    rtsp://rtsp.stream/pattern
    I have not yet investigated whether the same problem occurs with opencv from c++ or python.
     
  39. Karmahero

    Karmahero

    Joined:
    Feb 11, 2012
    Posts:
    6
    Using the same DLL with python (3.6) and package "opencv-python" (4.6.0.66), the playback for that test stream does appear to work in real time:

    Code (Python):
    1.  
    2. import cv2
    3.  
    4. def main():
    5.     cap = cv2.VideoCapture("rtsp://rtsp.stream/pattern")
    6.  
    7.     while(cap.isOpened()):
    8.         ret, frame = cap.read()
    9.         cv2.imshow('frame', frame)
    10.      
    11.         key = cv2.waitKey(10) & 0xFF
    12.  
    13.         if key == ord('q'):
    14.             break
    15.  
    16.     cap.release()
    17.     cv2.destroyAllWindows()
    18.  
    19. if __name__ == "__main__":
    20.     main()
    21.  
     
  40. sinhht1

    sinhht1

    Joined:
    Nov 29, 2018
    Posts:
    1
    Hello, I want place 2d object (Image) onto webcam face detect rectangle. I would like to place it at center of rectangle. Please guide me how to do it?
     
  41. EnoxSoftware

    EnoxSoftware

    Joined:
    Oct 29, 2014
    Posts:
    1,469
    I have found the code that causes this bug. If you comment out the following line, the video stream will be retrieved in real time.
    Code (CSharp):
    1.                 if (capture.get(Videoio.CAP_PROP_POS_FRAMES) >= capture.get(Videoio.CAP_PROP_FRAME_COUNT))
    2.                     capture.set(Videoio.CAP_PROP_POS_FRAMES, 0);
    https://github.com/EnoxSoftware/Ope...ptureExample/VideoCaptureExample.cs#L167-L168
     
  42. Karmahero

    Karmahero

    Joined:
    Feb 11, 2012
    Posts:
    6
    Works perfect. Thanks for the quick assistance.
     
  43. jalajshah

    jalajshah

    Joined:
    Mar 5, 2018
    Posts:
    60
    Hello,
    I am trying to run BackgroundSubtractorComparisonExample it always give me error "video File not exist".
    i checked the file location are in streaming asset folder.

    how can i fixed this ?
     
  44. EnoxSoftware

    EnoxSoftware

    Joined:
    Oct 29, 2014
    Posts:
    1,469
    Does "Assets/StremminigAssets/768x576_mjpeg.mjpeg" exist?
    768x576_mjpeg.PNG
     
  45. jalajshah

    jalajshah

    Joined:
    Mar 5, 2018
    Posts:
    60
    Yes, the file is exist.
    The issue is when we import plugin the StremmingAsset folder location are: Assets/OpenCVForUnity/StremminigAssets/768x576_mjpeg.mjpeg

    Working fine after changing location like:
    Assets/StremminigAssets/768x576_mjpeg.mjpeg
     
  46. VitekCapS

    VitekCapS

    Joined:
    Jul 26, 2017
    Posts:
    3
    Hello! I'm trying to establish interaction between two solutions, namely AR Foundation and OpenCV but i have a big trouble. If i use WebCameraTexture class then AR Foundation completely not working.
    I decided to try to change and replace the class WebCameraTexture by custom class like crappy-adapter WebCamXRBridge.
    And I got results with AR Foundation Remote 2.0 all libraries and scripts completely works.
    My texture obtained from ARCameraFrameEventArgs displayed correctly on the Unity material.
    BUT...
    That works fine in Editor, but in Android build its not working. I can't render the texture on RawImage or any Renderer... What am I doing wrong?

    Additional info:
    Unity version: 2021.3.7f1
    AR Foundation 4.2.3
    AR Foundation Remote 2.0.25
    OpenCV for Unity 2.4.8

    There is part of code
    Code (CSharp):
    1.  
    2. public class WebCamXRBridge : MonoBehaviour
    3. {
    4.     ARCameraManager cameraManager;
    5.  
    6.     [HideInInspector] public string deviceName;
    7.     [HideInInspector] public string name;
    8.     [HideInInspector] public float requestedFPS = 30;
    9.     [HideInInspector] public int height = 480;
    10.     [HideInInspector] public int width = 640;
    11.     [HideInInspector] public bool didUpdateThisFrame => lastFrame != null;
    12.  
    13.     [HideInInspector] public int videoRotationAngle = 90;
    14.     [HideInInspector] public bool videoVerticallyMirrored = false;
    15.  
    16.     [HideInInspector] public bool isPlaying;
    17.     bool isInitialized = false;
    18.  
    19.     [HideInInspector] public Texture2D lastFrame;
    20.     public WebCamXRBridge()
    21.     {
    22.     }
    23.  
    24.     public WebCamXRBridge(string deviceName, int requestedWidth, int requestedHeigth, float requestedFPS = 30)
    25.     {
    26.  
    27.     }
    28.  
    29.     private void Awake()
    30.     {
    31.         cameraManager = GetComponent<ARCameraManager>();
    32.     }
    33.     void OnEnable()
    34.     {
    35.         cameraManager.frameReceived += OnCameraFrameReceived;
    36.     }
    37.  
    38.     void OnDisable()
    39.     {
    40.         cameraManager.frameReceived -= OnCameraFrameReceived;
    41.     }
    42.  
    43.     bool isFirstUpdate = true;
    44.     public void OnCameraFrameReceived(ARCameraFrameEventArgs eventArgs)
    45.     {
    46.         if (isFirstUpdate)
    47.         {
    48.             lastFrame = new Texture2D(eventArgs.textures[0].width, eventArgs.textures[0].height, TextureFormat.RGBA32, false);
    49.             this.width = lastFrame.width;
    50.             this.height = lastFrame.height;
    51.             Debug.Log("FirstUpdate WebCamXRBridge " + width + "x" + height);
    52.             isFirstUpdate = false;
    53.         }
    54.  
    55.         Utils.textureToTexture2D(eventArgs.textures[0], lastFrame);    
    56.     }
    57.  
    58.     public void OnXRSubsystemBridgeInitialized()
    59.     {
    60.         isInitialized = true;
    61.     }
    62.  
    63.     public void Play()
    64.     {
    65.         isPlaying = true;
    66.     }
    67.  
    68.     public void Stop()
    69.     {
    70.         isPlaying = false;
    71.     }
    72.  
    73.     public void Pause()
    74.     {
    75.         isPlaying = false;
    76.     }
    77. }
    78.  
    79. //method from another class to to get a Mat from last frame
    80. public virtual Mat GetMat()
    81.     {
    82.         if (!hasInitDone || !webCamXRBridge.isPlaying)
    83.         {
    84.             return (rotatedFrameMat != null) ? rotatedFrameMat : frameMat;
    85.         }
    86.  
    87.         Utils.setDebugMode(true);
    88.         if (baseColorFormat == outputColorFormat)
    89.         {
    90.             Utils.fastTexture2DToMat(webCamXRBridge.lastFrame, frameMat, false);
    91.         }
    92.         else
    93.         {
    94.             Utils.fastTexture2DToMat(webCamXRBridge.lastFrame, baseMat, false);
    95.             Debug.Log("cvtColor In WebCamToMatHelper.cs");
    96.             Imgproc.cvtColor(baseMat, frameMat, ColorConversionCodes(baseColorFormat, outputColorFormat));
    97.         }
    98.         Utils.setDebugMode(false);
    99. #if !UNITY_EDITOR && !(UNITY_STANDALONE || UNITY_WEBGL)
    100.             if (rotatedFrameMat != null)
    101.             {
    102.                 if (screenOrientation == ScreenOrientation.Portrait || screenOrientation == ScreenOrientation.PortraitUpsideDown)
    103.                 {
    104.                     // (Orientation is Portrait, rotate90Degree is false)
    105.                     if (webCamDevice.isFrontFacing)
    106.                     {
    107.                         FlipMat(frameMat, !flipHorizontal, !flipVertical);
    108.                     }
    109.                     else
    110.                     {
    111.                         FlipMat(frameMat, flipHorizontal, flipVertical);
    112.                     }
    113.                 }
    114.                 else
    115.                 {
    116.                     // (Orientation is Landscape, rotate90Degrees=true)
    117.                     FlipMat(frameMat, flipVertical, flipHorizontal);
    118.                 }
    119.                 Core.rotate(frameMat, rotatedFrameMat, Core.ROTATE_90_CLOCKWISE);
    120.                 return rotatedFrameMat;
    121.             }
    122.             else
    123.             {
    124.                 if (screenOrientation == ScreenOrientation.Portrait || screenOrientation == ScreenOrientation.PortraitUpsideDown)
    125.                 {
    126.                     // (Orientation is Portrait, rotate90Degree is ture)
    127.                     if (webCamDevice.isFrontFacing)
    128.                     {
    129.                         FlipMat(frameMat, flipHorizontal, flipVertical);
    130.                     }
    131.                     else
    132.                     {
    133.                         FlipMat(frameMat, !flipHorizontal, !flipVertical);
    134.                     }
    135.                 }
    136.                 else
    137.                 {
    138.                     // (Orientation is Landscape, rotate90Degree is false)
    139.                     FlipMat(frameMat, flipVertical, flipHorizontal);
    140.                 }
    141.                 return frameMat;
    142.             }
    143. #else
    144.         FlipMat(frameMat, flipVertical, flipHorizontal);
    145.         if (rotatedFrameMat != null)
    146.         {
    147.             Core.rotate(frameMat, rotatedFrameMat, Core.ROTATE_90_CLOCKWISE);
    148.             return rotatedFrameMat;
    149.         }
    150.         else
    151.         {
    152.             return frameMat;
    153.         }
    154. #endif
    155.     }
    156.  
    157. //======Second another class======
    158. public void OnWebCamTextureToMatHelperInitialized()
    159.     {
    160.         Debug.Log("OnWebCamTextureToMatHelperInitialized");
    161.         if (XRFrameToMatHelper == null)
    162.             XRFrameToMatHelper = gameObject.GetComponent<XRCameraFrameToMatHelper>();
    163.  
    164.         detector = ORB.create();
    165.         detector.setMaxFeatures(1000);
    166.         keypoints = new MatOfKeyPoint();
    167.  
    168.         Mat webCamTextureMat = XRFrameToMatHelper.GetMat();
    169.  
    170.         texture = new Texture2D(webCamTextureMat.width(), webCamTextureMat.height(), TextureFormat.RGBA32, false);
    171.  
    172.         rgbMat = new Mat(webCamTextureMat.rows(), webCamTextureMat.cols(), CvType.CV_8UC3);
    173.         outputMat = new Mat(webCamTextureMat.rows(), webCamTextureMat.cols(), CvType.CV_8UC3);
    174.         grayMat = new Mat(webCamTextureMat.rows(), webCamTextureMat.cols(), CvType.CV_8UC1);
    175.  
    176.         //This is Quad transform
    177.         gameObject.transform.localScale = new Vector3(webCamTextureMat.width() / 200, webCamTextureMat.height() / 200, 1);
    178.         gameObject.GetComponent<Renderer>().material.mainTexture = texture;
    179.  
    180.         testRawImage.texture = texture;
    181.  
    182.         Debug.Log("Screen.width " + Screen.width + " Screen.height " + Screen.height + " Screen.orientation " + Screen.orientation);
    183.  
    184.         float width = webCamTextureMat.width();
    185.         float height = webCamTextureMat.height();
    186.  
    187.         int patternWidth = (int)(Mathf.Min(webCamTextureMat.width(), webCamTextureMat.height()) * 0.8f);
    188.  
    189.         patternRect = new OpenCVForUnity.CoreModule.Rect(webCamTextureMat.width() / 2 - patternWidth / 2, webCamTextureMat.height() / 2 - patternWidth / 2, patternWidth, patternWidth);
    190.  
    191.         float imageSizeScale = 1.0f;
    192.         float widthScale = (float)Screen.width / width;
    193.         float heightScale = (float)Screen.height / height;
    194.    
    195.         //set cameraparam
    196.         int max_d = (int)Mathf.Max(width, height);
    197.         double fx = max_d;
    198.         double fy = max_d;
    199.         double cx = width / 2.0f;
    200.         double cy = height / 2.0f;
    201.         camMatrix = new Mat(3, 3, CvType.CV_64FC1);
    202.         camMatrix.put(0, 0, fx);
    203.         camMatrix.put(0, 1, 0);
    204.         camMatrix.put(0, 2, cx);
    205.         camMatrix.put(1, 0, 0);
    206.         camMatrix.put(1, 1, fy);
    207.         camMatrix.put(1, 2, cy);
    208.         camMatrix.put(2, 0, 0);
    209.         camMatrix.put(2, 1, 0);
    210.         camMatrix.put(2, 2, 1.0f);
    211.         Debug.Log("camMatrix " + camMatrix.dump());
    212.  
    213.  
    214.         distCoeffs = new MatOfDouble(0, 0, 0, 0);
    215.         Debug.Log("distCoeffs " + distCoeffs.dump());
    216.  
    217.  
    218.         //calibration camera
    219.         Size imageSize = new Size(width * imageSizeScale, height * imageSizeScale);
    220.         double apertureWidth = 0;
    221.         double apertureHeight = 0;
    222.         double[] fovx = new double[1];
    223.         double[] fovy = new double[1];
    224.         double[] focalLength = new double[1];
    225.         Point principalPoint = new Point(0, 0);
    226.         double[] aspectratio = new double[1];
    227.  
    228.         Calib3d.calibrationMatrixValues(camMatrix, imageSize, apertureWidth, apertureHeight, fovx, fovy, focalLength, principalPoint, aspectratio);
    229.  
    230.         Debug.Log("imageSize " + imageSize.ToString());
    231.         Debug.Log("apertureWidth " + apertureWidth);
    232.         Debug.Log("apertureHeight " + apertureHeight);
    233.         Debug.Log("fovx " + fovx[0]);
    234.         Debug.Log("fovy " + fovy[0]);
    235.         Debug.Log("focalLength " + focalLength[0]);
    236.         Debug.Log("principalPoint " + principalPoint.ToString());
    237.         Debug.Log("aspectratio " + aspectratio[0]);
    238.  
    239.  
    240.         //To convert the difference of the FOV value of the OpenCV and Unity.
    241.         double fovXScale = (2.0 * Mathf.Atan((float)(imageSize.width / (2.0 * fx)))) / (Mathf.Atan2((float)cx, (float)fx) + Mathf.Atan2((float)(imageSize.width - cx), (float)fx));
    242.         double fovYScale = (2.0 * Mathf.Atan((float)(imageSize.height / (2.0 * fy)))) / (Mathf.Atan2((float)cy, (float)fy) + Mathf.Atan2((float)(imageSize.height - cy), (float)fy));
    243.  
    244.         Debug.Log("fovXScale " + fovXScale);
    245.         Debug.Log("fovYScale " + fovYScale);
    246.  
    247.  
    248.         //Adjust Unity Camera FOV https://github.com/opencv/opencv/commit/8ed1945ccd52501f5ab22bdec6aa1f91f1e2cfd4
    249.         if (widthScale < heightScale)
    250.         {
    251.             ARCamera.fieldOfView = (float)(fovx[0] * fovXScale);
    252.         }
    253.         else
    254.         {
    255.             ARCamera.fieldOfView = (float)(fovy[0] * fovYScale);
    256.         }
    257.  
    258.  
    259.         invertYM = Matrix4x4.TRS(Vector3.zero, Quaternion.identity, new Vector3(1, -1, 1));
    260.         Debug.Log("invertYM " + invertYM.ToString());
    261.  
    262.         invertZM = Matrix4x4.TRS(Vector3.zero, Quaternion.identity, new Vector3(1, 1, -1));
    263.         Debug.Log("invertZM " + invertZM.ToString());
    264.  
    265.  
    266.         //if WebCamera is frontFaceing,flip Mat.
    267.         XRFrameToMatHelper.flipHorizontal = XRFrameToMatHelper.GetWebCamDevice().isFrontFacing;
    268.         Debug.Log("OnInitialized XRSubsystemToOpenCV success");
    269.     }
    270.  
    271.     void OnEnable()
    272.     {
    273.         cameraManager.frameReceived += OnCameraFrameReceived;
    274.     }
    275.  
    276.     void OnDisable()
    277.     {
    278.         cameraManager.frameReceived -= OnCameraFrameReceived;
    279.     }
    280.  
    281.     public void OnCameraFrameReceived(ARCameraFrameEventArgs eventArgs)
    282.     {
    283.         AfterReceiveUpdate();
    284.     }
    285.  
    286.     // Update is called once per frame
    287.     void AfterReceiveUpdate()
    288.     {
    289.         if (!XRFrameToMatHelper.IsPlaying() || !XRFrameToMatHelper.DidUpdateThisFrame())
    290.             return;
    291.  
    292.         Mat rgbaMat = XRFrameToMatHelper.GetMat();
    293.      
    294.         Imgproc.cvtColor(rgbaMat, rgbMat, Imgproc.COLOR_RGBA2RGB);    
    295.         Imgproc.cvtColor(rgbaMat, outputMat, Imgproc.COLOR_RGBA2RGB);
    296.      
    297.         detector.detect(rgbMat, keypoints);
    298.  
    299.         Features2d.drawKeypoints(rgbMat, keypoints, rgbMat, Scalar.all(-1));      
    300.  
    301.         Imgproc.rectangle(rgbMat, patternRect.tl(), patternRect.br(), new Scalar(255, 0, 0, 255), 5);
    302.  
    303.         if (grayMat != null)
    304.             Imgproc.cvtColor(rgbaMat, grayMat, Imgproc.COLOR_RGBA2GRAY);
    305.  
    306.         bool patternFound = patternDetector == null ? false : patternDetector.findPattern(rgbaMat, patternTrackingInfo);
    307.  
    308.         foundedText.SetActive(patternFound);
    309.  
    310.         if (patternFound)
    311.         {
    312.             Debug.Log("Pattern Founded!");
    313.             patternTrackingInfo.computePose(pattern, camMatrix, distCoeffs);
    314.  
    315.             //Marker to Camera Coordinate System Convert Matrix
    316.             Matrix4x4 transformationM = patternTrackingInfo.pose3d;
    317.  
    318.  
    319.             // right-handed coordinates system (OpenCV) to left-handed one (Unity)
    320.             // https://stackoverflow.com/questions/30234945/change-handedness-of-a-row-major-4x4-transformation-matrix
    321.             Matrix4x4 ARM = invertYM * transformationM * invertYM;
    322.  
    323.             // Apply Y-axis and Z-axis refletion matrix. (Adjust the posture of the AR object)
    324.             ARM = ARM * invertYM * invertZM;
    325.  
    326.             if (shouldMoveARCamera)
    327.             {
    328.                 ARM = ARGameObject.transform.localToWorldMatrix * ARM.inverse;
    329.  
    330.                 //Debug.Log("ARM " + ARM.ToString());
    331.  
    332.                 ARUtils.SetTransformFromMatrix(ARCamera.transform, ref ARM);
    333.             }
    334.             else
    335.             {
    336.                 ARM = ARCamera.transform.localToWorldMatrix * ARM;
    337.  
    338.                 //Debug.Log("ARM " + ARM.ToString());
    339.  
    340.                 ARUtils.SetTransformFromMatrix(ARGameObject.transform, ref ARM);
    341.             }
    342.  
    343.             ARGameObject.GetComponent<DelayableSetActive>().SetActive(true);
    344.         }
    345.         else
    346.         {
    347.             ARGameObject.GetComponent<DelayableSetActive>().SetActive(false, 0.5f);
    348.         }
    349.  
    350.         Utils.setDebugMode(true);
    351.         Utils.fastMatToTexture2D(rgbaMat, texture);
    352.         Utils.setDebugMode(false);
    353.     }
    354.  
     
    Last edited: Aug 18, 2022
  47. EnoxSoftware

    EnoxSoftware

    Joined:
    Oct 29, 2014
    Posts:
    1,469
    Have you already tried the following examples?
    https://github.com/EnoxSoftware/ARFoundationWithOpenCVForUnityExample
     
  48. Philkrom

    Philkrom

    Joined:
    Dec 26, 2015
    Posts:
    87
    Hello
    I Installed OpenCVForUnity 2.4.8, playmaker and PlayMakerActions for OpenCVforUnity.
    It seems to work well in Editor mode, but if I build I get 777 errors in the console of 2 kinds :
    The type or namespace name 'ObjectPropertyDrawer' could not be found
    The type or namespace name 'ObjectPropertyDrawerAttribute

    Did I miss something ?

    Best regards, Philippe
     
  49. EnoxSoftware

    EnoxSoftware

    Joined:
    Oct 29, 2014
    Posts:
    1,469
    Could you tell me about the environment you tested?
    Unity version :
    OpenCVforUnity verison :
    PlayMakerActions for OpenCVforUnity version :
    Build Platform :
     
  50. Philkrom

    Philkrom

    Joined:
    Dec 26, 2015
    Posts:
    87
    Thanks for your quick reply.

    Unity version : 2021.3.1f1
    OpenCVforUnity version : downloaded today (2.4.8)
    Playmaker : 1.9.5.f3 (last)
    PlayMakerActions for OpenCVforUnity version : downloaded today (1.1.8)
    Build Platform : Win x64 (but it seems that it doesn't depend on the platform)

    Hope this helps !
    Best regards, Phil