Search Unity

[RELEASED] OpenCV for Unity

Discussion in 'Assets and Asset Store' started by EnoxSoftware, Oct 30, 2014.

  1. EnoxSoftware

    EnoxSoftware

    Joined:
    Oct 29, 2014
    Posts:
    1,566
    It seems that huge amounts of garbage is generated by calling Texture2D.GetRawTextureData () and Texture2D.GetColor32 () methods. It is difficult to avoid this.
     
  2. cecarlsen

    cecarlsen

    Joined:
    Jun 30, 2006
    Posts:
    864
    Great, thanks.

    The settings are correct but the project refuses to export the DLL. I tried to create a new project, import OpenCVForUnity and export one of your examples – that worked. So there must be something funky going on in my project.
     
  3. ina

    ina

    Joined:
    Nov 15, 2010
    Posts:
    1,085
    iOS build error on latest opencv - dyld: Library not loaded: image not found opencv2
     
  4. Great-Peter

    Great-Peter

    Joined:
    Oct 9, 2015
    Posts:
    14
    Hi Enox!!

    How to reduce apk size....?

    Even empty scene(just import opencv), Apk size is more then 130mb.

    How to reduce apk size...?

    I just use tensorflow scene.
     
  5. NGC6543

    NGC6543

    Joined:
    Jun 3, 2015
    Posts:
    228
    I figured this out by this :

    1. Open the XCode project
    2. Go to General > Embeded Binaries
    3. Add `opencv2.framework` there

    Hope this helps.
     
  6. ina

    ina

    Joined:
    Nov 15, 2010
    Posts:
    1,085
    Adding to embedded leads to unsealed contents?

    Frameworks/opencv2.framework: unsealed contents present in the root directory of an embedded framework

    Command /usr/bin/codesign failed with exit code 1
     
  7. NGC6543

    NGC6543

    Joined:
    Jun 3, 2015
    Posts:
    228
    Hmm.. I'm not sure then. My case was the exact same log, and adding it to Embeded Binaries solved the issue. My app was built and ran fine.
     
  8. EnoxSoftware

    EnoxSoftware

    Joined:
    Oct 29, 2014
    Posts:
    1,566
    Could you try with Unity 2018.1 or higher?
    If Unity 2018.1 or higher, opencv2.framework is automatically added to Embeded Binaries.
     
  9. EnoxSoftware

    EnoxSoftware

    Joined:
    Oct 29, 2014
    Posts:
    1,566
    There is a possibility that some unnecessary files are left.
    Please delete files you do not use from the "StreamingAssets" folder.

    Also, Please refer to this post.
    https://forum.unity.com/threads/released-opencv-for-unity.277080/page-32#post-3425699
     
    Great-Peter likes this.
  10. ina

    ina

    Joined:
    Nov 15, 2010
    Posts:
    1,085
    I'm on Unity 2018.1.20f3.
    It looks like this is due to xcode 10 beta - switching back the build settings to xcode 9.3.1 seems to solve the prob
     
  11. ina

    ina

    Joined:
    Nov 15, 2010
    Posts:
    1,085
    Archiving for app store leads to a lot of errors!
    ERROR-ITMS 90087 Unsupported Architectures
    90635 Invalid Mach-O-Format
    90209 Invalid Segment Alignment
    90125 Binary is invalid
     
  12. Great-Peter

    Great-Peter

    Joined:
    Oct 9, 2015
    Posts:
    14

    Thank you!
    But.... your way just reduce 4~5% of apk size.....
    Any other way to reduce apk size...?
    For example....delete all textures on resources folder.....
     
    Last edited: Jun 30, 2018
  13. EnoxSoftware

    EnoxSoftware

    Joined:
    Oct 29, 2014
    Posts:
    1,566
  14. EnoxSoftware

    EnoxSoftware

    Joined:
    Oct 29, 2014
    Posts:
    1,566
    Unity does not seem to support Xcode10 yet.
     
  15. EnoxSoftware

    EnoxSoftware

    Joined:
    Oct 29, 2014
    Posts:
    1,566
    Yes, deleting resources not related to the tensorflow scene will not cause problems.
     
  16. ina

    ina

    Joined:
    Nov 15, 2010
    Posts:
    1,085
  17. EnoxSoftware

    EnoxSoftware

    Joined:
    Oct 29, 2014
    Posts:
    1,566
    As a result of trial and error, I succeeded in automating it.
    Could you overwrite this code in "Assets / OpenCVForUnity / Editor / iOS_BuildPostprocessor.cs" and test it?
    This code will be included in the next version of OpenCVForUnity.

    Code (CSharp):
    1. #if (UNITY_5 || UNITY_5_3_OR_NEWER) && UNITY_IOS
    2. using UnityEngine;
    3. using UnityEditor;
    4. using UnityEditor.Callbacks;
    5. using UnityEditor.iOS.Xcode;
    6.  
    7. using System.Diagnostics;
    8.  
    9. #if UNITY_2017_2_OR_NEWER
    10. using UnityEditor.iOS.Xcode.Extensions;
    11. #endif
    12. using System;
    13. using System.Collections;
    14. using System.IO;
    15.  
    16. namespace OpenCVForUnity
    17. {
    18.     public class iOS_BuildPostprocessor : MonoBehaviour
    19.     {
    20.        
    21.         [PostProcessBuild]
    22.         public static void OnPostprocessBuild (BuildTarget buildTarget, string path)
    23.         {
    24.             if (buildTarget == BuildTarget.iOS) {
    25.                 if (PlayerSettings.iOS.sdkVersion == iOSSdkVersion.DeviceSDK) {
    26.  
    27.                     RemoveSimulatorArchitectures (path + "/Frameworks/OpenCVForUnity/Plugins/iOS/", "opencv2.framework/opencv2");
    28.                     RemoveSimulatorArchitectures (path + "/Libraries/OpenCVForUnity/Plugins/iOS/", "libopencvforunity.a");
    29.                 }
    30.  
    31. #if UNITY_5_0 || UNITY_5_1 || UNITY5_2
    32.                                 string projPath = path + "/Unity-iPhone.xcodeproj/project.pbxproj";
    33. #else
    34.                 string projPath = PBXProject.GetPBXProjectPath (path);
    35. #endif
    36.            
    37.                 PBXProject proj = new PBXProject ();
    38.                 proj.ReadFromString (System.IO.File.ReadAllText (projPath));
    39.                    
    40. #if UNITY_5_0 || UNITY_5_1 || UNITY5_2
    41.                                 string target = proj.TargetGuidByName ("Unity-iPhone");
    42. #else
    43.                 string target = proj.TargetGuidByName (PBXProject.GetUnityTargetName ());
    44. #endif
    45.  
    46. #if UNITY_2018_1_OR_NEWER
    47.  
    48. #elif UNITY_2017_2_OR_NEWER
    49.                 string frameworkPath = "Frameworks/OpenCVForUnity/Plugins/iOS/opencv2.framework";
    50.                 string fileGuid = proj.FindFileGuidByProjectPath(frameworkPath);
    51.  
    52.                 proj.AddFileToBuild(target, fileGuid);
    53.                 proj.AddFileToEmbedFrameworks(target, fileGuid);
    54.                 foreach (var configName in proj.BuildConfigNames()) {
    55.                     var configGuid = proj.BuildConfigByName(target, configName);
    56.                     proj.SetBuildPropertyForConfig(configGuid, "LD_RUNPATH_SEARCH_PATHS", "$(inherited) @executable_path/Frameworks");
    57.                 }
    58. #else
    59.                 UnityEngine.Debug.LogError ("If the version of Unity is less than 2017.2, you have to set opencv2.framework to Embedded Binaries manually.");
    60. #endif
    61.  
    62.                 File.WriteAllText (projPath, proj.WriteToString ());
    63.  
    64. #if UNITY_5_5_OR_NEWER
    65.                 if ((int)Convert.ToDecimal (PlayerSettings.iOS.targetOSVersionString) < 8) {
    66. #else
    67.                 if ((int)PlayerSettings.iOS.targetOSVersion < (int)iOSTargetOSVersion.iOS_8_0) {
    68. #endif
    69.                     UnityEngine.Debug.LogError ("Please set Target minimum iOS Version to 8.0 or higher.");
    70.                 }
    71.  
    72.             }
    73.         }
    74.  
    75.         /// <summary>
    76.         /// Removes the simulator architectures.
    77.         /// </summary>
    78.         /// <param name="WorkingDirectory">Working directory.</param>
    79.         /// <param name="filePath">File path.</param>
    80.         private static void RemoveSimulatorArchitectures (string WorkingDirectory, string filePath)
    81.         {
    82.             Process process = new Process ();
    83.             process.StartInfo.FileName = "/bin/bash";
    84.             process.StartInfo.WorkingDirectory = WorkingDirectory;
    85.  
    86.             process.StartInfo.Arguments = "-c \" ";
    87.  
    88.             process.StartInfo.Arguments += "lipo -remove i386 " + filePath + " -o " + filePath + ";";
    89.             process.StartInfo.Arguments += "lipo -remove x86_64 " + filePath + " -o " + filePath + ";";
    90.             process.StartInfo.Arguments += "lipo -info " + filePath + ";";
    91.  
    92.             process.StartInfo.Arguments += " \"";
    93.  
    94.             process.StartInfo.UseShellExecute = false;
    95.             process.StartInfo.RedirectStandardOutput = true;
    96.             process.StartInfo.RedirectStandardError = true;
    97.  
    98.             process.Start ();
    99.  
    100.             string output = process.StandardOutput.ReadToEnd ();
    101.             string error = process.StandardError.ReadToEnd ();
    102.  
    103.             process.WaitForExit ();
    104.             process.Close ();
    105.  
    106.             if (string.IsNullOrEmpty (error)) {
    107.                 UnityEngine.Debug.Log ("success : " + output);
    108.             } else {
    109.                 UnityEngine.Debug.LogWarning ("error : " + error);
    110.             }
    111.         }
    112.     }
    113. }
    114. #endif
     
    ina likes this.
  18. sushanta1991

    sushanta1991

    Joined:
    Apr 3, 2011
    Posts:
    305
    Hi I am interested in purchasing your plugin, I just need one confirmation, I want to fetch only the mouth/lips from a face, using your plugin I can detect face but can I extract the lips from detected face?
     
  19. EnoxSoftware

    EnoxSoftware

    Joined:
    Oct 29, 2014
    Posts:
    1,566
    https://github.com/EnoxSoftware/Ope...eDetectionExample/FaceDetectionExample.cs#L50
    By changing the cascade file of FaceDetectionExample to "haarcascade_mcs_mouth.xml", it is possible to detect mouth.
    https://github.com/opencv/opencv_contrib/tree/master/modules/face/data/cascades
     
  20. TomoyukiMukasa

    TomoyukiMukasa

    Joined:
    Jul 5, 2018
    Posts:
    4
    Hi EnoxSoftware,
    I'm trying to use SuperpixelSEEDS and found that the only constructer you providing is __fromPtr__(IntPtr addr) but don't know how to use it (How can I define IntPtr addr?).
    Could you explain it or show me an example?
    Thank you in advance.
     
  21. EnoxSoftware

    EnoxSoftware

    Joined:
    Oct 29, 2014
    Posts:
    1,566
    These sample codes of opencv github are helpful.
    https://github.com/opencv/opencv_contrib/blob/master/samples/python2/seeds.py
    https://github.com/opencv/opencv_contrib/blob/master/modules/ximgproc/samples/seeds.cpp
    Code (CSharp):
    1. using UnityEngine;
    2. using System.Collections;
    3.  
    4. #if UNITY_5_3 || UNITY_5_3_OR_NEWER
    5. using UnityEngine.SceneManagement;
    6. #endif
    7. using OpenCVForUnity;
    8.  
    9. namespace OpenCVForUnityExample
    10. {
    11.     /// <summary>
    12.     /// Superpixel SEEDS example.
    13.     /// </summary>
    14.     public class SuperpixelSEEDSExample : MonoBehaviour
    15.     {
    16.         // Use this for initialization
    17.         void Start ()
    18.         {
    19.             int num_iterations = 4;
    20.             int prior = 2;
    21.             bool double_step = false;
    22.             int num_superpixels = 400;
    23.             int num_levels = 4;
    24.             int num_histogram_bins = 5;
    25.  
    26.             Mat frame = Imgcodecs.imread (Utils.getFilePath ("lena.jpg"));
    27.  
    28.             Debug.Log ("frame ToString " + frame.ToString ());
    29.  
    30.  
    31.             int width = frame.width ();
    32.             int height = frame.height ();
    33.             SuperpixelSEEDS seeds = Ximgproc.createSuperpixelSEEDS (width, height, frame.channels (), num_superpixels,
    34.                                         num_levels, prior, num_histogram_bins, double_step);
    35.  
    36.             Mat converted = new Mat ();
    37.             Imgproc.cvtColor (frame, converted, Imgproc.COLOR_BGR2HSV);
    38.  
    39.             seeds.iterate (converted, num_iterations);
    40.  
    41.             Debug.Log ("seeds.getNumberOfSuperpixels() " + seeds.getNumberOfSuperpixels ());
    42.  
    43.             /* retrieve the segmentation result */
    44.             Mat labels = new Mat ();
    45.             seeds.getLabels (labels);
    46.  
    47.             /* get the contours for displaying */
    48.             Mat mask = new Mat ();
    49.             seeds.getLabelContourMask (mask, false);
    50.             frame.setTo (new Scalar (0, 0, 255), mask);
    51.  
    52.             Imgproc.cvtColor (frame, frame, Imgproc.COLOR_BGR2RGB);
    53.  
    54.  
    55.             Texture2D texture = new Texture2D (frame.cols (), frame.rows (), TextureFormat.RGBA32, false);
    56.        
    57.             Utils.matToTexture2D (frame, texture);
    58.  
    59.             gameObject.GetComponent<Renderer> ().material.mainTexture = texture;
    60.         }
    61.  
    62.         // Update is called once per frame
    63.         void Update ()
    64.         {
    65.  
    66.         }
    67.  
    68.         /// <summary>
    69.         /// Raises the back button click event.
    70.         /// </summary>
    71.         public void OnBackButtonClick ()
    72.         {
    73.             #if UNITY_5_3 || UNITY_5_3_OR_NEWER
    74.             SceneManager.LoadScene ("OpenCVForUnityExample");
    75.             #else
    76.             Application.LoadLevel ("OpenCVForUnityExample");
    77.             #endif
    78.         }
    79.     }
    80. }
     
    TomoyukiMukasa likes this.
  22. TomoyukiMukasa

    TomoyukiMukasa

    Joined:
    Jul 5, 2018
    Posts:
    4
    Thank you so much for your quick reply!
    This really helps.
    I don't know why I couldn't find createSuperpixelSEEDS under Ximgproc...
     
  23. HeavyArmoredMan

    HeavyArmoredMan

    Joined:
    Apr 9, 2017
    Posts:
    8
    Hi Enox,

    I notice that the new ArrayToTextureInRenderThread<byte>() feature introduced in OpenCV for Unity 2.2.9 may be a faster way to apply raw pixel data to Texture than Unity's Apply() method.

    But this handy tool only accepts RGBA32 or ARGB32 format. I have RGB24 data and I know they could be converted to Mat, then converted to RGBA32 format by using cvtColor(), and finally converted to Texture by calling MatToTextureInRender(). The only problem is the conversion takes extra time and memory.

    Is there anything I can do to let ArrayToTextureInRenderThread<byte>() accept RGB24 data? Or will you add support for this format?

    Thank you in advance.
     
  24. EnoxSoftware

    EnoxSoftware

    Joined:
    Oct 29, 2014
    Posts:
    1,566
    Adding support for RGB24 format is technically difficult.
    For now, I do not plan to add support for RGB24 format.
     
  25. MwGfyzzo

    MwGfyzzo

    Joined:
    Jul 10, 2018
    Posts:
    7
    I am looking into the examples for Holoens. They work more or less. At least I managed to run the HoloLensFaceDetectionOverlayExample and see red rectangles on the faces. Though I find it quite difficult to extract some own haar/cascade classifier algorithm from it. In other OpenCV examples it looked way simpler. I get that I need to use threading, but it seems the cascade.detectMultiScale how it is used in the DetectObject method is not enough to get what I need. I saw that you have also a DetectInRegion method, but it's way more complicated and I try to get a more simple approach. https://pastebin.com/M9E8u5jw https://i.imgur.com/h9E6s1T.png

    (Even if drawing the rectangles doesn't work it should move the cubes around if it finds something. But it doesn't.)

    Do you have some advice for me to fix this?
     
  26. idurvesh

    idurvesh

    Joined:
    Jun 9, 2014
    Posts:
    495
    Any example on how to change hair colour ??
     
  27. EnoxSoftware

    EnoxSoftware

    Joined:
    Oct 29, 2014
    Posts:
    1,566
    idurvesh likes this.
  28. aegis123321

    aegis123321

    Joined:
    Jul 5, 2015
    Posts:
    71
    Hi Enox,
    I use Dnn.readNetFromTensorflow() with my model. But it is not working and the error is " error: (-2) Unknown layer type Mean"
    I try to find the problem what is resulting in.
    And I found: There is no op of "Mean" in Opencv 3.4.1.
    But it has been added in Opencv3.4.2.
    So... please can you tell me that when you will upgrade opencv version?
    I really need this upgrade.
    Thanks.
     
  29. EnoxSoftware

    EnoxSoftware

    Joined:
    Oct 29, 2014
    Posts:
    1,566
    I am planning to upgrade OpenCVForUnity to OpenCV 4.0 which will be released in early August.
    https://github.com/opencv/opencv/wiki/2018
     
  30. twfarro

    twfarro

    Joined:
    Nov 23, 2013
    Posts:
    23
    Hello

    I seem to be encountering an issue where creating a Mat using the nativeObject constructor causes either a memory leak or some other form of crash.

    I am attempting to make use of some parallel processing using the new Unity Jobs system, and as such the only way to pass Mat data to a NativeArray is through the IntPtr data type, because it is blittable. As such, I was attempting to grab the native object address, then recreate the Mat in the worker thread from that address. This causes crashes.

    Even when doing that on the main Unity thread, and not using multithreading at all, this causes an editor crash after several seconds (not right away, which is why I think it is a memory leak).

    Any ideas why?

    I am on Unity 2018.1

    upload_2018-7-17_15-16-24.png
     
    Last edited: Jul 17, 2018
  31. BSummers

    BSummers

    Joined:
    Apr 13, 2018
    Posts:
    1
    Hello,

    I am new to OpenCV for Unity, and I am trying to use the MarkerLess example, but through a webcam with a printed symbol, exactly like you show in your Youtube video. But your example code is set up to only work with the static image provided, nothing related to webcams or video feeds. Is there any documentation anywhere you could point me towards that would help me get started with this?

    Thanks
     
  32. QAQAQAQAQAQ

    QAQAQAQAQAQ

    Joined:
    Jun 6, 2018
    Posts:
    7
    Hi, I am trying to use OpenCV's Calib3d.solvePnP function to reconstruct from 4 corresponding points. Every time I try to set the flag, it shows “Error CS0103 The name 'CV_P3P' does not exist in the current context”. I have tried the other two flags (CV_EPNP,CV_ITERATIVE) and the same error occurs. I tried using Calib3d.solveP3P and the flag still is not working.
    I am working on vs2017 with the OpenCVforUnity 2.2.9 on the Unity 2018.1.6f1.

    code:
    Code (CSharp):
    1. using UnityEngine;
    2. using System.Collections;
    3. using System;
    4. using System.Collections.Generic;
    5. using OpenCVForUnity;
    Code (CSharp):
    1. Point3 a = new Point3(0, 0, 0);
    2.             Point3 b = new Point3(1, 0, 0);
    3.             Point3 c = new Point3(1, 1, 0);
    4.             Point3 d = new Point3(0, 1, 0);
    5.  
    6.             MatOfPoint3f objPoints = new MatOfPoint3f(a,b,c,d);
    7.             MatOfPoint2f imgPoints = new MatOfPoint2f(
    8.                 cyanCenter,
    9.                 yellowCenter,
    10.                 redCenter,
    11.                 greenCenter);
    12.             Mat camMatrix = new Mat(3, 3, CvType.CV_64FC1);
    13.             MatOfDouble distCoeffs = new MatOfDouble(0, 0, 0, 0);
    14.             CameraCalibration(rgbMat, ref camMatrix, ref distCoeffs);
    15.             Mat rotation = new Mat();
    16.             Mat transformation = new Mat();
    17.  
    18.             Calib3d.solvePnP(objPoints,imgPoints,camMatrix,distCoeffs,rotation,transformation,true,CV_P3P);
    19.  
    I am quite new to this topic and I realize that this might be a really silly question. But still, can anyone point me in the right direction.
    Thanks
     
  33. EnoxSoftware

    EnoxSoftware

    Joined:
    Oct 29, 2014
    Posts:
    1,566
    Unfortunately, I have not tried the combination of the new Unity Jobs system and OpenCVForUnity yet.


    Code (CSharp):
    1.         void Start ()
    2.         {
    3.  
    4.             Mat aaa = new Mat ();
    5.             aaa.IsEnabledDispose = false;
    6.             Debug.Log ("aaa.getNativeObjAddr () " + aaa.getNativeObjAddr ());
    7.  
    8.             Mat bbb = new Mat (aaa.getNativeObjAddr ());
    9.         }
    It seems that a crash occurs when "aaa" is automatically disposed of when the method exits.
    Please set the IsEnabledDispose flag to false, or duplicate "aaa" with the clone () method to create "bbb".
     
    twfarro likes this.
  34. EnoxSoftware

    EnoxSoftware

    Joined:
    Oct 29, 2014
    Posts:
    1,566
    Could you tell me about your test environment?

    OpenCV for Unity version :
    MarkerLessARExample version :
    Unity version :
     
  35. twfarro

    twfarro

    Joined:
    Nov 23, 2013
    Posts:
    23
    To be clear, if I set the flag to false, I will manually need to dispose of the Mat myself later, correct?
     
  36. EnoxSoftware

    EnoxSoftware

    Joined:
    Oct 29, 2014
    Posts:
    1,566
    Please add "Calib3d." to "CV_P3P".
    Code (CSharp):
    1. Calib3d.solvePnP(objPoints,imgPoints,camMatrix,distCoeffs,rotation,transformation,true,[B]Calib3d.[/B]CV_P3P);
     
    QAQAQAQAQAQ likes this.
  37. EnoxSoftware

    EnoxSoftware

    Joined:
    Oct 29, 2014
    Posts:
    1,566
    Yes, you need to manually disable Mat.
    Code (CSharp):
    1. aaa.IsEnabledDispose = true;
    2. aaa.Dispose();
    I recommend cloning Mat with the clone () method.
    Code (CSharp):
    1. bbb = aaa.clone();
     
  38. Dundur

    Dundur

    Joined:
    Feb 5, 2015
    Posts:
    2
    Hi EnoxSoftware,
    I'm trying to check examples with "Intel RealSense depth camera D435", but always getting error

    Could not connect pins - RenderStream()
    UnityEngine.WebCamTexture: Play()

    Basically
    webCamTexture.Play ();
    doesn't work.

    Have you tested with the realsense cameras?
    What can cause this error?
     
  39. EnoxSoftware

    EnoxSoftware

    Joined:
    Oct 29, 2014
    Posts:
    1,566
    I have never tried with the realsense cameras.

    Does this error occur only when OpenCVforUnity is used?
    Does such a simple example work without problems?
    https://docs.unity3d.com/ScriptReference/WebCamTexture.Play.html
    Code (CSharp):
    1. // Starts the default camera and assigns the texture to the current renderer
    2. using UnityEngine;
    3. using System.Collections;
    4.  
    5. public class ExampleClass : MonoBehaviour
    6. {
    7.     void Start()
    8.     {
    9.         WebCamTexture webcamTexture = new WebCamTexture();
    10.         Renderer renderer = GetComponent<Renderer>();
    11.         renderer.material.mainTexture = webcamTexture;
    12.         webcamTexture.Play();
    13.     }
    14. }
     
  40. QAQAQAQAQAQ

    QAQAQAQAQAQ

    Joined:
    Jun 6, 2018
    Posts:
    7
    Hi EnoxSoftware,
    Thanks for the last answer, it worked perfectly.

    I am currently using the solvePnP function for 4 center points of the colored squares from a static picture. I measured the coordinates based on the picture and It worked great on the original one. It is not working, however, if I rotate the picture in the plain or tilt the picture.

    Here is the code:
    Code (CSharp):
    1.  
    2. //camMatrix was set where fx = fy = max(imgWidth,imgHeight), cx = imgWidth/2f, cy = imgHeight/2f
    3. //distCoeffs was set as MatOfDouble(0, 0, 0, 0)
    4. //All calibration steps was the same with ARMakerBased Example
    5. Calib3d.solvePnP(objPoints, imgPoints, camMatrix, distCoeffs, rvec, tvec,true,Calib3d.CV_P3P);
    6. float eulerX = Convert.ToSingle(rvec.get(0, 0).GetValue(0)) * Mathf.Rad2Deg;
    7. float eulerY = Convert.ToSingle(rvec.get(1, 0).GetValue(0)) * Mathf.Rad2Deg;
    8. float eulerZ = Convert.ToSingle(rvec.get(2, 0).GetValue(0)) * Mathf.Rad2Deg;
    9. Quaternion rotation = Quaternion.Euler(eulerX, eulerY, eulerZ);
    10. Vector3 scale = new Vector3(0.5f, 0.5f, 0.5f);
    11. Vector3 transform = new Vector3(
    12.     Convert.ToSingle(tvec.get(0, 0).GetValue(0)),
    13.     Convert.ToSingle(tvec.get(1, 0).GetValue(0)),
    14.     Convert.ToSingle(tvec.get(2, 0).GetValue(0)));
    15. OverlayModel(transform, rotation, scale);
    16.  
    17. private void OverlayModel(Vector3 transform, Quaternion rotation, Vector3 scale)
    18. {
    19.     Matrix4x4 transformationM = Matrix4x4.TRS(transform, rotation, scale);
    20.     Matrix4x4 invertZM = Matrix4x4.TRS(Vector3.zero, Quaternion.identity,new Vector3(1, -1, 1));//filp on z, somehow just works                        
    21.     Matrix4x4 invertYM = Matrix4x4.TRS(Vector3.zero, Quaternion.identity,new Vector3(1, -1, 1));//filp on y
    22.                            
    23.     if (ARGameObject != null)
    24.     {
    25.         Matrix4x4 ARM = ARGameObject.transform.localToWorldMatrix * invertZM * transformationM.inverse * invertYM;
    26.      
    27.         ARUtils.SetTransformFromMatrix(ARCamera.transform, ref ARM);
    28.         ARGameObject.SetActive(true);
    29.     }

    I have unity print out the rotation vector and the translation vector from the solvePnP function as well as the rotation quaternion in different situations.

    0.png 90 RHS.jpg 180.jpg 270.jpg tilted.jpg

    a) original (Working as intended)
    rotation quaternion: (0.7, 0.0, 0.0, 0.7)
    rotation matrix: (89.61823,-0.7631255,0.3581716)
    translate matrix: (0.02196548,0.007981017,5.764679)

    b) 90 degree RHS (Turing only half way)
    rotation quaternion: (0.7, 0.1, 0.1, 0.7)
    rotation matrix: (69.64867,68.47253,69.79657)
    translate matrix: (-0.007981017,0.02196548,5.764679)

    c) 180 :(Filped):
    rotation quaternion(0.8, -0.4, -0.4, 0.2)
    rotation matrix: (-1.07826,-126.6263,-127.4697)
    translate matrix: (-0.02196548,-0.007981017,5.764679)

    d) 270
    rotation quaternion: (0.6, -0.1, -0.1, 0.7)
    rotation matrix: (68.35384,-69.52794,-69.12452)
    translate matrix: (0.007981017,-0.02196548,5.764679)

    e) tilt
    rotation quaternion(0.5, 0.3, -0.2, 0.8)
    rotation matrix: (70.78563,25.46851,-17.48092)
    translate matrix: (-0.5683215,-0.17488,6.96864)

    Can you help me out on this? I have no idea why some is working and some are not, I have also tried the other methods besides P3P and they don't seem to be making much of a difference.

    Thanks
     
  41. Dundur

    Dundur

    Joined:
    Feb 5, 2015
    Posts:
    2
    It seems that Realsense does not work with WebCamTexture, a simple example returns the same error.
    How do you think - if I try to replace WebCamTexture with Unity's Texture, will your lib's work?
     
  42. MwGfyzzo

    MwGfyzzo

    Joined:
    Jul 10, 2018
    Posts:
    7
    hi, I can't really find a good way of debugging OpenCV on the HoloLens.

    I want to check how the Mat I get from the HoloLens camera. But it doesn't work like this:

    Code (CSharp):
    1.                 Texture2D tex = new Texture2D(m_grayMat.cols(),m_grayMat.rows());
    2.                 Utils.fastMatToTexture2D(m_grayMat, tex);
    3.                 m_renderer.material.SetTexture("_MainTexture", tex);
    The renderer is the Renderer of a quad in the scene. I also tried it with a Cube. Just so I can see the Mat. I'm still trying to build a simple way of object detection on the HoloLens.
     
  43. twfarro

    twfarro

    Joined:
    Nov 23, 2013
    Posts:
    23
    So I decided not to try pointer manipulation with the job threads, too dangerous.

    However, I have discovered what I believe to be some sort of memory leak with the .clone() method by trying things in a different part of my project.

    Specifically, I wanted to take advantage of the MatToTextureInRenderThread() method for faster rendering of the camera to the screen. In order to do this, the input mat must be continuous, and all sources seem to say cloning a mat will always result in a continuous mat.

    At the same time, I was cropping my camera input so that time wasn't wasted processing pixels that would appear offscreen. In doing so, I was creating a submat, which is not continuous.

    Code (CSharp):
    1. public Mat GetAndCropMat(Size size){
    2.        
    3.             Mat mat = GetMat();
    4.          /// calculations here ///
    5.             mat = mat.submat(heightExcess, heightExcess+height, widthExcess, widthExcess+width);
    6.             return mat;
    7.         }
    This crop method works fine with the normal MatToTexture2D method.

    However, as soon as I do the following change, the app will crash after several minutes on Android, every single time it is launched:

    Code (CSharp):
    1. public Mat GetAndCropMat(Size size){
    2.        
    3.             Mat mat = GetMat();
    4.          /// calculations here ///
    5.             Mat clonedMat = new Mat();
    6.             clonedMat = mat.submat(heightExcess, heightExcess+height, widthExcess, widthExcess+width).clone();
    7.             return clonedMat;
    8.  
    In trying to debug the mysterious crashes, I rolled back all the way until the above modification was the only change to my project. I am sure this is the source of my issue.

    Is there something I misunderstand about how to use clone? Or is there another, better way to make a continuous mat?

    I found this bug after upgrading to 2.3.0, but I rolled back my project to a state that uses 2.2.4, and the problem persists after implementing the above change.

    Thanks.
     
    Last edited: Jul 24, 2018
  44. EnoxSoftware

    EnoxSoftware

    Joined:
    Oct 29, 2014
    Posts:
    1,566
    Unfortunately, I could not find a mistake in your code.
    Is the conversion process between rvec and tvec the same as MarkerBasedARExample?
    https://github.com/EnoxSoftware/Mar...ple/MarkerBasedAR/MarkerDetector.cs#L507-L517
     
  45. EnoxSoftware

    EnoxSoftware

    Joined:
    Oct 29, 2014
    Posts:
    1,566
  46. EnoxSoftware

    EnoxSoftware

    Joined:
    Oct 29, 2014
    Posts:
    1,566
    The Utils.fastMatToTexture2D (Mat mat, Texture2D texture2D) method requires that the data length of mat and texture2D be the same.
    Please change Utils.fastMatToTexture2D () to Utils.matToTexture2D () and test it.
     
  47. QAQAQAQAQAQ

    QAQAQAQAQAQ

    Joined:
    Jun 6, 2018
    Posts:
    7
    Hi,
    It is working much better using the set row function and the rotation matrix. The rotation is basically correct, but the AR camera would locate inside the render target.
    2.png
    I used to deal with this by scaling the transform matrix but now I can't scale my hologram the same way as before.

    I have tried multiplying a scaling 4x4 matrix on the transform matrix.
    Code (CSharp):
    1. Matrix4x4 trs = Matrix4x4.TRS(new Vector3(0,0,0), Quaternion.identity, new Vector3(0.01f, 0.01f, 0.01f));
    2. transformationMatrix = transformationMatrix * trs;
    I have tried changing the ARM calculation from the first line to the second.
    Code (CSharp):
    1.  Matrix4x4 ARM = ARGameObject.transform.localToWorldMatrix * invertZM * transformationM.inverse * invertYM;
    2.  
    3. Matrix4x4 ARM = Matrix4x4.TRS(Vector3.zero,Quaternion.identity,new Vector3(10,10,10)) * invertZM * transformationM.inverse * invertYM;
    All the above change would result in translating the ar camera out in the x and z-direction. How can I put it above the target?
    1.png


    Also, one thing I noticed in the AR-based example is that there are two situations (based on "should move camera") where the multiplication order of the ARM matrix is different.
    Code (CSharp):
    1. // should move camera true
    2. Matrix4x4 ARM = ARGameObject.transform.localToWorldMatrix * invertZM * transformationM.inverse * invertYM;
    3. //else
    4. Matrix4x4 ARM = ARCamera.transform.localToWorldMatrix * invertYM * transformationM * invertZM;
    I just randomly used the first one without understanding what "should move camera" means. Does this have an effect on the project?
     
  48. EnoxSoftware

    EnoxSoftware

    Joined:
    Oct 29, 2014
    Posts:
    1,566
    As your code calls the clone () method per frame, memory is allocated per frame. If you do not dispose Mat correctly, OutOfMemoryError may occur.
    In this code, submat is copied the cropMat by using the copyTo () method.
    Code (CSharp):
    1. Mat cropMat;
    2.  
    3. Start(){
    4.            cropMat = new Mat(heightExcess, heightExcess+height, widthExcess, widthExcess+width, CvType.CV_8UC4);
    5. }
    6.  
    7. public Mat GetAndCropMat(Size size){
    8.    
    9.             Mat mat = GetMat();
    10.          /// calculations here ///
    11.             mat = mat.submat(heightExcess, heightExcess+height, widthExcess, widthExcess+width);
    12.  
    13.             mat.copyTo(cropMat);
    14.             return cropMat;
    15. }
     
  49. EnoxSoftware

    EnoxSoftware

    Joined:
    Oct 29, 2014
    Posts:
    1,566
    Please try an example of a very simple AR.
    https://www.dropbox.com/s/9zrgwk1w575wx4v/ARTest.unitypackage?dl=0
    ARTest.PNG

    If the shouldMoveARCamera flag is true, the position, rotation and scale of ARCamera are changed, if false, the position, rotation and scale of the ARGameObject are changed.
     
  50. artpologabriel

    artpologabriel

    Joined:
    Aug 7, 2013
    Posts:
    6
    Hi there, i want to achieve something like... detect an object like smileys... then use it as AR anchor where my 3d model will show up and follow where the detected smileys is and also detect smileys rotation, translation, scale in 3d.. like the markerbased ar.. please advise :) many thanks for the help in advance