Search Unity

[RELEASED] OpenCV for Unity

Discussion in 'Assets and Asset Store' started by EnoxSoftware, Oct 30, 2014.

  1. bugrahanbayat1

    bugrahanbayat1

    Joined:
    Apr 11, 2020
    Posts:
    6

    Attached Files:

  2. EnoxSoftware

    EnoxSoftware

    Joined:
    Oct 29, 2014
    Posts:
    1,566
    It seems to "import settings" are not correct.
    MenuItem.png

    Please make sure it is set as follows.

    Set [PlayerSettings]-[Other Settings]-[Configuration]-[Camera Usage Description].
    unnamed1.png

    Set Target minimum iOS Version to 8.0 or higher.
    unnamed2.png

    If the version of Unity is less than 2017.2, you have to set opencv2.framework to Embedded Binaries manually.
    unnamed3.png
    unnamed4.png
     
  3. bugrahanbayat1

    bugrahanbayat1

    Joined:
    Apr 11, 2020
    Posts:
    6
    I think I solved the error. It worked on the phone, but I encountered a new error while trying to connect to the App Store. It works on my phone right now.
     
  4. bugrahanbayat1

    bugrahanbayat1

    Joined:
    Apr 11, 2020
    Posts:
    6
    I'm getting this error right now.
     

    Attached Files:

  5. bugrahanbayat1

    bugrahanbayat1

    Joined:
    Apr 11, 2020
    Posts:
    6
    Hello, I got such an error during the build.
    upload_2020-4-14_13-2-18.png

    What should I do? thanks
     
  6. ClevertecXR

    ClevertecXR

    Joined:
    Apr 15, 2020
    Posts:
    1
    Hello!
    When building an application on real iOS device, I get such links errors.
    I tried in empty project:
    Unity 2020.1.05b, 2020.1.04b
    Xcode 11.4, 11.3.1

    Code (Boo):
    1. Undefined symbols for architecture arm64:
    2.  
    3.   "_glTexSubImage2D", referenced from:
    4.  
    5.       RenderAPI_OpenGLCoreES::EndModifyTexture(void*, int, int, int, void*) in libopencvforunity.a(RenderAPI_OpenGLCoreES.o)
    6.  
    7.   "_glGetError", referenced from:
    8.  
    9.       RenderAPI_OpenGLCoreES::CreateResources() in libopencvforunity.a(RenderAPI_OpenGLCoreES.o)
    10.  
    11.   "_glDisable", referenced from:
    12.  
    13.       RenderAPI_OpenGLCoreES::DrawSimpleTriangles(float const*, int, void const*) in libopencvforunity.a(RenderAPI_OpenGLCoreES.o)
    14.  
    15.   "_glDepthMask", referenced from:
    16.  
    17.       RenderAPI_OpenGLCoreES::DrawSimpleTriangles(float const*, int, void const*) in libopencvforunity.a(RenderAPI_OpenGLCoreES.o)
    18.  
    19.   "_glUseProgram", referenced from:
    20.  
    21.       RenderAPI_OpenGLCoreES::DrawSimpleTriangles(float const*, int, void const*) in libopencvforunity.a(RenderAPI_OpenGLCoreES.o)
    22.  
    23.   "_glEnable", referenced from:
    24.  
    25.       RenderAPI_OpenGLCoreES::DrawSimpleTriangles(float const*, int, void const*) in libopencvforunity.a(RenderAPI_OpenGLCoreES.o)
    26.  
    27.   "_glBufferSubData", referenced from:
    28.  
    29.       RenderAPI_OpenGLCoreES::DrawSimpleTriangles(float const*, int, void const*) in libopencvforunity.a(RenderAPI_OpenGLCoreES.o)
    30.  
    31.   "_glEnableVertexAttribArray", referenced from:
    32.  
    33.       RenderAPI_OpenGLCoreES::DrawSimpleTriangles(float const*, int, void const*) in libopencvforunity.a(RenderAPI_OpenGLCoreES.o)
    34.  
    35.   "_glVertexAttribPointer", referenced from:
    36.  
    37.       RenderAPI_OpenGLCoreES::DrawSimpleTriangles(float const*, int, void const*) in libopencvforunity.a(RenderAPI_OpenGLCoreES.o)
    38.  
    39.   "_glCreateShader", referenced from:
    40.  
    41.       CreateShader(unsigned int, char const*) in libopencvforunity.a(RenderAPI_OpenGLCoreES.o)
    42.  
    43.   "_glShaderSource", referenced from:
    44.  
    45.       CreateShader(unsigned int, char const*) in libopencvforunity.a(RenderAPI_OpenGLCoreES.o)
    46.  
    47.   "_glLinkProgram", referenced from:
    48.  
    49.       RenderAPI_OpenGLCoreES::CreateResources() in libopencvforunity.a(RenderAPI_OpenGLCoreES.o)
    50.  
    51.   "_glCreateProgram", referenced from:
    52.  
    53.       RenderAPI_OpenGLCoreES::CreateResources() in libopencvforunity.a(RenderAPI_OpenGLCoreES.o)
    54.  
    55.   "_glCompileShader", referenced from:
    56.  
    57.       CreateShader(unsigned int, char const*) in libopencvforunity.a(RenderAPI_OpenGLCoreES.o)
    58.  
    59.   "_glAttachShader", referenced from:
    60.  
    61.       RenderAPI_OpenGLCoreES::CreateResources() in libopencvforunity.a(RenderAPI_OpenGLCoreES.o)
    62.  
    63.   "_glDrawArrays", referenced from:
    64.  
    65.       RenderAPI_OpenGLCoreES::DrawSimpleTriangles(float const*, int, void const*) in libopencvforunity.a(RenderAPI_OpenGLCoreES.o)
    66.  
    67.   "_glBindBuffer", referenced from:
    68.  
    69.       RenderAPI_OpenGLCoreES::CreateResources() in libopencvforunity.a(RenderAPI_OpenGLCoreES.o)
    70.  
    71.       RenderAPI_OpenGLCoreES::DrawSimpleTriangles(float const*, int, void const*) in libopencvforunity.a(RenderAPI_OpenGLCoreES.o)
    72.  
    73.   "_glBufferData", referenced from:
    74.  
    75.       RenderAPI_OpenGLCoreES::CreateResources() in libopencvforunity.a(RenderAPI_OpenGLCoreES.o)
    76.  
    77.   "_glGetProgramiv", referenced from:
    78.  
    79.       RenderAPI_OpenGLCoreES::CreateResources() in libopencvforunity.a(RenderAPI_OpenGLCoreES.o)
    80.  
    81.   "_glGetUniformLocation", referenced from:
    82.  
    83.       RenderAPI_OpenGLCoreES::CreateResources() in libopencvforunity.a(RenderAPI_OpenGLCoreES.o)
    84.  
    85.   "_glGenBuffers", referenced from:
    86.  
    87.       RenderAPI_OpenGLCoreES::CreateResources() in libopencvforunity.a(RenderAPI_OpenGLCoreES.o)
    88.  
    89.   "_glDepthFunc", referenced from:
    90.  
    91.       RenderAPI_OpenGLCoreES::DrawSimpleTriangles(float const*, int, void const*) in libopencvforunity.a(RenderAPI_OpenGLCoreES.o)
    92.  
    93.   "_glBindAttribLocation", referenced from:
    94.  
    95.       RenderAPI_OpenGLCoreES::CreateResources() in libopencvforunity.a(RenderAPI_OpenGLCoreES.o)
    96.  
    97.   "_glBindTexture", referenced from:
    98.  
    99.       RenderAPI_OpenGLCoreES::EndModifyTexture(void*, int, int, int, void*) in libopencvforunity.a(RenderAPI_OpenGLCoreES.o)
    100.  
    101.   "_glUniformMatrix4fv", referenced from:
    102.  
    103.       RenderAPI_OpenGLCoreES::DrawSimpleTriangles(float const*, int, void const*) in libopencvforunity.a(RenderAPI_OpenGLCoreES.o)
    104.  
    105. ld: symbol(s) not found for architecture arm64

    Build for simulator success.
    What else is needed extra libs link? or settings?
     
  7. EnoxSoftware

    EnoxSoftware

    Joined:
    Oct 29, 2014
    Posts:
    1,566
    Thank you very much for reporting.
    Please kindly wait for a moment. I am still checking.
     
  8. bugrahanbayat1

    bugrahanbayat1

    Joined:
    Apr 11, 2020
    Posts:
    6
    I built IOS over macOS Catalina 10.15.3 for the problem- The architecture is successful, but I got a new error for the Apple Store link.
     

    Attached Files:

  9. EnoxSoftware

    EnoxSoftware

    Joined:
    Oct 29, 2014
    Posts:
    1,566
    Thank you very much for reporting.
    Unfortunately, OpenCVForUnity doesn't support Unity2020.1 for the ios platform at the moment. The IOS platform build will be supported in the next version of OpenCVForUnity.
     
  10. shi_no_gekai

    shi_no_gekai

    Joined:
    Mar 3, 2018
    Posts:
    14
    I'm using OpenPoseExample but once I calculate the keypoints, i experience a huge FPS drop and it takes a long time, as you can see in the profiler, is there anyway i can make Net.dnn_Net_forward executed on GPU or is there any other multithreading solution I can use? because Net.dnn_Net_forward method is in C++ and i cant change it
     

    Attached Files:

  11. EnoxSoftware

    EnoxSoftware

    Joined:
    Oct 29, 2014
    Posts:
    1,566
    This article will be helpful as you make Net.dnn_Net_forward executed on GPU.
    https://qiita.com/utibenkei/items/3d9ce5c30ef666e14f44
     
  12. marsaal

    marsaal

    Joined:
    Jan 18, 2018
    Posts:
    9
    Hello, I'm trying to deactivate a marker in the WebCamTextureMarkerBasedARExample.
    You've got the Marker Settings list and if a 3D Object is placed correctly I want to disable the marker and ignore it from that time on.

    Is that possible?

    question.png

    Hope to hear from you.
    Kind regards,
    Marcel
     
  13. Cyprien_ETH

    Cyprien_ETH

    Joined:
    May 16, 2019
    Posts:
    1
    Hello guys,

    I would like to make tiny Yolo v3 run on Hololens 1.
    To start, I decided to test the example of yolov3 from OpenCVforUnity 2.3.8.
    But I could not make it run, eventhough I followed the youtube tutorial to install the package and I added the weights and files related to Yolov3 into the dnn folder.
    I don't get any error message only warning comming for the object Quad.
    Apparently the issue comes from the script DnnObjectDetectionWebCamTextureExample.cs.
    It can't be loaded properly because the OpenCVForUnity.DnnModule is not found.

    Then I had a look on the support page of the software and I noticed that the dnn module is not supported on
    Windows10 UWP of my problem? and if so, is there a possible fix?

    Yours sincerely
    Cyprien
     
  14. shi_no_gekai

    shi_no_gekai

    Joined:
    Mar 3, 2018
    Posts:
    14
    thank you for the link! i will look more into it.
    Is there any multi-threaded solution or option i can turn on for that method?
     
  15. EnoxSoftware

    EnoxSoftware

    Joined:
    Oct 29, 2014
    Posts:
    1,566
    At this line, WebcamTextureMarkerBasedARExample determines if a marker registered in MarkerSettings array has been detected.
    https://github.com/EnoxSoftware/Mar...ple/WebCamTextureMarkerBasedARExample.cs#L265
    After the markers are detected for the first time, remove the markers you want to ignore from the MarkerSettings array.
     
  16. marsaal

    marsaal

    Joined:
    Jan 18, 2018
    Posts:
    9
    Thanks, thought so, but I can't get it to work..
    Should I use RemoveAt? At which point. Doesn't work on MarkerSettings, nor on markerSettings.

    Thanks!
     
  17. EnoxSoftware

    EnoxSoftware

    Joined:
    Oct 29, 2014
    Posts:
    1,566
    Unfortunately, the OpenCV Dnn module is not supported by the UWP platform.
    https://github.com/opencv/opencv/issues/9177
    https://github.com/opencv/opencv/issues/14587
     
  18. leavittx

    leavittx

    Joined:
    Dec 27, 2013
    Posts:
    176
    Hello! Thanks for a great asset!
    Is the bgsegm contrib module supported? I might need it really soon for my project
     
  19. EnoxSoftware

    EnoxSoftware

    Joined:
    Oct 29, 2014
    Posts:
    1,566
  20. ghasedak3411

    ghasedak3411

    Joined:
    Aug 25, 2015
    Posts:
    23
  21. EnoxSoftware

    EnoxSoftware

    Joined:
    Oct 29, 2014
    Posts:
    1,566
  22. ghasedak3411

    ghasedak3411

    Joined:
    Aug 25, 2015
    Posts:
    23

    This is a caffe model, but I get an error, can you run it with opencvforunity ? Please help me, thank you
     
  23. chatsopon_acc

    chatsopon_acc

    Joined:
    Apr 16, 2019
    Posts:
    4
    How to use At in this package? I can't find At function for extracting the point at x,y

    C++ Code Example
    Point2f pt = flowImage.at<Point2f>(y, x);


    I don't sure. Is below the same way to get point2f ?

    C# Code Example
    double[] d = flowImage.get(y,x);
    Point pt = new Point(d);
     
    Last edited: Apr 28, 2020
  24. mahna3411

    mahna3411

    Joined:
    Dec 11, 2018
    Posts:
    39
    name='LightWeight Human Pose Estimation (ONNX)', # https://github.com/Daniil-Osokin/lightweight-human-pose-estimation.pytorch
    url='https://drive.google.com/uc?export=dowload&id=1--Ij_gIzCeNA488u5TA4FqWMMdxBqOji',
    sha='5960f7aef233d75f8f4020be1fd911b2d93fbffc',
    filename='onnx/models/lightweight_pose_estimation_201912.onnx'),


    hello , can you run this model in OpencvForUnity ?

    This is part of the list of models that you said opencvforunity supports from the following link:



    https://github.com/opencv/opencv_extra/blob/master/testdata/dnn/download_models.py
     
  25. leavittx

    leavittx

    Joined:
    Dec 27, 2013
    Posts:
    176
    I'd be also highly interested in the lightweight pose estimation. Please post any updates on that
     
  26. leavittx

    leavittx

    Joined:
    Dec 27, 2013
    Posts:
    176
    Hello! Unfortunately bgsegm module wasn't enough for my purposes (but very nice to have anyway). Now I'd like to know what's the best way to call functions of 3rd party OpenCV based library inside OpenCV for Unity. Should I just get the internal Mat native handle (IntPtr) and P-Invoke on that? The library I'm talking about is quite useful (https://github.com/andrewssobral/bgslibrary), do you think it's possible to wrap it inside OpenCV for Unity in the future? I've also found this: https://github.com/mono/CppSharp
    What is the best way in your opinion?
     
  27. JOKER_LD

    JOKER_LD

    Joined:
    Feb 7, 2018
    Posts:
    6
    Hi, I'm new to OpenCV for Unity. I have written some code to use KalManFilter.
    But I get
    "OpenCVForUnity.CoreModule.CvException: Native object address is NULL" error when
    I call updateData(double value) function.
    in line,
    Code (CSharp):
    1. filter.predict()
    it gets an error and throws the exception.
    I have already initiated KalmanFilter.

    May you help me?

    My code is below.
    Code (CSharp):
    1. using UnityEngine;
    2. using System.Collections;
    3. using OpenCVForUnity.CoreModule;
    4. using OpenCVForUnity.VideoModule;
    5. using System.Collections.Generic;
    6. namespace CVVTuber
    7. {
    8.     public class Stabilizer
    9.     {
    10.         public int state_num;
    11.         public int measure_num;
    12.         public KalmanFilter filter;
    13.         public Mat state;
    14.         public Mat prediction;
    15.         public Mat measurement;
    16.         public Stabilizer(int state_num = 4, int measure_num = 2, double cov_process = 0.0001, double cov_measure = 0.1)
    17.         {
    18.             Debug.Assert(state_num == 4 || state_num == 2);
    19.             //Store the parameters.
    20.             this.state_num = state_num;
    21.             this.measure_num = measure_num;
    22.             // The filter itself.
    23.             filter = new KalmanFilter(state_num, measure_num, 0);
    24.             // Store the state.
    25.             //self.state = np.zeros((state_num, 1), dtype=np.float32)
    26.             state = new Mat(state_num, 1, CvType.CV_64FC1);
    27.             // Store the measurement result.
    28.             measurement = new Mat(measure_num, 1, CvType.CV_64FC1);
    29.             // Store the prediction.
    30.             prediction = Mat.zeros(state_num, 1, CvType.CV_64FC1);
    31.             // Kalman parameters setup for scalar.
    32.             if (measure_num == 1)
    33.             {
    34.                 Mat transM = new Mat(2, 2, CvType.CV_64FC1);
    35.                 transM.put(0, 0, new double[] {1, 1, 0, 1});
    36.                 filter.set_transitionMatrix(transM);
    37.                 //= np.array([[1, 1], [0, 1]], np.double32)
    38.                 Mat measureM = new Mat(1, 2, CvType.CV_64FC1);
    39.                 measureM.put(0, 0, new double[] {1, 1});
    40.                 filter.set_measurementMatrix(measureM);
    41.                 //= np.array([[1, 1]], np.double32)
    42.                 Mat processNoiseCovM = new Mat(2, 2, CvType.CV_64FC1);
    43.                 processNoiseCovM.put(0, 0, new double[] { 1, 0, 0, 1 });
    44.                 filter.set_processNoiseCov(processNoiseCovM * cov_process);
    45.                 //= np.array([[1, 0], [0, 1]], np.double32) * cov_process
    46.                 Mat measurementNoiseCovM = new Mat(1, 1, CvType.CV_64FC1);
    47.                 measurementNoiseCovM.put(0, 0, new double[] { 1 });
    48.                 filter.set_measurementNoiseCov(measurementNoiseCovM * cov_measure);
    49.                 //np.array( [[1]], np.double32) *cov_measure
    50.             }
    51.             // Kalman parameters setup for point.
    52.             if (measure_num == 2)
    53.             {
    54.                 Mat transM = new Mat(4, 4, CvType.CV_64FC1);
    55.                 transM.put(0, 0, new double[] {1, 0, 1, 0,
    56.                                                0, 1, 0, 1,
    57.                                                0, 0, 1, 0,
    58.                                                0, 0, 0, 1});
    59.  
    60.                 filter.set_transitionMatrix(transM);
    61.                 //= np.array([[1, 1], [0, 1]], np.double32)
    62.                 Mat measureM = new Mat(2, 4, CvType.CV_64FC1);
    63.                 measureM.put(0, 0, new double[] {1, 0, 0, 0,
    64.                                                  0, 1, 0, 0});
    65.                 filter.set_measurementMatrix(measureM);
    66.                 //= np.array([[1, 1]], np.double32)
    67.                 Mat processNoiseCovM = new Mat(4, 4, CvType.CV_64FC1);
    68.                 processNoiseCovM.put(0, 0, new double[] {1, 0, 0, 0,
    69.                                                          0, 1, 0, 0,
    70.                                                          0, 0, 1, 0,
    71.                                                          0, 0, 0, 1});
    72.                 filter.set_processNoiseCov(processNoiseCovM * cov_process);
    73.                 //= np.array([[1, 0], [0, 1]], np.double32) * cov_process
    74.                 Mat measurementNoiseCovM = new Mat(2, 2, CvType.CV_64FC1);
    75.                 measurementNoiseCovM.put(0, 0, new double[] {1, 0,
    76.                                                              0, 1});
    77.                 filter.set_measurementNoiseCov(measurementNoiseCovM * cov_measure);
    78.             }
    79.         }
    80.         public void updateData(double value)
    81.         {
    82.             prediction = filter.predict();
    83.             measurement = new Mat(measure_num, 1, CvType.CV_64FC1);
    84.             if (measure_num == 1)
    85.             {
    86.                 double[] putData = new double[1];
    87.                 putData[0] = value;
    88.                 measurement.put(0, 0, putData);
    89.             }
    90.             filter.correct(measurement);
    91.             state = filter.get_statePost();
    92.         }
    93.         public void updateData(Mat _measurement)
    94.         {
    95.             prediction = filter.predict();
    96.             measurement = new Mat(measure_num, 1, CvType.CV_64FC1);
    97.             if (measure_num == 1)
    98.             {
    99.                 long size = _measurement.total() * _measurement.channels();
    100.                 float[] data = new float[size];
    101.                 _measurement.get(0, 0, data);
    102.                 float[] putData = new float[_measurement.cols() * _measurement.channels()];
    103.                 for(int i = 0; i < putData.Length; i++)
    104.                 {
    105.                     putData = data;
    106.                 }
    107.                 measurement.put(0, 0, putData);
    108.             }
    109.             else
    110.             {
    111.                 long size = _measurement.total() * _measurement.channels();
    112.                 float[] data = new float[size];
    113.                 _measurement.get(0, 0, data);
    114.                 float[] putData = new float[_measurement.cols() * 2 * _measurement.channels()];
    115.                 for (int i = 0; i < putData.Length; i++)
    116.                 {
    117.                     putData = data;
    118.                 }
    119.                 measurement.put(0, 0, putData);
    120.             }
    121.             filter.correct(measurement);
    122.             state = filter.get_statePost();
    123.         }
    124.     }
    125. }
     

    Attached Files:

  28. EnoxSoftware

    EnoxSoftware

    Joined:
    Oct 29, 2014
    Posts:
    1,566
    This code is correct.
    Since this asset is a clone of OpenCV Java, you are able to use the same API as OpenCV Java.
    https://docs.opencv.org/4.2.0/javadoc/index.html
    https://enoxsoftware.github.io/OpenCVForUnity/3.0.0/doc/html/annotated.html
     
  29. EnoxSoftware

    EnoxSoftware

    Joined:
    Oct 29, 2014
    Posts:
    1,566
    OpenCVforUnity2.3.9 will include an example of using the Dnn module to get landmarks for faces. OpenCVforUnity2.3.9 will be released in the next few days.
    opencvforunity2.3.9_feature.png
     
  30. EnoxSoftware

    EnoxSoftware

    Joined:
    Oct 29, 2014
    Posts:
    1,566
    You can run LightWeight Human Pose Estimation (ONNX) by following the steps below.

    1. Import and setup OpenCVForUnity2.3.8.
    2. Download and Import LightweightHumanPoseEstimationExample238.unitypackage.
    3. Download lightweight_pose_estimation_201912.onnx ( https://drive.google.com/uc?export=dowload&id=1--Ij_gIzCeNA488u5TA4FqWMMdxBqOji ).
    4. Move lightweight_pose_estimation_201912.onnx to Assets/StremingAssets/dnn/ folder.
     

    Attached Files:

  31. mahna3411

    mahna3411

    Joined:
    Dec 11, 2018
    Posts:
    39
    Thanks
    I want to make this a real time
    Up to about ten frames
    Do you know of a lighter model for Openpose?
     
  32. EnoxSoftware

    EnoxSoftware

    Joined:
    Oct 29, 2014
    Posts:
    1,566
    Perhaps you can't share OpenCVforUnity's Mat pointer with another dll. I think you need to get the Mat data into a byte sequence once to use it in another dll.
     
  33. EnoxSoftware

    EnoxSoftware

    Joined:
    Oct 29, 2014
    Posts:
    1,566
    Have you seen our KalmanFilterExample?
    https://github.com/EnoxSoftware/Ope...eo/KalmanFilterExample/KalmanFilterExample.cs
    It seems that your code is trying to achieve almost the same functionality as this.

    Changing the type of each Mat in your code from CV_64F(double) to CV_32F(float) eliminates the error.
    Code (CSharp):
    1. using UnityEngine;
    2. using System.Collections;
    3. using OpenCVForUnity.CoreModule;
    4. using OpenCVForUnity.VideoModule;
    5. using System.Collections.Generic;
    6. namespace CVVTuber
    7. {
    8.     public class Stabilizer
    9.     {
    10.         public int state_num;
    11.         public int measure_num;
    12.         public KalmanFilter filter;
    13.         public Mat state;
    14.         public Mat prediction;
    15.         public Mat measurement;
    16.         public Stabilizer(int state_num = 4, int measure_num = 2, double cov_process = 0.0001, double cov_measure = 0.1)
    17.         {
    18.             Debug.Assert(state_num == 4 || state_num == 2);
    19.             //Store the parameters.
    20.             this.state_num = state_num;
    21.             this.measure_num = measure_num;
    22.             // The filter itself.
    23.             filter = new KalmanFilter(state_num, measure_num, 0);
    24.             // Store the state.
    25.             //self.state = np.zeros((state_num, 1), dtype=np.float32)
    26.             state = new Mat(state_num, 1, CvType.CV_32FC1);
    27.             // Store the measurement result.
    28.             measurement = new Mat(measure_num, 1, CvType.CV_32FC1);
    29.             // Store the prediction.
    30.             prediction = Mat.zeros(state_num, 1, CvType.CV_32FC1);
    31.             // Kalman parameters setup for scalar.
    32.             if (measure_num == 1)
    33.             {
    34.                 Mat transM = new Mat(2, 2, CvType.CV_32FC1);
    35.                 transM.put(0, 0, new double[] { 1, 1, 0, 1 });
    36.                 filter.set_transitionMatrix(transM);
    37.                 //= np.array([[1, 1], [0, 1]], np.double32)
    38.                 Mat measureM = new Mat(1, 2, CvType.CV_32FC1);
    39.                 measureM.put(0, 0, new double[] { 1, 1 });
    40.                 filter.set_measurementMatrix(measureM);
    41.                 //= np.array([[1, 1]], np.double32)
    42.                 Mat processNoiseCovM = new Mat(2, 2, CvType.CV_32FC1);
    43.                 processNoiseCovM.put(0, 0, new double[] { 1, 0, 0, 1 });
    44.                 filter.set_processNoiseCov(processNoiseCovM * cov_process);
    45.                 //= np.array([[1, 0], [0, 1]], np.double32) * cov_process
    46.                 Mat measurementNoiseCovM = new Mat(1, 1, CvType.CV_32FC1);
    47.                 measurementNoiseCovM.put(0, 0, new double[] { 1 });
    48.                 filter.set_measurementNoiseCov(measurementNoiseCovM * cov_measure);
    49.                 //np.array( [[1]], np.double32) *cov_measure
    50.             }
    51.             // Kalman parameters setup for point.
    52.             if (measure_num == 2)
    53.             {
    54.                 Mat transM = new Mat(4, 4, CvType.CV_32FC1);
    55.                 transM.put(0, 0, new double[] {1, 0, 1, 0,
    56.                                                0, 1, 0, 1,
    57.                                                0, 0, 1, 0,
    58.                                                0, 0, 0, 1});
    59.  
    60.                 filter.set_transitionMatrix(transM);
    61.                 //= np.array([[1, 1], [0, 1]], np.double32)
    62.                 Mat measureM = new Mat(2, 4, CvType.CV_32FC1);
    63.                 measureM.put(0, 0, new double[] {1, 0, 0, 0,
    64.                                                  0, 1, 0, 0});
    65.                 filter.set_measurementMatrix(measureM);
    66.                 //= np.array([[1, 1]], np.double32)
    67.                 Mat processNoiseCovM = new Mat(4, 4, CvType.CV_32FC1);
    68.                 processNoiseCovM.put(0, 0, new double[] {1, 0, 0, 0,
    69.                                                          0, 1, 0, 0,
    70.                                                          0, 0, 1, 0,
    71.                                                          0, 0, 0, 1});
    72.                 filter.set_processNoiseCov(processNoiseCovM * cov_process);
    73.                 //= np.array([[1, 0], [0, 1]], np.double32) * cov_process
    74.                 Mat measurementNoiseCovM = new Mat(2, 2, CvType.CV_32FC1);
    75.                 measurementNoiseCovM.put(0, 0, new double[] {1, 0,
    76.                                                              0, 1});
    77.                 filter.set_measurementNoiseCov(measurementNoiseCovM * cov_measure);
    78.             }
    79.         }
    80.         public void updateData(double value)
    81.         {
    82.             prediction = filter.predict();
    83.             measurement = new Mat(measure_num, 1, CvType.CV_32FC1);
    84.             if (measure_num == 1)
    85.             {
    86.                 double[] putData = new double[1];
    87.                 putData[0] = value;
    88.                 measurement.put(0, 0, putData);
    89.             }
    90.             filter.correct(measurement);
    91.             state = filter.get_statePost();
    92.         }
    93.         public void updateData(Mat _measurement)
    94.         {
    95.             prediction = filter.predict();
    96.             measurement = new Mat(measure_num, 1, CvType.CV_32FC1);
    97.             if (measure_num == 1)
    98.             {
    99.                 long size = _measurement.total() * _measurement.channels();
    100.                 float[] data = new float[size];
    101.                 _measurement.get(0, 0, data);
    102.                 float[] putData = new float[_measurement.cols() * _measurement.channels()];
    103.                 for(int i = 0; i < putData.Length; i++)
    104.                 {
    105.                     putData = data;
    106.                 }
    107.                 measurement.put(0, 0, putData);
    108.             }
    109.             else
    110.             {
    111.                 long size = _measurement.total() * _measurement.channels();
    112.                 float[] data = new float[size];
    113.                 _measurement.get(0, 0, data);
    114.                 float[] putData = new float[_measurement.cols() * 2 * _measurement.channels()];
    115.                 for (int i = 0; i < putData.Length; i++)
    116.                 {
    117.                     putData = data;
    118.                 }
    119.                 measurement.put(0, 0, putData);
    120.             }
    121.             filter.correct(measurement);
    122.             state = filter.get_statePost();
    123.         }
    124.     }
    125. }
    The following code is also a good example of using the Kalman filter.
    https://github.com/EnoxSoftware/Dli...seFilterExample/NoiseFilter/KFPointsFilter.cs
     
  34. EnoxSoftware

    EnoxSoftware

    Joined:
    Oct 29, 2014
    Posts:
    1,566
  35. leavittx

    leavittx

    Joined:
    Dec 27, 2013
    Posts:
    176
    Thanks for the tip! But what makes you think that the Mat IntPtr can't be used in external OpenCV C++ code?
     
  36. Calvin2274

    Calvin2274

    Joined:
    Sep 11, 2014
    Posts:
    17
    Hi, i purchased OpenCV for Unity and would like to implement a virtual background photobooth, but not supposed to use green screen. so i think i need to do with background subtraction features. so i am trying to start with BackgroundSubtractorExample. However, the result is not what i excepted and kind of wired. When i not moving, i will be 100% mask out slowly; when i move a bit, i can see a little noise mask. i would wonder what i missing and how could i make the background subtraction work even i am static.
     
  37. wxxhrt

    wxxhrt

    Joined:
    Mar 18, 2014
    Posts:
    163
    Hi,

    I'm running into an issue when using building for standalone windows 10 builds.

    My exe loads up then the webcam light turns on, then it turns off and nothing is detected through the webcam.

    The same project works as a Mac build.

    I've followed the readme guide to set everything up.

    Am using Unity 2019.3.9, the latest asset store versions of DLib and OpenCV.

    I've looked in the windows 10 camera usage permission control panel page and my .exe is showing? Do windows builds ask for permission in a dialog box im not seeing?

    Thanks

    EDIT:
    Running a development build shows that :

    "Could not connect pins - RenderStream()"
    the error is happening in WebcamTextureToMatHelper.cs line 606
     
    Last edited: May 4, 2020
  38. chatsopon_acc

    chatsopon_acc

    Joined:
    Apr 16, 2019
    Posts:
    4
    I have a problem with video on iOS

    My Work Description
    1. Record the WebcamTexture to the video file
    2. Open the video file for processing
    3. Save it a new video file.

    It works perfectly on macOS but not the iOS.

    Problem on iOS #1
    When I write with codec 'mjpg', I can't open it via OpenCV

    capture.isOpened() is false


    Solution: I use 'avc1'

    Problem on iOS #2
    I use the codec 'avc1' instead of 'mjpg' It's can write and capture.isOpened() is true
    but the problem is

    - check the current frame
    capture.get(Videoio.CAP_PROP_POS_FRAMES) always 1
    solution: count nowFrame++; after calling capture.read(mat)

    - check if finish video?
    capture.get(Videoio.CAP_PROP_FRAME_COUNT) always 1
    prepare: I need to prepare the file that I can tell the number frame of this video. (!!!!!!!!)

    - check some others information
    capture.get(Videoio.CAP_PROP_FRAME_WIDTH) always 1
    capture.get(Videoio.CAP_PROP_FRAME_HEIGHT) always 1
    capture.get(Videoio.CAP_PROP_FPS) always 1

    this is not correct!!

    I can't check by this code too

    while(capture.read(mat))


    this will always do infinity

    The same file I transfer it from iPhone to my Mac I open it on my Mac the APIs are correct
    capture.get(Videoio.CAP_PROP_POS_FRAMES) can grow after call capture.read(mat)
    capture.get(Videoio.CAP_PROP_FRAME_COUNT) tells the frameCap
    while(capture.read(mat)) can use correctly, don't do it infinity.
    capture.get(Videoio.CAP_PROP_FPS) tells the FPS

    There are problems on capture.get() on iOS I guess

    Lucky me this work It needs to record video before process it. So I can collect some important data such as frameCap. In the future, it may not have to do the first step (record the video) :( that is really big problem.

    so there are other way to fix the problem #2? or It needs to wait for next update?
     
    Last edited: May 5, 2020
  39. IvanTesseract

    IvanTesseract

    Joined:
    Jul 1, 2013
    Posts:
    109
    Hi,

    I am using the trial to test the DNN hand pose detection. When I run it in the editor it gives me an empty string for the paths to load the files. The method OpenCVForUnity.UnityUtils.Utils.getFilePath(string) returns empty string.

    I moved the needed files to StreamingAssets/dnn, except for hand.jpg, which I can't find anywhere.

    I changed the code to construct the path like Application.streamingAssetsPath + "/" + CAFFEMODEL_FILENAME, but if I do that Unity is crashing.

    Please help.

    ---------------------------------------------------------------------------------------------------------------------------------

    NEVER MIND SOLVED!
     
    Last edited: May 5, 2020
  40. EnoxSoftware

    EnoxSoftware

    Joined:
    Oct 29, 2014
    Posts:
    1,566
    Have you already tried GreenScreenExample?
    https://github.com/EnoxSoftware/Ope...nced/GreenScreenExample/GreenScreenExample.cs
     
  41. EnoxSoftware

    EnoxSoftware

    Joined:
    Oct 29, 2014
    Posts:
    1,566
    This problem seems to be a bug in UnityEditor.
    https://stackoverflow.com/questions/38758702/could-not-connect-pins-renderstream
    -unity3d-webcamtexture-error
     
  42. EnoxSoftware

    EnoxSoftware

    Joined:
    Oct 29, 2014
    Posts:
    1,566
    Thank you very much for reporting.
    Depending on the codec of the video file, it may not be possible to obtain the correct value.
    https://github.com/opencv/opencv/issues/12091

    https://github.com/opencv/opencv_extra/blob/master/testdata/cv/video/768x576.avi
    I converted this video file to mjpeg codec video file using ffmpeg.
    ffmpeg -i 768x576.avi -r 10 -s 768x576 -vcodec mjpeg -an 768x576_mjpeg.mjpeg
     
  43. EnoxSoftware

    EnoxSoftware

    Joined:
    Oct 29, 2014
    Posts:
    1,566
    Could you update to OpenCVForUnity2.3.9?
    HAND Download
    https://www.pexels.com/photo/person-s-right-hand-1257770/. Copy person-s-right-hand-1257770.jpg to “Assets/StreamingAssets/dnn/” folder.http://posefs1.perception.cs.cmu.edu/OpenPose/models/hand/pose_iter_102000.caffemodel. Copy pose_iter_102000.caffemodel to “Assets/StreamingAssets/dnn/” folder. Download https://raw.githubusercontent.com/C...npose/master/models/hand/pose_deploy.prototxt. Copy pose_deploy.prototxt to “Assets/StreamingAssets/dnn/” folder.​
     
  44. leavittx

    leavittx

    Joined:
    Dec 27, 2013
    Posts:
    176
    Hey. I've just realized there seems to be no debugging tool for OpenCV 4 Unity yet.
    Do you think it's possible to make the C++ Image Watch work somehow? Possibly with an apporach like that (not 100% sure it would work with C# though). For C# I've found the following two extensions, but they seem to be not as sophisticated as Image Watch (e.g. even no pixel lookup from what I see): 1, 2. I think having image debug viewer inside VS would speed up the CV app development a lot
     
  45. EnoxSoftware

    EnoxSoftware

    Joined:
    Oct 29, 2014
    Posts:
    1,566
    Unfortunately, I'm not familiar with Image Watch, so I don't know if Image Watch can work with Visual Studio Tools for Unity. As you suggest, I think it would be very useful to be able to use Image Watch when debugging.
     
  46. unity_14663430

    unity_14663430

    Joined:
    May 14, 2020
    Posts:
    2
    HI, i want to use OpenCV to do object detection (with trained model like YOLOv3) to find out the point of (x,y,z) of the object, i know it is fine to find out the (x,y) but is it possible to know the z axis and how?
    As i want to display AR object on top of the object being detected not create a 2D object on top of the camera (which may using real time AR camera). (e.g. i have a water bottle in font of me and the application is change the water bottle in font of me to be a monster)
    any example or any guide i can follow?
     
  47. leavittx

    leavittx

    Joined:
    Dec 27, 2013
    Posts:
    176
    From what I see, using Image Watch from C# only project is not possible, but if you supply DEBUG version of the native DLL along with the corresponding PDB file, it should do the trick - i.e. you'll be able to step into the C++ code and see any OpenCV Mats you want inside Imge Watch extension, and all its features (like pixel lookup) would be available to OpenCV for Unity users
     
  48. unity_14663430

    unity_14663430

    Joined:
    May 14, 2020
    Posts:
    2
    or is that a way to do object detection find our the (x,y) of the object and do something to find out the 3D object coordinates, but how?
     
  49. EnoxSoftware

    EnoxSoftware

    Joined:
    Oct 29, 2014
    Posts:
    1,566
    I think it would be possible if there was a model for detecting the point(x,y,z) of an object that supports OpenCV, but I don't have an example of that.
     
  50. EnoxSoftware

    EnoxSoftware

    Joined:
    Oct 29, 2014
    Posts:
    1,566
    Thank you for the useful information. I am going to give it a try.