Search Unity

  1. We are migrating the Unity Forums to Unity Discussions. On July 12, the Unity Forums will become read-only.

    Please, do not make any changes to your username or email addresses at id.unity.com during this transition time.

    On July 15, Unity Discussions will become read-only until July 18, when the new design and the migrated forum contents will go live.


    Read our full announcement for more information and let us know if you have any questions.

[RELEASED] OpenCV for Unity

Discussion in 'Assets and Asset Store' started by EnoxSoftware, Oct 30, 2014.

  1. MaxXR

    MaxXR

    Joined:
    Jun 18, 2017
    Posts:
    67
    It works! Thanks :)
     
  2. VIRNECT_unity

    VIRNECT_unity

    Joined:
    Jul 20, 2015
    Posts:
    4
  3. EnoxSoftware

    EnoxSoftware

    Joined:
    Oct 29, 2014
    Posts:
    1,581
    It may be related to this issue.
    https://answers.unity.com/questions/1425938/android-native-camera-low-lightlow-iso.html
    Does this problem occur only when using OpenCVforUnity?
     
  4. PhosphorUnity

    PhosphorUnity

    Joined:
    Jan 22, 2014
    Posts:
    39
    Thank you for getting back to me.

    It may be related to that, but the fact it works fine for "WebCamTextureToMatExample " but it doesn't work for "WebCamTextureToMatHelper" make me think it has to be something else. Like for WebCamTextureToMatExample it is bright and normal which is very strange.

    Are they handling anything differently?
     
  5. sticklezz

    sticklezz

    Joined:
    Oct 27, 2015
    Posts:
    33
    Here is the difference I see on Pixel phone between "WebCamTextureToMatExample" and "WebCamTextureToMatHelper" - it is a very bright scene near a light.

    As a side, I don't notice it the problem on an older Lenovo Android phone. How strange is that ?
     

    Attached Files:

  6. Artishock

    Artishock

    Joined:
    Mar 2, 2017
    Posts:
    1
    Hi,
    I'm using the asset for tracking 3D-Objects to a users face. The app will be run on Windows with a Webcam. I have got good setup for the tracker, but my 3D-Objects stops moving at a certain distanse. The face tracker still delievers nice facepoints.
    To visualize the issue, please refer to this screen recording

    Can someone please help me with this issue? I obviously want the object moving until there is no solid track to follow...
     
  7. EnoxSoftware

    EnoxSoftware

    Joined:
    Oct 29, 2014
    Posts:
    1,581
    It seems that the current latest version(2.2.6) does not support for that model.
    However, it worked well in the next version based on OpenCV 3.4.1 which I am currently developing. Please wait for a while until the release of the next version.
     
  8. EnoxSoftware

    EnoxSoftware

    Joined:
    Oct 29, 2014
    Posts:
    1,581
    I think that the difference between WebCamTextureToMatHelper.cs and WebCamTextureToMatExample.cs is only the following code part.
    https://github.com/EnoxSoftware/Ope...xamples/WebCamTextureToMatHelper.cs#L414-L426

    Could you change the GetMat () method of WebCamTextureToMatHelper.cs in this way and test it?
    Code (CSharp):
    1.         public virtual Mat GetMat ()
    2.         {
    3.             if (!hasInitDone || !webCamTexture.isPlaying) {
    4.  
    5.                 return rgbaMat;
    6.                
    7.             }
    8.  
    9.             Utils.webCamTextureToMat (webCamTexture, rgbaMat, colors);
    10.  
    11.             return rgbaMat;
    12.  
    13.         }
     
  9. MaxXR

    MaxXR

    Joined:
    Jun 18, 2017
    Posts:
    67
    Hi
    Built comic filter out for Hololens following this tut https://github.com/EnoxSoftware/HoloLensWithOpenCVForUnityExample. It works! However, there are 2 problems - how can I solve these?
    1. it looks like the filter is masked by a vignette (it only works in an oval in the middle). Any idea what this could be and how to fix?
      looks from your vid like it tracks all the way to the edges
    2. the tracking is a bit slow - mine is similar to yours with obama and trump, it jumps around a lot. Is there any way to increase this? (noticed the tracking with roboraid is really fast/good so that makes me think it's possible)
    thanks so much!
     
  10. daverosen5

    daverosen5

    Joined:
    Mar 1, 2018
    Posts:
    6
    Any plan for hand recognition?
     
  11. sticklezz

    sticklezz

    Joined:
    Oct 27, 2015
    Posts:
    33
    Last edited: Mar 11, 2018
  12. Angelk90

    Angelk90

    Joined:
    Dec 10, 2012
    Posts:
    2
    @EnoxSoftware:
    Hi, I have some questions about it.
    I have to recognize the hand, in particular two types of gesture:

    1) recognize the open, frontal hand
    2) recognize the back of the hand, with the closed fist

    I would like to make sure that if I find myself in situation 1 I show a certain object on the hand, if I find myself in situation 2 I show another one.
    If I do not find myself in either one, I do not show anything.

    What I want to do and use, the Markerless Augmented Reality on the hand.
    Although I do not know if in this case we can really talk about Markerless.

    Using this release of released-opencv-for-unity, would it be possible to do this?
    If yes, how can I do?


    P.Š.
    I can only find the package on the official Unity store, right?
     
  13. EnoxSoftware

    EnoxSoftware

    Joined:
    Oct 29, 2014
    Posts:
    1,581
    1. Please set the value of "vignette scale" to 0 on the inspector.
    2. The smaller the OpenCV Mat size is, the faster the processing time becomes. Processing speed may be improved by downscaling Mat specified by MultiScale () method.
     
  14. EnoxSoftware

    EnoxSoftware

    Joined:
    Oct 29, 2014
    Posts:
    1,581
    HandPoseEstimationExample is included with OpenCVForUnity.
    https://github.com/EnoxSoftware/Ope...stimationExample/HandPoseEstimationExample.cs
    写真 2015-11-20 22 37 39.png

    Since this asset is a clone of OpenCV Java, you are able to use the same API as OpenCV Java.http://enoxsoftware.github.io/OpenCVForUnity/3.0.0/doc/html/index.html
    If there is implementation example using "OpenCV Java", I think that it can be implemented even using "OpenCV for Unity".


    Regerds,
    EnoxSoftware
     
  15. asethicode5

    asethicode5

    Joined:
    Feb 21, 2018
    Posts:
    7
    How to detect an object using camera feed and show a 3d object around it? Also during this, the camera's rotation and position with respect to the object being tracked is needed. In short, if I'm tracking object using different techniques per frame how to do this? Reference:
     
  16. KevinW

    KevinW

    Joined:
    Mar 23, 2014
    Posts:
    5
    If I'm concerned about the impacts of this plugin on my total app size (stored and in memory), what advice could you provide to aid in minimizing OpenCV?

    Thanks.
     
  17. wbknox

    wbknox

    Joined:
    Aug 1, 2016
    Posts:
    11
    Adding onto KevinW's question, are there specific best practices for ensuring that an app using opencvforunity can be placed in the Apple iOS App Store and the Google Play Store for Android?

    For instance, I see one report that the Google Play Store limits apk sizes to 100MB, making it seem that "expansion files" might be a way to go, though in my experience the only output when I've compiled an app with opencvforunity has been a single apk.

    Thanks for humoring likely naive questions.
     
  18. sdf124

    sdf124

    Joined:
    Mar 4, 2016
    Posts:
    8
    For the ARHeadExample, it seems to track much better when the person is wearing no glasses. Is it possible to improve the point accuracy for people wearing glasses?
     
  19. MaxXR

    MaxXR

    Joined:
    Jun 18, 2017
    Posts:
    67
    Hi

    Anyone able to point out where I'm going wrong and how to resolve?

    Goal = modulate the alpha channel to achieve a flashing on and off of the comic filter effect (similar to how i do it manually in video).

    Have setup a co-routine but am struggling to get it to drive the OpenCV alpha variable. Been looking at referencing and changing other script values for a while but yet to figure it out.

    Video below


    And current gihub scripts/code: https://github.com/mrmaxmagna/opencv

    My qns:
    1. why isn't my current code triggering the alpha on the Quad gameobject which holds HoloLensComicFilterExample.cs ? How can I fix this?
    2. What's the best way to achieve the flashing / modulation of alpha between 0 and 1 over a second or so? Am noob and seemed co-routine could work but may be other options..

    PS. thanks to opencv for answering my vignette qn :)
     
    Last edited: Mar 15, 2018
  20. Westerby

    Westerby

    Joined:
    Jun 20, 2017
    Posts:
    8
    Hello @EnoxSoftware,

    Thank you for your last answer.

    I'm trying to use Tensorflow trained MobileNet. The pbtxt file was created from frozen graph with the use of tf_text_graph_ssd.py script as found here https://github.com/opencv/opencv/blob/master/samples/dnn/tf_text_graph_ssd.py

    OpenCV version is: 2.2.6

    Code (CSharp):
    1. using System.Collections;
    2. using System.Collections.Generic;
    3. using UnityEngine;
    4. using OpenCVForUnity;
    5.  
    6. public class main : MonoBehaviour {
    7.  
    8.  
    9.     Mat blob;
    10.     Net net;
    11.     Mat img;
    12.  
    13.     void Start () {
    14.  
    15.         img = OpenCVForUnity.Imgcodecs.imread (Utils.getFilePath ("dnn/image.jpg"));
    16.         net = Dnn.readNetFromTensorflow (Utils.getFilePath ("dnn/frozen_inference_graph.pb"),Utils.getFilePath ("dnn/config.pbtxt"));
    17.  
    18.         Utils.setDebugMode(true);
    19.         blob = Dnn.blobFromImage (img, 1, new Size (300, 300), new Scalar (104, 117, 123), false, false);
    20.         Debug.Log ("after blob " + blob.width());
    21.  
    22.  
    23.         net.setInput (blob);
    24.         Debug.Log ("after set input");
    25.  
    26.         Mat prob = net.forward ();
    27.         Utils.setDebugMode(false);
    28.         Debug.Log ("after forward");
    29.  
    30.     }
    31.  
    32.     // Update is called once per frame
    33.     void Update () {
    34.      
    35.     }
    36. }
    37.  
    And I'm getting following errors:

    Code (CSharp):
    1. dnn::forward_11() : C:\Users\satoo\Desktop\opencv\modules\dnn\src\layers\prior_box_layer.cpp:205: error: (-215) !_aspectRatios.empty(), _minSize > 0 in function cv::dnn::PriorBoxLayerImpl::PriorBoxLayerImpl
    upload_2018-3-13_12-29-41.png

    I tested some examples from your libary and they worked, but here it seems that I cannot produce blob object - the width() I'm printing is -1.

    What could be wrong here?
     
    Last edited: Mar 14, 2018
  21. VRxMedical

    VRxMedical

    Joined:
    Mar 8, 2018
    Posts:
    3
    Hi, Can you please help me.
    Actually, i am trying to stretch and compress the quad in face detection scene. when I am compressing the quad it shows the blue border. So I need to change the size of the quad and also need to change the width and height of webcamtexture. Please help me I am stuck on this from 2 days.
     
  22. asethicode5

    asethicode5

    Joined:
    Feb 21, 2018
    Posts:
    7
    Trying to apply Real-time pose estimation and 3D Reconstruction using Enox's OpenCV with Unity package, but facing a lot of hinderance. Can anyone help me out with this? Thanks!
     
  23. EnoxSoftware

    EnoxSoftware

    Joined:
    Oct 29, 2014
    Posts:
    1,581
    If you do not use the opencv_contrib module, you can reduce the build size by using native plugin files excluding the opencv_contrib module.
    1. Replace the OpenCVForUnity/Plugins/iOS folder to the OpenCVForUnity/Extra/exclude_contrib/iOS folder. Replace OpenCVForUnity/Plugins/Android/libs folder to OpenCVForUnity/Extra/exclude_contrib/Android/libs folder.
    2. Select MenuItem[Tools/OpenCV for Unity/Set Plugin Import Settings].
    3. Delete the OpenCVForUnity/Assets/OpenCVForUnity/org/opencv_contrib folder and the OpenCVForUnity/Examples/ContribModules folder.
     
  24. EnoxSoftware

    EnoxSoftware

    Joined:
    Oct 29, 2014
    Posts:
    1,581
    If you want Unity to split the app output package into APK and OBB for you, open the Player Settings window (menu: Edit > Project Settings > Player), and in the Publishing Settings section, tick the Split Application Binary checkbox.
    https://docs.unity3d.com/2017.3/Documentation/Manual/android-OBBsupport.html
    android-OBB-0.png
     
  25. EnoxSoftware

    EnoxSoftware

    Joined:
    Oct 29, 2014
    Posts:
    1,581
    In order to improve the accuracy of points, Probably, It is necessary to use a model file specialized for people wearing glasses.
     
  26. EnoxSoftware

    EnoxSoftware

    Joined:
    Oct 29, 2014
    Posts:
    1,581
    Please replace AlphaMod.cs with the following code.
    Code (CSharp):
    1. using System.Collections;
    2. using System.Collections.Generic;
    3. using UnityEngine;
    4. using HoloLensWithOpenCVForUnityExample;
    5.  
    6. public class AlphaMod : MonoBehaviour {
    7.  
    8.     public HoloLensComicFilterExample holoScript;
    9.     public float alphaLevel;
    10.     //public float alphaLevel;
    11.  
    12.     void Start()
    13.     {
    14.         holoScript = GetComponent<HoloLensComicFilterExample> ();
    15.  
    16.         StartCoroutine(BeatFlash(2.0f));
    17.     }
    18.  
    19.     void Update()
    20.     {
    21.         //StartCoroutine(BeatFlash());
    22.         //StartCoroutine(BeatFlash(0.0f,1.0f));
    23.         //Debug.Log("test can change to 10");
    24.         //alphaLevel = 10.0f;
    25.  
    26.     }
    27.  
    28.     IEnumerator BeatFlash(float alphaTime)
    29.     {
    30.         while (true) {
    31.             for (float t = 0.0f; t < 1.0f; t += Time.deltaTime / alphaTime) {
    32.                 holoScript.alpha = Mathf.Lerp (0.0f, 1.0f, t);
    33.                 yield return null;
    34.             }
    35.         }
    36.     }
    37.  
    38. //    IEnumerator BeatFlash()
    39. //    {
    40. //        // try to get alpha channel to turn on and off
    41. //        Debug.Log("beatflash A - alpha = 0");
    42. //        alphaLevel = 0.0f;        
    43. //        yield return new WaitForSeconds(2);
    44. //        Debug.Log("beatflash B - alpha = 1");
    45. //        alphaLevel = 1.0f;
    46. //
    47. //
    48. //    }
    49.     //try to fade alpha channel in and out
    50.     //IEnumerator BeatFlash(float alphaVal, float alphaTime)
    51.     //{
    52.  
    53.     //float alpha = alphaLevel;
    54.     //for (float t = 0.0f; t < 1.0f; t += Time.deltaTime / alphaTime)
    55.     //{
    56.     //Color newColor = new Color(1, 1, 1, Mathf.Lerp(alpha, alphaVal, t));
    57.     //transform.renderer.material.color = newColor;
    58.     //yield return null;
    59.     //}
    60.     //}
    61. }
    62.  
     
  27. sticklezz

    sticklezz

    Joined:
    Oct 27, 2015
    Posts:
    33
    >>Release Notes
    2.2.7
    Updated to WebCamTextureToMatHelper.cs v1.0.6.<<


    cool- This didn't correct the front facing camera issue on Android PIXEL, did it?
     
  28. EnoxSoftware

    EnoxSoftware

    Joined:
    Oct 29, 2014
    Posts:
    1,581
    This model worked well in the latest version(2.2.7) based on OpenCV 3.4.1.Please try with the latest version.
     
  29. EnoxSoftware

    EnoxSoftware

    Joined:
    Oct 29, 2014
    Posts:
    1,581
    What does "stretch and compression" mean?
    It is possible to set RequestWidth and RequestHeight on the inspector.
    ios_1280x720.png
     
  30. EnoxSoftware

    EnoxSoftware

    Joined:
    Oct 29, 2014
    Posts:
    1,581
    Since this asset is a clone of OpenCV Java, you are able to use the same API as OpenCV Java.
    If there is implementation example using "OpenCV Java", I think that it can be implemented even using "OpenCV for Unity".
    I think that this repository will be helpful.
    https://github.com/mscarlett/sfm-java
     
    asethicode5 likes this.
  31. EnoxSoftware

    EnoxSoftware

    Joined:
    Oct 29, 2014
    Posts:
    1,581
    The issue has not been fixed yet.
    I am currently procuring Google Pixel to solve this issue.
     
  32. MaxXR

    MaxXR

    Joined:
    Jun 18, 2017
    Posts:
    67
    Thanks for this

    Used your script and added the reference.
    upload_2018-3-19_17-56-39.png
    Seems it's not referencing the script with the alpha attribute correctly - is there something different that needs to be done here?
    Your code returns error of:
    Code (csharp):
    1.  
    2. NullReferenceException: Object reference not set to an instance of an object
    3. AlphaMod+<BeatFlash>c__Iterator0.MoveNext () (at Assets/Scripts/AlphaMod.cs:35)
    4. UnityEngine.SetupCoroutine.InvokeMoveNext (IEnumerator enumerator, IntPtr returnValueAddress) (at C:/buildslave/unity/build/Runtime/Export/Coroutines.cs:17)
    5. UnityEngine.MonoBehaviour:StartCoroutine(IEnumerator)
    6. AlphaMod:Start() (at Assets/Scripts/AlphaMod.cs:17)
    7.  
    8.  
    Is there something special that's required to make this reference in particular work?

    Line 35 in visualstudio is 'holoScript.alpha = Mathf.Lerp(0.0f, 1.0f, t);'
     
  33. EnoxSoftware

    EnoxSoftware

    Joined:
    Oct 29, 2014
    Posts:
    1,581
    1. Replace HoloLensComicFilterExample.cs and AlphaMod.cs with the code below.
    2. Attach the script to "Quad".
    Code (CSharp):
    1. using UnityEngine;
    2. using System.Collections;
    3. using UnityEngine.UI;
    4.  
    5. #if UNITY_5_3 || UNITY_5_3_OR_NEWER
    6. using UnityEngine.SceneManagement;
    7. #endif
    8. using OpenCVForUnity;
    9.  
    10. using System.Linq;
    11.  
    12. namespace HoloLensWithOpenCVForUnityExample
    13. {
    14.     /// <summary>
    15.     /// HoloLens Comic Filter Example
    16.     /// An example of image processing (comic filter) using OpenCVForUnity on Hololens.
    17.     /// Referring to http://dev.classmethod.jp/smartphone/opencv-manga-2/.
    18.     /// </summary>
    19.     [RequireComponent(typeof(HololensCameraStreamToMatHelper))]
    20.     public class HoloLensComicFilterExample : MonoBehaviour
    21.     {
    22.         /// <summary>
    23.         /// The gray mat.
    24.         /// </summary>
    25.         Mat grayMat;
    26.  
    27.         /// <summary>
    28.         /// The line mat.
    29.         /// </summary>
    30.         Mat lineMat;
    31.  
    32.         /// <summary>
    33.         /// The mask mat.
    34.         /// </summary>
    35.         Mat maskMat;
    36.  
    37.         /// <summary>
    38.         /// The background mat.
    39.         /// </summary>
    40.         Mat bgMat;
    41.  
    42.         /// <summary>
    43.         /// The dst mat.
    44.         /// </summary>
    45.         Mat dstMat;
    46.  
    47.         /// <summary>
    48.         /// The gray pixels.
    49.         /// </summary>
    50.         byte[] grayPixels;
    51.  
    52.         /// <summary>
    53.         /// The mask pixels.
    54.         /// </summary>
    55.         byte[] maskPixels;
    56.  
    57.         /// <summary>
    58.         /// The texture.
    59.         /// </summary>
    60.         Texture2D texture;
    61.  
    62.         /// <summary>
    63.         /// The quad renderer.
    64.         /// </summary>
    65.         Renderer quad_renderer;
    66.  
    67.         /// <summary>
    68.         /// The web cam texture to mat helper.
    69.         /// </summary>
    70.         HololensCameraStreamToMatHelper webCamTextureToMatHelper;
    71.  
    72.         OpenCVForUnity.Rect processingAreaRect;
    73.         public Vector2 outsideClippingRatio = new Vector2(0.17f, 0.19f);
    74.         public Vector2 clippingOffset = new Vector2(0.043f, -0.041f);
    75.         public float vignetteScale = 1.8f;
    76.  
    77.         //add alpha for modulation
    78.         public float alpha = 0.0f;
    79.  
    80.         Mat dstMatClippingROI;
    81.  
    82.         // Use this for initialization
    83.         void Start ()
    84.         {
    85.             webCamTextureToMatHelper = gameObject.GetComponent<HololensCameraStreamToMatHelper> ();
    86.             #if NETFX_CORE
    87.             webCamTextureToMatHelper.frameMatAcquired += OnFrameMatAcquired;
    88.             #endif
    89.             webCamTextureToMatHelper.Initialize ();
    90.         }
    91.  
    92.         /// <summary>
    93.         /// Raises the web cam texture to mat helper initialized event.
    94.         /// </summary>
    95.         public void OnWebCamTextureToMatHelperInitialized ()
    96.         {
    97.             Debug.Log ("OnWebCamTextureToMatHelperInitialized");
    98.      
    99.             Mat webCamTextureMat = webCamTextureToMatHelper.GetMat ();
    100.  
    101.  
    102.             #if NETFX_CORE
    103.             // HololensCameraStream always returns image data in BGRA format.
    104.             texture = new Texture2D (webCamTextureMat.cols (), webCamTextureMat.rows (), TextureFormat.BGRA32, false);
    105.             #else
    106.             texture = new Texture2D (webCamTextureMat.cols (), webCamTextureMat.rows (), TextureFormat.RGBA32, false);
    107.             #endif
    108.  
    109.             texture.wrapMode = TextureWrapMode.Clamp;
    110.  
    111.             Debug.Log ("Screen.width " + Screen.width + " Screen.height " + Screen.height + " Screen.orientation " + Screen.orientation);
    112.      
    113.  
    114.             processingAreaRect = new OpenCVForUnity.Rect ((int)(webCamTextureMat.cols ()*(outsideClippingRatio.x - clippingOffset.x)), (int)(webCamTextureMat.rows ()*(outsideClippingRatio.y + clippingOffset.y)),
    115.                 (int)(webCamTextureMat.cols ()*(1f-outsideClippingRatio.x*2)), (int)(webCamTextureMat.rows ()*(1f-outsideClippingRatio.y*2)));
    116.             processingAreaRect = processingAreaRect.intersect (new OpenCVForUnity.Rect(0,0,webCamTextureMat.cols (),webCamTextureMat.rows ()));
    117.  
    118.  
    119.             dstMat = new Mat (webCamTextureMat.rows (), webCamTextureMat.cols (), CvType.CV_8UC1);
    120.             dstMatClippingROI = new Mat (dstMat, processingAreaRect);
    121.  
    122.             // fill all black.
    123.             //Imgproc.rectangle (dstMat, new Point (0, 0), new Point (dstMat.width (), dstMat.height ()), new Scalar (0, 0, 0, 0), -1);
    124.  
    125.  
    126.             grayMat = new Mat (dstMatClippingROI.rows (), dstMatClippingROI.cols (), CvType.CV_8UC1);
    127.             lineMat = new Mat (dstMatClippingROI.rows (), dstMatClippingROI.cols (), CvType.CV_8UC1);
    128.             maskMat = new Mat (dstMatClippingROI.rows (), dstMatClippingROI.cols (), CvType.CV_8UC1);
    129.  
    130.             //create a striped background.
    131.             bgMat = new Mat (dstMatClippingROI.rows (), dstMatClippingROI.cols (), CvType.CV_8UC1, new Scalar (255));
    132.             for (int i = 0; i < bgMat.rows ()*2.5f; i=i+4) {
    133.                 Imgproc.line (bgMat, new Point (0, 0 + i), new Point (bgMat.cols (), -bgMat.cols () + i), new Scalar (0), 1);
    134.             }
    135.  
    136.             grayPixels = new byte[grayMat.cols () * grayMat.rows () * grayMat.channels ()];
    137.             maskPixels = new byte[maskMat.cols () * maskMat.rows () * maskMat.channels ()];
    138.            
    139.  
    140.             quad_renderer = gameObject.GetComponent<Renderer> () as Renderer;
    141.             quad_renderer.sharedMaterial.SetTexture ("_MainTex", texture);
    142.             quad_renderer.sharedMaterial.SetVector ("_VignetteOffset", new Vector4(clippingOffset.x, clippingOffset.y));
    143.  
    144.             Matrix4x4 projectionMatrix;
    145.             #if NETFX_CORE
    146.             projectionMatrix = webCamTextureToMatHelper.GetProjectionMatrix ();
    147.             quad_renderer.sharedMaterial.SetMatrix ("_CameraProjectionMatrix", projectionMatrix);
    148.             #else
    149.             //This value is obtained from PhotoCapture's TryGetProjectionMatrix() method.I do not know whether this method is good.
    150.             //Please see the discussion of this thread.Https://forums.hololens.com/discussion/782/live-stream-of-locatable-camera-webcam-in-unity
    151.             projectionMatrix = Matrix4x4.identity;
    152.             projectionMatrix.m00 = 2.31029f;
    153.             projectionMatrix.m01 = 0.00000f;
    154.             projectionMatrix.m02 = 0.09614f;
    155.             projectionMatrix.m03 = 0.00000f;
    156.             projectionMatrix.m10 = 0.00000f;
    157.             projectionMatrix.m11 = 4.10427f;
    158.             projectionMatrix.m12 = -0.06231f;
    159.             projectionMatrix.m13 = 0.00000f;
    160.             projectionMatrix.m20 = 0.00000f;
    161.             projectionMatrix.m21 = 0.00000f;
    162.             projectionMatrix.m22 = -1.00000f;
    163.             projectionMatrix.m23 = 0.00000f;
    164.             projectionMatrix.m30 = 0.00000f;
    165.             projectionMatrix.m31 = 0.00000f;
    166.             projectionMatrix.m32 = -1.00000f;
    167.             projectionMatrix.m33 = 0.00000f;
    168.             quad_renderer.sharedMaterial.SetMatrix ("_CameraProjectionMatrix", projectionMatrix);
    169.             #endif
    170.  
    171.             quad_renderer.sharedMaterial.SetFloat ("_VignetteScale", vignetteScale);
    172.  
    173.  
    174.             float halfOfVerticalFov = Mathf.Atan (1.0f / projectionMatrix.m11);
    175.             float aspectRatio = (1.0f / Mathf.Tan (halfOfVerticalFov)) / projectionMatrix.m00;
    176.             Debug.Log ("halfOfVerticalFov " + halfOfVerticalFov);
    177.             Debug.Log ("aspectRatio " + aspectRatio);
    178.  
    179.             //
    180.             //Imgproc.rectangle (dstMat, new Point (0, 0), new Point (webCamTextureMat.width (), webCamTextureMat.height ()), new Scalar (126, 126, 126, 255), -1);
    181.             //
    182.         }
    183.  
    184.         /// <summary>
    185.         /// Raises the web cam texture to mat helper disposed event.
    186.         /// </summary>
    187.         public void OnWebCamTextureToMatHelperDisposed ()
    188.         {
    189.             Debug.Log ("OnWebCamTextureToMatHelperDisposed");
    190.  
    191.             grayMat.Dispose ();
    192.             lineMat.Dispose ();
    193.             maskMat.Dispose ();
    194.      
    195.             bgMat.Dispose ();
    196.             dstMat.Dispose ();
    197.             dstMatClippingROI.Dispose ();
    198.  
    199.             grayPixels = null;
    200.             maskPixels = null;
    201.         }
    202.  
    203.         /// <summary>
    204.         /// Raises the web cam texture to mat helper error occurred event.
    205.         /// </summary>
    206.         /// <param name="errorCode">Error code.</param>
    207.         public void OnWebCamTextureToMatHelperErrorOccurred(WebCamTextureToMatHelper.ErrorCode errorCode){
    208.             Debug.Log ("OnWebCamTextureToMatHelperErrorOccurred " + errorCode);
    209.         }
    210.  
    211.         #if NETFX_CORE
    212.         public void OnFrameMatAcquired (Mat bgraMat, Matrix4x4 projectionMatrix, Matrix4x4 cameraToWorldMatrix)
    213.         {
    214.             Mat bgraMatClipROI = new Mat(bgraMat, processingAreaRect);
    215.  
    216.             Imgproc.cvtColor (bgraMatClipROI, grayMat, Imgproc.COLOR_BGRA2GRAY);
    217.  
    218.             bgMat.copyTo (dstMatClippingROI);
    219.  
    220.             Imgproc.GaussianBlur (grayMat, lineMat, new Size (3, 3), 0);
    221.  
    222.  
    223.             grayMat.get (0, 0, grayPixels);
    224.  
    225.             for (int i = 0; i < grayPixels.Length; i++) {
    226.                 maskPixels [i] = 0;
    227.  
    228.                 if (grayPixels [i] < 70) {
    229.                     grayPixels [i] = 0;
    230.                     maskPixels [i] = 1;
    231.                 } else if (70 <= grayPixels [i] && grayPixels [i] < 120) {
    232.                     grayPixels [i] = 100;
    233.  
    234.                 } else {
    235.                     grayPixels [i] = 255;
    236.                     maskPixels [i] = 1;
    237.                 }
    238.             }
    239.  
    240.             grayMat.put (0, 0, grayPixels);
    241.             maskMat.put (0, 0, maskPixels);
    242.             grayMat.copyTo (dstMatClippingROI, maskMat);
    243.  
    244.  
    245.             Imgproc.Canny (lineMat, lineMat, 20, 120);
    246.  
    247.             lineMat.copyTo (maskMat);
    248.  
    249.             Core.bitwise_not (lineMat, lineMat);
    250.  
    251.             lineMat.copyTo (dstMatClippingROI, maskMat);
    252.  
    253.  
    254.             //Imgproc.putText (dstMat, "W:" + dstMat.width () + " H:" + dstMat.height () + " SO:" + Screen.orientation, new Point (5, dstMat.rows () - 10), Core.FONT_HERSHEY_SIMPLEX, 1.0, new Scalar (0), 2, Imgproc.LINE_AA, false);
    255.  
    256.             Imgproc.cvtColor(dstMat, bgraMat, Imgproc.COLOR_GRAY2BGRA);
    257.  
    258.             //
    259.             //Imgproc.rectangle (bgraMat, new Point (0, 0), new Point (bgraMat.width (), bgraMat.height ()), new Scalar (0, 0, 255, 255), 2);
    260.             //Imgproc.rectangle (bgraMat, processingAreaRect.tl(), processingAreaRect.br(), new Scalar (0, 0, 255, 255), 2);
    261.             //
    262.  
    263.             bgraMat = bgraMat * alpha;
    264.  
    265.             bgraMatClipROI.Dispose ();
    266.  
    267.  
    268.             UnityEngine.WSA.Application.InvokeOnAppThread(() => {
    269.  
    270.                 if (!webCamTextureToMatHelper.IsPlaying ()) return;
    271.  
    272.                 Utils.fastMatToTexture2D(bgraMat, texture);
    273.                 bgraMat.Dispose ();
    274.  
    275.                 Matrix4x4 worldToCameraMatrix = cameraToWorldMatrix.inverse;
    276.  
    277.                 quad_renderer.sharedMaterial.SetMatrix ("_WorldToCameraMatrix", worldToCameraMatrix);
    278.  
    279.                 // Position the canvas object slightly in front
    280.                 // of the real world web camera.
    281.                 Vector3 position = cameraToWorldMatrix.GetColumn (3) - cameraToWorldMatrix.GetColumn (2);
    282.  
    283.                 // Rotate the canvas object so that it faces the user.
    284.                 Quaternion rotation = Quaternion.LookRotation (-cameraToWorldMatrix.GetColumn (2), cameraToWorldMatrix.GetColumn (1));
    285.  
    286.                 gameObject.transform.position = position;
    287.                 gameObject.transform.rotation = rotation;
    288.  
    289.             }, false);
    290.         }
    291.  
    292.         #else
    293.  
    294.         // Update is called once per frame
    295.         void Update ()
    296.         {
    297.             if (webCamTextureToMatHelper.IsPlaying () && webCamTextureToMatHelper.DidUpdateThisFrame ()) {
    298.  
    299.                 Mat rgbaMat = webCamTextureToMatHelper.GetMat ();
    300.  
    301.                 Mat rgbaMatClipROI = new Mat(rgbaMat, processingAreaRect);
    302.  
    303.                 Imgproc.cvtColor (rgbaMatClipROI, grayMat, Imgproc.COLOR_RGBA2GRAY);
    304.  
    305.                 bgMat.copyTo (dstMatClippingROI);
    306.  
    307.                 Imgproc.GaussianBlur (grayMat, lineMat, new Size (3, 3), 0);
    308.  
    309.  
    310.                 grayMat.get (0, 0, grayPixels);
    311.  
    312.                 for (int i = 0; i < grayPixels.Length; i++) {
    313.  
    314.                     maskPixels [i] = 0;
    315.  
    316.                     if (grayPixels [i] < 70) {
    317.                         grayPixels [i] = 0;
    318.                         maskPixels [i] = 1;
    319.                     } else if (70 <= grayPixels [i] && grayPixels [i] < 120) {
    320.                         grayPixels [i] = 100;
    321.  
    322.                     } else {
    323.                         grayPixels [i] = 255;
    324.                         maskPixels [i] = 1;
    325.                     }
    326.                 }
    327.  
    328.                 grayMat.put (0, 0, grayPixels);
    329.                 maskMat.put (0, 0, maskPixels);
    330.                 grayMat.copyTo (dstMatClippingROI, maskMat);
    331.  
    332.  
    333.                 Imgproc.Canny (lineMat, lineMat, 20, 120);
    334.  
    335.                 lineMat.copyTo (maskMat);
    336.  
    337.                 Core.bitwise_not (lineMat, lineMat);
    338.  
    339.                 lineMat.copyTo (dstMatClippingROI, maskMat);
    340.  
    341.  
    342.                 //Imgproc.putText (dstMat, "W:" + dstMat.width () + " H:" + dstMat.height () + " SO:" + Screen.orientation, new Point (5, dstMat.rows () - 10), Core.FONT_HERSHEY_SIMPLEX, 1.0, new Scalar (0), 2, Imgproc.LINE_AA, false);
    343.  
    344.                 Imgproc.cvtColor(dstMat, rgbaMat, Imgproc.COLOR_GRAY2RGBA);
    345.  
    346.                 //
    347.                 //Imgproc.rectangle (rgbaMat, new Point (0, 0), new Point (rgbaMat.width (), rgbaMat.height ()), new Scalar (255, 0, 0, 255), 2);
    348.                 //Imgproc.rectangle (rgbaMat, processingAreaRect.tl(), processingAreaRect.br(), new Scalar (255, 0, 0, 255), 2);
    349.                 //
    350.  
    351.                 rgbaMat = rgbaMat * alpha;
    352.  
    353.                 //
    354.                 Utils.fastMatToTexture2D(rgbaMat, texture);
    355.  
    356.                 rgbaMatClipROI.Dispose ();
    357.             }
    358.  
    359.             if (webCamTextureToMatHelper.IsPlaying ()) {
    360.  
    361.                 Matrix4x4 cameraToWorldMatrix = webCamTextureToMatHelper.GetCameraToWorldMatrix();
    362.                 Matrix4x4 worldToCameraMatrix = cameraToWorldMatrix.inverse;
    363.  
    364.                 quad_renderer.sharedMaterial.SetMatrix ("_WorldToCameraMatrix", worldToCameraMatrix);
    365.  
    366.                 // Position the canvas object slightly in front
    367.                 // of the real world web camera.
    368.                 Vector3 position = cameraToWorldMatrix.GetColumn (3) - cameraToWorldMatrix.GetColumn (2);
    369.  
    370.                 // Rotate the canvas object so that it faces the user.
    371.                 Quaternion rotation = Quaternion.LookRotation (-cameraToWorldMatrix.GetColumn (2), cameraToWorldMatrix.GetColumn (1));
    372.  
    373.                 gameObject.transform.position = position;
    374.                 gameObject.transform.rotation = rotation;
    375.             }
    376.         }
    377.         #endif
    378.  
    379.         /// <summary>
    380.         /// Raises the destroy event.
    381.         /// </summary>
    382.         void OnDestroy ()
    383.         {
    384.             #if NETFX_CORE
    385.             webCamTextureToMatHelper.frameMatAcquired -= OnFrameMatAcquired;
    386.             #endif
    387.             webCamTextureToMatHelper.Dispose ();
    388.         }
    389.  
    390.         /// <summary>
    391.         /// Raises the back button click event.
    392.         /// </summary>
    393.         public void OnBackButtonClick ()
    394.         {
    395.             #if UNITY_5_3 || UNITY_5_3_OR_NEWER
    396.             SceneManager.LoadScene ("HoloLensWithOpenCVForUnityExample");
    397.             #else
    398.             Application.LoadLevel ("HoloLensWithOpenCVForUnityExample");
    399.             #endif
    400.         }
    401.  
    402.         /// <summary>
    403.         /// Raises the play button click event.
    404.         /// </summary>
    405.         public void OnPlayButtonClick ()
    406.         {
    407.             webCamTextureToMatHelper.Play ();
    408.         }
    409.  
    410.         /// <summary>
    411.         /// Raises the pause button click event.
    412.         /// </summary>
    413.         public void OnPauseButtonClick ()
    414.         {
    415.             webCamTextureToMatHelper.Pause ();
    416.         }
    417.  
    418.         /// <summary>
    419.         /// Raises the stop button click event.
    420.         /// </summary>
    421.         public void OnStopButtonClick ()
    422.         {
    423.             webCamTextureToMatHelper.Stop ();
    424.         }
    425.  
    426.         /// <summary>
    427.         /// Raises the change camera button click event.
    428.         /// </summary>
    429.         public void OnChangeCameraButtonClick ()
    430.         {
    431.             webCamTextureToMatHelper.Initialize (null, webCamTextureToMatHelper.requestedWidth, webCamTextureToMatHelper.requestedHeight, !webCamTextureToMatHelper.requestedIsFrontFacing);
    432.         }
    433.     }
    434. }
    Code (CSharp):
    1. using System.Collections;
    2. using System.Collections.Generic;
    3. using UnityEngine;
    4. using HoloLensWithOpenCVForUnityExample;
    5.  
    6. public class AlphaMod : MonoBehaviour {
    7.  
    8.     HoloLensComicFilterExample holoScript;
    9.     //public float alphaLevel;
    10.  
    11.     void Start()
    12.     {
    13.         holoScript = GetComponent<HoloLensComicFilterExample> ();
    14.  
    15.         StartCoroutine(BeatFlash(2.0f));
    16.     }
    17.  
    18.     void Update()
    19.     {
    20.         //StartCoroutine(BeatFlash());
    21.         //StartCoroutine(BeatFlash(0.0f,1.0f));
    22.         //Debug.Log("test can change to 10");
    23.         //alphaLevel = 10.0f;
    24.  
    25.     }
    26.  
    27.     IEnumerator BeatFlash(float alphaTime)
    28.     {
    29.         while (true) {
    30.             for (float t = 0.0f; t < 1.0f; t += Time.deltaTime / alphaTime) {
    31.                 holoScript.alpha = Mathf.Lerp (0.0f, 1.0f, t);
    32.                 yield return null;
    33.             }
    34.         }
    35.     }
    36.  
    37. //    IEnumerator BeatFlash()
    38. //    {
    39. //        // try to get alpha channel to turn on and off
    40. //        Debug.Log("beatflash A - alpha = 0");
    41. //        alphaLevel = 0.0f;      
    42. //        yield return new WaitForSeconds(2);
    43. //        Debug.Log("beatflash B - alpha = 1");
    44. //        alphaLevel = 1.0f;
    45. //
    46. //
    47. //    }
    48.     //try to fade alpha channel in and out
    49.     //IEnumerator BeatFlash(float alphaVal, float alphaTime)
    50.     //{
    51.  
    52.     //float alpha = alphaLevel;
    53.     //for (float t = 0.0f; t < 1.0f; t += Time.deltaTime / alphaTime)
    54.     //{
    55.     //Color newColor = new Color(1, 1, 1, Mathf.Lerp(alpha, alphaVal, t));
    56.     //transform.renderer.material.color = newColor;
    57.     //yield return null;
    58.     //}
    59.     //}
    60. }
    61.  
    HololensComicFilterExample_AlphaMod.png
     
  34. timhays

    timhays

    Joined:
    Jun 19, 2013
    Posts:
    14
    Regarding runtime ArgumentException: "The output Mat object has to be of the same size" and mat = null

    I have searched for an answer to this error in this forum and I always see you asking the poster about environment, but I haven't found a post from you with your solution.

    My test environment:
    OS : Windows 10
    Unity version : 2017.3.0f3
    OpenCV for Unity version : 227
    Unity Build Settings set to: PC, Mac, Linux Standalone
    Webcam not appearing on blank white texture in scene

    I have downloaded the latest OpenCV v227, I'm using Windows 10, I get a bunch of compile errors after import, see your instructions regarding Windows, so I start over with an empty project and import: OpenCVForUnityUWP_Beta2.unitypackage and then attempt to run any demo which uses webCamTextureToMat and see the error that others have been reporting: "The output Mat object has to be of the same size"

    In the debugger, Utils.cs line 660:
    if (mat.cols () != webCamTexture.width || mat.rows () != webCamTexture.height)
    my width=640, height=480, however: mat = null

    -Tim
     
  35. timhays

    timhays

    Joined:
    Jun 19, 2013
    Posts:
    14

    I'm sorry for the confusion, it turns out that the asset store didn't update my unitypackage, since I had an old version on my computer and the 'update' wasn't available, I was incorrectly using an older version. I just tried a different computer with Unity 2017.1.1f1 64bit and OpenCV is now importing without errors and I also imported the MarkerAR project from the AssetStore and was able to run the webcam scene without any runtime errors.
    So, ignore the previous post (above) and thank you!
    -Tim
     
    EnoxSoftware likes this.
  36. MaxXR

    MaxXR

    Joined:
    Jun 18, 2017
    Posts:
    67
    Thanks! This works on desktop cam in unity but not when deployed to hololens. Any idea how to solve?
    Desktop Unity works - see


    Deployed to hololens but it fails to flash. See view through hololens (doesn't modulate alpha):


    Strangely, also get this build error in visual studio.App still runs on device (but didn't auto start). Not sure if it's related?
    upload_2018-3-21_6-29-2.png

    Any ideas what's wrong and how i could fix?

    Thanks so much :)
     
    Last edited: Mar 21, 2018
  37. EnoxSoftware

    EnoxSoftware

    Joined:
    Oct 29, 2014
    Posts:
    1,581
    Have you changed from 254 lines to 263 lines of HoloLensComicFilterExample.cs to the following code?
    Code (CSharp):
    1.             //Imgproc.putText (dstMat, "W:" + dstMat.width () + " H:" + dstMat.height () + " SO:" + Screen.orientation, new Point (5, dstMat.rows () - 10), Core.FONT_HERSHEY_SIMPLEX, 1.0, new Scalar (0), 2, Imgproc.LINE_AA, false);
    2.             Imgproc.cvtColor(dstMat, bgraMat, Imgproc.COLOR_GRAY2BGRA);
    3.             //
    4.             //Imgproc.rectangle (bgraMat, new Point (0, 0), new Point (bgraMat.width (), bgraMat.height ()), new Scalar (0, 0, 255, 255), 2);
    5.             //Imgproc.rectangle (bgraMat, processingAreaRect.tl(), processingAreaRect.br(), new Scalar (0, 0, 255, 255), 2);
    6.             //
    7.             bgraMat = bgraMat * alpha;
     
  38. mobilizAR

    mobilizAR

    Joined:
    Jul 28, 2016
    Posts:
    13
    Peeps, someone please help! :(

    I wanna find the number of holes so I need to work with 'hierarchy' after finding contours. However, I'm unable to do that 'cz there's no documentation.

    I have the same problem as - http://answers.opencv.org/question/78303/mat-hierarchy-in-java-vs-vectorcvvec4i-hierarchy-in-c/

    All examples point to C/C++ or Python examples.

    for (int index = 0; index >= 0; index = hierarchy[index][0]) Error: In c++ hierarchy is vector type but in Java is Mat type. I can't iterate through hierarchy and have no idea how to do it.

    Someone PLEASE help :(
     
  39. sticklezz

    sticklezz

    Joined:
    Oct 27, 2015
    Posts:
    33
    thank you!!!!
     
  40. EnoxSoftware

    EnoxSoftware

    Joined:
    Oct 29, 2014
    Posts:
    1,581
    I think that this code will be helpful.
    https://github.com/EnoxSoftware/Ope...bjectTrackingBasedOnColorExample.cs#L266-L309
     
    mobilizAR likes this.
  41. mobilizAR

    mobilizAR

    Joined:
    Jul 28, 2016
    Posts:
    13
  42. asethicode5

    asethicode5

    Joined:
    Feb 21, 2018
    Posts:
    7
    Thanks!
     
  43. gagagu

    gagagu

    Joined:
    Aug 28, 2015
    Posts:
    11
    Hi,
    thx for your product!

    I'm using OpenCV for my Meta2 Glasses (www.metavision.com) and i want to use Aruco Markers to place Product Objects upon it. The Customer are able to rotate and move the marker and the Object will follow. I've used the ArUcoCameraCalibrationExample and calibrate the camera, it is available as normal webcam in the device manager. Then I've loaded the ArUcoWebCamTextureExample, replaced the camera with the Meta2CameraRig and connect the left eye camera (several combinations tested) to the quad "Ar Camera". Then i will start the project ist sometimes seems to work but sometimes the object (cube) is jittering and sometimes its moving around, than suddenly it's working again, and so on.
    I guess the problem seems to go in the direction of gimbal lock or something went wrong with the camera matrix when I will move my head. I'm a beginner in Unity and not very good in vector math (not a beginner in c#) but I'm not able to solve the problem.
    I guess you don't know the Meta2 but may be you have some suggestions how to solve the problem or where i have to look deeper in code?

    Thx forward!
     
  44. dreamer-

    dreamer-

    Joined:
    Dec 9, 2016
    Posts:
    25
    Hi,

    Is there a way to detect ear, neck and fingers using this package?
     
    Last edited: Mar 28, 2018
  45. EnoxSoftware

    EnoxSoftware

    Joined:
    Oct 29, 2014
    Posts:
    1,581
    Unfortunately, I do not have Meta2 Glasses.
    There is a possibility that jittering is caused by camera shake. Since ARUco is a very simple library, it is necessary to remove noise from the obtained matrix in order to increase stability.
     
  46. EnoxSoftware

    EnoxSoftware

    Joined:
    Oct 29, 2014
    Posts:
    1,581
  47. dreamer-

    dreamer-

    Joined:
    Dec 9, 2016
    Posts:
    25
    Thanks.

    I am not familiar with OpenCV and Dlib. What is the difference between the "Dlib FaceLandmark Detector" and the "OpenCV for Unity" packages? I read that Dlib has better face detection. Can I use the Dlib package for detecting ear, neck and fingers? The Dlib package also costs less.
     
  48. EnoxSoftware

    EnoxSoftware

    Joined:
    Oct 29, 2014
    Posts:
    1,581
    "Dlib FaceLandmark Detector" is a wrapper for ObjectDetection and ShapePrediction which is part of the function of Dlib.
    A model file for detecting the face of a person is included in "Dlib FaceLandmark Detector", but if you use another model file, it is possible to detect another object. If you want to detect ears you need to train a model file to detect ears using dlib's script.
    http://dlib.net/train_object_detector.py.html
    http://www.hackevolve.com/create-your-own-object-detector/
    https://handmap.github.io/dlib-classifier-for-object-detection/
     
    artpologabriel likes this.
  49. dreamer-

    dreamer-

    Joined:
    Dec 9, 2016
    Posts:
    25
  50. tomihr2

    tomihr2

    Joined:
    Oct 25, 2010
    Posts:
    30
    Hello, is there any example of using OpenCV to detect gender and age?
    Thanks