Search Unity

  1. Welcome to the Unity Forums! Please take the time to read our Code of Conduct to familiarize yourself with the forum rules and how to post constructively.

[RELEASED] OpenCV for Unity

Discussion in 'Assets and Asset Store' started by EnoxSoftware, Oct 30, 2014.

  1. EnoxSoftware

    EnoxSoftware

    Joined:
    Oct 29, 2014
    Posts:
    1,447
    UnityEditor with the tag "SILICON" seem to have errors loading the "opencvforuntiy.bundle". This is probably due to the fact that "opencvforunity.bundle" is not code-signed. I am currently working on fixing this issue.
    For now, could you use an UnityEditor with the tag "INTEL"?
    upload_2021-10-26_7-59-36.png
     
  2. arcade_jon

    arcade_jon

    Joined:
    Feb 9, 2018
    Posts:
    11
    ok - thanks for looking into it so quickly and I hope it's a quick fix!
     
  3. kt5881

    kt5881

    Joined:
    Jul 26, 2014
    Posts:
    14
    I am working on openpose instead of using the input as an image.jpg i want to use webcam to target real time image. It works well but the output in real time is so slow can anyone help please?
     
  4. EnoxSoftware

    EnoxSoftware

    Joined:
    Oct 29, 2014
    Posts:
    1,447
    A bug-fixed version has been uploaded to the AssetStore.
     
  5. EnoxSoftware

    EnoxSoftware

    Joined:
    Oct 29, 2014
    Posts:
    1,447
    As you mentioned, the human pose estimation example in our assets uses an old OpenPose model, which is very slow in its inference speed.
    However, by replacing the model with "LightWeight Human Pose Estimation (ONNX)", the inference speed can be greatly improved. (The model file is available from the OpenCV Github: https://github.com/opencv/opencv_ex...ac93d08d/testdata/dnn/download_models.py#L890)
    In fact, measurements on my laptop show an 80% reduction in inference time.
    I have attached a small example of the model in action for you to try.

    Also, if you want to do real-time processing on webcam video, etc., you may want to consider using Barraacuda for faster inference processing.
    The following repository is an easy to use package for human pose estimation with UnityBarracuda.
    https://github.com/keijiro/BodyPixBarracuda/
    The results of the estimation can be used for OpenCV processing with a little effort.
     

    Attached Files:

    kt5881 likes this.
  6. arcade_jon

    arcade_jon

    Joined:
    Feb 9, 2018
    Posts:
    11
    Great - thanks. Do you know when the bug-fixed version is expected to show up? Currently Asset Store and Package Manager both still show 2.4.7 (from Jan 28th) as latest.
     
  7. EnoxSoftware

    EnoxSoftware

    Joined:
    Oct 29, 2014
    Posts:
    1,447
    The version number has not been changed, but the files have been updated. Could you delete the downloaded package from your cache and then re-download it?
    • Windows: C:\Users\[UserName]\AppData\Roaming\Unity\Asset Store-5.x\[PublisherName]
    • Mac: /Users/[UserName]/Library/Unity/Asset Store-5.x/[PublisherName]
    https://support.unity.com/hc/en-us/articles/210112873-How-do-I-download-an-asset-I-have-purchased-
     
  8. machine_man

    machine_man

    Joined:
    Jan 31, 2022
    Posts:
    2
    I understand the Aruco Marker Example can detect multiple markers at the same time, but only display one model. What do I need to need to modify in order to display diffrent models for different markers at the same time?
     
  9. wmaass88

    wmaass88

    Joined:
    Dec 23, 2012
    Posts:
    45
    Anyone here familiar with FindContours? I am porting some Python code to to Unity using this asset and I am having an issue finding the of Area of a contour. The Python has a simple method as does EMGU. I tried using the width and height of the contour but the width is always 1.

    Would be great if we could get area = cv.contourArea(cnt)

    EDIT: Looks like we can do the below, found in one of the examples:


    Moments moment = Imgproc.moments(contours[index]);
    double area = moment.get_m00();
     
    Last edited: Feb 8, 2022
  10. kt5881

    kt5881

    Joined:
    Jul 26, 2014
    Posts:
    14
    Thank you for the help. It worked well when i used the example you provided (LightWeightOpenpose), But when i did build it on Android its really slow. Is there any problem why it's so slow? Than you in advance.
     

    Attached Files:

  11. EnoxSoftware

    EnoxSoftware

    Joined:
    Oct 29, 2014
    Posts:
    1,447
    You will need to change the GameObject that updates the matrix depending on the id of the marker that was retrieved.
    https://github.com/EnoxSoftware/Ope...rUcoExample/ArUcoWebCamTextureExample.cs#L540

    In the MarkerBasedARExample, there is an example that changes the game object to be displayed depending on the id of the marker.
    https://github.com/EnoxSoftware/Mar...ebCamTextureMarkerBasedARExample.cs#L275-L296
     
  12. look001

    look001

    Joined:
    Mar 23, 2017
    Posts:
    97
    Hey there,
    your asset looks very interesting. Before buying I want to know is the full source included also for the native android, webgl, ios, etc part? And if so, how can I access it?
    Best regards
     
  13. EnoxSoftware

    EnoxSoftware

    Joined:
    Oct 29, 2014
    Posts:
    1,447
    Unfortunately, Asset includes the C# source code, but not the C++ native library source code.
     
  14. swapnil_unity613

    swapnil_unity613

    Joined:
    Dec 4, 2020
    Posts:
    2
    Hello,
    How do I get the exact ID number of the detected ArUco marker in the code? I just see the ID in the image, but I want to access the ID in code.
    It just returns Mats of all the detected markers.
    Please help, thanks!
    Code (CSharp):
    1. Aruco.detectMarkers(rgbMat, dictionary, corners, ids, detectorParams, rejectedCorners, camMatrix, distCoeffs);
     
  15. swapnil_unity613

    swapnil_unity613

    Joined:
    Dec 4, 2020
    Posts:
    2
    Found it!
    It's like this if anyone else is wondering the same:
    Code (CSharp):
    1. double[] id = ids.get(rowNum, 0);
    where every row is a detected marker, then the ID is simply id[0]
     
    EnoxSoftware likes this.
  16. Jieyang666

    Jieyang666

    Joined:
    Mar 3, 2022
    Posts:
    1
    will support wechat_qrcode for hololens(uwp) in the future?
     
  17. EnoxSoftware

    EnoxSoftware

    Joined:
    Oct 29, 2014
    Posts:
    1,447
    Unfortunately, there are currently no such plans.
    the UWP platform does not support the dnn module.
    https://github.com/opencv/opencv/issues/9177
    So, UWP proform does not support the wechat_qrcode module, which depends on the dnn module.
     
  18. felipechavesbmw

    felipechavesbmw

    Joined:
    Feb 11, 2022
    Posts:
    4
    Hello, I build every step, the opencv build etc... but I got this in the end:

    Plugins: Failed to load 'Assets/OpenCVForUnity/Plugins/Windows/x86_64/opencvforunity.dll' because one or more of its dependencies could not be loaded.
    UnityEngine.GUIUtility:processEvent (int,intptr,bool&)

    upload_2022-3-15_11-22-12.png

    upload_2022-3-15_11-27-25.png

    Using OpenCV4.5
    Unity 2020.3.8f1
    OpenCV for Unity v2.4.4
     
    Last edited: Mar 15, 2022
  19. EnoxSoftware

    EnoxSoftware

    Joined:
    Oct 29, 2014
    Posts:
    1,447
    Is the path to GStreamer set to system environment variables?
    Výstřižek.PNG
     
  20. felipechavesbmw

    felipechavesbmw

    Joined:
    Feb 11, 2022
    Posts:
    4
    There are 53 .dlls on the C:\Users\felipe.chaves\Downloads\opencv-4.x\build\install\x64\vc16\bin too
     
    Last edited: Mar 16, 2022
  21. felipechavesbmw

    felipechavesbmw

    Joined:
    Feb 11, 2022
    Posts:
    4
    I compiled the android version and for my surprise I got the same error:

    Autoconnected Player DllNotFoundException: Unable to load DLL 'opencvforunity': The specified module could not be found.

    I'm under ARMv7, I also tested ARM64 and didn't work

    upload_2022-3-16_19-29-3.png

    upload_2022-3-16_19-29-55.png

    As it is said on Assets\OpenCVForUnity\ReadMe.pdf:
    Android 1. Build the Android SDK with “opencv/platforms/android/build_sdk.py”. ( APP_STL := gnustl_static) python ../opencv/platforms/android/build_sdk.py ../build ../opencv --ndk_path=C://android-ndk --sdk_path=C://android-sdk --extra_modules_path=../opencv_contrib/modules --use_android_buildtools 2. Copy the output file ( native\libs\arm64-v8a\libopencv_java4.so ) to “OpenCVForUnity\Plugins\Android\libs\arm64-v8a\”. Copy the output files ( native\libs\arm64-v8a\libopencv_java4.so ) to “OpenCVForUnity\Plugins\Android\libs\armeabi-v7a\”. Copy the output files ( native\libs\x86\libopencv_java4.so ) to “OpenCVForUnity\Plugins\Android\libs\x86\”. 3. Copy “OpenCVForUnity\Extra\dll_version\Android\libs\” to “OpenCVForUnity\Plugins\Android\libs\”.

    And I think the documentation is wrong it links arm64-v8a to armeabi-v7a.

    upload_2022-3-16_21-23-46.png

    But I tried every setup with v7 v8 .so and I still keep getting Unable to load DLL 'opencvforunity'


    QUESTION:

    if I have unity compiling on ndk19 should I setup the build_sdk.py to build the android version targeting on ndk 19? or if I target to 18 (--config=ndk-18-api-level-21.config.py) it will be compatible with the ndk19 from unity?
     
    Last edited: Mar 17, 2022
  22. felipechavesbmw

    felipechavesbmw

    Joined:
    Feb 11, 2022
    Posts:
    4
    it worked on Windows standalone/editor with the opencv4.5.3!!!!

    The downloaded opencv version 4.5.5 it wasn't working.


    but it keeps not working on android I get the same DllNotFoundException

    I could compile for the android version 19.0.5232133 (the exact/same that comes with Unity)
    I copied the folder to c:\ndk, setup environment variables.
    Also copied the sdk that comes with unity and put to c:\sdk, just have to copy the cmake folder that is not present on the Unity version so I also had C:\SDK\cmake\3.6.4111459

    but I had to put on the build-sdk.py internally:
    cmd.append("-DBUILD_ZLIB=‘ON’")

    The build command:
    python “C:\Users\felipe\Downloads\opencv4.5.3\opencv-4.53\platforms\android\build_sdk.py” “C:\Users\felipe\Downloads\opencv4.5.3\buildandroid” “C:\Users\felipe\Downloads\opencv4.5.3\opencv-4.53” --ndk_path=%ANDROID_NDK_ROOT% --sdk_path=%ANDROID_SDK_ROOT% --extra_modules_path=“C:\Users\felipe\Downloads\opencv4.5.3\opencv_contrib-4.53\modules” --use_android_buildtools --config=ndk-19.config.py

    It was possible to compile and get the libopencv_java4.so

    BUT this doesn’t work with unity and android I still keep getting ‘DLLNotFoundException’ after importing it as a ARMv7 lib
     
    Last edited: Mar 18, 2022
  23. auxtern

    auxtern

    Joined:
    Jan 20, 2019
    Posts:
    6
    Is there a feature to detect the direction of the head so that it can know which direction the head is moving up, down, right or left. It would be better if there was an example.
     
  24. EnoxSoftware

    EnoxSoftware

    Joined:
    Oct 29, 2014
    Posts:
    1,447
  25. kukewilly

    kukewilly

    Joined:
    Jan 3, 2019
    Posts:
    33
    Hello,

    Im trying to convert a Texture2D into a Mat from a camera connected to the UVC port of an android device. I'm not using WebCamTextureToMatHelper.cs to get the texture because it doesn't produce a webcamTexture. I want to run the Texture2D in Update() in ArUcoWebCamTextureExample.cs. I know the camera source correctly produces a texture2D because I have it displaying on the Quad Gameobject. But the whole app eventually crashes shortly after I run it. Here is what my Update() function looks like:


    Code (CSharp):
    1.  void Update ()
    2.         {            
    3.             if ((webCamTextureToMatHelper.IsPlaying () && webCamTextureToMatHelper.DidUpdateThisFrame ()) || (USBCamera != null))
    4.             {
    5.                 Mat rgbaMat = new Mat();
    6. #if UNITY_EDITOR
    7.                 rgbaMat = webCamTextureToMatHelper.GetMat ();
    8. #endif            
    9.                 //Gets Texture 2D from USBCamera script instead of webcamTexture
    10. #if UNITY_ANDROID && !UNITY_EDITOR
    11.                 if(USBCamera.GetComponent<USBCamera>().tempTexture2D != null)
    12.                 {
    13.                     texture = USBCamera.GetComponent<USBCamera>().tempTexture2D;
    14.                     RenderTexture tmp = RenderTexture.GetTemporary(texture.width, texture.height, 0, RenderTextureFormat.Default, RenderTextureReadWrite.Linear);
    15.                     Graphics.Blit(texture, tmp);
    16.                     RenderTexture previous = RenderTexture.active;
    17.                     RenderTexture.active = tmp;
    18.  
    19.                     Texture2D newTexture = new Texture2D (texture.width, texture.height, TextureFormat.RGBA32, false);
    20.                     newTexture.ReadPixels(new UnityEngine.Rect(0, 0, tmp.width, tmp.height), 0, 0);
    21.                     newTexture.Apply();
    22.                     RenderTexture.active = previous;
    23.                     Mat imgMat = new Mat (texture.height, texture.width, CvType.CV_8UC4);
    24.                     Utils.texture2DToMat (newTexture, imgMat);
    25.                     rgbaMat = imgMat;
    26.                 }
    27. #endif
    28.                 Imgproc.cvtColor (rgbaMat, rgbMat, Imgproc.COLOR_RGBA2RGB);
    29.  
    30.                 // detect markers.
    31.                 Aruco.detectMarkers (rgbMat, dictionary, corners, ids, detectorParams, rejectedCorners, camMatrix, distCoeffs);
    32.  
    33.                 // refine marker detection.
    34.                 if (refineMarkerDetection && (markerType == MarkerType.GridBoard || markerType == MarkerType.ChArUcoBoard)) {
    35.                     switch (markerType) {
    36.                     case MarkerType.GridBoard:
    37.                         Aruco.refineDetectedMarkers (rgbMat, gridBoard, corners, ids, rejectedCorners, camMatrix, distCoeffs, 10f, 3f, true, recoveredIdxs, detectorParams);
    38.                         break;
    39.                     case MarkerType.ChArUcoBoard:
    40.                         Aruco.refineDetectedMarkers (rgbMat, charucoBoard, corners, ids, rejectedCorners, camMatrix, distCoeffs, 10f, 3f, true, recoveredIdxs, detectorParams);
    41.                         break;
    42.                     }
    43.                 }
    44.  
    45.                 // if at least one marker detected
    46.                 if (ids.total () > 0 && Markers != null) {
    47.                    
    48.                     if (markerType != MarkerType.ChArUcoDiamondMarker) {
    49.  
    50.                         if (markerType == MarkerType.ChArUcoBoard) {
    51.                             Aruco.interpolateCornersCharuco (corners, ids, rgbMat, charucoBoard, charucoCorners, charucoIds, camMatrix, distCoeffs, charucoMinMarkers);
    52.  
    53.                             // draw markers.
    54.                             Aruco.drawDetectedMarkers (rgbMat, corners, ids, new Scalar (0, 255, 0));
    55.                             if (charucoIds.total () > 0) {
    56.                                 Aruco.drawDetectedCornersCharuco (rgbMat, charucoCorners, charucoIds, new Scalar (0, 0, 255));
    57.                             }
    58.                         } else {
    59.                             // draw markers.
    60.                             Aruco.drawDetectedMarkers (rgbMat, corners, ids, new Scalar (0, 255, 0));
    61.                         }
    62.                            
    63.                         // estimate pose.
    64.                         if (applyEstimationPose) {
    65.                             switch (markerType) {
    66.                             default:
    67.                             case MarkerType.CanonicalMarker:
    68.                                 EstimatePoseCanonicalMarker (rgbMat);
    69.                                 break;
    70.                             case MarkerType.GridBoard:
    71.                                 EstimatePoseGridBoard (rgbMat);
    72.                                 break;
    73.                             case MarkerType.ChArUcoBoard:
    74.                                 EstimatePoseChArUcoBoard (rgbMat);
    75.                                 break;
    76.                             }
    77.                         }
    78.                     } else {
    79.                         // detect diamond markers.
    80.                         Aruco.detectCharucoDiamond (rgbMat, corners, ids, diamondSquareLength / diamondMarkerLength, diamondCorners, diamondIds, camMatrix, distCoeffs);
    81.  
    82.                         // draw markers.
    83.                         Aruco.drawDetectedMarkers (rgbMat, corners, ids, new Scalar (0, 255, 0));
    84.                         // draw diamond markers.
    85.                         Aruco.drawDetectedDiamonds (rgbMat, diamondCorners, diamondIds, new Scalar (0, 0, 255));
    86.  
    87.                         // estimate pose.
    88.                         if (applyEstimationPose)
    89.                             EstimatePoseChArUcoDiamondMarker (rgbMat);
    90.                     }
    91.                 }
    92.  
    93.                 if (showRejectedCorners && rejectedCorners.Count > 0)
    94.                     Aruco.drawDetectedMarkers (rgbMat, rejectedCorners, new Mat (), new Scalar (255, 0, 0));
    95.                
    96.                
    97. //                Imgproc.putText (rgbaMat, "W:" + rgbaMat.width () + " H:" + rgbaMat.height () + " SO:" + Screen.orientation, new Point (5, rgbaMat.rows () - 10), Imgproc.FONT_HERSHEY_SIMPLEX, 1.0, new Scalar (255, 255, 255, 255), 2, Imgproc.LINE_AA, false);
    98.  
    99.                 Utils.fastMatToTexture2D (rgbMat, texture);
    100.             }
    101.  
    102.         }
    When it crashes this is the error message:

    I looks like the something to do with texture at the end of Update() when Utils.fastMatToTexture2D(rgbmat, texture) is called. The source Texture2D from USBCamera is RGBA32 format and I am able to stream the texture onto quad, but something about it wont make it through the rest of update. FYI when I run the application in unity with the camera plugged into the PC, all the Update() code works so I'm pretty sure something about how I convert the texture2D into a Mat or the formatting of the Texture2D isn't right when I have it build to android.

    Any ideas? I'm stumped...

    Luke
     
  26. EnoxSoftware

    EnoxSoftware

    Joined:
    Oct 29, 2014
    Posts:
    1,447

    The most common cause of errors in the fastMatToTexture2D method is when the Mat and Texture2D data formats or sizes are different.
    I looked at a portion of your code and was curious about the following part.
    Code (CSharp):
    1.  
    2. texture = USBCamera.GetComponent<USBCamera>().tempTexture2D;
    3.  
    The texture must be a writable texture2D since it will eventually be passed to the fastMatToTexture2D method.
    Could you please create a simple test scene that uses the texture generated at the beginning of the code as in the original example and try it out?
     
  27. kukewilly

    kukewilly

    Joined:
    Jan 3, 2019
    Posts:
    33

    Sorry but can you clarify your request? You want me to use the tempTexture2D generated in USBCamera inside the original/unmodified ArUcoWebCamTextureExample?

    Here is more information. This is the code where tempTexture2D is set from USB camera frame in USBCamera.cs

    Code (CSharp):
    1.     public IEnumerator InitCameraForAndroid()
    2.     {
    3.         if (!isUSBCamera)
    4.             Debug.Log("isUSBCamera = false");
    5.             StartCoroutine("InitCamera");
    6.         if (isUSBCamera)
    7.         {
    8.             supportedSizes.Clear();
    9.             //yield return new WaitForSeconds(1F);
    10.             yield return new WaitUntil(() => RequireSupportedSize());
    11.             yield return new WaitUntil(() => supportedSizes.Count != 0);
    12.             sizeSelector.value = supportedSizes.IndexOf(width + "-" + height);
    13.             if (enableFPSDisplay)
    14.             {
    15.                 StopCoroutine("FPSCounter");
    16.                 StartCoroutine("FPSCounter");
    17.             }
    18.             System.Diagnostics.Stopwatch stopwatch = new System.Diagnostics.Stopwatch();
    19.             float runTime = 0;
    20.             Debug.Log(stopping);
    21.             while (!stopping)
    22.             {
    23.                 Debug.Log("Camera Playing");
    24.                 frameID++;
    25.                 stopwatch.Stop();
    26.                 runTime = (float)stopwatch.Elapsed.TotalSeconds;
    27.                 stopwatch.Reset();
    28.                 if (runTime < ((float)1 / FPS))
    29.                     runTime = ((float)1 / FPS) - runTime;
    30.                 else
    31.                     runTime = 0;
    32.                 yield return new WaitForSeconds(runTime);
    33.                 stopwatch.Start();
    34.                 yield return new WaitUntil(() => playing);
    35.                 //Uncomment this line for debug output
    36.                 //yield return new WaitForSeconds(1f + UnityEngine.Random.value);
    37.                 try
    38.                 {
    39.                     tempTexture2D = plugin.GetFrame(deviceID);
    40.                     if (tempTexture2D != null)
    41.                     {
    42.                         Debug.Log("Got Frame");
    43.                         if (!onlyRenderOnUI)
    44.                             screenRender.material.mainTexture = tempTexture2D;
    45.                         screenImage.texture = tempTexture2D;
    46.                         screenImage.material.mainTexture = tempTexture2D;
    47.                     }
    48.                 }
    49.                 catch (Exception e)
    50.                 {
    51.                     CameraDebug.Log("Empty frame: " + e);
    52.                 }
    53.             }
    54.         }
    55.     }

    Here is plugin.GetFrame(deviceID) where plugin = AARplugin.cs

    Code (CSharp):
    1.         public Texture2D GetFrame(int deviceID)
    2.         {
    3.             if (androidJavaObject.Call<bool>("getCameraState", 0, deviceID))
    4.             {
    5.                 int textureId = 0;
    6.                 if (frameTransferMode == FrameTransferMode.GPU_Mode)
    7.                     textureId = androidJavaObject.Call<int>("getTextureID", deviceID);
    8.                 else
    9.                     textureId = androidJavaObject.Call<int>("getTextureIDByRS", deviceID);
    10.                 if (textureId != 0)
    11.                 {
    12.                     CameraDebug.Log("create external texture");
    13.                     if (rawTextures[deviceID] == null || rawTextures[deviceID].width != cameraScreens[deviceID].width ||
    14.                         rawTextures[deviceID].height != cameraScreens[deviceID].height)
    15.                     {
    16.                         rawTextures[deviceID] = null;
    17.                         rawTextures[deviceID] = Texture2D.CreateExternalTexture(cameraScreens[deviceID].width,
    18.                             cameraScreens[deviceID].height, TextureFormat.RGB565, false, false, (IntPtr)textureId);
    19.                         rawTextures[deviceID].wrapMode = TextureWrapMode.Clamp;
    20.                         rawTextures[deviceID].filterMode = FilterMode.Bilinear;
    21.                     }
    22.                     else
    23.                     {
    24.                         rawTextures[deviceID].UpdateExternalTexture((IntPtr)textureId);
    25.                     }
    26.                 }
    27.                 return rawTextures[deviceID];
    28.             }
    29.             else
    30.             {
    31.                 return null;
    32.             }
    33.         }
    Looks like in here the texture2D in AARplugin.cs is created as RGB565. But in USBCamera.cs tempTexture2D is initialized as RGBA32. This isn't something I wrote and is the original from of the code.

    Code (CSharp):
    1.     public void InitScreen()
    2.     {
    3.         if (plugin.frameTransferMode == AARplugin.FrameTransferMode.GPU_Mode)
    4.             textureSize = orginalSize;
    5.         else
    6.             textureSize = flipSize;
    7.         FPS_text.enabled = enableFPSDisplay;
    8.         if (tempTexture2D == null)
    9.         {
    10.             tempTexture2D = new Texture2D(width, height, TextureFormat.RGBA32, false);
    11.         }
    12.         else
    13.         {
    14.             tempTexture2D.Resize(width, height, TextureFormat.RGBA32, false);
    15.             tempTexture2D.Apply();
    16.         }
    17.         if (autoFit)
    18.         {
    19.             if (!onlyRenderOnUI)
    20.                 AutoFitScreen(screen, 1.0f);
    21.             AutoFitScreen(screenUI, 5.0f);
    22.         }
    23. #if  UNITY_EDITOR || UNITY_STANDALONE_WIN
    24.         StopCoroutine("InitCamera");
    25.         StartCoroutine("InitCamera");
    26. #endif
    27. #if UNITY_ANDROID
    28.         StopCoroutine("InitCameraForAndroid");
    29.         Debug.Log("Starting Camera Initialization for Android");
    30.         StartCoroutine("InitCameraForAndroid");
    31. #endif
    32.     }
     
  28. EnoxSoftware

    EnoxSoftware

    Joined:
    Oct 29, 2014
    Posts:
    1,447

    In your code it looks like WebCamTexture and USBCamera are opening the camera at the same time.
    Can you create a simple test scene that eliminates extraneous elements to make it easier to identify the cause of the problem?
    For example, a simple code like this:
    Code (CSharp):
    1.     Texture2D newTexture;
    2.     Mat imgMat;
    3.  
    4.     void Update ()
    5.     {          
    6.         #if UNITY_ANDROID && !UNITY_EDITOR
    7.                 if(USBCamera.GetComponent<USBCamera>().tempTexture2D != null)
    8.                 {
    9.                     Texture2D texture = USBCamera.GetComponent<USBCamera>().tempTexture2D;
    10.                     RenderTexture tmp = RenderTexture.GetTemporary(texture.width, texture.height, 0, RenderTextureFormat.Default, RenderTextureReadWrite.Linear);
    11.                     Graphics.Blit(texture, tmp);
    12.                     RenderTexture previous = RenderTexture.active;
    13.                     RenderTexture.active = tmp;
    14.                     if (newTexture == null)
    15.                     {
    16.                         newTexture = new Texture2D (texture.width, texture.height, TextureFormat.RGBA32, false);
    17.                         gameObject.GetComponent<Renderer>().material.mainTexture = newTexture;
    18.                     }
    19.  
    20.                     newTexture.ReadPixels(new UnityEngine.Rect(0, 0, tmp.width, tmp.height), 0, 0);
    21.                     newTexture.Apply();
    22.                     RenderTexture.active = previous;
    23.  
    24.                     if (imgMat == null)
    25.                         imgMat = new Mat (newTexture.height, newTexture.width, CvType.CV_8UC4);
    26.  
    27.                     Utils.texture2DToMat (newTexture, imgMat);
    28.  
    29.                     Imgproc.putText (imgMat, "W:" + imgMat.width () + " H:" + imgMat.height () + " SO:" + Screen.orientation, new Point (5, imgMat.rows () - 10), Imgproc.FONT_HERSHEY_SIMPLEX, 1.0, new Scalar (255, 255, 255, 255), 2, Imgproc.LINE_AA, false);
    30.  
    31.                     Utils.matToTexture2D (imgMat, newTexture);
    32.                 }
    33.         #endif
    34.     }
     
  29. vice39

    vice39

    Joined:
    Nov 11, 2016
    Posts:
    98
    I have the latest version of this asset in a Unity 2020.3 project, running on Windows 10.
    When I try to build to iOS I get this error:



    The RemoveSimulatorArchitectures() method fails when outputting an Xcode project in UnityEditor on non-macOS.
    Before outputting the Xcode project, please execute the following command on macOS.
    //remove i386 architectures.
    lipo - remove i386 opencv2.framework / opencv2 - o opencv2.framework / opencv2
    //remove x86_64 architectures.
    lipo - remove x86_64 opencv2.framework / opencv2 - o opencv2.framework / opencv2
    //check the architectures.
    lipo - info opencv2.framework / opencv2
    //remove i386 architectures.
    lipo - remove i386 libopencvforunity.a - o libopencvforunity.a
    //remove x86_64 architectures.
    lipo - remove x86_64 libopencvforunity.a - o libopencvforunity.a
    //check the architectures.
    lipo - info libopencvforunity.a
    UnityEngine.Debug:LogError (object)
    OpenCVForUnity.OpenCVForUnityIOSBuildPostprocessor:OnPostprocessBuild (UnityEditor.BuildTarget,string) (at Assets/OpenCVForUnity/Editor/OpenCVForUnityIOSBuildPostprocessor.cs:44)
    UnityEngine.GUIUtility:ProcessEvent (int,intptr,bool&)



    Trying to build the XCODE project (on a mac) will fail as well. I thought this problem has been fixed by a post-processing script, but it doesn't seem to be working. Any ideas what to do now?
     
  30. EnoxSoftware

    EnoxSoftware

    Joined:
    Oct 29, 2014
    Posts:
    1,447
    Is Xcode's Command Line Tools already installed on macOS? Xcode's Command Line Tools must be installed to run the lipo command.
     
  31. vice39

    vice39

    Joined:
    Nov 11, 2016
    Posts:
    98
    Xcode's Command Line Tools are installed on the MacOS however this error happens when building to XCODE in Windows. This is my workflow. I develop under windows, build to xcode under windows, then move the xcode project to macOS and build in xcode there.
     
  32. vice39

    vice39

    Joined:
    Nov 11, 2016
    Posts:
    98
    when I try to build the XCODE I get this error:

    Building for iOS, but the embedded framework 'opencv2.framework' was built for iOS + iOS Simulator.
     
  33. EnoxSoftware

    EnoxSoftware

    Joined:
    Oct 29, 2014
    Posts:
    1,447
    In order to execute the RemoveSimulatorArchitectures() method, the Unity project must be moved to macOS and the Xcode project must be output.
    Code (CSharp):
    1. The RemoveSimulatorArchitectures() method fails when outputting an Xcode project in UnityEditor on non-macOS.
    2.  
    3. Before outputting the Xcode project, please execute the following command on macOS.
     
  34. vice39

    vice39

    Joined:
    Nov 11, 2016
    Posts:
    98
    So basically this asset doesn't work for building to ios from windows. This limitation should be documented somewhere. This is a big deal for me.
     
  35. EnoxSoftware

    EnoxSoftware

    Joined:
    Oct 29, 2014
    Posts:
    1,447
    Once you have moved the "opencv2.framework" and "libopencvforunity.a" from the Unity project to macOS, run the following lipo command and then move them back to the Unity project on windows. Then you can build your UnityProject in your workflow.

    Code (CSharp):
    1. //remove i386 architectures.
    2.  
    3. lipo - remove i386 opencv2.framework / opencv2 - o opencv2.framework / opencv2
    4.  
    5. //remove x86_64 architectures.
    6.  
    7. lipo - remove x86_64 opencv2.framework / opencv2 - o opencv2.framework / opencv2
    8.  
    9. //check the architectures.
    10.  
    11. lipo - info opencv2.framework / opencv2
    12.  
    13. //remove i386 architectures.
    14.  
    15. lipo - remove i386 libopencvforunity.a - o libopencvforunity.a
    16.  
    17. //remove x86_64 architectures.
    18.  
    19. lipo - remove x86_64 libopencvforunity.a - o libopencvforunity.a
    20.  
    21. //check the architectures.
    22.  
    23. lipo - info libopencvforunity.a
     
  36. Wailander

    Wailander

    Joined:
    May 2, 2017
    Posts:
    12
    Hey, any way to reduce the webGl build size ? I can't find the “OpenCVForUnity/Extra/exclude_contrib/WebGL/libs” folder. (I know I'd lose Aruco detection, but I wanted to see how much it'd save)

    I think I'll only need Image tracking and Aruco detection, is there a way for me to build the libs for my use-case ?

    Thanks.
     
  37. EnoxSoftware

    EnoxSoftware

    Joined:
    Oct 29, 2014
    Posts:
    1,447
    Sorry, the documentation was incorrect.
    Could you replace “ OpenCVForUnity/Plugins/WebGL/” folder to “OpenCVForUnity/Extra/exclude_contrib/WebGL/” folder?

    And unfortunately, there is no way to build custom libraries for the WebGL platform.
     
  38. Wailander

    Wailander

    Joined:
    May 2, 2017
    Posts:
    12
    There is no Extra folder in the asset. Is it available somewhere else ?
     
  39. EnoxSoftware

    EnoxSoftware

    Joined:
    Oct 29, 2014
    Posts:
    1,447
    Could you select MenuItem[Tools/OpenCV for Unity/Import Extra Package]?
    Import_Extra.png
     
    Wailander likes this.
  40. Anshul_Goyal

    Anshul_Goyal

    Joined:
    Jul 22, 2021
    Posts:
    7
    Hello I bought this plugin last week and wondering, is it possible to use full screen in portrait mode? If so, how do you properly size the Quad to fill up the entire screen? (on iOS/Android).
     
  41. EnoxSoftware

    EnoxSoftware

    Joined:
    Oct 29, 2014
    Posts:
    1,447
  42. Anshul_Goyal

    Anshul_Goyal

    Joined:
    Jul 22, 2021
    Posts:
    7
  43. EnoxSoftware

    EnoxSoftware

    Joined:
    Oct 29, 2014
    Posts:
    1,447
    Thank you very much for reporting.
    Could you please try this code?

    float imageSizeScale = 1.0f;
    float widthScale = (float)Screen.width / width;
    float heightScale = (float)Screen.height / height;
    if (widthScale < heightScale) {
    // Camera.main.orthographicSize = (width * (float)Screen.height / (float)Screen.width) / 2;
    // imageSizeScale = (float)Screen.height / (float)Screen.width;
    Camera.main.orthographicSize = height / 2;
    } else {
    // Camera.main.orthographicSize = height / 2;
    Camera.main.orthographicSize = (width * (float)Screen.height / (float)Screen.width) / 2;
    imageSizeScale = (float)Screen.height / (float)Screen.width;
    }
    ----------------------------------------------------------------------------------------------
    if (widthScale < heightScale)
    {
    // ARCamera.fieldOfView = (float)(fovx [0] * fovXScale);
    ARCamera.fieldOfView = (float)(fovy [0] * fovYScale);
    } else
    {
    // ARCamera.fieldOfView = (float)(fovy [0] * fovYScale);
    ARCamera.fieldOfView = (float)(fovx [0] * fovXScale);
    }
     
  44. Anshul_Goyal

    Anshul_Goyal

    Joined:
    Jul 22, 2021
    Posts:
    7
    Thank you so much this code is working like a charm to me in android but in IOS devices ARObject's position and scale is not perfect. can you please suggest something also for improving fps count in mobile devices?
     
    Last edited: May 10, 2022
  45. Anshul_Goyal

    Anshul_Goyal

    Joined:
    Jul 22, 2021
    Posts:
    7
    is there any way to run AR examples on mobile webgl?
     
  46. EnoxSoftware

    EnoxSoftware

    Joined:
    Oct 29, 2014
    Posts:
    1,447
    https://docs.unity3d.com/2022.2/Documentation/Manual/webgl-browsercompatibility.html
    Unity WebGL doesn’t support mobile devices. It might work on high-end devices, but current devices are often not powerful and don’t have enough memory to support Unity WebGL content.

    There still seems to be a problem with WebCamTexture for mobile builds of WebGL.
    https://forum.unity.com/threads/ios-safari-webgl-webcamtexture-cant-access-back-camera.1238674/
     
  47. Anshul_Goyal

    Anshul_Goyal

    Joined:
    Jul 22, 2021
    Posts:
    7
    Any thoughts over it?
     
  48. EnoxSoftware

    EnoxSoftware

    Joined:
    Oct 29, 2014
    Posts:
    1,447
    Could you tell me about the environment you tested?
    Unity version :
    OpenCVforUnity verison :
    DlibFaceLandmarkDetector version :
    iOS version :
    Device :
     
  49. Anshul_Goyal

    Anshul_Goyal

    Joined:
    Jul 22, 2021
    Posts:
    7
    Unity version : 2020.3.14f
    OpenCVforUnity verison :2.4.7
    DlibFaceLandmarkDetector version :1.3.3
    iOS version :15.4.1
    Device :Iphone 12 pro
     
  50. olsung

    olsung

    Joined:
    Apr 22, 2021
    Posts:
    1
    @EnoxSoftware can you include Apple Silicon support for the OpenCV dylibs?

    Currently the one included are only x86_64:

    % file ./OpenCVForUnity/Plugins/macOS/opencvforunity.bundle/Contents/MacOS/libopencv_aruco.4.5.0.dylib
    ./OpenCVForUnity/Plugins/macOS/opencvforunity.bundle/Contents/MacOS/libopencv_aruco.4.5.0.dylib: Mach-O 64-bit dynamically linked shared library x86_64

    Or provide us with instructions to build our Universal Intel+Apple dynamic ourselves?
     
unityunity