Search Unity

[RELEASED] Dlib FaceLandmark Detector

Discussion in 'Assets and Asset Store' started by EnoxSoftware, Jun 4, 2016.

  1. denisatglaza

    denisatglaza

    Joined:
    May 14, 2018
    Posts:
    18
    I think you might have misunderstood what coordinates im looking for. On your website I see the 68 points map with double digit points but im looking for (x,y,z) coordinates like Point3 (26, 15, 83). I was able to open the object in Visual Studio but I cant see x,y,z coordinates when I click on orange points. Blender doesn't want to open the object at all. Am I doing something wrong? Im looking for something like this that I can view on the male model and then create objectPoints on my own:

    Screenshot (21).png
     
  2. EnoxSoftware

    EnoxSoftware

    Joined:
    Oct 29, 2014
    Posts:
    1,566
    You can check coordinates in UnityEditor as well.
    ar_unity_editor1.png
    ar_unity_editor2.png
     
    denisatglaza likes this.
  3. NathanNegreiroMyPad3D

    NathanNegreiroMyPad3D

    Joined:
    May 10, 2018
    Posts:
    7
    Hello Again! I managed to solve the issue I was having. It was a simple mistake that I was making in setting up the face detection.

    However, now that I have the face detection working, I am now trying to crop the image closer to the detected face. I've noticed that the Rect returned from Detect() varies from the Rect drawn by DrawDetectResult(...). The Rects only ever differ on the y axis, and the DrawDetectResult(...) Rect usually has a much better result.

    Is there a better way to go about cropping the image than basing it off of the Detect() Rect?

    Thank You.
     
  4. EnoxSoftware

    EnoxSoftware

    Joined:
    Oct 29, 2014
    Posts:
    1,566
    In which example can this problem be confirmed?
    Could you send a simple code to reproduce the problem about Detect() and DrawDetectResult()?
    https://enoxsoftware.com/opencvforunity/contact/technical-inquiry/
     
  5. NathanNegreiroMyPad3D

    NathanNegreiroMyPad3D

    Joined:
    May 10, 2018
    Posts:
    7
    Hello! Thankfully I have managed to resolve the issue. Thankfully it wasn't an issue with Detect() or DrawDetectResult(). Instead it was an issue with Texture2D.SetPixel() and other helper functions using a bottom-left coordinate system.
     
    EnoxSoftware likes this.
  6. sasa_1021

    sasa_1021

    Joined:
    Nov 30, 2013
    Posts:
    1
    Hi,
    I would like to know if current version of Dlib-facelandmark-detector uses iBUG 300-W dataset or not.
    I can't find infomation since reply message on Oct 17, 2017.
    Since I plan to use Dlib-facelandmark-detector in commercial application, I would like to confirm.
     
  7. EnoxSoftware

    EnoxSoftware

    Joined:
    Oct 29, 2014
    Posts:
    1,566
    sp_human_face_68.dat and sp_human_face_68_mobile.dat is trained using the dlib 5-point face landmark dataset. https://github.com/davisking/dlib-models
    "iBUG 300-W dataset" has already been excluded from the dataset used to train sp_human_face_68.dat.
    so, sp_human_face_68.dat and sp_human_face_68_mobile.dat are available for commercial use.
     
  8. denisatglaza

    denisatglaza

    Joined:
    May 14, 2018
    Posts:
    18
    Hello,

    I tried create an additional face point in Arheadwebcamtexture but I couldn't make any above 7. Where in the script I can increase the number of facepoints or how can I create more? Thank you!
     
  9. haeleeeeleah

    haeleeeeleah

    Joined:
    Jul 15, 2018
    Posts:
    1
    Hello I want to load another video in videocapturefaceswapperexample. but when I load another video,
    it looks strange. how can i resolve it??
     
  10. EnoxSoftware

    EnoxSoftware

    Joined:
    Oct 29, 2014
    Posts:
    1,566
    Please change these arrays of ARHeadWebCamTextureExample.

    objectPoints68 (3D Points)
    https://github.com/EnoxSoftware/Dli...ample/ARHeadWebCamTextureExample.cs#L266-L272

    imagePoints (2D Points)
    https://github.com/EnoxSoftware/Dli...ample/ARHeadWebCamTextureExample.cs#L471-L477
     
  11. EnoxSoftware

    EnoxSoftware

    Joined:
    Oct 29, 2014
    Posts:
    1,566
    FaceMaskExample adds a mask to the face by superimposing Mat of OpenCV and Mesh of Unity, so I think it is difficult to combine FaceMaskExample and ComicFilterExample. To combine with ComicFilterExample, you need to make significant changes to the code.
     
  12. bastianmeyerbm3

    bastianmeyerbm3

    Joined:
    May 22, 2018
    Posts:
    15
    Hi,
    i got an issue with the FaceMaskExample. In "WebCamTextureFaceMaskExample" there is an option in the inspector of the quad, named "Requested is front facing" which works completely fine on IOS but on my Android device (i dont know if its not working on other devices too, i only got one Android device (Huawei Mate 10 pro)) it still opens the back cam. If i uncheck the Option the front cam of the Android device opens but the back cam of the IPhone.

    How can i fix this issue? Or is it a problem with my Phone?
     
  13. EnoxSoftware

    EnoxSoftware

    Joined:
    Oct 29, 2014
    Posts:
    1,566
    Does that issue also occur in OpenCVForUnity's WebCamTextureToMatHelperExample?
    https://github.com/EnoxSoftware/Ope...xamples/Basic/WebCamTextureToMatHelperExample
     
  14. Macode

    Macode

    Joined:
    Nov 28, 2013
    Posts:
    38
  15. bastianmeyerbm3

    bastianmeyerbm3

    Joined:
    May 22, 2018
    Posts:
    15
  16. EnoxSoftware

    EnoxSoftware

    Joined:
    Oct 29, 2014
    Posts:
    1,566
  17. EnoxSoftware

    EnoxSoftware

    Joined:
    Oct 29, 2014
    Posts:
    1,566
    Could you tell me the environment you tested?
    FaceMaskExample version :
    DlibFaceLandmarkDetector version :
    OpenCVForUnity version :
    Unity version :
     
  18. bastianmeyerbm3

    bastianmeyerbm3

    Joined:
    May 22, 2018
    Posts:
    15
    Also for some reason, when i hit the "change Cam Button" the mask also changes

    FaceMaskExample version : 1.0.6
    DlibFaceLandmarkDetector version : 1.2.1
    OpenCVForUnity version : 2.2.9
    Unity version: 2017.4.1f1

    Behavior:
    IOS: Opens Front Cam
    Android: Opens Back Cam and changes the mask

    OnButtonChangeCam: Changes the cam and the mask (Android only!)
     
    Last edited: Aug 14, 2018
  19. EnoxSoftware

    EnoxSoftware

    Joined:
    Oct 29, 2014
    Posts:
    1,566
    https://github.com/EnoxSoftware/Fac...ple/WebCamTextureFaceMaskExample.cs#L272-L283
    https://github.com/EnoxSoftware/Fac...ple/WebCamTextureFaceMaskExample.cs#L702-L709
    Probably, I think these lines are related to this problem. Could you comment out these lines and test?

    Also,
    If you do not want to change FaceMask when changing the camera, please comment out this line.
    https://github.com/EnoxSoftware/Fac...kExample/WebCamTextureFaceMaskExample.cs#L340
     
  20. gavin_quander

    gavin_quander

    Joined:
    Sep 17, 2017
    Posts:
    1
    Does this work with Unity 2018?
     
  21. bastianmeyerbm3

    bastianmeyerbm3

    Joined:
    May 22, 2018
    Posts:
    15
    That fixed it, thanks a lot!
     
  22. EnoxSoftware

    EnoxSoftware

    Joined:
    Oct 29, 2014
    Posts:
    1,566
    Yes, DlibFaceLandmarkDetector works with Unity 2018 without problems.
     
  23. lifeisbetter

    lifeisbetter

    Joined:
    Nov 23, 2017
    Posts:
    3
    Hi,
    When i upload APP to Mac Appstore with Apploader, It shows:
    ERROR ITMS-90240: "Unsupported Architectures. Your executable contained the following disallowed architectures: '[i386 (in com.face.faceface.pkg/Payload/faceface.app/Contents/Plugins/dlibfacelandmarkdetector.bundle/Contents/MacOS/dlibfacelandmarkdetector)]'. New apps submitted to the Mac App Store must support 64-bit starting January 2018, and Mac app updates and existing apps must support 64-bit starting June 2018."

    And other file in "OpenCVForUnity\Plugins\macOS\opencvforunity.bundle\Contents\MacOS" are also i386.

    Dlib FaceLandmark Detector version:v1.2.2

    How can I solve this problem? Thank you!
     
  24. EnoxSoftware

    EnoxSoftware

    Joined:
    Oct 29, 2014
    Posts:
    1,566
    It seems necessary to remove Unsupported Architecture(i386).
    https://forum.unity.com/threads/app-rejected-unsupported-architecture.541660/

    Please enter the following command on the terminal and remove Unsupported Architecture(i386).
    Code (CSharp):
    1. cd YourProject/Assets/DlibFaceLandmarkDetector/Plugins/macOS
    2. ditto -v --arch x86_64 dlibfacelandmarkdetector.bundle new_dlibfacelandmarkdetector.bundle
    Then rename new_dlibfacelandmarkdetector.bundle to dlibfacelandmarkdetector.bundle.

    Code (CSharp):
    1. cd YourProject/Assets/OpenCVForUnity/Plugins/macOS
    2. ditto -v --arch x86_64 opencvforunity.bundle new_opencvforunity.bundle
    Then rename new_opencvforunity.bundle to opencvforunity.bundle.
     
  25. lifeisbetter

    lifeisbetter

    Joined:
    Nov 23, 2017
    Posts:
    3

    Hi,
    Thanks for your reply,It solved this i386 problem.
    But When I uploaded successfully,Apple send me a mail with issue:
    Invalid Signature - The executable at path faceface.app/Contents/Plugins/opencvforunity.bundle/Contents/MacOS/libopencv_aruco.3.4.1.dylib has following signing error(s): code object is not signed at all In architecture: x86_64 . Refer to the Code Signing and Application Sandboxing Guide at http://developer.apple.com/library/...ceptual/CodeSigningGuide/AboutCS/AboutCS.html and Technical Note 2206 at https://developer.apple.com/library/mac/technotes/tn2206/_index.html for more information.

    And same issue with other dylib file in "faceface.app/Contents/Plugins/opencvforunity.bundle/Contents/MacOS/"


    At last,it shows:
    Specifically, codesign generated the following error: faceface.app/Contents/Plugins/dlibfacelandmarkdetector.bundle/Contents/MacOS/dlibfacelandmarkdetector: unsealed contents present in the bundle root faceface.pkg/Payload/faceface.app/Contents/Plugins/opencvforunity.bundle/Contents/MacOS/opencvforunity: unsealed contents present in the bundle root
    faceface.pkg/Payload/faceface.app/Contents/Plugins/dlibfacelandmarkdetector.bundle: unsealed contents present in the bundle root
    faceface.pkg/Payload/faceface.app/Contents/Plugins/opencvforunity.bundle: unsealed contents present in the bundle root


    I use this command to codesign files:
    codesign -f -s '3rd Party Mac Developer Application: DEVELOPER NAME' --entitlements "GAME.entitlements" “GAMENAME.app" --deep

    How should I solve this problem? Thanks for your help!
     
  26. EnoxSoftware

    EnoxSoftware

    Joined:
    Oct 29, 2014
    Posts:
    1,566
    It seems that it is necessary to code sign also to the file in the Plugins folder.
    https://forum.unity.com/threads/signing-mac-app-on-os-x-mavericks.206762/#post-1505996
    https://stackoverflow.com/questions/33037801/invalid-signature-when-submitting-on-mac-app-store
     
  27. dion04

    dion04

    Joined:
    May 16, 2018
    Posts:
    1
    Hello,


    I would like to detect faces from image ( from unity folder or any user folder ), crop them and then save them as a separate images.

    Can I do this with Dlib alone, or "OpenCV for Unity" is required?
     
  28. EnoxSoftware

    EnoxSoftware

    Joined:
    Oct 29, 2014
    Posts:
    1,566
    I think that it is possible to process cropping and saving without using "OpenCV for Unity". However, I do not have an example code.
     
  29. bastianmeyerbm3

    bastianmeyerbm3

    Joined:
    May 22, 2018
    Posts:
    15
    Hello,
    i am looking for a way to hide the small preview window for the masks in the FaceMaskExample ("WebcamTextureFaceMaskExample"). I searched for it a long time but i could not find a solution. Have i failed to spot the option to hide it?
     
  30. EnoxSoftware

    EnoxSoftware

    Joined:
    Oct 29, 2014
    Posts:
    1,566
    The example does not provide the option to hide the mask preview.
    Could you comment out lines 544 to 569 of the code (WebCamTextureFaceMaskExample.cs)?
    https://github.com/EnoxSoftware/Fac...ple/WebCamTextureFaceMaskExample.cs#L544-L569

    Code (CSharp):
    1.         // Update is called once per frame
    2.         void Update ()
    3.         {
    4.             if (webCamTextureToMatHelper.IsPlaying () && webCamTextureToMatHelper.DidUpdateThisFrame ()) {
    5.  
    6.                 Mat rgbaMat = webCamTextureToMatHelper.GetMat ();
    7.  
    8.                 // detect faces.
    9.                 List<OpenCVForUnity.Rect> detectResult = new List<OpenCVForUnity.Rect> ();
    10.                 if (useDlibFaceDetecter) {
    11.                     OpenCVForUnityUtils.SetImage (faceLandmarkDetector, rgbaMat);
    12.                     List<UnityEngine.Rect> result = faceLandmarkDetector.Detect ();
    13.  
    14.                     foreach (var unityRect in result) {
    15.                         detectResult.Add (new OpenCVForUnity.Rect ((int)unityRect.x, (int)unityRect.y, (int)unityRect.width, (int)unityRect.height));
    16.                     }
    17.                 } else {
    18.                     // convert image to greyscale.
    19.                     Imgproc.cvtColor (rgbaMat, grayMat, Imgproc.COLOR_RGBA2GRAY);
    20.  
    21.                     using (Mat equalizeHistMat = new Mat ())
    22.                     using (MatOfRect faces = new MatOfRect ()) {
    23.                         Imgproc.equalizeHist (grayMat, equalizeHistMat);
    24.  
    25.                         cascade.detectMultiScale (equalizeHistMat, faces, 1.1f, 2, 0 | Objdetect.CASCADE_SCALE_IMAGE, new OpenCVForUnity.Size (equalizeHistMat.cols () * 0.15, equalizeHistMat.cols () * 0.15), new Size ());
    26.  
    27.                         detectResult = faces.toList ();
    28.                     }
    29.  
    30.                     // correct the deviation of the detection result of the face rectangle of OpenCV and Dlib.
    31.                     foreach (OpenCVForUnity.Rect r in detectResult) {
    32.                         r.y += (int)(r.height * 0.1f);
    33.                     }
    34.                 }                  
    35.  
    36.                 // face tracking.
    37.                 rectangleTracker.UpdateTrackedObjects (detectResult);
    38.                 List<TrackedRect> trackedRects = new List<TrackedRect> ();
    39.                 rectangleTracker.GetObjects (trackedRects, true);
    40.  
    41.                 // create noise filter.
    42.                 foreach (var openCVRect in trackedRects) {
    43.                     if (openCVRect.state == TrackedState.NEW) {
    44.                         if (!lowPassFilterDict.ContainsKey(openCVRect.id))
    45.                             lowPassFilterDict.Add (openCVRect.id, new LowPassPointsFilter((int)faceLandmarkDetector.GetShapePredictorNumParts()));
    46.                         if (!opticalFlowFilterDict.ContainsKey(openCVRect.id))
    47.                             opticalFlowFilterDict.Add (openCVRect.id, new OFPointsFilter((int)faceLandmarkDetector.GetShapePredictorNumParts()));
    48.                     }else if (openCVRect.state == TrackedState.DELETED){
    49.                         if (lowPassFilterDict.ContainsKey (openCVRect.id)) {
    50.                             lowPassFilterDict [openCVRect.id].Dispose ();
    51.                             lowPassFilterDict.Remove (openCVRect.id);
    52.                         }
    53.                         if (opticalFlowFilterDict.ContainsKey (openCVRect.id)) {
    54.                             opticalFlowFilterDict [openCVRect.id].Dispose ();
    55.                             opticalFlowFilterDict.Remove (openCVRect.id);
    56.                         }
    57.                     }
    58.                 }
    59.  
    60.                 // create LUT texture.
    61.                 foreach (var openCVRect in trackedRects) {
    62.                     if (openCVRect.state == TrackedState.NEW) {
    63.                         faceMaskColorCorrector.CreateLUTTex (openCVRect.id);
    64.                     }else if (openCVRect.state == TrackedState.DELETED) {
    65.                         faceMaskColorCorrector.DeleteLUTTex (openCVRect.id);
    66.                     }
    67.                 }
    68.  
    69.                 // detect face landmark points.
    70.                 OpenCVForUnityUtils.SetImage (faceLandmarkDetector, rgbaMat);
    71.                 List<List<Vector2>> landmarkPoints = new List<List<Vector2>> ();
    72.                 for (int i = 0; i < trackedRects.Count; i++) {
    73.                     TrackedRect tr = trackedRects [i];
    74.                     UnityEngine.Rect rect = new UnityEngine.Rect (tr.x, tr.y, tr.width, tr.height);
    75.  
    76.                     List<Vector2> points = faceLandmarkDetector.DetectLandmark (rect);
    77.  
    78.                     // apply noise filter.
    79.                     if (enableNoiseFilter) {
    80.                         if (tr.state > TrackedState.NEW && tr.state < TrackedState.DELETED) {
    81.                             opticalFlowFilterDict [tr.id].Process (rgbaMat, points, points);
    82.                             lowPassFilterDict [tr.id].Process (rgbaMat, points, points);
    83.                         }
    84.                     }
    85.  
    86.                     if (extendForehead){
    87.                         AddForeheadPoints(points);
    88.                     }
    89.  
    90.                     landmarkPoints.Add (points);
    91.                 }
    92.  
    93.                 // face masking.
    94.                 if (faceMaskTexture != null && landmarkPoints.Count >= 1) { // Apply face masking between detected faces and a face mask image.
    95.  
    96.                     float maskImageWidth = faceMaskTexture.width;
    97.                     float maskImageHeight = faceMaskTexture.height;
    98.  
    99.                     TrackedRect tr;
    100.  
    101.                     for (int i = 0; i < trackedRects.Count; i++) {
    102.                         tr = trackedRects [i];
    103.  
    104.                         if (tr.state == TrackedState.NEW) {
    105.                             meshOverlay.CreateObject (tr.id, faceMaskTexture);
    106.                         }
    107.                         if (tr.state < TrackedState.DELETED) {
    108.                             MaskFace (meshOverlay, tr, landmarkPoints [i], faceLandmarkPointsInMask, maskImageWidth, maskImageHeight);
    109.  
    110.                             if (enableColorCorrection) {
    111.                                 CorrectFaceMaskColor (tr.id, faceMaskMat, rgbaMat, faceLandmarkPointsInMask, landmarkPoints [i]);
    112.                             }
    113.                         } else if (tr.state == TrackedState.DELETED) {
    114.                             meshOverlay.DeleteObject (tr.id);
    115.                         }
    116.                     }
    117.                 } else if (landmarkPoints.Count >= 1) { // Apply face masking between detected faces.
    118.  
    119.                     float maskImageWidth = texture.width;
    120.                     float maskImageHeight = texture.height;
    121.  
    122.                     TrackedRect tr;
    123.  
    124.                     for (int i = 0; i < trackedRects.Count; i++) {
    125.                         tr = trackedRects [i];
    126.                      
    127.                         if (tr.state == TrackedState.NEW) {
    128.                             meshOverlay.CreateObject (tr.id, texture);
    129.                         }
    130.                         if (tr.state < TrackedState.DELETED) {
    131.                             MaskFace (meshOverlay, tr, landmarkPoints [i], landmarkPoints [0], maskImageWidth, maskImageHeight);
    132.  
    133.                             if (enableColorCorrection) {
    134.                                 CorrectFaceMaskColor (tr.id, rgbaMat, rgbaMat, landmarkPoints [0], landmarkPoints [i]);
    135.                             }
    136.                         } else if (tr.state == TrackedState.DELETED) {
    137.                             meshOverlay.DeleteObject (tr.id);
    138.                         }
    139.                     }
    140.                 }
    141.  
    142.                 // draw face rects.
    143.                 if (displayFaceRects) {
    144.                     for (int i = 0; i < detectResult.Count; i++) {
    145.                         UnityEngine.Rect rect = new UnityEngine.Rect (detectResult [i].x, detectResult [i].y, detectResult [i].width, detectResult [i].height);
    146.                         OpenCVForUnityUtils.DrawFaceRect (rgbaMat, rect, new Scalar (255, 0, 0, 255), 2);
    147.                     }
    148.  
    149.                     for (int i = 0; i < trackedRects.Count; i++) {
    150.                         UnityEngine.Rect rect = new UnityEngine.Rect (trackedRects [i].x, trackedRects [i].y, trackedRects [i].width, trackedRects [i].height);
    151.                         OpenCVForUnityUtils.DrawFaceRect (rgbaMat, rect, new Scalar (255, 255, 0, 255), 2);
    152.                         //Imgproc.putText (rgbaMat, " " + frontalFaceChecker.GetFrontalFaceAngles (landmarkPoints [i]), new Point (rect.xMin, rect.yMin - 10), Core.FONT_HERSHEY_SIMPLEX, 0.5, new Scalar (255, 255, 255, 255), 2, Imgproc.LINE_AA, false);
    153.                         //Imgproc.putText (rgbaMat, " " + frontalFaceChecker.GetFrontalFaceRate (landmarkPoints [i]), new Point (rect.xMin, rect.yMin - 10), Core.FONT_HERSHEY_SIMPLEX, 0.5, new Scalar (255, 255, 255, 255), 2, Imgproc.LINE_AA, false);
    154.                     }
    155.                 }
    156.  
    157.                 // draw face points.
    158.                 if (displayDebugFacePoints) {
    159.                     for (int i = 0; i < landmarkPoints.Count; i++) {
    160.                         DrawFaceLandmark (rgbaMat, landmarkPoints [i], new Scalar (0, 255, 0, 255), 2);
    161.                     }
    162.                 }
    163.  
    164. /*
    165.                 // display face mask image.
    166.                 if (faceMaskTexture != null && faceMaskMat != null) {
    167.  
    168.                     if (displayFaceRects) {
    169.                         OpenCVForUnityUtils.DrawFaceRect (faceMaskMat, faceRectInMask, new Scalar (255, 0, 0, 255), 2);
    170.                     }
    171.                     if (displayDebugFacePoints) {
    172.                         DrawFaceLandmark (faceMaskMat, faceLandmarkPointsInMask, new Scalar (0, 255, 0, 255), 2);
    173.                     }
    174.  
    175.                     float scale = (rgbaMat.width () / 4f) / faceMaskMat.width ();
    176.                     float tx = rgbaMat.width () - faceMaskMat.width () * scale;
    177.                     float ty = 0.0f;
    178.                     Mat trans = new Mat (2, 3, CvType.CV_32F);//1.0, 0.0, tx, 0.0, 1.0, ty);
    179.                     trans.put (0, 0, scale);
    180.                     trans.put (0, 1, 0.0f);
    181.                     trans.put (0, 2, tx);
    182.                     trans.put (1, 0, 0.0f);
    183.                     trans.put (1, 1, scale);
    184.                     trans.put (1, 2, ty);
    185.                  
    186.                     Imgproc.warpAffine (faceMaskMat, rgbaMat, trans, rgbaMat.size (), Imgproc.INTER_LINEAR, Core.BORDER_TRANSPARENT, new Scalar (0));
    187.  
    188.                     if (displayFaceRects || displayDebugFacePointsToggle)
    189.                         OpenCVForUnity.Utils.texture2DToMat (faceMaskTexture, faceMaskMat);
    190.                 }
    191. */
    192.  
    193. //                Imgproc.putText (rgbaMat, "W:" + rgbaMat.width () + " H:" + rgbaMat.height () + " SO:" + Screen.orientation, new Point (5, rgbaMat.rows () - 10), Core.FONT_HERSHEY_SIMPLEX, 0.5, new Scalar (255, 255, 255, 255), 1, Imgproc.LINE_AA, false);
    194.  
    195.                 OpenCVForUnity.Utils.fastMatToTexture2D (rgbaMat, texture);
    196.             }
    197.         }
     
    bastianmeyerbm3 likes this.
  31. bastianmeyerbm3

    bastianmeyerbm3

    Joined:
    May 22, 2018
    Posts:
    15
    Thank you! I'll try that later
     
    EnoxSoftware likes this.
  32. olliebarbs

    olliebarbs

    Joined:
    Aug 16, 2018
    Posts:
    1
    I'm trying to use this with a full HD webcam along with the frame optimisation tools and AR scripts. The webcam texture to mat helper script appears to severely limit the webcam frame rate if the requested width and height are anything larger than 640 x 360. This is strange as this Unity's frame rate runs at more than 100fps. I have found other webcam scripts that are able to display webcam textures at full HD at 30fps. The requested FPS seems to have no effect either.
     
  33. tjroger

    tjroger

    Joined:
    May 6, 2017
    Posts:
    2
    have you resolved this problem? I am facing the same situation now. I guess it's because of a special Chinese part contained in the the path
     
  34. EnoxSoftware

    EnoxSoftware

    Joined:
    Oct 29, 2014
    Posts:
    1,566
    The process of converting WebCamTexture to Mat and restoring it might produce a non-negligible performance cost.
     
  35. Craig-Martin

    Craig-Martin

    Joined:
    Jun 14, 2015
    Posts:
    3
    Hi @EnoxSoftware,

    I had bought your plugin from asset store and ARHeadWebCamTextureExample works great.
    But WebCamTextureToMatHelper is using WebCamTexture from unity so the quality of image, FPS is not good as ARKit/ARCore camera. I can get the texture2D from camera of ARKit/ARCore but cant apply that texture to ARHeadWebCamTextureExample.cs file to make it works.

    Can you make another script that i can input the Texture2D into and then it can tracking the face and display ARHead onto that texture2D?

    Thanks.
     
  36. EnoxSoftware

    EnoxSoftware

    Joined:
    Oct 29, 2014
    Posts:
    1,566
    Hello Craig-Martin.
    What kind of error is displayed on the console?
     
  37. ina

    ina

    Joined:
    Nov 15, 2010
    Posts:
    1,085
  38. ina

    ina

    Joined:
    Nov 15, 2010
    Posts:
    1,085
    The better way to do this is perhaps to use facemask to determine where the mouth points are for an arbitrary person, instead of assuming that everyone has the same generic Caucasian male 3d model head
     
    EnoxSoftware likes this.
  39. Craig-Martin

    Craig-Martin

    Joined:
    Jun 14, 2015
    Posts:
    3
    It just cant apply the texture from arkit/arcore into the lib, so the plan (Quad Object) just display gray color
     
  40. EnoxSoftware

    EnoxSoftware

    Joined:
    Oct 29, 2014
    Posts:
    1,566
  41. Muhammad-Faisal-Aleem

    Muhammad-Faisal-Aleem

    Joined:
    Jun 22, 2015
    Posts:
    9
    Is it the optimized code? We are using this for our project and it's very important for us. Frame Rate on Good Android Phone is 10 FPS which is very low. How can it be improved. Please guide us with information and steps required to do it. Using Unity.
     
  42. Muhammad-Faisal-Aleem

    Muhammad-Faisal-Aleem

    Joined:
    Jun 22, 2015
    Posts:
    9
    Great, were you able to run successfully and detect without any issues? I am also gonna try this. Will inform you about results. Thanks!
     
  43. EnoxSoftware

    EnoxSoftware

    Joined:
    Oct 29, 2014
    Posts:
    1,566
    Have you tried FrameOptimizationExample? This example is found in the "Assets / DlibFaceLandmarkDetectorWithOpenCVExample / FrameOptimizationExample /" folder.
    https://github.com/EnoxSoftware/Dli...timizationExample/FrameOptimizationExample.cs
     
  44. universityofgames

    universityofgames

    Joined:
    Nov 24, 2016
    Posts:
    34
    Could you merge FrameOptimizationExample with WebCamTextureFaceMaskExample ?
     
  45. EnoxSoftware

    EnoxSoftware

    Joined:
    Oct 29, 2014
    Posts:
    1,566
    Unfortunately, I don't have an example of merging FrameOptimizationExample with WebCamTextureFaceMaskExample.
     
  46. INGTONY

    INGTONY

    Joined:
    Oct 13, 2014
    Posts:
    24
    hi, problem in unity 2018.2.13f getting this error with Dlib when it detects the face it stops and give my this error frame by frame if i quit from the fov of the camera the app still runing
    NullReferenceException: Object reference not set to an instance of an object
    OpenCVForUnity.Calib3d.solvePnP (OpenCVForUnity.MatOfPoint3f objectPoints, OpenCVForUnity.MatOfPoint2f imagePoints, OpenCVForUnity.Mat cameraMatrix, OpenCVForUnity.MatOfDouble distCoeffs, OpenCVForUnity.Mat rvec, OpenCVForUnity.Mat tvec, Boolean useExtrinsicGuess, Int32 flags) (at Assets/OpenCVForUnity/org/opencv/calib3d/Calib3d.cs:989)
    DlibFaceLandmarkDetectorExample.ARHeadWebCamTextureExample.Update () (at Assets/DlibFaceLandmarkDetectorWithOpenCVExample/ARHeadExample/ARHeadWebCamTextureExample.cs:525)
     
  47. INGTONY

    INGTONY

    Joined:
    Oct 13, 2014
    Posts:
    24
    also i went back for previus versions and all them have simiar issue with old projects to , it can be asociated with drivers or upgrades in my pc?
     
  48. INGTONY

    INGTONY

    Joined:
    Oct 13, 2014
    Posts:
    24
    the dlib Ar dosent work now for any versión of unity in my pc all of them just stops and gave me errors , test old projects in 5.6 , 2017 .2 and 2018.2 all them use to work without an issue
     
  49. INGTONY

    INGTONY

    Joined:
    Oct 13, 2014
    Posts:
    24
    testing in 2017.3 and get this one

    NullReferenceException: Object reference not set to an instance of an object
    DlibFaceLandmarkDetectorExample.WebCamTextureARExample.Update () (at Assets/DlibFaceLandmarkDetectorWithOpenCVExample/WebCamTextureARExample/WebCamTextureARExample.cs:373)
     
  50. EnoxSoftware

    EnoxSoftware

    Joined:
    Oct 29, 2014
    Posts:
    1,566
    Could you tell me the environment you tested?
    DlibFaceLandmarkDetector version :
    OpenCVForUnity version :
    Unity version :