Search Unity

Resolved Ar Foundation with windows webcam, XRFaceSubsystem.GetFaceMesh, Error can't convert void to bool

Discussion in 'AR' started by PlanetCreate_, Jan 29, 2023.

  1. PlanetCreate_

    PlanetCreate_

    Joined:
    Sep 5, 2021
    Posts:
    2
    What am i trying to achieve with this Script, i want to use arkit face tracking on windows with a webcam.
    What is the error about error: Cannot implicitly convert type 'void' to 'bool'
    The line of code where the error is located: if (m_faceManager.GetFaceMesh(faceId, Allocator.Temp, ref faceMesh))

    the questions i have is how can i fix the error and is this a good way to use a webcam with ar foundation.
    i use:
    Unity version: 2021.3.9f1, Render Pipeline Universal RP,
    ar foundation: 4.2.7,
    arkit xr plugin: 4.2.7,
    arkit face tracking: 4.2.7 and
    arcore xr plugin: 4.2.7(not used in this script).
    OpenCvSharp: https://assetstore.unity.com/packages/tools/integration/opencv-plus-unity-85928

    documentation of XRFaceSubsystem.GetFaceMesh: https://docs.unity3d.com/Packages/c...ator_UnityEngine_XR_ARSubsystems_XRFaceMesh__

    Code (CSharp):
    1. using System;
    2.  
    3. using System.Collections;
    4. using System.Collections.Generic;
    5. using System.Runtime.InteropServices;
    6. using UnityEngine;
    7. using Unity.Collections;
    8. using UnityEngine.XR.ARFoundation;
    9. using UnityEngine.XR.ARKit;
    10. using UnityEngine.XR.ARSubsystems;
    11. using OpenCvSharp;
    12. public class EyeTracking : MonoBehaviour
    13. {
    14.     private ARCameraBackground arCamera;
    15.     private Mat mat;
    16.     private Texture2D texture;
    17.     private ARKitFaceSubsystem m_faceSubsystem;
    18.     private XRFaceSubsystem m_faceManager;
    19.     WebCamTexture InputImage;
    20.     void Start()
    21.     {
    22.         // Get the ARKitFaceSubsystem component
    23.         m_faceSubsystem = GetComponent<ARKitFaceSubsystem>();
    24.         if (m_faceSubsystem == null)
    25.         {
    26.             Debug.LogError("ARKitFaceSubsystem is not found in the scene.");
    27.             return;
    28.         }
    29.         // Get the list of available webcams
    30.         WebCamDevice[] devices = WebCamTexture.devices;
    31.         // Initialize the webcam texture with the first available webcam
    32.         InputImage = new WebCamTexture(devices[0].name);
    33.         InputImage.Play();
    34.     }
    35.     void Update()
    36.     {
    37.         if (InputImage != null)
    38.         {
    39.             // Capture image using OpenCVSharp
    40.             mat = OpenCvSharp.Unity.TextureToMat(InputImage);
    41.             // Create the data array with the correct size
    42.             int bytesPerPixel = 3; // assuming the image is in RGB format
    43.             int dataSize = mat.Width * mat.Height * bytesPerPixel;
    44.             byte[] data = new byte[dataSize];
    45.             // Copy the data from the Mat object to the data array
    46.             Marshal.Copy(mat.Data, data, 0, dataSize);
    47.             // Convert Mat to Texture2D
    48.             texture = new Texture2D(mat.Width, mat.Height, TextureFormat.RGB24, false);
    49.             texture.LoadRawTextureData(data);
    50.             texture.Apply();
    51.             // Use the face subsystem to get the face mesh data
    52.             if (m_faceSubsystem != null)
    53.             {
    54.                 using (var faceChanges = m_faceSubsystem.GetChanges(Allocator.Temp))
    55.                 {
    56.                     for (int i = 0; i < faceChanges.added.Length; i++)
    57.                     {
    58.                         var faceId = faceChanges.added[i].trackableId;
    59.                         XRFaceMesh faceMesh = new XRFaceMesh();
    60.                        [U] if (m_faceManager.GetFaceMesh(faceId, Allocator.Temp, ref faceMesh))    < Error [/U]
    61.                         {
    62.                             // Process the face mesh data here, for example, by getting the positions of the eyes
    63.                             //Vector3 leftEyePosition = faceMesh.vertices[362];
    64.                             //Vector3 rightEyePosition = faceMesh.vertices[984];
    65.                             // Do something with the eye positions, such as rendering them on the screen
    66.                         }
    67.                     }
    68.                 }
    69.             }
    70.         }
    71.     }
    72. }

    thanks in advance.
     
  2. andyb-unity

    andyb-unity

    Unity Technologies

    Joined:
    Feb 10, 2022
    Posts:
    1,062
    The Apple ARKit XR Plug-in is not supported on Windows devices, nor does Unity support any AR Foundation provider plug-in for Windows. If you are implementing your own face tracking solution for Windows, follow the "Implement a Provider" steps in our docs, using AR Foundation and your provider package (and notably not ARKit): https://docs.unity3d.com/Packages/com.unity.xr.arfoundation@5.0/manual/implement-a-provider.html

    Otherwise, you may wish to seek a third-party face tracking solution for Windows. AR Foundation is simply a cross-platform abstraction built on top of ARKit, ARCore, and OpenXR, and Unity does not implement any features such as face tracking that could be made available on Windows. Implementations come from provider platforms themselves, not the AR Foundation or ARKit packages.
     
  3. PlanetCreate_

    PlanetCreate_

    Joined:
    Sep 5, 2021
    Posts:
    2
    thank you, i will try to make my own face tracking. for now my question is answered.
     
    andyb-unity likes this.