Can someone suggest how to place object(mesh) on tongue in ARKit Face-Tracking? The manager in the example indicates reading the Blendshape as "Location" but no placement/tracking tongue position (not included in ARFaceMesh Manager either)--any ideas? Thanks.
You used to be able to get data on the eyes but that's no longer available in AR Foundation (or not yet added yet) Take a look at older ARKit2 Unity plugins in bitbucket