Hi! I'm building an app where it's possible to record a video of a person and apply visual effects to them - say, when they wave their hand, have particles emitted from it, or have something spinning around their head. Because I want all this VFX to happen in 3D, I use 3D body tracking from ARFoundation / ARKit: https://github.com/Unity-Technologi...lob/master/Assets/Scripts/HumanBodyTracker.cs Thing is, the skeleton joints don't match in position with the actual body. Even if the pose itself is accurate, virtual hands on the image are not in the same place as the actual ones, they are about a palm away I think there is no information about scale of different joints - hands/legs/whatever, so it will never match anyone perfectly, but I would love a confirmation or clarification on this subject I found only scale of the whole body here https://developer.apple.com/documen...or/3255162-estimatedscalefactor?language=objc "ARKit sets this property to a value between 0.0 and 1.0." and "The default body is 1.8 meters tall.". I am 1.9m tall, does this mean ARKit won't even try to fit my body?