Can anyone confirm whether the new Human Segmentation samples in github https://github.com/Unity-Technologies/arfoundation-samples/tree/master/Assets/Scenes also include depth information via the depth buffers in ARKit3?
The HumanSegmentationImages scene surfaces the stencil and depth buffers which are required for people occlusion, yes.
Great, thanks. Can't wait to see some demos, but on the other hand maybe I'll get a new iPhone and make my own
Ey! @tdmowrer in my case, I'm trying to use humanBodyPoseEstimationEnabled and humanSegmentationStencilMode at the same time (I need the pose estimation to know the possition of the user, and de stencil because I have a shader that i give the texture to it, and the user can oclude objects). But in xcode I have a warning and the humanBodyPoseEstimationEnabled is disabled, do you know any way?
This is a limitation of the current ARKit 3 beta and is documented here. This may just be a temporary restriction of the iOS 13 beta. If Apple allow both, we will update our plugin.