Search Unity

  1. Welcome to the Unity Forums! Please take the time to read our Code of Conduct to familiarize yourself with the forum rules and how to post constructively.

Discussion [AR Foundation] ARKit device/camera tracking, SLAM, or only VIO feature?

Discussion in 'AR' started by mikeyrafier98, Jul 19, 2022.

  1. mikeyrafier98


    May 19, 2022
    Hi, first of all, I apologize for my lack of knowledge.
    I'm trying to test device tracking while using ARKit, provided by AR Foundation in Unity.
    When I record every second the device position, the results seems never showing any SLAM feature.
    From what I understand, one of the features is loop closing, which it can adjust the map when return to the same place, looking the same scene.
    I believe ARKit has visual IMU system to track the device, by point cloud and plane detection, but I cannot see any SLAM feature.
    I didn't only try to record the ARCamera/XRCamera/device position refer to the world origin/XROrigin, I place every second (or 0.5 second) an empty GameObject as trail.
    If the map is getting adjusted, the trail should be also adjusted, even the device position can change everytime.

    Please help to clarify my question, is ARKit from AR Foundation by Unity has SLAM, or only use VIO features?
    Does ARKit provided directly by Apple (Swift and Obj-C languages) has SLAM, or is it same?
    If there is no SLAM, how can we fusion AR view by ARKit with the current open SLAM project for device localization and mapping?

    Thank you in advance.
  2. andyb-unity


    Unity Technologies

    Feb 10, 2022
    AR Foundation does not implement any tracking algorithms. It uses the ARKit framework to provide tracking on iOS devices via the Apple ARKit XR Plug-in package.

    This question is better directed to Apple: