Search Unity

  1. Welcome to the Unity Forums! Please take the time to read our Code of Conduct to familiarize yourself with the forum rules and how to post constructively.
  2. Dismiss Notice

Question Provider plugin for AR Foundation with Unity XR SDK

Discussion in 'AR/VR (XR) Discussion' started by nfynt-zap, Oct 4, 2021.

  1. nfynt-zap


    Jun 3, 2021

    So I have been experimenting and reading a bit on Unity XR SDK and ARSubsystems API. My specific requirement is to create an interface connecting my native plugin (mobile AR) for a unity AR Foundation project. Currently my plugin integrates with Unity through normal native plugin work flow through C# APIs. Now since I am not working with any hardware yet (tracker, display, etc.) I don't think I need to add Unity XR SDK to my native plugin and can just get by, by creating a C# layer for bridging the ARSubsystems API to the already exposed C# interfaces from my native plugin.

    So far I have created basic XRLoader with - XRSessionSubsystem and XRCameraSubsystem and provided native bindings to subsystem's lifecycle (Construct, Start, Stop, Destroy) and other necessary methods. Which allows me to at least create and destroy those subsystem at runtime, however I am still not completely clear on ARSubsystems API. Specifically on integration of native graphics pipeline and further on synchronous update loop, which currently works based on MonoBehaviour. For instance some subsystems containers like XRTextureDescriptor, XRCameraFrame, XRCpuImage, etc. neither provide C# constructors nor are referenced in XR SDK, but are necessary for provider implementation for "TryGetCameraFrame".

    From my understanding, I think the ARSubsystems API is based on events so the providers API would be connected asynchronously through events. Would be helpful to get some more information on communication flow between the ARSubsystems API and native provider, esp. for graphics pipeline and clarification on implementing those containers.

    henryqng likes this.