Expose WorldTrackingProvider in CloudXR Swift framework to track objects in real world

Hi there,

I would like to persist location of virtual objects that are streamed via CloudXR in the real world when using Apple Vision Pro. In order to do that I need to use WorldTrackingProvider and WorldAnchors from RealityKit/ARKit.

CloudXRSession in the Swift framework has embedded ARKitSession which initializes its own WorldTrackingProvider but it is not exposed in any way to the developer and RealityKit/ARKit doesn’t allow multiple WorldTrackingProviders running at the same time.

You already have a method queryDeviceAnchor() that exposes deviceAnchor events, an option would be to make another method like queryWorldAnchor() that exposes worldAnchor events.

I’ve raised this in the CloudXR framework repo but not sure if anyone looks at issues there – Expose WorldTrackingProvider to track objects in the real world · Issue #1 · NVIDIA/cloudxr-framework · GitHub

Thanks,
Kostya

We’ve initially also tried to piggyback on the already existing WorldTrackingProvider used by CloudXRKit until we realized it’s totally fine to have multiple WorldTrackingProvider’s instantiated so we just built our world anchor persistence layer with a separate WorldTrackingProvider. Works like a charm. Or is there any other reason why you want “the one” from CloudXR?