Hi there,
I would like to persist location of virtual objects that are streamed via CloudXR in the real world when using Apple Vision Pro. In order to do that I need to use WorldTrackingProvider and WorldAnchors from RealityKit/ARKit.
CloudXRSession in the Swift framework has embedded ARKitSession which initializes its own WorldTrackingProvider but it is not exposed in any way to the developer and RealityKit/ARKit doesn’t allow multiple WorldTrackingProviders running at the same time.
You already have a method queryDeviceAnchor() that exposes deviceAnchor events, an option would be to make another method like queryWorldAnchor() that exposes worldAnchor events.
I’ve raised this in the CloudXR framework repo but not sure if anyone looks at issues there – Expose WorldTrackingProvider to track objects in the real world · Issue #1 · NVIDIA/cloudxr-framework · GitHub
Thanks,
Kostya