MR using CloudXR on Oculus Quest Pro

I have been wondering if there is any way to create an AR/MR experience for Oculus Quest Pro using Unity or Unreal Engine. I have tried out AR for Android with the Sample/Android/GoogleAR/ar-sample.apk and the TestTools/ar_test.exe but I have not been able to create a similar app, that gives out Alpha information of every frame, using Unity/Unreal. I would appreciate any guidance on how to go about these issues.

To sum it up:

  • Possibility of creating an AR/MR experience for Quest Pro that can be streamed through CloudXR
  • Possibility of creating an AR experience for ARCore using Unity/Unreal that can be streamed through CloudXR

I have had similar thoughts. I’m not sure if CloudXR natively supports sending alpha data. If it doesn’t, one way of going around this is writing the info about alpha directly into the image itself. This would require that you code not just your own VR app, but also your own CloudXR client.

Option 1 would be to just dedicate one RGB value and treat it as “background”.

Option 2 would be to shift all RGB values a bit and use the space that was freed to store info about the alpha.

Another thing to consider here is the depth information. With the alpha approach, you can for example overlay your geometry on top of the real world, but you are not able to mix and match passthrough video and the rendered geometry very well. For that, you need depth information. For that, there are two considerations: sending the depth info and then mixing the passthrough video with the rendered geometry.

Option 2 I presented above could be used to send info about the depth texture.

Unfortunately, neither Quest Pro nor Vive XR Elite offers access to depth texture of the passthrough video in Unity.

There are ways to go around this, for sure. For example use the detected geometry to do the mixing. Or maybe use some AI approach that approximates depth texture for the video passthrough.