Hi,
I know that multiple parties are in contact with Nvidia about the topic but I thought it would be beneficial to have the discussion also here publicly :)
So in general for me the most requested feature by our users is using pass-through with the Quest 3. My naive understanding would be that it should not be that hard actually, because with CloudXR 4.0 I could just say I want 2 RGBA streams and then activate pass-trough in the client. Now I heard that streaming alpha is not yet possible, but isn’t that done when I stream to an ipad using AR kit? And my understanding was that also for AR kit the server still sends 2 streams but the client just uses one of them. So I wonder what would need to be done on the server side or if that is ready to go.
The other problem I see is that the client uses the deprecated OpenVR sdk and to get the pass-through we would have to integrate the OpenXR sdk, where I am not sure what problems would come up if both are used, or how complex it would be to completelly replace the OpenVR sdk with OpenXR. Other solutions I heard about include some kind of chroma keying on the client side, but even then the pass-through would need to be activted on the client side I think.
Also it would be nice to get clearer feeling for what is coming there on the CloudXR roadmap, so that we know if a sample with pass-through is basically around the corner, or we should invest some time in it.