CloudXR.js + UE5.5 - cannot make it work

Hello,

I’m following the Unreal Engine Integration Guide, but cannot make it work, so I have multiple questions.

First, I just want to say I’ve successfully build the LOVR sample, and was able to connect to it both from my PC Chrome Browser and my Quest 3 Browser by running the simple server.

That being said, I’ve created a UE 5.5 VR Template and created a plugin as indicated in the documentation.

  • I manage to have the CloudXR Runtime module working, or so I think, as the process goes through with no errors.
    By the way, the documentation uses this as an example of setting Service Properties:
    FString PropertyName = TEXT("server-port");
    FString PropertyValue = TEXT("7000");

But the log reports it’s not working:
Error: Failed to set property ‘server-port’ ; result = -9 meaning NV_CXR_ERROR_PROPERTY_VALUE_INVALID.

  • Regarding the CloudXR Opaque Data Channel, I cannot manage to have Unreal automatically call the CreateInputDevice function. If I call it manually, it seems to work.
    However, some functions are not called, and I’m not really sure how/when they’re supposed to be called (OnCreateSession for example).
    The part of the documentation about that Module is not as clear as the part about the Realtime Module. And even that one was not super clear for someone who has never made a Module to handle stuff like this.

Anyway, when I try to connect with my Chrome browser or my Quest brower, I cannot connect and get the following error :
**⚠ Stream stopped: Error: Stream start failed with error code: 0xC0F22202
**
Additional log from the Browser:

Summary

Running on HTTP protocol - using insecure WebSocket (WS)
utils.ts:155 Using default server IP from window location: …
utils.ts:160 Using user-provided port: 49100
main.ts:163 Connected to Server localhost:49100…
iwer.min.js:1 Requested frame rate is the same as the current nominal frame rate, no update made
cloudxr.js:2 POST https://events.gfe.nvidia.com/v1.1/events/json net::ERR_NAME_NOT_RESOLVED
_0x422910 @ cloudxr.js:2
cloudxr.js:2 WebSocket connection to ‘ws://10.2.0.2:49100/sign_in?peer_id=peer-6737862040&version=2’ failed:
TT @ cloudxr.js:2Understand this error
main.ts:359 Stream stopped with error: Error: Stream start failed with error code: 0xC0F22202
at _0x2a9b6b.CloudXrNskStreamClientDelegate.onStreamStartFailed [as onStreamStartFailedCallback] (cloudxr.js:2:359842)
at _0x2a9b6b.CloudXrNskStreamClientDelegate.onStreamStartFailed (cloudxr.js:1:2352)
at _0x38e4eb.GE (cloudxr.js:2:285796)
at _0x548284.dE (cloudxr.js:2:188128)
at _0x35d01f (cloudxr.js:2:175332)
at _0x548284.hT (cloudxr.js:2:175598)
at _0x54ede6.onclose (cloudxr.js:2:142738)
onStreamStopped @ main.ts:359
onStreamStartFailed @ cloudxr.js:2
onStreamStartFailed @ cloudxr.js:1
GE @ cloudxr.js:2
dE @ cloudxr.js:2
_0x35d01f @ cloudxr.js:2
hT @ cloudxr.js:2
_0x54ede6.onclose @ cloudxr.js:2Understand this error
main.ts:163 Stream stopped: Error: Stream start failed with error code: 0xC0F22202
showStatus @ main.ts:163
onStreamStopped @ main.ts:360
onStreamStartFailed @ cloudxr.js:2
onStreamStartFailed @ cloudxr.js:1
GE @ cloudxr.js:2
dE @ cloudxr.js:2
_0x35d01f @ cloudxr.js:2
hT @ cloudxr.js:2
_0x54ede6.onclose @ cloudxr.js:2Understand this error

So, here is my list of questions:

  • Can the fact that I cannot connect be related to the CloudXR Opaque Data Channel module not being fully functional ?
  • If not, what could cause this ?
  • How can I fix the CloudXR Opaque Data Channel module so the CreateInputDevice function is correctly called by the Engine ?
  • Would it be possible to have a working Unreal Engine sample ?
  • Could you check and fix the documentation ?

Thank you !

  • Can the fact that I cannot connect be related to the CloudXR Opaque Data Channel module not being fully functional ?
    • No, not connecting is related to the server port issue. That code is basically saying not able to connect to signaling port (server port). If you use the default port it should connect. It should be int64 type:
    • 
         FString PropertyName = TEXT("server-port");
         FTCHARToUTF8 PropertyNameUTF8(*PropertyName);
      
         nv_cxr_result_t result = pfn_ServiceSetInt64Property(
             ServiceHandle,
             PropertyNameUTF8.Get(),
             PropertyNameUTF8.Length(),
             static_cast<int64_t>(7000));
      
  • How can I fix the CloudXR Opaque Data Channel module so the CreateInputDevice function is correctly called by the Engine ?
    • I’ll forward this question to the team.
  • Would it be possible to have a working Unreal Engine sample ?
    • I’ll forward this question to the team
  • Could you check and fix the documentation ?
    • Yes thanks for spotting the issue and feedback

Ok, thank you.
Can’t wait for an update from the team !

Hi Ice18-Studios,

Thanks for providing feedback.

The section on Configuring Service Properties was meant to be more of a general case of setting a property than actual requirement. When running Cloudxr.js, generally you won’t need to set the server-port unless you have specific network limitations. I will make a note to clarify that point.

However, for Cloudxr.js, you will need to set “device-profile” property to either “quest3” or “auto-webtrc” when using a Quest3. Since you are starting out, I recommend setting “device-profile” to “quest3” for simplicity. When that property is set, you can Start the Service and then Initialize the HMD. Afterwards, go to your Quest3 browser and connect to your Cloudxr.js web client. This would be the same steps as you had done for the LOVR sample, but now with Unreal as the application.

If you use “auto-webrtc”, the code gets a bit trickier as the runtime is waiting for a connection before xrGetSystem becomes valid. In Unreal, that would mean repeatedly calling

GEngine->StereoRenderingDevice->EnableStereo(true);

until it is successful before continuing. If you are only supporting Quest3 for the time being, using “quest3” will be easier.

Regarding CloudXR Opaque Data Channel, that is an optional module to CloudXR for application specific message passing and not necessary for connection. I recommend working through connection issues prior to implementing the Opaque Data Channel.

You can take a look at Unreal’s FOpenXRViveTrackerModule class for example of using CreateInputDevice. This is just one way to implement the Opaque Data Channel and you are free to implement the API as you see fit for your application.

Again, thanks for your feedback and let us know if you run across other issues.

Hi Hougantc,

Thanks for the tip about the “device-profile”.
I set it to “quest3” in my VRTemplate project, and I can now connect successfully:

  • In the Chrome browser on PC, the “Connect” button turn grey with “Connect (Streaming)” and below there is a message in green “/!\ Streaming started!”.
  • In the Quest3 Web browser, it goes into full VR mode.

However, the screen in the headset is black and no input is transferred to the PC app.
On the PC, the camera stays on the floor. Moving my head does nothing on the PC side.
In the Chrome browser on PC, adjusting the Controller or the “Meta Quest 3” location/FoV has no impact on the application.

Is this expected ?
Or should it be working, and something is amiss ?

[Edit]
I’ve tried creating the XRSystem either as soon as possible when my Module is created and the CloudXRLirary loaded, or after Begin Play in my VRPawn, but that lead to the same results.

I believe you said you were able to connect with LOVR sample. Did all the functionality work in LOVR where you can see the text and controller models and text indicating button presses?

If that is the case, it would appear this may be an issue on the Unreal side. Are you running AR in Unreal? If that is the case, it may be how Unreal handles alpha.

  • Go to Project Settings in Unreal.
  • Search for Alpha Output and enable it. This setting may require saving and restarting Unreal.
  • Without this option, areas expected to show the real world environment will display as black.

Secondly, Unreal 5.5 outputs alpha inverted from what is expected. To rectify, a custom material to invert alpha (1-x masked on Alpha into Opacity) should be created and set as a Post Process Material, which is then set as the material for the scene’s PostProcessVolume

With LOVR:

  • Everything is working fine with the PC Chrome Browser (Head “tracking”, Controllers tracking and buttons)
  • With the Web Browser on the Quest 3, display is fine, head tracking is fine, controllers are not working, but I put that on the fact that I’m using Meta Quest Pro controllers instead of Meta Quest 3. Maybe this is not related, but I didn’t switch back to Quest 3 Controllers to check.

With UE5.5:

  • With PC Chrome browser, nothing is working. Adjusting head location or controllers location or buttons do nothing
  • On Quest 3 Web Browser, nothing is working either and the display is black.

I’m not running AR in Unreal. This is the default VR Template from UE 5.5.

Even if the black display comes from an Alpha Output issue, this is not related to the fact that head tracking is not working (neither the real one from the headset or the fake one from the PC Chrome browser) nor are the controllers (PC Chrome browser only, if we take for granted that my Quest Pro Controllers are not supported).

So there is something else not working, but I have no logged error to point me in the right direction.

Once again, if Nvidia could share a working Unreal Engine project, that would help a lot to understand how the technology is supposed to be used, and pinpoint key differences between my flawed plugin and a working one.