- is there access to runtime configuring the CloudXR stream encoding?
- is FEC (forward error correction) supported?
- add encryption to stream
1: I would like to have the ability to dynamically change the mpeg4/etc encoding parameters (iframes rate, encoded frame-size, encoding quality, UDP packet size, etc) to support lower-bandwidth endpoints and dynamically changing network-conditions.
Current CloudXR (3.1.1) requires very high-bandwidth (60+ Mbps) and frames from server-side, with very severely degraded image-quality less than that. I know other streaming technology show higher-quality frames under 20 Mbps. Even Amazon’s “NICE DCV” remote-desktop shows better frames at 30-60fps (single-frame) than CloudXR currently does while running simultaneously on the same Internet connection.
Also, why does CloudXR use up to 70%+ of the GPU for 72fps frame-encoding? Having benchmarked Intel’s QuickSync being able to support HD (maybe not FHD) frames at multiple of hundreds-of-frames per second, ONE 72fps stream shouldn’t be so much effort even on a RTX 3080 (laptop). It would be nice if CloudXR can use extra system resources such as QuickSync to assist in encoding more streams at a time.
2: Does CloudXR implement/support FEC? It would help keep streams more resistant to disruption by being able to rebuild frames from extra parity bits.
3: It would be nice to add encryption to the CloudXR stream. If using QUIC/HTTP3 is the quickest path there, it would be nice for CloudXR to move in that direction.
Hi @NuShrike1 -
If encryption is required - you can attempt to run within an optimized VPN for your location.
Regarding FEC, this is used part of our QoS.
There is not access to the runtime, we are looking at bringing more tools soon to help set network conditions.
Thanks for reaching out!
Thank you for the reply. VPN per client is not a reasonable solution, as I understand it, with complex setup issues.
I see that your protocol does suggest video/audio stream encryption could be supported: “Video stream encryption disabled for stream:0”
Currently, my opinion of the default streaming quality without tuning is: it’s low to poor quality most of the time – especially if the “Server” is connected via WiFi on a local WiFi6 AX network
Compared to Oculus AirLink, the difference in quality is even more stark, needing more GPU encoding resources and still ends up not looking as good. It really could use some work.
Looking forward to getting more access to the internals in order to tune it more optimally.
Definitely try 3.2 release, as it should have improved on some aspects.
Also, I recommend running with -f 50 (or 60 or 70), as foveated rendering allows much better central focus quality at same or lower bw.
I don’t know enough about the capture and encode pieces, but would be surprised if our GPU use is significantly different than other solutions. Excepting when enabling certain features like foveation which add an extra GPU pass.
On the encryption topic, yes we have it in our backlog to enable encrypted video (and other) streams. I don’t have an ETA unfortunately.
And finally (in reverse order), no, the QoS tuning is all black box under the hood. Yes, FEC is used, you’ll actually see it noted in the log when it adjusts. We are looking at what specific network options we can expose, likely starting with high level network profiles, like wifi5 vs wifi6 vs 5G (or something like that).