Unreal-SteamVR Alpha channel

Hi all
i know that android arcore need frame with alpha channel, but in
in the Nvidia cloudXR Android APK, used on many type of smartphone,
seems that every object from the “default Home” of SteamVR, every Unreal exported VR scene
and yet, the Nvidia cubes ar_test.exe included in the TestTools, is blended with the phone camera,
even if the project is set with rendering framebuffer in RGBA.
Unity exported scenes seems a bit less transparent than unreal, but objects are a bit trasparente too.
Others have encountered the same problem?

best regards. Alex

1 Like

We found that the issue (in this case Unity Projects) seems based on the Andorid/ArCore Version

Same scene, same Windos10 Pro, same CloudXR server and clients and same RTX2080
On
Huawei P3 and some Samsung (A50 - Note 20 - Galaxy 20) with Android 10 and 11 on - Scene Objects and SteamVR Home have a sort of Transparency whre objects are opaque.
Redmi Note 7 and LG G6 on on Andoid 9 - Scene Objects and SteamVR Home are perfectly opaque

Seems that the Alpha Channel is readed mixed with the RGB info, because light/area are less transparent than shadows

1 Like

Update @GJones-NVIDIA-XR-Team
Seems that old smartphone without ArCore “Supports Depth API” are ok
All Recent Smartphone tested with Deph API, had some mixed alpha
Win10Pro->Steam->CloudXR->smartphone (before uploading any scenes) .

1 Like

Me too
Request to solve

1 Like

We’ve had this issue too (testing on a Galaxy S21+). For now we’ve worked around the problem by applying a 1.1 scalefactor to the alpha values in the pixel shader (in the blitter). But an official fix would be much nicer!

We believe we’ll have this improved for most modern devices in the next release.

Interesting that you think you have nailed it down to devices with depth support, as I’m unclear what difference that would make here.

I believe the upcoming improvements were tested on a Galaxy S10, which has depth support, so hopefully we’re good for now.

It’s not clear 100%, but we had tested 7/8 different smartphone (Samsung, LG, Xiaomi, Oppo)
and seems that the oldest Android installation ( 9) do not have this problem,
and the latest (10+) seems to have this problem only on phones with Depth support.
Of course, these could simply be different ARcore implementations, but that seemed to be the common denominator present on the updated Android versions.

The transparency issue seemed to be fixed at last on our phone with the newest CloudXR version. However from the getgo, the main idea for us was to leverage RTX ray tracing in our solution. And for Unreal Engine, things are fine (transparency issue resolved) until we turn on RTX. This results in no image rendred when we do CloudXR. Are we doing it wrong? Are anyone else able to use RTX from Unreal Engine? We use version 4.27.

I don’t know offhand if we’ve played around with an RTX server sample app.

Quick guess would be RTX is generating no alpha map, or it’s generating it to a buffer other than the one being sampled for alpha for the stream, both of which might result in black stream visual.

If you use the stream capture command lines options (-csb and -ccb for server and client respectively, set on the CLIENT), you can then replay the streams and see what they look like. Similarly, you can use the -d option either client or server to capture frames every 300 or so, and then go back and review what is in left vs right eye images.

That should be a good start to debugging whether the right data is even being populated. If alpha is blank, that will tell us the capture side isn’t being fed the alpha channel correctly.

1 Like

Thanks for the debug tip!