Eye tracking -> foveated rendering

As I estimate there is impossible to use eye tracking for foveated rendering in CloudXR:

  • best latency for client → server render → client is ~100ms
  • eye movement (saccade) is upto 900°/s
    → up to 90° in one render cycle that is nearly the same as headset FOV

Or I am wrong and all latency problems can be solved in CloudXR (Latency Requirements for Foveated Rendering in Virtual Reality) ?

I would be surprised if you can. But realistically the point of CloudXR is to allow you to not need foveated rendering really…

It’d be fine for eye tracking for other purposes I imagine (gameplay, blink detection, etc) just not Foveated Rendering

Hi,

Eye tracking + foveated rendering is definitely possible in CloudXR. However, the actual visual impact of having the foveated center delayed in relation to eye movement is not clear yet. It is something that can be looked into for the future, however.

-Will

Any updates on this?

We can clearly see the foveation in the periphery and even an imperfect or laggy dynamic foveation using eye-tracking would be preferable to what we currently see while peeking at the sides of the view in VR.

With more and more VR headsets supporting eye-tracking (especially the upcoming Quest Pro / Cambria), it would be great to get an ETA on when / if passing eye coordinates to CloudXR will be added.

thanks.