I have started my journey with optix, targeting foveated ray tracing. I was googling (1) how to enable eye tracing in optix. One solution could be OpenXR, right? But OpenXR only works for AR, VR, and MR. However, if we have an eyetracker, say, the latest tobii eye tracker 5 which works on desktop environment, how to add this eye tracker in optix? Is there any library to add any arbitrary eyetracker in OptiX?
And my second question, is it possible foveated raytracing (gaze-tracked rendering for lowering the rendering load) in optiX? A similar code repository, or tutorial would be really helpful.
OptiX is not a renderer, it’s a general purpose ray casting SDK which can be used to implement pretty much anything you can solve with ray tracing.
As ray-casting SDK it knows absolutely nothing about HMDs, VR, AR, eye tracking, or foveated rendering.
All that is outside of OptiX and you would first need to implement a renderer which contains the necessary features.
Means the HMD tracking of the respective VR runtime would provide you with a current camera location and projection.
An eye tracker would provide you with a gaze direction inside that HMD coordinate system.
Both together could be used to control a ray generation program for that projection with some ray distribution where the foveated area gets more rays.
If the peripheral areas should get less rays, you need either a sophisticated scatter algorithm filling more than one pixel for some rays, or a similarly sophisticated post-processing filter which gathers the correct results and expands them to the final display texture for the HMD runtime. (When following the links at the bottom, look for the variable rate shading feature in rasterizer APIs as well.)
Then you would need to render the scene with some raytracing algorithm somehow.
Finally the rendered image needs to be transferred to the actual HMD runtime which involves some interoperability between CUDA and the 3D graphics API of the HMD runtime (normally Direct3D, maybe Vulkan).
That is all your responsibility as a developer.
Due to the high frame rate and high resolutions required for VR, this will be a challenge to get fast and nice.
Progressive rendering algorithms are not really applicable because the camera or gaze direction changes every frame.
Means that would need some final frame rendering which uses simple lighting, reflections, etc. That in turn will result in lots of aliasing.
Trust me, I tried implementing that some time ago. It’s probably better with current high-end Ampere boards today, and with that plural I really mean multi-GPU NVLINK setups!
General OptiX 7 tutorials can be found inside the OptiX SDK and the links in the sticky posts at the top of this sub-forum.
Considering that there are multiple VR specific features added to the rasterizer APIs, these will be able to solve that much easier.
Please also have a look at the VR Works API to see what that offers: https://developer.nvidia.com/vrworks
I’m not sure if HMDs with eye trackers already provide variable rate shading information. I assume you need to build these yourself.
There have been quite a number of GTC talks about VR and VR Works over the years. Have a search through these here: https://www.nvidia.com/en-us/on-demand/
@droettger , you have gave the best insightful suggestions. Now I have better understanding on OptiX API. Hopefully I can overcome the roadblocks, recently I have bought a RTX 3090 GPU. On the other hand, the HTC eye pro have enabled variable rate shading already. So wish me luck. Thanks for the details.