A pair of 3090s is a very beefy GPU driver, that should be able to handle your VR application just fine, provided that you can push the rendered results into the Vive headset.
The main issue for you to solve is how to render something (using any means whatsoever) and send those rendered results to the headset display. We on the OptiX team are completely unfamiliar with the Vive SDKs for display, but assuming that Vive provides a path for you to build your own rendering engine, the place to look for such an example would be on their developer page and/or forums.
https://developer.vive.com/resources/
https://vr.tobii.com/sdk/develop/
From a little bit of searching, it appears that Vive headsets have the most support for Unity & Unreal based applications and content. But I can see a Wave Native SDK that you can use to build OpenGL applications. There are details and a tutorial here: https://hub.vive.com/storage/docs/en-us/RenderRuntime.html
I recommend going through their tutorial to build a very basic OpenGL based renderer for your display, without trying to use OptiX (and also don’t bother with eye tracking or head motion yet). Just get to the point where you can drive the headset display with your own code. Once that is up and running, you can then consider the problem of how to render using OptiX and then push the resulting image into an OpenGL buffer and use it in your Vive application. This part won’t be very difficult because the OptiX SDK samples already have built-in OpenGL interop, so for a path tracing sample you could use optixPathTracer
, but pay attention to the OpenGL code we have in our sutil
library, in particular GLDisplay.cpp
. You’re going to need to create the GL context & display buffer in your own application, follow our example of how to render into the display buffer, and then pass it to the Vive SDK for display.
There will be several other bits and pieces to integrate, once you get the display rendering to work. Now you can investigate your eye tracking API and figure out how to pass the camera parameters over to your OptiX based renderer. There will be plenty of work figuring out how to synchronize your OptiX launches with the Vive display refresh. I’m assuming you’ll probably use one GPU for each eye. Because Vive supports OpenGL, you will have the option to do some raster based rendering in OpenGL before or after your OptiX ray tracing phase, if you want. This is how you might display on-screen text or UI. If you used a pre-render phase, you can consider rasterizing your first bounce, and ray tracing the secondary bounces (reflections, refractions, shadows, etc.) This is optional, and can be fairly complicated, so you could start by using pure OptiX and then investigate the raster integration later if it seems necessary.
This path will take some time to explore, I’m sure, and we will be interested to hear how it’s going and see demos of OptiX on Vive Pro Eye once it’s up and running. Good luck!!
–
David.