Is out-of-core supported by Optix for GeForce serials? If not, will it be supported in the future?
An older version says, out-of-core is supported, but not for GeForce
More recent version says, out of core processing is supported and doesn’t mention the limitations to GeForce
The latest version that also the one I am using, doesn’t mention out-of-core at all.
You know, typical useful out-of-core features for a GPU raytracer.
- Extremely huge textures
- Running out of GPU memory.
Out-of-core texture and geometry support (in the sense of working sets bigger than the installed GPU VRAM) existed in earlier OptiX versions and was available for Quadro and Tesla boards, then also GeForce GTX.
But the implementation was limited and for the geometry part also not the most performant.
Because of that it was removed during the OptiX 4.0 core re-architecture to pave the way for future developments, some of which can be found in OptiX 5.0 already.
It was possible to add support for NVLINK more easily now, which allows to combine the VRAM of multiple NVLINK connected GPUs to a bigger pool. For example in case of the NVIDIA DGX-1 system, four NVLINK connected GPUs with 16 GB VRAM each could be combined to a 64 GB pool.
Please keep requesting features and ask technical questions as you like, but we simply cannot comment on future developments or schedules.