Tesla & Quadro with OPTIMUS

Dear all CUDA experts

I have a system with Quadro6000 and Tesla C2075 cards inside.

When I scan the system for CUDA capable devices, Quadro 6000 is coming at 0th place where as Tesla at 1.

Other systems where I my application runs have only one CUDA capable device.

What should I do If I want to offload all CUDA calculations automatically to Tesla while use Quadro only for rendering.

I reconstruct huge 3D volume and I use vtk’s GPURasyCast mapper to update the image on the screen which it automatically uses Quadro.

I’m looking for a solution where I no need to explicitly call cudaSetDevice(TeslaDevice). It’s mentioned that nVidia OPTIMUS technology will handle this automatically(not sure) but in my case, If I don’t explicitly change the device number via cudaSetDevice, all my CUDA kernels are executed on Quadro include GPU based ray-cast algorithms from VTK while Tesla is sitting idle.

Please help me how to automatically offload CUDA calculations to Tesla?


I’m using Windows 7 64-bit.

How does this flag affects the CUDA device detection and kernel offload to Tesla instead of Quadro?

Thanks in advance.

If you have an execcutable in Windows with optimus, if you do right click on the executable you should have an optio run with the one f the nvidia gpu. If you have 2 cards one gets the id 0 and the other id 1. In the code you have to use cudasetdevice function in order to make it run on a specific device. If you do not set that it will automatically run on device 0.

On my system Quadro is assigned device id 0 and Tesla device id 1.

I have CUDA modules which were developed some time back with only one GPU in the system. These modules are by-default using device with id =0 (call to CudaSetDevice(0)). Now do I need to explicitly change the code to use the Tesla device?

I read in some article that with OPTIMUS technology, all CUDA calculations are automatically off-loaded to Tesla irrespective of which device we set
via CudaSetDevice()? Is it true?

Thanks in advance for your replies.