Lenovo Thinkpad W530 running Linux: use integrted (Intel) for X11 and discrete (NVIDIA) for CUDA dev


I have a Lenovo Thinkpad W530 running Linux and it has a BIOS option for choosing which graphics card to use or use optimus (integrated/discrete/optimus).

I was thinking of running X11 on the Intel card and CUDA programs on the NVIDIA card, that way I can debug CUDA on X11 (effectively use the NVIDIA card as a calculation GPU and not for actual output).

I was wondering if anyone has tried this?


You’re looking for Bumblebee – http://bumblebee-project.org/. I’ve run it on ubuntu without issues… it creates a virtual X screen in which it runs anything that uses graphics acceleration. If you can actually disable optimus and the discrete setting actually shows up the NVIDIA native GPU without intel graphics showing up, you won’t need bumblebee.

Bumblebee is the solution, if your card is supported. Unfortunately in some lonovo laptops the bumblebee does not work. I found a workaround on a archlinux wikipage, but I was not able to test it.

Hi all,

from what I understood, Bumblebee will use the NVIDIA card to create a virtual screen which will be used to output to if you need it and the output will be redirected to the real screen. However, I don’t need the output at all as it means that I cannot debug my GPU since that requires that no X is using it.

The idea is to use the Intel card for X and the NVIDIA card just for its processing power.


It is your option, what you do. I have acer 4830tg with ubuntu 12.04. I have bumblebee and I run cuda programs. I do not know how bumblebee works, but I know it works on my laptop.

In my BIOS I can only choose if disable or not the nvidia card. It is a little different from what you described. some guy on this page claims he can run cuda with bumblebee http://ubuntuforums.org/showthread.php?t=2007485&page=2 .

Good luck.

Hi Pasoleatis,

Thanks for the update, what about debugging CUDA programs at the GPU level? I know running them works but that is a bit different.

I will check the debugging today or tomorrow evening. I am pretty sure the X servers runs on my laptop on the intel gpu, so the debugging should work as well.

Cool, thanks. The CPU and CUDA debugging works but the GPU level debugging is what I’m missing.

I just checked. nvprof (nvidia profiler) works and nvvp (nvidia visual profiler) work. The only trick is that any cuda program/profiler should be run using optirun. Such as : optirun nvprof ./a.out and/or optirun nvvp.

Have fun!

Yes, but they work regardless of if you have X11 on the same CPU, it is specifically the debugging (cuda-gdb) of the GPU (not the CPU or the CUDA code, that also works fine in a single GPU setup). If you want to be able to debug the GPU, it must not be running X11, however, from what I understand, Bumblebee creates a virtual screen for it, thus the debugging should not work in this case.

Hello. I opened he nvvp (nvidia visual profiler) I put the executable in and it ran. I do not understand what is your problem. I am telling you I just did it and you do not believe. Since you started this topic you could have installed bumblebee yourself and convince yourself.
I tried cuda-gdb, nvprof and nvvp. Teh X server runs on the intel card.

ok, thanks. The laptop is being sent to me, I don’t have it physically yet, I am trying to get as much info in advance since I will be quite busy these next couple of months. I have tested those tools also on my current system that only has an NVIDIA card (so no optimus issue), they all worked, even the CUDA and CPU debugging (using both cuda-gdb & nsight eclipse), however since I only have one GPU I cannot debug the GPU. Thanks for the info.

It is also that not everyone realizes that there are 3 things you can debug: the CPU, CUDA, and the GPU. The 1st two work on Linux with 1 card, if you have no X or if you have 2 cards (one for X and one for the dev work) all three work on Linux. On windows only the last two work (from what the NVIDIA site says).

So the reason I was prodding is that just running cuda-gdb and stepping through a main() doesn’t mean it works for GPU debugging, that is what I did initially (and thought it worked) until I realized and then read on the NVIDIA site that the GPU debugging doesn’t work if X runs on the card.

So please do not take offense Pasoleatis :-), it is not that I do not believe you tried or your results, it is just that I am trying to make sure we are speaking about the same thing.

:) Ok. Just tell me what would you would like to try and I will try on my laptop.

Don’t worry about it, I should get it on Monday, I will try and post my results here, thank you for your help. I’m not sure how to explain what to do to test it :-).

Did you find the answer to your question?