This description shows how to remove the need for Nvidia’s X server when using the GeForce card for calculations only.
I recently had some strange results when using the Nvidia X server that is recommended for CUDA and the ATI driver:
[a] Some programs won’t run at all, saying some error about using X palettes(?).
[b] The xscreensaver uses one quarter of the first screen only.
[c] OpenOffice’s presentation program displays only half the slide in fullscreen mode.
My setup is the following: GeForce 8800GTS for calculations and ATI Radeon 7000 for X, all under Ubuntu Feisty x86_64.
As I don’t want to run X on the GeForce, I figured out a way how to circumvent the need to install the Nvidia X server at all.
(1) Remove all stuff from /etc/X11/xorg.conf that is solely needed for the Nvidia card.
(2) Tell linux-restricted-modules helper lrm-video to load the nvidia kernel module even if there seems to be no need as of xorg.conf: in /etc/modprobe.d/lrm-video, comment out the line starting with “install nvidia” by prepending a “#”.
(3) Force loading of the kernel module on boot time (no idea why the kernel does not load the module automagically when we access /dev/nvidactl): insert a line just containing “nvidia” in /etc/modules.