Using the GPU just for computing

Dear all,
Is there a way to use the internal graphics card for graphics and the NVIDIA card just for CUDA? When I try to do so, then the nvidia driver will not recognise the card. I have to set in the bios the primary graphics card to the NVIDIA and connect the screen to it in order to be able to use the GPU for computing. Thanks for any help!
Michael

This should (and has) worked without any problems. You failed to state which driver you’re using, or what the actual error is, nor which GPUs, so its not possible to comment on specifics. Additionally, pelase generate and attach an nvidia-bug-report.log.

Ok, good to hear! I have a GTX280, and use the driver which is installed with NVIDIA-Linux-x86_64-177.67-pkg2.run, on SUSE 11.0. When I declare the NVIDIA GPU as primary graphics card in the bios, then everything works fine. When I change this to the internal graphics card, I get the following error when I try to run e.g. deviceQuery:

FATAL: Error inserting nvidia (/lib/modules/2.6.25.11-0.1-default/kernel/drivers/video/nvidia.ko): No such device

Attached is the bug report! Thanks!
nvidia_bug_report.txt (156 KB)

You should check if your GTX280 is listed in your xorg.conf, otherwise run the utility to configure X to include the GTX280.

If you’re switching between an onboard Intel display and a GTX 280 in a PCIe slot, generally you can’t use both at the same time (or that has been my experience on every onboard graphics adapter going back many years).

I too want to use a gtx 280 for computation only on suse 11.

the card should arrive monday and I am trying to figure out how to do it ahead of time. i want to continue using my ati 3300 integrated motherboard for the X server. i understand that this is possible but complicated as the ati and nvidia drivers will conflict unless done just right. has anyone done this before? currently i am using the latest ati drivers for the integrated card.

I did find the following on the forum, but suse 11 does have the described modprobe lrm-video file, which i think is a ubuntu thing

thanks in advance for any help you can provide.

From the looks of it, just ignore step 2 in that process–linux-restricted-modules is an Ubuntu thing.

any idea if the ati integrated or the 280gtx should be set to primary in the bios?

thanks

presumably the integrated, but I haven’t tried this and don’t know if that means it will disable the PCIe slot completely.

I wonder if it is possible to go a step further, and actually turn the GPU off (so that it draws little or no power) when it’s not being actively used for computation?

I’m new to GPU computation, but will be doing some development & testing in the near future (the machine is in transit now). After optimizing my usual (laptop) machine to draw less than 20W in “stare at the screen & think” mode, I’m a bit disturbed at the thought of running a machine which seems (from the 500-600W power supply recommendation) to have a power draw rather more than the rest of my house. I’ve looked around NVidia’s site a bit, but didn’t see anything useful on power conservation. Anyone have suggestions?

Thanks,
James

Idle power draw is relatively low with any modern GPU.

My Intel Q9300 / NVIDIA 780i / 8800 GTS (G92) machine idles around 150 W and peaks at 220 W. Only when adding a 9800 GX2 and running all 3 GPUs full bore does the power usage go up to ~430 W. (measurements made at the wall with a Kill-a-watt meter).

If you are curious, I could swap out cards and measure the power consumption of the GTX 280.

I suppose I should look on the bright side: it will make a nice space heater this winter :-(

Not worth the effort: I know it’s a lot, and would be more interested in learning ways to reduce the power used, rather than measuring just how bad it is.

Thanks,

James

That’s one way to look at it… The other way is to compare that 220W to the many kilowats (plus the air conditioning) a 32-CPU core cluster would be pulling, which would offer similar performance in my application.

Sure, but the clusters are out there available for other researchers to use when I’m not running things on them, and presumably are configured to sleep when not being used. As for instance, I’ve been sitting here editing code for the last couple of hours, and the processor has spent 99% of its time in a low-power sleep state.

Which is really what I’m asking: whether there’s a way to do a similar GPU sleep when I’m not using it for computation?

It’s automatic. Idle power consumption for the GT200 series (as measured by Tech Report) is extremely low, even compared to G92 cards.

That looks better than I had expected, though I wish they’d put in a baseline for the system without an accelerated graphics card. Still, it was running Vista, so with a bit of kernel tweaking I ought to be able to get total power down to something reasonable. Thanks!

reporting back that i got cuda working on my 280 gtx with computations only while continuing to run graphics on my ati integrated card. the nvidia drivers installed without even modifying when xorg.conf file, as i skipped the aport involving sax2; which was nice. i did need to run the init script mentioned on the forums before to create the nodes in the /dev directory. since I’m on open suse I had to comment out the RH specific functions success and failure from that script. so in the end I never even had to start X on the nvidia card.

I also used the patched sdk for opensuse 11 from andrew cooke, who’s work made my life much easier; so thanks if you’re out there!

i do have one last question, ever since putting the card in my machine, my integrated graphics are running very slow. for example, scrolling on webpages used to be very fast and seamless; now with the nvidia card in the pcie slot everything has a lag which is very frustrating. could there be some kind of resource conflict going on; this was happening even before installing the nvidia drivers and since I never even told xorg about the card I am surprised and clueless as to any potential conflicts.

does the nvidia card take some ram that would normally be for the integrated graphics, even set up to not be used for graphics (compute only)?