M4A79T Deluxe with multi-gpu

Hi,
do you know if M4A79T Deluxe MB from ASUS can handle multi-nvidia-gpu cards at all?
I would like to install multiple Quadro FX5800 cards under Ubuntu linux 9.04 for CUDA (no need for SLI).
The nvidia driver for CUDA 3.0 shows only one card,
Thanks,

It should do. I use a Gigabyte board (MA790FXT-UD5P) which is almost identical to it for CUDA with a pair of GT200 gpus without any problems using Ubuntu 9.04. Are you sure you don’t have any other hardware problems, like power supply capacity, for example? Do you see both cards card enumerated in the linux device tree? Can you see the VGA bios post from either card, or can you see the cards in the BIOS hardware setup?

The power is Cooler master silent 1000W, should be enough I guess.

I have never seen both cards by /dev/nvidia*, but “lshw” shows them.

Only if I use one card can I see it as “/dev/nvidia0” or so.

I don’t see any option in the BIOS to see the cards.

The only thing related there to PCIe is to enable/disable/auto GFX or GFX2 dual slot config.

But changing it did not help.

I suspect that some ati-crossfire protocol might prevent me to use multiple nvidia cards with this board.

Crossfire should have nothing to do with it. I have a pair of GT200s on a Crossfire 790FX motherboard and it works fine. With Ubuntu 9.04 and the 195.36.15 driver I get this when the driver enumerates the devices:

avidday@cuda:~$ uname -a

Linux cuda 2.6.28-18-generic #60-Ubuntu SMP Fri Mar 12 04:26:47 UTC 2010 x86_64 GNU/Linux

avidday@cuda:~$ dmesg | grep nvidia

[   11.973664] nvidia: module license 'NVIDIA' taints kernel.

[   12.228440] nvidia 0000:01:00.0: PCI INT A -> GSI 18 (level, low) -> IRQ 18

[   12.228445] nvidia 0000:01:00.0: setting latency timer to 64

[   12.228542] nvidia 0000:04:00.0: enabling device (0000 -> 0003)

[   12.228545] nvidia 0000:04:00.0: PCI INT A -> GSI 19 (level, low) -> IRQ 19

[   12.228550] nvidia 0000:04:00.0: setting latency timer to 64

I have a panel connected to one card, and the other card is not touched by the display manager. When X11 starts, I get this for the display card:

(**) NVIDIA(0): Depth 24, (--) framebuffer bpp 32

(==) NVIDIA(0): RGB weight 888

(==) NVIDIA(0): Default visual is TrueColor

(==) NVIDIA(0): Using gamma correction (1.0, 1.0, 1.0)

(**) NVIDIA(0): Option "Coolbits" "1"

(**) May 07 19:16:12 NVIDIA(0): Enabling RENDER acceleration

(II) May 07 19:16:12 NVIDIA(0): Support for GLX with the Damage and Composite X extensions is

(II) May 07 19:16:12 NVIDIA(0):	 enabled.

(II) May 07 19:16:13 NVIDIA(0): NVIDIA GPU GeForce GTX 275 (GT200) at PCI:1:0:0 (GPU-0)

(--) May 07 19:16:13 NVIDIA(0): Memory: 917504 kBytes

(--) May 07 19:16:13 NVIDIA(0): VideoBIOS: 62.00.60.00.01

(II) May 07 19:16:13 NVIDIA(0): Detected PCI Express Link width: 16X

(--) May 07 19:16:13 NVIDIA(0): Interlaced video modes are supported on this GPU

(--) May 07 19:16:13 NVIDIA(0): Connected display device(s) on GeForce GTX 275 at PCI:1:0:0:

(--) May 07 19:16:13 NVIDIA(0):	 ViewSonic VG1930wm (DFP-1)

(--) May 07 19:16:13 NVIDIA(0): ViewSonic VG1930wm (DFP-1): 330.0 MHz maximum pixel clock

(--) May 07 19:16:13 NVIDIA(0): ViewSonic VG1930wm (DFP-1): Internal Dual Link TMDS

(II) May 07 19:16:13 NVIDIA(0): Assigned Display Device: DFP-1

(II) May 07 19:16:13 NVIDIA(0): Validated modes:

(II) May 07 19:16:13 NVIDIA(0):	 "1440x900"

and then this for the non-display card:

(II) May 07 19:16:14 NVIDIA(GPU-1): NVIDIA GPU GeForce GTX 275 (GT200) at PCI:4:0:0 (GPU-1)

(--) May 07 19:16:14 NVIDIA(GPU-1): Memory: 917504 kBytes

(--) May 07 19:16:14 NVIDIA(GPU-1): VideoBIOS: 62.00.60.00.01

(II) May 07 19:16:14 NVIDIA(GPU-1): Detected PCI Express Link width: 16X

(--) May 07 19:16:14 NVIDIA(GPU-1): Interlaced video modes are supported on this GPU

(--) May 07 19:16:14 NVIDIA(GPU-1): Connected display device(s) on GeForce GTX 275 at PCI:4:0:0:

(--) May 07 19:16:14 NVIDIA(GPU-1):	 none

It is during this process that the device entries are created in /dev. The other alternative is to do them at boot time with an init script yourself. There is a sample script in the toolkit release notes that can be used as a starting point. It really should work…

Thank you very much ! That has helped indeed.

By rewriting the script slightly and putting it in the /etc/rc.local now 2 cards seems to work.

However the X11 does not. So the Ubuntu Xorg can be blamed for this ?

Another interesting thing that by inserting the 3rd Quadro FX5800 the cpu fan starts for just a

half second and than the machine does not as if the 1KW power was not enough for 3x189W+CPU(<200W)…

Regards

I don’t think so, I am using the same Xorg and it works for me. I have my Xorg conf file like this:

Section "Device"

	Identifier	 "Device0"

	Driver		 "nvidia"

	VendorName	 "NVIDIA Corporation"

	BoardName	  "GeForce GTX 275"

	BusID		  "PCI:1:0:0"

	Option		 "Coolbits" "1"

EndSection

Section "Screen"

	Identifier	 "Screen0"

	Device		 "Device0"

	Monitor		"Monitor0"

	DefaultDepth	24

	SubSection	 "Display"

		Depth	   24

		Modes	  "1440x900"

	EndSubSection

EndSection

So I identify the card I want for display by PCI-ID and everything just works.

It isn’t the total wattage of the PSU that is all that important. it is the total current available on the 12v rails that are important. It might well be that the PSU (or the way you are doing the external power connections to the cards) has insufficient power available on the 12 volt lines.