Multiple Monitor Issue in Vista and Win7 Can there be a driver enhancement?

I have not searched through cuda zone archives on this issue, and some out there will be unhappy that I again am bringing this up. Sorry.

I participate in the Folding@Home project running mostly with 9800GX2 hardware. I recently moved my main system from XP to Win7 x64. This system has dual 9800GX2s and only a single monitor on a CrassFire MB (sp intential). Under XP everything ran well, but with the move to Win7, the cards are not active until there is a physical monitor attached to the card. This has dropped me down to running with a single GPU. With some (but not all) of the projects, I can get the second GPU running on the primary card by enabling the on-card SLI, and specifing in the F@H clients to force the use of the second GPU.

Because my MB is not SLI, there is no time I can get the second card running.

The solution everyone seems to be using is creating a “dummy” load for the DVI port that fools the card into believing that a non-PnP monitor is attached. I don’t really want to attach something homemade like that to my videio cards.

Can a driver switch be added to the nVidia drivers to “lie” to Vista and Win7 and tell the OS that a monitor is present even if the monitor is not.

By default the drivers should work as they current do. If there is no monitor attached the OS can turn off the card. This will meet the requirements of WHQL certification.

But,

If the option were to be enabled. Something like: “Enable Cuda functionality when monitor is not present” Then the driver would sense the state of no monitor present, and tell the OS that a non-PNP device is attached.

This would be an elegant solution that would return the functionality that we had under XP.

Thank you,

    • Steve Mizera - Silverado Canyon, CA

Wow, no response from either the CUDA team, or the user base.

Normally I at least get someone telling me why that would be a bad idea, or questioning my geneology.

    • Silveradocyn

seriously, its all about the dummy dvi plug, that does EXACTLY what you want to do, its one wire, do your 1 minute research into how to make it, take your time and nothing can go wrong. the reason it isnt inthe drivers is because it is a rare situation you are in, most people actually have monitors plugged inbto their graphics cards.

See I knew someone would pipe in to tell me I was wrong to ask for an enhancement.

As I currently run 2 9800GX2s in each of 2 systems, I would need 8 monitors on my desktop if I were just using monitors.

I don’t like using something like the dummy plug, because as soon as there is an issue with a card, someone will tell me that the use of a dummy plug has voided my warrantee. Yes it is simple, yes I have done the research, but no, I would rather not have a hack like that to get around a software issue.

I don’t feel that I am all that special of a case in the CUDA world. It just seems that it would be an easy and elegant solution to add a checkbox to allow Vista and Win7 to work as XP has been doing for years.

    • Silveradocyn

well, thats up to you. i cant see anything changing, remember that vista introduced a whole new driver model and so graphics cards get treated differently than in xp.

it looks like its use the dummy monitor or go back to xp.

There’s no real issue with the dummy plugs. All they do is add a little resistance into the port. Wired up correctly (and there are plenty of pictures around to show you how it’s done) using a DVI-VGA adapter, there is absolutely no way they can damage the card. Infact although it’s a niche market, I suspect it won’t be long until you can buy them pre-made. Would you be prepared to plug one in if it came from a shop?

The limitation you are seeing is due to changes in the way the Operating System handles hardware & is not something the drivers can compensate for.

I would feel much better if it came from a shop with someone backing the product.

The Operating System does not handle hardware, that is all done by the driver, and the OS just responds to what the driver is telling it about the hardware.

Vista and Win7 only know that the monitor is not present if the driver tells the OS that a monitor is not present. To meet Microsoft WHQL specs, the driver is required to report that a monitor is not present, but my request is that there be an option in the driver to “trick” the OS. Of course this could be done in the driver. The bigger trick is the politics of not mucking up the MS WHQL certification.

    • Silveradocyn

This already exists, but it’s hidden behind horrible registry editing:

[url=“http://forums.nvidia.com/index.php?s=&showtopic=91614&view=findpost&p=517428”]The Official NVIDIA Forums | NVIDIA

hi steve,

the combination of 64-bit vista and multiple GPU’s is a tricky and somewhat unstable one, as multiple posts in these forums show – particularly if combined with CUDA. that isn’t an issue for you, but for F@H, the plugs do seem to be necessary (see [url=“http://foldingforum.org/viewtopic.php?f=38&t=8503”]http://foldingforum.org/viewtopic.php?f=38&t=8503[/url] for a discussion). fortunately, they’re really not that hard to make. you don’t need a soldering iron; just buy DVI-> VGA adapter plugs (about $7-9 at Radio Shack each), one for each DVI connection - 1 for the display (i assume you just have a single monitor) and put three 68 ohm resistors (also from RS) on the back of each one (google VGA dummy for the pin pattern), plug them into to your cards on the back.

HTH.

-drh1

TMurry,

Thank you, I will start experimenting with the Reg values to figure out if there is a rhyme or reason to the implementation.

(I know there has to be a way to fix the software issue in software! I would rather deal with a software kludge than a hardware kludge.)

    • Silveradocyn

cleaning up this thread–please don’t post nonsense in the CUDA forums.