Tesla question

Hi,
I’d appriciate your input on this. If I get for example a x20 performance gain (over one core) by using GPU and want
to use the Tesla solution, I understood that I’d need two server machines so each will be able to connect to 2 cards over PCI.
Is it true? if so the $ benefit is real low no? is there a way to connect a Tesla to one host? how to utilize the performance boost
in a more reasonable way?

thanks
eyal

S1070 has 2 PCI-E connectors and can be connected to 1 or 2 host PC’s

S1075 has 1 PCI-E connector and can be connected to only 1 host (at the cost of bandwidth between host & device offcourse)

Hi,

How do I connect the S1070 to one host? should this host be special? have two PCI-E outputs? Do you have recommendation of such machine?

Anyway what you’re saying is that for the cost of bandwidth I can connect 4 GPU cards to one host?

Thanks for the assistance :)

eyal

Your computer needs to have 2 PCI-E x8 or x16 slots free to connect 1 S1070 to it. On the website is a list of validated computers for the Tesla series.

There’s a S1075 now? I can’t find any mention of it on the Tesla site. (The option to be able to multiplex 4 cards on one PCI-E slot is kind of handy.)

If you google it you will find that it is mentioned on numerous nvidia sites, apart from nvidia.com, there are even sites where you can pre-order them.

Hi!

Is it possible to install two C1060 on one “compatible platform” ( Systems for use with Tesla C1060 )?

Thanks!

Not except where noted. (well, there’s a way to make the T7400 work even with its lack of power connectors, but it’s such a hilariously bad hack that I can’t in good faith recommend it to anyone)

So, there is no other brand-name solution for two C1060 except HP xw8600?