MultiGPU Peer-to-Peer problem on Tesla C2070 cards

Hi all,
I have a problem when I try to make a MultiGPU implementation with P2P strategy using 4 Tesla C2070 cards (Fermi). On the P2P approach with this configuration it is just possible communicate with cards 0<->1 and 2<->3 (anybody confirm this?) but is not possible to enable 1<->2 communication.

Does anybody help me with this problem? Is there any kind of parameter to set in order to enable cards 1 and 2 to communicate?

Thanks in advance.

I can get 0>1,2,3 1>0,2,3 2>0,1,3 3>0,1,2 communication using 4 gpus… I’m wondering if maybe the problem is that your motherboard has 2 different IO hubs. That would produce the symptoms you described.

Try connecting 0<>2 and 0<>3 - if these don’t work it’s probably what I said.

Hi sBc-Random,

thanks for your reply… Do you have 4 Tesla C2070 too? I will try to test the connection between 0<->2 and 0<->3 surely… In case of your comment be right, what do you suggest in order to solve this problem?

Regards.