About the Mellanox adapter supporting GPUDirect Async and Storage technologys

Dear Sir or Ms:

I’m a greenhand within respect of the GPUDirect Storage technology, and have some questions to consult with some experts:

(1) Mellanox network adaper is the important equipment to achieve this technology, but I found that two kinds of Mellanox adapter can be supplied from the website of NVIDIA CORP., which includes Ethernet ConnectX SmartNIC and Infiniband VPI adapter, both of the cards can supprot GPUDirect technology. What’s the difference between them? Now I plan to develop CUDA C code using GPUDirect Async functions, which needs Infiniband VPI card. Actually I need to test my code, if the cheap adapter can meet my requirements, it’s enough, I want to save money.

(2) Can I use Mellanox ConnectX-4 adapter to execute GPUDirect Storage (GDS) technique ? I found that GDS requires ConnectX-5 or later from the website of NVIDIA.

(2) If I have two computers, one with NVIDIA Quadro GPU and Nvme disk, each one is inserted in PCIe X16 port. Can I carry out GDS technique?

Thank you in advance, then this will help me to consider the nexp steps about my CUDA C coding.

Best Regards,

Li Jian

China University of Geosciences