Hi
I have installed a ConnectX-4 Lx 10GB Card on Windows 10, and an Intel PRO/1000 for comparison.
In Network Connections the Mellanox is described as a 10GB connection, the Intel as a 1GB connection.
I have used several benchmarking applications to test them with similar results, most recently ntttcp.
The test uses a crossover cable between one port and the other on the same card, the test commands I use are below.
As you can see in the attached text file, the Intel achieves a transfer rate higher than it’s 1GB spec at 12812.267 MB/s.
The Mellanox underperforms at 13412.951 MB/s, running at just over 1/10th it’s 10GB capacity.
I have tried:
- adding firewall rules, and disabling the firewall completely
- installing latest drivers (see driver info below) - before I did this it actually ran slower than the Intel
- various adapter settings
It is installed in a PCI-E 8.0 GT/s x8, so it should have a bandwidth of >60GB.
Does anyone know what I need to do to get the speed of the Mellanox up to 10GB?
See below for the full test results and details of my system and NICs.
Many thanks!
MellanoxTests.txt (7.33 KB)
Hi,
I suggest to review the below guides :
Getting started with ConnectX-4 100Gb/s Adapter for Windows
https://community.mellanox.com/s/article/getting-started-with-connectx-4-100gb-s-adapter-for-windows
Getting Started with ConnectX-5 100Gb/s Adapter for Windows
https://community.mellanox.com/s/article/getting-started-with-connectx-5-100gb-s-adapter-for-windows
Windows Performance Tuning
https://docs.mellanox.com/display/winof2v240/Performance+Tuning
After applying the same configuration and tuning steps , please share the results.
Thanks,
Samer
Hello,
I have done everything I can do from that guide. The BIOS settings are completely different to my servers BIOS, and the mlxconfig command does not return LINK_TYPE_P1 or LINK_TYPE_P2.
I did change the Jumbo packet setting, and doing so increased the speed of my tests to 14419 MB/s (the ntttcp settings recommended in the guide resulted in only 9334 MB/s). But this is still only 1.4GB and nowhere near 10GB.
I am testing the card on a development machine. I need to give the go ahead for my hosting company to purchase x8 of these for my servers as per their recommendation, but I am concerned about doing this while they are underperforming so badly on my kit.
Please advise.
Hi Fergus,
Please open a support case for further investigation at
networking-support@nvidia.com
Thanks,
Samer
I contacted networking support as you suggested, but they will not deal with me without a contract. I object to paying for support when there is clearly an issue with the Mellanox card/drivers under Windows 10 that is causing this problem.
Please advise.
Thanks,
Fergus
I just tried moving the Mellanox to a x16 PCI slot (it was in a x8 before), and removing the Intel Dual Port card and another PCI card I didn’t need.
The results are still bad at 1.27GB.