Hi all, this is my first foray into datacenter switches, i have quite basic skills when it comes to switches etc, enough to keep a single building running well but nothing major.
We use Microsoft storage spaces direct and MS have informed me i need to upgrade to RDMA to fix my performance issues. This led me to buy some second hand hardware to test out it before spending big bucks on another. I do plan to read into this as much as possible, so any good links would be appreciated.
First off, i am trying to get 2x M4001T switches online. I have installed the cards into the C1 / C2 slots in my chassis, inserted a blade and installed the software. I then run Opensm for both ports which is showing as below for both Guids.
C:\Windows\system32>opensm -g 0xe41d2d0300e6bb92 -B
OpenSM 3.3.11 UMAD
Command Line Arguments:
Log File: %windir%\temp\osm.log
OpenSM 3.3.11 UMAD
Entering DISCOVERING state
Entering MASTER state
So i thought this would mean my switches would now be online and working? However i have 2x NICs in Dell R730s (CX324A) which are connected via 40Gb cables to the M4001T switches.
However they are still showing as “not connected” i have the software installed on all OS’s however not running on the R730s as i read subnet manager should only run on one server.
Is there a way for me to get an IP address for my switch? Also, how can i bring these ports online so i can start using the 40Gb network.
Thanks a lot in advance for any help. It is appreciated!
Edit - after looking in to it i believe i have IB NIC for the blades but eth NICs for the servers, which is why they arent showing a connection?
I have ordered 2x Mellanox MCX354A-FCBT cards instead for my servers and then i will try the configuration again.
Although i am still having issues getting port 1 and 2 to stay online across 2 blades. Port1 will come online and ping but port 2 seems to stay offline.