MT25418

3gghead
  6 years ago
  Other (nic chipset)
  Mint (older version)
  Works perfectly
What works:

Fiber Channel Network Interface Controller (InfiniBand, IB)

PCIe 2.0 x8
InfiniBand Mellanox Technologies MT25418 [ConnectX VPI PCIe 2.0 2.5GT/s - IB DDR / 10GigE] (rev a0)

Provides:
Remote Direct Memory Access (RDMA)
InfiniBand 20-Byte Address Management
IP over IB (connection to native IP services via IB adapter)
NFS, SCSI connections via InfiniBand

Kernel Modules:
fcoe FCoE
libfcoe FIP discovery protocol and FCoE transport for FCoE HBAs
libfc libfc
scsi_transport_fc FC Transport Attributes
scsi_tgt SCSI target core
nfsd
nfs
lockd NFS file locking service version 0.5.
fscache FS Cache Manager
auth_rpcgss
nfs_acl
scsi_dh SCSI device handler
microcode Microcode Update Driver
xfs SGI XFS with ACLs, security attributes, realtime, large block/inode numbers, no debug enabled
ib_ipoib IP-over-InfiniBand net driver
rdma_ucm RDMA Userspace Connection Manager Access
rdma_cm Generic RDMA CM Agent
ib_cm InfiniBand CM
iw_cm iWARP CM
ib_sa InfiniBand subnet administration query support
ib_addr IB Address Translation
ib_umad InfiniBand userspace MAD packet access
ib_uverbs InfiniBand userspace verbs access
mlx4_ib Mellanox ConnectX HCA InfiniBand driver
ib_mad kernel IB MAD API
ib_core core kernel InfiniBand API
mlx4_en Mellanox ConnectX HCA Ethernet driver
async_xor asynchronous xor/xor-zero-sum api
async_memcpy asynchronous memcpy api
async_tx Asynchronous Bulk Memory Transactions API
nbd Network Block Device
mlx4_core Mellanox ConnectX HCA low-level driver

What was done to make it work:

Surprisingly, the drivers and firmware updater were already in my software repositories. Unfortunately the package maintainer didn't include dependency or related packages in their metadata. So it was just a matter of filtering the package list to find them. Plus some config tweaking to optimize the IB packet transfer.

Additional notes:

The card actually has 2 2.5GT/s ports which can be bonded but it provides network I/O that's already faster than local disk I/O. The cards, cables and switch were dirt cheap in auction. It's hard to imagine what their former owners upgraded to.

"Mellanox Technologies" wasn't in the list of hardware vendors at the time of this post.