HP Cluster Platform InfiniBand Interconnect Installation and User's Guide

no data is being passed. The activity link blinks when data is being passed. If the LEDs are not
active, either the physical or the logical (or both) connections have not been established.
LED NamePort 1
Physical Link - Green
Data Activity - Yellow
LED NamePort 2
Physical Link - Green
Data Activity - Yellow
8.7 Mellanox Memory Free PCI-Express HCA (SDR)
The Mellanox Memory Free PCI-Express HCA supports InfiniBand protocols. It is single data
rate (SDR) card with one 4x InfiniBand 20 Gb/s port. Figure 8-7 shows the Mellanox Memory
Free PCI-Express HCA.
Figure 8-7 Mellanox Memory Free PCI–Express HCA (SDR)
Features of the Mellanox Mem-Free PCI-Express HCA include:
PCI-Express x8 version 1.0a compatible card
Single 4X InfiniBand port Version 1.2 compatible Host Channel Adapter
InfiniBand Compatible Verbs API interface for both Linux and Windows operating systems
4X (10 Gb/s) InfiniBand port with standard copper connector
Hardware support for up to 16 million QPs, EEs and CQs
Memory Protection and Translation Tables fully implemented in hardware
IB Native layer 4 DMA hardware acceleration
Multicast support
Programmable MTU size from 256 to 2K bytes
Four Virtual Lanes supported plus Management Lane
Support for InfiniBand transport mechanisms (UC, UD, RC, RAW)
8.7 Mellanox Memory Free PCI-Express HCA (SDR) 101