- General Information
- Manufacturer
- NVIDIA Corporation
- Manufacturer Website Address
- http://www.nvidia.com
- Brand Name
- Mellanox
- Product Line
- ConnectX-6
- Product Type
- Infiniband Host Bus Adapter
ConnectX-6 Virtual Protocol Interconnect® provides two ports of 200Gb/s for InfiniBand and Ethernet connectivity, sub-600ns latency and 200 million messages per second, enabling the highest performance and most flexible solution for the most demanding data center applications.
ConnectX-6 is a groundbreaking addition to the Mellanox ConnectX series of industry-leading adapter cards. In addition to all the existing innovative features of past versions, ConnectX-6 offers a number of enhancements to further improve performance and scalability. ConnectX-6 VPI supports HDR, HDR100, EDR, FDR, QDR, DDR and SDR InfiniBand speeds as well as 200, 100, 50, 40, 25, and 10Gb/s Ethernet speeds.
HPC ENVIRONMENTS
Over the past decade, Mellanox has consistently driven HPC performance to new record heights. With the introduction of the ConnectX-6 adapter card, Mellanox continues to pave the way with new features and unprecedented performance for the HPC market.
ConnectX-6 VPI delivers the highest throughput and message rate in the industry. As the first adapter to deliver 200Gb/s HDR InfiniBand, 100Gb/s HDR100 InfiniBand and 200Gb/s Ethernet speeds, ConnectX-6 VPI is the perfect product to lead HPC data centers toward Exascale levels of performance and scalability.
ConnectX-6 supports the evolving Co-Design paradigm with which the network becomes a distributed processor. With its In-Network Computing and In-Network Memory capabilities, ConnectX-6 offloads even further computation to the network, saving CPU cycles and increasing the efficiency of the network.
ConnectX-6 VPI utilizes both IBTA RDMA (Remote Data Memory Access) and RoCE (RDMA over Converged Ethernet) technologies, delivering low-latency and high performance. ConnectX-6 enhances RDMA network capabilities even further by delivering end-to-end packet level flow control.
MACHINE LEARNING AND BIG DATA ENVIRONMENTS
Data analytics has become an essential function within many enterprise data centers, clouds and Hyperscale platforms. Machine learning relies on especially high throughput and low latency to train deep neural networks and to improve recognition and classification accuracy. As the first adapter card to deliver 200Gb/s throughput, ConnectX-6 is the perfect solution to provide machine learning applications with the levels of performance and scalability that they require.
ConnectX-6 utilizes the RDMA technology to deliver low-latency and high performance. ConnectX-6 enhances RDMA network capabilities even further by delivering end-to-end packet level flow control.
SECURITY
ConnectX-6 offers a crucial innovation to network security by providing block-level encryption. Data in transit undergoes encryption and decryption as it is stored or retrieved. The encryption/decryption, based on the IEEE XTS-AES standard, is offloaded by the ConnectX-6 hardware, saving latency and offloading CPU. ConnectX-6 block-level encryption offload enables protection between users sharing the same resources, as different encryption keys can be used.
By performing encryption in the adapter, ConnectX-6 also renders encryption unnecessary elsewhere in the network, such as in storage. Moreover, ConnectX-6 supports Federal Information Processing Standards (FIPS) compliance, alleviating the systemic need for selfencrypted disks. With this capability, customers are free to choose their preferred storage device, including byte-addressable and NVDIMMs that otherwise would be used without encryption.
- Technical Information
- Total Number of InfiniBand Ports
- 1
- Host Interface
- PCI Express 4.0 x16
- Media & Performance
- Data Transfer Rate
- 200 Gbit/s
- I/O Expansions
- Expansion Slot Type
- QSFP
- Physical Characteristics
- Form Factor
- Plug-in Card
- Miscellaneous
- Environmentally Friendly
- Yes