NIC bonding

I wanted to boost the available throughput between my two identical servers. Until now there has only been a single 1 Gbps link between the servers.

Getting some 10 Gbps networking cards would have been awesome, but they’re rather expensive. Another way to do it is combining (bonding/teaming) multiple 1 Gbps links. I found some suitable NICs for sale at Ebay.

These are Intel PRO/1000 PT Dual Port Server Adapters. These will definitely work with Linux, and they also have good support for jumbo frames.

I installed two of these cards in each server.

All those pretty LEDs… =)

And what about the throughput when the bonding is active?

3750Mbps or 450MB/s in raw throughput using iperf. That should probably do.

Synchronizing the SSD drives using DRBD shows pretty nice results of around 320MB/s. However, note that this doesn’t reflect the actual performance of the DRBD device itself. This is only showing the speed of the initial synchronization.

Leave a Reply

Your email address will not be published. Required fields are marked *