Why do servers perform slower when NICs are set at 100Mbs than at 10 Mbs?
Posted on 2006-11-22
I recently experienced a very frustrating situation involving poor communication between several of my Windows 2003 servers. This had the effect of drastically slowing down my application to the point of almost not working. In the end it ended up being some incompatibility of my new servers with my old HP 10/100 switch. I deployed a new Alcatel 10/100/1000 switch and everything is lightening fast now.
In attempting to reslove the problem it became apparant that when I set the NICs on the servers to 10Mbs communication was better than when they were set to 100 Mbs and this was true even on the new Alcatel switch. The best set up, the one that I'm using now is with the NICs set to Auto Detect.
Anyway, I ended up calling Microsoft to help resolve the issue and after it was "fixed" with the new switch, I asked why it would be slower to set the NICs at 100 Mbs than at 10 Mbs. I got a very vague answer that really didn't explain it.
So can anyone tell me why this would happen?