Difference between datacenter Gigabit Ethernet and Fast Ethernet
Posted on 2003-03-05
We are considering relocating our servers to a new datacenter.
Initially we will rent a 100Mbps Fast Ethernet port which will be more than enough for our needs right now.
However when we approach the 100 Mbps limit of this port we will of course want to purchase more bandwidth.
This is where the figures don't conform to the normal economies of scale which you would expect.
For example we can get 100 Mbps for 6000 Euro / Month on a fast ethernet port. If we bust this limit then we will need to rent another full (100mbps) fast ethernet port to expand it or go to Gigabit ethernet.
For 100Mbps on a Gigabit Ethernet port it will cost 8000 Euro per month. 400 Mbps costs 70 Euro / Mbps which is still 10 Euro / Mbps higher than the initial 100Mbps connection on Fast Ethernet.
Can anyone explain why this is, why should 100Mbps cost 33% more on a Gigabit circuit than it does on a fast ethernet connection. Bearing in mind this is just a charge for raw bandwidth rack rental and switches etc are seperate.
I've asked the quoting company as well but would like some outside feedback on this as well.
Isn't 100Mbps still 100Mbps no matter what kind of equipment it goes through.