When designing a network with the obvious use of Servers or whatever else etc, how is bandwidth calculated?
- Is it based on the actual network design ready to be implemented?
Bandwidth will obviously only be an issue across a WAN/INTERNET as in if it is just local connections presumably within the same building then it is just down to memory on each device etc ie server, router, switch etc?
Im not sure about from one building to the next ?
what is it: 2 + 2 = 4, plus 10% for residual - although a REAL bad example i know!!!