My employer is asking me to check the effective data rates available for users on an 802.3 Network. I am using random numbers here, they have no meaning just filleres.
Say on average to read a single email message via web mail, the HTTP protocol sends 1 packet from client to server with 80 bytes and receives one packet with 910. If the average is one new TCP connection must be established for every email message read. then what is the effective data rate of 802.3 network only using web based email.
How do I explain this in steps that a user will understand, can anyone walk me through the math.
What are the total # of packets sent?
What are the total # of data bytes sent?
What is overhead of Ethernet? 26 Bytes
What is the overhead for TCP? 20 Bytes
What is the overhead for IP? 20 Bytes
What is the minimum frame size for ethernet? 64 Bytes
What is the total overhead?
What is the effective data rate?
I am not sure how to answer these in user terms with simple math. Can anyone help? Thanks in advace!