Ethereal can show a graph of the throughput of a singe TCP connection. How is the Y value of each dot calculated?
I know this is an application question but it seemed more likely that the networking experts would be able to answer this one.
I googled and found similar questions, but never any answer to them.
I figured out the following already:
- every dot represents an incoming packet
- the Y value is NOT the packet size divided by the time elapsed since the last packet arrived.
- the first 20 Y values are "simply" the average so far: Y-value = ((packet number) * 524) / (X-value)
(you may assume that all packets are size 524)
So if the first packet arrived after 1 sec, the Y is 524 Bytes/sec. If the second then arrives at X = 1.44 sec, the Y-value is 2*524 / 1.44 = 727 Bytes / sec.
But the formula results start to differ after about 20 packets. I thought about a sliding window that averages the last 20 packets and tried the following:
etime = this packet arrival time
stime = arrival time of (this packet number - 20)
if packetnumber > 20
Y-value = 20 * 524 / (etime - stime)
Thereby averaging over the interval of the last 20 packets, but that formula also fails when applied to the following values that I see in my graph:
pkt number - pkt arrival time - graph value (read from screen)
1 - 1.03 - 524
2 - 1.44 - 730
3 - 2.55 - 630
4 - 2.66 - 790
5 - 3.06 - 860
20 - 7.29 - 1437
21 - 7.53 - 1462
22 - 7.74 - 1669
23 - 8.11 - 1627
24 - 8.33 - 1674
25 - 8.53 - 1905
It feels like I'm close, does anyone know what the actual formula is or can anyone think of a formula that will closely estimate the values I read from the graph?
I'm gonna look at the source code now, but that might take me a while..