# how to calculate delay jitter given delay.

Hi experts,

I understand that delay jitter is the variation of delay in network packet measurement.
So given delay, how do I calculate delay jitter?
If delay jitter is the variance, can I use the usual statistic variance formula var(X)=E(X^2)-E(x)^2?
If my delay is given in millisecond, what is the unit of delay jitter?
LVL 1
###### Who is Participating?
I wear a lot of hats...

"The solutions and answers provided on Experts Exchange have been extremely helpful to me over the last few years. I wear a lot of hats - Developer, Database Administrator, Help Desk, etc., so I know a lot of things but not a lot about one thing. Experts Exchange gives me answers from people who do know a lot about one thing, in a easy to use platform." -Todd S.

Commented:
Not really the actual function is a simple example of a class of estimators called "recursive prediction error" or "stochastic gradient algorithms".  .Think about as a recursive running average.  To calculate you have to remember that nothing is known about absolute transit times, the differences in packet transit times are independent of absolute clock values. If packets i and j are time stamped with timestamps Si and Sj when they are sent and are received at times Ri and Rj, respectively, then Dij = (Rj-Sj) - (Ri-Si) = (Rj-Ri)-(Sj-Si) is the difference in transit times in time stamp units. Knowing that then jitter is a smoothed function of |Di-1,i|:Ji = Ji-1 + ( |Di-1,i| - Ji-1 )/16 = 15/16 * Ji-1 + 1/16 * |Di-1,i|
Thus, jitter is a sort of running average of all |Di-1,i| from the beginning of the measurement up to the current packet. Recent packets have a larger weight than older packets. The jitter curve needs about 100 packets to stabilize.
0

Experts Exchange Solution brought to you by