• Status: Solved
  • Priority: Medium
  • Security: Public
  • Views: 3931
  • Last Modified:

how to calculate delay jitter given delay.

Hi experts,

I understand that delay jitter is the variation of delay in network packet measurement.
So given delay, how do I calculate delay jitter?
If delay jitter is the variance, can I use the usual statistic variance formula var(X)=E(X^2)-E(x)^2?
If my delay is given in millisecond, what is the unit of delay jitter?
1 Solution
Not really the actual function is a simple example of a class of estimators called "recursive prediction error" or "stochastic gradient algorithms".  .Think about as a recursive running average.  To calculate you have to remember that nothing is known about absolute transit times, the differences in packet transit times are independent of absolute clock values. If packets i and j are time stamped with timestamps Si and Sj when they are sent and are received at times Ri and Rj, respectively, then Dij = (Rj-Sj) - (Ri-Si) = (Rj-Ri)-(Sj-Si) is the difference in transit times in time stamp units. Knowing that then jitter is a smoothed function of |Di-1,i|:Ji = Ji-1 + ( |Di-1,i| - Ji-1 )/16 = 15/16 * Ji-1 + 1/16 * |Di-1,i|
Thus, jitter is a sort of running average of all |Di-1,i| from the beginning of the measurement up to the current packet. Recent packets have a larger weight than older packets. The jitter curve needs about 100 packets to stabilize.
Question has a verified solution.

Are you are experiencing a similar issue? Get a personalized answer when you ask a related question.

Have a better answer? Share it in a comment.

Join & Write a Comment

Featured Post

Free Tool: IP Lookup

Get more info about an IP address or domain name, such as organization, abuse contacts and geolocation.

One of a set of tools we are providing to everyone as a way of saying thank you for being a part of the community.

Tackle projects and never again get stuck behind a technical roadblock.
Join Now