I have a VB .NET Timer control. I set Timer1.Interval to 1000 ms (1 second). It turns out that the timer event happens every 1.46 seconds!
I've read that the timer control is derived from the 18/second PC clock and cannot, therefore, time anything closer than 55 ms. However, my error (1.46 sec instead of 1.0 seconds) is way bigger than that. Can anyone explain this phenomenon?
By trial and error I can set the interval to 450 ms and get an event of about 1 second. But how can I count on that staying that way if it doesn't make sense?