Hello all, having a C# timer issue, just one of those days today. How high can you set the interval in C# for a timer? I am trying to specify my interval in minutes and hours according to the code below:
iHourIntervalValue = (iHours * 60 * 1000);
iMinuteIntervalValue = (iMinutes * 60 * 60 * 1000);
But I have notice that when I specifed the interval in minutes, lets say to fire off every 15 minutes, that my timer fires every 3.5 minutes or so, not every 15 minutes. I have not even tested the hour interval, for fear that what I am doing with the minute timer is incorrect. When I start my code, I am creating an event handler for my timers tick property, that is when this happens. To start, I am doing the following:
tmrData.Interval = iMinuteIntervalValue;
tmrData.Tick+= new System.EventHandler(this.tmrData_Tick);
tmrData.Start(); tmrData.Enabled = true;
In tmrData_Tick, I am firing off a subroutine that calls my database to retrieve and return xml data back. This routine is firing every 3.5 minutes instead of 15 minutes, please help.