I'm developing an application in which I process various message strings sent from a server. One of these messages occurs every 10 seconds or so and I must process this message as follows:
If I get 6 messages within approximately 50 seconds (at 0, 10, 20, 30, 40, 50 seconds or some other 50 second block), then the message frequency is acceptable and gets a grade of A. If for whatever reason there is an interruption in the message stream and I get between 1 and 5 strings in that time frame, then the grade is B. If I receive no messages in that time span then the grade is C and if this continues for 10 minutes then the grade is D and must be rechecked every 10 minutes. I'm using a 55 second time frame to check for the frequency of messages in case the messages are slightly longer than 10 seconds apart.
The problem I'm running into is that if there is a break in the comunication, I know it'll be a grade less than A and will remain as such until I get 6 consecutive strings within that 55 second time frame to get a grade of A again. If I hapeen to have a grade of C or D and all of a sudden get 6 messages in a row, the first 5 still have an grade of B (because this is still less than 6), and get an A only when the 6th message comes in. In addition, if the 6 consecutive message is consistent (that is, at 0 10 20 30 40 50 and continuing at 10 20 30 40 50 60 and so on), then the grade is consistently an A. Only when there is a break in the message stream does the grade fall below A and, as previously mentioned, there must be 6 consecutive messages at any give time in order to get a grade A again.
I know this probably sounds like a junior programmer issue, but I can't seem to get my head around it. I'll gladly double the points to the first person who can provide a workable solution (Scout's honor), and for the record, no this is NOT a homework assignment, even though I'm sure there are those out there who might think it is and choose not to respond because of that thought. Any assistance is genuinely appreciated. TIA