Strange time consumption in for loop

Hello Experts,
I have encountered a strange situation.
I have added the corresponding code and the TRACE results.
So basically, I have two / three (three in the example but it is the same behavior with only two for loops), and it is making a huge time consumption on certain index.
Usually, it takes 0 milliseconds, but sometimes, it will take 16 ms.
I am working with VC2010 SP1 with Windows 7 and under Debug mode.
Thanks in advance !

int _tmain(int argc, TCHAR* argv[], TCHAR* envp[])
{
	int nRetCode = 0;

	HMODULE hModule = ::GetModuleHandle(NULL);

	if (hModule != NULL)
	{
		// initialize MFC and print and error on failure
		if (!AfxWinInit(hModule, NULL, ::GetCommandLine(), 0))
		{
			// TODO: change error code to suit your needs
			_tprintf(_T("Fatal Error: MFC initialization failed\n"));
			nRetCode = 1;
		}
		else
		{
			for (unsigned int i = 0 ; i < 100 ; i++)
			{
				DWORD dwT1 = GetTickCount();

				for (unsigned int j = 0 ; j < 500 ; j++)
				{
					for (unsigned int k = 0 ; k < 500 ; k++)
					{
						int Position = 0;
						Position++;
					}
				}

				DWORD dwT2 = GetTickCount();
				TRACE(_T("Delai (%d) = %d\r\n"), i, dwT2 - dwT1);
			}
		}
	}
	else
	{
		// TODO: change error code to suit your needs
		_tprintf(_T("Fatal Error: GetModuleHandle failed\n"));
		nRetCode = 1;
	}

	return nRetCode;
}

Open in new window


Delai (0) = 0
...
Delai (5) = 0
Delai (6) = 15
Delai (7) = 0
...
Delai (14) = 0
Delai (15) = 16
Delai (16) = 0
...
Delai (28) = 0
Delai (29) = 15
Delai (30) = 0
...
Delai (63) = 0
Delai (64) = 16
Delai (65) = 0
...
Delai (98) = 0
Delai (99) = 16
DBTechniqueAsked:
Who is Participating?
I wear a lot of hats...

"The solutions and answers provided on Experts Exchange have been extremely helpful to me over the last few years. I wear a lot of hats - Developer, Database Administrator, Help Desk, etc., so I know a lot of things but not a lot about one thing. Experts Exchange gives me answers from people who do know a lot about one thing, in a easy to use platform." -Todd S.

Neil RussellTechnical Development LeadCommented:
Well you loop is doing a total of 500 * 500 * 100 iterations. Thats 25000000 times through a LOOP!!
0
DBTechniqueAuthor Commented:
But why doesn't it take the same time for all the iterations of the first loop ?
0
Neil RussellTechnical Development LeadCommented:
Almost certainly because your computer is doing other things. Also your innermost loop will almost certainly be optimised out by the compiler when compiled to a runtime. It does nothing, changes nothing and non of its variables are used except in the loop itself. A prime candidate for optimization.

A loop for timing is of no use at all as you can NOT predict what else windows is doing.
0
Cloud Class® Course: MCSA MCSE Windows Server 2012

This course teaches how to install and configure Windows Server 2012 R2.  It is the first step on your path to becoming a Microsoft Certified Solutions Expert (MCSE).

Infinity08Commented:
The work you're doing is too short to measure accurately. It would be even worse if you enabled optimization (I hope you didn't).

The granularity of GetTickCount is around 15ms on many systems. And apparently also on yours.
And it's that granularity that you're seeing. Consecutive calls to GetTickCount will return values like 0, 0, ..., 0, 15, 15, ..., 15, 30, etc., and not 0, 1, 2, 3, etc.

To get a more accurate timing, run the code you're measuring a few thousand times (or more, such that it takes at least several seconds to complete), measure the total time, and then divide by the amount of times you ran the code.
0

Experts Exchange Solution brought to you by

Your issues matter to us.

Facing a tech roadblock? Get the help and guidance you need from experienced professionals who care. Ask your question anytime, anywhere, with no hassle.

Start your 7-day free trial
sarabandeCommented:
to add to above comment:

optimization is off when compiling a visual studio project in debug mode.  but it is not recommended to make time measurement in debug mode cause the debugger necessarily distorts the results.

the GetTickCount returns number of milliseconds gone since the (windows) system was started. the return type is DWORD what is defined as unsigned int. on a system with 32-bit integers (as far as i know all windows systems since win32s) it wraps to zero after 49.7 days.

the resolution can be retrieved by GetSystemTimeAdjustment and could be set by SetSystemTimeAdjustment, but i never did it myself and never heard that someone made adjustments.

Sara
0
DBTechniqueAuthor Commented:
To Infinity08 :
Thanks ! Those are really useful information.
To Sara :
I am too curious now to not try it ^^
Thank you.
0
It's more than this solution.Get answers and train to solve all your tech problems - anytime, anywhere.Try it for free Edge Out The Competitionfor your dream job with proven skills and certifications.Get started today Stand Outas the employee with proven skills.Start learning today for free Move Your Career Forwardwith certification training in the latest technologies.Start your trial today
C++

From novice to tech pro — start learning today.

Question has a verified solution.

Are you are experiencing a similar issue? Get a personalized answer when you ask a related question.

Have a better answer? Share it in a comment.