In C++, how much more do dynamic multi-dimensional arrays use?

In looking at the available Physical Memory in Task Manager while my program is executing in the debugger, I've realized that the dynamically allocated multi-dimensional arrays in my VS2012 C++ application use A LOT MORE MEMORY than I would expect.  

Let's assume for this discussion that my computer needs 8 bytes to store a variable of data type "double".  Well, I would've expected that the amount of memory (ie, the heap) needed for a two-dimensional array of size [1000][50] (using the "new" keyword) would simply be 1000 * 50 * 8 bytes =  400,000 bytes (ie, 400 K).  Well, I must be misunderstanding the algorithm, because more than twice this much memory is actually be "taken" from the supply of available memory.

So, if someone can explain the algorithm for the use of dynamically allocated memory with regard to arrays, and in particular multi-dimensional arrays, I would greatly appreciate it.
cjm20Asked:
Who is Participating?

[Product update] Infrastructure Analysis Tool is now available with Business Accounts.Learn More

x
I wear a lot of hats...

"The solutions and answers provided on Experts Exchange have been extremely helpful to me over the last few years. I wear a lot of hats - Developer, Database Administrator, Help Desk, etc., so I know a lot of things but not a lot about one thing. Experts Exchange gives me answers from people who do know a lot about one thing, in a easy to use platform." -Todd S.

TommySzalapskiCommented:
Are you using "managed c++" or unmanaged in your project properties? The memory management stuff that VC++ tacks on can be rather large.

The way operating systems allocate memory is not simple at all. Your program may ask for 400k and the OS might give it a 1MB block.

In Task Manager, go to the Performance tab, click Resource Monitor...
Then click on the Memory tab. See if the numbers there for your process make more sense.
0
aboo_sCommented:
I think it depends on the architecture of your machine OS , 32bit or 64bit depends how large is one word in your machine's memory.
0
aboo_sCommented:
if a variable needs 8 bytes to be stored and a word in your machine is 32bytes long than 8 bytes are stores in 32.
0
JavaScript Best Practices

Save hours in development time and avoid common mistakes by learning the best practices to use for JavaScript.

TommySzalapskiCommented:
A 64bit system will generally use 64bits for its default allocation "word"
But 64 bits is 8 bytes. Also arrays are typically packed anyway.
If you declare 1 character, you'll see that it takes up 8 bytes.
If you declare an array of 5 characters... 8 bytes
11 characters... 16 bytes, etc etc. It will typically round up to the nearest 8 bytes.

But the memory that the OS allocates to your process is a different story. If you declare a large array, it can give you twice that memory in case you allocate more so it can be efficient and give you memory near what you already have. Looking at the free physical memory isn't really a good indication of how much memory a c++ object is taking up.
0

Experts Exchange Solution brought to you by

Your issues matter to us.

Facing a tech roadblock? Get the help and guidance you need from experienced professionals who care. Ask your question anytime, anywhere, with no hassle.

Start your 7-day free trial
sarabandeCommented:
if you build the debug configuration the extra space is reserved for the debugger.

Sara
0
cjm20Author Commented:
Thanks Tom (and to everyone else who chimed in).  The Resource Viewer in Task Manger is nice, but still didn't help pinpoint exactly why the excess memory for my dynamic arrays is, well, so excessive.  But i think you've hit on it when you mentioned the OS trying to be efficient in its anticipation of the application allocating MORE memory than it just asked for.  This makes sense.  The OS, for my big arrays, is allocating about 2.5 times the amount I would think my array would need.  I've also read elsewhere that for multi-dimensional dynamically-sized arrays, the pointers to the arrays within arrays also need to reside in memory. Said pointers for an array of the "double" data types take 4 bytes I'm told, so with a very large array, this is more a lot more overheard.
0
TommySzalapskiCommented:
Sara is right too. If you were running a debug build, then that would explain most of the extra memory.

In C++ a two dimensional array like that does not store pointers to "arrays within arrays." If you had some type of two dimensional array that had rows of different lengths that would apply, but double x[1000][50] is just going to give you one array. x[ i ][ j ] just translates to something like x[i*50+j].

On a 64 bit system, pointers take up 8 bytes.
0
It's more than this solution.Get answers and train to solve all your tech problems - anytime, anywhere.Try it for free Edge Out The Competitionfor your dream job with proven skills and certifications.Get started today Stand Outas the employee with proven skills.Start learning today for free Move Your Career Forwardwith certification training in the latest technologies.Start your trial today
Programming Languages-Other

From novice to tech pro — start learning today.