cjm20
asked on
In C++, how much more do dynamic multi-dimensional arrays use?
In looking at the available Physical Memory in Task Manager while my program is executing in the debugger, I've realized that the dynamically allocated multi-dimensional arrays in my VS2012 C++ application use A LOT MORE MEMORY than I would expect.
Let's assume for this discussion that my computer needs 8 bytes to store a variable of data type "double". Well, I would've expected that the amount of memory (ie, the heap) needed for a two-dimensional array of size [1000][50] (using the "new" keyword) would simply be 1000 * 50 * 8 bytes = 400,000 bytes (ie, 400 K). Well, I must be misunderstanding the algorithm, because more than twice this much memory is actually be "taken" from the supply of available memory.
So, if someone can explain the algorithm for the use of dynamically allocated memory with regard to arrays, and in particular multi-dimensional arrays, I would greatly appreciate it.
Let's assume for this discussion that my computer needs 8 bytes to store a variable of data type "double". Well, I would've expected that the amount of memory (ie, the heap) needed for a two-dimensional array of size [1000][50] (using the "new" keyword) would simply be 1000 * 50 * 8 bytes = 400,000 bytes (ie, 400 K). Well, I must be misunderstanding the algorithm, because more than twice this much memory is actually be "taken" from the supply of available memory.
So, if someone can explain the algorithm for the use of dynamically allocated memory with regard to arrays, and in particular multi-dimensional arrays, I would greatly appreciate it.
I think it depends on the architecture of your machine OS , 32bit or 64bit depends how large is one word in your machine's memory.
if a variable needs 8 bytes to be stored and a word in your machine is 32bytes long than 8 bytes are stores in 32.
ASKER CERTIFIED SOLUTION
membership
This solution is only available to members.
To access this solution, you must be a member of Experts Exchange.
if you build the debug configuration the extra space is reserved for the debugger.
Sara
Sara
ASKER
Thanks Tom (and to everyone else who chimed in). The Resource Viewer in Task Manger is nice, but still didn't help pinpoint exactly why the excess memory for my dynamic arrays is, well, so excessive. But i think you've hit on it when you mentioned the OS trying to be efficient in its anticipation of the application allocating MORE memory than it just asked for. This makes sense. The OS, for my big arrays, is allocating about 2.5 times the amount I would think my array would need. I've also read elsewhere that for multi-dimensional dynamically-sized arrays, the pointers to the arrays within arrays also need to reside in memory. Said pointers for an array of the "double" data types take 4 bytes I'm told, so with a very large array, this is more a lot more overheard.
Sara is right too. If you were running a debug build, then that would explain most of the extra memory.
In C++ a two dimensional array like that does not store pointers to "arrays within arrays." If you had some type of two dimensional array that had rows of different lengths that would apply, but double x[1000][50] is just going to give you one array. x[ i ][ j ] just translates to something like x[i*50+j].
On a 64 bit system, pointers take up 8 bytes.
In C++ a two dimensional array like that does not store pointers to "arrays within arrays." If you had some type of two dimensional array that had rows of different lengths that would apply, but double x[1000][50] is just going to give you one array. x[ i ][ j ] just translates to something like x[i*50+j].
On a 64 bit system, pointers take up 8 bytes.
The way operating systems allocate memory is not simple at all. Your program may ask for 400k and the OS might give it a 1MB block.
In Task Manager, go to the Performance tab, click Resource Monitor...
Then click on the Memory tab. See if the numbers there for your process make more sense.