I have an application that dumps data from a complicated dynamic array to disk and then reads it back in again at a later point to minimize memory usage.
The array (of singles) has four dimensions and is created in two steps:
for i := 0 to MaxRegionTypes do
SetLength(SDVuser[i],MaxSubRegions[i]+1, MaxSimDailyVars+1, MaxDays+1);
So in others words, the first dimension is created first and then the 2nd/3rd/4th dimensions are created next because the size of the second dimension varies.
The writing to and reading from file F (file of single) is done using BlockWrite and BlockRead commands:
for rt := 0 to MaxRegionTypes do
RecordSize := (MaxSubRegions[rt]+1)*
This assumes that the 2nd/3rd/4th dimensions of the SDVuser array are CONTIGUOUS in memory. Documentation and on-line comment that I can see on dynamic arrays suggests this is true.
But I sometimes get Range Check errors associated with the BlockRead which I am thinking might be caused by non-contiguity. I am also getting odd exceptions occurring elsewhere which again I think may be caused by memory being trampled because the array is not in fact contiguous.
Question 1: Does the SetLength para create a contiguous block of memory for the 3 dimensions I create in one go?
Question 2: Do the secondary SetLength operations cause the entire 4 dimensional structure to be recreated each time to ensue that the WHOLE array is contiguous - my observation is that assuming this causes lots more problems elsewhere and so I have avoided making this assumption.
Hope someone can clarify the situation