reading files in matlab into arrays


I have matlab version 2009. When i try to read in a file that has 63174528 floats (ie 252698112 bytes),  I end up with an array that has lesser number of numbers. I understand that matlab treats numbers as doubles i.e. 8 bytes , but even so, I should have plenty of memory to store 63174528 doubles (i.e. 505396224 bytes)

When i run the memory command in matlab to check out my system resources, here is what i get

Maximum possible array:              28069 MB (2.943e+010 bytes) *
Memory available for all arrays:     28069 MB (2.943e+010 bytes) *
Memory used by MATLAB:                 659 MB (6.915e+008 bytes)
Physical Memory (RAM):               16361 MB (1.716e+010 bytes)

*  Limited by System Memory (physical + swap file) available.

So as you can see, 28069 MB, is plenty of space to accomodate my file of numbers in an array.

my matlab code is as follows:

Here, I get data as <14598144x1 double> array. I am missing some 600,000 numbers.

Can anyone advise as to what can be done to accomodate 63174528 numbers in an array and why the array is so small when the memory command lists far more memory being available.

PS: the file was written out in the following way in C++:
FILE *fp=fopen("myfile.txt","wb");
fwrite((void*)f,sizePerElement,numElements,fp) //dump bytes pointed by a pointer f into the binary file

Who is Participating?

[Webinar] Streamline your web hosting managementRegister Today

yuk99Connect With a Mentor Commented:
The problem shouldn't be connected with available memory. It probably related with how the data are organized in the input file.

Can you check if the data recorded into the file in C++ and what you get in MATLAB are the same (well, at least, the first elements)? Are you using the same OS to write and to read the file?
All Courses

From novice to tech pro — start learning today.