Opening a large file in Perl -> "Out of Memory!" error
Posted on 2003-03-20
I am attempting to open a 24MB file using Perl. Below is the code that I am attempting to use:
my $file = 'bigfile.temp';
open(FILE,"<$file") || die "Cannot open $file: $!";
my $sgml = join('',<FILE>); # The program dies on this line
print "Read ".length($sgml)." characters from $file\n";
The program runs for a while, starts accessing the disk (paging from memory), and then halts. The error message printed is simply:
Out of memory!
I don't see why Perl would crash on such a problem. I identified the line that causes the crash by using output statements, which have been removed. I have also attempted a different solution using:
foreach my $line (<FILE>) ...
However the alternate solution has the same problem. The content of the loop is not executed even once. Can anyone shed light on why Perl would crash like this?
I am running Perl version 5.6.0 on Windows 2000 with 512MB of physical memory.
The content of the file is SGML.
The file was written using Perl (external to the previous script).
The file can be viewed through a text editor (Textpad), so I assume that the file is not corrupted.
Windows Task Manager indicates that the memory usage of Perl peaks at 80MB, then drops off to 15MB, then rises back up to 80MB, at which point Perl halts with the "Out of Memory!" error.