I'm prepping some log files for insertion into a SQL table, but the log files aren't delimited properly.
So I'm running a powershell script similar to this, to get the delimiters in the right place:
$lines = @(gc infile.log)
foreach ($line in $lines)
$line = $line.replace(" File:",",")
$line = $line.replace(" From:<",",")
$line = $line.replace("> To:<",",")
add-content outfile.txt $line
This works great on test files, but when munching through a normal log file (50 MB), it uses up a lot of RAM and CPU resources, and takes forever. And some of my log files are up to 1 GB.
Is there any way to read 100 lines at a time into the array, operate on those lines and then get the next 100 until the end of file is reached, so that we're not trying to load the entire file into memory all at once?