I have wrote a perl script to parse and process a lot of large gzipped text files line by line. There are about 700k .gz files, and the total size of these gzip files are around 120G. The decompressed files should be tens of magnitude larger.
I found that it takes about 8 hours to process even 800 gz files. So it will take about one year to finish all of them with such processing speed.
I am wondering why perl takes so long to process them. will it be possible for me to improve the running speed by re-write the code in c?