How to speed up reading data from a txt file in R?


I have a very large file (800 MB) to be read in R from a txt files and it takes several minutes to do so. I need it to be much quicker.

 Is there any functions, packages, or tricks to read in data much quicker than what I am doing?

I have included my code and the dataset.
I use a HP laptop with intel i7 and windows8.

Thank you for you help.

### time the first way


time.data1  ## Time difference of 1.523134 secs

### time the second way


time.data2  ## Time difference of 1.497055 secs

Open in new window

Who is Participating?
I wear a lot of hats...

"The solutions and answers provided on Experts Exchange have been extremely helpful to me over the last few years. I wear a lot of hats - Developer, Database Administrator, Help Desk, etc., so I know a lot of things but not a lot about one thing. Experts Exchange gives me answers from people who do know a lot about one thing, in a easy to use platform." -Todd S.

Notepad reads your file 8 secs independently on the drive type used. And Notepad is very bad at this field... MS Excel is faster. The R speed is derived by the R itself.

It seems the only way is parallelism in your task...

I suppose you don't need all detail data in one program instance. Simply split your file into several parts and execute these parts separately. This will be limited by your disk speed so I would recommend to use SSD drive.

If you need to process all data at once (and no way to speed the R up exists) then you should look for faster processing in e.g. C++. Or you may use C++ to preprocess/reduce your data.

For more advanced R solution you may look here:
Do you have 800MB+ free RAM (physical memory) when you begin reading?

The reason I used the plus sign is that R will need additional memory to do statistics.
dhsindyRetired considering supplemental income.Commented:
Are you using a 32 or 64 bit system?  I am assuming a 64-bit since i7 processor.  Disk speed and memory comes to mind.  Verify the file is loaded all to the memory and not virtually to the disk.  Do you have ample free space (30%) on your hard disk.

Also, you didn't mention where the file is stored, hard disk, cloud, etc.  Other things to look for  are other processes that might be taking priority - like a security program, or other - I use Malwarebytes and I noticed it periodically kicks in and uses up to 30% of my RAM. I frequently stop my security program when loading large files and then restart it after the file is loaded.

I would start "Task Manager"/or other process manager and watch the memory processes as it loads, maybe increase the priority of that process.  And, look for other processes hogging resources.
The Ultimate Tool Kit for Technolgy Solution Provi

Broken down into practical pointers and step-by-step instructions, the IT Service Excellence Tool Kit delivers expert advice for technology solution providers. Get your free copy for valuable how-to assets including sample agreements, checklists, flowcharts, and more!

Gary PattersonVP Technology / Senior Consultant Commented:
How about fread() instead of read.table()?

Here's a nice comparison of some basic techniques to load big data into memory:

Experts Exchange Solution brought to you by

Your issues matter to us.

Facing a tech roadblock? Get the help and guidance you need from experienced professionals who care. Ask your question anytime, anywhere, with no hassle.

Start your 7-day free trial
pgmerLAAuthor Commented:
Thank you all for your answers.

I have used GaryPatterson advice on fread() and I was able to reduce the time considerably

> system.time(data1<-fread("XYZ_EE.txt",sep=",",header=F))
   user  system elapsed 
   0.59    0.00    0.61 
> system.time(data2<-read.table("XYZ_EE.txt",sep=",",header=F))
   user  system elapsed 
   1.64    0.03    1.67 

Open in new window

pgmerLAAuthor Commented:
The system I use:

i7 4700MQ @2.4 Ghz
64bit Windows8
1TB hard drive
At least some speeding is good.
pgmerLAAuthor Commented:
Hi pcelba,

How can I follow your advice in R?  "I would recommend to use SSD drive"
It was just a brainstorming... or hardware solution.

Any HDD swap for SSD drive will speed your computer up. I am using notebooks with SSD C: drive and it is much faster than any older desktop.

The notebook uses just i7-3610QM but its experience index is 7.
pgmerLAAuthor Commented:
Are you using an external SSD C:drive?

How much faster do you think my program will get?

What do you mean by "The notebook uses just i7-3610QM but its experience index is 7"?
My SSD is internal. External would need USB 3 connection and it also does not speed the OS up.

Sorry I cannot predict the speed improvement.

 i7-3610QM is slower than 4700MQ:,64899
but thanks to SSD the notebook speed index is 7.
pgmerLAAuthor Commented:
Thank you pcelba. I will keep that really good advise in mind!
You are welcome.
It's more than this solution.Get answers and train to solve all your tech problems - anytime, anywhere.Try it for free Edge Out The Competitionfor your dream job with proven skills and certifications.Get started today Stand Outas the employee with proven skills.Start learning today for free Move Your Career Forwardwith certification training in the latest technologies.Start your trial today

From novice to tech pro — start learning today.