How to speed up reading data from a txt file in R?

Posted on 2014-11-30
Last Modified: 2014-12-10

I have a very large file (800 MB) to be read in R from a txt files and it takes several minutes to do so. I need it to be much quicker.

 Is there any functions, packages, or tricks to read in data much quicker than what I am doing?

I have included my code and the dataset.
I use a HP laptop with intel i7 and windows8.

Thank you for you help.

### time the first way


time.data1  ## Time difference of 1.523134 secs

### time the second way


time.data2  ## Time difference of 1.497055 secs

Open in new window

Question by:pgmerLA
Welcome to Experts Exchange

Add your voice to the tech community where 5M+ people just like you are talking about what matters.

  • Help others & share knowledge
  • Earn cash & points
  • Learn & ask questions
LVL 42

Assisted Solution

pcelba earned 110 total points
ID: 40473446
Notepad reads your file 8 secs independently on the drive type used. And Notepad is very bad at this field... MS Excel is faster. The R speed is derived by the R itself.

It seems the only way is parallelism in your task...

I suppose you don't need all detail data in one program instance. Simply split your file into several parts and execute these parts separately. This will be limited by your disk speed so I would recommend to use SSD drive.

If you need to process all data at once (and no way to speed the R up exists) then you should look for faster processing in e.g. C++. Or you may use C++ to preprocess/reduce your data.

For more advanced R solution you may look here:
LVL 45

Assisted Solution

aikimark earned 25 total points
ID: 40473472
Do you have 800MB+ free RAM (physical memory) when you begin reading?

The reason I used the plus sign is that R will need additional memory to do statistics.
LVL 16

Assisted Solution

dhsindy earned 65 total points
ID: 40473625
Are you using a 32 or 64 bit system?  I am assuming a 64-bit since i7 processor.  Disk speed and memory comes to mind.  Verify the file is loaded all to the memory and not virtually to the disk.  Do you have ample free space (30%) on your hard disk.

Also, you didn't mention where the file is stored, hard disk, cloud, etc.  Other things to look for  are other processes that might be taking priority - like a security program, or other - I use Malwarebytes and I noticed it periodically kicks in and uses up to 30% of my RAM. I frequently stop my security program when loading large files and then restart it after the file is loaded.

I would start "Task Manager"/or other process manager and watch the memory processes as it loads, maybe increase the priority of that process.  And, look for other processes hogging resources.
Independent Software Vendors: We Want Your Opinion

We value your feedback.

Take our survey and automatically be enter to win anyone of the following:
Yeti Cooler, Amazon eGift Card, and Movie eGift Card!

LVL 35

Accepted Solution

Gary Patterson earned 300 total points
ID: 40473679
How about fread() instead of read.table()?

Here's a nice comparison of some basic techniques to load big data into memory:

Author Comment

ID: 40492825
Thank you all for your answers.

I have used GaryPatterson advice on fread() and I was able to reduce the time considerably

> system.time(data1<-fread("XYZ_EE.txt",sep=",",header=F))
   user  system elapsed 
   0.59    0.00    0.61 
> system.time(data2<-read.table("XYZ_EE.txt",sep=",",header=F))
   user  system elapsed 
   1.64    0.03    1.67 

Open in new window


Author Comment

ID: 40492829
The system I use:

i7 4700MQ @2.4 Ghz
64bit Windows8
1TB hard drive
LVL 42

Expert Comment

ID: 40492834
At least some speeding is good.

Author Comment

ID: 40492843
Hi pcelba,

How can I follow your advice in R?  "I would recommend to use SSD drive"
LVL 42

Expert Comment

ID: 40492852
It was just a brainstorming... or hardware solution.

Any HDD swap for SSD drive will speed your computer up. I am using notebooks with SSD C: drive and it is much faster than any older desktop.

The notebook uses just i7-3610QM but its experience index is 7.

Author Comment

ID: 40492873
Are you using an external SSD C:drive?

How much faster do you think my program will get?

What do you mean by "The notebook uses just i7-3610QM but its experience index is 7"?
LVL 42

Expert Comment

ID: 40492883
My SSD is internal. External would need USB 3 connection and it also does not speed the OS up.

Sorry I cannot predict the speed improvement.

 i7-3610QM is slower than 4700MQ:,64899
but thanks to SSD the notebook speed index is 7.

Author Comment

ID: 40492910
Thank you pcelba. I will keep that really good advise in mind!
LVL 42

Expert Comment

ID: 40492920
You are welcome.

Featured Post

Free Tool: Subnet Calculator

The subnet calculator helps you design networks by taking an IP address and network mask and returning information such as network, broadcast address, and host range.

One of a set of tools we're offering as a way of saying thank you for being a part of the community.

Question has a verified solution.

If you are experiencing a similar issue, please ask a related question

Suggested Solutions

Title # Comments Views Activity
Object Oriented Programming, C#, referencing, scoping. 13 102
Ruby or Python 7 132
Excel Web Add-in Where is Visual Basic used 9 84
How do I buy BitCoin? 4 29
In this post we will learn how to connect and configure Android Device (Smartphone etc.) with Android Studio. After that we will run a simple Hello World Program.
When we purchase storage, we typically are advertised storage of 500GB, 1TB, 2TB and so on. However, when you actually install it into your computer, your 500GB HDD will actually show up as 465GB. Why? It has to do with the way people and computers…
In this fourth video of the Xpdf series, we discuss and demonstrate the PDFinfo utility, which retrieves the contents of a PDF's Info Dictionary, as well as some other information, including the page count. We show how to isolate the page count in a…

751 members asked questions and received personalized solutions in the past 7 days.

Join the community of 500,000 technology professionals and ask your questions.

Join & Ask a Question