Welcome to Experts Exchange

Add your voice to the tech community where 5M+ people, just like you, are talking about what matters.

  • Help others & share knowledge
  • Earn cash & points
  • Learn & ask questions

How to speed up reading data from a txt file in R?

Posted on 2014-11-30
Last Modified: 2014-12-10

I have a very large file (800 MB) to be read in R from a txt files and it takes several minutes to do so. I need it to be much quicker.

 Is there any functions, packages, or tricks to read in data much quicker than what I am doing?

I have included my code and the dataset.
I use a HP laptop with intel i7 and windows8.

Thank you for you help.

### time the first way


time.data1  ## Time difference of 1.523134 secs

### time the second way


time.data2  ## Time difference of 1.497055 secs

Open in new window

Question by:pgmerLA
LVL 42

Assisted Solution

pcelba earned 110 total points
ID: 40473446
Notepad reads your file 8 secs independently on the drive type used. And Notepad is very bad at this field... MS Excel is faster. The R speed is derived by the R itself.

It seems the only way is parallelism in your task...

I suppose you don't need all detail data in one program instance. Simply split your file into several parts and execute these parts separately. This will be limited by your disk speed so I would recommend to use SSD drive.

If you need to process all data at once (and no way to speed the R up exists) then you should look for faster processing in e.g. C++. Or you may use C++ to preprocess/reduce your data.

For more advanced R solution you may look here: http://cran.r-project.org/web/views/HighPerformanceComputing.html
LVL 45

Assisted Solution

aikimark earned 25 total points
ID: 40473472
Do you have 800MB+ free RAM (physical memory) when you begin reading?

The reason I used the plus sign is that R will need additional memory to do statistics.
LVL 16

Assisted Solution

dhsindy earned 65 total points
ID: 40473625
Are you using a 32 or 64 bit system?  I am assuming a 64-bit since i7 processor.  Disk speed and memory comes to mind.  Verify the file is loaded all to the memory and not virtually to the disk.  Do you have ample free space (30%) on your hard disk.

Also, you didn't mention where the file is stored, hard disk, cloud, etc.  Other things to look for  are other processes that might be taking priority - like a security program, or other - I use Malwarebytes and I noticed it periodically kicks in and uses up to 30% of my RAM. I frequently stop my security program when loading large files and then restart it after the file is loaded.

I would start "Task Manager"/or other process manager and watch the memory processes as it loads, maybe increase the priority of that process.  And, look for other processes hogging resources.
Free Tool: ZipGrep

ZipGrep is a utility that can list and search zip (.war, .ear, .jar, etc) archives for text patterns, without the need to extract the archive's contents.

One of a set of tools we're offering as a way to say thank you for being a part of the community.

LVL 35

Accepted Solution

Gary Patterson earned 300 total points
ID: 40473679
How about fread() instead of read.table()?


Here's a nice comparison of some basic techniques to load big data into memory:


Author Comment

ID: 40492825
Thank you all for your answers.

I have used GaryPatterson advice on fread() and I was able to reduce the time considerably

> system.time(data1<-fread("XYZ_EE.txt",sep=",",header=F))
   user  system elapsed 
   0.59    0.00    0.61 
> system.time(data2<-read.table("XYZ_EE.txt",sep=",",header=F))
   user  system elapsed 
   1.64    0.03    1.67 

Open in new window


Author Comment

ID: 40492829
The system I use:

i7 4700MQ @2.4 Ghz
64bit Windows8
1TB hard drive
LVL 42

Expert Comment

ID: 40492834
At least some speeding is good.

Author Comment

ID: 40492843
Hi pcelba,

How can I follow your advice in R?  "I would recommend to use SSD drive"
LVL 42

Expert Comment

ID: 40492852
It was just a brainstorming... or hardware solution.

Any HDD swap for SSD drive will speed your computer up. I am using notebooks with SSD C: drive and it is much faster than any older desktop.

The notebook uses just i7-3610QM but its experience index is 7.

Author Comment

ID: 40492873
Are you using an external SSD C:drive?

How much faster do you think my program will get?

What do you mean by "The notebook uses just i7-3610QM but its experience index is 7"?
LVL 42

Expert Comment

ID: 40492883
My SSD is internal. External would need USB 3 connection and it also does not speed the OS up.

Sorry I cannot predict the speed improvement.

 i7-3610QM is slower than 4700MQ: http://ark.intel.com/compare/75117,64899
but thanks to SSD the notebook speed index is 7.

Author Comment

ID: 40492910
Thank you pcelba. I will keep that really good advise in mind!
LVL 42

Expert Comment

ID: 40492920
You are welcome.

Featured Post

Free Tool: Path Explorer

An intuitive utility to help find the CSS path to UI elements on a webpage. These paths are used frequently in a variety of front-end development and QA automation tasks.

One of a set of tools we're offering as a way of saying thank you for being a part of the community.

Question has a verified solution.

If you are experiencing a similar issue, please ask a related question

Suggested Solutions

Title # Comments Views Activity
split53 challenge 7 109
Plain Text Editor for iPad 6 95
ejb example issues 3 25
AvlTree-Node Data type 4 14
Whether you've completed a degree in computer sciences or you're a self-taught programmer, writing your first lines of code in the real world is always a challenge. Here are some of the most common pitfalls for new programmers.
A short article about problems I had with the new location API and permissions in Marshmallow
With the power of JIRA, there's an unlimited number of ways you can customize it, use it and benefit from it. With that in mind, there's bound to be things that I wasn't able to cover in this course. With this summary we'll look at some places to go…
In this seventh video of the Xpdf series, we discuss and demonstrate the PDFfonts utility, which lists all the fonts used in a PDF file. It does this via a command line interface, making it suitable for use in programs, scripts, batch files — any pl…

861 members asked questions and received personalized solutions in the past 7 days.

Join the community of 500,000 technology professionals and ask your questions.

Join & Ask a Question