Solved

Speed up linked tables in Access

Posted on 2009-07-09
4
733 Views
Last Modified: 2012-05-07
I have 7 linked tables that take data from CSV files. The trouble is each file could potentially contain 100,000 records (or maybe more) and it takes Access a few minutes per file to append the data to a table. With 7 tables this can take 15-20 minutes. Normally this wouldnt be a problem, however the appends need to be done once a week and I can see this potentially being a major issue.

Does anyone have any suggestions on how to increase the performance when taking data from a linked table connected to a CSV file and appending it to a regular table in Access? Is there a better way to import CSV files??

Your help would be much appreciated.

Many Thanks,
James
0
Comment
Question by:nikez2k4
  • 2
4 Comments
 
LVL 57

Accepted Solution

by:
Jim Dettman (Microsoft MVP/ EE MVE) earned 500 total points
ID: 24812176
Your going to want to read in the CSV using code into a table.  Then work with it.  Right now, to work with it as an linked table, that's more or less what Access is doing, but there is a lot of overhead associated with that.
There are two approaches to reading a file in code:
1. Import into a table with TransferText, then read the table and parse the records.
2. Open the file directly with the open statement, then use Input # to read the file.

 I would suggest #2, as TransferText has some length limits.  The sample snippet below reads a file, parses each record into a couple of fields, then saves the record.  What you would need to do would be very similar.

JimD.

          ' Import the file

280       pb.SetMessage "Importing Flat File"

 

          ' Get import file into a table.

          ' Note: Was originally a Text transfer.  Had to rewrite because

          ' the record length got too long.

290       CurrentDb().Execute "Delete * From tbl_tmpEDIImportPO", dbFailOnError

300       OpenADORecordset rsImport, "select * from tbl_tmpEDIImportPO WHERE 1 = 0"

310       intFileNum = FreeFile

320       Open strFileName For Input As #intFileNum

330       With rsImport

340           Do While Not EOF(intFileNum)

350               Line Input #intFileNum, strLine

360               .AddNew

370               ![Tag] = left(strLine, 3)

380               ![EDITradingPartner] = Trim(Mid$(strLine, 4, 20))

390               ![PONumber] = Trim(Mid$(strLine, 24, 22))

400               ![InputLine] = Mid$(strLine, 46)

410               .Update

420           Loop

430       End With

440       rsImport.Close

450       Close #intFileNum

 

Open in New WindowSelect All

Open in new window

0
 
LVL 49

Expert Comment

by:Gustav Brock
ID: 24812475
If you use the tables directly in your append queries, it will be slow.

You could try to create one query per table, a simple "select all fields" query. Make the table read-only by marking the query as Snapshot under properties. This will force the data to be cached by Access.

Now, use these queries as sources in your update queries.

/gustav
0
 
LVL 1

Author Comment

by:nikez2k4
ID: 24821650
Thanks for your help guys. I am relatively knew to VBA so I wasn't aware of these commands. Just out of curiosity I tried the TransferText function and was blown away with the speed of it. Creating a linked table and using an append query to import the data took around 5 minutes to do 150,000 records, but the TransferText function did it in 12 seconds!!!

Jim, I know you mentioned that TransferText has length limits, could you explain this in more detail? Is it a length limit on the actual file, per row, per field or something else?

Is there any advantage using something similar to the code you posted rather than the TrasnferText function?

Thanks again for your input,
James
0
 
LVL 57
ID: 24822482
James,
  The most fundamental problem is that because it is working directly with a table, you have table restrictions in place, ie. that you can't work with more then 255 fields in a import.  Your also forced to deal with a single table.  If your CSV file needs to be parsed into multiple tables, you need code to do that.  Last, it's difficult to control the import process and you have no opportunity to scrub the data (review and modify) if need be.  You can use a schema.ini file to control the import, but I find it simpler just to use code and you have total control over the process that way.
JimD.
0

Featured Post

6 Surprising Benefits of Threat Intelligence

All sorts of threat intelligence is available on the web. Intelligence you can learn from, and use to anticipate and prepare for future attacks.

Join & Write a Comment

Most if not all databases provide tools to filter data; even simple mail-merge programs might offer basic filtering capabilities. This is so important that, although Access has many built-in features to help the user in this task, developers often n…
Regardless of which version on MS Access you are using, one of the harder data-entry forms to create is one where most data from previous entries needs to be appended to new records, especially when there are numerous fields and records involved.  W…
Familiarize people with the process of utilizing SQL Server views from within Microsoft Access. Microsoft Access is a very powerful client/server development tool. One of the SQL Server objects that you can interact with from within Microsoft Access…
In Microsoft Access, learn how to “cascade” or have the displayed data of one combo control depend upon what’s entered in another. Base the dependent combo on a query for its row source: Add a reference to the first combo on the form as criteria i…

760 members asked questions and received personalized solutions in the past 7 days.

Join the community of 500,000 technology professionals and ask your questions.

Join & Ask a Question

Need Help in Real-Time?

Connect with top rated Experts

20 Experts available now in Live!

Get 1:1 Help Now