Solved

Speed up linked tables in Access

Posted on 2009-07-09
4
743 Views
Last Modified: 2012-05-07
I have 7 linked tables that take data from CSV files. The trouble is each file could potentially contain 100,000 records (or maybe more) and it takes Access a few minutes per file to append the data to a table. With 7 tables this can take 15-20 minutes. Normally this wouldnt be a problem, however the appends need to be done once a week and I can see this potentially being a major issue.

Does anyone have any suggestions on how to increase the performance when taking data from a linked table connected to a CSV file and appending it to a regular table in Access? Is there a better way to import CSV files??

Your help would be much appreciated.

Many Thanks,
James
0
Comment
Question by:nikez2k4
  • 2
4 Comments
 
LVL 57

Accepted Solution

by:
Jim Dettman (Microsoft MVP/ EE MVE) earned 500 total points
ID: 24812176
Your going to want to read in the CSV using code into a table.  Then work with it.  Right now, to work with it as an linked table, that's more or less what Access is doing, but there is a lot of overhead associated with that.
There are two approaches to reading a file in code:
1. Import into a table with TransferText, then read the table and parse the records.
2. Open the file directly with the open statement, then use Input # to read the file.

 I would suggest #2, as TransferText has some length limits.  The sample snippet below reads a file, parses each record into a couple of fields, then saves the record.  What you would need to do would be very similar.

JimD.

          ' Import the file

280       pb.SetMessage "Importing Flat File"

 

          ' Get import file into a table.

          ' Note: Was originally a Text transfer.  Had to rewrite because

          ' the record length got too long.

290       CurrentDb().Execute "Delete * From tbl_tmpEDIImportPO", dbFailOnError

300       OpenADORecordset rsImport, "select * from tbl_tmpEDIImportPO WHERE 1 = 0"

310       intFileNum = FreeFile

320       Open strFileName For Input As #intFileNum

330       With rsImport

340           Do While Not EOF(intFileNum)

350               Line Input #intFileNum, strLine

360               .AddNew

370               ![Tag] = left(strLine, 3)

380               ![EDITradingPartner] = Trim(Mid$(strLine, 4, 20))

390               ![PONumber] = Trim(Mid$(strLine, 24, 22))

400               ![InputLine] = Mid$(strLine, 46)

410               .Update

420           Loop

430       End With

440       rsImport.Close

450       Close #intFileNum

 

Open in New WindowSelect All

Open in new window

0
 
LVL 49

Expert Comment

by:Gustav Brock
ID: 24812475
If you use the tables directly in your append queries, it will be slow.

You could try to create one query per table, a simple "select all fields" query. Make the table read-only by marking the query as Snapshot under properties. This will force the data to be cached by Access.

Now, use these queries as sources in your update queries.

/gustav
0
 
LVL 1

Author Comment

by:nikez2k4
ID: 24821650
Thanks for your help guys. I am relatively knew to VBA so I wasn't aware of these commands. Just out of curiosity I tried the TransferText function and was blown away with the speed of it. Creating a linked table and using an append query to import the data took around 5 minutes to do 150,000 records, but the TransferText function did it in 12 seconds!!!

Jim, I know you mentioned that TransferText has length limits, could you explain this in more detail? Is it a length limit on the actual file, per row, per field or something else?

Is there any advantage using something similar to the code you posted rather than the TrasnferText function?

Thanks again for your input,
James
0
 
LVL 57
ID: 24822482
James,
  The most fundamental problem is that because it is working directly with a table, you have table restrictions in place, ie. that you can't work with more then 255 fields in a import.  Your also forced to deal with a single table.  If your CSV file needs to be parsed into multiple tables, you need code to do that.  Last, it's difficult to control the import process and you have no opportunity to scrub the data (review and modify) if need be.  You can use a schema.ini file to control the import, but I find it simpler just to use code and you have total control over the process that way.
JimD.
0

Featured Post

Is Your Active Directory as Secure as You Think?

More than 75% of all records are compromised because of the loss or theft of a privileged credential. Experts have been exploring Active Directory infrastructure to identify key threats and establish best practices for keeping data safe. Attend this month’s webinar to learn more.

Question has a verified solution.

If you are experiencing a similar issue, please ask a related question

In the previous article, Using a Critera Form to Filter Records (http://www.experts-exchange.com/A_6069.html), the form was basically a data container storing user input, which queries and other database objects could read. The form had to remain op…
Today's users almost expect this to happen in all search boxes. After all, if their favourite search engine juggles with tens of thousand keywords while they type, and suggests matching phrases on the fly, why shouldn't they expect the same from you…
Familiarize people with the process of utilizing SQL Server functions from within Microsoft Access. Microsoft Access is a very powerful client/server development tool. One of the SQL Server objects that you can interact with from within Microsoft Ac…
Basics of query design. Shows you how to construct a simple query by adding tables, perform joins, defining output columns, perform sorting, and apply criteria.

863 members asked questions and received personalized solutions in the past 7 days.

Join the community of 500,000 technology professionals and ask your questions.

Join & Ask a Question

Need Help in Real-Time?

Connect with top rated Experts

23 Experts available now in Live!

Get 1:1 Help Now