Speed up linked tables in Access

I have 7 linked tables that take data from CSV files. The trouble is each file could potentially contain 100,000 records (or maybe more) and it takes Access a few minutes per file to append the data to a table. With 7 tables this can take 15-20 minutes. Normally this wouldnt be a problem, however the appends need to be done once a week and I can see this potentially being a major issue.

Does anyone have any suggestions on how to increase the performance when taking data from a linked table connected to a CSV file and appending it to a regular table in Access? Is there a better way to import CSV files??

Your help would be much appreciated.

Many Thanks,
James
LVL 1
nikez2k4Asked:
Who is Participating?
I wear a lot of hats...

"The solutions and answers provided on Experts Exchange have been extremely helpful to me over the last few years. I wear a lot of hats - Developer, Database Administrator, Help Desk, etc., so I know a lot of things but not a lot about one thing. Experts Exchange gives me answers from people who do know a lot about one thing, in a easy to use platform." -Todd S.

Jim Dettman (Microsoft MVP/ EE MVE)President / OwnerCommented:
Your going to want to read in the CSV using code into a table.  Then work with it.  Right now, to work with it as an linked table, that's more or less what Access is doing, but there is a lot of overhead associated with that.
There are two approaches to reading a file in code:
1. Import into a table with TransferText, then read the table and parse the records.
2. Open the file directly with the open statement, then use Input # to read the file.

 I would suggest #2, as TransferText has some length limits.  The sample snippet below reads a file, parses each record into a couple of fields, then saves the record.  What you would need to do would be very similar.

JimD.

          ' Import the file
280       pb.SetMessage "Importing Flat File"
 
          ' Get import file into a table.
          ' Note: Was originally a Text transfer.  Had to rewrite because
          ' the record length got too long.
290       CurrentDb().Execute "Delete * From tbl_tmpEDIImportPO", dbFailOnError
300       OpenADORecordset rsImport, "select * from tbl_tmpEDIImportPO WHERE 1 = 0"
310       intFileNum = FreeFile
320       Open strFileName For Input As #intFileNum
330       With rsImport
340           Do While Not EOF(intFileNum)
350               Line Input #intFileNum, strLine
360               .AddNew
370               ![Tag] = left(strLine, 3)
380               ![EDITradingPartner] = Trim(Mid$(strLine, 4, 20))
390               ![PONumber] = Trim(Mid$(strLine, 24, 22))
400               ![InputLine] = Mid$(strLine, 46)
410               .Update
420           Loop
430       End With
440       rsImport.Close
450       Close #intFileNum
 
Open in New WindowSelect All

Open in new window

0

Experts Exchange Solution brought to you by

Your issues matter to us.

Facing a tech roadblock? Get the help and guidance you need from experienced professionals who care. Ask your question anytime, anywhere, with no hassle.

Start your 7-day free trial
Gustav BrockCIOCommented:
If you use the tables directly in your append queries, it will be slow.

You could try to create one query per table, a simple "select all fields" query. Make the table read-only by marking the query as Snapshot under properties. This will force the data to be cached by Access.

Now, use these queries as sources in your update queries.

/gustav
0
nikez2k4Author Commented:
Thanks for your help guys. I am relatively knew to VBA so I wasn't aware of these commands. Just out of curiosity I tried the TransferText function and was blown away with the speed of it. Creating a linked table and using an append query to import the data took around 5 minutes to do 150,000 records, but the TransferText function did it in 12 seconds!!!

Jim, I know you mentioned that TransferText has length limits, could you explain this in more detail? Is it a length limit on the actual file, per row, per field or something else?

Is there any advantage using something similar to the code you posted rather than the TrasnferText function?

Thanks again for your input,
James
0
Jim Dettman (Microsoft MVP/ EE MVE)President / OwnerCommented:
James,
  The most fundamental problem is that because it is working directly with a table, you have table restrictions in place, ie. that you can't work with more then 255 fields in a import.  Your also forced to deal with a single table.  If your CSV file needs to be parsed into multiple tables, you need code to do that.  Last, it's difficult to control the import process and you have no opportunity to scrub the data (review and modify) if need be.  You can use a schema.ini file to control the import, but I find it simpler just to use code and you have total control over the process that way.
JimD.
0
It's more than this solution.Get answers and train to solve all your tech problems - anytime, anywhere.Try it for free Edge Out The Competitionfor your dream job with proven skills and certifications.Get started today Stand Outas the employee with proven skills.Start learning today for free Move Your Career Forwardwith certification training in the latest technologies.Start your trial today
Microsoft Access

From novice to tech pro — start learning today.