MS Access Performance Question: How to save 1000 rows

MS Access (Office365) performance question

I have a process which requires saving up to 1000 records in a MS SQL Server backend database.
Currently I collect the data for each record, and then use an insertinto command, once for each record.
This process takes a very long time... up to 7 minutes on very fast windows10 computers with a windows 2016 server, although the ms sqlserver is running on a 3GHz win7 workstation with 32MB.

Is there a faster way to save 1000 rows than using the InsertInto command once for each row?

Who is Participating?
I wear a lot of hats...

"The solutions and answers provided on Experts Exchange have been extremely helpful to me over the last few years. I wear a lot of hats - Developer, Database Administrator, Help Desk, etc., so I know a lot of things but not a lot about one thing. Experts Exchange gives me answers from people who do know a lot about one thing, in a easy to use platform." -Todd S.

Dale FyeOwner, Dev-Soln LLCCommented:
you could try something like:
INSERT INTO yourSQLTable (field1, field2, ..., fieldN)
SELECT TOP 1000 field1, field2, ..., FieldN
FROM yourAccessTable
ORDER by yourAccessTable.CreatedDT DESC

Open in new window

but this will only work if you have a field similar to [CreatedDT] which defaults to Now().
I'm not certain that will be any faster, and having the Loop allows you to keep your user informed as to the progress of the operation, and, if this is related to your "Stop" button issue, will also allow you to cancel the operation before it is complete.

Another way to do this might be to open the SQL Server table as a recordset and simply use rs.AddNew,..., rs.Update code inside your loop.
Gustav BrockCIOCommented:
First, use DAO to open a Recordset, and use the AddNew and Update methods for each record to write the data to a local table.
This is superfast.

Then use a query like Dale's above to export the table to SQL Server.
John TsioumprisSoftware & Systems EngineerCommented:
You have 2 options : both using passthrough queries
1st. You can insert multiple rows at once essentially you prepare your VALUES ..probably store them in a collection and iterate them in chunks...and create one big string ...only catch that the string has a limit of around 64K so you should make some checking about the length of the string....(for example you could insert 100 values at once...or 200...or 50 depending on the size) .. (tested and working on similar case)
2nd. There is also the option of BULK INSERT where you export your query to Csv and the SQL reads it and performs an at once import...(haven't tried but it should be the fastest possible insert method....just output your records to a .csv to a shared folder and let SQL do the hard work)
Your Guide to Achieving IT Business Success

The IT Service Excellence Tool Kit has best practices to keep your clients happy and business booming. Inside, you’ll find everything you need to increase client satisfaction and retention, become more competitive, and increase your overall success.

To expand upon John's comment, you can use a passthru query and structure your T-SQL to insert multiple rows with a single operation.
Insert Into tablename (colname1, colname2,..., colnameN)
(value1, value2,..., valueN)
(value1a, value2a,..., valueNa)
(value1b, value2b,..., valueNb)
(value1c, value2c,..., valueNc)
(value1d, value2d,..., valueNd)

Open in new window

You still have a 64k character limit, but you should be able to prevent SQL Server from performing a commit for each row.
pcalabriaAuthor Commented:
@John @Akimark  Thanks, I half understand your suggestion, I plan to try Dale/Gustavs approach first because I understand and it will be easy to implement... Hopefully you can help out if it doesn't work!

@Dale @Gustav  Let me ask a question to see if I understand... I think its been suggested that I open a MS Access table on the local machine with the same structure, and then append records (up to 1000) one at a time using DAO .addnew to a temp table, and then after all the rows are created, append the entire access table to the sql table... Is that right?

INSERT INTO yourSQLTable (field1, field2, ..., fieldN)
SELECT TOP 1000 field1, field2, ..., FieldN
FROM yourAccessTable

I would then delete the local access temp table.

Is that the process you think would be much faster???

John TsioumprisSoftware & Systems EngineerCommented:
I reckon that the solution i quite easy to implement...
Just make a test and manually prepare your INSERT for e.g. 20 records...
Make a simple iteration
DIm rst as DAO.Recordset
Dim strVALUES as String
Dim qdf as QueryDef
Dim strSQL as String
Dim counter as Integer
strsSQL = "INSERT INTO yourSQLTable (field1, field2, ..., fieldN) VALUES ("
set qdf = Currentdb.QueryDefs(YOUR PASSTHROUGH QUERY) 'Remember that action passthrough queries have the property "Return Records = False"
set rst = Currentdb.Open(YourQuery)
With rst
While Not .EOF
For i = 0 to .Fields.Count -1
strVALUES = strVALUES &"'" & .Fields(i) & "','" 
' <---> Here you need some check so that you don't exceed the 64k if counter =100 then .MoveLast
counter =counter +1
End With
strVALUES = left(strTemp,Len(strTemp)-1)
strSQL = strSQL & strVALUES  & ")"
qdf.SQL = strSQL

Open in new window

Gustav BrockCIOCommented:
Is that the process you think would be much faster???

Action queries are always faster than VBA loops.  Your process is especially slow because not only are you using a VBA code loop but you are creating and running a separate append query for each row so you are running 1000 queries.  Access has to analyze the SQL and create an execution plan and then run the query.  As you can see this is quite expensive timewise.

You could do as the others have suggested and stick with the code loop but instead open up a recordset where you then use the .add method to add each new record.  This will be much faster.

But, faster still, and far less code would be to simply run an append query that selects the records from what ever table they are currently in and appends them to the target table.  If the data is in a linked spreadsheet, that works fine since Access treats all local and linked tables and even queries the same for this purpose.
pcalabriaAuthor Commented:
Experts... I'm working on this now but have some questions...

1. Step 1: The data is currently stored on our MS SQL Server. Its my understanding that I should create an MS Access table with the same file structure on my local machine...  this means I will have a MDB file on my local hard drive.  (I'll use a TEMP directory).
                            First issue I encountered... how do I create an MS Access table with the same structure as the MS SQL Table? Do I have to   manually recreated it?  I don't think there is a downsizing wizard, is there?

2.  Next I would need to initialize stuff before the loop.
My code would look something like this:  Is this correct?
dim oRS as DAO.Recordset
dim oDB as Database
set oDB=c:\temp\MyDB.MDB
Set oRS = oDB.OpenRecordset ("SELECT * FROM MyTable")

3.  Before I start to loop, I need to delete all the code in the temp table.

DoCmd.RunSQL ("DELETE * from MyTable")
4. My code starts to loop, populating ten variables with data.  It will loop up to 1000 times.  Each time through the loop I will use the .addnew command to add a record to the local table.  This means the code below executes 1000 times.

With oRS
        !Field1= strData1
        !Field2= strData2
End With

5.  After I finish looping and now have 1000 rows in the temp table that need to be added to the table in the SQL database.
This is where it gets confusing.  I think I need to use the approach offered by @John and @...mark?
Either case, it seems like its an extra operation????

Gustav BrockCIOCommented:
1. You can run an import where you import the structure only, not the data.

2. - 4. OK

5. Run an append query with the temp table as source and the SQL Server as target.
John TsioumprisSoftware & Systems EngineerCommented:
I will give an outline since i am not near my workstation.
You prepare a passthrough query that has has the structure
Save it with a easy name and don't forget ...Return Records FALSE...its an acrion passthrough.
Goto to the table you want to push the values to the Sql table.
Iterate and create the big string of Values...declare a qdf..set it to the action passthrough...set its Sql..just take the existing SQl and replace the dummy string with the big VALUES string..set the qdf's Sql to the final Sql string..execute it...job done
pcalabriaAuthor Commented:
@John  Thanks but this is a lot of new territory for me.. I've never used a passthrough query before.. not sure what acrion is... not sure what iterate... or declare a qdf????  Wow.. when I chat with someone like you I realize how little I know...  I'll read up and give it a try... just wanted you to know I appreciate your response... and off course all the experts who are helping... I'll get back to you when I get a chance to tackle this again... it shouldn't be too long.  THanks
Gustav BrockCIOCommented:
For 1000 records, I wouldn't bother.
It will take little time for you to expand your routine to write the data to a local table using DAO, and even less to create the append query to export the data.
pcalabriaAuthor Commented:
@Gustav  Thanks... This will be easy for me to try... :-)
I still don't know where the 1000 rows are coming from.  Are users typing the data in using a form?  Are they importing a spreadsheet or csv file?

If the data starts out in some machine readable format, you can link to the source file.  You also link to the SQL Server database.  Then using an append query, you select the data from the linked source file and append it to the table on the server.  Only one line of code is required to run the append query.

The more clear you are in your description, the better, and simpler our answers will be.
pcalabriaAuthor Commented:
@Pat and all... My code copies the html code from a webpage at a specific url.  The webpage has a list of 1000 rows of data.  Each row has information on a specific part number, such as a quantity, price, source, and so forth.  My code saves this webpage to a file.  Next I read the file into a string variable.  My routine then loops 1000 times, collecting the information from on row at a time.  For example, first time through it collects info on part number 12345 , 1000 in stock, $1 each, yada yada.  When the code gets to the bottom of the loop, it saves this data as a row in my SQL table.  Now, it starts over and collects row 2, saves, then gets row 3, etc.

@Gustav  I made changes the routine as suggested.  I now use .addnew to save each row to a table in a new database.  The new database is on the same hard drive as the program code.  It is linked as a BE.  My code starts looping, extracting the data that it needs from the string variable.  At the end of each loop, I use .addnew to append the row to the local table. When all the rows are captured, I use runsql to append the contents of the local table to the SQL table I use to store everything.  Once appended, I empty the local table, perform a C&R on the empty table, then begin the process again to capture information for the next sku.  There are 100,000 skus that I monitor.  With the changes you discussed implemented, it still takes 8 minutes to save 1000 records, which still is way too slow.  That would be 800,000 minutes (555 days) to complete one series of captures.  Fortunately, it doesn't take nearly this long because most skus have fewer than 1000 rows... the average sku has only 100 rows... but that is still way to slow.  Much of the time is spent extracting the data, ie string manipulation, instr, mid, left, etc... the append query takes about 30 seconds for 1000 records.

Am I paying a performance penalty for having the table in the BE?  The MDB file is 350MB, so I don't think it would be practical to include it in the FE.
Gustav BrockCIOCommented:
Hmm, that's about half a second per sku seems quite a lot, but without neither data nor code, it is hard to tell how to optimise.
Do you really have to parse the complete page for each sku? Most pages can be extracted to some kind of table, so you just have to loop it once.
If you can supply an URL for download, I would be happy to have a look.

A linked table is usually very fast, indeed if it is local.

As a side note, the parsing and the whole process would probably be much, much faster in .Net - by a factor of ten or more.

Experts Exchange Solution brought to you by

Your issues matter to us.

Facing a tech roadblock? Get the help and guidance you need from experienced professionals who care. Ask your question anytime, anywhere, with no hassle.

Start your 7-day free trial
John TsioumprisSoftware & Systems EngineerCommented:
Based on my initial posts here is how someone handled pretty much the same case.
inserting via chunks will get you up and running much much faster...each individual INSERT presents a major bottleneck in the whole just need to take advantage of the fact that Access (along with most other languages) do string/array handling quite fast compared to communicating to a 3rd party engine and pushing data.
Picture it like this :
For 1 row insert the Access has to open a communication channel with SQL ...issue the INSERT...return back.
With the "chunk" it will do just as the above  but instead of INSERTing a single row you will insert 100 or 200 ...or whatever the 64k string block can just a bit more time...
And i have to slightly disagree with Gustav...if its done correctly (minus the multithreading capability of .NET) it will also be quite fast.
pcalabriaAuthor Commented:
<<Most pages can be extracted to some kind of table, so you just have to loop it once.>>
I only loop through the code once.  Each time I extract a row, I write one row to the local table.  When I reach the end of the table, I trigger the append query that saves the local table to SQL table. That's the process that takes about 8 minutes for 1000 rows.

Most of the time is taken up extracting the information.  Saving each row to the local table is indeed fast.  Appending the local table to the SQL table may take 30-60 seconds, but that's not much considering the whole process takes 8 minutes.

@John  Thanks, but it looks like most of the time is being spent extracting the data from the page.  A page with 1000 line items takes a few seconds 5 seconds or so to download the page to a file, 8 minutes to extract the data, and at this point, about 30 seconds to append to the SQL table... I need to reduce the 8 minutes... once optimized, I will take a look at your comments again and work them in.

<<As a side note, the parsing and the whole process would probably be much, much faster in .Net - by a factor of ten or more.>>
Thanks, but I'm 64 years old... by the time I learn .net I'll probably be retired!  LOL (nice dream at least)
John TsioumprisSoftware & Systems EngineerCommented:
Now its much its not the pushing that causes the slow down but the parsing...
For this we need to have some more solid Gustav mentioned of taking a look on what you see...
Can you give some info on what kind of file you are saving the data (html ?) and how you parse the data...
Its true that Access and Internet are not good partners but .html at the end is a simple text file maybe you need to read it a bit more cleverly.
pcalabriaAuthor Commented:
My problem was performance.  The solution was to use the html import feature of Access that lets me import the entire table... instead of one field at a time...

This will take me so time to recode, but I appreciate everyone's help, but Gustav, you made my day, week, and month!  Thanks.
Gustav BrockCIOCommented:
You are welcome!
As I suggested, once you got the data into a table, it worked a lot faster.  Keep in mind as you are recoding, that a table IS an array so use the table rather than an array in your loop or better still, try to update the SQL Server directly so you don't have to process the same set of data multiple times..
It's more than this solution.Get answers and train to solve all your tech problems - anytime, anywhere.Try it for free Edge Out The Competitionfor your dream job with proven skills and certifications.Get started today Stand Outas the employee with proven skills.Start learning today for free Move Your Career Forwardwith certification training in the latest technologies.Start your trial today

From novice to tech pro — start learning today.