Go Premium for a chance to win a PS4. Enter to Win

x
  • Status: Solved
  • Priority: Medium
  • Security: Public
  • Views: 233
  • Last Modified:

MS SQL to VBA-Excel large data dump

Need to insert 250 000 records into Excel sheet from MS SQL DB.
I use VBA ADODB connection.
I do it frequenty.
Now it takes to run almost 4 minutes, way too long.

Here is my Sub, any way to speed it up?

'Query DB and populate data in specified worksheet starting row/Col in active workbook
Sub SQLSrvDB_LowLevel_ReadRecSet(strSQL As String, strWksh As String, intstartRow As Integer, intStartCol As Integer)

  Dim time1, time2, time3, time4    'Used for timing response time
  Dim destRng As Range
  
  'Instantiate Record Set
  Set privADORecSet = New ADODB.Recordset
  
  'Query DB
  On Error Resume Next
  time1 = Time
  privADORecSet.Open Source:=strSQL, ActiveConnection:=privADOConnDB, CursorType:=adOpenStatic, LockType:=adLockReadOnly
  If Err.Number <> 0 Then MsgBox Err.Description
  time2 = Time
  
  'Populate RawData worksheet
  time3 = Time
  Set destRng = Range(Worksheets(strWksh).Cells(intstartRow, intStartCol), _
                      Worksheets(strWksh).Cells(privADORecSet.RecordCount, privADORecSet.Fields.Count - 1))
  destRng.CopyFromRecordset privADORecSet
  time4 = Time
  
  'Close the objects
  privADORecSet.Close
  
  'Recover memory
  Set privADORecSet = Nothing
  Set destRng = Nothing
  
End Sub

Open in new window

0
CABRLU63
Asked:
CABRLU63
  • 3
  • 2
2 Solutions
 
SimonCommented:
I assume you've done the obvious and set application.calculation =false before you start?

you might save a bit of time by just specifying the starting call for the .copyfromrecordset, so that it's not evaluating the whole recordset before it starts copying...

'set destRng to a single cell range
Set destRng = Range(Worksheets(strWksh).Cells(intstartRow, intStartCol),Worksheets(strWksh).Cells(intstartRow, intStartCol),))
  destRng.CopyFromRecordset privADORecSet

Open in new window


Have you compared the length of time it takes to push the same set of data using the data import/export wizard?

Have you tried writing the data to a file on the local hard drive of the SQL server?

I write large recordsets (containing a lot of varchar columns) and it does take a while, especially to network shares, but 4 minutes for 250k records is a bit slow.

How large is the resulting file and how fast is your LAN?
0
 
RayData AnalystCommented:
Have you connected to your sql server with a query tool like SQL Server Management studio and run your query there?  If not, do so and see how long it takes.  If the times are within a few seconds, the bottleneck is your SQL Server.  If it is fast through query tool, then the issue is likely within your VBA and excel as 'Simon' mentioned.
0
 
SimonCommented:
Just spotted a typo in my earlier comment...
you might save a bit of time by just specifying the starting call for the .copyfromrecordset, so that it's not evaluating the whole recordset before it starts copying...
Should read
you might save a bit of time by just specifying the starting cell for the .copyfromrecordset, so that it's not evaluating the whole recordset before it starts copying...
0
What does it mean to be "Always On"?

Is your cloud always on? With an Always On cloud you won't have to worry about downtime for maintenance or software application code updates, ensuring that your bottom line isn't affected.

 
CABRLU63Author Commented:
Yeah, same dataset runs for 20 seconds in MS SQL Server.

I've added your code and it sped up things by 11%.

I'm on 2 mbps down, and 4 mbps up network.

I don't believe I can put that data on the server's drive.

Output file is 77 MB large.

I'm actually inserting that data into the tab with RawData and then later my code refreshes 10 Pivots.
0
 
RayData AnalystCommented:
When you ran it via MS SQL Server did you do that from a local machine (where you're running the excel) or did you do it directly on the same box as the SQL Server?  I ask because at 2mbps down, it should take a while to receive 77MB worth of data at your workstation.  To simply download 77Mb it would take over 4 1/2 minutes at that speed.  I'm sure it is not quite that much actual data since there is overhead in the file size, hence why it only takes 4 minutes.

Sorry if I'm 'barking up the wrong tree' here.
0
 
SimonCommented:
@Ray, I think we're on the same page on this one.

It the query may complete in 20 seconds, but the significant comparison from SSMS is to export it to a flat file on the workstation or network share.

If the bottleneck is simply shifting the bytes, the only thing (other than a faster network) is to look hard at whether all of those bytes are truly necessary: trim the number of rows or columns in the result set if possible or schedule the export  or split it so that you only move the volatile data. e.g. if it's a daily update, can you get away with joining live data for the last day with previously downloaded data for the other 29 days of the month?
0

Featured Post

Free Tool: Path Explorer

An intuitive utility to help find the CSS path to UI elements on a webpage. These paths are used frequently in a variety of front-end development and QA automation tasks.

One of a set of tools we're offering as a way of saying thank you for being a part of the community.

  • 3
  • 2
Tackle projects and never again get stuck behind a technical roadblock.
Join Now