SolvedPrivate

Execute SQL Task Extremely Slow

Posted on 2014-12-04
12
199 Views
Last Modified: 2016-02-15
I have a execute sql task in ssis and it performs poorly.  At any given time it's inserting between 500 - 50K records.  Its taking over 2 minutes each time.  Is there a way to improve the performance?  I'm using an ADO.Net connection.  Any suggestions and help is greatly appreciated.
0
Comment
Question by:LeVette Alexander
  • 4
  • 3
  • 2
  • +2
12 Comments
 
LVL 65

Expert Comment

by:Jim Horn
ID: 40481619
I'll go out on a limb here and state that it would greatly help if you could copy-paste the T-SQL that it's executing into this question.  And if it's just a SP/function name, copy-paste that SP/function.
0
 
LVL 69

Expert Comment

by:Scott Pletcher
ID: 40481753
Be sure to do all the INSERTs as part of a single transaction rather than using a separate transaction for each INSERT; all separate trans would be massive overhead.
0
 
LVL 12

Expert Comment

by:James Elliott
ID: 40482418
How many records in your table and is it indexed in any way?
0
 
LVL 46

Expert Comment

by:Vitor Montalvão
ID: 40482561
Lack of information in this question.
Where are the data coming from?
0
 

Author Comment

by:LeVette Alexander
ID: 40482746
code is simple - its a stored procedure that resembles the below

INSERT INTO TableA
 (ColumnA, ColumnB, ColumnC)

SELECT ColumnA, ColumnB, ColumnC From Table

The table has 500 million records and the table is indexed.  The data comes from other tables and is inserted into a large table.  There is one clustered index and 8 non clustered indexes with the fill factor set to 50 for each index.

My assumption was that I could quickly insert records into the table using ssis execute sql task.
0
 
LVL 46

Expert Comment

by:Vitor Montalvão
ID: 40482807
TableA is being recreated each time the package run or already exists and you are only appending the records?
The indexes has been created in TableA or in the source table?
0
PRTG Network Monitor: Intuitive Network Monitoring

Network Monitoring is essential to ensure that computer systems and network devices are running. Use PRTG to monitor LANs, servers, websites, applications and devices, bandwidth, virtual environments, remote systems, IoT, and many more. PRTG is easy to set up & use.

 

Author Comment

by:LeVette Alexander
ID: 40483009
The table (TableA) already exists (500 million records).  I'm appending records.  The indexes are on TableA
0
 
LVL 46

Accepted Solution

by:
Vitor Montalvão earned 250 total points
ID: 40483027
Ok. That justifies the time that is taking.
Without an archive solution or a purge process, the only way I can see by now is to partitionate the TableA.
0
 

Author Comment

by:LeVette Alexander
ID: 40483036
Thanks Victor. I guess I"m just stuck until we partition the table.
0
 
LVL 65

Assisted Solution

by:Jim Horn
Jim Horn earned 250 total points
ID: 40483037
Couple of random thoughts, as 500 million rows is a lot, even with three columns and indexed.

Are there existing rows in the target table?  If so, you might be better served partitioning that table, then doing your insert into that partition, which will speed things up.

Is it an option to drop all indexes in the target table, do your INSERT, then re-create those indexes?
This would eliminate the overhead of also having to INSERT into the index(es) as you're inserting data.
Might want to throw this into a loop so that it inserts only 50k / 100k / whateverk rows at a time, just in case there is a limiting buffer somewhere.   A side benefit of doing this is if with every loop you write the results into a 'log' table, then when the job is executing on a scheduled basis (i.e. not using the designer) you can view current progress by looking at the table.
Is doing this in a data pump an option?  SSIS data pumps can easily 'stream' this data to avoid any buffers.
In your SSIS package are you running this multi-threaded, and can you max out the number of threads on the processor?
Can whatever process that populates the source table be re-written to directly populate the target table, eliminating an 'intermediary hop' of the source table?
0
 

Author Comment

by:LeVette Alexander
ID: 40483100
Jim, there is only one table.  I'm not using SSIS in the traditional  manner.  I'm simply inserting records into one table.  I think Batching might be the answer w/o SSIS.

Thanks for your insight.
0
 
LVL 69

Expert Comment

by:Scott Pletcher
ID: 40486184
>> There is one clustered index and 8 non clustered indexes with the fill factor set to 50 for each index. <<

With only 3 or 4 columns in the table, that's an extremely high number of indexes.  Verify that you have the best clustering key and that a fill factor that low is really warranted, esp. on the clustered index.  You're more than doubling the size of that table when (if) you rebuild it with a fill factor that low.
0

Featured Post

Best Practices: Disaster Recovery Testing

Besides backup, any IT division should have a disaster recovery plan. You will find a few tips below relating to the development of such a plan and to what issues one should pay special attention in the course of backup planning.

Question has a verified solution.

If you are experiencing a similar issue, please ask a related question

My client sends a request to me that they want me to load data, which will be returned by Web Service APIs, and do some transformation before importing to database. In this article, I will provide an approach to load data with Web Service Task and X…
How to leverage one TLS certificate to encrypt Microsoft SQL traffic and Remote Desktop Services, versus creating multiple tickets for the same server.
Familiarize people with the process of retrieving data from SQL Server using an Access pass-thru query. Microsoft Access is a very powerful client/server development tool. One of the ways that you can retrieve data from a SQL Server is by using a pa…
This videos aims to give the viewer a basic demonstration of how a user can query current session information by using the SYS_CONTEXT function

863 members asked questions and received personalized solutions in the past 7 days.

Join the community of 500,000 technology professionals and ask your questions.

Join & Ask a Question

Need Help in Real-Time?

Connect with top rated Experts

27 Experts available now in Live!

Get 1:1 Help Now