SolvedPrivate

Execute SQL Task Extremely Slow

Posted on 2014-12-04
12
202 Views
Last Modified: 2016-02-15
I have a execute sql task in ssis and it performs poorly.  At any given time it's inserting between 500 - 50K records.  Its taking over 2 minutes each time.  Is there a way to improve the performance?  I'm using an ADO.Net connection.  Any suggestions and help is greatly appreciated.
0
Comment
Question by:LeVette Alexander
[X]
Welcome to Experts Exchange

Add your voice to the tech community where 5M+ people just like you are talking about what matters.

  • Help others & share knowledge
  • Earn cash & points
  • Learn & ask questions
  • 4
  • 3
  • 2
  • +2
12 Comments
 
LVL 65

Expert Comment

by:Jim Horn
ID: 40481619
I'll go out on a limb here and state that it would greatly help if you could copy-paste the T-SQL that it's executing into this question.  And if it's just a SP/function name, copy-paste that SP/function.
0
 
LVL 69

Expert Comment

by:Scott Pletcher
ID: 40481753
Be sure to do all the INSERTs as part of a single transaction rather than using a separate transaction for each INSERT; all separate trans would be massive overhead.
0
 
LVL 12

Expert Comment

by:James Elliott
ID: 40482418
How many records in your table and is it indexed in any way?
0
Instantly Create Instructional Tutorials

Contextual Guidance at the moment of need helps your employees adopt to new software or processes instantly. Boost knowledge retention and employee engagement step-by-step with one easy solution.

 
LVL 49

Expert Comment

by:Vitor Montalvão
ID: 40482561
Lack of information in this question.
Where are the data coming from?
0
 

Author Comment

by:LeVette Alexander
ID: 40482746
code is simple - its a stored procedure that resembles the below

INSERT INTO TableA
 (ColumnA, ColumnB, ColumnC)

SELECT ColumnA, ColumnB, ColumnC From Table

The table has 500 million records and the table is indexed.  The data comes from other tables and is inserted into a large table.  There is one clustered index and 8 non clustered indexes with the fill factor set to 50 for each index.

My assumption was that I could quickly insert records into the table using ssis execute sql task.
0
 
LVL 49

Expert Comment

by:Vitor Montalvão
ID: 40482807
TableA is being recreated each time the package run or already exists and you are only appending the records?
The indexes has been created in TableA or in the source table?
0
 

Author Comment

by:LeVette Alexander
ID: 40483009
The table (TableA) already exists (500 million records).  I'm appending records.  The indexes are on TableA
0
 
LVL 49

Accepted Solution

by:
Vitor Montalvão earned 250 total points
ID: 40483027
Ok. That justifies the time that is taking.
Without an archive solution or a purge process, the only way I can see by now is to partitionate the TableA.
0
 

Author Comment

by:LeVette Alexander
ID: 40483036
Thanks Victor. I guess I"m just stuck until we partition the table.
0
 
LVL 65

Assisted Solution

by:Jim Horn
Jim Horn earned 250 total points
ID: 40483037
Couple of random thoughts, as 500 million rows is a lot, even with three columns and indexed.

Are there existing rows in the target table?  If so, you might be better served partitioning that table, then doing your insert into that partition, which will speed things up.

Is it an option to drop all indexes in the target table, do your INSERT, then re-create those indexes?
This would eliminate the overhead of also having to INSERT into the index(es) as you're inserting data.
Might want to throw this into a loop so that it inserts only 50k / 100k / whateverk rows at a time, just in case there is a limiting buffer somewhere.   A side benefit of doing this is if with every loop you write the results into a 'log' table, then when the job is executing on a scheduled basis (i.e. not using the designer) you can view current progress by looking at the table.
Is doing this in a data pump an option?  SSIS data pumps can easily 'stream' this data to avoid any buffers.
In your SSIS package are you running this multi-threaded, and can you max out the number of threads on the processor?
Can whatever process that populates the source table be re-written to directly populate the target table, eliminating an 'intermediary hop' of the source table?
0
 

Author Comment

by:LeVette Alexander
ID: 40483100
Jim, there is only one table.  I'm not using SSIS in the traditional  manner.  I'm simply inserting records into one table.  I think Batching might be the answer w/o SSIS.

Thanks for your insight.
0
 
LVL 69

Expert Comment

by:Scott Pletcher
ID: 40486184
>> There is one clustered index and 8 non clustered indexes with the fill factor set to 50 for each index. <<

With only 3 or 4 columns in the table, that's an extremely high number of indexes.  Verify that you have the best clustering key and that a fill factor that low is really warranted, esp. on the clustered index.  You're more than doubling the size of that table when (if) you rebuild it with a fill factor that low.
0

Featured Post

Microsoft Certification Exam 74-409

Veeam® is happy to provide the Microsoft community with a study guide prepared by MVP and MCT, Orin Thomas. This guide will take you through each of the exam objectives, helping you to prepare for and pass the examination.

Question has a verified solution.

If you are experiencing a similar issue, please ask a related question

Suggested Solutions

A couple of weeks ago, my client requested me to implement a SSIS package that allows them to download their files from a FTP server and archives them. Microsoft SSIS is the powerful tool which allows us to proceed multiple files at same time even w…
This article shows gives you an overview on SQL Server 2016 row level security. You will also get to know the usages of row-level-security and how it works
Via a live example combined with referencing Books Online, show some of the information that can be extracted from the Catalog Views in SQL Server.
Viewers will learn how to use the SELECT statement in SQL and will be exposed to the many uses the SELECT statement has.

763 members asked questions and received personalized solutions in the past 7 days.

Join the community of 500,000 technology professionals and ask your questions.

Join & Ask a Question