Solved

How to optimize SSIS 2008 Package

Posted on 2011-03-15
10
709 Views
Last Modified: 2012-05-11
Hi Experts,

I am connected to this server3  through my desktop, I have created Sql server 2008 Integration services package But package is created on server3 only

My package consists the following tasks

Dynamic connections script component( C# code is written),
Execute sql task(delete from dbo.Spn_SM_PF_Bst_Prce_Prcng)
Data flow task(loading data from server1 (database1.Dbo.Sam_pf_Bst_prce_Prcng using Oledb source) to server 2 (database2.dbo.Spn_SM_PF_Bst_Prce_Prcng using Oledb destination compponent) Its a straight forward loading

 my source table have 19221289 rows, when I execute package it is taking about 15 mins time,
Now my challenge is reduce no of logs read/writing to Database, Decrease  execution time,

How to optimize my data flow task, could you please suggest its properties like

BLOBTempStorage Path
Buffer TempStorage Path =
Default Buffer maxrows =
Default buffer size =

any properties that can improve performance,
 Please help me I am in very emergency need, I am working on this now,
0
Comment
Question by:adstorm88
[X]
Welcome to Experts Exchange

Add your voice to the tech community where 5M+ people just like you are talking about what matters.

  • Help others & share knowledge
  • Earn cash & points
  • Learn & ask questions
  • 5
  • 3
10 Comments
 
LVL 17

Assisted Solution

by:Barry Cunney
Barry Cunney earned 50 total points
ID: 35146364
What type of database are you transferring data to - In my experience for MS SQL Server, the 'SQL Server Destination' object performs better than the 'OLE DB' object.

Also you could try dropping the indexes on the destination table temporarily and re-create them after the data load
0
 
LVL 1

Author Comment

by:adstorm88
ID: 35149108
Thank you Bcunney for your reply,
'SQL SERVER Destination ' we can use only if source and Destination are on same server ?, Because in my scenario Source and destination are different server(not on same server)


any thoughts?
0
 
LVL 37

Expert Comment

by:ValentinoV
ID: 35198613
The "SQL Server Destination" is indeed faster but can only be used if your package is running on the same server as the destination server.  Which does not seem to be the case here.

Have a look at the following great articles with tips how to improve SSIS performance:
http://sqlcat.com/top10lists/archive/2008/10/01/top-10-sql-server-integration-services-best-practices.aspx
http://www.simple-talk.com/sql/ssis/sql-server-2005-ssis-tuning-the-dataflow-task/
0
Industry Leaders: We Want Your Opinion!

We value your feedback.

Take our survey and automatically be enter to win anyone of the following:
Yeti Cooler, Amazon eGift Card, and Movie eGift Card!

 
LVL 1

Author Comment

by:adstorm88
ID: 35201506
Thanks Valentino for your reply,
I am read those article, I am searching for table or graph where values for
Default Buffer maxrows =
Default buffer size =
I can reach better performance,Because only performance tuning I can do is my Data flow only or How can I implement batch processing for this scenario,

Thanks
0
 
LVL 1

Author Comment

by:adstorm88
ID: 35202150
I think I am making progress in improving performance,
In OLEDB Destination I have Unchekced  check constraints
Row per batch = 2000000
Max insert commit size = 2147483647((largest value for 4 byte integer type)
I got loaded 19.2 millions in 8 mins, Before it was 18 mins,

Now I searching for best value for Max insert commit size = ?
so that Data flow will do several commits, and  it will release the pressure on the transaction log and tempdb to grow tremendously specifically during high volume data transfers.

any thoughts ?
0
 
LVL 37

Accepted Solution

by:
ValentinoV earned 450 total points
ID: 35204796
There's no specific value that can be recommended because it all depends on the situation.  You'll have to experiment with that to find out what setting gives the best result.

The recommendation from the CAT article is: "For an indexed destination, I recommend testing between 100,000 and 1,000,000 as batch size."

Usually I don't need to customize that number and leave it at 2147483647, but I have also seen situations where 100.000 gave good results.

In any case, don't put it to zero.  Here's what the Books Online have to say about that: "A value of 0 might cause the running package to stop responding if the OLE DB destination and another data flow component are updating the same source table. To prevent the package from stopping, set the Maximum insert commit size option to 2147483647."

I recomment to try playing with that while leaving the "Rows per batch" empty.

There's some good info here as well: http://social.msdn.microsoft.com/forums/en-US/sqlintegrationservices/thread/86efc54e-45bd-4ccc-baf1-a7b34f80fe32/
0
 
LVL 1

Author Comment

by:adstorm88
ID: 35209861
Valentino V,
Thank you for your suggestions and Comments, I helped me up to some extent, Not completely Because as you know scenario to scenario environment resources are different,
But I could able to run package 7:44 mins,
Thank you
 
0
 
LVL 1

Author Comment

by:adstorm88
ID: 35209887
all is well
0
 
LVL 37

Expert Comment

by:ValentinoV
ID: 35214149
I'm glad you could speed up your package, and quite well: coming 18 minutes and ending up at 7:44 is nice enough to have a splendid weekend! :-)
0

Featured Post

DevOps Toolchain Recommendations

Read this Gartner Research Note and discover how your IT organization can automate and optimize DevOps processes using a toolchain architecture.

Question has a verified solution.

If you are experiencing a similar issue, please ask a related question

This article explains how to reset the password of the sa account on a Microsoft SQL Server.  The steps in this article work in SQL 2005, 2008, 2008 R2, 2012, 2014 and 2016.
A Stored Procedure in Microsoft SQL Server is a powerful feature that it can be used to execute the Data Manipulation Language (DML) or Data Definition Language (DDL). Depending on business requirements, a single Stored Procedure can return differe…
This video shows, step by step, how to configure Oracle Heterogeneous Services via the Generic Gateway Agent in order to make a connection from an Oracle session and access a remote SQL Server database table.
Using examples as well as descriptions, and references to Books Online, show the documentation available for datatypes, explain the available data types and show how data can be passed into and out of variables.

734 members asked questions and received personalized solutions in the past 7 days.

Join the community of 500,000 technology professionals and ask your questions.

Join & Ask a Question