Solved

How to optimize SSIS 2008 Package

Posted on 2011-03-15
10
711 Views
Last Modified: 2012-05-11
Hi Experts,

I am connected to this server3  through my desktop, I have created Sql server 2008 Integration services package But package is created on server3 only

My package consists the following tasks

Dynamic connections script component( C# code is written),
Execute sql task(delete from dbo.Spn_SM_PF_Bst_Prce_Prcng)
Data flow task(loading data from server1 (database1.Dbo.Sam_pf_Bst_prce_Prcng using Oledb source) to server 2 (database2.dbo.Spn_SM_PF_Bst_Prce_Prcng using Oledb destination compponent) Its a straight forward loading

 my source table have 19221289 rows, when I execute package it is taking about 15 mins time,
Now my challenge is reduce no of logs read/writing to Database, Decrease  execution time,

How to optimize my data flow task, could you please suggest its properties like

BLOBTempStorage Path
Buffer TempStorage Path =
Default Buffer maxrows =
Default buffer size =

any properties that can improve performance,
 Please help me I am in very emergency need, I am working on this now,
0
Comment
Question by:adstorm88
[X]
Welcome to Experts Exchange

Add your voice to the tech community where 5M+ people just like you are talking about what matters.

  • Help others & share knowledge
  • Earn cash & points
  • Learn & ask questions
  • 5
  • 3
10 Comments
 
LVL 17

Assisted Solution

by:Barry Cunney
Barry Cunney earned 50 total points
ID: 35146364
What type of database are you transferring data to - In my experience for MS SQL Server, the 'SQL Server Destination' object performs better than the 'OLE DB' object.

Also you could try dropping the indexes on the destination table temporarily and re-create them after the data load
0
 
LVL 1

Author Comment

by:adstorm88
ID: 35149108
Thank you Bcunney for your reply,
'SQL SERVER Destination ' we can use only if source and Destination are on same server ?, Because in my scenario Source and destination are different server(not on same server)


any thoughts?
0
 
LVL 37

Expert Comment

by:ValentinoV
ID: 35198613
The "SQL Server Destination" is indeed faster but can only be used if your package is running on the same server as the destination server.  Which does not seem to be the case here.

Have a look at the following great articles with tips how to improve SSIS performance:
http://sqlcat.com/top10lists/archive/2008/10/01/top-10-sql-server-integration-services-best-practices.aspx
http://www.simple-talk.com/sql/ssis/sql-server-2005-ssis-tuning-the-dataflow-task/
0
Get 15 Days FREE Full-Featured Trial

Benefit from a mission critical IT monitoring with Monitis Premium or get it FREE for your entry level monitoring needs.
-Over 200,000 users
-More than 300,000 websites monitored
-Used in 197 countries
-Recommended by 98% of users

 
LVL 1

Author Comment

by:adstorm88
ID: 35201506
Thanks Valentino for your reply,
I am read those article, I am searching for table or graph where values for
Default Buffer maxrows =
Default buffer size =
I can reach better performance,Because only performance tuning I can do is my Data flow only or How can I implement batch processing for this scenario,

Thanks
0
 
LVL 1

Author Comment

by:adstorm88
ID: 35202150
I think I am making progress in improving performance,
In OLEDB Destination I have Unchekced  check constraints
Row per batch = 2000000
Max insert commit size = 2147483647((largest value for 4 byte integer type)
I got loaded 19.2 millions in 8 mins, Before it was 18 mins,

Now I searching for best value for Max insert commit size = ?
so that Data flow will do several commits, and  it will release the pressure on the transaction log and tempdb to grow tremendously specifically during high volume data transfers.

any thoughts ?
0
 
LVL 37

Accepted Solution

by:
ValentinoV earned 450 total points
ID: 35204796
There's no specific value that can be recommended because it all depends on the situation.  You'll have to experiment with that to find out what setting gives the best result.

The recommendation from the CAT article is: "For an indexed destination, I recommend testing between 100,000 and 1,000,000 as batch size."

Usually I don't need to customize that number and leave it at 2147483647, but I have also seen situations where 100.000 gave good results.

In any case, don't put it to zero.  Here's what the Books Online have to say about that: "A value of 0 might cause the running package to stop responding if the OLE DB destination and another data flow component are updating the same source table. To prevent the package from stopping, set the Maximum insert commit size option to 2147483647."

I recomment to try playing with that while leaving the "Rows per batch" empty.

There's some good info here as well: http://social.msdn.microsoft.com/forums/en-US/sqlintegrationservices/thread/86efc54e-45bd-4ccc-baf1-a7b34f80fe32/
0
 
LVL 1

Author Comment

by:adstorm88
ID: 35209861
Valentino V,
Thank you for your suggestions and Comments, I helped me up to some extent, Not completely Because as you know scenario to scenario environment resources are different,
But I could able to run package 7:44 mins,
Thank you
 
0
 
LVL 1

Author Comment

by:adstorm88
ID: 35209887
all is well
0
 
LVL 37

Expert Comment

by:ValentinoV
ID: 35214149
I'm glad you could speed up your package, and quite well: coming 18 minutes and ending up at 7:44 is nice enough to have a splendid weekend! :-)
0

Featured Post

The Ultimate Checklist to Optimize Your Website

Websites are getting bigger and complicated by the day. Video, images, custom fonts are all great for showcasing your product/service. But the price to pay in terms of reduced page load times and ultimately, decreased sales, can lead to some difficult decisions about what to cut.

Question has a verified solution.

If you are experiencing a similar issue, please ask a related question

JSON is being used more and more, besides XML, and you surely wanted to parse the data out into SQL instead of doing it in some Javascript. The below function in SQL Server can do the job for you, returning a quick table with the parsed data.
In the first part of this tutorial we will cover the prerequisites for installing SQL Server vNext on Linux.
Familiarize people with the process of retrieving data from SQL Server using an Access pass-thru query. Microsoft Access is a very powerful client/server development tool. One of the ways that you can retrieve data from a SQL Server is by using a pa…
This videos aims to give the viewer a basic demonstration of how a user can query current session information by using the SYS_CONTEXT function

688 members asked questions and received personalized solutions in the past 7 days.

Join the community of 500,000 technology professionals and ask your questions.

Join & Ask a Question