• Status: Solved
  • Priority: Medium
  • Security: Public
  • Views: 729
  • Last Modified:

How to optimize SSIS 2008 Package

Hi Experts,

I am connected to this server3  through my desktop, I have created Sql server 2008 Integration services package But package is created on server3 only

My package consists the following tasks

Dynamic connections script component( C# code is written),
Execute sql task(delete from dbo.Spn_SM_PF_Bst_Prce_Prcng)
Data flow task(loading data from server1 (database1.Dbo.Sam_pf_Bst_prce_Prcng using Oledb source) to server 2 (database2.dbo.Spn_SM_PF_Bst_Prce_Prcng using Oledb destination compponent) Its a straight forward loading

 my source table have 19221289 rows, when I execute package it is taking about 15 mins time,
Now my challenge is reduce no of logs read/writing to Database, Decrease  execution time,

How to optimize my data flow task, could you please suggest its properties like

BLOBTempStorage Path
Buffer TempStorage Path =
Default Buffer maxrows =
Default buffer size =

any properties that can improve performance,
 Please help me I am in very emergency need, I am working on this now,
0
adstorm88
Asked:
adstorm88
  • 5
  • 3
2 Solutions
 
Barry CunneyCommented:
What type of database are you transferring data to - In my experience for MS SQL Server, the 'SQL Server Destination' object performs better than the 'OLE DB' object.

Also you could try dropping the indexes on the destination table temporarily and re-create them after the data load
0
 
adstorm88Author Commented:
Thank you Bcunney for your reply,
'SQL SERVER Destination ' we can use only if source and Destination are on same server ?, Because in my scenario Source and destination are different server(not on same server)


any thoughts?
0
 
ValentinoVBI ConsultantCommented:
The "SQL Server Destination" is indeed faster but can only be used if your package is running on the same server as the destination server.  Which does not seem to be the case here.

Have a look at the following great articles with tips how to improve SSIS performance:
http://sqlcat.com/top10lists/archive/2008/10/01/top-10-sql-server-integration-services-best-practices.aspx
http://www.simple-talk.com/sql/ssis/sql-server-2005-ssis-tuning-the-dataflow-task/
0
Get quick recovery of individual SharePoint items

Free tool – Veeam Explorer for Microsoft SharePoint, enables fast, easy restores of SharePoint sites, documents, libraries and lists — all with no agents to manage and no additional licenses to buy.

 
adstorm88Author Commented:
Thanks Valentino for your reply,
I am read those article, I am searching for table or graph where values for
Default Buffer maxrows =
Default buffer size =
I can reach better performance,Because only performance tuning I can do is my Data flow only or How can I implement batch processing for this scenario,

Thanks
0
 
adstorm88Author Commented:
I think I am making progress in improving performance,
In OLEDB Destination I have Unchekced  check constraints
Row per batch = 2000000
Max insert commit size = 2147483647((largest value for 4 byte integer type)
I got loaded 19.2 millions in 8 mins, Before it was 18 mins,

Now I searching for best value for Max insert commit size = ?
so that Data flow will do several commits, and  it will release the pressure on the transaction log and tempdb to grow tremendously specifically during high volume data transfers.

any thoughts ?
0
 
ValentinoVBI ConsultantCommented:
There's no specific value that can be recommended because it all depends on the situation.  You'll have to experiment with that to find out what setting gives the best result.

The recommendation from the CAT article is: "For an indexed destination, I recommend testing between 100,000 and 1,000,000 as batch size."

Usually I don't need to customize that number and leave it at 2147483647, but I have also seen situations where 100.000 gave good results.

In any case, don't put it to zero.  Here's what the Books Online have to say about that: "A value of 0 might cause the running package to stop responding if the OLE DB destination and another data flow component are updating the same source table. To prevent the package from stopping, set the Maximum insert commit size option to 2147483647."

I recomment to try playing with that while leaving the "Rows per batch" empty.

There's some good info here as well: http://social.msdn.microsoft.com/forums/en-US/sqlintegrationservices/thread/86efc54e-45bd-4ccc-baf1-a7b34f80fe32/
0
 
adstorm88Author Commented:
Valentino V,
Thank you for your suggestions and Comments, I helped me up to some extent, Not completely Because as you know scenario to scenario environment resources are different,
But I could able to run package 7:44 mins,
Thank you
 
0
 
adstorm88Author Commented:
all is well
0
 
ValentinoVBI ConsultantCommented:
I'm glad you could speed up your package, and quite well: coming 18 minutes and ending up at 7:44 is nice enough to have a splendid weekend! :-)
0

Featured Post

Visualize your virtual and backup environments

Create well-organized and polished visualizations of your virtual and backup environments when planning VMware vSphere, Microsoft Hyper-V or Veeam deployments. It helps you to gain better visibility and valuable business insights.

  • 5
  • 3
Tackle projects and never again get stuck behind a technical roadblock.
Join Now