• Status: Solved
  • Priority: Medium
  • Security: Public
  • Views: 1423
  • Last Modified:

Prevent parellel execution of SSIS package

We have a SSIS package that we need to prevent from being run more than once at any given time.  The package can be executed sequentially, but cannot be executed in parallel.  It doesn't matter how the package is called, if an instance of the package is already executing the next time execution is triggered it needs to either, wait (queue up the execution) or return an error that execution is already in progress.

We need some way of enforcing this.

The package can be called from multiple sources:
Triggered from a web application
Called directly from SQL Server
etc.
0
mrichmon
Asked:
mrichmon
1 Solution
 
jijeeshCommented:
You can track it in a database table. When you start processing the package set a flag  (say  "IsRunning")  in database and once you finish running reset it. Also be carefully handle the flag on exceptions.
0
 
radcaesarCommented:
There is a property called "MaxConcurrentExecutables" for a package.

For more info

http://blogs.msdn.com/b/sqlperf/archive/2007/05/11/implement-parallel-execution-in-ssis.aspx
0
 
mrichmonAuthor Commented:
jijeesh,

Yes that is the only way we came up with so far too...  Thanks for the suggestion.


radcaesar,
That sounded neat, but reading the article it limits concurrent tasks within a package - not concurrent executions of the entire package....
0
 
Eugene ZCommented:
hmm.
you probably needf to change the way how you call the ssis pack
you can try to wrap it as .NET application
and try to get GUID, and or logging to check if status is running , etc

check;
Execute Package Task
http://technet.microsoft.com/en-us/library/ms137609.aspx

Integration Services: Performance Tuning Techniques
http://technet.microsoft.com/en-us/library/cc966529.aspx

0
 
mrichmonAuthor Commented:
EugeneZ,

>>you probably need to change the way how you call the ssis pack
We need a solution that is independent of how the package is called

The links you posted are for tuning a package - the package is fine - it just can't be run more than once at a time based on the resources it needs to use.  We want the second execution to either a) queue up the request to run and run once the previous package run is complete or b) report back an error as in "can't be called now, please try again"

For some reason, locking resources isn't sufficient - they are locked and when the second execution tries to access them it hangs up the first execution - instead of just the second one not working or timing out...

So we need to actually prevent the second, third, etc. calls until the prior one is complete.

The only thing we came up with is like jijeesh mentioned, having the package run status in a table.  Then the first step of the package is to read that table to find out if it is "okay" to run.  This prevents the multiple run, but would have to return the error - we aren't sure how to use that technique to queue up the request to run...

As a note, this package takes about 40 minutes to run in one configuration (that goes nightly) and about 2 minutes to run during all other calls (on a much smaller population).  It is more a concern of queuing the short runs - since those are user triggered, the nightly run is scheduled...

0
Question has a verified solution.

Are you are experiencing a similar issue? Get a personalized answer when you ask a related question.

Have a better answer? Share it in a comment.

Join & Write a Comment

Featured Post

Free Tool: Site Down Detector

Helpful to verify reports of your own downtime, or to double check a downed website you are trying to access.

One of a set of tools we are providing to everyone as a way of saying thank you for being a part of the community.

Tackle projects and never again get stuck behind a technical roadblock.
Join Now