We help IT Professionals succeed at work.

Check out our new AWS podcast with Certified Expert, Phil Phillips! Listen to "How to Execute a Seamless AWS Migration" on EE or on your favorite podcast platform. Listen Now


Prevent parellel execution of SSIS package

Medium Priority
Last Modified: 2013-03-13
We have a SSIS package that we need to prevent from being run more than once at any given time.  The package can be executed sequentially, but cannot be executed in parallel.  It doesn't matter how the package is called, if an instance of the package is already executing the next time execution is triggered it needs to either, wait (queue up the execution) or return an error that execution is already in progress.

We need some way of enforcing this.

The package can be called from multiple sources:
Triggered from a web application
Called directly from SQL Server
Watch Question

Unlock this solution with a free trial preview.
(No credit card required)
Get Preview
There is a property called "MaxConcurrentExecutables" for a package.

For more info




Yes that is the only way we came up with so far too...  Thanks for the suggestion.

That sounded neat, but reading the article it limits concurrent tasks within a package - not concurrent executions of the entire package....
EugeneZ ZhitomirskySQL SERVER EXPERT

you probably needf to change the way how you call the ssis pack
you can try to wrap it as .NET application
and try to get GUID, and or logging to check if status is running , etc

Execute Package Task

Integration Services: Performance Tuning Techniques



>>you probably need to change the way how you call the ssis pack
We need a solution that is independent of how the package is called

The links you posted are for tuning a package - the package is fine - it just can't be run more than once at a time based on the resources it needs to use.  We want the second execution to either a) queue up the request to run and run once the previous package run is complete or b) report back an error as in "can't be called now, please try again"

For some reason, locking resources isn't sufficient - they are locked and when the second execution tries to access them it hangs up the first execution - instead of just the second one not working or timing out...

So we need to actually prevent the second, third, etc. calls until the prior one is complete.

The only thing we came up with is like jijeesh mentioned, having the package run status in a table.  Then the first step of the package is to read that table to find out if it is "okay" to run.  This prevents the multiple run, but would have to return the error - we aren't sure how to use that technique to queue up the request to run...

As a note, this package takes about 40 minutes to run in one configuration (that goes nightly) and about 2 minutes to run during all other calls (on a much smaller population).  It is more a concern of queuing the short runs - since those are user triggered, the nightly run is scheduled...

Unlock the solution to this question.
Thanks for using Experts Exchange.

Please provide your email to receive a free trial preview!

*This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.


Please enter a first name

Please enter a last name

8+ characters (letters, numbers, and a symbol)

By clicking, you agree to the Terms of Use and Privacy Policy.