• Status: Solved
  • Priority: Medium
  • Security: Public
  • Views: 1423
  • Last Modified:

Prevent parellel execution of SSIS package

We have a SSIS package that we need to prevent from being run more than once at any given time.  The package can be executed sequentially, but cannot be executed in parallel.  It doesn't matter how the package is called, if an instance of the package is already executing the next time execution is triggered it needs to either, wait (queue up the execution) or return an error that execution is already in progress.

We need some way of enforcing this.

The package can be called from multiple sources:
Triggered from a web application
Called directly from SQL Server
1 Solution
You can track it in a database table. When you start processing the package set a flag  (say  "IsRunning")  in database and once you finish running reset it. Also be carefully handle the flag on exceptions.
There is a property called "MaxConcurrentExecutables" for a package.

For more info

mrichmonAuthor Commented:

Yes that is the only way we came up with so far too...  Thanks for the suggestion.

That sounded neat, but reading the article it limits concurrent tasks within a package - not concurrent executions of the entire package....
Eugene ZCommented:
you probably needf to change the way how you call the ssis pack
you can try to wrap it as .NET application
and try to get GUID, and or logging to check if status is running , etc

Execute Package Task

Integration Services: Performance Tuning Techniques

mrichmonAuthor Commented:

>>you probably need to change the way how you call the ssis pack
We need a solution that is independent of how the package is called

The links you posted are for tuning a package - the package is fine - it just can't be run more than once at a time based on the resources it needs to use.  We want the second execution to either a) queue up the request to run and run once the previous package run is complete or b) report back an error as in "can't be called now, please try again"

For some reason, locking resources isn't sufficient - they are locked and when the second execution tries to access them it hangs up the first execution - instead of just the second one not working or timing out...

So we need to actually prevent the second, third, etc. calls until the prior one is complete.

The only thing we came up with is like jijeesh mentioned, having the package run status in a table.  Then the first step of the package is to read that table to find out if it is "okay" to run.  This prevents the multiple run, but would have to return the error - we aren't sure how to use that technique to queue up the request to run...

As a note, this package takes about 40 minutes to run in one configuration (that goes nightly) and about 2 minutes to run during all other calls (on a much smaller population).  It is more a concern of queuing the short runs - since those are user triggered, the nightly run is scheduled...

Question has a verified solution.

Are you are experiencing a similar issue? Get a personalized answer when you ask a related question.

Have a better answer? Share it in a comment.

Join & Write a Comment

Featured Post

Free Tool: Site Down Detector

Helpful to verify reports of your own downtime, or to double check a downed website you are trying to access.

One of a set of tools we are providing to everyone as a way of saying thank you for being a part of the community.

Tackle projects and never again get stuck behind a technical roadblock.
Join Now