I have a web-based application that uses Perl CGI to provide an interface to execute unix commands remotely. Each set of command has got a name and a given set of commands always runs on a given set of remote machines. I use ssh to connect to the remote machines.
A set of commands cannot be executed by two persons simultaneously. So, two persons accessing the same instance of the web-application cannot execute the same command-set simultaneously. Suppose I have two instances of my web-application running on two different servers, both the servers should not run a given command-set simultaneously.
So, I am using a locking mechanism prevent such simultaneous opertation. I am creating a file called <command-set-name>.lck in a common location. Whenever a new user asks to execute the same command set, I first check if this file exists before allowing the user to execute it.
This mechanism has been working fine so far, but as you can see this is not atomic and so I am scared that the system will not scale. Moreever, we have only one server hosting the application so far. I am afraid the method doesn't work when we have simultaneous servers hostsing it.
I will be grateful to have some advice on how to proceed.