I have a perl CGI script which is ofcourse called under user "nobody". I want this script to update a database. This information needs to come from various sources which may take a while to find. I therefore want to make the updating of the database a seperate process so that the CGI script can exit while the updating process does its work. The CGI script is meant as a way of starting the process going, and MUST be independent of the updating process. I have tried the following;
1) use system with the & appended. The script does not finish until ALL processes, even child processes end (because they all run under "nobody"). This is no good.
2) Use exec to try to create a child process. This runs afoul of the same problem, the CGI script will not end until all processes owned by nobody are complete.
This led me to think of the following possible solution;
1) Use Perl Cookbook recipe 16.11 to make a process look like a file and have the CGIs try to update a file which is really a script which updates the database. Would this new process also be owned by nobody?
2) Use IPC::Shareable to share a variable between the CGI script processes and another process which I have running constantly under a different user (say myself). This process is sleeping and reacts only ehen a variable changes (I read about this in Perl Cookbook, recipe 16.12) This should allow for an exchange of data between the two processes (I only need two variables to go from the CGI to the process which updates the database). I am concerned whether IPC::Shareable will allow processes owned by different users to share variables...
I would appreciate it if someone could give me a suggestion here. I know this is a complicated problem so I am really looking for an idea and a direction to follow. Links to documentation or a book where I can get this information would be great. Obviously the more info you can give, the better, and sample code would be the best!