I currently have a script which ideally, I'd like to run 1000 times by submitting them to a queue on a linux cluster using qsub.
The script is called pipeline.sh.
I've tried submitting using qsub 10 times from the command line at the same time. All ten scripts fell over, so I suspect each pipeline instance is attempting to access (to write to) the same file somewhere.
I need to try to control where the output goes for each run (for TMP files, and any output files). If the location of where the output goes can't be changed (e.g. if it just gets dumped to the current working directory), would it be possible to create a set of temporary directories? (as part of a bash script).
Here's what I'm thinking:
e.g. pseudo-code for a control script that submits the jobs to the queue.
runDirs=qw(dir1, dir2, dir3, dir4)
foreach(dir in $runDirs)
qsub -cwd -b y -V -q node.q -N name bash script.sh
I basically want to submit multiple jobs to a queue without the runs tripping up over one another.
I've attached the scripts which I'm currently working with.