Want to win a PS4? Go Premium and enter to win our High-Tech Treats giveaway. Enter to Win


Bash Script Output

Posted on 2012-03-20
Medium Priority
Last Modified: 2012-03-21

I currently have a script which ideally, I'd like to run 1000 times by submitting them to a queue on a linux cluster using qsub.

The script is called pipeline.sh.

I've tried submitting using qsub 10 times from the command line at the same time. All ten scripts fell over, so I suspect each pipeline instance is attempting to access (to write to) the same file somewhere.

I need to try to control where the output goes for each run (for TMP files, and any output files). If the location of where the output goes can't be changed (e.g. if it just gets dumped to the current working directory), would it be possible to create a set of temporary directories? (as part of a bash script).

Here's what I'm thinking:

e.g.  pseudo-code for a control script that submits the jobs to the queue.
runDirs=qw(dir1, dir2, dir3, dir4)
foreach(dir in $runDirs)
       cd $dir
       qsub -cwd -b y -V -q node.q -N name bash script.sh
       cd ..

I basically want to submit multiple jobs to a queue without the runs tripping up over one another.

I've attached the scripts which I'm currently working with.


Question by:StephenMcGowan
Welcome to Experts Exchange

Add your voice to the tech community where 5M+ people just like you are talking about what matters.

  • Help others & share knowledge
  • Earn cash & points
  • Learn & ask questions
  • 3
LVL 84

Accepted Solution

ozo earned 1500 total points
ID: 37741425
You might try changing
mkdir $TMP

Author Comment

ID: 37741790
Hi ozo,

Thanks for getting back to me.

mkdir $TMP

This created a TEMP folder for each job in my permanalysis folder i.e. "tmp.17906" etc.

I ran 10 jobs from the commandline at the same time to test your modified code.
Each job failed for different reasons.

cannot find FILE:

formatdb host and controls ...
Run fastacutter.pl ..

cannot find FILE at /fs/nas15/home/mqbpgsm4/permanalysis/bin/fastacutter.pl line 9.
chmod: cannot access `/fs/nas15/home/mqbpgsm4/permanalysis/tmp.8975/input_for_pipeline.sh': No such file or directory
get molecular mimicry candidates ...
pipeline.sh: line 215: /fs/nas15/home/mqbpgsm4/permanalysis/tmp.8975/input_for_pipeline.sh: No such file or directory
cat: /fs/nas15/home/mqbpgsm4/permanalysis/tmp.8975/B_burgdorferi*-in: No such file or directory
cat: /fs/nas15/home/mqbpgsm4/permanalysis/tmp.8975/B_burgdorferi*-out: No such file or directory
grep: /fs/nas15/home/mqbpgsm4/permanalysis/tmp.8975/B_burgdorferi*-peptides: No such file or directory
grep: /fs/nas15/home/mqbpgsm4/permanalysis/tmp.8975/B_burgdorferi*-peptidesincontrol: No such file or directory
clean up tmp ...
Job B_burgdorferi completed...
rm: cannot remove `/fs/nas15/home/mqbpgsm4/permanalysis/data/proteomes/B_burgdorferi.fasta': No such file or directory

is this because the same script (found in /fs/nas15/home/mqbpgsm4/permanalysis/bin/fastacutter.pl) is currently being run on a different job? Would a temporary folder need to be set up for the /bin/ directory where the scripts are being held or am I understanding this wrong?

No string found in fs/nas15/home/mqbpgsm4/permanalysis/tmp.21749/B_burgdorferi1-finalids :

formatdb host and controls ...
Run fastacutter.pl ..
get molecular mimicry candidates ...
Working on B_burgdorferi1
Blast against control species
Separate fulllength conserved/nonconserved proteins
Ungapped Blast parasite 14mers against control species
Filter peptides control species
Ungapped Blast against host/vector proteome
Filter peptides host/vector

No string to search with was found in /fs/nas15/home/mqbpgsm4/permanalysis/tmp.21749/B_burgdorferi1-finalids. Nothing to do!
Calculation of Shannon entropy
clean up tmp ...
Job B_burgdorferi completed...

Thanks again,


Author Comment

ID: 37745402
I think for:

cannot find FILE at /fs/nas15/home/mqbpgsm4/permanalysis/bin/fastacutter.pl line 9.

It is because PROTEOMEDIR is currently still assigned as a global folder used by all jobs, but it will need to be job specific:

if [ -z "$PROTEOMEDIR" ]

Shuffleseq needs to pick up Original.fasta from the PROTEOMEDIR shown above. BUT it needs to output (-outseq) the file into the specific job folder (this can be the TMP folder created for the job as specified above).
Phobius needs to pick up $PARASITE.fasta from the jobs tmp folder (where it was submitted by Shuffleseq) and it's output should go back in to the jobs specific tmp folder.

How the code looks at the moment:

echo Use shuffleseq to create shuffled proteome
$SHUFFLESEQ -sequence $PROTEOMEDIR/Original.fasta -outseq $PROTEOMEDIR/B_burgdorferi.fasta
echo Run phobius ...
$BINDIR/adaptphobius.pl $PARASITE

How I think it should look (apologies for any mistakes):

echo Use shuffleseq to create shuffled proteome
$SHUFFLESEQ -sequence $PROTEOMEDIR/Original.fasta -outseq $TMP/B_burgdorferi.fasta
echo Run phobius ...
$PHOBIUSBIN -short $TMP/$PARASITE.fasta > $TMP/$PARASITE.phobius
$BINDIR/adaptphobius.pl $PARASITE

Would this work? I'm trying to keep everythinbg 'job specific' so that if the same script was run multiple times, there wouldn't be any problem. As it is at the moment I can see many B_Burgdorferi files being created ($SHUFFLESEQ -sequence $PROTEOMEDIR/Original.fasta -outseq $PROTEOMEDIR/B_burgdorferi.fasta) in PROTEOMEDIR=~/permanalysis/data/proteomes

Thanks again,


Author Closing Comment

ID: 37746568
Solved the immediate problem I had, but this in turn led to another problem.

Posted second question on same conversation with no response.

Therefore opened a new question:




Featured Post

What does it mean to be "Always On"?

Is your cloud always on? With an Always On cloud you won't have to worry about downtime for maintenance or software application code updates, ensuring that your bottom line isn't affected.

Question has a verified solution.

If you are experiencing a similar issue, please ask a related question

Background Still having to process all these year-end "csv" files received from all these sources (including Government entities), sometimes we have the need to examine the contents due to data error, etc... As a "Unix" shop, our only readily …
In part one, we reviewed the prerequisites required for installing SQL Server vNext. In this part we will explore how to install Microsoft's SQL Server on Ubuntu 16.04.
This demo shows you how to set up the containerized NetScaler CPX with NetScaler Management and Analytics System in a non-routable Mesos/Marathon environment for use with Micro-Services applications.
Six Sigma Control Plans

618 members asked questions and received personalized solutions in the past 7 days.

Join the community of 500,000 technology professionals and ask your questions.

Join & Ask a Question