Solved

Looping a shell script

Posted on 2012-04-02
6
457 Views
Last Modified: 2012-04-02
Hi,

I have a shell script (pipeline.sh) located in the following directory:

/fs/nas15/home/mqbpgsm4/permanalysis/bin

I would like to run 1000 instances of this script on a linux server by submitting it to a queue using Qsub 1000 times.

i.e:
qsub -b y (/directory/permanalysis/bin) -N perm1 sh pipeline.sh

I would like to loop it so that the job is submitted to server 1000 times:
i.e.

qsub -b y (/directory/permanalysis/bin) -N perm1 sh pipeline.sh
qsub -b y (/directory/permanalysis/bin) -N perm2 sh pipeline.sh
qsub -b y (/directory/permanalysis/bin) -N perm3 sh pipeline.sh
to
qsub -b y (/directory/permanalysis/bin) -N perm1000 sh pipeline.sh

Is this at all possible?

Thanks,

Stephen
0
Comment
Question by:StephenMcGowan
[X]
Welcome to Experts Exchange

Add your voice to the tech community where 5M+ people just like you are talking about what matters.

  • Help others & share knowledge
  • Earn cash & points
  • Learn & ask questions
6 Comments
 
LVL 1

Expert Comment

by:Xizz
ID: 37795308
Hi Stephen,

You could try something like this:

#!/bin/bash
for ((i=1;i<=1000;i++));
do
   qsub -b y (/directory/permanalysis/bin) -N $i sh pipeline.sh
done

Open in new window

0
 
LVL 48

Expert Comment

by:Tintin
ID: 37795331
for i in $(seq 1 1000)
do
   qsub -b y (/directory/permanalysis/bin) -N $i sh pipeline.sh
done

Open in new window

0
 
LVL 16

Accepted Solution

by:
Peter Kwan earned 500 total points
ID: 37795365
Sure it is possible. The following is an example:

X=1
while [ $X -le 1000 ];
do
    eval "qsub -b y (/directory/permanalysis/bin) -N perm$X sh pipeline.sh"
    X=`expr $X + 1`
done

Open in new window

0
Industry Leaders: We Want Your Opinion!

We value your feedback.

Take our survey and automatically be enter to win anyone of the following:
Yeti Cooler, Amazon eGift Card, and Movie eGift Card!

 

Author Comment

by:StephenMcGowan
ID: 37795498
Hi Guys,

Thanks for getting back to me.

Would I need to define the bin folder for the location of pipeline.sh?

i.e:

#define pipeline.sh location
$directory=~/fs/nas15/home/mqbpgsm4/permanalysis/bin

X=1
while [ $X -le 1000 ];
do
    eval "qsub -b y $directory -N perm$X sh pipeline.sh"
    X=`expr $X + 1`
done

Open in new window


Thanks,

Stephen
0
 
LVL 1

Expert Comment

by:Xizz
ID: 37795570
Yes you could do that, otherwise place the script in the same folder.
0
 

Author Comment

by:StephenMcGowan
ID: 37795585
Sorry,

Just to double check, would the attached file work ok?
I just want to double check this out before potentially submitting 1000 jobs to the server.

Thanks again,

Stephen
loopscript.sh
0

Featured Post

NFR key for Veeam Backup for Microsoft Office 365

Veeam is happy to provide a free NFR license (for 1 year, up to 10 users). This license allows for the non‑production use of Veeam Backup for Microsoft Office 365 in your home lab without any feature limitations.

Question has a verified solution.

If you are experiencing a similar issue, please ask a related question

Suggested Solutions

Title # Comments Views Activity
RPM creation 6 48
Logrotate Every Saturday 5 43
phpmyadmin memory error 55 92
Issue when using 'yum update' 4 23
Linux users are sometimes dumbfounded by the severe lack of documentation on a topic. Sometimes, the documentation is copious, but other times, you end up with some obscure "it varies depending on your distribution" over and over when searching for …
I have been pestered over the years to produce and distribute regular data extracts, and often the request have explicitly requested the data be emailed as an Excel attachement; specifically Excel, as it appears: CSV files confuse (no Red or Green h…
Learn how to find files with the shell using the find and locate commands. Use locate to find a needle in a haystack.: With locate, check if the file still exists.: Use find to get the actual location of the file.:

734 members asked questions and received personalized solutions in the past 7 days.

Join the community of 500,000 technology professionals and ask your questions.

Join & Ask a Question