Welcome to Experts Exchange

Add your voice to the tech community where 5M+ people, just like you, are talking about what matters.

  • Help others & share knowledge
  • Earn cash & points
  • Learn & ask questions
Solved

cron job duplicate process by itself again and again

Posted on 2014-12-10
4
378 Views
Last Modified: 2014-12-10
Dear expert,

I have a question concerning with cron job. this is my code sample:

00 7-20 * * * /home/usr/./test.sh  # test.sh is an executable file

The code runs perfectly. I leave the script on for 4 to 5 hours to see if the script can run without any issue. When I check the process after few hours I see 5 same process running at same time.

I've checked online that flock can "fix" this issue:

I've applied this:

00 7-20 * * * flock -n /home/usr/test.sh -c /home/usr/./test.sh

somehow I got this error: /bin/bash: bad interpreter: Text file busy.

Is there away that I can reduce down the process that only one process is running when I activate my script through cron job? Thanks
0
Comment
Question by:Kinderly Wade
4 Comments
 
LVL 23

Expert Comment

by:savone
ID: 40492276
I beleive the problem is your time.

7-20 means running it every hour between 7 and 20.  This would mean a new process is started every hour between those hours.
0
 

Author Comment

by:Kinderly Wade
ID: 40492322
Is there a way for me to configure that I want the script to start 7:00 am and stop at 8:pm? THanks
0
 
LVL 4

Assisted Solution

by:Zsolt Pribusz
Zsolt Pribusz earned 250 total points
ID: 40492415
You can do it with 2 crontab entry.
First you need to start your script, and then you need to kill it.
You can do this in many ways.
here is two sort example:

#Start:
0 7 * * * /home/usr/test.sh
#End:
0 20 * * * pkill -f test.sh  #This will kill any process with test.sh in its command line 

Open in new window


If you want to be more specific, then modify your test.sh to store it's pid to some file:
Add this line to begining of your script
echo $$ > /tmp/test_sh.pid

Open in new window


then you can kill process based on this file.
 0 20 * * * kill $(cat /tmp/test_sh.pid)

Open in new window

0
 
LVL 38

Accepted Solution

by:
Gerwin Jansen, EE MVE earned 250 total points
ID: 40492545
I would not kill the script, it may be busy doing something. I would recommend to first check if the script is running already and if not, create the 'flag' file as suggested above. Then when the script finishes, remove the flag file. You could add some logging at the check part.

FLAGFILE=/tmp/test_sh.flag
# check
if ! [ -r ${FLAGFILE} ]
then
touch ${FLAGFILE} 
else
# add logging here (optional)
exit
fi

# start
(your code here)

# finish
rm -f ${FLAGFILE} 

Open in new window


Just make sure no one cleans the flag file for some reason, you could store it elsewhere.
0

Featured Post

Efficient way to get backups off site to Azure

This user guide provides instructions on how to deploy and configure both a StoneFly Scale Out NAS Enterprise Cloud Drive virtual machine and Veeam Cloud Connect in the Microsoft Azure Cloud.

Question has a verified solution.

If you are experiencing a similar issue, please ask a related question

1. Introduction As many people are interested in Linux but not as many are interested or knowledgeable (enough) to install Linux on their system, here is a safe way to try out Linux on your existing (Windows) system. The idea is that you insta…
This article will show, step by step, how to integrate R code into a R Sweave document
Learn how to get help with Linux/Unix bash shell commands. Use help to read help documents for built in bash shell commands.: Use man to interface with the online reference manuals for shell commands.: Use man to search man pages for unknown command…
Learn how to navigate the file tree with the shell. Use pwd to print the current working directory: Use ls to list a directory's contents: Use cd to change to a new directory: Use wildcards instead of typing out long directory names: Use ../ to move…

829 members asked questions and received personalized solutions in the past 7 days.

Join the community of 500,000 technology professionals and ask your questions.

Join & Ask a Question