[Okta Webinar] Learn how to a build a cloud-first strategyRegister Now

x
?
Solved

wget schedule sending mail if response time > 5 secs

Posted on 2011-04-24
9
Medium Priority
?
761 Views
Last Modified: 2013-11-15
I'm using a script called Responser which sends email when any of the my sites are down/slow. I just wanted to check if it's sending the mail only if the sites are really down or slow(as I checked it accessing as soon I got the error mails and sites was fine). I thought of using wget as it shows the speed and response time and I want to schedule it to compare it with my responser script whether both sends the mail at the same time saying error.

I've postfix installed and I could send mails using mail command to any e-mail id from the terminal. I want the output of wget http://www.mysite.com to be mailed with the condition if response time is greater than 5 seconds. Then I can schedule it to run at every 5 mins. I hope someone could help me.

Thanks!
0
Comment
Question by:Thyagaraj03
  • 3
  • 3
  • 2
  • +1
9 Comments
 
LVL 18

Expert Comment

by:TobiasHolm
ID: 35458623
Hi!

I had a little problem finding a site with more than 1 second ping response time, but try this script:
#!/bin/sh

pcmd="ping -c 2 -W 29 -i 30 -w 59 -n yoursite.com"

ptime=$($pcmd | grep 'rtt' | awk -F'/' '{ print $6 }')
echo "Ping MAX response time:" $ptime
psec=$(echo $ptime | awk -F'.' '{ print $1 }')
echo "Response time in whole ms:" $psec

if [ "$psec" -gt 5000 ]; then echo "Over 5 sec"
	#  Put your mail commandline here
fi

Open in new window


It sends two pings (because the first pings sometimes are lost), waits 30 seconds between the pings and makes no attempts to lookup symbolic names for host addresses.

Regards, Tobias
0
 
LVL 6

Assisted Solution

by:de2Zotjes
de2Zotjes earned 200 total points
ID: 35458653
The script below tries to retrieve the wanted page, add the script as a cronjob on a box and have fun
#!/bin/bash

# make sure it ends within 5m (300s)
trap exit_script(1) SIGALRM
alarm(290)

recipient='admin@mysite.com' #the mail address to send notifications
page='http://www.mysite.com/' #the page to measure
threshold=5 #the number of seconds above which a mail is sent

content=$(mktemp content.XXXXXX)

function exit_script {
        if [[ -f $content ]]; then
                rm -f $content
        fi
        if [[ $1 -ne 0 ]]; then
                mail -s 'Retrieving $page failed' $recipient
   #!/bin/bash

# make sure it ends within 5m (300s)
trap exit_script(1) SIGALRM
alarm(290)

recipient='admin@misite.com'
page='http://www.mysite.com/'
content=$(mktemp content.XXXXXX)

function exit_script {
        if [[ -f $content ]]; then
                rm -f $content
        fi
        if [[ $1 -ne 0 ]]; then
                mail -s 'Retrieving $page failed' $recipient
        fi
        exit $1
}

duration=$(/usr/bin/time -f '%e' 2>&1 wget -q $page -O $content)
if [[ $duration -gt $threshold ]]; then
        cat $content|mail -s 'Retrieving $page took $duration' $recipient
fi
exit_script(0)
     fi
        exit $1
}

duration=$(/usr/bin/time -f '%e' 2>&1 wget -q $page -O $content)
if [[ $duration -gt 5 ]]; then
        cat $content|mail -s 'Retrieving $page took $duration' $recipient
fi
exit_script(0)

Open in new window


Error handling is not quite up to par but that is left as an excercise to the reader :)
0
 
LVL 6

Expert Comment

by:de2Zotjes
ID: 35458662
Hmmm, double pasted the script

#!/bin/bash

# make sure it ends within 5m (300s)
trap exit_script(1) SIGALRM
alarm(290)

recipient='admin@misite.com' #the mail address to send notifications
page='http://www.mysite.com/' #the page to measure
threshold=5 #the number of seconds above which a mail is sent

content=$(mktemp content.XXXXXX)

function 
if [[ $duration -gt 5 ]]; then
        cat $content|mail -s 'Retrieving $page took $duration' $recipient
fi
exit_script(0)
     fi
        exit $1
}

duration=$(/usr/bin/time -f '%e' 2>&1 wget -q $page -O $content)
if [[ $duration -gt 5 ]]; then
        cat $content|mail -s 'Retrieving $page took $duration' $recipient
fi
exit_script(0)

Open in new window


The script will send the data wget retreived as the content of the message
0
Veeam and MySQL: How to Perform Backup & Recovery

MySQL and the MariaDB variant are among the most used databases in Linux environments, and many critical applications support their data on them. Watch this recorded webinar to find out how Veeam Backup & Replication allows you to get consistent backups of MySQL databases.

 
LVL 18

Accepted Solution

by:
TobiasHolm earned 1600 total points
ID: 35458766
I was a little quick to respond. I now saw you wanted to measure the wget time, not the ping time. Try this script! The outfile must point to a directory where the user running the script has write rights.
#!/bin/bash

outfile=/tmp/response.txt

# Set time format to seconds
TIMEFORMAT=%R
# Time a process
ptime=$(time (wget http://xpro.se >$outfile 2>&1) 2>&1)
echo "Response time:" $ptime
psec=$(echo $ptime | awk -F'.' '{ print $1 }')
echo "Response time in whole seconds:" $psec

if [ "$psec" -gt 5 ]; then echo "Over 5 sec"
	#  Put your mail commandline here
	cat $outfile|mail -s 'Response time over 5 sec' 'your@mail.com'
fi

Open in new window


Regards, Tobias
0
 

Author Comment

by:Thyagaraj03
ID: 35458820
Hi Tobias, Hi de2Zotjes, the scripts seems great!

@Tobias: I'll give a try on my VM and get back here

@de2Zotjes: Don't know why I'm getting the following error when I try to execute the script(test1.sh is the script name)

./test1.sh: line 4: syntax error near unexpected token `('
./test1.sh: line 4: `trap exit_script(1) SIGALRM'
0
 

Author Comment

by:Thyagaraj03
ID: 35460386
@Tobias: Awesome!, it works. I'm just wondering if it's possible to edit the script for multiple websites
0
 
LVL 18

Expert Comment

by:TobiasHolm
ID: 35460565
Sure! Instead of the hardcoded sitename in line 8, use $1

ptime=$(time (wget $1 >$outfile 2>&1) 2>&1)

Open in new window

Then call the script and type the site as an argument:

./yourscript.sh http://xpro.se

Open in new window

Regards, Tobias
0
 
LVL 8

Assisted Solution

by:shaunak
shaunak earned 200 total points
ID: 35464105
Using a continuous cron for wget will result in load to your server. Instead the login should be:

Check the ping report (as @TobiasHolm suggested) every 10mins.
If it at any particular run of the cron, the report is not normal, then start checking after every two mins for that site. And mail the response. There are many free tools available to keep this monitoring
http://sixrevisions.com/tools/10-free-server-network-monitoring-tools-that-kick-ass/

wget will test whether your site is working or not. If webserver is restarted it will give your errors. While ping will test your network. Sometimes firewall blocks the ips which are continuously doing the ping to there server.
0
 

Author Comment

by:Thyagaraj03
ID: 35464497
It's my sole responsibility for  assigning points to experts legally. Tobias's answer is perfectly matched and I hope he would never mind if I wont' assign full points as he would keep on gaining sharing his knowledge. Thanks!
0

Featured Post

Free Tool: ZipGrep

ZipGrep is a utility that can list and search zip (.war, .ear, .jar, etc) archives for text patterns, without the need to extract the archive's contents.

One of a set of tools we're offering as a way to say thank you for being a part of the community.

Question has a verified solution.

If you are experiencing a similar issue, please ask a related question

If you use Debian 6 Squeeze and you are tired of looking at the childish graphical GDM login screen that is used by default, here's an easy way to change it. If you've already tried to change it you've probably discovered that none of the old met…
Meet the world's only “Transparent Cloud™” from Superb Internet Corporation. Now, you can experience firsthand a cloud platform that consistently outperforms Amazon Web Services (AWS), IBM’s Softlayer, and Microsoft’s Azure when it comes to CPU and …
How to Install VMware Tools in Red Hat Enterprise Linux 6.4 (RHEL 6.4) Step-by-Step Tutorial
If you're a developer or IT admin, you’re probably tasked with managing multiple websites, servers, applications, and levels of security on a daily basis. While this can be extremely time consuming, it can also be frustrating when systems aren't wor…

834 members asked questions and received personalized solutions in the past 7 days.

Join the community of 500,000 technology professionals and ask your questions.

Join & Ask a Question