rgb192
asked on
run a batch script to visit a website every hour
just one url
once an hour
windows server 2008
once an hour
windows server 2008
Please explain a little more - what URL? Is it on your server or another? Which server has Windows?
Use curl to do this.
Download curl:-
http://www.gknw.net/mirror/curl/win32/curl-7.19.7-ssl-sspi-zlib-static-bin-w32.zip
Now, create a batch file:-
C:\Path_to_curl\curl.exe http://www.google.com/
Now use task schedular to run the batch file every hour
That's it.
Download curl:-
http://www.gknw.net/mirror/curl/win32/curl-7.19.7-ssl-sspi-zlib-static-bin-w32.zip
Now, create a batch file:-
C:\Path_to_curl\curl.exe http://www.google.com/
Now use task schedular to run the batch file every hour
That's it.
ASKER
using windows
url on my server or another server
url on my server or another server
Do you want to do this in PHP?
ASKER
php would make it easier because i know php
WGET is one of the most used command on linux to download page and files.
http://gnuwin32.sourceforge.net/packages/wget.htm
In a batch, you just need to put :
wget http://www.yousiteaddress.com/thepage.php
http://gnuwin32.sourceforge.net/packages/wget.htm
In a batch, you just need to put :
wget http://www.yousiteaddress.com/thepage.php
OK, I know Linux, and not much about Windows, but I think you are on the right path with the task scheduler.
Here is a script that will do the /GET interaction with a URL. It can have arguments, etc. It will retrieve any browser output that the site produces (HTML). If you use this with the task scheduler, it will be almost like typing the url into your browser.
best regards, ~Ray
Here is a script that will do the /GET interaction with a URL. It can have arguments, etc. It will retrieve any browser output that the site produces (HTML). If you use this with the task scheduler, it will be almost like typing the url into your browser.
best regards, ~Ray
<?php // RAY_curl_example.php
error_reporting(E_ALL);
function my_curl($url)
{
$curl = curl_init();
// HEADERS FROM FIREFOX - APPEARS TO BE A BROWSER REFERRED BY GOOGLE
$header[] = "Accept: text/xml,application/xml,application/xhtml+xml,text/html;q=0.9,text/plain;q=0.8,image/png,*/*;q=0.5";
$header[] = "Cache-Control: max-age=0";
$header[] = "Connection: keep-alive";
$header[] = "Keep-Alive: 300";
$header[] = "Accept-Charset: ISO-8859-1,utf-8;q=0.7,*;q=0.7";
$header[] = "Accept-Language: en-us,en;q=0.5";
$header[] = "Pragma: "; // browsers keep this blank.
curl_setopt($curl, CURLOPT_URL, $url);
curl_setopt($curl, CURLOPT_USERAGENT, 'Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US; rv:1.8.1.15) Gecko/20080623 Firefox/2.0.0.15');
curl_setopt($curl, CURLOPT_HTTPHEADER, $header);
curl_setopt($curl, CURLOPT_REFERER, 'http://www.google.com');
curl_setopt($curl, CURLOPT_ENCODING, 'gzip,deflate');
curl_setopt($curl, CURLOPT_AUTOREFERER, true);
curl_setopt($curl, CURLOPT_RETURNTRANSFER, 1);
curl_setopt($curl, CURLOPT_TIMEOUT, 3); // GIVE UP AFTER THREE SECONDS
if (!$html = curl_exec($curl))
{
return FALSE;
}
curl_close($curl);
return $html;
}
// USAGE EXAMPLE
$url = 'http://twitter.com';
$htm = my_curl($url);
if (!$htm) die("NO $url");
echo "<pre>";
echo htmlentities($htm);
ASKER
only loops once
could i put a
while (no key pressed)
and a sleep 36000
and have this loop all day
would php time out
could i put a
while (no key pressed)
and a sleep 36000
and have this loop all day
would php time out
<?php // RAY_curl_example.php
error_reporting(E_ALL);
// USAGE EXAMPLE
$url = 'http://twitter.com';
$htm = my_curl($url);
//if (!$htm) die("NO $url");
echo "<pre>";
echo htmlentities($htm);
function my_curl($url)
{
$curl = curl_init();
// HEADERS FROM FIREFOX - APPEARS TO BE A BROWSER REFERRED BY GOOGLE
$header[] = "Accept: text/xml,application/xml,application/xhtml+xml,text/html;q=0.9,text/plain;q=0.8,image/png,*/*;q=0.5";
$header[] = "Cache-Control: max-age=0";
$header[] = "Connection: keep-alive";
$header[] = "Keep-Alive: 300";
$header[] = "Accept-Charset: ISO-8859-1,utf-8;q=0.7,*;q=0.7";
$header[] = "Accept-Language: en-us,en;q=0.5";
$header[] = "Pragma: "; // browsers keep this blank.
curl_setopt($curl, CURLOPT_URL, $url);
curl_setopt($curl, CURLOPT_USERAGENT, 'Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US; rv:1.8.1.15) Gecko/20080623 Firefox/2.0.0.15');
curl_setopt($curl, CURLOPT_HTTPHEADER, $header);
curl_setopt($curl, CURLOPT_REFERER, 'http://www.google.com');
curl_setopt($curl, CURLOPT_ENCODING, 'gzip,deflate');
curl_setopt($curl, CURLOPT_AUTOREFERER, true);
curl_setopt($curl, CURLOPT_RETURNTRANSFER, 1);
curl_setopt($curl, CURLOPT_TIMEOUT, 3); // GIVE UP AFTER THREE SECONDS
if (!$html = curl_exec($curl))
{
return FALSE;
}
curl_close($curl);
return $html;
}
?>
ASKER CERTIFIED SOLUTION
membership
This solution is only available to members.
To access this solution, you must be a member of Experts Exchange.
ASKER
works
Great! Thanks for the points, ~Ray
You may use wget for windows : http://gnuwin32.sourceforge.net/packages/wget.htm
Regards.