Get URL content in batch mode.

I need to be able to get the content of URLs (typically Web pages) in batch mode. That means, I need to get Web pages content automatically through a command line and then be able to anlyze it. I also need to use such tool in order to get content of HTML FORM. I used a webget.pl Perl script but  
I cannot get result setting a time out value in order to fire a program if the URL is not available.

I would run this batch on my NT4.0 server.

Thanks in advance.
joel011197Asked:
Who is Participating?

[Webinar] Streamline your web hosting managementRegister Today

x
 
akisConnect With a Mentor Commented:
You may want to look at LIBWWW Perl modules(ver 5.10 is the latest version, from CPAN).  These allow you to have a Perl script actually visit a web site/page, sent GET or POST requests and receive content. You can then parse that page content (received as text of course) and do your stuff.

Example:
Assuming that array @URL has all URLs that you want to visit:

##*******************************
use LWP::UserAgent;
use HTTP::Request;
my $url
my $ua = new LWP::UserAgent;
foreach $url(@URL) {
  my $request = new HTTP::Request 'GET', $url;
  my $response = $ua->request($request);
  if ($response->is_success) {
      print $response->content(); # to print URL's text
  } else {
      print "Failed:".$response->as_string()."\n";
  }
}

0
All Courses

From novice to tech pro — start learning today.