Improve company productivity with a Business Account.Sign Up

x
  • Status: Solved
  • Priority: Medium
  • Security: Public
  • Views: 438
  • Last Modified:

Fetch each new page in a new process

I am fetching a page to get a part of that site!
I am using the following code:

#!/usr/local/bin/perl

# read en write currency convert file from www.oada.com
$output_url = "http://www.oanda.com/converter/classic?value=1&exch=NLG&expr=USD";

push(@test, "EUR","USD","GBP","NLG");
use LWP::Simple;

print "Content-type: text/html\n\n";
foreach (@test) {
     $html = get("http://www.oanda.com/converter/classic?value=1&exch=NLG&expr=$_");
     $/="\n";
     
     # Remove the \n chars from that variable.
     $html =~ s/\n//g;
     # Extract the relevent part.
     $html =~ /(.*)<\!-- conversion result starts  -->(.*)\!-- conversion result ends  -->(.*)/;
     # Set $html_text
     $html_text = $2;
     # Squeeze multiple white spaces to a single space
     $html_text =~ s/\s+/ /g;
     # Remove any HTML tags
     $html_text =~ s/<[^>]*>//g;
     # Get the currency value!!
     $html_text =~ /$$item_currency\s+=\s+([\d,]+\.?\d+)/;
     $result = $1;
     $result =~ s/,//g;

     print "<br>result: $result<br>";
}

The fetching will done by the get command.
This all is done as much times as the values in @test.
The get process will take sometimes so much time that I get the following error:

CGI Timeout
The specified CGI application exceeded the allowed time for processing. The server has deleted the process.

Is it possible to start for each get command a new process?
How to take care that the cgi timeout will not happen??
If it happens, how to start the process again (Maybe for a max. of 3 times)??

0
weversbv
Asked:
weversbv
  • 5
  • 5
1 Solution
 
christopher sagayamCommented:
?
0
 
weversbvAuthor Commented:
Get the page with the expr EUR.
Write to screen and then start the script again with the following value (USD) etc.
I think the foreach loop has then to be removed??
0
 
marecsCommented:
This script forks and thus accesses the site concurrently for each currency. It is still possible that one of the accesses takes too long and the script may time out. Try using LWP::UserAgent and set the timeout value.

#!/usr/bin/perl -w

use strict;
use LWP::Simple;
use vars qw(@test $cur $child $html_text $html);

my @test = ( "EUR","USD","GBP","NLG");
$|++; # Autoflush essential
print "Content-type: text/html\n\n";

foreach $cur (reverse @test) {
    if ($child = fork) {
        waitpid($child,0);
        print $html_text if defined $html_text;
        exit;
    }
    else {
        die "fork failed" unless $child == 0;
        $html = get("http://www.oanda.com/converter/classic?value=1&exch=NLG&expr=$cur");
        $html =~ /<!-- conversion result starts  -->(.*)<!-- conversion result ends  -->/s;
        $html_text = $1;
    }
}

print $html_text; # Last child with no child of its own
0
Free Tool: Path Explorer

An intuitive utility to help find the CSS path to UI elements on a webpage. These paths are used frequently in a variety of front-end development and QA automation tasks.

One of a set of tools we're offering as a way of saying thank you for being a part of the community.

 
marecsCommented:
If the server killed you then that's it. You cannot start yourself up again. The user will have to initiate a new request from his browser. Why not store a cache of the exchange rates and fall back to these values when a timeout occurs?

0
 
weversbvAuthor Commented:
What do you mean marecs??
Can you give me an example!

By the way I tried your code, but it will not work on NT (Windows2000).
The Unsupported function fork function is unimplemented...
0
 
weversbvAuthor Commented:
How to use and set the LWP::UserAgent and set the
timeout value.
0
 
marecsCommented:
Then I am very sorry I put in the effort. My solution is a UNIX only one unfortunately.

If you use LWP::UserAgent rather than LWP::Simple there is an option to specify the timeout of a request. What you probably want  to prevent is that nasty "CGI Timeout" error. So if your request times out, it would be handy to have values you can fall back to. This can be achieved by writing the exchange rates to a file every time you were able to get them, and then read those values when a timeout occurs.

Sorry, no time for an example. look at perldoc LWP::UserAgent
0
 
marecsCommented:
use LWP::UserAgent;
$ua = LWP::UserAgent->new;
$request = HTTP::Request->new('GET', 'http://www.msn.com/');
$ua->timeout(5); # 5 seconds
$response = $ua->request($request);
$response or die "Timed out";
0
 
weversbvAuthor Commented:
I will increase to 300 point for a example!
0
 
weversbvAuthor Commented:
I tried this:

#!/usr/local/bin/perl

$|++;

use LWP::UserAgent;
$ua = LWP::UserAgent->new;
$request = HTTP::Request->new('GET', 'http://www.msn.com/');
$ua->timeout(5); # 5 seconds
$response = $ua->request($request);
$response or die "Timed out";

print "Content-type: text/html\n\n";
print "response: $response<br>";
print "request: $request<br>";

But this will be the result:

response: HTTP::Response=HASH(0x8647ac)
request: HTTP::Request=HASH(0x3a96c8)

What am I doing wrong?
0
 
marecsCommented:
print "response: ", $response->content, "<br>";

So you could use your original script with
$html = $response->content;
0
Question has a verified solution.

Are you are experiencing a similar issue? Get a personalized answer when you ask a related question.

Have a better answer? Share it in a comment.

Join & Write a Comment

Featured Post

Free Tool: ZipGrep

ZipGrep is a utility that can list and search zip (.war, .ear, .jar, etc) archives for text patterns, without the need to extract the archive's contents.

One of a set of tools we're offering as a way to say thank you for being a part of the community.

  • 5
  • 5
Tackle projects and never again get stuck behind a technical roadblock.
Join Now