seekinG1
asked on
Function file_get_contents connection time out
Hi,
I'm using the file get contents to display a dynamic text on my site. but it doesn't work 100% of the time. Every once in a while it times out and gives me the error below.
Warning: file_get_contents *** [function.file-get-content s]: failed to open stream: Connection timed out in *** on line 9
What are my solutions?
I'm a php newbie so detailed answers are appreciated.
Thank you!
my code looks like this:
$url = "http://feed.com"; <--- gives me a .csv file
$csv = file_get_contents($url);
$output = explode($csv));
I'm using the file get contents to display a dynamic text on my site. but it doesn't work 100% of the time. Every once in a while it times out and gives me the error below.
Warning: file_get_contents *** [function.file-get-content
What are my solutions?
I'm a php newbie so detailed answers are appreciated.
Thank you!
my code looks like this:
$url = "http://feed.com"; <--- gives me a .csv file
$csv = file_get_contents($url);
$output = explode($csv));
Try this code:
// Create a stream
$opts = array(
'http'=>array(
'timeout'=>60,
)
);
$context = stream_context_create($opts);
// Open the file using the HTTP headers set above
$file = file_get_contents($url, false, $context);
You might find that some error handling is appropriate to this application. If the remote server hangs or fails to respond, you need to capture the condition and at least report it, or program your way around it.
Since w do not know exactly what your "feed.com" file looks like, we cannot guess how long the remote server would be expected to take to produce the CSV file, but if it is more than a second or two, I would be surprised. Most HTTP requests complete in sub-second time.
For remote services, I prefer CURL to file_get_contents() - it returns better error information and is more flexible in several ways. Here is an example of CURL packaged into a function. You can install it and run it to see the moving parts. HTH, ~Ray
Since w do not know exactly what your "feed.com" file looks like, we cannot guess how long the remote server would be expected to take to produce the CSV file, but if it is more than a second or two, I would be surprised. Most HTTP requests complete in sub-second time.
For remote services, I prefer CURL to file_get_contents() - it returns better error information and is more flexible in several ways. Here is an example of CURL packaged into a function. You can install it and run it to see the moving parts. HTH, ~Ray
<?php // RAY_curl_example.php
error_reporting(E_ALL);
function my_curl($url, $timeout=2, $error_report=FALSE)
{
$curl = curl_init();
// HEADERS FROM FIREFOX - APPEARS TO BE A BROWSER REFERRED BY GOOGLE
$header[] = "Accept: text/xml,application/xml,application/xhtml+xml,text/html;q=0.9,text/plain;q=0.8,image/png,*/*;q=0.5";
$header[] = "Cache-Control: max-age=0";
$header[] = "Connection: keep-alive";
$header[] = "Keep-Alive: 300";
$header[] = "Accept-Charset: ISO-8859-1,utf-8;q=0.7,*;q=0.7";
$header[] = "Accept-Language: en-us,en;q=0.5";
$header[] = "Pragma: "; // browsers keep this blank.
// SET THE CURL OPTIONS - SEE http://php.net/manual/en/function.curl-setopt.php
curl_setopt($curl, CURLOPT_URL, $url);
curl_setopt($curl, CURLOPT_USERAGENT, 'Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US; rv:1.9.1.6) Gecko/20091201 Firefox/3.5.6');
curl_setopt($curl, CURLOPT_HTTPHEADER, $header);
curl_setopt($curl, CURLOPT_REFERER, 'http://www.google.com');
curl_setopt($curl, CURLOPT_ENCODING, 'gzip,deflate');
curl_setopt($curl, CURLOPT_AUTOREFERER, TRUE);
curl_setopt($curl, CURLOPT_RETURNTRANSFER, TRUE);
curl_setopt($curl, CURLOPT_TIMEOUT, $timeout);
// RUN THE CURL REQUEST AND GET THE RESULTS
$htm = curl_exec($curl);
$err = curl_errno($curl);
$inf = curl_getinfo($curl);
curl_close($curl);
// ON FAILURE
if (!$htm)
{
// PROCESS ERRORS HERE
if ($error_report)
{
echo "CURL FAIL: $url TIMEOUT=$timeout, CURL_ERRNO=$err";
var_dump($inf);
}
return FALSE;
}
// ON SUCCESS
return $htm;
}
// USAGE EXAMPLE
$url = "http://twitter.com/Ray_Paseur";
$htm = my_curl($url);
if (!$htm) die("NO $url");
// SHOW WHAT WE GOT
echo "<pre>";
echo htmlentities($htm);
ASKER
Thanks for the replies!
Gary, could you give me an example of how to do it?
This is the feed:
http://finance.yahoo.com/d/quotes.csv?s=lulu&f=snl1c1ohgvt1
Gary, could you give me an example of how to do it?
This is the feed:
http://finance.yahoo.com/d/quotes.csv?s=lulu&f=snl1c1ohgvt1
ASKER
hernst42: it works with the codes you recommended but how that's going to help with the occasional timeout?
ASKER
Ray_Paseur: I like to have the code error handling but this is what I get when I run the code:
<HEAD><TITLE>Redirect</TIT LE></HEAD>
<BODY BGCOLOR="white" FGCOLOR="black">
<FONT FACE="Helvetica,Arial"><B>
"<em>http://download.finance.yahoo.com/d/quotes.csv?s=lulu&f=snl1c1ohgvt1</em>".<p></B></FONT>
<!-- default "Redirect" response (301) -->
</BODY>
<HEAD><TITLE>Redirect</TIT
<BODY BGCOLOR="white" FGCOLOR="black">
<FONT FACE="Helvetica,Arial"><B>
"<em>http://download.finance.yahoo.com/d/quotes.csv?s=lulu&f=snl1c1ohgvt1</em>".<p></B></FONT>
<!-- default "Redirect" response (301) -->
</BODY>
ASKER CERTIFIED SOLUTION
membership
This solution is only available to members.
To access this solution, you must be a member of Experts Exchange.
ASKER
Ray,
This is fantastic! Much faster than the file get content.
Many thanks!
This is fantastic! Much faster than the file get content.
Many thanks!
Thanks for the points - it's a great question. ~Ray
find Timeout variable and give it 300 value
Timeout 300
Thanks
Gary