• Status: Solved
  • Priority: Medium
  • Security: Public
  • Views: 3890
  • Last Modified:

Using cURL to download an entire webpage (HTML, images, css, js etc...)

Hi,

I have been searching here and Google for the past few days but I haven't been able to find an answer.

I want to have a script that will download one page of a website with all the content i.e. images, css, js etc...

I have been able to save the html (text) like this:

function get_data($url)
{
	$ch = curl_init();
	$timeout = 5;
	curl_setopt($ch,CURLOPT_URL,$url);
	curl_setopt($ch,CURLOPT_RETURNTRANSFER,1);
	curl_setopt($ch,CURLOPT_CONNECTTIMEOUT,$timeout);
	$data = curl_exec($ch);
	curl_close($ch);
	return $data;
}

$returned_content = get_data('http://example.com/page.htm');

$my_file = 'file.htm';
$handle = fopen($my_file, 'w') or die('Cannot open file:  '.$my_file);
fwrite($handle, $returned_content);

Open in new window


This will save a file called 'file.htm' with all the HTML but no images, css, js etc...

I have also been able to do this:

$img[]='http://example.com/image.jpg';

foreach($img as $i){
	save_image($i);
	if(getimagesize(basename($i))){
		echo 'Image ' . basename($i) . ' Downloaded OK';
	}else{
		echo 'Image ' . basename($i) . ' Download Failed';
	}
}

function save_image($img,$fullpath='basename'){
	if($fullpath=='basename'){
		$fullpath = basename($img);
	}
	$ch = curl_init ($img);
	curl_setopt($ch, CURLOPT_HEADER, 0);
	curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1);
	curl_setopt($ch, CURLOPT_BINARYTRANSFER,1);
	$rawdata=curl_exec($ch);
	curl_close ($ch);
	if(file_exists($fullpath)){
		unlink($fullpath);
	}
	$fp = fopen($fullpath,'x');
	fwrite($fp, $rawdata);
	fclose($fp);
}

Open in new window



This will save that specific image but I haven't found anything that will save the entire HTML with all the content behind it.


Thanks for your help in advance!
0
jambla
Asked:
jambla
  • 8
  • 5
1 Solution
 
roemelboemelCommented:
I would propose to using a wget call from php.
Wget has the option "-p" option to get a copy of the page including all images,css etc.
If you further like to adjust the links to the images,css rewriten to the downloaded content you have to specify "-K"
0
 
jamblaAuthor Commented:
Hello roemelboemel,

Thanks for your response.  In my hours of searching I have seen a lot of talk about wget.  However, I'm not sure how to use it.  I have verified that my server has enabled it.

Using wget and php how would I go about saving a webpage to a folder with all the contents of the webpage?  Could you show me an example of code that would do this for me?

Thanks again.
0
 
roemelboemelCommented:
This sniplet would save the http://www.gnu.org in the directory /tmp/www.gnu.org/
exec('wget -P /tmp/ -p -k www.gnu.org');

Open in new window

0
Introducing Cloud Class® training courses

Tech changes fast. You can learn faster. That’s why we’re bringing professional training courses to Experts Exchange. With a subscription, you can access all the Cloud Class® courses to expand your education, prep for certifications, and get top-notch instructions.

 
jamblaAuthor Commented:
Hi roemelboemel,

I uploaded this to my server:

<?php

exec('wget -P /tmp/ -p -k www.gnu.org');

?>

Open in new window


When I execute it I'm getting a "500 Internal Server Error".  I contacted my server and they confirmed that wget is enabled.

Any suggestions?
0
 
roemelboemelCommented:
I've tested on one of my servers. It's working. Please check the errorlogs of the apache webserver about the specific error.

no permission to write to /tmp/
not allowed to call external command using exec
selinux preventing something
...
0
 
jamblaAuthor Commented:
Ok,

I was able to get rid of the 500 Internal Error and it seems to be executing fine, however I looked in the /tmp/ folder but I don't see any files/folders there that match the downloaded site.
0
 
roemelboemelCommented:
Should be in /tmp/www.gnu.org/ with all the images etc.
What about when running it on the commandline?
0
 
jamblaAuthor Commented:
When I try to go to /tmp/www.gnu.org/ I'm getting "550 Can't change directory to /tmp/www.gnu.org: No such file or directory"

What about when running it on the commandline?

I'm running Windows 7.  Can I run a Linux cmd from Windows?
0
 
roemelboemelCommented:
If you have a shell access via ssh you could run it on the server. furthermore if you have a directory where the user under which the webserver (doesn't have to be the same user you are using to upload you php file) is running you could point it there instead of /tmp.
You could make a directory on the webserver ex.
mkdir /home/<myhomedir>/foo

Open in new window

. then give access to the write access to webserver user or for testing just
chmod 777 /home/<myhomedir>/foo

Open in new window

And then point the wget output there

<?php
exec('wget -P /home/<myhomedir>/foo/ -p -k www.gnu.org');
?> 

Open in new window

0
 
jamblaAuthor Commented:
Hi roemelboemel,

Thanks for the suggestion however I don't have shell access.  I have tried to mess around with wget for a few days now and I always end up getting nowhere.  Which is why I started looking more at cURL.

I will try to create a dir with 777 and try to send the files there.
0
 
jamblaAuthor Commented:
Hi roemelboemel,

I have tried creating a folder, changing the permission to 777 and running the wget script but I'm still not getting the folder/files.
0
 
jamblaAuthor Commented:
Hi roemelboemel,

I tried a few other things and I got it working!


Thanks so much for your help!
0
 
jamblaAuthor Commented:
Like always the gurus at EE comes though!
0
Question has a verified solution.

Are you are experiencing a similar issue? Get a personalized answer when you ask a related question.

Have a better answer? Share it in a comment.

Join & Write a Comment

Featured Post

Get your problem seen by more experts

Be seen. Boost your question’s priority for more expert views and faster solutions

  • 8
  • 5
Tackle projects and never again get stuck behind a technical roadblock.
Join Now