Link to home
Start Free TrialLog in
Avatar of jambla
jamblaFlag for Thailand

asked on

wget saving web page help

Hello,

I need a bit of help with wget.

When a user submits a URL I want to use wget to create an archive/backup of that specific page.  I want to include all the contents of the page i.e. css, images, js etc...

I have the following code and it's working about 90% of what I need.

exec("wget -e robots=off --limit-rate=250k -F -P /home/USERNAME/public_html/results/". $rnd1 ."/". $rnd2 ."/"." -p -k -E ". $site_url ."");

Open in new window


The problem with this code is if a user submits a URL like this:

http://techcrunch.com/2011/03/22/digital-textbook-startup-inkling-nabs-multi-million-dollar-investment-from-mcgraw-hill-and-pearson/

The backup will be structured this way:

[ techcrunch.com - Folder ] / [ 2011 - Folder ] / [ 03 - Folder ] / [ 22 - Folder ] / [ digital-textbook-startup-inkling-nabs-multi-million-dollar-investment-from-mcgraw-hill-and-pearson - Folder ]

User generated image
The html will load all the images from main site (techcrunch.com)

However if the user submits a URL like this:

http://blog.joerogan.net/archives/2889

The backup will contain all the images, css, etc...

User generated image


I hope this makes sense.  If not I will try to clarify.
ASKER CERTIFIED SOLUTION
Avatar of absx
absx
Flag of United Kingdom of Great Britain and Northern Ireland image

Link to home
membership
This solution is only available to members.
To access this solution, you must be a member of Experts Exchange.
Start Free Trial
Avatar of jambla

ASKER

Hello absx,

Thanks for the link, I will have a look to see if it can help me out.


Any one else have any suggestions?