I need a bit of help with wget
When a user submits a URL I want to use wget to create an archive/backup of that specific page
. I want to include all the contents of the page i.e. css, images, js etc...
I have the following code and it's working about 90% of what I need.
exec("wget -e robots=off --limit-rate=250k -F -P /home/USERNAME/public_html/results/". $rnd1 ."/". $rnd2 ."/"." -p -k -E ". $site_url ."");
The problem with this code is if a user submits a URL like this:
The backup will be structured this way:
[ techcrunch.com - Folder ] / [ 2011 - Folder ] / [ 03 - Folder ] / [ 22 - Folder ] / [ digital-textbook-startup-i
on - Folder ]
The html will load all the images from main site (techcrunch.com)
However if the user submits a URL like this:
The backup will contain all the images, css, etc...
I hope this makes sense. If not I will try to clarify.