Download Website from Google Cache

I recently suffered a SQL Injection attach on one of my websites which meant a large proportion of my MS-SQL tables were over-written with Malware scripts.
I searched Google to find that c.480 pages are cached.
Is there a quick way of downloading all these cached pages in one go, using software?
I dont fancy having to click on each cached link and go File > Save in my browser
Who is Participating?
Although Google are a very friendly company, I doubt they are going to act on this an time soon for you - if at all, they will say to just view and save from their cache (which you already know about.)

If you know where the pages are in their cache you can use the automated approach that I mentioned in the first comment - the longer you wait (especially for a reply and action from Google) the greater the chance your site will go from their cache next time they visit and be gone forever...

Alternatively, the WAY BACK MACHINE stores copies of peoples websites, you may be in look and they have a copy - you are more likely to get a reply and action from them than the Googleplex.

On something like this however, time is of the essence so at least try the suggestions I made while waiting for feedback from Google, because as the old saying goes... once it's gone, it's gone!

Good luck
Bluesquirrel WebWhacker can be unleashed on a website to save it and its assets, try pointing it at the cache?

Alternatively use it on the way back machine here

It may be logged in there - my sites in there, so I can actually see what it looked like after 3 redesigns; cool stuff.
Bernard S.CTOCommented:
Have you tried contacting Google?
Keep up with what's happening at Experts Exchange!

Sign up to receive Decoded, a new monthly digest with product updates, feature release info, continuing education opportunities, and more.

fgictAuthor Commented:
Hi fibo

No I have not tried contacting Google.
How would I go about that? what do you think they could provide me?
fgictAuthor Commented:
Hi bluefezteam

I have review your answer re: downloading from google using website downloader software but the big problem I have is:
When I point it to a cached page e.g.
The links within this page are not cached links but the website links so it will spider all the dynamic pages I have on the site e.g. news.asp and this will return the corrupted content.

I dont know how else to download the cached pages without doing it manually one by one
Hmm try the way back machine on you may have some luck there.
Is there a way of listing all the pages that you need to save?

For example do you still have the sitemap structure as a google sitemap (XMl doc) - if you do then maybe it's possible to apply that structure to googles cache and create a routine to strip all content from those page links defines in the site map.

you should be able to strip out the text from the pages automatically using some form of PHP/.Net routine applied to a recursive loop controlled by the sitemap

Bernard S.CTOCommented:
what is now the situation?
Question has a verified solution.

Are you are experiencing a similar issue? Get a personalized answer when you ask a related question.

Have a better answer? Share it in a comment.

All Courses

From novice to tech pro — start learning today.