Learn how to a build a cloud-first strategyRegister Now

x
  • Status: Solved
  • Priority: Medium
  • Security: Public
  • Views: 456
  • Last Modified:

Need to scrape images from website

I have a client with an existing website.  He is unhappy with his current designer and would like to use me.  I have access to his product database, but he was never given website access (FTP) credentials, so I cannot directly access the 10,000+ images he has for his site.
The database gives me the relative paths to all images.

Is there some script I can write that will:
1. parse through rows of a file
2. navigate to the image
3. save the image with the same relative path structure

...so that I can quickly get all the images I need for a site demo?

Thanks!
0
dimmergeek
Asked:
dimmergeek
1 Solution
 
Scott Fell, EE MVEDeveloperCommented:
Try using httrack. You can download the images in the same folder structure they currently are in.  http://www.httrack.com/
0
 
Ray PaseurCommented:
+1 for httrack.  I've used it for exactly the kind of thing you need.  IIRC it is Windows-only.  Runs on your laptop.  Much easier than trying to do this with a server-to-server PHP script!
0
 
Dave BaldwinFixer of ProblemsCommented:
There is a Linux version of HTTrack.  http://www.httrack.com/   WinHTTrack is the Windows version and is available on the same page.
0

Featured Post

New feature and membership benefit!

New feature! Upgrade and increase expert visibility of your issues with Priority Questions.

Tackle projects and never again get stuck behind a technical roadblock.
Join Now