• Status: Solved
  • Priority: Medium
  • Security: Public
  • Views: 460
  • Last Modified:

downloading images from URL links

I have a database with images for each product.

the field for images is populated with urls for the images.

how can i download all the images from the urls?
ideally i would be able to choose what to name each image also.

is there a program or script i can run to do this?
0
gleverator
Asked:
gleverator
1 Solution
 
Jeffrey CoachmanCommented:
You may want to click the "Request Attention" link and ask that the appropriate web zones get added to this Q.
0
 
gdemariaCommented:

what language are you coding in?  Php, coldfusion, asp...
0
 
puppydogbuddyCommented:
try downloading Infranview, a well known free graphics viewer, editor, etc.  See this link:
          http://www.irfanview.com/
0
Concerto's Cloud Advisory Services

Want to avoid the missteps to gaining all the benefits of the cloud? Learn more about the different assessment options from our Cloud Advisory team.

 
Bill PrewCommented:
Hard to be sure exactly what you are after, if you want real time download and display of the images in an application, or not.  Since you mentioned renaming images it sounds more like you might want to pull them all at once to a local location for the application to use.  If that is the case then take a look at the WGET free utility, it's great at pulling things from the web and storing local copies.

http://www.gnu.org/software/wget/

~bp
0
 
BillDLCommented:
Yes there are some methods that you could adopt using a couple of 3rd-party programs, but it depends on several aspects that perhaps you can explain a little bit more about to see whether my proposed suggestion would be suitable for your needs:

1. Are you trying to run a script from WITHIN your Database, eg. using VBA Scripting?

2. Are you trying to run Perl, VBS, JavaScript from within a Web Page?

3. If neither, and an EXTERNALLY run script is what you are looking for, then do you have a text-based list of URLs that can be used as a "list file" for a script that does the image downloading?

4. Do the URLs contain any special characters or spaces in the file names?  The characters may have to be anticipated in advance for a batch scripted method, such as finding and replacing %20 in the target file name with spaces when saving the file locally.

The following assumes that:
1. You have a text-based "list file" of URLs, each one on a new line,
2. That the server's folders are accessible to a browser, ie. no 404.php files or default.htm files to deliberately stop image browsing,
3. That the suggested small, old, but still functional, 3rd-party program runs in your Operating System:

URL2File:
http://www.chami.com/free/url2file_wincon.html
Called from a batch file that parses a list file containing fully qualified URLs, you can download the images to a new or existing folder of choice and optionally rename them.  For example, the following batch file should function:

 
@echo off
SetLocal EnableDelayedExpansion
set LISTFILE=C:\Images\ImgList.txt
set SAVEFOLDER=C:\Images\Downloaded
set PROGEXE=C:\Program Files\URL2File\URL2FILE.EXE
set SUFFIX=_downloaded

for /f "tokens=* delims=" %%A in ('type listfile.txt ^| find /i ".jpg"') do (
    echo.
    set URL=%%A
    set FILE=%%~nA
    set NAME=%%~nxA
    set EXT=%%~xA
    set FILE=!FILE:http://=!
    set FILE=!FILE:/=,!
    set FILE=!FILE:%%20=?!
    set FILE=!FILE: =?!
    set FILE=!FILE:?= !
    echo Downloading File: !NAME!
    call "%PROGEXE%" -o 15 "!URL!" "%SAVEFOLDER%\!FILE!%SUFFIX%!EXT!"
    echo. 
)
pause

Open in new window

0
 
BillDLCommented:
Hi Bill.  I see we were thinking along the same lines as we tried to interpret gleverator's exact needs.  I went away for something to eat and came back, hence the delay in posting.  I hadn't read your suggestion.
0
 
Bill PrewCommented:
It's all good Bill.

~bp
0
 
BillDLCommented:
gleverator

In case you do test my suggested batch file, be aware of the following:

1. It is set to search the listfile for URLs that have their target as .JPG files.  That can probably be changed easily enough.

2. If browsing to the web server folder is prevented using a simple method like a 404.php or default.htm file in that folder. then URL2File will download whatever it sees in that folder, which could be the HTM or other file.  The file is saved as whatever the file extension of the called URL was, even though it is not an image file.

3. To use it, just modify the paths in the   set=   lines in the upper part of the batch file before the FOR loop.  Don't modify any other contents unless it doesn't work and you know why and how.

4. If at first the URL2File commands fail, you can run the program with switches that allow debugging:
-d = Enable debug mode.  Display warnings and other miscellaneous information.  Must be specified before other parameters.
-h = Display web server's response headers.
-o n = Timeout value in seconds (n) before aborting.
There are switches for sending a user name and password, proxy details, and a few others to look at also if required.
0

Featured Post

Free Tool: ZipGrep

ZipGrep is a utility that can list and search zip (.war, .ear, .jar, etc) archives for text patterns, without the need to extract the archive's contents.

One of a set of tools we're offering as a way to say thank you for being a part of the community.

Tackle projects and never again get stuck behind a technical roadblock.
Join Now