Can someone please make me a script I can easliy run on a linux OS (or even windows it doesn't matter).. Im not a programmer but I can run a shell script on unix or a batch file in windows
I have a list of URL's in a text file -
The list is rather large (450 urls/images) and downloading by hand would take awhile
Can someone throw together a script that will read the list of URL's and download each image and place them all in the same folder.
example of the resulting output
I'd need to keep the folder structer and image name the same as named in the URL - BUT IF this complicates the script to much I really only have to have the image name remain the same, I can place them in folders after they are downloaded.