Create a cgi/html link on web that allows dowloading of ALL images in a directory. I have 240 images.

Lets say I have a group of images within a directory called /path and I want to post a link that allows someone to download all images in that directory. Is that possible or would I have to create a link for ALL images that want to be downloaded. Perhaps I want to do it in groups instead. Lets say 4 link that pull up all 60 in a category. The files/images in category 1 would start with lets say so4big category 2 could be so2big category 3 could be so4domain and category 4 could be so2domain and so on. The link would be on an index.cgi file
libertyforall2Asked:
Who is Participating?

Improve company productivity with a Business Account.Sign Up

x
 
selvolConnect With a Mentor Commented:
A good way to do this and it would be faster is.

Create a .zip file or .rar file containing all the images for that folder.
And another for each catagory.

Then just have that link posted

<a href=allimages.rar> Download all these images</a>

Selvol
0
 
Dave BaldwinConnect With a Mentor Fixer of ProblemsCommented:
A link is to a single URL or file.  You would have to automate it somehow to download more than one at time.  There are programs that are not web browsers that do that.
0
 
libertyforall2Author Commented:
The thing is I want to scp automatically files in batch mode to a directory. If I did that, I would need to somehow set that up in batch mode as well. Not sure how to do that.
0
Easily Design & Build Your Next Website

Squarespace’s all-in-one platform gives you everything you need to express yourself creatively online, whether it is with a domain, website, or online store. Get started with your free trial today, and when ready, take 10% off your first purchase with offer code 'EXPERTS'.

 
Dave BaldwinFixer of ProblemsCommented:
SCP doesn't have anything to do with "a cgi/html link on web".  If you're going to use an FTP/SFTP client, you can organize your pic files into directories.
0
 
libertyforall2Author Commented:
            I know scp doesn't have anything to do with the link on what web except for the fact that I am using scp to send files that are creating twice daily in batch mode to another server. The other server will then host the images that will be linked to the site for download. As a result, every time the images are changed the zip or rar file would then need to be created. This needs to be an automated process so I don't have to manually create the file using software. I would have to install and compile the software on a linux box which adds uncessary complications. It would be easier for me to create a static link for all files and simply have the files updated for download. I just thought there might be a cleaner way to do it than to have 240 links on site.
0
 
Dave BaldwinConnect With a Mentor Fixer of ProblemsCommented:
There isn't any kind of link but the regular kind that is a request for a single URL which returns a single object.  If this is some kind of subscription that people will download every day, there are other ways to download them like wget and httrack.
0
 
libertyforall2Author Commented:
I just created the page with numerous links.
0
Question has a verified solution.

Are you are experiencing a similar issue? Get a personalized answer when you ask a related question.

Have a better answer? Share it in a comment.

All Courses

From novice to tech pro — start learning today.