Hey All -
I've got a project which I'm trying to script that I cannot get to work correctly. I have a remote location which has a folder named "Complete" in which files are dumped to at random times. Some files may go directly in the root and others into folders in the root. Each of the files' names will be different and one of about 5 different of extensions
Remote Example:
- Complete/UbuntuLinux1.iso
- Complete/Mint1/LinuxMint.zip
- Complete/Windows/Windows7.rar
Currently, I log in a few times a day via FTP, select the files I want, download them to one of 5 local folders, then delete the rest of the crap. What I'm trying to do is automate this
process.
I tried using SmartFTP to create schewdules, but couldn't find a way for it to monitor a remote FTP site for changes.
Below is what I would need it to do / rules:
- Monitor folder on remote FTP site (including it's child folders) for new files which end in 1 of 5 extensions (.zip, .rar, .iso, .img, & .bin)
- Examine the filename - if "linux" is anywhere in the filename, download to the local "Linux" Folder. If "Windows", then download to the local "Windows" folder, and so on.
Local Example of same remote files:
- Linux\UbuntuLinux1.iso
- Linux\LinuxMint.zip
- Windows\Windows7.rar
Any ideas for applications or scripts that may help with this? Is it makes any difference, I can connect to the remote site using FTP, FTPS, HTTP, or HTTPS.
Last resort would be for it to monitor and download everything in a specific remote folder, then have a separate application sort it all out. Any ideas on this would be helpful, too.
Thanks!