I've made a script that will go out to a hosted FTP site and copy the files located there into a directory on my network. I close the session and then re-open it and move the files to an archive folder.
The issue I face is that from the time that the script is kicked off, until the time that it completes, new files could be added to this directory. When that does happen, they are not copied to my network location but are moved to the archive. Defeating the purpose of what the script is supposed to do.
In turn I need to do a treasure hunt to find the files that have been skipped.
What is the best way to handle that situation? Is there a way to temporary list the files somewhere that have been copied to the network then only move those when the second part of the script is engaged?
Here is the script:
# Make sure to have WinSCP downloaded first!
# Set login name to diicorp
# Automatically skips or ignores prompts
option batch continue
# Allows for overwrites
option confirm off
# Opens session for diicorp
# Sets remote directory as root
# Changes local working directory to X:\
# Sets to ASCII
option transfer ascii
# Gets multiple .txt files and copies to X:\
mget *.txt X:\
# Disconnects session
# Reopens session
# Moves remote .txt files to /Archive folder
mv *.txt /Archive/*.bak