I currently have a few servers located on isolated networks across the internet. To back them up, we currently VPN in, map a network drive and run an xcopy command to only copy the files have have been changed/modified. The VPN is limited to 1mbit and it takes an average of 2.5s for each file to transfer. As you can imagine, if 30,000 files have changed during the course of the day it is impossible to back up everything.
I'm looking to write something in vb.net which:
1) Looks recusively into a starting directory and puts its structure and file data into a text file (with date last modified etc)
1) Looks recursively through the same starting directory, comparing the files and folders to that text file saved beforehand.
2) If the file/folder is new and/or has been changed, copy/create this file into a seperate backup directory, keeping the folder structure.
3) Add the paths of deleted files into a text file
4) Update or Create the text file again with the new file/folder structure
We'll then be able to FTP in to download the changed files.
Could someone tell me whether this is an efficent way of doing it or are there other options I should consider?