I have my data being backed up to cloud storage but I also run my own batch backup locally for critical files and directories. Currently I am using 'xcopy'. This command preserves and copies sub-directories.
xcopy "i:\My Documents\Access_Databases\Delinq\*.*" "J:\Delinq" /s /y /f
One thing neither my cloud backup or by batch backup does is retain files on the destination drive that I have erased from the origin drive.
For example, my I drive contains:
File1
File2
FIle3
I execute my 'xcopy' command, then FIle1, FIle2 and FIle3 are on J: drive.
I erase 'FIle2' from the I drive and execute the xCopy again.
Now only files File1 and FIle3 are on the J: drive.
I would like to use the J: drive a as a repository for any files ever copied from the I drive, not just the files currently on the I: drive. So if I delete File2 from the I drive, it will still be on the J: drive.
I created the xcopy so long ago I don't remember what the specific options '/' do.
Is there a command I can build into my batch backup that will copy all current files to the destination drive, retaining the directory structure and keep previously copied files that no longer exist on the origin drive?
ASKER
Thanks