I have a situation where I have a document management server (3.8 million documents stored in 1900 folders) that has lost a bunch of files (thousands). What I do have is a huge list pulled from SQL in xls format that can be converted to text of all the files that were unceremoniously deleted by the system. What I also have is a backup of all 1900 folder and 2.8 million docs including the thousands of files that were deleted. What I need is a script that will copy all the files listed in the xls or txt file back to their correct folders. Here is an example of the txt file. This is an example of the text file with the file list:
\\cp1\CP_HISTORY\CPWin\History\041208_0024\This document is gone.doc
\\cp1\CP_HISTORY\CPWin\History\041245_0001\Our DM server sucks.doc...etc
This is actually the destination path on the production server. Notice the server and first 3 directory levels do not change but the last directory and filename changes with each file.
Here's the pathing I am looking at with the backup data:
\\backuppc\e$\Backup 2015-02-08\CP1\History\041208_0024\This document is gone.doc
\\backuppc\e$\Backup 2015-02-08\CP1\History\041245_0001\Our DM server sucks.doc...etc
What I can't figure out is how to get a script to read the txt file with each file and path (thinking batch file with FOR loop but after that I have no idea), use this path and file name which is the destination path and name to copy the file from the backup path to the production path .
Any help would be extremely appreciated. Thanks