Need help copying foldered files to thier respective folder based on a txt or xls list

I have a situation where I have a document management server (3.8 million documents stored in 1900 folders) that has lost a bunch of files (thousands). What I do have is a huge list pulled from SQL in xls format that can be converted to text of all the files that were unceremoniously deleted by the system. What I also have is a backup of all 1900 folder and 2.8 million docs including the thousands of files that were deleted. What I need is a script that will copy all the files listed in the xls or txt file back to their correct folders. Here is an example of the txt file. This is an example of the text file with the file list:

\\cp1\CP_HISTORY\CPWin\History\041208_0024\This document is gone.doc
\\cp1\CP_HISTORY\CPWin\History\041208_0045\abc456.doc
\\cp1\CP_HISTORY\CPWin\History\041245_0001\Our DM server sucks.doc...etc

This is actually the destination path on the production server. Notice the server and first 3 directory levels do not change but the last directory and filename changes with each file.

Here's the pathing I am looking at with the backup data:

\\backuppc\e$\Backup 2015-02-08\CP1\History\041208_0024\This document is gone.doc
\\backuppc\e$\Backup 2015-02-08\CP1\History\041208_0045\abc456.doc
\\backuppc\e$\Backup 2015-02-08\CP1\History\041245_0001\Our DM server sucks.doc...etc

What I can't figure out is how to get a script to read the txt file with each file and path (thinking batch file with FOR loop but after that I have no idea), use this path and file name which is the destination path and name to copy the file from the backup path to the production path .

Any help would be extremely appreciated. Thanks
Mark LewisAsked:
Who is Participating?
 
oBdAConnect With a Mentor Commented:
Sorry, that should have been "-LiteralPath" in Copy-Item, not "-Path":
$FileList = "C:\Temp\test.txt"
$TargetRoot = "D:\Temp" # "\\cp1\CP_HISTORY\CPWin"
$BackupRoot = "C:\Temp" # "\\backuppc\e$\Backup 2015-02-08\CP1"
$TargetRoot = $TargetRoot.ToLower()
$BackupRoot = $BackupRoot.ToLower()
Get-Content -Path $FileList | % {
	"Processing '$($_)' ..." | Write-Host
	Try {
		Copy-Item -LiteralPath $_.Trim('"').ToLower().Replace($TargetRoot, $BackupRoot) -Destination $_.Trim('"') -Force -ErrorAction Stop -WHATIF
	} Catch {
		$_.Exception.Message | Write-Host -Fore Red
	}
}

Open in new window

0
 
oBdACommented:
Well, basically, a batch script could look like this (in test mode, it will only display the copy commands it would normally run; remove the uppercase ECHO in line 10 to run it for real). Whether this works depends mainly on the amount of files in the list, because "for /f" isn't really fast when it comes to large input files.
Make sure that the input file is saved as ANSI, NOT Unicode!
 
@echo off
setlocal enabledelayedexpansion
set FileList=C:\Temp\test.txt
set TargetRoot=\\cp1\CP_HISTORY\CPWin
set BackupRoot=\\backuppc\e$\Backup 2015-02-08\CP1
for /f "usebackq delims=" %%a in ("%FileList%") do (
	set Target=%%~a
	set Source=!Target:%TargetRoot%=%BackupRoot%!
	echo Processing '%%~a' ...
	ECHO copy "!Source!" "!Target!"
)

Open in new window

So just in case, here's a Powershell version; it's in test mode as well, remove the uppercase "-WHATIF" in line 9 to run it for real:
$FileList = "C:\Temp\test.txt"
$TargetRoot = "\\cp1\CP_HISTORY\CPWin"
$BackupRoot = "\\backuppc\e$\Backup 2015-02-08\CP1"
$TargetRoot = $TargetRoot.ToLower()
$BackupRoot = $BackupRoot.ToLower()
Get-Content -Path $FileList | % {
	"Processing '$($_)' ..." | Write-Host
	Try {
		Copy-Item -Path $_.ToLower().Replace($TargetRoot, $BackupRoot) -Destination $_ -Force -ErrorAction Stop -WHATIF
	} Catch {
		$_.Exception.Message | Write-Host -Fore Red
	}
}

Open in new window

0
 
Mark LewisAuthor Commented:
I tested it and it works but it's having trouble copying files with spaces in the filename. At least, that seems to be the common thread of the problem. Thanks for what you have done so far though, it works great for filename without a space.
0
Cloud Class® Course: Microsoft Azure 2017

Azure has a changed a lot since it was originally introduce by adding new services and features. Do you know everything you need to about Azure? This course will teach you about the Azure App Service, monitoring and application insights, DevOps, and Team Services.

 
Mark LewisAuthor Commented:
oh, I jumped directly to the powershell script. I didn't test the batch file.
0
 
oBdACommented:
Neither the PS script nor the batch script should have issues with spaces in the file names, I tested with these kind of file names as well.
Could it be that the file names in the list have leading or trailing spaces?
Could it be that the "spaces" in the list aren't actually Char(32) / 0x20?
Are the file names in the list surrounded with quotes, and if so, with "pretty" ones, or the proper double quotes (as in Chr(34) / 0x22)?
If the lines in the file are enclosed in double quotes, use this for PS:
$FileList = "C:\Temp\test.txt"
$TargetRoot = "\\cp1\CP_HISTORY\CPWin"
$BackupRoot = "\\backuppc\e$\Backup 2015-02-08\CP1"
$TargetRoot = $TargetRoot.ToLower()
$BackupRoot = $BackupRoot.ToLower()
Get-Content -Path $FileList | % {
	"Processing '$($_)' ..." | Write-Host
	Try {
		Copy-Item -Path $_.Trim('"').ToLower().Replace($TargetRoot, $BackupRoot) -Destination $_.Trim('"') -Force -ErrorAction Stop -WHATIF
	} Catch {
		$_.Exception.Message | Write-Host -Fore Red
	}
}

Open in new window

0
 
Mark LewisAuthor Commented:
I did more testing and you're right, it's not the spaces. The problem is that all my doc names have a doc number inside brackets []...example doc name being: [1634] WCC Form 20.doc

If I remove the brackets, it works fine. Any workaround for the [ ]? Thanks again.
0
 
Mark LewisAuthor Commented:
Work's perfectly. My company should write you a check as you have saved hundreds of man hours. Thanks
0
Question has a verified solution.

Are you are experiencing a similar issue? Get a personalized answer when you ask a related question.

Have a better answer? Share it in a comment.

All Courses

From novice to tech pro — start learning today.