?
Solved

Copy File and Log

Posted on 2013-01-16
11
Medium Priority
?
294 Views
Last Modified: 2013-02-05
I want to create a script that will copy one file from one fixed location on my computer to a fixed location on a share drive. The file will exist on the share drive, so I will overwrite it.

Once the copy has finished, I want to write to the event log the status. The copy could fail if the source file is locked by another applicaion/user, so I need to detect failure and record either success or failure to the event log.

I do not need any kind of recovery if it fails.
0
Comment
Question by:dbbishop
  • 6
  • 4
11 Comments
 
LVL 15

Author Comment

by:dbbishop
ID: 38783137
A little change. I need to copy all of the files in a specific folder to a destination and log the result. If any one file cannot be copied, the action is considered a failure, although I do not need to log the specific file that couldn't be copied, if that can be easily obtaned, it would be nice.
0
 
LVL 7

Expert Comment

by:karunamoorthy
ID: 38783164
you need some programming effort for eg. vb
0
 
LVL 15

Author Comment

by:dbbishop
ID: 38783774
I want to do it in PowerShell.
0
Industry Leaders: We Want Your Opinion!

We value your feedback.

Take our survey and automatically be enter to win anyone of the following:
Yeti Cooler, Amazon eGift Card, and Movie eGift Card!

 
LVL 15

Author Comment

by:dbbishop
ID: 38783966
I've been playing around a bit and I've come up with this code for doing the copy. However, it appears the full directory structure needs to be in place for files to copy properly. Although I can create the structure before-hand, I am concerned if a folder gets added in the source path, how to make sure it, and its contents gets copied to the destination path. Any ideas?

$sourceDir = 'E:\sourcepath'
$targetDir = 'g:\destinationpath'

get-childitem -path $sourceDir -recurse |
    foreach { Copy-Item $_.FullName -destination $targetDir -recurse -force }
0
 
LVL 40

Expert Comment

by:Subsun
ID: 38784028
Try this script.. The script will copy C:\Dumps\shell.txt to the share \\Server\Users\Test and log Event ID 20399 with status.
 
$Src = "C:\Dumps\shell.txt"
$Dest = "\\Server\Users\Test"
$eventLogID = 20399
$logFilePrefix = "FileCopyStat"
	
function Write-Event{
		$msg = $args[0]
		$source = $args[1]
		$type = $args[2]
		$eventid = $args[3]
		if(![System.Diagnostics.EventLog]::SourceExists($source))
			{
				[System.Diagnostics.EventLog]::CreateEventSource($source,'Application')
			}
		$log = New-Object System.Diagnostics.EventLog 
		$log.set_log("Application") 
		$log.set_source($source)
		$log.WriteEntry($msg,$type,$eventid)
}

Try 
	{
		Copy-Item $Src $Dest -ErrorAction Stop
		write-event $("The file $($Src) was successfully copied to $($Dest)") $logFilePrefix "Information" $eventLogID
	}
catch [System.Exception]
	{
		write-event $_.Exception.Message $logFilePrefix "Error" $eventLogID
	}

Open in new window

0
 
LVL 40

Expert Comment

by:Subsun
ID: 38784719
I am concerned if a folder gets added in the source path, how to make sure it, and its contents gets copied to the destination path. Any ideas?

If you are copying a directory then you can use -recurse parameter with Copy-Item which creates a destination folder structure. When the source is a file then destination directory should be there.. There are workarounds..

You can add a If condition to test whether the destination folder exist or not..
$Dest= "C:\test\Folder"
if (!(Test-Path -path $Dest)) {New-Item $Dest -Type Directory}
Copy-Item $Src $Dest

Open in new window

Or try..
$Dest= "C:\test\Folder\file.txt"
New-Item -ItemType File -Path $Dest -Force
Copy-Item $source $Dest -Force

Open in new window

0
 
LVL 15

Author Comment

by:dbbishop
ID: 38785135
subsun: I'd like to do this on a foreach loop, as I would not necessarily know, at the time of execution, the folder name(s) and there couldbe more than one. In other words, a 'parent' subfolder could have been added and a complete hierarchy of subfolders under it. I need to make sure all of the folders in the source path exist in the destination or are created.

I am playing around with this some on my own but work s rather overwhelming right now and my time available for research is limited. Normally, I'd likely go out and figure out how to do this all on my own.
0
 
LVL 40

Expert Comment

by:Subsun
ID: 38785168
Do you want an event for each file & folder which is copied?
0
 
LVL 15

Author Comment

by:dbbishop
ID: 38794238
No, just for the full process. If it fails, then I know at least one file did not copy. Actually, if not to much trouble, if all files copy, then a single succeed event written, else a fail event for each file that did not copy, along with the filename. But if too much trouble, then a single pass/fail event.
0
 
LVL 40

Accepted Solution

by:
Subsun earned 2000 total points
ID: 38794737
Try this..
$Src = "C:\Documents"
$Dest = "\\Server\Users\Test"
$eventLogID = 20399
$logFilePrefix = "FileCopyStat"
function Write-Event{
		$msg = $args[0]
		$source = $args[1]
		$type = $args[2]
		$eventid = $args[3]
		if(![System.Diagnostics.EventLog]::SourceExists($source))
			{
				[System.Diagnostics.EventLog]::CreateEventSource($source,'Application')
			}
		$log = New-Object System.Diagnostics.EventLog 
		$log.set_log("Application") 
		$log.set_source($source)
		$log.WriteEntry($msg,$type,$eventid)
}
$error.Clear()
Copy-Item $Src $Dest -Recurse -Force -ErrorAction SilentlyContinue
if ($error.Count -eq 0){
write-event ("The files at "+$Src+" was successfully copied to "+$Dest) $logFilePrefix "Information" $eventLogID
}
Else{
$Error | % {write-event $_ $logFilePrefix "Error" $eventLogID}
}

Open in new window

0
 
LVL 15

Author Closing Comment

by:dbbishop
ID: 38857161
Sorry it has taken so long to respond. This was very low on the priority list of tasks to accomplish, and I've been working 12-hour days on high priority stuff. I have not tested this, but in looking it over, I believe it will work. If not, it should only require minor tweaks. Thank you.
0

Featured Post

 The Evil-ution of Network Security Threats

What are the hacks that forever changed the security industry? To answer that question, we created an exciting new eBook that takes you on a trip through hacking history. It explores the top hacks from the 80s to 2010s, why they mattered, and how the security industry responded.

Question has a verified solution.

If you are experiencing a similar issue, please ask a related question

Recently we ran in to an issue while running some SQL jobs where we were trying to process the cubes.  We got an error saying failure stating 'NT SERVICE\SQLSERVERAGENT does not have access to Analysis Services. So this is a way to automate that wit…
Measuring Server's processing rate with a simple powershell command. The differences in processing rate also was recorded in different use-cases, when a server in free and busy states.
Exchange organizations may use the Journaling Agent of the Transport Service to archive messages going through Exchange. However, if the Transport Service is integrated with some email content management application (such as an anti-spam), the admin…
Loops Section Overview

864 members asked questions and received personalized solutions in the past 7 days.

Join the community of 500,000 technology professionals and ask your questions.

Join & Ask a Question