Solved

Moving files older than a certain date, but keeping folder structure

Posted on 2012-12-26
6
1,702 Views
Last Modified: 2013-01-03
Hi team

I'm in need of a solution to move files older than 3months off the file server (Server 2008 (Not R2)) onto an external hard drive or network share. I'm by no means skilled in creating script commands, but from what I've read most of the solutions would mean removing the folders etc from the source when moving to the destination which isn't what I want to do.

I'm hoping when the files are moved, the file structure will effectively look identical in both locations making it easier to track down these files if required.

Is this something that can be done, or am I expecting too much?

Cheers
0
Comment
Question by:DTS-Tech
  • 4
6 Comments
 
LVL 59

Accepted Solution

by:
Darius Ghassem earned 250 total points
ID: 38722490
You can use something like Robocopy ot Richcopy to copy all data off to another drive both of these have gui so you can use this without scripting

http://blogs.technet.com/b/ken/archive/2009/06/10/build-4-0-216-has-been-posted-to-the-microsoft-download-center.aspx
0
 
LVL 29

Expert Comment

by:becraig
ID: 38722506
This might help:

param ([string] $SourceFolder, [string] $DestFolder,  [string] $DDate);


write-host $sourcefolder
write-host $destfolder
write-host $ddate

if (!(Test-Path $SourceFolder)) 
{ write-host "Source $sourcefolder does not exist!" }
if (!(Test-Path $DestFolder)) 
{ write-host  "Destination $destfolder does not exist!" }


$FileList = Get-ChildItem $SourceFolder -recurse
ForEach ($FileObj in $FileList) {
if ($fileobj.LastWriteTime -lt (Get-Date).AddDays(-$Ddate))
	{
write-host "$FileName = $FileObj.Name"
write-host "$fullpath = $FileObj.FullName"

$FileName = $FileObj.Name
$fullpath = $FileObj.FullName
write-host "$copypath = [regex]::Replace($Fullpath,'$SourceFolder','');"
$copypath = [regex]::Replace("$fullpath", "$SourceFolder",'');
$destpath = $DestFolder + $copypath

Copy-Item "$fullpath" "$destpath"


	}

				}
  

Open in new window

0
 
LVL 29

Expert Comment

by:becraig
ID: 38722513
I forgot to add usage :

script name (whatever you save it as .ps1)
script.ps1 -sourcefolder c:\sourcefolder -destfolder c:\destfolder -ddate40

That should do it.
0
VMware Disaster Recovery and Data Protection

In this expert guide, you’ll learn about the components of a Modern Data Center. You will use cases for the value-added capabilities of Veeam®, including combining backup and replication for VMware disaster recovery and using replication for data center migration.

 
LVL 29

Assisted Solution

by:becraig
becraig earned 250 total points
ID: 38722742
It seems I had an error in the script I posted:  updating below-

param ([string] $SourceFolder, [string] $DestFolder,  [string] $DDate);
if ($SourceFolder -eq "" -or $DestFolder -eq "" -or $DDate -eq "")
{ 
write-host "
`t You failed to specify correct syntax
`n
`t Usage: script.ps1 -sourcefolder c:\folder -destfolder c:\folder2 -ddate 40 
" -fore red
}
else
{
if (!(Test-Path $SourceFolder)) 
{ write-host "Source $sourcefolder does not exist!" }
if (!(Test-Path $DestFolder)) 
{ write-host  "Destination $destfolder does not exist!" }

$FileList = Get-ChildItem $SourceFolder -recurse
ForEach ($FileObj in $FileList) {
if ($fileobj.LastWriteTime -lt (Get-Date).AddDays(-$Ddate))
	{
$FileName = $FileObj.Name
$fullpath = $FileObj.FullName
$copypath = ($fullpath -replace [regex]::Escape($SourceFolder), '')
$destpath = $DestFolder + $copypath

Copy-Item $fullpath $destpath
	}
				}
}

Open in new window


Give it a go ...
If you need any extra functionality like not copying files already present etc let me know.
0
 
LVL 29

Expert Comment

by:becraig
ID: 38740585
Were you able to test this and did you have any additional questions ?
0
 

Author Comment

by:DTS-Tech
ID: 38742328
Sorry for the delay in the update, I haven't moved the files yet, but at this stage I think I'll end up using the Richcopy application due to the GUI. Though I do appreciate the script you've written out
0

Featured Post

Best Practices: Disaster Recovery Testing

Besides backup, any IT division should have a disaster recovery plan. You will find a few tips below relating to the development of such a plan and to what issues one should pay special attention in the course of backup planning.

Question has a verified solution.

If you are experiencing a similar issue, please ask a related question

Suggested Solutions

Title # Comments Views Activity
Windows 10:  Delay in entering RDP credentials 19 35
Dropbox for Free 5 48
Corrupt / Encrypted Word Documents 6 32
What are the purpose of ESXi local storage usage ? 5 41
Background Information Recently I have fixed file server permission issues for one of my client. The client has 1800 users and one Windows Server 2008 R2 domain joined file server with 12 TB of data, 250+ shared folders and the folder structure i…
Restoring deleted objects in Active Directory has been a standard feature in Active Directory for many years, yet some admins may not know what is available.
This tutorial will show how to configure a single USB drive with a separate folder for each day of the week. This will allow each of the backups to be kept separate preventing the previous day’s backup from being overwritten. The USB drive must be s…
This tutorial will walk an individual through setting the global and backup job media overwrite and protection periods in Backup Exec 2012. Log onto the Backup Exec Central Administration Server. Examine the services. If all or most of them are stop…

821 members asked questions and received personalized solutions in the past 7 days.

Join the community of 500,000 technology professionals and ask your questions.

Join & Ask a Question