Solved

We would like to do some file Archiving on a server?

Posted on 2013-01-02
8
289 Views
Last Modified: 2013-01-02
OK, here is what we are thinking of doing.

We want to archive a folder based on the age of the files to a different location for online backup purposes.  For Example:

D:\Share\Data\files.txt -> D:\Archive\Share\Data\files.txt based on age.  

The archive location is read only and would be moved to tape.  These files would be files that have not been touched in 6 months or more.  This will greatly reduce the size of the "Live" files in the write location.  The problem is that we could do this once with a lot of work, but we want to maintain it run it like once a month.  That way only actual live files are being backedup to the online backup.  Is there software that does this?
0
Comment
Question by:XenoSaber
[X]
Welcome to Experts Exchange

Add your voice to the tech community where 5M+ people just like you are talking about what matters.

  • Help others & share knowledge
  • Earn cash & points
  • Learn & ask questions
  • 3
  • 2
  • 2
  • +1
8 Comments
 
LVL 4

Assisted Solution

by:Jernej Navotnik
Jernej Navotnik earned 250 total points
ID: 38736661
Hey!

Don't know what kind of OS you're using, but have you considered using powershell or robocopy (as batch) and then run once a month?
Also, the files, you want them copied or moved?

Try these:
http://technet.microsoft.com/en-us/library/ee176988.aspx#EBAA

http://blog.integrii.net/?p=27
0
 

Author Comment

by:XenoSaber
ID: 38736677
We are using Windows 2003 & 2008.

We could do that, I was seeing about getting it done without all that leg work.

We want to move the data, so it is out of the "Live" share.
0
 
LVL 28

Accepted Solution

by:
Bill Bach earned 250 total points
ID: 38736742
"All that leg work" is sometimes critical to getting the job done correctly.

If you want a simpler solution to the file-copy side of things, use RichCopy:
    http://technet.microsoft.com/en-us/magazine/2009.04.utilityspotlight.aspx

Note, though, that this solution will handle copying files from the "live" area to the "archive" area for you.  The next phase is to wipe out the olf files.  For that, you can use a tool like AutoDelete:
    http://www.whoismadhur.com/2008/12/15/delete-old-files-with-auto-delete/

However, this is a fairly basic tool, and doesn't let you delete based on Last Accessed Date.  A more powerful solution for you will likely be XXCOPY:
    http://www.xxcopy.com/

This tool can easily hanbdle what you want, and it can easily be set up as a scheduled task in Windows to work automatically.  In fact, you can easily wipe out all files that have not been accessed in the last 6 months with a line like this:
    xxcopy \src\ /rs/db#180/fa

Of course, you should be VERY careful with anything as powerful as this, and we are back to doing a lot of legwork once again to get the switches "just so".  However, the time spent is worth it.
0
Now Available: Firebox Cloud for AWS and FireboxV

Firebox Cloud brings the protection of WatchGuard’s leading Firebox UTM appliances to public cloud environments. It enables organizations to extend their security perimeter to protect business-critical assets in Amazon Web Services (AWS).

 

Author Comment

by:XenoSaber
ID: 38736777
I know, and I don't mind doing the work, it's just that I have to write this up so a monkey can read it and understand it as they are letting me go in June in lieu of outsourcing.  This is going to be a *itch, better brush up on my powershell, that;s going to be the best way.

Thanks All.
0
 
LVL 35

Expert Comment

by:Joseph Daly
ID: 38736787
I would take a look at vice versa to do this. It is not a free software but it is pretty cheap overall. I actually used this in a previous company i worked for to do a very similar thing as you.

This is basically like robocopy on steroids with many more options, logging, etc. It also has a nice scheduler built in so that you can set it and forget it.

http://www.tgrmn.com/
0
 
LVL 28

Expert Comment

by:Bill Bach
ID: 38736902
Well, I've seen many companies go through outsourcing over the years.  Incidentally, if you've got the you savings buffer and health care benefits (e.g. COBRA or spouse) to not need to find another job immediately, you can use that as an opportunity to provide consulting services back to the company -- at a vastly higher rate (I'd start at a minimum of $150/hour, higher if you have years of experience in their exact needs).  This allows several benefits:  You get to keep working, you get to work less (say, 20 hours a week or so), and they get to take you salary off of the permanent payroll costs.  I know of some "consultants" who were outsourced that continued to work for YEARS after the transition (one of them was there over 15 years when I last spoke to him), and they ended up working about 1/2 the time they worked previously for the same pay.  Because they weren't "full time employees", it didn't count as a FTE cost to the manager of the department.  Sounds strange, but it can be a great benefit at the same time.
0
 

Author Comment

by:XenoSaber
ID: 38736914
I can hope.
0
 
LVL 4

Expert Comment

by:Jernej Navotnik
ID: 38737186
Hey!

Actually you gave me a great idea for one of my clients, so here's the the script I sticked together from some of my scripts and a bit of Google and Technet.
It works for me (!not thoroughly tested! so no guaranties:) ).

It copies the folder structure and moves the files older (last accessed) than whatever you like in days.
And writes all moved files in a .csv.
It's probably not "the top 100 ps scripts" but, it's a start if you're going to try on your own.

Set-StrictMode -Version Latest
$From="C:\from"
$To="C:\to"
$DaysAgo="10"
$LastAccess =(get-date).AddDays(-$DaysAgo)
get-childitem $From -include *.* -recurse | ? {
	!$_.PsIsContainer -and $_.LastAccessTime -le “$LastAccess”
} | % {
	$newpath = join-path $To $_.DirectoryName.SubString($From.length)
	New-Item $newpath -type directory -ErrorAction SilentlyContinue
	Move-Item $_.FullName -destination $newpath
	write-host "File Was Last Accessed" $_.LastAccessTime
	$outputfilename = ( "c:\"+$DaysAgo+"dayReport.csv") 
	$_ |format-table -property Fullname |out-file $outputfilename -append 
}

Open in new window


Best regards to all, Jernej
0

Featured Post

Surfing Is Meant To Be Done Outdoors

Featuring its rugged IP67 compliant exterior and delivering broad, fast, and reliable Wi-Fi coverage, the AP322 is the ideal solution for the outdoors. Manage this AP with either a Firebox as a gateway controller, or with the Wi-Fi Cloud for an expanded set of management features

Question has a verified solution.

If you are experiencing a similar issue, please ask a related question

Ever notice how you can't use a new drive in Windows without having Windows assigning a Disk Signature?  Ever have a signature collision problem (especially with Virtual Machines?)  This article is intended to help you understand what's going on and…
It’s been over a month into 2017, and there is already a sophisticated Gmail phishing email making it rounds. New techniques and tactics, have given hackers a way to authentically impersonate your contacts.How it Works The attack works by targeti…
With the advent of Windows 10, Microsoft is pushing a Get Windows 10 icon into the notification area (system tray) of qualifying computers. There are many reasons for wanting to remove this icon. This two-part Experts Exchange video Micro Tutorial s…
Here's a very brief overview of the methods PRTG Network Monitor (https://www.paessler.com/prtg) offers for monitoring bandwidth, to help you decide which methods you´d like to investigate in more detail.  The methods are covered in more detail in o…

733 members asked questions and received personalized solutions in the past 7 days.

Join the community of 500,000 technology professionals and ask your questions.

Join & Ask a Question