Need a solution to keep a version history of Excel files

Hey guys,

I have a requirement where the Finance dept want people to save each revision of an Excel workbook to a folder but they shouldn't be allowed to delete or modify any existing files They should also be able to read them.

I have done a few hours of testing NTFS permissions to try and make this work but its seems to be a bit of catch 22 type situation to me. They can write new files but cant write existing files? There are such no granular NTFS permissions for that from what I can tell but I don't claim to be an expert by any stretch.

Is there a way to allow new files to be created but make them read only once they are created?

Do you guys have any recommendations on how to solve this problem? I am starting to look at things like TortoiseSNV and Git because I am beginning to doubt I can solve this with NTFS permissions alone. However I am very reluctant to proceed down this path as it appears that it would require a lot of training of already change resistant users in order for them to use such a solution correctly.

Is there an idiot proof solution to my problem?

Thanks in advance
Who is Participating?
Steven HarrisConnect With a Mentor PresidentCommented:
I don't have the exact script handy, but this should help to get you going.

What the following code is doing is grabbing any files that have been created in the last day (/maxage:1) and copying them to the destination folder, under a newly created folder with today's date:

    $date = Get-Date -Format "yyyy-MM-dd"
    $source = "\\Source\Folder"
    $dest = "\\Destination\Folder"
    $destin = $dest + "\" + $date
    mkdir $destin
    robocopy /s /r:1 /w:10 /maxage:1 $source $destin

Open in new window

So, let's say there have been 3 edited files along with 4 new files in the last 24 hours.  These seven files are found in \\Source\Folder and copied over to \\Destination\Folder\2013-11-11

This keeps a daily revision of documents.  If something is deleted, you can retrieve it from previous day(s).  If a file is changed, you have a daily backup that provides revision levels.

To create the purge of old documents, use something like the following which will remove any folder older than 90 days ($Days = "90"):

$Now = Get-Date
$Days = "90"
$Target = "\\Destination\Folder"
$LastWrite = $Now.AddDays(-$Days)

$Folders = get-childitem -path $Target | 
Where {$_.psIsContainer -eq $true} | 
Where {$_.LastWriteTime -le "$LastWrite"} 

    foreach ($Folder in $Folders)
    write-host "Deleting $Folder" -foregroundcolor "Red"
    Remove-Item $Folder -recurse -Confirm:$false

Open in new window

Even better - Combine the two codes and run once daily!

Let me know if you need/want to make any changes.
Bill BachPresidentCommented:
Using Excel's built-in tools, there are likely to be no solutions other than careful naming of the files:

Your ideas of using a version control system (VCS) is a great one, but does require additional training.  

Another possibility is to use a continuous data protection system that never deletes files and keeps a running history, but this is likely to only retain data for so long -- depending on the storage space available.  Depending on your hardware and server environment, this last oiption may take the form of a software component, or you could leverage a NAS solution like the Buffalo Terastation.  Remember, though -- once ytou start filling up the storage system, it will start overwriting older copies, so this may not be exactly what you want, either.
Steven HarrisPresidentCommented:
We have a small company that uses a modified Contribute script that syncs two folders.

In essence, when any file is added or changed in the main folder, the new or changed file is synced to the 'archive' folder.  No overwrites, No deletions.  Users have Read only to the Archive folder.  After a set period of time the folder is pruned, deleting any item over X days.
Has Powershell sent you back into the Stone Age?

If managing Active Directory using Windows Powershell® is making you feel like you stepped back in time, you are not alone.  For nearly 20 years, AD admins around the world have used one tool for day-to-day AD management: Hyena. Discover why.

defectaAuthor Commented:
@ThinkSpaceSolutions: This is the sort of solution I have just been looking into since posting. A script that moves/renames/modifies attributes of new files so that unprivileged users can't change or delete them.

I have used Apple Automator (on Mac OS X of course) with great success to complete this sort of task quickly, easily and reliably. But as there is not a native Windows equivalent that doesn't require some competent scripting abilities and have been looking at Windows equivalents. So far I have found some Java software called "Action(s)" that I am looking at currently.

Your proposal sounds like the best solution to this problem. I am not familiar with Contribute scripts and my googling hasn't turned up any clues as to how you are acheiving this. Would you mind elaborating on it?
defectaAuthor Commented:
Can anyone comment if an Auditing policy in concert with NTFS permission to acheive my intended outcome? I have zero experience with Auditing Policies and how they work except for the 5 minutes I have just spent looking at them now.

EDIT: looking a little closer Auditing will only track who has done what, not prevent it. Forget I asked.
defectaAuthor Commented:
@ThinkSpaceSolutions: Thanks, I think I have found a way to improve your script by using a Powershell script that watches for a file creation event with the help of WMI.

But my Powershell scripting sucks and I have been trying and failing to write a loop so it will continue to check after a file has been created. And then I need to pass the newly detected files to a copy command like you have suggested.

I'm sure I am biting off more than I can chew but I am sure someone else could benefit from the effort put into such a script too so I figured what the hey. =)

Here is the code I have borrowed from 'Hey Scripting Guy'.  
# ----------------------------------------------------------------------------- 
# Script: TemporaryWMIEventToMonitorFolderForNewFiles.ps1 
# Author: ed wilson, msft 
# Date: 07/13/2012 16:05:50 
# Keywords: WMI, Events and Monitoring 
# comments: creates temporary event consumer to watch files. Also has a function 
# to cleanup events and subscribers after running the script. 
# hsg-7-17-2012 
# ----------------------------------------------------------------------------- 
$query = @" 
 Select * from __InstanceCreationEvent within 10  
 where targetInstance isa 'Cim_DirectoryContainsFile'  
 and targetInstance.GroupComponent = 'Win32_Directory.Name="c:\\\\test"' 
Register-WmiEvent -Query $query -SourceIdentifier "MonitorFiles" 
$fileEvent = Wait-Event -SourceIdentifier "MonitorFiles" 
Function Remove-WMIEventAndSubscriber 
 Get-EventSubscriber | Unregister-Event 
 Get-Event | Remove-Event 
} #end function Remove-WmiEventAndSubscriber

Open in new window

I was trying to add a 'do until' loop but I cant figure out the syntax so it loops after the file detection event. Any suggestions?

EDIT: I think I will open another question for the scripting advice. I will accept ThinksPaceSolutions answer for this.
Steven HarrisPresidentCommented:
Hey defecta.  I look forward to your next post!  Sounds like you are having fun with this little project at any rate!
defectaAuthor Commented:
Until I post a new question, this is what I am working on now. I think my answer is in the second last post by Robert.
Question has a verified solution.

Are you are experiencing a similar issue? Get a personalized answer when you ask a related question.

Have a better answer? Share it in a comment.

All Courses

From novice to tech pro — start learning today.