Link to home
Start Free TrialLog in
Avatar of Bert2005
Bert2005Flag for United States of America

asked on

Delete files from folder based on number

Hi experts,

I have searched via Google for two days now, and I have looked at quite a few programs. I am looking for a software application which will monitor a folder and delete a file based either on schedule or preferably when one is added. I really don't like the ones based on date.

I have no problem paying for the application. If someone has a script which isn't way over my head, I could look at that also.

Thanks.
Avatar of ToddBeaulieu
ToddBeaulieu
Flag of United States of America image

I'm not sure what you mean by "schedule".

You can easily write a PowerShell script to look at the CreationTime on files in a folder.

I just copied an existing file to a test folder. The LastWriteTime was preserved as the original file creation, but the CreationTime property showed the date/time I copied the file.
$file = Get-Item "c:\test\Package - Copy.dtsx"
$file.CreationTime

Open in new window

Avatar of Bert2005

ASKER

Hi Todd,

Thanks. Sorry, by schedule I meant date. I have had some problems with the deletion by creation or modified based on last eight days, etc.

What I would prefer and your script looks like it may work would be for the script to notice a new file had been placed in the folder and then delete the oldest file.

And, the path would be just to make it easier for me is:

D:\Amazing Charts\Backup   or   \\Server\Amazing Charts\Backup
ASKER CERTIFIED SOLUTION
Avatar of ToddBeaulieu
ToddBeaulieu
Flag of United States of America image

Link to home
membership
This solution is only available to members.
To access this solution, you must be a member of Experts Exchange.
Start Free Trial
Thanks Todd and Satsumo,

So, if I copy the script into a notepad and change it to a .bat file, I will be all set? What things do I need to change in it besides the path? I take it the file names have to be changed as well, but they have different dates.

Sorry, I am not very good with scripts.
I did download the PowerGUI. Not sure what to do with it. Just looking around. The train is cool, though.
>>"the file names have changed, but they have different dates"

don't understand what you mean.

I created and tested this script. It should be productionized, but it's a great start.

It's not a batch file. Like I mentioned, it's a PowerShell script, which is a free download from MS.
Oh, I did download it. At least I downloaded PowerGUI. Do I need to downlaod PowerShell separately. I used your URL.

I just thought where it said "target file" I may have to put the name of the files in the folder. Please tell me a bit more step by step if you have time. Also, comment on the different between PowerGUI and PowerShell. I will look also.
Ah, ok!

PowerGUI is just a free editor that runs whatever you enter into it against PowerShell.

If you change the top two lines as shown below, I introduced a source folder setting. You change THAT and the maxFiles to be the # of files to preserve.

Here's the download to the current production version:

http://www.microsoft.com/windowsserver2003/technologies/management/powershell/download.mspx
FROM....
 
$maxFiles = 2;
$files = Get-ChildItem -Path "c:\test" | where{!($_.PSISContainer)} | Sort-Object CreationTime;
 
TO...
 
$maxFiles = 2;
$sourceFolder = "c:\test";
$files = Get-ChildItem -Path $sourceFolder | where{!($_.PSISContainer)} | Sort-Object CreationTime;

Open in new window

So, I installed and opened Powershell. It seems to be just a command prompt.
I'm trying. I think you experts think I am smarter than what I am sometimes.
That's what PS is. But the editor lets you do enter scripts interactively and run them. And if you save the script to a file with an extension of .ps1, you can execute it.

You also have to enable script execution on your computer.

It's tough to explain everything! There are TONS of PS tutorials and write-ups, out there.
Well, I am am able to run .cmd and .bat files on SBS2003. So, it would seem that script execution is on. I have started reading the PS tutorials and listened to a few.

If script execution is on, what would I need to do with what your wrote for a script to make it an executable file?
Rename it to a ps1. It's unrelated to batch and cmd files.

If you google "powershell enable execution" you'll see a lot of discussion about enabling it. It's off by default for security reasons. It's a one-liner to enable it.

Once you enable it and learn a bit more about PS, you'll discover a wealth of capabilities that PS offers you. Lots of scripts out there.
Ok, so I have it unrestricted now. So, at least I got that far. And, I have run a bunch of Get and Set and Format commands.

The part I am confused on, and you are going to kill me is I am not sure what to do with your script. I have copied and pasted it into a text file and named it Backup.ps1 but it still looks like a text file.

1. What do I do with your script? Copy and paste it somewhere?
2. Do I then use commands from the PS?

Avatar of Member_2_5069294
Member_2_5069294

Just to clarify though, what if you rename of of the files?  Is that a 'new' file in this case?  Should it then delete the oldest file or is it just maintaining a specific number of files?  If you added 3 files should it then delete the 3 oldest files?
Once you install PS, it will associate PS1 files with PS and run them when you double-click them. Or, you can open the script with power gui and run it inside the editor. You can even step through it one line at a time. You could also modify the script to copy the oldest files into a sub folder instead of deleting them, until you're comfortable that it's working as expected.
>> If you added 3 files should it then delete the 3 oldest files?<<

Exactly. Just to give a little background. The author of the Electronic Medical Record made a backup program. Now the backup program allows you (as most do) to back it up where you wish. These are SQL databases, which are compressed, then encrypted. It's a basic backup program for the application, nothing like Acronis or anything. It's a bit different, because it allows you to back up to three places as well as off site. I think that is a bit of overkill, but you don't have to use all three paths or the off site backup. The stupid part of the backup (in my opinion) is that it automatically by default places a backup in the C:\Program Files\Amazing Charts folder in a subdirectory called Backup. Many users aren't even aware it is there, and some of these backups are three or four GBs. So, they are going along and all of a sudden they get a notice that there is no space left on their C: drive and have no reason why. Besides the fact, that besides losing a file or having a corrupt database, the last place I would want a backup would be not only on the same drive but in the same folder.

Now, of course, I could go in and delete files manually and mine are way less than GBs. But, that is a pain. When I back up my server using Acronis or Backup Assist, it has the capability of deleting old backups based on space left, data, etc.
Thanks everyone for bearing with me.

The following is what I have done so far. I put the contents of the script into a notepad: You can see if I have done anything wrong with it. Should I not have the From and To fields?

Also, when I save it as a .PS1 file with PS installed, it doesn't change into anything that looks like a file which would run. Justs for fun, I dragged the Test.ps1 file into the PowerShell command thing and it does show the path. But, double clicking on the PS1 file doesn't seem to do anything.


FROM
D:\Amazing Charts\Backup 
$maxFiles = 5;
$files = Get-ChildItem -Path "c:\test" | where{!($_.PSISContainer)} | Sort-Object CreationTime;
 
TO
D\Amazing Charts\Backup2 
$maxFiles = 1;
$sourceFolder = "c:\test";
$files = Get-ChildItem -Path $sourceFolder | where{!($_.PSISContainer)} | Sort-Object CreationTime; 

Open in new window

So I guess you only want to delete files with the extension of the database backup file?  I can see it's unlikely anyone will put anything else in that directory or rename a backup file.  People have a habit of doing unlikely things.
Oh dear, the "from" and "to" were my notes to have you change the header.

So you'd want to have the first few lines look like this:

$maxFiles = {ENTER YOUR MAX # FILES HERE};
$sourceFolder = "D\Amazing Charts\Backup2";
$files = Get-ChildItem -Path $sourceFolder | where{!($_.PSISContainer)} | Sort-Object CreationTime;

Have you tried loading the file into power gui and stepping through it? That's where I'd start. In fact, that's how I developed the script for you.

Maybe I'm mistaken about the automatic association of ps1 scripts with ps. Hmmm... I have my association of then with powergui.  Here's a discussion on firing up a script: http://www.microsoft.com/technet/scriptcenter/resources/qanda/sept06/hey0926.mspx

You can also run a script from the ps prompt by typing the complete path of the script into the prompt. If you CD into the same folder, you can enter just the script name, without the path, but you have to do something a little strange, which is including a "." first. For example: "  . myscript.ps1".
You can tweak this script till the cows come home to polish it off.

If you change the line below:

$files = Get-ChildItem -Path $sourceFolder | where{!($_.PSISContainer)} | Sort-Object CreationTime;

To ...

$files = Get-ChildItem -Path $sourceFolder | where{!($_.PSISContainer) -and $_.Extension -eq "dat" } | Sort-Object CreationTime;

It will only process "*.DAT" files, *for example*.
@satsumo

RIVERVIEWPEDIATRICS_1313_081209_123552.enc  Yes, it is only .enc files in this folder. And, since these are backed up in about nine other places (well pretty close), I wouldn't really care much if someone did delete them, although no one has that permission besides me.

@Todd

I will take a look a bit later after I finish work. It looks like I may need a little time.
$maxFiles = {ENTER YOUR MAX # FILES HERE};
$sourceFolder = "D\Amazing Charts\Backup2";
$files = Get-ChildItem -Path $sourceFolder | where{!($_.PSISContainer)} | Sort-Object CreationTime;


Todd, by typing powershell.exe -noexit and the path, I can try to run the script from Powershell.

I have changed the script. It would help if I understood what the last line of the script is for and what I need to exchange. I think I understand $files = Get-ChildItem -Path $sourcefolder | but I don't know what the rest is. Well, the CreationTime makes sense but mostly it's the PSISContainer. What do I replace?

Also, I don't really wish to move a file to another folder, but rather just delete the oldest.
The last line of the script? It's  closing brace. Are you forgetting the rest of the script I originally posted?
$maxFiles = 2;
$sourceFolder = "D\Amazing Charts\Backup2";
$files = Get-ChildItem -Path $sourceFolder | where{!($_.PSISContainer)} | Sort-Object CreationTime;
$files = @($files)
 
if ($files.Count -ge $maxFiles) # array of System.IO.FileInfo objects
{
	$targetFileCount = ($files.Count - $maxFiles);
	
	if ($targetFileCount -gt 0)
	{
		$targetFileCount-- # 0-based
		$targetFiles = $files[0..$targetFileCount];
		
		ForEach($file in $targetFiles)
		{
			"Deleting $($file.Name)"
			[System.IO.File]::Delete($file.FullName);
		}
	}
}

Open in new window

Sorry again, I am so bad at scripts. Feel free to quit any time.

Do I need to include all of the } things?

Is it possible to put everything I need to customize with my info in purple?
Everything in the code block I just posted is your code and is used in whole. There's only two customizable bits. The first is maximum number of files to keep in the folder. The second is the folder name itself.
Hi Todd,

Well, you aren't going to believe this, but I copied your code into a text file and named it with a ps1 extension. And it worked. Weird huh? :-) The only thing is I had to change D:\....... to \\Server\Amazing Charts\Backup

OK, so it works from the powershell prompt using:

powershell.exe C:\scripts\delete.ps1

I tried another script by changing the path to a network folder. There were no errors when I ran it in Powershell, but it's been over five minutes now with nothing. Now the first one was about 100Mbs. This is more like 15GBs over a network. Is it powerful enough to do something like that?
The size of the file should have little effect, since it's just deleting it. It's not copying the file. I have ps scripts actually copying files larger than that. Again, you can step through it with power gui to see exactly what's happening... no guesswork.
Wow. 29 replies on this thread so far .. I think we're on track for a record!!!
https://www.experts-exchange.com/questions/24546969/Root-Kit.html

Gotta ways to go. This one was 104 posts. I still have three larger. But, then I have over 375 questions. I love E-E.

I think the problem I am having is two fold. One, I think it prefers \\server and not anything over the network and I have an NAS with the same issues. Also, some things are folders and most of these delete programs are for files only.
"some things are folders"? we specifically ignore folders with this script. If you want to include folders, we can change the script.

Simply remove the "| where{!($_.PSISContainer)}" bit.

Not sure what you mean about UNC and "not anything over the network".
Well, as you know, when I run the script via the command prompt of Powershell, if it doesn't work for whatever reason, it will give the error message. So when I try to delete a file on the Buffalo NAS where the UNC is \\Buffalo\Medware it gives a red error message which says, "Can't find "Buffalo\Medware, you must use this path: \\server\etc. So, it basically is telling me it will only find files and folders directly on the server.
That's funny, because while I was waiting, I tried changing all the "files" to "folders" in the script.
When I run a script to remove folders and change the path to the correct one as below:
$maxFiles = 1;
$sourceFolder = "\\Server\FAP Backups";
$files = Get-ChildItem -Path $sourceFolder  Sort-Object CreationTime;
$files = @($files)
 
if ($files.Count -ge $maxFiles) # array of System.IO.FileInfo objects
{
	$targetFileCount = ($files.Count - $maxFiles);
	
	if ($targetFileCount -gt 0)
	{
		$targetFileCount-- # 0-based
		$targetFiles = $files[0..$targetFileCount];
		
		ForEach($file in $targetFiles)
		{
			"Deleting $($file.Name)"
			[System.IO.File]::Delete($file.FullName);
		}
	}
}

Open in new window

Sorry, the post went by itself or something. But, when I take the part out exactly as you showed me and just changed the path, I get the following error message:

Windows PowerShell
Copyright (C) 2006 Microsoft Corporation. All rights reserved.

PS C:\Documents and Settings\Administrator> powershell.exe C:\scripts\FAP.ps1
Get-ChildItem : A parameter cannot be found that matches parameter name 'CreationTime'.
At C:\scripts\FAP.ps1:3 char:23
+ $files = Get-ChildItem  <<<< -Path $sourceFolder  Sort-Object CreationTime;

PS is not very happy with it.

We may end up passing that other post anyway. If you stay long enough. Thanks.
1. You didn't take the filter out exactly as I showed. You took out the pipe symbol following it, as well. Now you've got the Get-ChildItem command runnning into the Sort-Object command.

2. I don't know why you can't get UNC to work. I use it every day with PS. At this point I'm guessing it was a typo. UNC format is \\server\share\folder

3. "Well, as you know, when I run the script via the command prompt of Powershell, if it doesn't work for whatever reason, it will give the error message." Give  *what* error message? It *will* work from the command prompt. Guaranteed. You must not be entering the execution line correctly, as I indicated, using the leading dot and space. ". myscript.ps1"
SOLUTION
Link to home
membership
This solution is only available to members.
To access this solution, you must be a member of Experts Exchange.
Start Free Trial
@satsumo,

Yes, thank you for picking up on that. "So when I try to delete a file on the Buffalo NAS where the UNC is \\Buffalo\Medware" Obviously, I knew to write \\ as in the sentence I wrote in my post above. But, I forgot to write it that way in the script. Thanks! It works great now.

As to the folders vs files, they are different in different backups. When I was deleting backup files of the AC EMR which were .enc files, I used the script for files. But, I also do backups of say "Medware" which is a billing software. When it backs up, it backs up the entire folder daily. So, I end up with many folders. I don't want to delete subfolders. I want to delete the entire folder, hence the need for changing the script for folders. :-)
PS C:\Documents and Settings\Administrator> powershell.exe C:\scripts\folders.ps1
Deleting Test2
Exception calling "Delete" with "1" argument(s): "Access to the path '\\Server\notfiles\Test2' is denied."
At C:\scripts\folders.ps1:18 char:28
+             [System.IO.File]::Delete( <<<< $file.FullName);
PS C:\Documents and Settings\Administrator>

I am trying a test to delete folders where there are four and the max is three. The folder is completely shared with the correct permissions. I don't know why it says Access is denied.
$maxFiles = 3;
$sourceFolder = "\\Server\notfiles";
$files = Get-ChildItem -Path $sourceFolder | Sort-Object CreationTime;
$files = @($files)
 
if ($files.Count -ge $maxFiles) # array of System.IO.FileInfo objects
{
	$targetFileCount = ($files.Count - $maxFiles);
	
	if ($targetFileCount -gt 0)
	{
		$targetFileCount-- # 0-based
		$targetFiles = $files[0..$targetFileCount];
		
		ForEach($file in $targetFiles)
		{
			"Deleting $($file.Name)"
			[System.IO.File]::Delete($file.FullName);
		}
	}
}

Open in new window

the max # of files/folder to PRESERVE has nothing to do with the error you're getting. if the folder has contents, you can't just delete it. you didn't mention originally that you wanted to delete folders, so it was a last minute throw-in. i'd handle folder seperately, but at this point i'm not sure i can bring this question to completion.
Thanks. Now it's my turn to say I don't know what you mean by "not being able to bring this question to completion." If you are saying you would have to write an entirely different script which is beyond the scope of this question, then I would be glad to close this question, award points and open a new question on folders. :-)

PS The folders are empty. It was just a test.
Even more confused -

'I don't want to delete subfolders, I want to delete the entire folder'

I don't see how you could delte the entire folder without deleting it's sub-folders?  And I'm not even sure which folder you are refering too in this case.  You have a directory with many Backup folders in it (and you want to keep a specific number of them) or a directory where you want to delete a single Backup folder with many Backup sub-folders?

'The folders are empty. It was just a test'

The Backup folders are empty?  Not much use as backup in that case?  What was a test?  If they are empty as a test, I guess they aren't empty in practice.
OK, I will explain. I think everyone is reading too much into this, lol. Part of the confusion is that experts in scripts tend to think that we newbies no too much. That's not a bad thing. It's that that scripts look very scary to us, and we never know when to insert our custom stuff vs when not to. I am a doctor and I make the same mistake. To me the word hypercalciuria seems straightforward. Hyper = high and calciuria = calcium in urine. But, keep using those words and you will not have many patients. They just see the big picture and have no idea. Now to answer your question.

Because I am a doctor, my files are mission critical. Yes, I back up my server nightly, but a lot of my programs, e.g. my electronic medical record, my fax import program and my billing program all have their native backups. Since, it is easy to click on this and make a quick backup, I do that as well. I do that for two reasons. 1. I really think that "Set it and forget it (referring to Acronis or ntBackup)" is also Set it and regret it. There have been times that after a month I went in to test it and noticed that the D: drive wasn't checked. That's my data! So, for my money, I like the idea of an automatic backup in case I forget and a manual backup. 2. The small backups can be restore much more quickly, and I don't have to search for the files or restore an entire partition or server.

Sorry this is long, but I really want to explain this well. Because of having four backups, there are backups that have folders and there are backups that have files. So, let's just look at two. The electronic medical record backups are files. It compresses and encrypts the files INSIDE the Amazing Charts folder and thus makes a file. The files grow and Todd's program works great to delete the files based on how many and the date. OK, so there is the backup of files.

It would be great if all my different backups were files, but they are not. My practice management software, eMedware, backs the FOLDER up. It does NOT back up the files inside into a nice tidy file. Therefore there are folders. Given that the script Todd makes doesn't back up folders, I asked about that.

Now the subfolders: I have NO backups that back up as folders which have subfolders. So, yes, my subfolder comment may be stupid, but as you will see, it is not.

The reason I asked this question is I looked at over 15 commercial programs that should be able to do this, but none of them performed well and, frankly, Todd's was better. Now, when I used one of the commercial programs to try to delete the folders, they couldn't do it. BUT, and this is why it seems weird, if you set it to delete subfolders, it sort of followed it, because it would delete all of the files inside the folder that you wanted to be deleted leaving an empty folder. Somewhat useful but obviously not a long term solution.

Now for the test folders. Todd tried to change his script to delete folders. He later went on to say that he really would have to start from scratch if I understood him. Now, when I use the scripts for the first time, I am sort of testing them. So, I don't want to run them on a folder with real backups and end of deleting all of them by mistake. So, I do it on test folders. So, I made a folder and put five folders in it called Test1, Test2, etc. Then, I made the script to go to that folder to see if it would delete the folders. I hope this explains it.

No empty backups are not useful. I can assure you that my backups average about 40GBs of data at least for the server backups.
I don't know where my post went on this, but as to  I'd guess it's telling you the path 'Buffalo\Medware' is missing it's initial '\\'.  Windows won't recognise it as a UNC without.

Yes, I am aware that I need to use \\. I just forgot. Thanks for pointing it out.
Todd,

If you want to try your hand at making one for folders, I would be happy to do another question. If you are tired of my ineptitude, I understand. :)

As you see, your script and helping me to use it, did bring this question to completion.

As an FYI: I now have 375 questions, of which ~ 350 have accepted answers with a grade of A.