PowerShell Get-ChildItem limitations

Hello,

I put together a script to delete files older than "x" for each directory in a root directory.  The script works as planned, until it gets to a directory that contains 1.9 million files.  I know that's a whole lot of files, and this script will clean them up, but are there limitations to Get-ChildItem, or PowerShell's object count?

If I run the command manually on the directory it sits for hours then returns to the prompt without displaying any paths.  If I wait 10 minutes after initiating the command and then press the enter key, the file list starts displaying / scrolling.  I don't have / want the ability to do this everytime as we will be running this script on a nightly schedule.

Thanks for any info!
omnipower321Asked:
Who is Participating?
 
Chris DentPowerShell DeveloperCommented:

You should get a boost in speed by making it run in the pipeline rather than saving the result set as a variable.

At least that way you don't need to wait for it to completely read all the information from all the files before it gets on with its work.

Chris
Get-ChildItem -recurse | ?{$_.LastWriteTime -le (Get-Date).AddDays(-$intDeleteFilesOlderThan) } | %{
  Remove-Item $_.FullName
  If ($?) { 
    "$(Get-Date -f 'yyyy-MM-dd HH:mm:ss') Deleted: $($_.FullName)" >> ".\Deletion History.log"
  } Else {
    "$(Get-Date -f 'yyyy-MM-dd HH:mm:ss') Failed to delete: $($_.FullName)" >> ".\Deletion History.log"
  }
}

Open in new window

0
 
Chris DentPowerShell DeveloperCommented:

> but are there limitations to Get-ChildItem, or PowerShell's object count?

Yes... your system RAM, but it depends on how you use the commands.

> hours then returns to the prompt without displaying any paths

Returns to which prompt? This one?

PS C:\>

Or  this one?

>>

If it's the latter it's an unfinished script block, it sits there because you must finish it off by pressing return again.

Everything else depends on how you've written the script.

Chris
0
 
omnipower321Author Commented:
Hi Chris,

Returns to the PS C:\> prompt.  System does not run out of RAM sadly.  Here is how the script is setup:
I simplified it as it is part of a larger script, but I ran it with the same results.

I get the same results if I don't do the date check and just Get-ChildItem | %{Write-Host $_.Name} too.
$OutboundFiles = Get-ChildItem -recurse | Where {$_.LastWriteTime -le ((Get-Date).AddDays(- $intDeleteFilesOlderThan))}
ForEach($OutboundFile in $OutboundFiles){
	Remove-Item $OutboundFile.FullName
	If($?){Write-Output ((Get-Date -f "yyyy-MM-dd HH:mm:ss") + " Deleted: " + $OutboundFile.FullName) | Out-File -filepath ".\Deletion History.log" -append}
	Else{Write-Output ((Get-Date -f "yyyy-MM-dd HH:mm:ss") + " Failed to delete: " + $OutboundFile.FullName) | Out-File -filepath ".\Deletion History.log" -append}
}

Open in new window

0
Has Powershell sent you back into the Stone Age?

If managing Active Directory using Windows Powershell® is making you feel like you stepped back in time, you are not alone.  For nearly 20 years, AD admins around the world have used one tool for day-to-day AD management: Hyena. Discover why.

 
omnipower321Author Commented:
replace $intDeleteFilesOlderThan with your favorite number.
0
 
omnipower321Author Commented:
Nice, I like those shortcuts!  learn something new everyday.  Same deal though, just blinks at me until I hit enter again.
0
 
Chris DentPowerShell DeveloperCommented:

I'm a little confused about that. I can see it sitting and blinking with ">>" as the prompt, that's the usual script block closing behaviour (double-return to terminate block).

But if it drops back to the prompt then the command is done.

If it's actually running the command then I can't claim I'd be surprised, 1.9 million is a lot of files as you rightly point out. You would Tee-Object the data you're writing so you can see progress as well as writing it to the file?

Chris
0
 
omnipower321Author Commented:
I was not aware of tee-object, very cool though, in the main script I was just using write-host in the loop with the current object name.  But even if I do this: Get-ChildItem D:\folder\bignastyfolder

I just sit at a screen that looks like this:

PS C:\> Get-ChildItem D:\folder\bignastyfolder
_

For over an hour, then it will just return to:
PS C:\>

If I run Get-ChildItem D:\folder\bignastyfolder and wait about 10 mins, then hit enter, it starts the files flowing.  Same with the script you wrote above.
0
 
Chris DentPowerShell DeveloperCommented:

Okay, hang on... making 1.9 million files.

Chris
0
 
omnipower321Author Commented:
Dir in cmd.exe works immediately.  If need be I will resort to redirecting the output for cmd.exe dir to a text file, split up the text file, and iterate through the chunks.  Once I get the dir to a reasonible size the script should work against it.  The script is working fine against other dirs with 250,000 files in it, so that should be ok.
0
 
Chris DentPowerShell DeveloperCommented:

I expect it will be a little while creating files so I'll refrain from wondering too much about advantages dir might have ;)

Out of interest, PowerShell 1 or 2?

Chris
0
 
omnipower321Author Commented:
Ha, I appreciate the effort!  Using v2, on server 2003.  I am testing the simplified script against the directory through the admin share from my workstation now to rule out any strangeness with powershell on the server.
0
 
Chris DentPowerShell DeveloperCommented:

Cool, good to know.

Just waiting for this to complete...

1..1900000 | %{ $_ > "BobsFile$_.txt" }

Hopefully it'll be close enough to reproduce the problem you're suffering from.

Chris
0
 
omnipower321Author Commented:
Explorer starts to implode somewhere around 300k in case you have issues :)
0
 
Chris DentPowerShell DeveloperCommented:

Yeah, I was avoiding opening Explorer completely :)

Chris
0
 
omnipower321Author Commented:
So this is fun, the script is working from a remote system.  Very slowly, but working.  Powershell.exe is slowly claiming all of my laptop's RAM.  I may end up running this first round remotely from a system with a little more punch, but stringe it did not work locally.
0
 
Chris DentPowerShell DeveloperCommented:

I finally got enough files, but I can't recreate the problem unfortunately. It's slow, but I rather expected that.

Chris
0
 
omnipower321Author Commented:
Thanks again Chris.  It appears to have been an issue with the server I was running it from.  I ran the modified script remotely and it completed in about 9 hours.  Now that it is a managable size, I can include the path in the daily task.  Still unknown as to why it did not work locally from the server, but I will be reinstalling PowerShell to see if that helps.  Thanks!
0
Question has a verified solution.

Are you are experiencing a similar issue? Get a personalized answer when you ask a related question.

Have a better answer? Share it in a comment.

All Courses

From novice to tech pro — start learning today.