Link to home
Start Free TrialLog in
Avatar of omnipower321
omnipower321Flag for United States of America

asked on

PowerShell Get-ChildItem limitations

Hello,

I put together a script to delete files older than "x" for each directory in a root directory.  The script works as planned, until it gets to a directory that contains 1.9 million files.  I know that's a whole lot of files, and this script will clean them up, but are there limitations to Get-ChildItem, or PowerShell's object count?

If I run the command manually on the directory it sits for hours then returns to the prompt without displaying any paths.  If I wait 10 minutes after initiating the command and then press the enter key, the file list starts displaying / scrolling.  I don't have / want the ability to do this everytime as we will be running this script on a nightly schedule.

Thanks for any info!
Avatar of Chris Dent
Chris Dent
Flag of United Kingdom of Great Britain and Northern Ireland image


> but are there limitations to Get-ChildItem, or PowerShell's object count?

Yes... your system RAM, but it depends on how you use the commands.

> hours then returns to the prompt without displaying any paths

Returns to which prompt? This one?

PS C:\>

Or  this one?

>>

If it's the latter it's an unfinished script block, it sits there because you must finish it off by pressing return again.

Everything else depends on how you've written the script.

Chris
Avatar of omnipower321

ASKER

Hi Chris,

Returns to the PS C:\> prompt.  System does not run out of RAM sadly.  Here is how the script is setup:
I simplified it as it is part of a larger script, but I ran it with the same results.

I get the same results if I don't do the date check and just Get-ChildItem | %{Write-Host $_.Name} too.
$OutboundFiles = Get-ChildItem -recurse | Where {$_.LastWriteTime -le ((Get-Date).AddDays(- $intDeleteFilesOlderThan))}
ForEach($OutboundFile in $OutboundFiles){
	Remove-Item $OutboundFile.FullName
	If($?){Write-Output ((Get-Date -f "yyyy-MM-dd HH:mm:ss") + " Deleted: " + $OutboundFile.FullName) | Out-File -filepath ".\Deletion History.log" -append}
	Else{Write-Output ((Get-Date -f "yyyy-MM-dd HH:mm:ss") + " Failed to delete: " + $OutboundFile.FullName) | Out-File -filepath ".\Deletion History.log" -append}
}

Open in new window

replace $intDeleteFilesOlderThan with your favorite number.
ASKER CERTIFIED SOLUTION
Avatar of Chris Dent
Chris Dent
Flag of United Kingdom of Great Britain and Northern Ireland image

Link to home
membership
This solution is only available to members.
To access this solution, you must be a member of Experts Exchange.
Start Free Trial
Nice, I like those shortcuts!  learn something new everyday.  Same deal though, just blinks at me until I hit enter again.

I'm a little confused about that. I can see it sitting and blinking with ">>" as the prompt, that's the usual script block closing behaviour (double-return to terminate block).

But if it drops back to the prompt then the command is done.

If it's actually running the command then I can't claim I'd be surprised, 1.9 million is a lot of files as you rightly point out. You would Tee-Object the data you're writing so you can see progress as well as writing it to the file?

Chris
I was not aware of tee-object, very cool though, in the main script I was just using write-host in the loop with the current object name.  But even if I do this: Get-ChildItem D:\folder\bignastyfolder

I just sit at a screen that looks like this:

PS C:\> Get-ChildItem D:\folder\bignastyfolder
_

For over an hour, then it will just return to:
PS C:\>

If I run Get-ChildItem D:\folder\bignastyfolder and wait about 10 mins, then hit enter, it starts the files flowing.  Same with the script you wrote above.

Okay, hang on... making 1.9 million files.

Chris
Dir in cmd.exe works immediately.  If need be I will resort to redirecting the output for cmd.exe dir to a text file, split up the text file, and iterate through the chunks.  Once I get the dir to a reasonible size the script should work against it.  The script is working fine against other dirs with 250,000 files in it, so that should be ok.

I expect it will be a little while creating files so I'll refrain from wondering too much about advantages dir might have ;)

Out of interest, PowerShell 1 or 2?

Chris
Ha, I appreciate the effort!  Using v2, on server 2003.  I am testing the simplified script against the directory through the admin share from my workstation now to rule out any strangeness with powershell on the server.

Cool, good to know.

Just waiting for this to complete...

1..1900000 | %{ $_ > "BobsFile$_.txt" }

Hopefully it'll be close enough to reproduce the problem you're suffering from.

Chris
Explorer starts to implode somewhere around 300k in case you have issues :)

Yeah, I was avoiding opening Explorer completely :)

Chris
So this is fun, the script is working from a remote system.  Very slowly, but working.  Powershell.exe is slowly claiming all of my laptop's RAM.  I may end up running this first round remotely from a system with a little more punch, but stringe it did not work locally.

I finally got enough files, but I can't recreate the problem unfortunately. It's slow, but I rather expected that.

Chris
Thanks again Chris.  It appears to have been an issue with the server I was running it from.  I ran the modified script remotely and it completed in about 9 hours.  Now that it is a managable size, I can include the path in the daily task.  Still unknown as to why it did not work locally from the server, but I will be reinstalling PowerShell to see if that helps.  Thanks!