PowerShell Get-ChildItem limitations

Hello,

I put together a script to delete files older than "x" for each directory in a root directory.  The script works as planned, until it gets to a directory that contains 1.9 million files.  I know that's a whole lot of files, and this script will clean them up, but are there limitations to Get-ChildItem, or PowerShell's object count?

If I run the command manually on the directory it sits for hours then returns to the prompt without displaying any paths.  If I wait 10 minutes after initiating the command and then press the enter key, the file list starts displaying / scrolling.  I don't have / want the ability to do this everytime as we will be running this script on a nightly schedule.

Thanks for any info!
omnipower321Asked:
Who is Participating?

[Product update] Infrastructure Analysis Tool is now available with Business Accounts.Learn More

x
I wear a lot of hats...

"The solutions and answers provided on Experts Exchange have been extremely helpful to me over the last few years. I wear a lot of hats - Developer, Database Administrator, Help Desk, etc., so I know a lot of things but not a lot about one thing. Experts Exchange gives me answers from people who do know a lot about one thing, in a easy to use platform." -Todd S.

Chris DentPowerShell DeveloperCommented:

> but are there limitations to Get-ChildItem, or PowerShell's object count?

Yes... your system RAM, but it depends on how you use the commands.

> hours then returns to the prompt without displaying any paths

Returns to which prompt? This one?

PS C:\>

Or  this one?

>>

If it's the latter it's an unfinished script block, it sits there because you must finish it off by pressing return again.

Everything else depends on how you've written the script.

Chris
0
omnipower321Author Commented:
Hi Chris,

Returns to the PS C:\> prompt.  System does not run out of RAM sadly.  Here is how the script is setup:
I simplified it as it is part of a larger script, but I ran it with the same results.

I get the same results if I don't do the date check and just Get-ChildItem | %{Write-Host $_.Name} too.
$OutboundFiles = Get-ChildItem -recurse | Where {$_.LastWriteTime -le ((Get-Date).AddDays(- $intDeleteFilesOlderThan))}
ForEach($OutboundFile in $OutboundFiles){
	Remove-Item $OutboundFile.FullName
	If($?){Write-Output ((Get-Date -f "yyyy-MM-dd HH:mm:ss") + " Deleted: " + $OutboundFile.FullName) | Out-File -filepath ".\Deletion History.log" -append}
	Else{Write-Output ((Get-Date -f "yyyy-MM-dd HH:mm:ss") + " Failed to delete: " + $OutboundFile.FullName) | Out-File -filepath ".\Deletion History.log" -append}
}

Open in new window

0
omnipower321Author Commented:
replace $intDeleteFilesOlderThan with your favorite number.
0
10 Tips to Protect Your Business from Ransomware

Did you know that ransomware is the most widespread, destructive malware in the world today? It accounts for 39% of all security breaches, with ransomware gangsters projected to make $11.5B in profits from online extortion by 2019.

Chris DentPowerShell DeveloperCommented:

You should get a boost in speed by making it run in the pipeline rather than saving the result set as a variable.

At least that way you don't need to wait for it to completely read all the information from all the files before it gets on with its work.

Chris
Get-ChildItem -recurse | ?{$_.LastWriteTime -le (Get-Date).AddDays(-$intDeleteFilesOlderThan) } | %{
  Remove-Item $_.FullName
  If ($?) { 
    "$(Get-Date -f 'yyyy-MM-dd HH:mm:ss') Deleted: $($_.FullName)" >> ".\Deletion History.log"
  } Else {
    "$(Get-Date -f 'yyyy-MM-dd HH:mm:ss') Failed to delete: $($_.FullName)" >> ".\Deletion History.log"
  }
}

Open in new window

0

Experts Exchange Solution brought to you by

Your issues matter to us.

Facing a tech roadblock? Get the help and guidance you need from experienced professionals who care. Ask your question anytime, anywhere, with no hassle.

Start your 7-day free trial
omnipower321Author Commented:
Nice, I like those shortcuts!  learn something new everyday.  Same deal though, just blinks at me until I hit enter again.
0
Chris DentPowerShell DeveloperCommented:

I'm a little confused about that. I can see it sitting and blinking with ">>" as the prompt, that's the usual script block closing behaviour (double-return to terminate block).

But if it drops back to the prompt then the command is done.

If it's actually running the command then I can't claim I'd be surprised, 1.9 million is a lot of files as you rightly point out. You would Tee-Object the data you're writing so you can see progress as well as writing it to the file?

Chris
0
omnipower321Author Commented:
I was not aware of tee-object, very cool though, in the main script I was just using write-host in the loop with the current object name.  But even if I do this: Get-ChildItem D:\folder\bignastyfolder

I just sit at a screen that looks like this:

PS C:\> Get-ChildItem D:\folder\bignastyfolder
_

For over an hour, then it will just return to:
PS C:\>

If I run Get-ChildItem D:\folder\bignastyfolder and wait about 10 mins, then hit enter, it starts the files flowing.  Same with the script you wrote above.
0
Chris DentPowerShell DeveloperCommented:

Okay, hang on... making 1.9 million files.

Chris
0
omnipower321Author Commented:
Dir in cmd.exe works immediately.  If need be I will resort to redirecting the output for cmd.exe dir to a text file, split up the text file, and iterate through the chunks.  Once I get the dir to a reasonible size the script should work against it.  The script is working fine against other dirs with 250,000 files in it, so that should be ok.
0
Chris DentPowerShell DeveloperCommented:

I expect it will be a little while creating files so I'll refrain from wondering too much about advantages dir might have ;)

Out of interest, PowerShell 1 or 2?

Chris
0
omnipower321Author Commented:
Ha, I appreciate the effort!  Using v2, on server 2003.  I am testing the simplified script against the directory through the admin share from my workstation now to rule out any strangeness with powershell on the server.
0
Chris DentPowerShell DeveloperCommented:

Cool, good to know.

Just waiting for this to complete...

1..1900000 | %{ $_ > "BobsFile$_.txt" }

Hopefully it'll be close enough to reproduce the problem you're suffering from.

Chris
0
omnipower321Author Commented:
Explorer starts to implode somewhere around 300k in case you have issues :)
0
Chris DentPowerShell DeveloperCommented:

Yeah, I was avoiding opening Explorer completely :)

Chris
0
omnipower321Author Commented:
So this is fun, the script is working from a remote system.  Very slowly, but working.  Powershell.exe is slowly claiming all of my laptop's RAM.  I may end up running this first round remotely from a system with a little more punch, but stringe it did not work locally.
0
Chris DentPowerShell DeveloperCommented:

I finally got enough files, but I can't recreate the problem unfortunately. It's slow, but I rather expected that.

Chris
0
omnipower321Author Commented:
Thanks again Chris.  It appears to have been an issue with the server I was running it from.  I ran the modified script remotely and it completed in about 9 hours.  Now that it is a managable size, I can include the path in the daily task.  Still unknown as to why it did not work locally from the server, but I will be reinstalling PowerShell to see if that helps.  Thanks!
0
It's more than this solution.Get answers and train to solve all your tech problems - anytime, anywhere.Try it for free Edge Out The Competitionfor your dream job with proven skills and certifications.Get started today Stand Outas the employee with proven skills.Start learning today for free Move Your Career Forwardwith certification training in the latest technologies.Start your trial today
Powershell

From novice to tech pro — start learning today.