[Last Call] Learn how to a build a cloud-first strategyRegister Now

x
  • Status: Solved
  • Priority: Medium
  • Security: Public
  • Views: 1357
  • Last Modified:

Powershell stream/one-liner to delete 3 million files

I have a directory with 3 million subdirs.  I want to delete all of the subdirs.  
I would like to have a PS one-liner like this:
  dir | remove-item | tee output to log?
If there were not so many files, I would just use Windows Explorer, highlight and delete.  But bcz there are so many subdirs, the folder won't even open in WE.  Right now I am using: "remove-item *" but it does not show me the progress, so I dont know if it is hanging.  I tried using dir to pipe the names of the subdirs to a file, and then read the file into a PS variable with get-contents, and then loop through the file names, but it is taking too long.  At the rate it is going, it will probably take more than a day to complete.  Any suggestions on the best way to rapidly delete these, and monitor the progress?  I am thinking that a one-liner would be the most efficient, since there would not need to be any intermediate variables to load.
0
ee3
Asked:
ee3
2 Solutions
 
ee3Author Commented:
For example, I was using this to delete in a script, but it was running too slowly:

$con = gc "$inputfile"
foreach ( $myfile in $con ) {
  remove-item "\\${mybox}\$directory\$myfile"
}

In the above, $input file is the result of a "dir" in a cmd window.
0
 
Steve KnightIT ConsultancyCommented:
Any reason why it has to be powershell, anything wrong with just:

rd \\server\share\somedir /s/q

if you are doing it over a network connection as above suggested it will be MUCH MUCH slower than doing it natively on the box of course.  If you can do it from the box itself through an RDP session or whatever then I can't imagine an RD taking more than minutes or hours max?

Interested to see how Powershell handles this for other replies, not used it much yet myself.

Steve
0
 
ee3Author Commented:
Hi Steve - Thank you for that suggestion - I am trying "rd . /s /q".  It seems to be working, but I dont have a way to meaure the progress.  That is why I wanted to use PS to have a tee to output the progress to a log.  Without a progress monitor it is hard to say if it is better/worse/same.
0
Granular recovery for Microsoft Exchange

With Veeam Explorer for Microsoft Exchange you can choose the Exchange Servers and restore points you’re interested in, and Veeam Explorer will present the contents of those mailbox stores for browsing, searching and exporting.

 
senadCommented:
fire up a command prompt (i hope you know how to use one... otherwise there's no hope for you) and get to the dir with the files you want to delete. then a good ol'fashined "deltree /y *.*
:-)
0
 
Leon FesterCommented:
You can use forfiles to manage the deletion of files. It's a really nice utility from the Windows resource kit.

But in this scenario I'd like to suggest another tool...Robocopy.
Why? because of the execellent logging abilities.
The easiest way to do this is to mirror a blank folder onto your folder with 3Mill files.
I know it sounds strange but it does work, and I've used it for a similar scenario as well.
robocopy  c:\blank\ c:\full /mir /Log+:c:\deleted_files_output.txt
The log file can be opened while the command is running so depending on what you're wanting to see, you can monitor the progress from there.
0
 
Steve KnightIT ConsultancyCommented:
Thats a nice way of doing it.... I can't see how any of these can necessarily be faster than simply doing a simple rd though.... frankly you can soon do in another cmd.exe window perioidcally a:

dir /b /ad /s | find /v /c "#####"

which will return the subdir count.....this does take a while the first time but caching makes it run MUCH faster on the next run.

Steve
0
 
ee3Author Commented:
Thank you Steve and dvt_localboy - I ran the "rd . /s /q" command last night and it deleted all of the files.  I am not sure how long it took.  This morning I am trying robocopy as you suggested.  That is working well to log the deletions, and it is running in powershell.   Unfortunately it has taken 75 mins to del 284k dirs, so, at this rate it will take until tommorow to finish.   In any case, I am very glad to see that the logging is there.  I am trying to learn powershell, so I was hoping to have a native PS cmdlet stream to do the work - especially re how it would handle the tee part.  
0
 
ee3Author Commented:
Overall A grade.  Would be nice if there were a native PowerShell stream one-liner that used a tee, also, but the logging capability in robocopy is good, too.
0
 
Steve KnightIT ConsultancyCommented:
Sorry I couldn't help with the PS side.  I know you can do it but to my knowledge it involves effectively a loop.   Depending upon your structure, i.e. how many top level dirs etc you could always go halfway and call an rd for each first level dir say, and echo %time% to a log file an the console window, e.g.

@echo off
(
  echo Started at %date% %time%
  set startdir=C:\topleveldir
  cd /d "%startdir%
  for /d %%a in (.) do (
    echo Starting on %%a at %time%
    title Working on %%a at %time%
    rd /s /q "%startdir%\%%a"
  )
) > logfile.txt

0
 
Steve KnightIT ConsultancyCommented:
Sorry the . should be a * in the for command.
0

Featured Post

Industry Leaders: We Want Your Opinion!

We value your feedback.

Take our survey and automatically be enter to win anyone of the following:
Yeti Cooler, Amazon eGift Card, and Movie eGift Card!

Tackle projects and never again get stuck behind a technical roadblock.
Join Now