get the last file changed in a set of known folders(in a text file)

SquigglyMonkey
SquigglyMonkey used Ask the Experts™
on
Script to get the last file changed in a set of known folders(in a text file)
I have a previous question here link.
I am after the same functionality, but read a list of folders from a text file instead.
This is the script form the other question.
Get-ChildItem D:\Shares | Where-Object {$_.PSIsContainer} | ForEach-Object {
	Write-Host "Processing '$($_.FullName)'"
	Get-ChildItem $_.FullName -Recurse | Where-Object {-not $_.PSIsContainer} |
		Sort-Object LastWriteTime -Descending |
		Select-Object FullName, Name, DirectoryName, LastWriteTime -First 1
} | Export-Csv -NoTypeInformation -Path C:\Temp\LatestFiles.csv

Open in new window

Comment
Watch Question

Do more with

Expert Office
EXPERT OFFICE® is a registered trademark of EXPERTS EXCHANGE®
Most Valuable Expert 2018
Distinguished Expert 2018

Commented:
In the other question, you wanted the latest file per subfolder under the initial search root.
Do you now want ...
.. to replace the search root with the file list (that is, do you want the one single latest file from all folders in the list)
or
... multiple latest files as before, that is, the latest one from inside each folder in the list?
And recursive or just immediate children of the folders listed?

Author

Commented:
"multiple latest files as before, that is, the latest one from inside each folder in the list?
And recursive or just immediate children of the folders listed?"
This sounds right.
I have the same condition, a single folder at the root of a drive that contains 1000+ folders. I was able to take care of 960 of them, there are 40 left to deal with.  I  have that list of those  folder names in a text file.

Author

Commented:
A little "LOL",,, oldest so far is 1994
PMI ACP® Project Management

Prepare for the PMI Agile Certified Practitioner (PMI-ACP)® exam, which formally recognizes your knowledge of agile principles and your skill with agile techniques.

Michael B. SmithManaging Consultant

Commented:
As I understood the request.

So the only real issue I see with this is that if you have multiple files with the same access time (as is common when software is installed), only the first one gets reported.

$folderList = Get-Content <filename>
$newest = $null
foreach( $folder in $folderList )
{
	$files = Get-ChildItem $folder -Recurse |? { $_.PSIsContainer -eq $false }
	$files |% { if( $_.LastWriteTime -gt $newest.LastWriteTime ) { $newest = $_ } }
}

$newest

Open in new window

Author

Commented:
Thanks Michael, but that doesn't work for me. It is just going through all the folders, which takes like 6 hours.
I am having trouble relating in words what I need so I'll try this.
This company has a file server, on one drive (D:) is a folder (Shares), it is shared to all authenticated users.
Under this folder over the past 20 years, business entities would request a folder be created, and groups or users granted access to that folder.
There is over 1000 of these folders, inside of which can be files, or more folders with files, etc.
I am trying to ascertain the last time anything was written to to any file in each of the root 1000+ folders.
It does not matter when that last date is, or if it is the same for multiple files. I just need the date of the last(latest) change.
With the previous script, I was able to ascertain that for over 900 of the folders.
Running the script against the whole "share" takes 6+ hours or more, since there is about a 3/4 of a  petabyte of data.
I have a list now of specific root folder-names (d:\share\foldername) which the script could not run against for some reasons, which I have cleared up.
That's why I just want the previous script, to run against a set of known foldernames.
Most Valuable Expert 2018
Distinguished Expert 2018
Commented:
If you'd mentioned the amount of data earlier, i wouldn't have used the Sort-Object; easier to understand, but harder on system resources.
This should do the trick:
Get-Content -Path C:\Temp\FolderList.txt | ForEach-Object {
	Write-Host "Processing '$($_)'"
	Get-ChildItem -Path $_ -Recurse |
	Where-Object {-not $_.PSIsContainer} |
	ForEach-Object `
		-Begin {$latest = $null} `
		-Process {If ($_.LastWriteTime -gt $latest.LastWriteTime) {$latest = $_}} `
		-End {$latest} |
	Select-Object FullName, Name, DirectoryName, LastWriteTime
} | Export-Csv -NoTypeInformation -Path C:\Temp\LatestFiles.csv

Open in new window

Michael B. SmithManaging Consultant

Commented:
@Squiggly - mine only runs against the folders listed in <filename>.

@oBdA - That does exactly what mine does, except for the export-csv. (And it uses pipes instead of variables, but whatever.) It'll have the same performance profile.

Author

Commented:
Thank you, this works well now.
Most Valuable Expert 2018
Distinguished Expert 2018

Commented:
@Michael - nope, it does not the same . Yours will retrieve one single (that is, the overall latest) file of all folders in the list.
SquigglyMonkey is looking for the latest file per folder listed.

Author

Commented:
Michael, I had a list of folder names including path and when I ran your code, it just started processing all folders, I could see it do it, from some specific files that are encrypted, so I don't have access to them and they throw errors, those folders are not in the list.
Michael B. SmithManaging Consultant

Commented:
@oBdA - got it, ok.

Do more with

Expert Office
Submit tech questions to Ask the Experts™ at any time to receive solutions, advice, and new ideas from leading industry professionals.

Start 7-Day Free Trial