Solved

Batch file to get full path, last access and last modify dates.

Posted on 2012-03-28
43
3,263 Views
Last Modified: 2012-04-09
I need to get a list of every file in a directory, 500,000 is the biggest directory.

I found the batch file I can run listed below. How would I modify it to show the last access date or the modify date? I don't mind running it twice I'm just not sure how to change the ~tI part to do what I need.

%%~tI %%~dpnxI

Also, is it possible to run something like this that lists the file name separate from the full path? like c:\test\temp\               file.txt

I really need to run it both ways, file name separate and full path altogether.


Thanks,
0
Comment
Question by:REIUSA
  • 19
  • 17
  • 4
  • +1
43 Comments
 
LVL 51

Expert Comment

by:Bill Prew
ID: 37777073
You can't access those alternate date/time fields with the basic loop variable modifiers.  You would have to use the DIR command with the options that let you select the field to report, see DIR /? For full details.

If you need a different format than the DIR basic output then you could process that with a FOR /F statement, but parsing up the DIR output can be a little tricky.

There are also a number of utilities that support this type of thing, but you may not want to leverage a utility, which I can understand.

~bp
0
 
LVL 6

Expert Comment

by:HAVARD7979
ID: 37777338
Do you want to list them on a screen , print them or copy to a file? not sure of the exact output you are looking for so here are some general listing options:

Display a list of files and subfolders

Syntax
      DIR [pathname(s)] [display_format] [file_attributes] [sorted] [time] [options]
Key
   [pathname] The drive, folder, and/or files to display,
              this can include wildcards:
                 *   Match any characters
                 ?   Match any ONE character

   [display_format]
   /P   Pause after each screen of data.
   /W   Wide List format, sorted horizontally.
   /D   Wide List format, sorted by vertical column.

   [file_attributes] /A:

   /A:D  Folder         /A:-D  NOT Folder
   /A:R  Read-only      /A:-R  NOT Read-only
   /A:H  Hidden         /A:-H  NOT Hidden
   /A:A  Archive        /A:-A  NOT Archive
   /A    Show all files
   Several attributes may be combined e.g. /A:HD-R

   [sorted]   Sorted by /O:

   /O:N   Name                  /O:-N   Name
   /O:S   file Size             /O:-S   file Size
   /O:E   file Extension        /O:-E   file Extension
   /O:D   Date & time           /O:-D   Date & time
   /O:G   Group folders first   /O:-G   Group folders last
   several attributes may be combined e.g. /O:GEN

   [time] /T:  the time field to display & use for sorting

   /T:C   Creation
   /T:A   Last Access
   /T:W   Last Written (default)

   [options]
   /S     include all subfolders.
   /R     Display alternate data streams. (Vista and above)
   /B     Bare format (no heading, file sizes or summary).
   /L     use Lowercase.
   /Q     Display the owner of the file.

   /N     long list format where filenames are on the far right.
   /X     As for /N but with the short filenames included.

   /C     Include thousand separator in file sizes.
   /-C    don't include thousand separator in file sizes.

   /4     Display four-digit years
The switches above may be preset by adding them to an environment variable called DIRCMD.
For example: SET DIRCMD=/O:N /S

Override any preset DIRCMD switches by prefixing the switch with -
For example: DIR *.* /-S

Upper and Lower Case filenames:
Filenames longer than 8 characters - will always display the filename with mixed case as entered.
Filenames shorter than 8 characters - may display the filename in upper or lower case - this may vary from one client to another (registry setting)

To obtain a bare DIR format (no heading or footer info) but retain all the details, pipe the output of DIR into FIND, this assumes that your date separator is /

DIR c:\temp\*.* | FIND "/"

FOR /f "tokens=*" %%G IN ('dir c:\temp\*.* ^| find "/"') DO echo %%G
All file sizes are shown in bytes.

Normally DIR /b will return just the filename, however when displaying subfolders with DIR /b /s the command will return a full pathname.
0
 

Author Comment

by:REIUSA
ID: 37778169
I just need to export it to a file like csv. Ideally it would work best to have the path the file name and then either the last access date or modify date.

From what I have tested you can't get all that in one job from the DIR command.

Something like
Path                           File Name          Last access date
 c:\path1\path2\          filename.vob      1/1/2010

I tried using one tool that is made for getting reports on file types and deleting data but it took a loooooong time to run. A simple DIR command finishes in an hour or so but I am just trying to refine the output as much as I can.
0
 
LVL 51

Expert Comment

by:Bill Prew
ID: 37778236
How about VBS, that would allow us to get exactly the output you want very easily.  I can work something up if you think you could use that?

~bp
0
 
LVL 6

Expert Comment

by:HAVARD7979
ID: 37778294
VB is going to be the better way to go try this in excel and then you can save as csv or xls.
 

If you paste the below code into an Excel VBA module (ALT F11, right click and insert new module and paste this code there). Then go back to your main Excel window and hit ALT F8. This should let you run the macro GetFileList.


Option Explicit
Sub GetFileList()

Dim strFolder As String
Dim varFileList As Variant
Dim FSO As Object, myFile As Object
Dim myResults As Variant
Dim l As Long

' Get the directory from the user
With Application.FileDialog(msoFileDialogFolderPicker)
.Show
If .SelectedItems.Count = 0 Then Exit Sub 'user cancelled
strFolder = .SelectedItems(1)
End With

' Get a list of all the files in this directory.
' Note that this isn't recursive... although it could be...
varFileList = fcnGetFileList(strFolder)

If Not IsArray(varFileList) Then
MsgBox "No files found.", vbInformation
Exit Sub
End If

' Now let's get all the details for these files
' and place them into an array so it's quick to dump to XL.
ReDim myResults(0 To UBound(varFileList) + 1, 0 To 5)

' place make some headers in the array
myResults(0, 0) = "Filename"
myResults(0, 1) = "Size"
myResults(0, 2) = "Created"
myResults(0, 3) = "Modified"
myResults(0, 4) = "Accessed"
myResults(0, 5) = "Full path"

Set FSO = CreateObject("Scripting.FileSystemObject")

' Loop through our files
For l = 0 To UBound(varFileList)
Set myFile = FSO.GetFile(CStr(varFileList(l)))
myResults(l + 1, 0) = CStr(varFileList(l))
myResults(l + 1, 1) = myFile.Size
myResults(l + 1, 2) = myFile.DateCreated
myResults(l + 1, 3) = myFile.DateLastModified
myResults(l + 1, 4) = myFile.DateLastAccessed
myResults(l + 1, 5) = myFile.path
Next l

' Dump these to a worksheet
fcnDumpToWorksheet myResults

'tidy up
Set myFile = Nothing
Set FSO = Nothing


End Sub

Private Function fcnGetFileList(ByVal strPath As String, Optional strFilter As String) As Variant
' Returns a one dimensional array with filenames
' Otherwise returns False

Dim f As String
Dim i As Integer
Dim FileList() As String

If strFilter = "" Then strFilter = "*.*"

Select Case Right$(strPath, 1)
Case "\", "/"
strPath = Left$(strPath, Len(strPath) - 1)
End Select

ReDim Preserve FileList(0)

f = Dir$(strPath & "\" & strFilter)
Do While Len(f) > 0
ReDim Preserve FileList(i) As String
FileList(i) = f
i = i + 1
f = Dir$()
Loop

If FileList(0) <> Empty Then
fcnGetFileList = FileList
Else
fcnGetFileList = False
End If
End Function
Private Sub fcnDumpToWorksheet(varData As Variant, Optional mySh As Worksheet)

Dim iSheetsInNew As Integer
Dim sh As Worksheet, wb As Workbook
Dim myColumnHeaders() As String
Dim l As Long, NoOfRows As Long

If mySh Is Nothing Then

'make a workbook if we didn't get a worksheet
iSheetsInNew = Application.SheetsInNewWorkbook
Application.SheetsInNewWorkbook = 1
Set wb = Application.Workbooks.Add
Application.SheetsInNewWorkbook = iSheetsInNew
Set sh = wb.Sheets(1)

Else

Set mySh = sh

End If

With sh

Range(.Cells(1, 1), .Cells(UBound(varData, 1) + 1, UBound(varData, 2) + 1)) = varData
.UsedRange.Columns.AutoFit

End With

Set sh = Nothing
Set wb = Nothing

End Sub
0
 
LVL 6

Expert Comment

by:HAVARD7979
ID: 37778619
http://www.softpedia.com/get/System/Hard-Disk-Utils/STG-FolderPrint-Plus.shtml

You might want to look  at this. it has a 30day trial and only $22 to buy. it does the listing you want and will export to csv  or Xls
0
 

Author Comment

by:REIUSA
ID: 37778879
Hello,
@ billprew
VB would be great, if you could help me get something going that I could use it would be greatly appreciated. I can Frankenstein scripting but not some much from scratch.

I'll take a look at the STG app but I probably won't be able to install it on a server with short notice.

I like the excel idea but the servers I have access to that are on the same network don't have excel and it's like pulling teeth in a lions mouth getting it installed on anything.
0
 
LVL 51

Expert Comment

by:Bill Prew
ID: 37779075
Sure, away from my computer (on mobile) right now but will provide a VBS later today.

~bp
0
 
LVL 51

Expert Comment

by:Bill Prew
ID: 37780184
Here's a first pass at a VBS script that should do what you wanted.  Save as a VBS, and run like this:

cscript //nologo EE27652087.vbs c:\temp >listing.csv
' Set up filesystem object for usage
Set objFSO = CreateObject("Scripting.FileSystemObject")

' Get folder name to list off the command line, make sure it's valid
If (WScript.Arguments.Count > 0) Then
    strFolder = Wscript.Arguments(0)
    If Right(strFolder, 1) = "\" Then strFolder = Left(strFolder, Len(strFolder)-1)
    If Not objFSO.FolderExists(strFolder & "\") Then
        WScript.Echo "Specified folder does not exist."
        WScript.Quit
    End If
Else
    WScript.Echo "No folder name specified to list."
    WScript.Quit
End If

' Get access to the folder we want to list files in
Set objFolder = objFSO.GetFolder(strFolder)

' Header line
Wscript.Echo Quote("Path") & "," & Quote("File Name") & "," & Quote("Last Accessed")

' List files
For Each objFile In objFolder.Files
    Wscript.Echo Quote(objFile.ParentFolder) & "," & Quote(objFile.Name) & "," & Quote(objFile.DateLastAccessed)
    ' DateCreated, DateLastAccessed, DateLastModified
Next

' Add surrounding double quotes to a string
Function Quote(s)
   Quote = Chr(34) & s & Chr(34)
End Function

Open in new window

~bp
0
 

Author Comment

by:REIUSA
ID: 37784340
billprew,
Thank you so much. I have a few questions though.

The format looks great but is it possible to add the modify date as well as the Last Access date? Would I just add some more code for objFile.DateModified , I'm not sure if that is correct or not?

When I run the command on a test folder it only outputs the files in the folder but nothing in the sub folders, the folder in question is a mapped driver E: so I used the command...

cscript //nologo EE27652087.vbs E:\ >listing.csv

I tried adding a *.* and * after the E:\ but it just outputs a error that the folder can't be found. If I just run with E:\ it outputs the files, about 100 for what I am testing it on but there are several more folders and sub folders.

Thanks,
0
 
LVL 51

Expert Comment

by:Bill Prew
ID: 37785558
Okay, this will drill into all subfolders, and add the other date you mentioned.
' Set up filesystem object for usage
Set objFSO = CreateObject("Scripting.FileSystemObject")

' Get folder name to list off the command line, make sure it's valid
If (WScript.Arguments.Count > 0) Then
    strFolder = Wscript.Arguments(0)
    If Right(strFolder, 1) = "\" Then strFolder = Left(strFolder, Len(strFolder)-1)
    If Not objFSO.FolderExists(strFolder & "\") Then
        WScript.Echo "Specified folder does not exist."
        WScript.Quit
    End If
Else
    WScript.Echo "No folder name specified to list."
    WScript.Quit
End If

' Get access to the folder we want to list files in
Set objFolder = objFSO.GetFolder(strFolder)
FindFiles objFolder

' Header line
Wscript.Echo Quote("Path") & "," & Quote("File Name") & "," & Quote("Last Accessed")  & "," & Quote("Last Modified")

Sub FindFiles(objFolder)
   ' List files
   For Each objFile In objFolder.Files
       Wscript.Echo Quote(objFile.ParentFolder) & "," & Quote(objFile.Name) & "," & Quote(objFile.DateLastAccessed) & "," & Quote(objFile.DateLastModified)
       ' DateCreated, DateLastAccessed, DateLastModified
   Next

   ' Recursively drill down into subfolder
   For Each objSubFolder In objFolder.SubFolders
       FindFiles objSubFolder
   Next
End Sub

' Add surrounding double quotes to a string
Function Quote(s)
   Quote = Chr(34) & s & Chr(34)
End Function

Open in new window

~bp
0
 
LVL 11

Expert Comment

by:paultomasi
ID: 37786253
I think a batch file approach would be too slow on this occasion, so the ball is very much in bill's...

Good luck!
0
 

Author Comment

by:REIUSA
ID: 37786666
Awesome, thanks. It's running now.

Would I be better off running one instance on one folder at a time or will it run fine running several instances on about four different folders at the same time?
0
 

Author Comment

by:REIUSA
ID: 37788135
It's running fine but I noticed a few of the folders I am getting a error that the path isn't found but it still returns data on a lot of the files in the directory. Any idea what causes that? I have full access to the folders.
0
 
LVL 51

Expert Comment

by:Bill Prew
ID: 37791844
Is there anything interesting about the files that are skipped and reported as errors in the folders?

~bp
0
 

Author Comment

by:REIUSA
ID: 37791856
Not that I can tell, the only difference between the ones that are not completing is they directories have 300,000 and 500,000 files. The out put shows between 50,000 and 160,000 and then nothing else.
0
 
LVL 51

Expert Comment

by:Bill Prew
ID: 37791865
Just for a test, try adding the following as the first line of the script.

On Error Resume Next

~bp
0
 

Author Comment

by:REIUSA
ID: 37791876
It's running now with that line of code, I'll check it later and see how it goes. Thanks.
0
 
LVL 51

Expert Comment

by:Bill Prew
ID: 37791880
Okay.

~bp
0
 

Author Comment

by:REIUSA
ID: 37798988
I had it to scripts running and one of them finished without showing an error but it only shows results for about 160,000/500,000 files.

Is there a way to exclude files before a certain date, I could cut down on the number of files like that.  Or any other ideas why it would only return a portion of the files?
0
 
LVL 51

Expert Comment

by:Bill Prew
ID: 37799086
Can you run this very small test VBS changing the folder name to the same one have the 500,000 files in.  It will run very fast, and I just want to understand if the count of the files in the folder matches the 500,000 range, or the 160,000 range number.

I don't have a folder with 500,000 files in it, so no way for me to test here unfortunately.
strFolder = "C:\Temp\"
Set objFSO = CreateObject("Scripting.FileSystemObject")
Set objFolder = objFSO.GetFolder(strFolder)
Wscript.Echo objFolder.Files.Count

Open in new window

~bp
0
Do You Know the 4 Main Threat Actor Types?

Do you know the main threat actor types? Most attackers fall into one of four categories, each with their own favored tactics, techniques, and procedures.

 

Author Comment

by:REIUSA
ID: 37799261
Will do, thanks for your help. I'll be back on the network Wed and will try it out then.
0
 

Author Comment

by:REIUSA
ID: 37805933
I ran that but it only returned 1. It looks like it isn't looking into the sub directories. The main folder I am looking at has several sub directories down the line that contain all the files. If I high light everything in the directory and look at the properties, after it calculates it says it has 580,149 files.

Something else I needed to add is file size, is it possible to make it add file size along with other data?
0
 
LVL 51

Expert Comment

by:Bill Prew
ID: 37807520
Okay, can you try running this against the folder with the 500,000 files in it and let's see what it says.
On Error Resume Next
strFolder = "C:\temp\"
Set objFSO = CreateObject("Scripting.FileSystemObject")
Set objFolder = objFSO.GetFolder(strFolder)
iCount = 0
iFIles = 0
CountAll objFolder
Wscript.Echo "iCount=" & iCount & ", iFiles=" & iFiles

Sub CountAll(objFolder)
   iCount = iCount + objFolder.Files.Count
   For Each objFile In objFolder.Files
      iFiles = iFiles + 1
   Next
   For Each objSubFolder In objFolder.SubFolders
      CountAll objSubFolder
   Next
End Sub

Open in new window

~bp
0
 

Author Comment

by:REIUSA
ID: 37808488
The results say "iCount=148198, iFiles=148158". I highlighted all the folders in the directory and looked at the properties again and it says 584,000 files and 89,000 folders.

Directly under the first level folder there are 19 sub folders so I might be able to run it on each one individually with a date restriction and files sizes listed.
0
 
LVL 51

Expert Comment

by:Bill Prew
ID: 37809177
Hmmm, that doesn't make sense to me.  I think there are actually only 148,158 files in the folders.

Could it be ZIP files, where Windows will automatically expose all the files in those and count them seperate, but the VBS will only see the single ZIP file.

Do the following command from a command prompt on the folder involved and see if any files show up:

dir /b /a-d /s c:\temp\*.zip

~bp
0
 

Author Comment

by:REIUSA
ID: 37810776
Ran that command and it is showing several .zip files.

I ran a test on the local server with a zip file and some folder and when I right click it shows just 1 file for the zip file. But, the files I am actually working with are on a Netapp share so it might be different.

What's weird is I ran a DIR command before and output it to text and I think it showed all the files since there were over a million lines in the text file but the format of the output didn't work very well.

Is it possible there is something related to VB and it timing out or running out of memory and can't continue?
0
 
LVL 51

Expert Comment

by:Bill Prew
ID: 37810815
I don't think it's a VBS problem.

Here's an easy way to get the file count using the DIR command, at a DOS prompt do this change the name of your base directory in, and see what it says.

dir /b /a-d /s c:\temp|find /c /v ""

Quite a little puzzle we have here...

~bp
0
 

Author Comment

by:REIUSA
ID: 37811406
Running it now, thanks again for all your help.
0
 

Author Comment

by:REIUSA
ID: 37812057
It showed several paths, saw several path is too long messages and then at the very bottom it says 584830
0
 
LVL 51

Expert Comment

by:Bill Prew
ID: 37813099
Interesting, so it appears that the DIR command saw significantly more files than the VBS script.  I haven't seen this before, but also haven't messed with this many files before, or your network config.

Clearly there are some files that are not being seen by the VBS, but not sure how you want to proceed at this point.  There are a number of reasons I guess this could be occurring, permissions, or long paths, etc.  One approach to try and figure out what files are being skipped, and maybe the cause, would be to take a list of the files from a DOS DIR command similar to above, and then compare it to the VBS approach to listing the same files.  Since there are so many missing, I suspect that whole directories of files must be being skipped.  We could then sort and compare the two output listings to try and get a clue what was skipped.

Another approach would be to rework the VBS to try and add more error trapping, in the hopes that an error condition is actually happening, and report on it.

Another option is to use some other tool to produce the listing you need, perhaps either Powershell, or a third party utility.

Thoughts?

~bp
0
 

Author Comment

by:REIUSA
ID: 37813176
Hello,
Only thing I can think of is a resource issue on the box I am running it on or something on the network or netapp that is stopping the data flow, like it's generating too much traffic in a short time.

Would it be possible to make the script add the file size and limit the access date? I might be able to just run it for everything accessed from 12/31/2010 until now. That might limit it enough to run right or I can run it on the subdirectories one at a time if needed. Thanks.
0
 
LVL 51

Expert Comment

by:Bill Prew
ID: 37813219
Yes, I'll work that up.

~bp
0
 
LVL 51

Expert Comment

by:Bill Prew
ID: 37814120
Okay, I think I'm making some headway on this, discovered something new, will work a bit more on it tonight...

~bp
0
 
LVL 51

Accepted Solution

by:
Bill Prew earned 500 total points
ID: 37814567
Okay, here's a new VBS script with some additional error checking and logging, as well as the file size added, and the logic to only show files accessed after the date specified near the top.  Also, please run with the slightly different command line below and if there is anything in the errors.txt file upload it here please.

cscript //nologo EE27652087.vbs c:\temp >listing.csv 2>errors.txt
' Define cutoff date to process files if accessed after
datCutoff = CDate(#12/31/2010#)

' Set up filesystem object for usage
Set objFSO = CreateObject("Scripting.FileSystemObject")

' Get folder name to list off the command line, make sure it's valid
If (WScript.Arguments.Count > 0) Then
    strFolder = Wscript.Arguments(0)
    If Right(strFolder, 1) = "\" Then strFolder = Left(strFolder, Len(strFolder)-1)
    If Not objFSO.FolderExists(strFolder & "\") Then
        WScript.StdErr.WriteLine "Specified folder does not exist."
        WScript.Quit
    End If
Else
    WScript.StdErr.WriteLine "No folder name specified to list."
    WScript.Quit
End If

' Get access to the folder we want to list files in
Set objFolder = objFSO.GetFolder(strFolder)

' Header line
WScript.StdOut.WriteLine Quote("Path") & "," & Quote("File Name") & "," & Quote("File Size") & "," & Quote("Last Accessed")  & "," & Quote("Last Modified")

' Look for files
FindFiles objFolder


Sub FindFiles(objFolder)
   On Error Resume Next

   ' List files
   For Each objFile In objFolder.Files
       On Error Resume Next
       If Err.Number <> 0 Then ShowError "FindFiles:01", objFolder.Path
       If objFile.DateLastAccessed > datCutoff Then
           On Error Resume Next
           WScript.StdOut.WriteLine Quote(objFile.ParentFolder) & "," & Quote(objFile.Name) & "," & objFile.Size & "," & Quote(objFile.DateLastAccessed) & "," & Quote(objFile.DateLastModified)
           If Err.Number <> 0 Then ShowError "FindFiles:02", objFile.Path
       End If
   Next

   If Err.Number = 0 Then
       ' Recursively drill down into subfolder
       For Each objSubFolder In objFolder.SubFolders
           On Error Resume Next
           If Err.Number <> 0 Then ShowError "FindFiles:04", objFolder.Path
           FindFiles objSubFolder
           If Err.Number <> 0 Then ShowError "FindFiles:05", objSubFolder.Path
       Next
   Else
       ShowError "FindFiles:03", objFolder.Path
   End If
End Sub

' Add surrounding double quotes to a string
Function Quote(s)
   Quote = Chr(34) & s & Chr(34)
End Function

Sub ShowError(strLocation, strMessage)
   WScript.StdErr.WriteLine "==> ERROR at [" & strLocation & "]"
   WScript.StdErr.WriteLine "    Number:[" & Err.Number & "], Source:[" & Err.Source & "], Desc:[" &  Err.Description & "]"
   WScript.StdErr.WriteLine "    " & strMessage
   Err.Clear
End Sub

Open in new window

~bp
0
 
LVL 11

Expert Comment

by:paultomasi
ID: 37815519
REIUSA

I have been watching this question from the start. billprew is doing a great job however, I have a few questions of my own.
 
1) Just out of curiosity, why do you have 500,000 or so files in a single folder?

2) Would it not be better to group the files into separate subfolders?

Accessing so many files (especially in one folder) is really slow.

3) Rather than struggle with so many files, is it possible to group the files into separate folders, limiting the number of files in each folder?

Whatever information you require can still be obtained no matter how your files are organised.

More logically organised files will speed up processing immensely.
0
 
LVL 51

Expert Comment

by:Bill Prew
ID: 37815710
@REIUSA,

All the files are not in a single folder, I thought that at first as well.  But a few comments along the way like "The main folder I am looking at has several sub directories down the line that contain all the files." have helped me to understand that they are actually located in a tree under a single base folder, but not all in the folder.

~bp
0
 

Author Comment

by:REIUSA
ID: 37815849
Right, something like 40k folders :)

Thanks I'll run it now.
0
 
LVL 51

Expert Comment

by:Bill Prew
ID: 37815871
Whoops, my last post (37815710) should have been addressed to Paul, not REIUSA, my mistake...

~bp
0
 
LVL 11

Expert Comment

by:paultomasi
ID: 37815973
bill
He states:
...500,000 is the biggest directory...
The main folder... has several sub directories... it says it has 580,149 files
...iCount=148198, iFiles=148158...
...584,000 files and 89,000 folders...
...under the first level folder there are 19 sub folders...
...then at the very bottom it says 584830...
It may be prudent to remind ourselves that 'DOS' is an acronym for 'Disk Operating System' which is therefore, more suited to dealing with this issue than is VB hence he states:
...a DIR command... output it to text... it showed all the files...
It seems his biggest gripe may be:
...text file... format of the output...
As far as I see, he requires CSV formatted output to a file - something we are very used to doing, and seen a million times over. DOS is more than capable of providing a solution here. If it's just a question of speed then that's something he may have to live with as a consequence of processing a large volume of files (unfortunately, not even VB can alter the laws of physics!).

There is nothing to be gained here by not using DOS. I say keep it simple for now and add any bells and whistles once it's up and running.

Oddly enough, even after reading through the entire thread, I can't see anywhere where he states the drive-letter his files are on. Nor has it been stated what type of files they are - for all we know, it could be a complete system drive taken from another machine (unless of course I overlooked this information somewhere).

I say, either use a single-pass approach using FOR /R with DIR /TA and DIR /TW, or a two-pass approach using FOR /R for the first pass, followed by FOR /F with DIR /TA and DIR /TW for the second pass.

Let the first pass include:

    filename.extension
    filesize
    filepath
    date and time created

Let the second pass include:

    date and time last modified
    date and time last accessed

The first pass could be something as simple as:

    (@for /r E:\ %a in (*) do @echo %~nxa, %~dpa, %~za, %~ta)>output.txt

where 'E:\' is the drive-letter and path of his files.

BTW, it would be helpful if he confirmed his system's DATE and TIME formats as well.
0
 

Author Closing Comment

by:REIUSA
ID: 37823657
I ran it again and it looks like it completed fine excluding the older files. The error log showed the similar errors about path size but those paths showed up in csv file.

I'll try to remove the date restriction part of the code and run it again and see if it completes with all the files.

@Paul
One reason was so I didn't have to run it twice on all the folders, but thanks for posting that info I would like to try that out too, it may help for future projects were file info is needed.
0
 
LVL 11

Expert Comment

by:paultomasi
ID: 37825138
The following command failed in DOS (giving the error message directly beneath it).

    for /r d:\ %%a in (*) do echo %%a>>file
    Not enough storage is available to process this command.
    Out of memory.

However, I was able to redirect 'DIR /A-D /B /S' to a file. This was tested on a secondary drive containing nearly 770,000 files (one folder alone contained over 400,000 files). Taking just a tad over 4 minutes to complete, resulted in a 44,032,503 byte (approx. 42MB) text file.

From here on, I was unable to process the file further. Here are some attempts:
 
    for /f "tokens=* usebackq" %%a in ("file") do echo %%a
    Not enough storage is available to process this command.
    Out of memory.

    for /f %%a in ('type file') do echo %%a
    (DOS hangs)

    for /f %%a in ('more^<file') do echo %%a
    (DOS hangs)

    for /f "tokens=*" %%a in (file) do echo %%a
    Not enough storage is available to process this command.
    Out of memory.

    for /f "tokens=*" %%a in ('find /v "" file') do echo %%a
    Not enough storage is available to process this command.
    Out of memory.

I have 4GB of physical memory available under 32-bit XP.

I must admit I have not previously attempted to process a text file anywhere near this size and I have not encountered this error before.

Further testing in DOS is warranted.

In view of the above findings, VB seems to be the better choice here!

Welldone billprew.
0
 
LVL 51

Expert Comment

by:Bill Prew
ID: 37825960
Thank you Paul, for the research and kind words.

~bp
0

Featured Post

Do You Know the 4 Main Threat Actor Types?

Do you know the main threat actor types? Most attackers fall into one of four categories, each with their own favored tactics, techniques, and procedures.

Join & Write a Comment

Suggested Solutions

The password reset disk is often mentioned as the best solution to deal with the lost Windows password problem. In Windows 2008, 7, Vista and XP, a password reset disk can be easily created. But besides Windows 7/Vista/XP, Windows Server 2008 and ot…
When you upgrade from Windows 8 to 8.1 or to Windows 10 or if you are like me you are on the Insider Program you may find yourself with many 450MB recovery partitions.  With a traditional disk that may not be a problem but with relatively smaller SS…
The viewer will learn the basics of jQuery, including how to invoke it on a web page. Reference your jQuery libraries: (CODE) Include your new external js/jQuery file: (CODE) Write your first lines of code to setup your site for jQuery.: (CODE)
In this fourth video of the Xpdf series, we discuss and demonstrate the PDFinfo utility, which retrieves the contents of a PDF's Info Dictionary, as well as some other information, including the page count. We show how to isolate the page count in a…

707 members asked questions and received personalized solutions in the past 7 days.

Join the community of 500,000 technology professionals and ask your questions.

Join & Ask a Question

Need Help in Real-Time?

Connect with top rated Experts

16 Experts available now in Live!

Get 1:1 Help Now