[Webinar] Learn how to a build a cloud-first strategyRegister Now

x
?
Solved

Windows batch script to count top 5 recurrent values for a column in csv file

Posted on 2014-08-17
31
Medium Priority
?
576 Views
Last Modified: 2014-09-16
Pls unzip the attached using the pin number 1234554321 :
it's a sanitized file but just in case, I've zip it with a pin so
that its content is not easily searched

I'll need scripts (ideally Windows Batch script or if it's going to
run more than 15 minutes using Windows Batch on my 32bit
Win XP that runs on SSD, then I can accept PowerShell or even
Linux Shell script but there's this hassle of  uploading files into
the servers instead of doing on my laptop) that will process
the unzipped csv file (the number of lines could run from
200,000 to 800,000 & I do this weekly)

a) sort by the "Source IP" (ie column N) as primary key & the
    Reason (ie column D) as secondary key & then make a count
    of the occurrences of top 5 "Source IP" : need to know the
    top 5 Source IPs recur how many times each.  The output
    should have 5 lines of IPs with the count for each IP

b) then make a count of the occurrences of top 25 events for
    each of the top 5"Source IP" ie for each of the top 5
    recurring Source IP, what is the top 25 events (or if there
    is less then stop at whatever is the number of events that
    are available).    The output the top 5  IPs with each IP &
    counts of each of its top 25 events  (or whatever available)  

c) sort by the "Destination IP" (ie column Q) as primary key & the
    Reason (ie column D) as secondary key & then make a count
    of the occurrences of top 5 "Source IP" : need to know the
    top 5 Source IPs recur how many times each.  The output
    should have 5 lines of IPs with the count for each IP

d) then make a count of the occurrences of top 25 events for
    each of the top 5 "Destination IP" ie for each of the top 5
    recurring Destination IP, what is the top 25 events (or if
    there is less then stop at whatever is the number of events
    that are available).    The output the top 5  IPs with each IP
    & counts of each of its top 25 events  (or whatever available)  

Leave the sorted output filename in the same folder as
original_filename_sorted.csv  as I would like to browse
through
0
Comment
Question by:sunhux
  • 21
  • 9
31 Comments
 

Author Comment

by:sunhux
ID: 40265985
Sorry there's no pin as EE doesn't allow it to be uploaded
Samples.zip
0
 

Author Comment

by:sunhux
ID: 40265993
For requests a & c, also include a percentage which is derived by the
number of counts of each top 5 counts by the total number of lines
(excluding the label) x 100%  (up to the point 1 decimal place; eg: 88.1%)
0
 
LVL 35

Expert Comment

by:Dan Craciun
ID: 40266037
Can you please post the code you've tried, to see where you're stuck?

Thank you.
0
NEW Veeam Backup for Microsoft Office 365 1.5

With Office 365, it’s your data and your responsibility to protect it. NEW Veeam Backup for Microsoft Office 365 eliminates the risk of losing access to your Office 365 data.

 

Author Comment

by:sunhux
ID: 40266043
I don't have a code nor a script yet but I'm exploring a freeware cmsort: http://www.chmaas.handshake.de/delphi/freeware/cmsort/cmsort.htm#download

Can't get the syntax right on /V (option for csv file, right as the separator is
semicolon with /V, but I need comma);  It's supposed to run in Win XP
0
 

Author Comment

by:sunhux
ID: 40266044
& my csv file doesn't have double quotes to enclose each field too
while I can't get the right syntax that could sort my csv without the
double quotes
0
 

Author Comment

by:sunhux
ID: 40266046
The help output & a couple of csv examples for cmsort :


CMsort version 2.01 - Sort a DOS, WINDOWS, UNIX, MAC, or mixed text file
(c) 2014 Christian Maas // chmaas@handshake.de // www.chmaas.handshake.de
Usage: CMsort [sort key][-] ... [option] ... <input file> <output file>
Sort keys:
/S=F,L string from position F with length L (case sensitive)
/C=F,L string from position F with length L (case insensitive)
/N=F,L interpreted numeric from position F with length L
Notes:
1. The complete line is used as sort key if no sort keys are indicated.
2. L=0 allowed for last non-numeric key to define a key until EOR.
3. An appending minus sign indicates descending sort order.
4. CMsort uses the quicksort algorithm (not a stable sort).
Sort keys used for CSV files (see option /V below)
/SV=I,F,L I-th CSV field string from position F with length L (case sensitive)
/CV=I,F,L I-th CSV field string from position F with length L (case insensitive)
/NV=I,F,L I-th CSV field numeric from position F with length L

Press <Return> to continue or Ctrl-C to break.

Options:
/F=n read/write files with fixed-length records (n byte without CR/LF)
/B ignore blank or empty records
/D ignore records with duplicate keys (according to given sort keys)
/D=<file> ignore records with duplicate keys, write them to <file>
/Q quiet mode (no progress output)
/H=n don't sort n header lines (default: n=0)
/W=n n-way-merge of temporary files (2<=n<=5, default: n=5)
/T=<path> for temporary files (/T=TMP for Windows temporary file path)
/M=n use n KB memory (n >= 1,024 KB = 1 MB; default: 65,536 KB = 64 MB)
/X=n,c exclude lines with char c (as char, $<hex>, #<dec>) in column n
/V or /V=S,Q for sorting CSV files
  If only /V is specified, semicolon is used as separator and the double
  quotation mark as quotation sign. Use /V=S,Q to specify a different
  separator S and/or quotation sign Q. S, Q must be indicated as
  $<hex> or #<decimal> (e.g. default /V=$3B,$22 for semicolon/double
  quotation mark. Use $00 for Q if no quotation sign is used.
  Indicate sort keys as
  /SV=I,F,L or /CV=I,F,L or /NV=I,F,L with:
    I for I-th CSV field, F for "from position", L for length
    L=0 allowed for all keys (auto-aligning of partial keys)
Use CMsort /E to show examples.


================= 2 examples =====================

Example 3: sort a CSV file
We have the following CSV input file:

"Customer No";CustName;OrderDate;Return
1004711;Miller & Co.;1999-12-06;1,207.23
"1004713";"Topsoft";"2000-01-04";"2,521.95"
1004747;MCP & Co.;2000-01-04;7,356.88
1004799;Eftpos;1999-12-06;23,122.56

This is the command line to sort by order date ascending and by return descending without the header line:

cmsort /SV=3,1,0 /NV=4,1,0- /V /H=1 customercsv.txt customercsv.sor

This is the resulting file:

"Customer No";CustName;OrderDate;Return
1004799;Eftpos;1999-12-06;23,122.56
1004711;Miller & Co.;1999-12-06;1,207.23
1004747;MCP & Co.;2000-01-04;7,356.88
"1004713";"Topsoft";"2000-01-04";"2,521.95"

[Top]
 
 

Example 4: CSV file with empty fields
CSV input file:

Euclid;;Greek
Thales;;Greek
Banach;Stefan;Polish
de Fermat;Pierre;French
Cantor;Georg;German

Command line to sort by first name ascending:

cmsort /V /SV=2,1,0 csvempty.in csvempty.sor

Resulting file:

Thales;;Greek
Euclid;;Greek
Cantor;Georg;German
de Fermat;Pierre;French
Banach;Stefan;Polish
0
 

Author Comment

by:sunhux
ID: 40266051
I also tried the example below but got an error:
http://www.commandlinefu.com/commands/view/8070/sort-a-csv-file-according-to-a-particular-n-th-field-numerically-quicker-than-excel

c:\temp\sort -t"," -n -k23 samplem.csv |more
sort -t"," -n -k23 samplem.csv |more
0
 
LVL 71

Expert Comment

by:Qlemo
ID: 40266104
Doing that in cmd.exe is tedious, but feasible. PowerShell is much better suited. Using a database would be most appropriate.
0
 

Author Comment

by:sunhux
ID: 40266121
I'll look for a Windows 7 laptop then as it comes with PowerShell.
So what's the command / script?

http://www.commandlinefu.com/commands/view/8070/sort-a-csv-file-according-to-a-particular-n-th-field-numerically-quicker-than-excel
I've just figured that the link above is for Linux sort & I downloaded
a gnuwin32 sort but somehow the sorted output is exactly the
same as the original file.
0
 
LVL 71

Assisted Solution

by:Qlemo
Qlemo earned 2000 total points
ID: 40266131
Sort for a):
  cmsort /V=$2C,$00 /H=1 /CV=14,1,0 /CV=4,1,0 SampleM.csv SampleM_Sorted_SrcIP.csv
Sort for c):
  cmsort /V=$2C,$00 /H=1 /CV=17,1,0 /CV=4,1,0 SampleM.csv SampleM_Sorted_DstIP.csv

Don't mess with the *nix sort, not worth it unless you are on Linux anyway.
0
 
LVL 71

Expert Comment

by:Qlemo
ID: 40266132
0
 

Author Comment

by:sunhux
ID: 40266163
For a,
  cmsort /V=$2C,$00 /H=1 /CV=14,1,0 /CV=4,1,0 SampleM.csv
  SampleM_Sorted_SrcIP.csv
was sorted with the SrcIP bottom 5 on top first.  Can change
it so that the top 5 are on top first in the output?  Guess, to
get a count, I'll issue
   find/c "Src_IP" sortedfile.csv

For c,
  cmsort /V=$2C,$00 /H=1 /CV=17,1,0 /CV=4,1,0 SampleM.csv
  SampleM_Sorted_DstIP.csv
likewise, I'll need to the top 5 Destinatn_IP on the top not
at the bottom of the output.  What adjust to the cmsort
options need to be changed?
0
 

Author Comment

by:sunhux
ID: 40266164
I'll still prefer that you provide the counts rather than I resort
to  "find/c ..."  to count for b & d
0
 
LVL 71

Assisted Solution

by:Qlemo
Qlemo earned 2000 total points
ID: 40266179
Will abandon the CMD approach, because getting the count is really, really tedious. The sort isn't an issue, we could reverse that with e.g.
 cmsort /V=$2C,$00 /H=1 /CV=14,1,0 /CV=4,1,0- SampleM.csv SampleM_Sorted_SrcIP.csv
That is, we append a minus to the sort field specification.
0
 
LVL 71

Expert Comment

by:Qlemo
ID: 40266214
In PowerShell, a) and c) look like this:
$filename = 'SampleM'

$SrcIPs = Import-Csv "$filename.csv" | Sort-Object 'Source IP', Reason
$dstIPs = $SrcIPs | Sort-Object 'Destination IP', Reason

$SrcIPs | ConvertTo-Csv -NoType | % { $_ -Replace """"} | Out-File "$filename_sorted_SrcIP.csv"
$dstIPs | ConvertTo-Csv -NoType | % { $_ -Replace """"} | Out-File "$filename_sorted_DstIP.csv"

$top5srcIPs = @($srcIPs | group 'Source IP'      | sort count -Descending) | select -First 5
Remove-Variable srcIPs # to free up memory as early as possible
$top5dstIPs = @($dstIPs | group 'Destination IP' | sort count -Descending) | select -First 5
Remove-Variable dstIPs

Write-Host "Top 5 Source IPs:"
$top5srcIPs | select name, count

Write-Host "Top 5 Destination IPs:"
$top5dstIPs | select name, count

Open in new window

The memory consumption will be around 3 GB, because we need to keep all the data for creating the sorted files. It could be lowered by not keepint the sorted result in memory (instead read the sorted data when processing), but disk I/O will slow down processing significantly.
Please try the above with a full-size source file to see if performance and memory footprint are ok.

To implement b) and d), what do you define as "Event"? Is it Application Type, Reason, Action or Note?
0
 

Author Comment

by:sunhux
ID: 40266241
Ok, my mistake for not clarifying;  Event is actually the Reason column
ie column D is the secondary sort key
0
 
LVL 71

Assisted Solution

by:Qlemo
Qlemo earned 2000 total points
ID: 40266307
Complete code for dumping the results to the console as text.
$filename = 'SampleM'

$SrcIPs = Import-Csv "$filename.csv" | Sort-Object 'Source IP', Reason
$dstIPs = $SrcIPs | Sort-Object 'Destination IP', Reason

$SrcIPs | ConvertTo-Csv -NoType | % { $_ -Replace """"} | Out-File "$filename_sorted_SrcIP.csv"
$dstIPs | ConvertTo-Csv -NoType | % { $_ -Replace """"} | Out-File "$filename_sorted_DstIP.csv"

$top5srcIPs = @($srcIPs | Select 'Source IP'     , 'Reason' | group 'Source IP'      | sort count -Descending) | select -First 5
Remove-Variable srcIPs # to free up memory as early as possible
$top5dstIPs = @($dstIPs | Select 'Destination IP', 'Reason' | group 'Destination IP' | sort count -Descending) | select -First 5
Remove-Variable dstIPs

Write-Host "Top 5 Source IPs:"
$top5srcIPs | select name, count | Out-Host

Write-Host "Top 25 reasons per source IP:"
$top5srcIPs | % {
  $_.Group |
  group Reason |
  sort count -descending |
  select -first 25 |
  select @{l='Source IP'; e={$_.Group[0].'Source IP'}},
          @{l='Count'    ; e={$_.Count}},
          @{l='Reason'   ; e={$_.Name}} 
} | Out-Host

Write-Host "Top 5 Destination IPs:"
$top5dstIPs | select name, count | Out-Host

Write-Host "Top 25 reasons per destination IP:"
$top5dstIPs | % {
  $_.Group |
  group Reason |
  sort count -descending |
  select -first 25 |
  select @{l='Destination IP'; e={$_.Group[0].'Destination IP'}},
          @{l='Count'    ; e={$_.Count}},
          @{l='Reason'   ; e={$_.Name}} 
} | Out-Host

Open in new window

0
 

Author Comment

by:sunhux
ID: 40266380
Thanks very much, appreciate it.

I'm newbie with PowerShell so run into difficulties below.
What do I need to do to run it correctly?  I've changed the
$filename = 'hz'


    Directory: D:\julsortd\h


Mode                LastWriteTime     Length Name
----                -------------     ------ ----
-a---         8/17/2014  11:15 PM 1742326095 hz.csv
-a---         8/18/2014   1:31 AM       1481 psort.ps1


PS D:\julsortd\h> psort.ps1
The term 'psort.ps1' is not recognized as the name of a cmdlet, function, script file, or operable program. Check the s
pelling of the name, or if a path was included, verify that the path is correct and try again.
At line:1 char:10
+ psort.ps1 <<<<
    + CategoryInfo          : ObjectNotFound: (psort.ps1:String) [], CommandNotFoundException
    + FullyQualifiedErrorId : CommandNotFoundException
0
 

Author Comment

by:sunhux
ID: 40266383
Just to be sure, this is the content of the psort.ps1 file which
I've placed in d:\julsortd\h  folder (ie same folder as the csv file) ;

d:\> more d:\julsortd\h\psort.ps1

$filename = 'hz'

$SrcIPs = Import-Csv "$filename.csv" | Sort-Object 'Source IP', Reason
$dstIPs = $SrcIPs | Sort-Object 'Destination IP', Reason

$SrcIPs | ConvertTo-Csv -NoType | % { $_ -Replace """"} | Out-File "$filename_so
rted_SrcIP.csv"
$dstIPs | ConvertTo-Csv -NoType | % { $_ -Replace """"} | Out-File "$filename_so
rted_DstIP.csv"

$top5srcIPs = @($srcIPs | Select 'Source IP'     , 'Reason' | group 'Source IP'
     | sort count -Descending) | select -First 5
Remove-Variable srcIPs # to free up memory as early as possible
$top5dstIPs = @($dstIPs | Select 'Destination IP', 'Reason' | group 'Destination
 IP' | sort count -Descending) | select -First 5
Remove-Variable dstIPs

Write-Host "Top 5 Source IPs:"
$top5srcIPs | select name, count | Out-Host

Write-Host "Top 25 reasons per source IP:"
$top5srcIPs | % {
  $_.Group |
  group Reason |
  sort count -descending |
  select -first 25 |
  select @{l='Source IP'; e={$_.Group[0].'Source IP'}},
          @{l='Count'    ; e={$_.Count}},
          @{l='Reason'   ; e={$_.Name}}
} | Out-Host

Write-Host "Top 5 Destination IPs:"
$top5dstIPs | select name, count | Out-Host

Write-Host "Top 25 reasons per destination IP:"
$top5dstIPs | % {
  $_.Group |
  group Reason |
  sort count -descending |
  select -first 25 |
  select @{l='Destination IP'; e={$_.Group[0].'Destination IP'}},
          @{l='Count'    ; e={$_.Count}},
          @{l='Reason'   ; e={$_.Name}}
} | Out-Host
0
 

Author Comment

by:sunhux
ID: 40266389
Some of the lines above appear to be truncated but they are actually
on a single line when opened with Notepad.  So happened that I
copied them from the Windows command prompt :

$SrcIPs | ConvertTo-Csv -NoType | % { $_ -Replace """"} | Out-File "$filename_so
rted_SrcIP.csv"
$dstIPs | ConvertTo-Csv -NoType | % { $_ -Replace """"} | Out-File "$filename_so
rted_DstIP.csv"
0
 

Author Comment

by:sunhux
ID: 40266398
I copied both the ps1 & the csv input file to c:\windows\system32
& run again but got a different error :

PS C:\Windows\system32> dir hz.csv

    Directory: C:\Windows\system32

Mode                LastWriteTime     Length Name
----                -------------     ------ ----
-a---         8/17/2014  11:15 PM 1742326095 hz.csv


PS C:\Windows\system32> dir psort.ps1

    Directory: C:\Windows\system32

Mode                LastWriteTime     Length Name
----                -------------     ------ ----
-a---         8/18/2014   1:31 AM       1481 psort.ps1


PS C:\Windows\system32> psort
File C:\Windows\system32\psort.ps1 cannot be loaded because the execution of scripts is disabled on this system. Please
 see "get-help about_signing" for more details.
At line:1 char:6
+ psort <<<<
    + CategoryInfo          : NotSpecified: (:) [], PSSecurityException
    + FullyQualifiedErrorId : RuntimeException
0
 

Author Comment

by:sunhux
ID: 40266413
My laptop has 8GB RAM & it's on Windows 7 x64 Professional.

I address the above by running as Administrator &
PS D:\julsortd\h> Set-ExecutionPolicy RemoteSigned

Execution Policy Change
The execution policy helps protect you from scripts that you do not trust. Changing the
execution policy might expose you to the security risks described in the
about_Execution_Policies help topic. Do you want to change the execution policy?
[Y] Yes  [N] No  [S] Suspend  [?] Help (default is "Y"): Y
PS D:\julsortd\h>
PS D:\julsortd\h> psort
. . . runs for more than a minute . . .
PS D:\julsortd\h> psort
Import-Csv : Exception of type 'System.OutOfMemoryException' was thrown.
At C:\Windows\system32\psort.ps1:3 char:21
+ $SrcIPs = Import-Csv <<<<  "$filename.csv" | Sort-Object 'Source IP', Reason
    + CategoryInfo          : NotSpecified: (:) [Import-Csv], OutOfMemoryException
    + FullyQualifiedErrorId : System.OutOfMemoryException,Microsoft.PowerShell.Commands.ImportCsvCommand

ConvertTo-Csv : Cannot bind argument to parameter 'InputObject' because it is null.
At C:\Windows\system32\psort.ps1:6 char:24
+ $SrcIPs | ConvertTo-Csv <<<<  -NoType | % { $_ -Replace """"} | Out-File "$filename_sorted_SrcIP.csv"
    + CategoryInfo          : InvalidData: (:) [ConvertTo-Csv], ParameterBindingValidationException
    + FullyQualifiedErrorId : ParameterArgumentValidationErrorNullNotAllowed,Microsoft.PowerShell.Commands.Con


I'll try running with a smaller csv input file as this one is 1.7GB
with about 1.1 million lines
0
 

Author Comment

by:sunhux
ID: 40266423
It works fine with a much smaller csv input file.
The earlier file is about 25 times the size of this
one that works.

Possible to tweak the PowerShell script or any
parameter to enable it to handle the bigger csv
input files?
0
 

Author Comment

by:sunhux
ID: 40266430
There is a slight issue with the output; refer to the
lines indicated by  <==  below :

Top 5 Source IPs:

Name                              Count
----                                      -----
192.168.3.6                      143107
192.168.3.2                        2093
165.21.42.84                        270
160.96.97.251                       194
103.3.201.122                       187


Top 25 reasons per source IP:

Source IP                            Count Reason
---------                            ----- ------
192.168.3.6                          42851 1003598 - Multiple HTTP Server Low B...
192.168.3.6                            256 Region Too Big  <==
192.168.3.2                           2061 1003598 - Multiple HTTP Server Low B...
192.168.3.2                             32 Region Too Big   <==
165.21.42.84                           270 1000552 - Generic Cross Site Scripti...
160.96.97.251                          170 1000552 - Generic Cross Site Scripti...
160.96.97.251                           24 1003598 - Multiple HTTP Server Low B...
103.3.201.122                          173 Invalid Traversal
103.3.201.122                           14 Illegal Character in URI
0
 
LVL 71

Expert Comment

by:Qlemo
ID: 40266462
First of all, for security reasons a script needs to be used with the path it is in. So calling a script in the current folder usually is done with
  .\scriptname.ps1

In regard of the 1.7TB file - as long as we have to retain the original fields I see no way to make that work. If it were for getting the top 5 / top 25, preserving only the needed columns (IP addresses and Reason) certainly would help, but you want the files sorted completely.

What is the issue with those "Region Too Big" lines? Looks pretty much like they are from the file.
0
 

Author Comment

by:sunhux
ID: 40267035
Thanks, got the security thingy addressed.

Can change the script such that it doesn't do sorting (ie it won't
bomb out due to insufficient memory) & just report the counts
& percentage ?  Still need to run on that big 1.7GB file (not 1.7TB)
0
 
LVL 71

Accepted Solution

by:
Qlemo earned 2000 total points
ID: 40267093
Ok, 1.7TB was somewhat exaggerated ;-).
If we scan skip creating the sorted files, we can reduce the in-memory properties and hence amount of data to process. But be aware that the simple methods available here will always be an issue - reporting of that kind is a more sophisticated process usually, involving databases and a custom application able to handle data differently and with a smaller memory footprint.
$filename = 'SampleM'

$SrcIPs = Import-Csv "$filename.csv" | select 'Source IP', 'Destination IP', Reason | Sort-Object 'Source IP', Reason
$dstIPs = $SrcIPs | Sort-Object 'Destination IP', Reason

$top5srcIPs = @($srcIPs | Select 'Source IP'     , 'Reason' | group 'Source IP'      | sort count -Descending) | select -First 5
Remove-Variable srcIPs # to free up memory as early as possible
$top5dstIPs = @($dstIPs | Select 'Destination IP', 'Reason' | group 'Destination IP' | sort count -Descending) | select -First 5
Remove-Variable dstIPs

Write-Host "Top 5 Source IPs:"
$top5srcIPs | select name, count | Out-Host

Write-Host "Top 25 reasons per source IP:"
$top5srcIPs | % {
  $_.Group |
  group Reason |
  sort count -descending |
  select -first 25 |
  select @{l='Source IP'; e={$_.Group[0].'Source IP'}},
          @{l='Count'    ; e={$_.Count}},
          @{l='Reason'   ; e={$_.Name}} 
} | Out-Host

Write-Host "Top 5 Destination IPs:"
$top5dstIPs | select name, count | Out-Host

Write-Host "Top 25 reasons per destination IP:"
$top5dstIPs | % {
  $_.Group |
  group Reason |
  sort count -descending |
  select -first 25 |
  select @{l='Destination IP'; e={$_.Group[0].'Destination IP'}},
          @{l='Count'    ; e={$_.Count}},
          @{l='Reason'   ; e={$_.Name}} 
} | Out-Host

Open in new window

0
 

Author Comment

by:sunhux
ID: 40267349
With the medium sized file which fails earlier, this new script now works.
However, with the 1.7GB csv file, it still give error :

The '=' operator failed: Exception of type 'System.OutOfMemoryException' was thrown..
At D:\julsortd\h\h.ps1:6 char:14
+ $top5srcIPs = <<<<  @($srcIPs | Select 'Source IP'     , 'Reason' | group 'Source IP'      | sort count -Descending)
| select -First 5
    + CategoryInfo          : InvalidOperation: (:) [], RuntimeException
    + FullyQualifiedErrorId : OperatorFailed


I'm sure I have 5GB of RAM free on my 8GB Win 7.
0
 

Author Comment

by:sunhux
ID: 40267379
Will there be any issue if the header line (ie the column label that
usually appear in the 1st row of the csv) appear more than once
in the csv file ?
0
 
LVL 71

Expert Comment

by:Qlemo
ID: 40267393
Yes. Import-CSV errors out in that case with duplicate column headers,
0
 

Author Comment

by:sunhux
ID: 40270072
Any chance that you can enhance it further:

With the medium sized file which fails earlier, this new script now works.
However, with the 1.7GB csv file, it still give error :

Exception of type 'System.OutOfMemoryException'

or will it help if I run it on a Win7 with 16GB RAM?  Still hunting for
one but I think one of the DBA colleague has it
0

Featured Post

Free Tool: ZipGrep

ZipGrep is a utility that can list and search zip (.war, .ear, .jar, etc) archives for text patterns, without the need to extract the archive's contents.

One of a set of tools we're offering as a way to say thank you for being a part of the community.

Question has a verified solution.

If you are experiencing a similar issue, please ask a related question

This is a fine trick which I've found useful many times, when you just don't want to accidentally run a batch script or the commands needs administrator rights.
There are times when we need to generate a report on the inbox rules, where users have set up forwarding externally in their mailbox. In this article, I will be sharing a script I wrote to generate the report in CSV format.
The viewer will learn how to create and use a small PHP class to apply a watermark to an image. This video shows the viewer the setup for the PHP watermark as well as important coding language. Continue to Part 2 to learn the core code used in creat…
In this fourth video of the Xpdf series, we discuss and demonstrate the PDFinfo utility, which retrieves the contents of a PDF's Info Dictionary, as well as some other information, including the page count. We show how to isolate the page count in a…
Suggested Courses
Course of the Month20 days, 9 hours left to enroll

868 members asked questions and received personalized solutions in the past 7 days.

Join the community of 500,000 technology professionals and ask your questions.

Join & Ask a Question