Windows batch script to count top 5 recurrent values for a column in csv file

Pls unzip the attached using the pin number 1234554321 :
it's a sanitized file but just in case, I've zip it with a pin so
that its content is not easily searched

I'll need scripts (ideally Windows Batch script or if it's going to
run more than 15 minutes using Windows Batch on my 32bit
Win XP that runs on SSD, then I can accept PowerShell or even
Linux Shell script but there's this hassle of  uploading files into
the servers instead of doing on my laptop) that will process
the unzipped csv file (the number of lines could run from
200,000 to 800,000 & I do this weekly)

a) sort by the "Source IP" (ie column N) as primary key & the
    Reason (ie column D) as secondary key & then make a count
    of the occurrences of top 5 "Source IP" : need to know the
    top 5 Source IPs recur how many times each.  The output
    should have 5 lines of IPs with the count for each IP

b) then make a count of the occurrences of top 25 events for
    each of the top 5"Source IP" ie for each of the top 5
    recurring Source IP, what is the top 25 events (or if there
    is less then stop at whatever is the number of events that
    are available).    The output the top 5  IPs with each IP &
    counts of each of its top 25 events  (or whatever available)  

c) sort by the "Destination IP" (ie column Q) as primary key & the
    Reason (ie column D) as secondary key & then make a count
    of the occurrences of top 5 "Source IP" : need to know the
    top 5 Source IPs recur how many times each.  The output
    should have 5 lines of IPs with the count for each IP

d) then make a count of the occurrences of top 25 events for
    each of the top 5 "Destination IP" ie for each of the top 5
    recurring Destination IP, what is the top 25 events (or if
    there is less then stop at whatever is the number of events
    that are available).    The output the top 5  IPs with each IP
    & counts of each of its top 25 events  (or whatever available)  

Leave the sorted output filename in the same folder as
original_filename_sorted.csv  as I would like to browse
through
sunhuxAsked:
Who is Participating?
I wear a lot of hats...

"The solutions and answers provided on Experts Exchange have been extremely helpful to me over the last few years. I wear a lot of hats - Developer, Database Administrator, Help Desk, etc., so I know a lot of things but not a lot about one thing. Experts Exchange gives me answers from people who do know a lot about one thing, in a easy to use platform." -Todd S.

sunhuxAuthor Commented:
Sorry there's no pin as EE doesn't allow it to be uploaded
Samples.zip
0
sunhuxAuthor Commented:
For requests a & c, also include a percentage which is derived by the
number of counts of each top 5 counts by the total number of lines
(excluding the label) x 100%  (up to the point 1 decimal place; eg: 88.1%)
0
Dan CraciunIT ConsultantCommented:
Can you please post the code you've tried, to see where you're stuck?

Thank you.
0
What were the top attacks of Q1 2018?

The Threat Lab team analyzes data from WatchGuard’s Firebox Feed, internal and partner threat intelligence, and a research honeynet, to provide insightful analysis about the top threats on the Internet. Check out our Q1 2018 report for smart, practical security advice today!

sunhuxAuthor Commented:
I don't have a code nor a script yet but I'm exploring a freeware cmsort: http://www.chmaas.handshake.de/delphi/freeware/cmsort/cmsort.htm#download

Can't get the syntax right on /V (option for csv file, right as the separator is
semicolon with /V, but I need comma);  It's supposed to run in Win XP
0
sunhuxAuthor Commented:
& my csv file doesn't have double quotes to enclose each field too
while I can't get the right syntax that could sort my csv without the
double quotes
0
sunhuxAuthor Commented:
The help output & a couple of csv examples for cmsort :


CMsort version 2.01 - Sort a DOS, WINDOWS, UNIX, MAC, or mixed text file
(c) 2014 Christian Maas // chmaas@handshake.de // www.chmaas.handshake.de
Usage: CMsort [sort key][-] ... [option] ... <input file> <output file>
Sort keys:
/S=F,L string from position F with length L (case sensitive)
/C=F,L string from position F with length L (case insensitive)
/N=F,L interpreted numeric from position F with length L
Notes:
1. The complete line is used as sort key if no sort keys are indicated.
2. L=0 allowed for last non-numeric key to define a key until EOR.
3. An appending minus sign indicates descending sort order.
4. CMsort uses the quicksort algorithm (not a stable sort).
Sort keys used for CSV files (see option /V below)
/SV=I,F,L I-th CSV field string from position F with length L (case sensitive)
/CV=I,F,L I-th CSV field string from position F with length L (case insensitive)
/NV=I,F,L I-th CSV field numeric from position F with length L

Press <Return> to continue or Ctrl-C to break.

Options:
/F=n read/write files with fixed-length records (n byte without CR/LF)
/B ignore blank or empty records
/D ignore records with duplicate keys (according to given sort keys)
/D=<file> ignore records with duplicate keys, write them to <file>
/Q quiet mode (no progress output)
/H=n don't sort n header lines (default: n=0)
/W=n n-way-merge of temporary files (2<=n<=5, default: n=5)
/T=<path> for temporary files (/T=TMP for Windows temporary file path)
/M=n use n KB memory (n >= 1,024 KB = 1 MB; default: 65,536 KB = 64 MB)
/X=n,c exclude lines with char c (as char, $<hex>, #<dec>) in column n
/V or /V=S,Q for sorting CSV files
  If only /V is specified, semicolon is used as separator and the double
  quotation mark as quotation sign. Use /V=S,Q to specify a different
  separator S and/or quotation sign Q. S, Q must be indicated as
  $<hex> or #<decimal> (e.g. default /V=$3B,$22 for semicolon/double
  quotation mark. Use $00 for Q if no quotation sign is used.
  Indicate sort keys as
  /SV=I,F,L or /CV=I,F,L or /NV=I,F,L with:
    I for I-th CSV field, F for "from position", L for length
    L=0 allowed for all keys (auto-aligning of partial keys)
Use CMsort /E to show examples.


================= 2 examples =====================

Example 3: sort a CSV file
We have the following CSV input file:

"Customer No";CustName;OrderDate;Return
1004711;Miller & Co.;1999-12-06;1,207.23
"1004713";"Topsoft";"2000-01-04";"2,521.95"
1004747;MCP & Co.;2000-01-04;7,356.88
1004799;Eftpos;1999-12-06;23,122.56

This is the command line to sort by order date ascending and by return descending without the header line:

cmsort /SV=3,1,0 /NV=4,1,0- /V /H=1 customercsv.txt customercsv.sor

This is the resulting file:

"Customer No";CustName;OrderDate;Return
1004799;Eftpos;1999-12-06;23,122.56
1004711;Miller & Co.;1999-12-06;1,207.23
1004747;MCP & Co.;2000-01-04;7,356.88
"1004713";"Topsoft";"2000-01-04";"2,521.95"

[Top]
 
 

Example 4: CSV file with empty fields
CSV input file:

Euclid;;Greek
Thales;;Greek
Banach;Stefan;Polish
de Fermat;Pierre;French
Cantor;Georg;German

Command line to sort by first name ascending:

cmsort /V /SV=2,1,0 csvempty.in csvempty.sor

Resulting file:

Thales;;Greek
Euclid;;Greek
Cantor;Georg;German
de Fermat;Pierre;French
Banach;Stefan;Polish
0
sunhuxAuthor Commented:
I also tried the example below but got an error:
http://www.commandlinefu.com/commands/view/8070/sort-a-csv-file-according-to-a-particular-n-th-field-numerically-quicker-than-excel

c:\temp\sort -t"," -n -k23 samplem.csv |more
sort -t"," -n -k23 samplem.csv |more
0
QlemoBatchelor, Developer and EE Topic AdvisorCommented:
Doing that in cmd.exe is tedious, but feasible. PowerShell is much better suited. Using a database would be most appropriate.
0
sunhuxAuthor Commented:
I'll look for a Windows 7 laptop then as it comes with PowerShell.
So what's the command / script?

http://www.commandlinefu.com/commands/view/8070/sort-a-csv-file-according-to-a-particular-n-th-field-numerically-quicker-than-excel
I've just figured that the link above is for Linux sort & I downloaded
a gnuwin32 sort but somehow the sorted output is exactly the
same as the original file.
0
QlemoBatchelor, Developer and EE Topic AdvisorCommented:
Sort for a):
  cmsort /V=$2C,$00 /H=1 /CV=14,1,0 /CV=4,1,0 SampleM.csv SampleM_Sorted_SrcIP.csv
Sort for c):
  cmsort /V=$2C,$00 /H=1 /CV=17,1,0 /CV=4,1,0 SampleM.csv SampleM_Sorted_DstIP.csv

Don't mess with the *nix sort, not worth it unless you are on Linux anyway.
0
QlemoBatchelor, Developer and EE Topic AdvisorCommented:
0
sunhuxAuthor Commented:
For a,
  cmsort /V=$2C,$00 /H=1 /CV=14,1,0 /CV=4,1,0 SampleM.csv
  SampleM_Sorted_SrcIP.csv
was sorted with the SrcIP bottom 5 on top first.  Can change
it so that the top 5 are on top first in the output?  Guess, to
get a count, I'll issue
   find/c "Src_IP" sortedfile.csv

For c,
  cmsort /V=$2C,$00 /H=1 /CV=17,1,0 /CV=4,1,0 SampleM.csv
  SampleM_Sorted_DstIP.csv
likewise, I'll need to the top 5 Destinatn_IP on the top not
at the bottom of the output.  What adjust to the cmsort
options need to be changed?
0
sunhuxAuthor Commented:
I'll still prefer that you provide the counts rather than I resort
to  "find/c ..."  to count for b & d
0
QlemoBatchelor, Developer and EE Topic AdvisorCommented:
Will abandon the CMD approach, because getting the count is really, really tedious. The sort isn't an issue, we could reverse that with e.g.
 cmsort /V=$2C,$00 /H=1 /CV=14,1,0 /CV=4,1,0- SampleM.csv SampleM_Sorted_SrcIP.csv
That is, we append a minus to the sort field specification.
0
QlemoBatchelor, Developer and EE Topic AdvisorCommented:
In PowerShell, a) and c) look like this:
$filename = 'SampleM'

$SrcIPs = Import-Csv "$filename.csv" | Sort-Object 'Source IP', Reason
$dstIPs = $SrcIPs | Sort-Object 'Destination IP', Reason

$SrcIPs | ConvertTo-Csv -NoType | % { $_ -Replace """"} | Out-File "$filename_sorted_SrcIP.csv"
$dstIPs | ConvertTo-Csv -NoType | % { $_ -Replace """"} | Out-File "$filename_sorted_DstIP.csv"

$top5srcIPs = @($srcIPs | group 'Source IP'      | sort count -Descending) | select -First 5
Remove-Variable srcIPs # to free up memory as early as possible
$top5dstIPs = @($dstIPs | group 'Destination IP' | sort count -Descending) | select -First 5
Remove-Variable dstIPs

Write-Host "Top 5 Source IPs:"
$top5srcIPs | select name, count

Write-Host "Top 5 Destination IPs:"
$top5dstIPs | select name, count

Open in new window

The memory consumption will be around 3 GB, because we need to keep all the data for creating the sorted files. It could be lowered by not keepint the sorted result in memory (instead read the sorted data when processing), but disk I/O will slow down processing significantly.
Please try the above with a full-size source file to see if performance and memory footprint are ok.

To implement b) and d), what do you define as "Event"? Is it Application Type, Reason, Action or Note?
0
sunhuxAuthor Commented:
Ok, my mistake for not clarifying;  Event is actually the Reason column
ie column D is the secondary sort key
0
QlemoBatchelor, Developer and EE Topic AdvisorCommented:
Complete code for dumping the results to the console as text.
$filename = 'SampleM'

$SrcIPs = Import-Csv "$filename.csv" | Sort-Object 'Source IP', Reason
$dstIPs = $SrcIPs | Sort-Object 'Destination IP', Reason

$SrcIPs | ConvertTo-Csv -NoType | % { $_ -Replace """"} | Out-File "$filename_sorted_SrcIP.csv"
$dstIPs | ConvertTo-Csv -NoType | % { $_ -Replace """"} | Out-File "$filename_sorted_DstIP.csv"

$top5srcIPs = @($srcIPs | Select 'Source IP'     , 'Reason' | group 'Source IP'      | sort count -Descending) | select -First 5
Remove-Variable srcIPs # to free up memory as early as possible
$top5dstIPs = @($dstIPs | Select 'Destination IP', 'Reason' | group 'Destination IP' | sort count -Descending) | select -First 5
Remove-Variable dstIPs

Write-Host "Top 5 Source IPs:"
$top5srcIPs | select name, count | Out-Host

Write-Host "Top 25 reasons per source IP:"
$top5srcIPs | % {
  $_.Group |
  group Reason |
  sort count -descending |
  select -first 25 |
  select @{l='Source IP'; e={$_.Group[0].'Source IP'}},
          @{l='Count'    ; e={$_.Count}},
          @{l='Reason'   ; e={$_.Name}} 
} | Out-Host

Write-Host "Top 5 Destination IPs:"
$top5dstIPs | select name, count | Out-Host

Write-Host "Top 25 reasons per destination IP:"
$top5dstIPs | % {
  $_.Group |
  group Reason |
  sort count -descending |
  select -first 25 |
  select @{l='Destination IP'; e={$_.Group[0].'Destination IP'}},
          @{l='Count'    ; e={$_.Count}},
          @{l='Reason'   ; e={$_.Name}} 
} | Out-Host

Open in new window

0
sunhuxAuthor Commented:
Thanks very much, appreciate it.

I'm newbie with PowerShell so run into difficulties below.
What do I need to do to run it correctly?  I've changed the
$filename = 'hz'


    Directory: D:\julsortd\h


Mode                LastWriteTime     Length Name
----                -------------     ------ ----
-a---         8/17/2014  11:15 PM 1742326095 hz.csv
-a---         8/18/2014   1:31 AM       1481 psort.ps1


PS D:\julsortd\h> psort.ps1
The term 'psort.ps1' is not recognized as the name of a cmdlet, function, script file, or operable program. Check the s
pelling of the name, or if a path was included, verify that the path is correct and try again.
At line:1 char:10
+ psort.ps1 <<<<
    + CategoryInfo          : ObjectNotFound: (psort.ps1:String) [], CommandNotFoundException
    + FullyQualifiedErrorId : CommandNotFoundException
0
sunhuxAuthor Commented:
Just to be sure, this is the content of the psort.ps1 file which
I've placed in d:\julsortd\h  folder (ie same folder as the csv file) ;

d:\> more d:\julsortd\h\psort.ps1

$filename = 'hz'

$SrcIPs = Import-Csv "$filename.csv" | Sort-Object 'Source IP', Reason
$dstIPs = $SrcIPs | Sort-Object 'Destination IP', Reason

$SrcIPs | ConvertTo-Csv -NoType | % { $_ -Replace """"} | Out-File "$filename_so
rted_SrcIP.csv"
$dstIPs | ConvertTo-Csv -NoType | % { $_ -Replace """"} | Out-File "$filename_so
rted_DstIP.csv"

$top5srcIPs = @($srcIPs | Select 'Source IP'     , 'Reason' | group 'Source IP'
     | sort count -Descending) | select -First 5
Remove-Variable srcIPs # to free up memory as early as possible
$top5dstIPs = @($dstIPs | Select 'Destination IP', 'Reason' | group 'Destination
 IP' | sort count -Descending) | select -First 5
Remove-Variable dstIPs

Write-Host "Top 5 Source IPs:"
$top5srcIPs | select name, count | Out-Host

Write-Host "Top 25 reasons per source IP:"
$top5srcIPs | % {
  $_.Group |
  group Reason |
  sort count -descending |
  select -first 25 |
  select @{l='Source IP'; e={$_.Group[0].'Source IP'}},
          @{l='Count'    ; e={$_.Count}},
          @{l='Reason'   ; e={$_.Name}}
} | Out-Host

Write-Host "Top 5 Destination IPs:"
$top5dstIPs | select name, count | Out-Host

Write-Host "Top 25 reasons per destination IP:"
$top5dstIPs | % {
  $_.Group |
  group Reason |
  sort count -descending |
  select -first 25 |
  select @{l='Destination IP'; e={$_.Group[0].'Destination IP'}},
          @{l='Count'    ; e={$_.Count}},
          @{l='Reason'   ; e={$_.Name}}
} | Out-Host
0
sunhuxAuthor Commented:
Some of the lines above appear to be truncated but they are actually
on a single line when opened with Notepad.  So happened that I
copied them from the Windows command prompt :

$SrcIPs | ConvertTo-Csv -NoType | % { $_ -Replace """"} | Out-File "$filename_so
rted_SrcIP.csv"
$dstIPs | ConvertTo-Csv -NoType | % { $_ -Replace """"} | Out-File "$filename_so
rted_DstIP.csv"
0
sunhuxAuthor Commented:
I copied both the ps1 & the csv input file to c:\windows\system32
& run again but got a different error :

PS C:\Windows\system32> dir hz.csv

    Directory: C:\Windows\system32

Mode                LastWriteTime     Length Name
----                -------------     ------ ----
-a---         8/17/2014  11:15 PM 1742326095 hz.csv


PS C:\Windows\system32> dir psort.ps1

    Directory: C:\Windows\system32

Mode                LastWriteTime     Length Name
----                -------------     ------ ----
-a---         8/18/2014   1:31 AM       1481 psort.ps1


PS C:\Windows\system32> psort
File C:\Windows\system32\psort.ps1 cannot be loaded because the execution of scripts is disabled on this system. Please
 see "get-help about_signing" for more details.
At line:1 char:6
+ psort <<<<
    + CategoryInfo          : NotSpecified: (:) [], PSSecurityException
    + FullyQualifiedErrorId : RuntimeException
0
sunhuxAuthor Commented:
My laptop has 8GB RAM & it's on Windows 7 x64 Professional.

I address the above by running as Administrator &
PS D:\julsortd\h> Set-ExecutionPolicy RemoteSigned

Execution Policy Change
The execution policy helps protect you from scripts that you do not trust. Changing the
execution policy might expose you to the security risks described in the
about_Execution_Policies help topic. Do you want to change the execution policy?
[Y] Yes  [N] No  [S] Suspend  [?] Help (default is "Y"): Y
PS D:\julsortd\h>
PS D:\julsortd\h> psort
. . . runs for more than a minute . . .
PS D:\julsortd\h> psort
Import-Csv : Exception of type 'System.OutOfMemoryException' was thrown.
At C:\Windows\system32\psort.ps1:3 char:21
+ $SrcIPs = Import-Csv <<<<  "$filename.csv" | Sort-Object 'Source IP', Reason
    + CategoryInfo          : NotSpecified: (:) [Import-Csv], OutOfMemoryException
    + FullyQualifiedErrorId : System.OutOfMemoryException,Microsoft.PowerShell.Commands.ImportCsvCommand

ConvertTo-Csv : Cannot bind argument to parameter 'InputObject' because it is null.
At C:\Windows\system32\psort.ps1:6 char:24
+ $SrcIPs | ConvertTo-Csv <<<<  -NoType | % { $_ -Replace """"} | Out-File "$filename_sorted_SrcIP.csv"
    + CategoryInfo          : InvalidData: (:) [ConvertTo-Csv], ParameterBindingValidationException
    + FullyQualifiedErrorId : ParameterArgumentValidationErrorNullNotAllowed,Microsoft.PowerShell.Commands.Con


I'll try running with a smaller csv input file as this one is 1.7GB
with about 1.1 million lines
0
sunhuxAuthor Commented:
It works fine with a much smaller csv input file.
The earlier file is about 25 times the size of this
one that works.

Possible to tweak the PowerShell script or any
parameter to enable it to handle the bigger csv
input files?
0
sunhuxAuthor Commented:
There is a slight issue with the output; refer to the
lines indicated by  <==  below :

Top 5 Source IPs:

Name                              Count
----                                      -----
192.168.3.6                      143107
192.168.3.2                        2093
165.21.42.84                        270
160.96.97.251                       194
103.3.201.122                       187


Top 25 reasons per source IP:

Source IP                            Count Reason
---------                            ----- ------
192.168.3.6                          42851 1003598 - Multiple HTTP Server Low B...
192.168.3.6                            256 Region Too Big  <==
192.168.3.2                           2061 1003598 - Multiple HTTP Server Low B...
192.168.3.2                             32 Region Too Big   <==
165.21.42.84                           270 1000552 - Generic Cross Site Scripti...
160.96.97.251                          170 1000552 - Generic Cross Site Scripti...
160.96.97.251                           24 1003598 - Multiple HTTP Server Low B...
103.3.201.122                          173 Invalid Traversal
103.3.201.122                           14 Illegal Character in URI
0
QlemoBatchelor, Developer and EE Topic AdvisorCommented:
First of all, for security reasons a script needs to be used with the path it is in. So calling a script in the current folder usually is done with
  .\scriptname.ps1

In regard of the 1.7TB file - as long as we have to retain the original fields I see no way to make that work. If it were for getting the top 5 / top 25, preserving only the needed columns (IP addresses and Reason) certainly would help, but you want the files sorted completely.

What is the issue with those "Region Too Big" lines? Looks pretty much like they are from the file.
0
sunhuxAuthor Commented:
Thanks, got the security thingy addressed.

Can change the script such that it doesn't do sorting (ie it won't
bomb out due to insufficient memory) & just report the counts
& percentage ?  Still need to run on that big 1.7GB file (not 1.7TB)
0
QlemoBatchelor, Developer and EE Topic AdvisorCommented:
Ok, 1.7TB was somewhat exaggerated ;-).
If we scan skip creating the sorted files, we can reduce the in-memory properties and hence amount of data to process. But be aware that the simple methods available here will always be an issue - reporting of that kind is a more sophisticated process usually, involving databases and a custom application able to handle data differently and with a smaller memory footprint.
$filename = 'SampleM'

$SrcIPs = Import-Csv "$filename.csv" | select 'Source IP', 'Destination IP', Reason | Sort-Object 'Source IP', Reason
$dstIPs = $SrcIPs | Sort-Object 'Destination IP', Reason

$top5srcIPs = @($srcIPs | Select 'Source IP'     , 'Reason' | group 'Source IP'      | sort count -Descending) | select -First 5
Remove-Variable srcIPs # to free up memory as early as possible
$top5dstIPs = @($dstIPs | Select 'Destination IP', 'Reason' | group 'Destination IP' | sort count -Descending) | select -First 5
Remove-Variable dstIPs

Write-Host "Top 5 Source IPs:"
$top5srcIPs | select name, count | Out-Host

Write-Host "Top 25 reasons per source IP:"
$top5srcIPs | % {
  $_.Group |
  group Reason |
  sort count -descending |
  select -first 25 |
  select @{l='Source IP'; e={$_.Group[0].'Source IP'}},
          @{l='Count'    ; e={$_.Count}},
          @{l='Reason'   ; e={$_.Name}} 
} | Out-Host

Write-Host "Top 5 Destination IPs:"
$top5dstIPs | select name, count | Out-Host

Write-Host "Top 25 reasons per destination IP:"
$top5dstIPs | % {
  $_.Group |
  group Reason |
  sort count -descending |
  select -first 25 |
  select @{l='Destination IP'; e={$_.Group[0].'Destination IP'}},
          @{l='Count'    ; e={$_.Count}},
          @{l='Reason'   ; e={$_.Name}} 
} | Out-Host

Open in new window

0

Experts Exchange Solution brought to you by

Your issues matter to us.

Facing a tech roadblock? Get the help and guidance you need from experienced professionals who care. Ask your question anytime, anywhere, with no hassle.

Start your 7-day free trial
sunhuxAuthor Commented:
With the medium sized file which fails earlier, this new script now works.
However, with the 1.7GB csv file, it still give error :

The '=' operator failed: Exception of type 'System.OutOfMemoryException' was thrown..
At D:\julsortd\h\h.ps1:6 char:14
+ $top5srcIPs = <<<<  @($srcIPs | Select 'Source IP'     , 'Reason' | group 'Source IP'      | sort count -Descending)
| select -First 5
    + CategoryInfo          : InvalidOperation: (:) [], RuntimeException
    + FullyQualifiedErrorId : OperatorFailed


I'm sure I have 5GB of RAM free on my 8GB Win 7.
0
sunhuxAuthor Commented:
Will there be any issue if the header line (ie the column label that
usually appear in the 1st row of the csv) appear more than once
in the csv file ?
0
QlemoBatchelor, Developer and EE Topic AdvisorCommented:
Yes. Import-CSV errors out in that case with duplicate column headers,
0
sunhuxAuthor Commented:
Any chance that you can enhance it further:

With the medium sized file which fails earlier, this new script now works.
However, with the 1.7GB csv file, it still give error :

Exception of type 'System.OutOfMemoryException'

or will it help if I run it on a Win7 with 16GB RAM?  Still hunting for
one but I think one of the DBA colleague has it
0
It's more than this solution.Get answers and train to solve all your tech problems - anytime, anywhere.Try it for free Edge Out The Competitionfor your dream job with proven skills and certifications.Get started today Stand Outas the employee with proven skills.Start learning today for free Move Your Career Forwardwith certification training in the latest technologies.Start your trial today
Windows Batch

From novice to tech pro — start learning today.

Question has a verified solution.

Are you are experiencing a similar issue? Get a personalized answer when you ask a related question.

Have a better answer? Share it in a comment.