Solved

script to verify MD5 of files on Windows 2008 R2 & RHEL/Suse Linux

Posted on 2014-07-28
6
969 Views
Last Modified: 2014-08-08
Our customer's governance requested for scripts to be written to scan
files on servers & I think it's to ensure files are sane / not corrupted
/not tampered with.

To quote what they requested:

"The purpose is to see if there is any match from scanned files to MD5 hashes.

1) Management cluster systems (including OS like Windows 2008 R2, RHEL 5.x/6.x, Solaris x86)

2) Management consoles"

Can anyone write / provide the scripts?  Ideally don't need to install any
additional binaries (eg: compiler/interpreter) but just run on the native
OS (eg: Windows batch or PowerShell or VB scripts, Linux Shell/Perl)
0
Comment
Question by:sunhux
[X]
Welcome to Experts Exchange

Add your voice to the tech community where 5M+ people just like you are talking about what matters.

  • Help others & share knowledge
  • Earn cash & points
  • Learn & ask questions
  • 3
  • 2
6 Comments
 
LVL 28

Assisted Solution

by:Jan Springer
Jan Springer earned 275 total points
ID: 40226903
on the linux side, you would use a bash script:

#!/bin/bash
/usr/bin/md5sum FILENAME > FILENAME.md5

When the same command is run on the Windows side and the two FILENAME.md5 are compared, they should be identical.
0
 
LVL 38

Assisted Solution

by:Gerwin Jansen, EE MVE
Gerwin Jansen, EE MVE earned 225 total points
ID: 40230530
>> I think it's to ensure files are sane / not corrupted /not tampered with.
That is what MD5 hashes can be used for, yes.

Without installing anything, you could use PsFCIV (PowerShell version of legacy FCIV.exe).

You can user PsFCIV to validate a set of files that have their hashes stored in an XML database.

Ref: http://gallery.technet.microsoft.com/PowerShell-File-Checksum-e57dcd67
0
 

Author Comment

by:sunhux
ID: 40241801
Thanks Jesper for the Linux solution & Gerwin for Windows.

If a file (say a logfile or a Windows .evtx event logfile) is
constantly being updated, do both the checksums methods
above report files are corrupted or files are intact?

Another example is when a password file or database is
updated in a non-malicious or an authorized manner
(say by the OS or an app), checksum should not report
it as being 'tampered with', right?



From the link Gerwin provided, I have 2 queries:

> Start-PsFCIV -Path C:\tmp -XML DB.XML
Does the above create an initial xml database
storing checksums of all files?

> Checks all files in C:\tmp folder by using SHA1 hash algorithm.
> Start-PsFCIV -Path C:\tmp -XML DB.XML -HashAlgorithm SHA1, SHA256, SHA512 -Recurse
Why does the above uses 3 algorithms ie ShA1, SHA256 & SHA512?
Isn't is enough to just specify 1 algo rather than 3?  Or using 3
is more reliable in detecting if file(s) are being tampered with?
0
Best Practices: Disaster Recovery Testing

Besides backup, any IT division should have a disaster recovery plan. You will find a few tips below relating to the development of such a plan and to what issues one should pay special attention in the course of backup planning.

 
LVL 28

Accepted Solution

by:
Jan Springer earned 275 total points
ID: 40241821
an md5 checksum validates (within reason) the contents of a file and that when comparing a later copy that should be the same -- that it is.

it doesn't identify the status (good or corrupt) of a file -- it just says that it's the same.

if you read up on md5, you will see that changes can be made to a file and still get the same hash.

but, for what you're doing, an md5 compare that the file is the same should be okay.

the only reason that i can see for using multiple algorithms is to prevent a false positive (that the files are the same).  more of an audit point.  you just won't necessarily have all options on all operating systems which is why i end up with md5 between linux and windows.
0
 
LVL 38

Assisted Solution

by:Gerwin Jansen, EE MVE
Gerwin Jansen, EE MVE earned 225 total points
ID: 40243246
> Start-PsFCIV -Path C:\tmp -XML DB.XML
Does the above create an initial xml database
storing checksums of all files?


-> When you run the command first time, it creates XML database file for specified folder or folders.
0
 
LVL 38

Expert Comment

by:Gerwin Jansen, EE MVE
ID: 40243247
>> Why does the above uses 3 algorithms ie ShA1, SHA256 & SHA512?
Why do you want 3 algorithms? Just choose 1 I would say.
0

Featured Post

Optimizing Cloud Backup for Low Bandwidth

With cloud storage prices going down a growing number of SMBs start to use it for backup storage. Unfortunately, business data volume rarely fits the average Internet speed. This article provides an overview of main Internet speed challenges and reveals backup best practices.

Question has a verified solution.

If you are experiencing a similar issue, please ask a related question

Suggested Solutions

Title # Comments Views Activity
Powershell- Registry HIVE 6 60
Bing Maps Add-in in Exchange 2016 1 73
78 files, need to delete row 2 in every file 3 27
Bulk export PST files in MS Exchange 2016 2 58
A quick Powershell script I wrote to find old program installations and check versions of a specific file across the network.
Previously, on our Nano Server Deployment series, we've created a new nano server image and deployed it on a physical server in part 2. Now we will go through configuration.
In this fifth video of the Xpdf series, we discuss and demonstrate the PDFdetach utility, which is able to list and, more importantly, extract attachments that are embedded in PDF files. It does this via a command line interface, making it suitable …
In this seventh video of the Xpdf series, we discuss and demonstrate the PDFfonts utility, which lists all the fonts used in a PDF file. It does this via a command line interface, making it suitable for use in programs, scripts, batch files — any pl…

738 members asked questions and received personalized solutions in the past 7 days.

Join the community of 500,000 technology professionals and ask your questions.

Join & Ask a Question