Script for comparing 2 files for tracing the disk usage.

Hello Experts,

Need your help. I have a requirement to create one bash script for analyzing the disk usage. Let me detail here. We have a NetApp filer and it hosts many users’ home folders. The problem we face is that the disk usage is becoming full always. As the filer consists of too many folders, its very difficult for us to trace out the disk usage correctly. The procedure we follow now is to take a folder-wise screen-shot in a file and compare it to the previous screen-shot, by which we can find the difference in the folder size and trace out the folder whose size increased recently. Screen-shot in the sense, being in root /, we run this command: #du –sk * > filename.<date>.  E.g: #du –sk * > diskusage.04Oct06

This file will contain the disk usage detail of all the folders in the root folder. To list all the folders based on size, we use the following command:

#cat diskusage.04Oct06|sort –n.

So when we want to find which folder size has increased recently, we will manually do a comparison between the current file and the previous week file. But this process is tedious, as we have to check several folders. That’s the reason I like to write a script for this (But I don't know much about scripting).

Let me tell you in simple terms about my requirements:

We have two files (say diskusage.04Oct06 and diskusage.28Sep06) which contains the folder-size details of a directory (say /). Now we need a script which compares these two files and get us a output of difference in disk usage of the respective folder.

The logic can work like this. First it should compare the strings (filenames) in both the files, if it matches, calculate the difference (filesize value) and display the result. Incase if both the strings aren’t matching then it should discard it.  Hope you got it.  Can anyone work on this and get me the script?

 
Sample disk usage screenshot file:


bash-2.05# cat dus.04-26-06|sort -n

8       AR
2196    blfs
77160   hris_project
360032  hr
502520  speedy
3280756 purch_rep
3372096 micropower
3434676 rpq_data
3530012 document
4082684 Legal_Dept
5077992 netadmin
7461988 Oracle_11i
8872748 gollum
11508516        Treasury_NT
16117436        terminated
26566756        home1unix
28623272        FA
35648704        marcom
44091864        misadmin
56374524        unix
bash-2.05#

Thanks,
Ashok

PS:  IF YOU HAVE ANY OTHER ALTERNATIVE SCRIPT ALSO, FINE WITH ME. All I want is to trace the disk usage.
rdashokrajAsked:
Who is Participating?
I wear a lot of hats...

"The solutions and answers provided on Experts Exchange have been extremely helpful to me over the last few years. I wear a lot of hats - Developer, Database Administrator, Help Desk, etc., so I know a lot of things but not a lot about one thing. Experts Exchange gives me answers from people who do know a lot about one thing, in a easy to use platform." -Todd S.

chris_calabreseCommented:
Sounds like you really want Quotas. See http://www.netapp.com/library/tr/3425.pdf
TintinCommented:
quotas certainly sound like the way to go.

If you want to identify what recent files are filling up the home dirs, you can do

find /home -mtime -1 -type f -ls

or if you want to see only files greater than a certain size

find /home -mtime -1 -type -f -size +100000b -ls

The above will find all files created within the last day that are greater than 100000 bytes.

Experts Exchange Solution brought to you by

Your issues matter to us.

Facing a tech roadblock? Get the help and guidance you need from experienced professionals who care. Ask your question anytime, anywhere, with no hassle.

Start your 7-day free trial
It's more than this solution.Get answers and train to solve all your tech problems - anytime, anywhere.Try it for free Edge Out The Competitionfor your dream job with proven skills and certifications.Get started today Stand Outas the employee with proven skills.Start learning today for free Move Your Career Forwardwith certification training in the latest technologies.Start your trial today
Unix OS

From novice to tech pro — start learning today.