Avatar of YZlat
YZlatFlag for United States of America asked on

Need help modifying my script

I need to write a script that loops throught all the directories that contain $var in the path and get all the files with .txt extension and calculate the total size


for example if $var="sample" and directories below contain

/u01/test/samples

test1.txt 1GB
test2.txt 1GB

/u04/data/sample

sample2.txt 2Gb

then the script should add up file sizes in those directories and assing them to a variable

So far I came up with something like this:

$dir=/u01/test/samples/*.txt
du -g $dir | total_size=`awk '{c+=$1};END { print c }'`

which works great but only returns total file size for one specified directory. Can someone help fix my code so that it returns total size fo all directories containing a specific variable in the path?
Shell ScriptingUnix OS

Avatar of undefined
Last Comment
simon3270

8/22/2022 - Mon
ozo

wc -c `find . -path '*/sample/*.txt'`
ASKER
YZlat

ozo, but it will not loop through all fielsystems, will it?

The thing is I have a number of different filesystems: /u01, /u02, /u03, /u04 and many more. I need to loop through them because almost every one of them have word "sample" in it
ASKER
YZlat

i get an error

find: 0652-017 -path is not a valid option.

Also wc command counts the characters not file size, I know it is basically the same thing but I feel like it is not the right way to go.

The bottom line this code does not work either way
I started with Experts Exchange in 2004 and it's been a mainstay of my professional computing life since. It helped me launch a career as a programmer / Oracle data analyst
William Peck
simon3270

A command to generate the result could be:
find /u0? -type f | grep "/${var}/.*\.txt$" | xargs ls -s | awk '{c+=$1};END { print c }'

Open in new window

To get this into a variable, use
totsize=$(find /u0? -type f | grep "/${var}/.*\.txt$" | xargs ls -s | awk '{c+=$1};END { print c }')

Open in new window


(editted to add the "-type f", in case a directory name ended with ".txt")
skullnobrains

du -ch `find / -type d -name sample`

you should try and toy with the find command to make it faster. maybe something like this

du -ch `find /u* -maxdepth 4 -type d -name sample`

i guess you can figure that part out easily if you know where to expect the files

if locate is available and reasonably up to date, you may want to replace find with "locate sample"
ASKER
YZlat

simon3270, your code produces an error:

find: 0652-019 The status on /u0? is not valid.
Get an unlimited membership to EE for less than $4 a week.
Unlimited question asking, solutions, articles and more.
simon3270

That's /u zero questionmark (or whatever matches the base directories you want to search)
ASKER
YZlat

yes, I tried /u zero question mark and that's what gave me the error. I also tried /u? but still same error
simon3270

I think you get that error either if there is a problem with a disk (are they NFS mounts with a dead server?  Or actual disk failures?) or simply if the path specified doesn't exist - you don't want to try text until it works - use a pattern which finds the actual disks (assuming that /u01 or /uo1 for example is the mount point for a mounted disk).  I can't see your system, so unless you copy and paste the output, for example, or "ls -p /" or "mount", I can't see what your directories are called.

If find can't get past the lowest level directory, it won't be able to check for "sample" directories - you need to fix that problem first.

I only used /u0? because you implied that you needed to check all such directories (and I didn't expect them to be /uo?).  if you only need to check some of them, first check that the ones you want work with "find" (test them one at a time with, for example, "find /u01 | head"), then just list them on the command line, like:

    find /u01 /disk2 /cc0703 /mdisk04 -type f |

(and then the rest of the line is the same as it was).
Your help has saved me hundreds of hours of internet surfing.
fblack61
skullnobrains

here is an example on my machine

$ du -ch `find /v* -maxdepth 2 -type d -name \*sa\*` | tail
4,0K	/var/log/boot-sav/log/2013-06-21__14h13boot-repair20/sdb1
4,0K	/var/log/boot-sav/log/2013-06-21__14h13boot-repair20/sda5
1,1M	/var/log/boot-sav/log/2013-06-21__14h13boot-repair20/sda
4,0K	/var/log/boot-sav/log/2013-06-21__14h13boot-repair20/sda1
4,0K	/var/log/boot-sav/log/2013-06-21__14h13boot-repair20/sda2
1,2M	/var/log/boot-sav/log/2013-06-21__14h13boot-repair20
7,1M	/var/log/boot-sav/log
7,1M	/var/log/boot-sav
4,0K	/var/log/samba
7,2M	total

Open in new window


try something like this

du -ch `find /u[0-9] -maxdepth 4 -type d -name sample` | cut -d \  -f 1

Open in new window

ASKER
YZlat

simon, directories are called

/u01
/u02
/u10
/u40
/u80
ASKER
YZlat

skullnobrains, what does this part do?

cut -d \  -f 1
Get an unlimited membership to EE for less than $4 a week.
Unlimited question asking, solutions, articles and more.
simon3270

The command becomes:
find /u01 /u02 /u10 /u40 /u80 -type f | grep "/${var}/.*\.txt$" | xargs ls -s | awk '{c+=$1};END { print c }'

Open in new window

If that returns the same "find" error, check that each disk can be read by running:
    ls /u01 /u02 /u10 /u40 /u80 >/dev/null
and letting us know if there are any errors reported
ASKER CERTIFIED SOLUTION
skullnobrains

Log in or sign up to see answer
Become an EE member today7-DAY FREE TRIAL
Members can start a 7-Day Free trial then enjoy unlimited access to the platform
Sign up - Free for 7 days
or
Learn why we charge membership fees
We get it - no one likes a content blocker. Take one extra minute and find out why we block content.
See how we're fighting big data
Not exactly the question you had in mind?
Sign up for an EE membership and get your own personalized solution. With an EE membership, you can ask unlimited troubleshooting, research, or opinion questions.
ask a question
simon3270

I thought you only wanted to add up the size of the *.txt files, not of the entire contents of the directories called "sample".