gavcol
asked on
HP-UX Large Files
I need to search through a directory and all it's sub directories and print out the top files by size
what is the best way to print out, say the 10 largest files in a directory and it's sub directories, or at least all files over a certain size... say 500mb.
This is on a HP-UX box which is an annoyingly different version of unix
Thanks
Gav
what is the best way to print out, say the 10 largest files in a directory and it's sub directories, or at least all files over a certain size... say 500mb.
This is on a HP-UX box which is an annoyingly different version of unix
Thanks
Gav
ASKER
Thanks Yuzh,
It is good that I get the path to each file but it doesn't give me the size of each file and therefore I am unable to determine which ones are the largest ??
Cheers
Gav
It is good that I get the path to each file but it doesn't give me the size of each file and therefore I am unable to determine which ones are the largest ??
Cheers
Gav
find . -type f | xargs wc -c | sort -nr -t " " -k1 | grep -v "total$" | head -1
To see the filesize owner permission sort by file size, you can do:
> 100MB:
find . -type f -size +200000 -ls | sort -k 7
>500MB:
find . -type f -size +1000000 -ls | sort -k 7
> 100MB:
find . -type f -size +200000 -ls | sort -k 7
>500MB:
find . -type f -size +1000000 -ls | sort -k 7
ASKER
Sorry guys,
amit_g... your example just gave me a list of files with "wc:cannot open file" at the start of every line and no filesizes
yuzh.... yours returned "find: bad option -ls"
Results don't have to be sorted. I'll > file.txt and sort later
Thanks again for your help. I hope we can sort this
amit_g... your example just gave me a list of files with "wc:cannot open file" at the start of every line and no filesizes
yuzh.... yours returned "find: bad option -ls"
Results don't have to be sorted. I'll > file.txt and sort later
Thanks again for your help. I hope we can sort this
ASKER CERTIFIED SOLUTION
membership
This solution is only available to members.
To access this solution, you must be a member of Experts Exchange.
SOLUTION
membership
This solution is only available to members.
To access this solution, you must be a member of Experts Exchange.
ASKER
Thanks amit_g
That worked a treat ... though it took ages to run as it listed ALL files so I did modify it to
find . -type f -size +200000 -exec ls -l {} \; | sort -nr -k 5 > filesize.txt
Thanks for your help
Muchly appreciated
Gav
That worked a treat ... though it took ages to run as it listed ALL files so I did modify it to
find . -type f -size +200000 -exec ls -l {} \; | sort -nr -k 5 > filesize.txt
Thanks for your help
Muchly appreciated
Gav
ASKER
Bugger,
sorry amit_g. after I replied to your response I just accepted the last comment above my post thinking it was yours but Yuzh had just jumped in so yuzh got the points.
Really sorry. can this be fixed or modified so I can split the points ???
Gav
sorry amit_g. after I replied to your response I just accepted the last comment above my post thinking it was yours but Yuzh had just jumped in so yuzh got the points.
Really sorry. can this be fixed or modified so I can split the points ???
Gav
Thanks gavcol and yuzh :)
eg for file > 100MB under current dir
find . -type f -size +200000 -print
for 500MB
find . -type f -size +1000000 -print
man find
to learn more details