Sort command help

Hello,

I have a file with around 200K records. I'm trying to get the number of occurances of each word with the following command:

tr ' ' '\n' |sort |uniq -c

This command is not sorting correctly...

I'm also using the following command to delete words from the main file which occur less than 100 times which is also not producing correct results.

perl -ne 's/(\S+)/$s{$1}/g,print,next if !@ARGV; ++$s{$_} for split; if( eof ){ $s{$_}=$s{$_}>=100&&$_ for keys %s}' file file >newfile

Please help with usage.

Thanks
faithless1Asked:
Who is Participating?
 
tel2Commented:
Hi FL1,

tr ' ' '\n' |sort |uniq -c
This command is not sorting correctly...

1. Where are you specifying your input filename?  Please show us how you include the input filename in the (above) command.
2. How do you know it's not sorting correctly?  Please give us a sample of the output and tell us what's wrong with it.

Thanks.
TRS
0
 
ozoCommented:
In what way does the result differ from the correct result?
0
 
tel2Commented:
Having just collected 500 points for asking a few questions, I think I must be getting good at this game.
A pleasure doing business, FL1
0
Question has a verified solution.

Are you are experiencing a similar issue? Get a personalized answer when you ask a related question.

Have a better answer? Share it in a comment.

All Courses

From novice to tech pro — start learning today.