Link to home
Start Free TrialLog in
Avatar of bt707
bt707Flag for United States of America

asked on

awk command

I have a very large log file and need to find all lines with a certain statement in it, there is many of them repeating, I need to get it to work with and without uniq lines.

here is a example of one of the lines I'm looking for,    the key that is in each line is  ( Connection refused )


30-Nov-2004 08:20:05.76 tcp_intranet              Q 8 uzqgvch@hamptonroads.com rfc822;511@ntserver1ktc.hampton.com 511@ntserver1ktc.hampton.com /opt/iplanet/ms5/msg-hampton/queue/tcp_intranet/005/Z60I7W00.00 <AQWIXDZTPIPXK@hongkong.com> mailsrv  TCP active open: Failed connect()    Error: Connection refused


I can find these lines with grep but I want to find these lines and only print the columns  5 and 6 and the   Error: Connection refused


Thanks,
Avatar of bt707
bt707
Flag of United States of America image

ASKER

also is there a way to use uniq and only print out one line of all the duplicate lines but get a number of how many times a duplicate line was in the file.


Thanks
Avatar of bt707

ASKER


I have it about working using this


awk '$18 ~ /refused/ {print $6, $7, $16, $17,$18}' log_files | uniq


however i'm still for a way to print to screen one copy of the duplicate lines but get a number of how many times the duplicate lines was in the file,

any sugestions,

Thanks,
ASKER CERTIFIED SOLUTION
Avatar of tfewster
tfewster
Flag of United Kingdom of Great Britain and Northern Ireland image

Link to home
membership
This solution is only available to members.
To access this solution, you must be a member of Experts Exchange.
Start Free Trial
Avatar of bt707

ASKER

Thanks tfewster worked Great,


Thanks again,