bt707
asked on
awk command
I have a very large log file and need to find all lines with a certain statement in it, there is many of them repeating, I need to get it to work with and without uniq lines.
here is a example of one of the lines I'm looking for, the key that is in each line is ( Connection refused )
30-Nov-2004 08:20:05.76 tcp_intranet Q 8 uzqgvch@hamptonroads.com rfc822;511@ntserver1ktc.ha mpton.com 511@ntserver1ktc.hampton.c om /opt/iplanet/ms5/msg-hampt on/queue/t cp_intrane t/005/Z60I 7W00.00 <AQWIXDZTPIPXK@hongkong.co m> mailsrv TCP active open: Failed connect() Error: Connection refused
I can find these lines with grep but I want to find these lines and only print the columns 5 and 6 and the Error: Connection refused
Thanks,
here is a example of one of the lines I'm looking for, the key that is in each line is ( Connection refused )
30-Nov-2004 08:20:05.76 tcp_intranet Q 8 uzqgvch@hamptonroads.com rfc822;511@ntserver1ktc.ha
I can find these lines with grep but I want to find these lines and only print the columns 5 and 6 and the Error: Connection refused
Thanks,
ASKER
I have it about working using this
awk '$18 ~ /refused/ {print $6, $7, $16, $17,$18}' log_files | uniq
however i'm still for a way to print to screen one copy of the duplicate lines but get a number of how many times the duplicate lines was in the file,
any sugestions,
Thanks,
ASKER CERTIFIED SOLUTION
membership
This solution is only available to members.
To access this solution, you must be a member of Experts Exchange.
ASKER
Thanks tfewster worked Great,
Thanks again,
Thanks again,
ASKER
Thanks