• Status: Solved
  • Priority: Medium
  • Security: Public
  • Views: 279
  • Last Modified:

awk command

I have a very large log file and need to find all lines with a certain statement in it, there is many of them repeating, I need to get it to work with and without uniq lines.

here is a example of one of the lines I'm looking for,    the key that is in each line is  ( Connection refused )


30-Nov-2004 08:20:05.76 tcp_intranet              Q 8 uzqgvch@hamptonroads.com rfc822;511@ntserver1ktc.hampton.com 511@ntserver1ktc.hampton.com /opt/iplanet/ms5/msg-hampton/queue/tcp_intranet/005/Z60I7W00.00 <AQWIXDZTPIPXK@hongkong.com> mailsrv  TCP active open: Failed connect()    Error: Connection refused


I can find these lines with grep but I want to find these lines and only print the columns  5 and 6 and the   Error: Connection refused


Thanks,
0
bt707
Asked:
bt707
  • 3
1 Solution
 
bt707Author Commented:
also is there a way to use uniq and only print out one line of all the duplicate lines but get a number of how many times a duplicate line was in the file.


Thanks
0
 
bt707Author Commented:

I have it about working using this


awk '$18 ~ /refused/ {print $6, $7, $16, $17,$18}' log_files | uniq


however i'm still for a way to print to screen one copy of the duplicate lines but get a number of how many times the duplicate lines was in the file,

any sugestions,

Thanks,
0
 
tfewsterCommented:
Does this do what you want?
awk '$18 ~ /refused/ {print $6, $7, $16, $17,$18}' log_files |sort |uniq -c
0
 
bt707Author Commented:
Thanks tfewster worked Great,


Thanks again,
0

Featured Post

VIDEO: THE CONCERTO CLOUD FOR HEALTHCARE

Modern healthcare requires a modern cloud. View this brief video to understand how the Concerto Cloud for Healthcare can help your organization.

  • 3
Tackle projects and never again get stuck behind a technical roadblock.
Join Now