Solved

Shell - Command to count lines per hour (Solaris 10)

Posted on 2014-03-10
8
516 Views
Last Modified: 2014-03-17
Hi,

I have a log file which has the following line repeated several times during a day:

2014-03-10 00:00:35.970 [80] Debug      HttpReceiverServlet            Incoming request url : http://10.129.65.23:7920/httpeventadapter/SP_ADD/httpAdapter/ST.TERMINAL_SWITCH query string ST.MSISDN=5567
.
.
2014-03-10 00:00:36.075 [80] Debug      HttpReceiverServlet            Incoming request url : http://10.129.65.23:7920/httpeventadapter/SP_ADD/httpAdapter/ST.TERMINAL_SWITCH query string ST.MSISDN=55494
.
.


So what I need to have an output of how many times does this line appears in the period of an hour...

Pls note that the line has in common the beggining of it.. the rest differs... so what I need is a sort of grep "Incoming request url" | wc -l and in the output something like this:

00:00  10000  <---- (number of times that the "Incoming request url"  appeared from 00:01 to 00:59)
.
04:00   20000
.
.
15:00   2000
16:00   4000

Until 23:00... is this possible to be done?

Im on Solaris 10.

Tks,
Joao
0
Comment
Question by:joaotelles
  • 2
  • 2
  • 2
  • +2
8 Comments
 
LVL 40

Expert Comment

by:omarfarid
ID: 39917893
try this

awk '{ print $2 }' filename | awk -F":" '{ print $1 }' | sort | uniq -c
0
 

Author Comment

by:joaotelles
ID: 39918058
I got the wrong output...

awk '{ print $2 }' filename | awk -F":" '{ print $1 }' | sort | uniq -c
   2 EventId=2696257,
   2 EventId=2696259,
   2 EventId=2696265,
   2 EventId=2696268,
   2 EventId=2696271,
   2 EventId=2696274,

I grepped the "EventId=2696257" and saw that you got the wrong line..

2014-03-10 01:12:05.483 [344] Flow       HttpNotificationClient         Sending notification RequestHolder [Notification name=TerminalSwitch_3G_To_2G, URL=ht
tp://mdm2a:7920/air-integration/not,  Method=POST, ContentType=text/xml, Url parameters=?ST.MSISDN=556193350627,
Body=<methodCall><methodName>UpdateSubscriberSegmentation</methodName><params><param><value><struct><member><name>originNodeType</name><value>EXT</value></me
mber><member><name>originHostName</name><value>MDM</value></member><member><name>originTransactionID</name><value>58748988436</value></member><member><name>o
riginTimeStamp</name><value><dateTime.iso8601>20140310T01:11:56-0300</dateTime.iso8601></value></member><member><name>subscriberNumber</name><value>556193350
627</value></member><member><name>originOperatorID</name><value>MDM</value></member><member><name>serviceOfferings</name><value><array><data><value><struct><
member><name>serviceOfferingID</name><value><int>26</int></value></member><member><name>serviceOfferingActiveFlag</name><value><boolean>1</boolean></value></
member></struct></value><value><struct><member><name>serviceOfferingID</name><value><int>27</int></value></member><member><name>serviceOfferingActiveFlag</na
me><value><boolean>0</boolean></value></member></struct></value></data></array></value></member></struct></value></param></params></methodCall>
, EventId=2696257, Time=20140310T01:11:56-0300]

========================
0
 
LVL 68

Accepted Solution

by:
woolmilkporc earned 500 total points
ID: 39918078
I think you should take the date into account, in case the log file would span several days.

Try this:
nawk -F: '/Incoming request url/ {A[$1]+=1} END {for (n in A) print n, "->", A[n]}' log.txt | sort

Open in new window

If you really want to ignore the date:
nawk -F":| " '/Incoming request url/ {A[$2]+=1} END {for (n in A) print n, "->", A[n]}' log.txt | sort

Open in new window

Please note that using "nawk" instead of "awk" for this latter version is important under Solaris!
0
 
LVL 29

Expert Comment

by:MikeOM_DBA
ID: 39918768
Perhaps this will help:
awk -F'[ :]' '/Incoming request url/{
  dt=$1" "$2; 
  if(k==dt){c+=1}
  else { if(c>0){print k" -> "c;}
  c=1;k=dt}
} END {print k" -> "c}' log.txt

Open in new window

0
Is Your Active Directory as Secure as You Think?

More than 75% of all records are compromised because of the loss or theft of a privileged credential. Experts have been exploring Active Directory infrastructure to identify key threats and establish best practices for keeping data safe. Attend this month’s webinar to learn more.

 
LVL 2

Expert Comment

by:c_kedar
ID: 39920302
I guess this simple one liner will do, unless I am missing some point
cut -d : -f 1,2 filename | sort | uniq -c

Open in new window

You can omit 'sort' if it is safe to assume that in log file time will always be ascending.
0
 
LVL 29

Expert Comment

by:MikeOM_DBA
ID: 39920529
@c_kedar, Unfortunalely the one - liner does not count the 'number of times that the "Incoming request url"  appeared'
:p
0
 
LVL 2

Expert Comment

by:c_kedar
ID: 39921407
I missed that requirement. It is simple to add, though:
grep "Incoming request url"  filename | cut -d : -f 1,2 | sort | uniq -c

Open in new window

0
 

Author Closing Comment

by:joaotelles
ID: 39933994
Tks.
0

Featured Post

Is Your Active Directory as Secure as You Think?

More than 75% of all records are compromised because of the loss or theft of a privileged credential. Experts have been exploring Active Directory infrastructure to identify key threats and establish best practices for keeping data safe. Attend this month’s webinar to learn more.

Question has a verified solution.

If you are experiencing a similar issue, please ask a related question

This tech tip describes how to install the Solaris Operating System from a tape backup that was created using the Solaris flash archive utility. I have used this procedure on the Solaris 8 and 9 OS, and it shoudl also work well on the Solaris 10 rel…
Introduction Regular patching is part of a system administrator's tasks. However, many patches require that the system be in single-user mode before they can be installed. A cluster patch in particular can take quite a while to apply if the machine…
Learn several ways to interact with files and get file information from the bash shell. ls lists the contents of a directory: Using the -a flag displays hidden files: Using the -l flag formats the output in a long list: The file command gives us mor…
In a previous video, we went over how to export a DynamoDB table into Amazon S3.  In this video, we show how to load the export from S3 into a DynamoDB table.

863 members asked questions and received personalized solutions in the past 7 days.

Join the community of 500,000 technology professionals and ask your questions.

Join & Ask a Question

Need Help in Real-Time?

Connect with top rated Experts

26 Experts available now in Live!

Get 1:1 Help Now