[Webinar] Streamline your web hosting managementRegister Today

x
  • Status: Solved
  • Priority: Medium
  • Security: Public
  • Views: 644
  • Last Modified:

Shell - Command to count lines per hour (Solaris 10)

Hi,

I have a log file which has the following line repeated several times during a day:

2014-03-10 00:00:35.970 [80] Debug      HttpReceiverServlet            Incoming request url : http://10.129.65.23:7920/httpeventadapter/SP_ADD/httpAdapter/ST.TERMINAL_SWITCH query string ST.MSISDN=5567
.
.
2014-03-10 00:00:36.075 [80] Debug      HttpReceiverServlet            Incoming request url : http://10.129.65.23:7920/httpeventadapter/SP_ADD/httpAdapter/ST.TERMINAL_SWITCH query string ST.MSISDN=55494
.
.


So what I need to have an output of how many times does this line appears in the period of an hour...

Pls note that the line has in common the beggining of it.. the rest differs... so what I need is a sort of grep "Incoming request url" | wc -l and in the output something like this:

00:00  10000  <---- (number of times that the "Incoming request url"  appeared from 00:01 to 00:59)
.
04:00   20000
.
.
15:00   2000
16:00   4000

Until 23:00... is this possible to be done?

Im on Solaris 10.

Tks,
Joao
0
joaotelles
Asked:
joaotelles
  • 2
  • 2
  • 2
  • +2
1 Solution
 
omarfaridCommented:
try this

awk '{ print $2 }' filename | awk -F":" '{ print $1 }' | sort | uniq -c
0
 
joaotellesAuthor Commented:
I got the wrong output...

awk '{ print $2 }' filename | awk -F":" '{ print $1 }' | sort | uniq -c
   2 EventId=2696257,
   2 EventId=2696259,
   2 EventId=2696265,
   2 EventId=2696268,
   2 EventId=2696271,
   2 EventId=2696274,

I grepped the "EventId=2696257" and saw that you got the wrong line..

2014-03-10 01:12:05.483 [344] Flow       HttpNotificationClient         Sending notification RequestHolder [Notification name=TerminalSwitch_3G_To_2G, URL=ht
tp://mdm2a:7920/air-integration/not,  Method=POST, ContentType=text/xml, Url parameters=?ST.MSISDN=556193350627,
Body=<methodCall><methodName>UpdateSubscriberSegmentation</methodName><params><param><value><struct><member><name>originNodeType</name><value>EXT</value></me
mber><member><name>originHostName</name><value>MDM</value></member><member><name>originTransactionID</name><value>58748988436</value></member><member><name>o
riginTimeStamp</name><value><dateTime.iso8601>20140310T01:11:56-0300</dateTime.iso8601></value></member><member><name>subscriberNumber</name><value>556193350
627</value></member><member><name>originOperatorID</name><value>MDM</value></member><member><name>serviceOfferings</name><value><array><data><value><struct><
member><name>serviceOfferingID</name><value><int>26</int></value></member><member><name>serviceOfferingActiveFlag</name><value><boolean>1</boolean></value></
member></struct></value><value><struct><member><name>serviceOfferingID</name><value><int>27</int></value></member><member><name>serviceOfferingActiveFlag</na
me><value><boolean>0</boolean></value></member></struct></value></data></array></value></member></struct></value></param></params></methodCall>
, EventId=2696257, Time=20140310T01:11:56-0300]

========================
0
 
woolmilkporcCommented:
I think you should take the date into account, in case the log file would span several days.

Try this:
nawk -F: '/Incoming request url/ {A[$1]+=1} END {for (n in A) print n, "->", A[n]}' log.txt | sort

Open in new window

If you really want to ignore the date:
nawk -F":| " '/Incoming request url/ {A[$2]+=1} END {for (n in A) print n, "->", A[n]}' log.txt | sort

Open in new window

Please note that using "nawk" instead of "awk" for this latter version is important under Solaris!
0
Free Tool: ZipGrep

ZipGrep is a utility that can list and search zip (.war, .ear, .jar, etc) archives for text patterns, without the need to extract the archive's contents.

One of a set of tools we're offering as a way to say thank you for being a part of the community.

 
MikeOM_DBACommented:
Perhaps this will help:
awk -F'[ :]' '/Incoming request url/{
  dt=$1" "$2; 
  if(k==dt){c+=1}
  else { if(c>0){print k" -> "c;}
  c=1;k=dt}
} END {print k" -> "c}' log.txt

Open in new window

0
 
c_kedarCommented:
I guess this simple one liner will do, unless I am missing some point
cut -d : -f 1,2 filename | sort | uniq -c

Open in new window

You can omit 'sort' if it is safe to assume that in log file time will always be ascending.
0
 
MikeOM_DBACommented:
@c_kedar, Unfortunalely the one - liner does not count the 'number of times that the "Incoming request url"  appeared'
:p
0
 
c_kedarCommented:
I missed that requirement. It is simple to add, though:
grep "Incoming request url"  filename | cut -d : -f 1,2 | sort | uniq -c

Open in new window

0
 
joaotellesAuthor Commented:
Tks.
0

Featured Post

Free Tool: IP Lookup

Get more info about an IP address or domain name, such as organization, abuse contacts and geolocation.

One of a set of tools we are providing to everyone as a way of saying thank you for being a part of the community.

  • 2
  • 2
  • 2
  • +2
Tackle projects and never again get stuck behind a technical roadblock.
Join Now