Solved

shell script to aggregate counts

Posted on 2011-03-06
5
794 Views
Last Modified: 2012-05-11
I have a space-separated file that contains 3 fields:  1) the hit count for that date and hit type 2) the date 3) the hit type.

So it looks something like this:

     83 2011-03-04 0
      9 2011-03-04 -1
    630 2011-03-04 10
      6 2011-03-04 4
    250 2011-03-04 10

As you can see some of the days and types (in this case 2011-03-04 10) have more than one line.

In that case, I want the hit counts for that date and type to be added.

So the resulting file would say:

     83 2011-03-04 0
      9 2011-03-04 -1
    880 2011-03-04 10
      6 2011-03-04 4

How can I do this in a shell script?  (Using perl or awk is great).
0
Comment
Question by:aturetsky
  • 2
  • 2
5 Comments
 
LVL 16

Assisted Solution

by:sjklein42
sjklein42 earned 300 total points
Comment Utility
This should do it.  It does not preserve the same order as the input.  If that is a requirement it can be done differently.

while ( <> )
{
	s/[\r\n]//g;
	s/^\s+//;
	($n, $d, $t) = split(/\s+/);
	$n{"$d $t"} += $n;
	
}

foreach $k (sort(keys(%n)))
{
	print sprintf("%7d", $n{$k}) . " " . $k . "\n";
}

Open in new window

0
 
LVL 1

Author Comment

by:aturetsky
Comment Utility
I put your script in to aggregate.sh and run it like this, and, for some reason, get an error:

$ cat hits.txt | ./aggregate.sh
./aggegate.sh: line 1: syntax error near unexpected token `)'
./aggregate.sh: line 1: `while ( <> )'

What am I doing wrong?
0
 
LVL 16

Expert Comment

by:sjklein42
Comment Utility
With no header line in the script you need to say "perl".  This should work:

cat hits.txt | perl aggregate.sh

Open in new window


If you add a header line to the perl script you can do it your way.   Something like this as the first line in the perl script should do it:

#!/usr/bin/perl 

Open in new window

0
 
LVL 7

Accepted Solution

by:
Murugesan Nagarajan earned 200 total points
Comment Utility
Using awk
##!/bin/ksh
awk 'BEGIN {
                RowCnt=0;
        }
        {
                first[RowCnt] = $1;
                second[RowCnt] = $2;
                third[RowCnt] = $3;
                RowCnt++;
        }
        END {
                for( i=0; i<RowCnt; i++)
                {
                        for( j=0; j<RowCnt; j++)
                        {
                                if ( i != j )
                                {
                                        if ( ( second[i]  == second[j] ) && (( third[i]  == third[j] )) )
                                        {
                                                first[i] += first[j] ;
                                                second[j] = "" ;
                                                third[j] = "" ;
                                                first[j] = "" ;
                                        }
                                }
                        }
                }
                for( i=0; i<RowCnt; i++)
                {
                        if ( ( "" != first[i] ) && ( "" != second[i] ) && ( "" != third[i] ) )
                        {
                                printf("%-08s%11s%13s\n", first[i], second[i], third[i]);
                        }
                }
        }' input.txt                                        

Open in new window


As proposed by sjklein42 with exception handling
#!/usr/bin/perl
sub aggregateCounts
{
        if ( "$#ARGV" eq "-1" )
        {
                print "Usage:\n" ;
                print "\t$0\tExistingFileName\n";
                exit 1;
        }
        else
        {
                if ( -e $ARGV[0] )
                {
                        while ( <> )
                        {
                                s/[\r\n]//g;
                                s/^\s+//;
                                ($n, $d, $t) = split(/\s+/);
                                $n{"$d $t"} += $n;

                        }
                        foreach $k (sort(keys(%n)))
                        {
                                print sprintf("%7d", $n{$k}) . " " . $k . "\n";
                        }
                }
                else
                {
                        print "File Not Found:  $ARGV[0]\n" ;
                }
        }
        exit 0;
}
aggregateCounts $ARGV;

Open in new window


Example program(using awk,perl) to sum the first column if the second column and the third column matches for the matching rows
0
 
LVL 1

Author Closing Comment

by:aturetsky
Comment Utility
Thanks to both of you!!!
0

Featured Post

IT, Stop Being Called Into Every Meeting

Highfive is so simple that setting up every meeting room takes just minutes and every employee will be able to start or join a call from any room with ease. Never be called into a meeting just to get it started again. This is how video conferencing should work!

Join & Write a Comment

Suggested Solutions

Over the years I've spent many an hour playing on hardened, DMZ'd servers, with only a sub-set of the usual GNU toy's to keep me company; frequently I've needed to save and send log or data extracts from these server back to my PC, or to others, and…
Background Still having to process all these year-end "csv" files received from all these sources (including Government entities), sometimes we have the need to examine the contents due to data error, etc... As a "Unix" shop, our only readily …
Learn several ways to interact with files and get file information from the bash shell. ls lists the contents of a directory: Using the -a flag displays hidden files: Using the -l flag formats the output in a long list: The file command gives us mor…
This video explains how to create simple products associated to Magento configurable product and offers fast way of their generation with Store Manager for Magento tool.

771 members asked questions and received personalized solutions in the past 7 days.

Join the community of 500,000 technology professionals and ask your questions.

Join & Ask a Question

Need Help in Real-Time?

Connect with top rated Experts

12 Experts available now in Live!

Get 1:1 Help Now