• Status: Solved
  • Priority: Medium
  • Security: Public
  • Views: 208
  • Last Modified:

Reading log file in Perl

Hi,
Can anyone please help me with the code snippet for the below issue:

Need to the read the attached "Test_log" file on a daily basis based on date  and generate the output as shown below into a log file. Next login to Db and run a select query statement and generate a log file with the query results.
Finally send an email with both the log files attachment and the  total count of results in the log files  as subject line

[b]output for the log file attached should be:[/b]
HEFFILE0004/02720220R
HEFFILE0004/60530610R
HEFFILE0004/60530700R
HEFFILE0004/60531070R

Open in new window

Test-log.txt
0
new_perl_user
Asked:
new_perl_user
  • 10
  • 6
1 Solution
 
jeromeeCommented:
you would need to be more specific but in term of the rules governing what to generate in the log file and more importantly "the login to Db and run a select query statement and generate a log file with the query results". This is so vague that it makes my head spin.
0
 
new_perl_userAuthor Commented:
I  am sorry.

1)Below is the  file I need to read daily and grab yesterday's data from that log file based on date and then write the ouput  to  daily.txt.

Test_log sample  Data:

##################### SYNC  STARTED: Mon Sep 26 03:10:05
 #####################
receiving file list ... done
.d..t.... HEFFILE0004/02720220R/METADATA/
.d..t.... HEFFILE0004/60530610R/METADATA/
.d..t.... HEFFILE0004/60530700R/METADATA/
.d..t.... HEFFILE0004/60531070R/METADATA/
.d..t.... HEFFILE0004/60910750R/METADATA/
##################### SYNC FINISHED: Mon Sep 26 05:15:39 EDT 2011  #####################

Open in new window


Output Format for daily.txt :
HEFFILE0004/02720220R
HEFFILE0004/60530610R
HEFFILE0004/60530700R
HEFFILE0004/60531070R


2) After that login to Oracle Database and the run the below query:
Select a,b from table1 where date =to_date(SYSDATE, 'DD.MM.YY')-1;  and write the output to a Db.txt


3) finally send an email with both the logs attached (daily.txt , Db.txt). If possible can we get the count  of data in each log file as subject line of an email  like( daily.txt = 10 files , db.txt = 20 files)

If I am not clear yet please let me know. Thank you for  the  help.
0
 
new_perl_userAuthor Commented:
can anyone please help me on this..
0
Concerto Cloud for Software Providers & ISVs

Can Concerto Cloud Services help you focus on evolving your application offerings, while delivering the best cloud experience to your customers? From DevOps to revenue models and customer support, the answer is yes!

Learn how Concerto can help you.

 
new_perl_userAuthor Commented:
any suggestions will be very helpful

Thanks,
0
 
new_perl_userAuthor Commented:
Hi Jeromee,
If possible can you please help me with #1. I am able to do the #2  and will try out #3.

Thanks,
0
 
jeromeeCommented:
Here's #1 only...

run as
   perl thisscript.pl  Test.log.txt

Please note that I made some assumption re: what the filename must look like (HEFFILEnnnn/yyyyyy/)
use strict;

my $yesterday = scalar localtime(time()-24*60*60);	# Thu Sep 29 14:59:20 2011
substr($yesterday, 11, 9) = "";				# remove the time
my $date = '';
while( <> ) {
   if( /SYNC  STARTED: (.*) \d\d:\d\d:\d\d .* (\d{4})/ ) {	# Extract the date of each sectio
      $date = "$1 $2";
   }

   if( $yesterday eq $date && m{(HEFFILE\d+/[^/]+)} ) {		# if the same date as yesterday and has the right filename signature
      print "$1\n";
   }
}

Open in new window

0
 
new_perl_userAuthor Commented:
Hi,

Thank you so much for the response.
 
I tried the above code but it is not printing anything. Can you please take a look. Below is the sample Test.log.txt I am trying to read.

##################### SYNC  STARTED: Thu Sep 29 03:10:05 EDT 2011 #####################
receiving file list ... done
.d..t.... HEFFILE0004/02720220R/METADATA/
.d..t.... HEFFILE0004/60530610R/METADATA/
.d..t.... HEFFILE0004/60530700R/METADATA/
.d..t.... HEFFILE0004/60531070R/METADATA/
.d..t.... HEFFILE0004/60910750R/METADATA/
##################### SYNC FINISHED: Thu Sep 29 05:15:39 EDT 2011 #####################

Open in new window

0
 
jeromeeCommented:
It works for me:

HEFFILE0004/02720220R
HEFFILE0004/60530610R
HEFFILE0004/60530700R
HEFFILE0004/60531070R
HEFFILE0004/60910750R

How do you run the script?
0
 
new_perl_userAuthor Commented:
My bad it is working now. Executed in a wrong way.  One small change needed if possible. Right now it is printing all the files based on date, even if there are repetitive filenames like

HEFFILE0004/02720220R
HEFFILE0004/02720220R
HEFFILE0004/60530610R
HEFFILE0004/60530700R

 So it is possible to get just only one entry for repetitive data and write the output to a log file.

Thanks
0
 
jeromeeCommented:

use strict;

my $yesterday = localtime(time()-24*60*60);	# Thu Sep 29 14:59:20 2011
substr($yesterday, 11, 9) = "";			# remove the time
my $date = '';
my %output;
while( <> ) {
   if( /SYNC  STARTED: (.*) \d\d:\d\d:\d\d .* (\d{4})/ ) {	# Extract the date of each sectio
      $date = "$1 $2";
   }

   if( $yesterday eq $date && m{(HEFFILE\d+/[^/]+)} ) {		# if the same date as yesterday and has the right filename signature
      $output{"$1\n"}++;
   }
}
print join("",sort keys %output);

Open in new window

0
 
new_perl_userAuthor Commented:
can we write the output to a log file.
0
 
jeromeeCommented:
perl thisscript.pl  Test.log.txt > output.txt
0
 
new_perl_userAuthor Commented:
Thanks a ton for the help.
0
 
jeromeeCommented:
no problem.
Happy Perling!
0
 
new_perl_userAuthor Commented:
Hi Jeromee,

I was running the script again today to grab for data from yesterday(oct 2) but it is not working . Any ideas or suggestion on why??
0
 
new_perl_userAuthor Commented:
Never mind.. It is working, there were spaces near the date so it failed. Removed those and it is fine..

Thanks,
0

Featured Post

Concerto's Cloud Advisory Services

Want to avoid the missteps to gaining all the benefits of the cloud? Learn more about the different assessment options from our Cloud Advisory team.

  • 10
  • 6
Tackle projects and never again get stuck behind a technical roadblock.
Join Now