Analyse a file not beginning each time at the start

I need a command (tail, or a shell script) to analyse a large LOG file (100 Mo).

I check this file every 5 minutes, and i don't want to re-read each time from the beginning of the file, but beginning at the last line analyzed before.

Anyone could help ?
eeolivierAsked:
Who is Participating?
I wear a lot of hats...

"The solutions and answers provided on Experts Exchange have been extremely helpful to me over the last few years. I wear a lot of hats - Developer, Database Administrator, Help Desk, etc., so I know a lot of things but not a lot about one thing. Experts Exchange gives me answers from people who do know a lot about one thing, in a easy to use platform." -Todd S.

eeolivierAuthor Commented:
Let me add that i can only read the LOG file, and that the content is increased of about 50/100 lines each seconds !
0
blklineCommented:
One option is to use the tail command like this:

tail -f /your/log/file

From the man page:
      -f, --follow[={name|descriptor}]
              output appended data as the file grows; -f, --follow, and --fol-
              low=descriptor are equivalent
0
eeolivierAuthor Commented:
No, that wasn't my question.

I need a command that return me, when i call it the last lines since i check it last times.

It's for use then in a log analyser script.
0
Cloud Class® Course: Microsoft Windows 7 Basic

This introductory course to Windows 7 environment will teach you about working with the Windows operating system. You will learn about basic functions including start menu; the desktop; managing files, folders, and libraries.

TintinCommented:
Any chance of changing the method your log analyser script uses?  Running tail -f is the easiest way if you are flexible in the way you input/analyse the data.
0
blklineCommented:
Well, you could do something like this:

tail -f /your/log/file | \
while read xl
do
   # code to determine outfile name here.
   echo $xl >> $outfile
done

Add some code in the while loop that alters the name of the outfile every so often, say every so many lines or every so many times.
That way, your log file is being split up into separate files that you can analyze at your leisure.
0
sunnycoderCommented:
Assume that tempfile contains the last line number in the logfile that you read. You will have write the first entry yourself

first=`cat tempfile`
second=`wc -l logfile`

d=`expr $second - $first`

tail -n $d logfile

echo $second > tempfile

I have not tried the script but it should be pretty close to what you are looking for.
0
eeolivierAuthor Commented:
It don't work sunnycode !!

I obtain this in my tempile : 3 /usr/logger/stat/SCVRU.all.041207.csv
 
(instead of only 3), and so an error

expr: syntax error
tail: /usr/logger/stat/SCVRU.all.041207.csv: invalid number of lines

it's the wc -l that return : 3 /usr/logger/stat/SCVRU.all.041207.csv
0
TintinCommented:
*If* you wanted to use sunnycoder's script, you'd need to correct:

second=`wc -l <logfile`


0
moduloCommented:
PAQed with no points refunded (of 350)

modulo
Community Support Moderator
0

Experts Exchange Solution brought to you by

Your issues matter to us.

Facing a tech roadblock? Get the help and guidance you need from experienced professionals who care. Ask your question anytime, anywhere, with no hassle.

Start your 7-day free trial
It's more than this solution.Get answers and train to solve all your tech problems - anytime, anywhere.Try it for free Edge Out The Competitionfor your dream job with proven skills and certifications.Get started today Stand Outas the employee with proven skills.Start learning today for free Move Your Career Forwardwith certification training in the latest technologies.Start your trial today
Linux OS Dev

From novice to tech pro — start learning today.