bndit
asked on
Effective way of reading txt file (only new content since last read) using Powershell and TaskScheduler
Hello,
I'm trying to setup a scheduled task that will read the content of a text file and look for a pattern (error messages) and then send an alert. The scheduled task will run every x mins and will read the same file. However, this script has to "remember" where it left off on the last read as to 1) avoid false positives (errors already reported) and 2) save time by reading only 'new' content.
Hoping that someone out there already has overcome this challenge and can point me in the right direction.
I found this piece of code (http://www.archivum.info/microsoft.public.windows.powershell/2008-04/00134/RE-Read-a-huge-text-file-from-bottom-up.html) that uses size to avoid processing the entire file at every iteration, however not sure if size is a good option here.
You're input is appreciated. Thanks.
I'm trying to setup a scheduled task that will read the content of a text file and look for a pattern (error messages) and then send an alert. The scheduled task will run every x mins and will read the same file. However, this script has to "remember" where it left off on the last read as to 1) avoid false positives (errors already reported) and 2) save time by reading only 'new' content.
Hoping that someone out there already has overcome this challenge and can point me in the right direction.
I found this piece of code (http://www.archivum.info/microsoft.public.windows.powershell/2008-04/00134/RE-Read-a-huge-text-file-from-bottom-up.html) that uses size to avoid processing the entire file at every iteration, however not sure if size is a good option here.
You're input is appreciated. Thanks.
$TextFilePath = "C:\myfile.log"
$BytesToRead = 1024
$fs = [System.IO.File]::OpenRead($TextFilePath)
$fs.Position = $fs.Length - $BytesToRead
$sr = New-Object System.IO.StreamReader($fs)
$text = $sr.ReadToEnd()
$sr.Close()
$fs.Close()
$text
ASKER CERTIFIED SOLUTION
membership
This solution is only available to members.
To access this solution, you must be a member of Experts Exchange.
One approach to this would be to log the processing of each file and record the size of it at that time. Processing this as a text file would be complex, but if you create the log entries as objects (a set of functions that I use for this is here: http://powershell.com/cs/media/p/10676.aspx ) means that you can retrieve a log, filter by a log entry attribute such as source and the select the latest entry.If the messafe was the file size you could then use this value.
ASKER
Thanks for the link and the help.
ASKER
This answer pointed in the right direction to accomplish my goal.
ASKER
# logs.txt contains files names (log1.txt..log2.txt...etc)
$logfiles = get-content c:\logs.txt
$logfiles | foreach-object {
# Question here is...how do I grab the corresponding bytecount text file for log1...I thought about a loop, but I dont to process all of these bytecount text files...I only need the corresponding file
}