Avatar of rutgermons
rutgermons
 asked on

powershell searching multiple logs

folks

The below extracts all errors from 1 log file

Get-Content TEST.log -wait | where { $_ -match “ERROR” } |out-file logger.txt -append

how can i get it extract ERROR's from multiple log files from multiple servers?

all help will do
PowershellWindows Batch

Avatar of undefined
Last Comment
RobSampson

8/22/2022 - Mon
Qlemo

This will only work with jobs, as you need to run several commands in parallel, and combine their output. Sadly get-content -wait ignores multipe file names provided.
You can decide whether you allow each file to be appended individually, or output collected and then appended.
$jobs = @()
$jobs += Start-Job { get-content c:\temp\ee\test1.log -wait | select-string "Error" | select -Expand Line }
$jobs += Start-Job { get-content c:\temp\ee\test2.log -wait | select-string "Error" | select -Expand Line }
# and so on

while ($true)
{
  receive-job $jobs | out-file -append logger.txt
  sleep -m 100
}

Open in new window

But that is inefficient, as we need a polling loop now. Allowing each job to do the work seems to be better, but you get the risk to have multiple processes writing to the same file at the same time, resulting in garbled content. (The following code also uses a loop to go thru paths)
foreach ($name in "test1", "test2")
{
  Start-Job { get-content c:\temp\ee\test1.log -wait | select-string "Error" | select -Expand Line | out-file -append C:\temp\ee\logger.txt }
}

Open in new window

RobSampson

Hi guys, I was writing this up, the way I normally would have done it, until I realised there was a flaw in what I had, and why Qlemo said it needs to be a job.

I had this:
# Servers file with one server per line
$Servers = Get-Content -Path "C:\Temp\Scripts\Servers.txt"
# Files to search with one file PATH per line
$SearchFiles = Get-Content -Path "C:\Temp\Scripts\SearchFiles.txt"
# Log file
$LogFile = "C:\Temp\Scripts\Logger.txt"

$Servers | ForEach {
    If (Test-Connection -ComputerName $_ -Count 1 -Quiet) {
        $UNCBase = "\\$_\"
        $SearchFiles | ForEach {
            $RemoteFilePath = $UNCBase + $_.Replace(':', '$')
            If (Test-Path -Path $RemoteFilePath) {
                Get-Content $RemoteFilePath | Where { $_ -match “ERROR” } | Out-File $LogFile -Append
            }
            Else {
                Write-Output "$RemoteFilePath cannot be found"
            }
        }
    }
    Else {
        Write-Output "$_ is offline"
    }
}

Open in new window


But with the -wait parameter appears to make the script stop and wait for new content to be added to the file, which obviously, makes it a synchronous script that won't get to the next file.

From this, I take it your original goal was to monitor the files constantly for new content that matches ERROR?  Since that's the case, go with the jobs, but maybe, for the sake of avoiding a mess of output, would it better to write each host / file to a separate log file like C:\Logs\$Computer_$Filename_Errors.txt?

Rob.
rutgermons

ASKER
rob

i was thinking the same as u suggest in the end, that would work
I started with Experts Exchange in 2004 and it's been a mainstay of my professional computing life since. It helped me launch a career as a programmer / Oracle data analyst
William Peck
RobSampson

Qlemo, maybe you can help me out here.  I have set up this script that executes the remote job and is able to write output to the correct file.  The problem is with testing the -Wait parameter.  I ran the script, and verified the jobs were running, then I added a line of text, including Error, to the files being monitored with the -Wait parameter, but the change didn't get logged.  Am I misunderstanding what it is supposed to be doing?

Rob.

# Servers file with one server per line
$Servers = Get-Content -Path "C:\Temp\Scripts\Servers.txt"
# Files to search with one file PATH per line
$SearchFiles = Get-Content -Path "C:\Temp\Scripts\SearchFiles.txt"
# Log share that the remote jobs will have access to
$LogShare = "\\SERVER\PublicShare\Logs"

$RemoteFunction = {
    Function CheckFile {
        Param (
            [Parameter(Mandatory=$True,ValueFromPipeline=$True)]
            [String]$FilePath,
            [Parameter(Mandatory=$True,ValueFromPipeline=$True)]
            [String]$LogPath
        )
        #Write-Output "Reading $FilePath and logging to $LogPath" | Out-File -Append $LogPath
        If (Test-Path -Path $FilePath) {
            Get-Content $FilePath -Wait | Select-String "Error" | Select -Expand Line | Out-File -Append $LogPath
        }
    }
}

ForEach ($Server In $Servers) {
    ForEach ($FileSpec In $SearchFiles) {
        $LogFile = $LogShare + "\" + $Server + "_" + ($FileSpec.Split("\") | Select -Last 1) + "_ERRORS.txt"
        Write-Output $FileSpec
        Write-Output $LogFile
        Start-Job -ScriptBlock {CheckFile $args[0] $args[1]} -ArgumentList @($FileSpec, $LogFile) -InitializationScript $RemoteFunction
    }
}

Open in new window

Qlemo

I cannot see any flaw, but it seems to be overly complicated for that simple purpose.
The prerequisite for Get-Content -Wait to work is that the monitored file needs to get closed after each change. Otherwise you won't see any change until closed.
ASKER CERTIFIED SOLUTION
RobSampson

THIS SOLUTION ONLY AVAILABLE TO MEMBERS.
View this solution by signing up for a free trial.
Members can start a 7-Day free trial and enjoy unlimited access to the platform.
See Pricing Options
Start Free Trial
GET A PERSONALIZED SOLUTION
Ask your own question & get feedback from real experts
Find out why thousands trust the EE community with their toughest problems.