powershell searching multiple logs

folks

The below extracts all errors from 1 log file

Get-Content TEST.log -wait | where { $_ -match “ERROR” } |out-file logger.txt -append

how can i get it extract ERROR's from multiple log files from multiple servers?

all help will do
rutgermonsAsked:
Who is Participating?

[Product update] Infrastructure Analysis Tool is now available with Business Accounts.Learn More

x
I wear a lot of hats...

"The solutions and answers provided on Experts Exchange have been extremely helpful to me over the last few years. I wear a lot of hats - Developer, Database Administrator, Help Desk, etc., so I know a lot of things but not a lot about one thing. Experts Exchange gives me answers from people who do know a lot about one thing, in a easy to use platform." -Todd S.

Qlemo"Batchelor", Developer and EE Topic AdvisorCommented:
This will only work with jobs, as you need to run several commands in parallel, and combine their output. Sadly get-content -wait ignores multipe file names provided.
You can decide whether you allow each file to be appended individually, or output collected and then appended.
$jobs = @()
$jobs += Start-Job { get-content c:\temp\ee\test1.log -wait | select-string "Error" | select -Expand Line }
$jobs += Start-Job { get-content c:\temp\ee\test2.log -wait | select-string "Error" | select -Expand Line }
# and so on

while ($true)
{
  receive-job $jobs | out-file -append logger.txt
  sleep -m 100
}

Open in new window

But that is inefficient, as we need a polling loop now. Allowing each job to do the work seems to be better, but you get the risk to have multiple processes writing to the same file at the same time, resulting in garbled content. (The following code also uses a loop to go thru paths)
foreach ($name in "test1", "test2")
{
  Start-Job { get-content c:\temp\ee\test1.log -wait | select-string "Error" | select -Expand Line | out-file -append C:\temp\ee\logger.txt }
}

Open in new window

0
RobSampsonCommented:
Hi guys, I was writing this up, the way I normally would have done it, until I realised there was a flaw in what I had, and why Qlemo said it needs to be a job.

I had this:
# Servers file with one server per line
$Servers = Get-Content -Path "C:\Temp\Scripts\Servers.txt"
# Files to search with one file PATH per line
$SearchFiles = Get-Content -Path "C:\Temp\Scripts\SearchFiles.txt"
# Log file
$LogFile = "C:\Temp\Scripts\Logger.txt"

$Servers | ForEach {
    If (Test-Connection -ComputerName $_ -Count 1 -Quiet) {
        $UNCBase = "\\$_\"
        $SearchFiles | ForEach {
            $RemoteFilePath = $UNCBase + $_.Replace(':', '$')
            If (Test-Path -Path $RemoteFilePath) {
                Get-Content $RemoteFilePath | Where { $_ -match “ERROR” } | Out-File $LogFile -Append
            }
            Else {
                Write-Output "$RemoteFilePath cannot be found"
            }
        }
    }
    Else {
        Write-Output "$_ is offline"
    }
}

Open in new window


But with the -wait parameter appears to make the script stop and wait for new content to be added to the file, which obviously, makes it a synchronous script that won't get to the next file.

From this, I take it your original goal was to monitor the files constantly for new content that matches ERROR?  Since that's the case, go with the jobs, but maybe, for the sake of avoiding a mess of output, would it better to write each host / file to a separate log file like C:\Logs\$Computer_$Filename_Errors.txt?

Rob.
0
rutgermonsAuthor Commented:
rob

i was thinking the same as u suggest in the end, that would work
0
Check Out How Miercom Evaluates Wi-Fi Security!

It's not just about Wi-Fi connectivity anymore. A wireless security breach can cost your business large amounts of time, trouble, and expense. Plus, hear first-hand from Miercom on how WatchGuard's Wi-Fi security stacks up against the competition plus a LIVE demo!

RobSampsonCommented:
Qlemo, maybe you can help me out here.  I have set up this script that executes the remote job and is able to write output to the correct file.  The problem is with testing the -Wait parameter.  I ran the script, and verified the jobs were running, then I added a line of text, including Error, to the files being monitored with the -Wait parameter, but the change didn't get logged.  Am I misunderstanding what it is supposed to be doing?

Rob.

# Servers file with one server per line
$Servers = Get-Content -Path "C:\Temp\Scripts\Servers.txt"
# Files to search with one file PATH per line
$SearchFiles = Get-Content -Path "C:\Temp\Scripts\SearchFiles.txt"
# Log share that the remote jobs will have access to
$LogShare = "\\SERVER\PublicShare\Logs"

$RemoteFunction = {
    Function CheckFile {
        Param (
            [Parameter(Mandatory=$True,ValueFromPipeline=$True)]
            [String]$FilePath,
            [Parameter(Mandatory=$True,ValueFromPipeline=$True)]
            [String]$LogPath
        )
        #Write-Output "Reading $FilePath and logging to $LogPath" | Out-File -Append $LogPath
        If (Test-Path -Path $FilePath) {
            Get-Content $FilePath -Wait | Select-String "Error" | Select -Expand Line | Out-File -Append $LogPath
        }
    }
}

ForEach ($Server In $Servers) {
    ForEach ($FileSpec In $SearchFiles) {
        $LogFile = $LogShare + "\" + $Server + "_" + ($FileSpec.Split("\") | Select -Last 1) + "_ERRORS.txt"
        Write-Output $FileSpec
        Write-Output $LogFile
        Start-Job -ScriptBlock {CheckFile $args[0] $args[1]} -ArgumentList @($FileSpec, $LogFile) -InitializationScript $RemoteFunction
    }
}

Open in new window

0
Qlemo"Batchelor", Developer and EE Topic AdvisorCommented:
I cannot see any flaw, but it seems to be overly complicated for that simple purpose.
The prerequisite for Get-Content -Wait to work is that the monitored file needs to get closed after each change. Otherwise you won't see any change until closed.
0
RobSampsonCommented:
>> but it seems to be overly complicated for that simple purpose

I did it this way because it was the only way I could find to pass parameters to the remote scriptblock with Start-Job.

Since it is building a dynamic log path and file path, I couldn't figure out how to get the parameters to pass directly to the script block with Start-Job.

Actually, I just tested it, and did work as expected.  Not sure why it didn't work last time.

Rob.
0

Experts Exchange Solution brought to you by

Your issues matter to us.

Facing a tech roadblock? Get the help and guidance you need from experienced professionals who care. Ask your question anytime, anywhere, with no hassle.

Start your 7-day free trial
It's more than this solution.Get answers and train to solve all your tech problems - anytime, anywhere.Try it for free Edge Out The Competitionfor your dream job with proven skills and certifications.Get started today Stand Outas the employee with proven skills.Start learning today for free Move Your Career Forwardwith certification training in the latest technologies.Start your trial today
Powershell

From novice to tech pro — start learning today.