• Status: Solved
  • Priority: Medium
  • Security: Public
  • Views: 74
  • Last Modified:

powershell searching multiple logs

folks

The below extracts all errors from 1 log file

Get-Content TEST.log -wait | where { $_ -match “ERROR” } |out-file logger.txt -append

how can i get it extract ERROR's from multiple log files from multiple servers?

all help will do
0
rutgermons
Asked:
rutgermons
  • 3
  • 2
1 Solution
 
QlemoBatchelor, Developer and EE Topic AdvisorCommented:
This will only work with jobs, as you need to run several commands in parallel, and combine their output. Sadly get-content -wait ignores multipe file names provided.
You can decide whether you allow each file to be appended individually, or output collected and then appended.
$jobs = @()
$jobs += Start-Job { get-content c:\temp\ee\test1.log -wait | select-string "Error" | select -Expand Line }
$jobs += Start-Job { get-content c:\temp\ee\test2.log -wait | select-string "Error" | select -Expand Line }
# and so on

while ($true)
{
  receive-job $jobs | out-file -append logger.txt
  sleep -m 100
}

Open in new window

But that is inefficient, as we need a polling loop now. Allowing each job to do the work seems to be better, but you get the risk to have multiple processes writing to the same file at the same time, resulting in garbled content. (The following code also uses a loop to go thru paths)
foreach ($name in "test1", "test2")
{
  Start-Job { get-content c:\temp\ee\test1.log -wait | select-string "Error" | select -Expand Line | out-file -append C:\temp\ee\logger.txt }
}

Open in new window

0
 
RobSampsonCommented:
Hi guys, I was writing this up, the way I normally would have done it, until I realised there was a flaw in what I had, and why Qlemo said it needs to be a job.

I had this:
# Servers file with one server per line
$Servers = Get-Content -Path "C:\Temp\Scripts\Servers.txt"
# Files to search with one file PATH per line
$SearchFiles = Get-Content -Path "C:\Temp\Scripts\SearchFiles.txt"
# Log file
$LogFile = "C:\Temp\Scripts\Logger.txt"

$Servers | ForEach {
    If (Test-Connection -ComputerName $_ -Count 1 -Quiet) {
        $UNCBase = "\\$_\"
        $SearchFiles | ForEach {
            $RemoteFilePath = $UNCBase + $_.Replace(':', '$')
            If (Test-Path -Path $RemoteFilePath) {
                Get-Content $RemoteFilePath | Where { $_ -match “ERROR” } | Out-File $LogFile -Append
            }
            Else {
                Write-Output "$RemoteFilePath cannot be found"
            }
        }
    }
    Else {
        Write-Output "$_ is offline"
    }
}

Open in new window


But with the -wait parameter appears to make the script stop and wait for new content to be added to the file, which obviously, makes it a synchronous script that won't get to the next file.

From this, I take it your original goal was to monitor the files constantly for new content that matches ERROR?  Since that's the case, go with the jobs, but maybe, for the sake of avoiding a mess of output, would it better to write each host / file to a separate log file like C:\Logs\$Computer_$Filename_Errors.txt?

Rob.
0
 
rutgermonsAuthor Commented:
rob

i was thinking the same as u suggest in the end, that would work
0
Protect Your Employees from Wi-Fi Threats

As Wi-Fi growth and popularity continues to climb, not everyone understands the risks that come with connecting to public Wi-Fi or even offering Wi-Fi to employees, visitors and guests. Download the resource kit to make sure your safe wherever business takes you!

 
RobSampsonCommented:
Qlemo, maybe you can help me out here.  I have set up this script that executes the remote job and is able to write output to the correct file.  The problem is with testing the -Wait parameter.  I ran the script, and verified the jobs were running, then I added a line of text, including Error, to the files being monitored with the -Wait parameter, but the change didn't get logged.  Am I misunderstanding what it is supposed to be doing?

Rob.

# Servers file with one server per line
$Servers = Get-Content -Path "C:\Temp\Scripts\Servers.txt"
# Files to search with one file PATH per line
$SearchFiles = Get-Content -Path "C:\Temp\Scripts\SearchFiles.txt"
# Log share that the remote jobs will have access to
$LogShare = "\\SERVER\PublicShare\Logs"

$RemoteFunction = {
    Function CheckFile {
        Param (
            [Parameter(Mandatory=$True,ValueFromPipeline=$True)]
            [String]$FilePath,
            [Parameter(Mandatory=$True,ValueFromPipeline=$True)]
            [String]$LogPath
        )
        #Write-Output "Reading $FilePath and logging to $LogPath" | Out-File -Append $LogPath
        If (Test-Path -Path $FilePath) {
            Get-Content $FilePath -Wait | Select-String "Error" | Select -Expand Line | Out-File -Append $LogPath
        }
    }
}

ForEach ($Server In $Servers) {
    ForEach ($FileSpec In $SearchFiles) {
        $LogFile = $LogShare + "\" + $Server + "_" + ($FileSpec.Split("\") | Select -Last 1) + "_ERRORS.txt"
        Write-Output $FileSpec
        Write-Output $LogFile
        Start-Job -ScriptBlock {CheckFile $args[0] $args[1]} -ArgumentList @($FileSpec, $LogFile) -InitializationScript $RemoteFunction
    }
}

Open in new window

0
 
QlemoBatchelor, Developer and EE Topic AdvisorCommented:
I cannot see any flaw, but it seems to be overly complicated for that simple purpose.
The prerequisite for Get-Content -Wait to work is that the monitored file needs to get closed after each change. Otherwise you won't see any change until closed.
0
 
RobSampsonCommented:
>> but it seems to be overly complicated for that simple purpose

I did it this way because it was the only way I could find to pass parameters to the remote scriptblock with Start-Job.

Since it is building a dynamic log path and file path, I couldn't figure out how to get the parameters to pass directly to the script block with Start-Job.

Actually, I just tested it, and did work as expected.  Not sure why it didn't work last time.

Rob.
0
Question has a verified solution.

Are you are experiencing a similar issue? Get a personalized answer when you ask a related question.

Have a better answer? Share it in a comment.

Join & Write a Comment

Featured Post

Has Powershell sent you back into the Stone Age?

If managing Active Directory using Windows Powershell® is making you feel like you stepped back in time, you are not alone.  For nearly 20 years, AD admins around the world have used one tool for day-to-day AD management: Hyena. Discover why.

  • 3
  • 2
Tackle projects and never again get stuck behind a technical roadblock.
Join Now