Script causing an error after running for long time.

bfuchs
bfuchs used Ask the Experts™
on
Hi Experts,

I have the script below do download all new files from my FTP server into my local PC.
However after running it for a while I get the attached error.

# Load WinSCP .NET assembly
Add-Type -Path "C:\Program Files (x86)\WinSCP\WinSCPnet.dll"

# Set up session options
$sessionOptions = New-Object WinSCP.SessionOptions -Property @{
    Protocol = [WinSCP.Protocol]::Sftp
    HostName = "sftp.MySite.com"
    UserName = "MyUserName"
    Password = "MyPWD"
    SshHostKeyFingerprint = "1234567890="
}

$session = New-Object WinSCP.Session

try
{
    # Connect
    $session.Open($sessionOptions)

    # Transfer files
    $sourcePath = "/Outbox/" # don't add *, will be added where necessary
    $destPath = "H:\FTP\"
    $destPathNew = "H:\FTP\Caspio\"

    $transferOptions = New-Object WinSCP.TransferOptions

    while($True)
    {
        try {
            # Get list of matching files in the directory
            $remoteFiles = $session.EnumerateRemoteFiles($sourcePath, "*.*", [WinSCP.EnumerationOptions]::None)

            # Any file matched?
            if ($remoteFiles.Count -gt 0) {
                foreach ($fileInfo in $remoteFiles) {
                    try {
                        Write-Host "Checking rules for the file " $fileInfo.Name
                        # check the filename for matching the mask
                        if ($fileInfo.Name -like "*PAT*.*" -or $fileInfo.Name -like "*Sch*.*" -or $fileInfo.Name -like "*Full*.*") {
                            # check if the file exists in the local download folder
                            if (![System.IO.File]::Exists($destPath + $fileInfo.Name)) {
                                # download the file
                                $transferResult = $session.GetFiles($sourcePath + $fileInfo.Name, $destPath, $False, $transferOptions)
                                $transferResult.Check()
                                
                                $destFile = $destPathNew + $fileInfo.Name   # local path where to copy files
                                $sourceFile = $destPath + $fileInfo.Name    # local path where was downloaded new files

                                if ( ![System.IO.File]::Exists($destFile)) {    # check if the file exists in the destination folder
                                    # copy the file in the destination folder
                                    Copy-Item -Path $sourceFile -Destination $destFile
                                    Write-Host "Copy of file $($fileInfo.Name) succeeded"
                                }
                            }
                        }
                    }
                    catch {
                        Write-Host "Exception catched on file $($fileInfo.Name)"
                        Write-Host $_.Exception.Message
                    }
                }
            }
            else {
                Write-Host "No files matching $wildcard found"
            }
        }
        catch {
            Write-Host "Exception catched while downloading new files"
            Write-Host $_.Exception.Message
        }
    }
    Write-Host "Waiting..."
    Start-Sleep -Seconds 5
}
finally
{
    $session.Dispose()
}

Open in new window


Can someone help me fix this problem.?

see following for reference..
https://www.experts-exchange.com/questions/29112257/How-to-copy-only-new-files-from-FTP-site.html?anchor=a42647509¬ificationFollowed=210720838#a42647509

Thanks in advance.
Untitled.png
Comment
Watch Question

Do more with

Expert Office
EXPERT OFFICE® is a registered trademark of EXPERTS EXCHANGE®
David FavorFractional CTO
Distinguished Expert 2018

Commented:
Well... You're transferring files via FTP, so script runtime duration can be a very long time.

Likely there's some way in PowerShell to set the runtime duration to infinite, which will fix your problem.

Note: Keep in mind, this will likely effect all scripts, so no runaway script will ever exit... This may or may not be appropriate...
Hi,
so script runtime duration can be a very long time.
So far this was not the case here, we have a scheduler that creates small files on the FTP server every 15 minutes, the downloading takes just a minute.

In my opinion, looks like the writing messages to log is the source of failure as the error message states.

Perhaps we can remove the successful/unsuccessful messages from the script?

Thanks,
Ben
"Batchelor", Developer and EE Topic Advisor
Top Expert 2015
Commented:
This scripts runs endlessly without any wait (the sleep is outside of the loop), so I suppose you are just overstressing the FTP connection.
If you have a file generation frequency of n every 15 minutes, I recommend to
  • adjust the sleep to 5 minutes
  • move the sleep into the while loop
  • open and close the FTP connection in the loop (move     $session.Open($sessionOptions) and  $session.Dispose() into the while loop)
# Load WinSCP .NET assembly
Add-Type -Path "C:\Program Files (x86)\WinSCP\WinSCPnet.dll"

# Set up session options
$sessionOptions = New-Object WinSCP.SessionOptions -Property @{
  Protocol = [WinSCP.Protocol]::Sftp
  HostName = "sftp.MySite.com"
  UserName = "MyUserName"
  Password = "MyPWD"
  SshHostKeyFingerprint = "1234567890="
}

$session = New-Object WinSCP.Session

$sourcePath  = "/Outbox/" # don't add *, will be added where necessary
$destPath    = "H:\FTP\"
$destPathNew = "H:\FTP\Caspio\"

$transferOptions = New-Object WinSCP.TransferOptions

while($True)
{
  try {
    # Connect
    $session.Open($sessionOptions)

    try {
      # Get list of matching files in the directory
      $remoteFiles = $session.EnumerateRemoteFiles($sourcePath, "*.*", [WinSCP.EnumerationOptions]::None)

      # Any file matched?
      if ($remoteFiles.Count) {
        foreach ($fileInfo in $remoteFiles) {
          try {
            Write-Host "Checking rules for the file " $fileInfo.Name
            # check the filename for matching the mask
            if ($fileInfo.Name -like "*PAT*.*" -or $fileInfo.Name -like "*Sch*.*" -or $fileInfo.Name -like "*Full*.*") {
              # check if the file exists in the local download folder
              if (![System.IO.File]::Exists($destPath + $fileInfo.Name)) {
                # download the file
                $transferResult = $session.GetFiles($sourcePath + $fileInfo.Name, $destPath, $False, $transferOptions)
                $transferResult.Check()
                
                $destFile = $destPathNew + $fileInfo.Name   # local path where to copy files
                $sourceFile = $destPath + $fileInfo.Name    # local path where was downloaded new files

                if ( ![System.IO.File]::Exists($destFile)) {    # check if the file exists in the destination folder
                  # copy the file in the destination folder
                  Copy-Item -Path $sourceFile -Destination $destFile
                  Write-Host "Copy of file $($fileInfo.Name) succeeded"
                }
              }
            }
          }
          catch { Write-Host "Exception catched on file $($fileInfo.Name)`n$($_.Exception.Message)" }
        }
      }
      else {
        Write-Host "No files matching $wildcard found"
      }
    }
    catch { Write-Host "Exception catched while downloading new files`n$($_.Exception.Message)" }
  }
  catch { Write-Host "Exception catched while opening FTP session`n$($_.Exception.Message)" }

  $session.Dispose()

  Write-Host "Waiting..."
  Start-Sleep -Seconds 300
}

Open in new window

How to Generate Services Revenue the Easiest Way

This Tuesday! Learn key insights about modern cyber protection services & gain practical strategies to skyrocket business:

- What it takes to build a cloud service portfolio
- How to determine which services will help your unique business grow
- Various use-cases and examples

Hi,
It only worked initially, but after it copied all files I then went and deleted all copied files and let it continue running, but nothing more got copied over.
See attached results.
fyi- we're planning to let this run constantly, and another program will also constantly check for files created under $destPathNew = "H:\FTP\Caspio\", process those files and delete them, therefore I had to test them in same way.
Thanks,
Ben
1
Qlemo"Batchelor", Developer and EE Topic Advisor
Top Expert 2015

Commented:
Looks like you have to move line 13 before line 25, as we need to recreate the session object each time.
David FavorFractional CTO
Distinguished Expert 2018

Commented:
Also, you may prefer using straight up sftp (as I recall Windows has this installed by default).

If you use sftp, your code becomes a single line...

echo "commands\nexit\n" | sftp -i ~/path-to-empty-passphrase-key-file

Open in new window

Qlemo"Batchelor", Developer and EE Topic Advisor
Top Expert 2015

Commented:
David, though the script is not perfect, it has an import check: files are only copied over if they do not yet exist in the destination folder. You can't do that (well) with FTP directly.
Hi,
though the script is not perfect
In middle of testing, but do you foresee any issues? or perhaps just referring to possible improvements? what improvements would you suggest?
Thanks,
Ben
Qlemo"Batchelor", Developer and EE Topic Advisor
Top Expert 2015

Commented:
Minor improvement. Line 47 in my code snippet above is superfluous, the existence of the destination file has already been checked (negative) in line 37.
I would also set $destfile prior to line 37 and use this var in the if.
Thank you Qlemo!
btw, when you have a chance please look at the following, which is my next step after this script.
https://www.experts-exchange.com/questions/29115337/Cannot-read-file-permission-denied.html?anchor=a42664113¬ificationFollowed=212034968#a42664113
Thanks,
Ben
Hi,
See error message gotten, perhaps this happens while file is middle being saved on the FTP server.
How should this be handled?
Thanks,
Ben
Untitled.png
Qlemo"Batchelor", Developer and EE Topic Advisor
Top Expert 2015

Commented:
You have to live with such temporary failures on a living system. Only if you get an error repeatedly you need to care.

Do more with

Expert Office
Submit tech questions to Ask the Experts™ at any time to receive solutions, advice, and new ideas from leading industry professionals.

Start 7-Day Free Trial