Link to home
Start Free TrialLog in
Avatar of bfuchs
bfuchsFlag for United States of America

asked on

How to copy only new files from FTP site?

Hi Experts,

I have the following script that

1- downloads from a FTP site all new files to a local folder
2- Copy all downloaded files to another local folder.

However I see that 2nd step is not working properly, it downloads/copies all files again to the 2nd folder, not only the new files.
Can someone help me fix that?

# Load WinSCP .NET assembly
Add-Type -Path "C:\Program Files (x86)\WinSCP\WinSCPnet.dll"

# Set up session options
$sessionOptions = New-Object WinSCP.SessionOptions -Property @{
    Protocol = [WinSCP.Protocol]::Sftp
    HostName = "sftp.MySite.com"
    UserName = "MyUserName"
    Password = "MyPWD"
    SshHostKeyFingerprint = "1234567890="
}

$session = New-Object WinSCP.Session

try
{
    # Connect
    $session.Open($sessionOptions)

    # Transfer files
    $remotePath = "H:\FTP\*"
    
    $sourcePath = "/Outbox/*"
    $destPath = "H:\FTP\"
    $destPathNew = "H:\FTP\Caspio\"

    $transferOptions = New-Object WinSCP.TransferOptions

    $transferOptions.FileMask = "*PAT*.*;*Sch*.*|*Full*.*"

    while($True)
    {
         try
        {
            $transferResult = $session.GetFiles($sourcePath, $destPath, $False, $transferOptions)
            $transferResult.Check()
        }
        finally
        {
 	        foreach ($transfer in $transferResult.Transfers)
                {
                   $session.GetFiles($transfer.FileName, $destPathNew, $False, $transferOptions)
                    Write-Host "Download of $($transfer.FileName) succeeded"
                }
                $destPathNew1
        }
    }
    Write-Host "Waiting..."
    Start-Sleep -Seconds 5
}
finally
{
    $session.Dispose()
}

Open in new window


Also wondering why the script is running scrolling down listing file names succeeded..and at the same time I keep running a batch that gives me the count of files, and its always the same?!
See attached.

Thanks in advance.
Untitled.png
Avatar of David Johnson, CD
David Johnson, CD
Flag of Canada image

you should be using syncronize instead
https://winscp.net/eng/docs/faq_script_modified_files

 $session.GetFiles($transfer.FileName, $destPathNew, $False,

Guessing that $False means don't overwrite or skip. Your logic increments if the file is downloaded or skipped
Avatar of Member_2_6404472
Member_2_6404472

Hi,
the following script can be the answer to your question, but be careful because it's not tested.

This script synchronize the local files with the remote folder as source.
Afterward copy the synchronized files in the second location, if the files exist will be overridden.

# Load WinSCP .NET assembly
Add-Type -Path "C:\Program Files (x86)\WinSCP\WinSCPnet.dll"

# Set up session options
$sessionOptions = New-Object WinSCP.SessionOptions -Property @{
    Protocol              = [WinSCP.Protocol]::Sftp
    HostName              = "sftp.MySite.com"
    UserName              = "MyUserName"
    Password              = "MyPWD"
    SshHostKeyFingerprint = "1234567890="
}

$session = New-Object WinSCP.Session

try {
    # Connect
    $session.Open($sessionOptions)

    # Transfer files
    $sourcePath = "/Outbox/*"
    $destPath = "H:\FTP\"
    $destPathNew = "H:\FTP\Caspio\"

    $transferOptions = New-Object WinSCP.TransferOptions

    $transferOptions.FileMask = "*PAT*.*;*Sch*.*|*Full*.*"

    while ($True) {
        try {
            $synchronizationResult = $session.SynchronizeDirectories(
                [WinSCP.SynchronizationMode]::Local, $destPath, $sourcePath, $False, $transferOptions)
            $synchronizationResult.Check()
            Write-Host "Synchronization on folder $($destPath) succeeded!"
        }
        catch {
            Write-Host "Exception catched while synchronizing the folder $($destPath)"
        }
        finally {
            foreach ($transfer in $synchronizationResult.Transfers) {
                try {
                    Copy-Item Get-Content( $destPath + $transfer.FileName) -Destination Get-Content( $destPathNew + $transfer.FileName) -Force
                    Write-Host "Copy of file $($transfer.FileName) succeeded"
                }
                catch {
                    Write-Host "Exception catched while copying the file $($transfer.FileName)"
                }
            }
        }
    }
    Write-Host "Waiting..."
    Start-Sleep -Seconds 5
}
finally {
    $session.Dispose()
}

Open in new window


Bregs
Rossano Praderi
Avatar of bfuchs

ASKER

Hi Experts,

@David,
To be honest I'm not familiar with that Powershell language whatsoever, and therefore would need exact steps how to implement your suggestion.

As a matter of fact, I was not the one who wrote it, and currently looking to convert it to VBA, where I have at least some knowledge...

https://www.experts-exchange.com/questions/29112291/Looking-to-convert-script-into-VBA-code.html?anchor=a42644251¬ificationFollowed=210606316&anchorAnswerId=42644251#a42644251

@Rossano,
This script synchronize the local files with the remote folder as source.
Afterward copy the synchronized files in the second location, if the files exist will be overridden.
Just to clarify...
What we need from this script is to download from FTP all newer files created since the script last ran, meaning it should check if file already exists in the local folder, if not then perform the download.

Then second step we want is that as the file gets downloaded it should also be copied to another folder.

Let me know if this is what your revised script is supposed to do?

Thanks,
Ben
Avatar of bfuchs

ASKER

Hi Rossano,

I tested your script and its running forever...

Perhaps the best way to do this is like the guy suggested here..
https://www.experts-exchange.com/questions/29112291/Looking-to-convert-script-into-VBA-code.html#questionAdd
The way to do that would be to download a file listing from the FTP site, compare it to the local file listing and download only the files that are different.

Thanks,
Ben
Hi Ben,
@Ben
To be honest I'm not familiar with that Powershell language ...

then...

Your original PowerShell script was written to exit from loop "while($True) { ... }" only if throws an un-handled exception.
I thought you would keep it running, I'm sorry for misunderstanding that.

There are many solutions to do what you are looking for, with almost every programming/scripting language can be done.
Why don't simply asked only for a VBA macro instead?

I personally suggest you to use a simple software as like FreeFileSync, five minutes and you solve your problems.


But if you would like to done this with VBA then you will receive all the appropriated answers on that other thread.


Have a nice day.

Bregs
Rossano Praderi
Avatar of bfuchs

ASKER

Hi Rossano,

I thought you would keep it running, I'm sorry for misunderstanding that.
Actually I'm okay with that keep running, as long as it does the two intended.

1- Downloads the new files.
2- Copies the new files to Caspio folder.

Let me know if that's the way it operates.

Thanks,
Ben
Avatar of bfuchs

ASKER

@Rossano,

Testing now your script and only see the attached.
While looking at the local folders, no files are getting downloaded.

Thanks,
Ben
Untitled.png
ASKER CERTIFIED SOLUTION
Avatar of Member_2_6404472
Member_2_6404472

Link to home
membership
This solution is only available to members.
To access this solution, you must be a member of Experts Exchange.
Start Free Trial
Avatar of bfuchs

ASKER

Hi Rossano,
I tested your script as follows.
First deleted all files from Caspio folder, then ran the script.
It did create many files there.
Then (while the script was still running) I deleted all files from Caspio folder, in order to see if only files are going to get created.
However it stopped creating files, and I see the script is returning with an error.
See attached.
Thanks,
Ben
Untitled.png
Avatar of bfuchs

ASKER

Just to clarify...
I'm going to have a program that will process each file under Caspio folder and then delete it.
Therefore is important to have this tested with same scenario we are planning to use it in production.
Thanks,
Ben
Avatar of bfuchs

ASKER

btw,
just checking if the below is a mistake?
if ($fileInfo.Name -like "*PAT*.*" -or $fileInfo.Name -like "*Sch*.*" -or $fileInfo.Name -like "*Full*.*")

Open in new window

looks like this will include all "Full" files as well, no?
Thanks,
Ben
Avatar of bfuchs

ASKER

I let the script run for a while, and now I'm getting a different error.
See attached.
Thanks,
Ben
Untitled.png
Hi Ben,
the script select only files included in you original "filemask" ("PAT", "Sch", "Full").

This script doesn't give me any enumeration error or download problem.
To avoid the "enumeration" issue try the follow modification.

Substitute this line
$remoteFiles = $session.EnumerateRemoteFiles($sourcePath, "*.*", [WinSCP.EnumerationOptions]::None)

Open in new window


With the follow
$remoteFiles = $session.ListDirectory($sourcePath)

Open in new window


Keep me updated


Bregs
Rossano Praderi
Avatar of bfuchs

ASKER

Hi,
the script select only files included in you original "filemask" ("PAT", "Sch", "Full").
No, look at the other post (you happen to be the one who provided the solution-:), its for excluding "Full".
https://www.experts-exchange.com/questions/29111991/Change-script-to-include-only-particular-files.html
This script doesn't give me any enumeration error or download problem.
It only happens after executing it for a long time like a few hours, I just happen to see it again.
To avoid the "enumeration" issue try the follow modification.
Will try that & let you know.

Thanks,
Ben
Avatar of bfuchs

ASKER

To avoid the "enumeration" issue try the follow modification.
Just want make sure this will solve both enumeration issues posted, especially the last one with log file being closed?
Thanks,
Ben
Avatar of bfuchs

ASKER

So far its just showing Checking rules for file, but its not copying anything, will wait to see during the night...
Thanks,
Ben
I'm sorry for mistake about file filters.

change this
if ($fileInfo.Name -like "*PAT*.*" -or $fileInfo.Name -like "*Sch*.*" -or $fileInfo.Name -like "*Full*.*")

Open in new window


with this
if (($fileInfo.Name -Like "*PAT*.*" -or $fileInfo.Name -Like "*Sch*.*") -and $fileInfo.Name -NotLike "*Full*.*")

Open in new window


On my last message I've changed the method to get the files list as an alternative to see if these can solve the issue.

May be the enumeration error can be caused by a session timeout or something related to the "enumeration" session status.
This can be resolved in many way, by closing and recreate a new session at every loop or with a counter/timer which will close/create a new session conditionally.

Your original script run indefinitely with a pause of 5 seconds between every cycle.
The message "Checking rules for file" is the position where will be checked the filename for download and copy.
If you want you can comment this line so you will not see this message anymore.

The script will copy only files not present in the destination folder.

Keep me updated.

Bregs
Rossano Praderi
Avatar of bfuchs

ASKER

Hi Rossano,

First thanks for the filter update.

With the new update of the script regarding enumeration issue, I end up having two problems.
1- It does not download anything.
2- After running for some time I keep getting the log file closed error. (See attached).

So I guess will put back your original code and search for solution on this latest issue.

However as the original question of copying new files seems to be resolved, I will close this question and open a new one for the other issue.

You're welcome to pickup from there-:)

Thanks,
Ben
Untitled.png
Avatar of bfuchs

ASKER

Thank you very much Rossano!!