Link to home
Start Free TrialLog in
Avatar of IM&T SRFT
IM&T SRFT

asked on

Copying original files from 1 - many locations and then archiving the original

Hi All

We have a folder that receives between 1 & 3 files each day and then a system is scheduled to process this file at a given time.  Once this file is processed it is discarded.

We now have a second system which requires a copy of this original file so what we would like to happen is:

Original file/s arrive in FOLDER1 (on server1), a copy of these new files are copied to FOLDER 2 (on server1) & FOLDER 3 (on server2)

Then the original file/s need to be archived to FOLDER4 (on server1).

The file name is named based on time & date of creation so it is always different which means the script needs to look for new files within FOLDER 1 rather than a specific name.

I keep finding lots of articles showing a copy file function and remote copy but not to define a copy rule based on the age/date of the file.  If anyone can help I would really appreciate it.
ASKER CERTIFIED SOLUTION
Avatar of Bill Prew
Bill Prew

Link to home
membership
This solution is only available to members.
To access this solution, you must be a member of Experts Exchange.
Start Free Trial
Avatar of IM&T SRFT
IM&T SRFT

ASKER

Ah yes that is true it could be a case of if a file exists it can be copied to the other folders and then deleted.
Thanks for making me think differently about this and here is the solution to what I originally asked to help other out.

I am having issues with it not processing when scheduling as a task is task scheduler but will post as a new issue.

# Copy everything under C:\Folder1 to C:\Folder2 & C:\Folder3
get-childitem "C:\Folder1" | % {
    copy-item $_.FullName -destination "C:\Folder2\$_" -recurse
    copy-item $_.FullName -destination "C:\Folder3\$_" -recurse
}

# Remove everything under C:\Temp\Test1
get-childitem "C:\Folder1" -recurse | % {
    remove-item $_.FullName -recurse
}