fieldj
asked on
powershell to fetch folder names and add to verible
Hi, I am new to powershell so would like some help with my script on Server 2012 R2 version 4.0.
My ps1 script is in a folder D:/data. I need it to check all sub-folders and upload a file to AWS s3 cloud bucket but I want it to fetch the sub-folder name and add that as a location to the s3 bucket, (e.g. D:/data/sub1, D:/data/sub2, D:/data/sub3) I dont want the full path, just the sub folder (sub1, sub2 etc).
I have tried split-path and -leaf and stored it in a variable but have been unsuccessful, and this may have simply been the wrong approach... Sample file below.
#upload of multiple files from folder
$results = Get-ChildItem "D:\DataIn\" -Recurse -Include "*.txt"
foreach ($path in $results) {
Write-Host "Transferring: " $path
$filename = Get-Item $path
Write-S3Object -BucketName myS3bucket-name -Key "$($filename.Name)" -File $filename.FullName
}
I'd also like to zip the raw files before the upload, they are in .txt format. I had planned to do this via a separate script with 7zip but wonder if there is a better way such as incorporating it into the attached ps1?
And finally i'd like some confirmation the upload has been completed successfully, or better still a check on the bucket, if the file is not there an email to let me know the upload failed.
I intend to run the script via Task Scheduler. Please provide full details / example of each step...
My ps1 script is in a folder D:/data. I need it to check all sub-folders and upload a file to AWS s3 cloud bucket but I want it to fetch the sub-folder name and add that as a location to the s3 bucket, (e.g. D:/data/sub1, D:/data/sub2, D:/data/sub3) I dont want the full path, just the sub folder (sub1, sub2 etc).
I have tried split-path and -leaf and stored it in a variable but have been unsuccessful, and this may have simply been the wrong approach... Sample file below.
#upload of multiple files from folder
$results = Get-ChildItem "D:\DataIn\" -Recurse -Include "*.txt"
foreach ($path in $results) {
Write-Host "Transferring: " $path
$filename = Get-Item $path
Write-S3Object -BucketName myS3bucket-name -Key "$($filename.Name)" -File $filename.FullName
}
I'd also like to zip the raw files before the upload, they are in .txt format. I had planned to do this via a separate script with 7zip but wonder if there is a better way such as incorporating it into the attached ps1?
And finally i'd like some confirmation the upload has been completed successfully, or better still a check on the bucket, if the file is not there an email to let me know the upload failed.
I intend to run the script via Task Scheduler. Please provide full details / example of each step...
ASKER
Hi Chris, I have amended the script in terms of relative path and its halfway there:
$results = Get-ChildItem "D:\DataIn\" -Recurse -Include "*.txt"
foreach ($file in $results) {
$relativePath = $file.FullName.Substring(" D:\DataIn\ ".Length)
Write-Host "Transferring: " $relativePath
Write-S3Object -BucketName my-bucket -Key "$relativePath/$($file.Nam e)" -File $file.FullName
}
The only thing which isn't quite right is the uploaded file path is sort of duplicated, e.g the file D:\DataIn\Test\eeTest.txt end up with a path in s3 of "Test/eeTEST.txt/eeTEST.tx t"
In regards compression, yes I'd like to zip each file individually (keeping same file name).
$results = Get-ChildItem "D:\DataIn\" -Recurse -Include "*.txt"
foreach ($file in $results) {
$relativePath = $file.FullName.Substring("
Write-Host "Transferring: " $relativePath
Write-S3Object -BucketName my-bucket -Key "$relativePath/$($file.Nam
}
The only thing which isn't quite right is the uploaded file path is sort of duplicated, e.g the file D:\DataIn\Test\eeTest.txt end up with a path in s3 of "Test/eeTEST.txt/eeTEST.tx
In regards compression, yes I'd like to zip each file individually (keeping same file name).
Relative path includes the file name, so you don't need to write it again in your key.
$results = Get-ChildItem "D:\DataIn\" -Recurse -Include "*.txt"
foreach ($file in $results) {
$relativePath = $file.FullName.Substring("D:\DataIn\".Length)
Write-Host "Transferring: " $relativePath
Write-S3Object -BucketName my-bucket -Key $relativePath -File $file.FullName
}
This version adds compression.
If the zipped file must be passed up with a .txt extension, it will have to be renamed post-compression. Should it have the zip extension?
$results = Get-ChildItem "D:\DataIn\" -Recurse -Include "*.txt"
foreach ($file in $results) {
$relativePath = $file.FullName.Substring("D:\DataIn\".Length)
Compress-Archive $file.FullName -DestinationPath "$($file.FullName).zip" -Force
Write-Host "Transferring: " $relativePath
Write-S3Object -BucketName my-bucket -Key $relativePath -File "$($file.FullName).zip"
}
Compress-Archive won't let the file extension be anything other than .zip. Force is used to overwrite any existing zip file of the same name.If the zipped file must be passed up with a .txt extension, it will have to be renamed post-compression. Should it have the zip extension?
ASKER
Hi Chris
File name issue is sorted, but I get an error when trying compression:
Compress-Archive : The term 'Compress-Archive' is not recognized as the name of a cmdlet, function, script file
My setup is:
PS C:\> $PSVersionTable
Name Value
---- -----
PSVersion 4.0
WSManStackVersion 3.0
SerializationVersion 1.1.0.1
CLRVersion 4.0.30319.42000
BuildVersion 6.3.9600.17400
PSCompatibleVersions {1.0, 2.0, 3.0, 4.0}
PSRemotingProtocolVersion 2.2
File name issue is sorted, but I get an error when trying compression:
Compress-Archive : The term 'Compress-Archive' is not recognized as the name of a cmdlet, function, script file
My setup is:
PS C:\> $PSVersionTable
Name Value
---- -----
PSVersion 4.0
WSManStackVersion 3.0
SerializationVersion 1.1.0.1
CLRVersion 4.0.30319.42000
BuildVersion 6.3.9600.17400
PSCompatibleVersions {1.0, 2.0, 3.0, 4.0}
PSRemotingProtocolVersion 2.2
PowerShell 5.0 or higher for Compress-Archive. I do have a function which will do the same thing, but it has a .NET framework requirement as well.
Upgrading to PS 5 (or 5.1) would be neatest, but is that something you'd find acceptable?
Upgrading to PS 5 (or 5.1) would be neatest, but is that something you'd find acceptable?
ASKER
Hi Chris
The box isn't in production yet, so upgrading wasn't a problem, now on v5.1. Hitting a new issue: The term 'file.FullName' is not recognized...
my script:
$results = Get-ChildItem "D:\DataIn\" -Recurse -Include "*.txt" -Exclude "trigger.txt"
foreach ($file in $results) {
$relativePath = $file.FullName.Substring(" D:\DataIn\ ".Length)
Compress-Archive $file.FullName -DestinationPath "$($file.FullName).zip" -Force
Write-Host "Transferring: " $relativePath
Write-S3Object -BucketName mybucket -Key "$relativePath" -File "$(file.FullName).zip"
}
Also I like to move the source file (.txt) once the zip has been created (D:/Source/sub-folder) , and when the uploads have finished to move the zip files to D:/Archive/sub-folder.
cheers Richard
The box isn't in production yet, so upgrading wasn't a problem, now on v5.1. Hitting a new issue: The term 'file.FullName' is not recognized...
my script:
$results = Get-ChildItem "D:\DataIn\" -Recurse -Include "*.txt" -Exclude "trigger.txt"
foreach ($file in $results) {
$relativePath = $file.FullName.Substring("
Compress-Archive $file.FullName -DestinationPath "$($file.FullName).zip" -Force
Write-Host "Transferring: " $relativePath
Write-S3Object -BucketName mybucket -Key "$relativePath" -File "$(file.FullName).zip"
}
Also I like to move the source file (.txt) once the zip has been created (D:/Source/sub-folder) , and when the uploads have finished to move the zip files to D:/Archive/sub-folder.
cheers Richard
There should be a $ symbol there.
"$($file.FullName).zip"
I'll add the last two requests in a little while. On a mobile, need a proper keyboard. They won't be a problem.
ASKER
No errors now but the uploaded files not the compressed versions, they are .txt (the process is creating the .zip).
thanks again
thanks again
Good morning,
This update "should" fix the uploaded file type. It moves both the .txt and .zip files as requested. The comments within the script describe the current limitations of this.
I've added error handling, the goal is to avoid moving the zip file if any of the previous steps (for that file) fail.
This update "should" fix the uploaded file type. It moves both the .txt and .zip files as requested. The comments within the script describe the current limitations of this.
I've added error handling, the goal is to avoid moving the zip file if any of the previous steps (for that file) fail.
$results = Get-ChildItem "D:\DataIn\" -Recurse -Include "*.txt" -Exclude "trigger.txt"
$ErrorActionPreference = 'Stop'
foreach ($file in $results) {
try {
$relativePath = $file.FullName.Substring("D:\DataIn\".Length)
Compress-Archive $file.FullName -DestinationPath "$($file.FullName).zip" -Force
# This will be just the file, not the intermediate path. Does it need to include the intermediate path?
Move-Item $file.FullName -DestinationPath 'D:\Source\sub-folder'
Write-Host "Transferring: " $relativePath
Write-S3Object -BucketName mybucket -Key "$relativePath.zip" -File "$($file.FullName).zip"
# As above, no sub-folders will be included.
Move-Item "$($file.FullName).zip" -DestinationPath 'D:\Archive\sub-folder'
} catch {
Write-Warning "An error occurred uploading $($file.FullName) ($($_.Exception.Message))"
}
}
ASKER
Good morning Chris
Yes I'd like the intermediate path to be retained i.e. D:/DataIn/xyz/myfile.txt, it will be moved to D:/Source/xyz/myfile.txt & D:/Archive/xyz/myfile.zip
As a final piece after completion ideally it will delete the trigger.txt file (this is added by another job on a daily basis).
cheers Richard
Yes I'd like the intermediate path to be retained i.e. D:/DataIn/xyz/myfile.txt, it will be moved to D:/Source/xyz/myfile.txt & D:/Archive/xyz/myfile.zip
As a final piece after completion ideally it will delete the trigger.txt file (this is added by another job on a daily basis).
cheers Richard
We alerady have the relative path, so this should work to preserve the intermediate structure.
$results = Get-ChildItem "D:\DataIn\" -Recurse -Include "*.txt" -Exclude "trigger.txt"
$ErrorActionPreference = 'Stop'
foreach ($file in $results) {
try {
$relativePath = $file.FullName.Substring("D:\DataIn\".Length)
Compress-Archive $file.FullName -DestinationPath "$($file.FullName).zip" -Force
Move-Item $file.FullName -DestinationPath (Join-Path 'D:\Source\sub-folder' $relativePath) -Force
Write-Host "Transferring: " $relativePath
Write-S3Object -BucketName mybucket -Key "$relativePath.zip" -File "$($file.FullName).zip"
Move-Item "$($file.FullName).zip" -DestinationPath (Join-Path 'D:\Archive\sub-folder' "$relativePath.zip") -Force
} catch {
Write-Warning "An error occurred uploading $($file.FullName) ($($_.Exception.Message))"
}
}
# Path needs to be defined here.
Remove-Item trigger.txt
ASKER
I got an error running it, ('DestinationPath') so ran it bit by bit in ISE, then hit the same error
Move-Item $file.FullName -DestinationPath (Join-Path 'D:\Original\' $relativePath) -Force
Move-Item : A parameter cannot be found that matches parameter name 'DestinationPath'.
With regards to the Trigger.txt file, that is removed at the end of the process, how can I get it to check if the file exists, if it does - carry on, if not exit?
Move-Item $file.FullName -DestinationPath (Join-Path 'D:\Original\' $relativePath) -Force
Move-Item : A parameter cannot be found that matches parameter name 'DestinationPath'.
With regards to the Trigger.txt file, that is removed at the end of the process, how can I get it to check if the file exists, if it does - carry on, if not exit?
ASKER CERTIFIED SOLUTION
membership
This solution is only available to members.
To access this solution, you must be a member of Experts Exchange.
ASKER
extremely helpful, clear comments so was able to understand code used, and fast response. many thanks
ASKER
Hi Chris
First off, I should say I'm happy to open a new question, but commented here as I wanted to follow-up with you, if you can, and don't mind.
I had the script working fine for a good week and then have run into some issues (it seems to abort part way through). Therefore I'd like to add some kind of logging which I hope will shed light on why/where its falling over.
cheers Richard
First off, I should say I'm happy to open a new question, but commented here as I wanted to follow-up with you, if you can, and don't mind.
I had the script working fine for a good week and then have run into some issues (it seems to abort part way through). Therefore I'd like to add some kind of logging which I hope will shed light on why/where its falling over.
cheers Richard
Open in new window
For compression, you can use the Compress-Archive command if your version of PowerShell is reasonably modern. Do you want to zip each file individually?Verification, there's a Get-S3Object command. That would be my first thing to check, but I haven't a clue if it'll return what you're looking for or not. Are you able to run it and see exactly what it gets you?
Actual failure testing is likely better done by wrapping Write-S3Object in a bit of error handling.
Open in new window