I am trying to utilize a powershell plugin for amazon s3 to schedule synchronizations of backup files from a local folder to s3.
I can make it work if I make a script for each folder I want to sync but what I'd like to do is pass an argument from the command line to select the local and remote folders that the job should sync.
I'm sure this is pretty basic, but my scripting abilities are minimal.
Here is what I have, but does not work:
*********************************************************
$server=$args[0]
Set-Logging -LogPath S:\ps-log -LogLevel info
Add-PSSnapin CloudBerryLab.Explorer.PSSnapIn
$s3 = Get-CloudS3Connection -Key XXXXXXXX -Secret XXXXXXXXX
$dest = $s3 | Select-CloudFolder -Path "phxbackup02/VMbackups/" + $server + "@10.99.0.25"
$src = Get-CloudFilesystemConnection | Select-CloudFolder "S:\Local_Backup\" + $server + "@10.99.0.25\"
Set-CloudOption -UseChunks 1 -ChunkSizeKB 1024000
$src | Copy-CloudSyncFolders $dest -IncludeSubfolders -DeleteOnTarget
*********************************************************************
My problems are on lines 5-6
I just want to be able to change 1 piece of both paths as the $server variable, for example, if I use the command .\script.ps1 "DC01"
Do I need to create 2 more variables for the source and destination paths?
Something like $s3path = "phxbackup02/VMbackups/" + $server + "@10.99.0.25" ?
.\script.ps1 "DC01" will backup "S:\Local_Backup\DC01@10.9