Bitley
asked on
force PowerShell to wait for cmdlet execution to end
Before I forget, credit where credit is due for as far as I've gotten: the following code is adapted from http://mow001.blogspot.com/2006/01/msh-out-zip-function.html.
Okay, here's my challenge: I would like to create a function that will accept a series of filenames (full paths) as arguments, and zip them ONE AT A TIME to the same folder and filename, but with ".zip" appended.
So [PS> zipfiles "g:\test\file1.txt" "g:\test\file2.txt"] will generate [g:\test\file1.txt.zip] and [g:\test\file2.txt.zip].
Here's something that works:
function zipFiles {
for( $i = 0; $i -lt $args.length; $i++ ){
# Define the source and destination file paths.
$FileName = $args[$i]
$ZipName = $FileName + ".zip"
# Create code to zip the file.
$cmds = ' $' + "ZipFile = (new-object -com shell.application).NameSpa ce('$ZipNa me');"
$cmds += ' $' + "ZipFile.CopyHere('$FileNa me');"
# Execute the code in a separate process and wait for it to exit, so we won't overload the machine.
write-host "Zipping '$FileName...'"
set-content $ZipName("PK" + [char]5 + [char]6 + ("$([char]0)" * 18))
[diagnostics.process]::sta rt("powers hell", "-noexit -command & {$cmds}").waitforexit()
write-host "...done."
}
}
...BUT the "-noexit" is highly undesirable, since there are a lot of files to be zipped and I would like to be able to start it and walk away, rather than sit there and close the secondary window for each file.
The problem is, if I remove the "-noexit" it doesn't work. The secondary window doesn't wait for CopyHere() to finish; it just opens and closes in a flash without doing anything.
The ideal solution would be something that tells PowerShell to wait for a given cmdlet to finish executing before going on to the next such that the secondary window wouldn't even be necessary.
Hopefully the solution is obvious to the Shellistas here, but I'm new and confused.
Okay, here's my challenge: I would like to create a function that will accept a series of filenames (full paths) as arguments, and zip them ONE AT A TIME to the same folder and filename, but with ".zip" appended.
So [PS> zipfiles "g:\test\file1.txt" "g:\test\file2.txt"] will generate [g:\test\file1.txt.zip] and [g:\test\file2.txt.zip].
Here's something that works:
function zipFiles {
for( $i = 0; $i -lt $args.length; $i++ ){
# Define the source and destination file paths.
$FileName = $args[$i]
$ZipName = $FileName + ".zip"
# Create code to zip the file.
$cmds = ' $' + "ZipFile = (new-object -com shell.application).NameSpa
$cmds += ' $' + "ZipFile.CopyHere('$FileNa
# Execute the code in a separate process and wait for it to exit, so we won't overload the machine.
write-host "Zipping '$FileName...'"
set-content $ZipName("PK" + [char]5 + [char]6 + ("$([char]0)" * 18))
[diagnostics.process]::sta
write-host "...done."
}
}
...BUT the "-noexit" is highly undesirable, since there are a lot of files to be zipped and I would like to be able to start it and walk away, rather than sit there and close the secondary window for each file.
The problem is, if I remove the "-noexit" it doesn't work. The secondary window doesn't wait for CopyHere() to finish; it just opens and closes in a flash without doing anything.
The ideal solution would be something that tells PowerShell to wait for a given cmdlet to finish executing before going on to the next such that the secondary window wouldn't even be necessary.
Hopefully the solution is obvious to the Shellistas here, but I'm new and confused.
ASKER
Thanks Chris, good question - quite possibly the separate process is giving me nothing.
All I'm trying to accomplish is to single-thread (I believe that's the right term here) the execution so that one file at a time is compressed. Without the separate process and the .waitforexit, as many zip processes as there were files would be spawned simultaneously, doing Bad Things to the machine since some of the files are multi-GB.
In case there's any question about it, the important part is the end result: zipping multiple files one at a time. Feel free to ignore the code listing - if there's a completely different and better way to accomplish the goal, I'm all ears!
All I'm trying to accomplish is to single-thread (I believe that's the right term here) the execution so that one file at a time is compressed. Without the separate process and the .waitforexit, as many zip processes as there were files would be spawned simultaneously, doing Bad Things to the machine since some of the files are multi-GB.
In case there's any question about it, the important part is the end result: zipping multiple files one at a time. Feel free to ignore the code listing - if there's a completely different and better way to accomplish the goal, I'm all ears!
ASKER CERTIFIED SOLUTION
membership
This solution is only available to members.
To access this solution, you must be a member of Experts Exchange.
ASKER
Thanks Peter, even though that was the kind of answer I was afraid of!
I'll leave this thread here for another day or two to see if there are more comments.
Anybody else? Anybody?
I'll leave this thread here for another day or two to see if there are more comments.
Anybody else? Anybody?
SOLUTION
membership
This solution is only available to members.
To access this solution, you must be a member of Experts Exchange.
ASKER
Thanks guys, even though the answer wasn't what I was hoping for! Have a great weekend.
If it's doing it one at a time why not just do each inline rather than spawning a separate PowerShell process? What does the separate process give you here?
Chris