I wrote this long ago and am trying to maintain it now. While this seems rather obvious, I'm sure that it isn't:
# Start_Script_08.ps1#Has been run for this script: Set-ExecutionPolicy Unrestricted# Copies WMI_08.ps1 into Env:Copy "\\fileserver\Scanner\0 AA Scanner\Scripts\WMI_08.ps1" "$($Env:Temp)"pauseImport-Module "$($Env:Temp)\WMI_08.ps1"pause
So, why would those pauses be there? If I comment them out the process never seems to run. The objective here is to run a script (WMI_08.ps1) on a workstation where the script resides on a file server. The process flow here is: - Run a .bat file, residing on the file server, as administrator: WMI_08.bat This runs a Powershell executionpolicy bypass command for the Start_Script_08.ps1 - The .bat file launches Start_Script_08.ps1 which also resides on the file server. - This copies the main script WMI_08.ps1 onto the workstation into Env and runs it. The script can be manually copied onto a workstation and run. But, the idea is to avoid manual operations by executing the preliminary scripts as above.
For now, I'd be happy to know what those pauses do and don't do. Questions like: Should one or both be commented out? That seems to not work.
PowershellWindows Batch* Script Task
Last Comment
Qlemo
8/22/2022 - Mon
Seth Simmons
So, why would those pauses be there?
could be to see the output of the previous command before it continues; comes in handy for debugging it will sit there and say 'press any key to continue . . .' aside from that, it does nothing
Should one or both be commented out?
to be fully automated, remove both else it will sit there forever when it hits the first one
Dustin Saunders
I'd imagine the file is small enough to fit in the cache and your copy is not blocking the process until the file exists. Copy-Item should block, but if the file is tiny you might have an issue.
You could do a Start-Sleep to block for a short period before moving on. If you have scripts that have file dependencies you could put them in a waiting pattern with Test-Path to make sure the file exists before moving on. Something like:
while(!(Test-Path -Path "\\some.path")) will continue to wait in a loop until the file exists in the target location. Same for after Import-Module, you can do a similar wait using Get-Module and making sure its loaded before moving on.
Copy-Item works as any cmdlet and is non blocking. Thus its per file output makes your script continue.
Sorry, but I don't agree to that statement.
Standing as a single command, the copy has to be performed completely before script execution continues (that is all files are copied). Only if the cmdlet is part of a pipeline, execution of the next step in the pipeline is performed on a per-object base. But even that is not asynchronous; each file has to be copied successfully before it is passed down the pipeline.
A different behaviour would cause havoc, as you won't be able to reliably tell if the prior cmdlet has done its work yet or not.
Probably adding "| Out-Null" above adds a slight delay, allowing a cache or something similar to get in sync. A simple
Being an engineer by background, I'd be looking for a conceptual timing diagram.
My old and current implementation does this with no knowledge or regard for delays and asynchronicity:
1) Batch file run as administrator sets ExecutionPolicy Bypass on a particular .ps1 file 2) That particular.ps1 file copies the "main" .ps1 file to the local workstation using Copy. 3) Import-Module operates on that .ps1 file .. and, as I understand it, combines it into the current execution.
So, as I see it now, to be fully cautious one might: 1) Batch file run as administrator sets ExecutionPolicy Bypass on a particular .ps1 file 1a) Best: Wait for the action to be taken or: Can be made suitable: give enough time for it to complete.
2) That particular.ps1 file copies the "main" .ps1 file to the local workstation using Copy. 2a) Best: Wait for the Copy to complete or: Can be made suitable: give enough time for it to complete I imagine that this is the most likely culprit if this script is going on to the next command asynchronously.
3) Import-Module operates on that .ps1 file .. and, as I understand it, combines it into the current execution. 3a) Probably no need to wait here? - and, how would one using Import-Module?
Is that it?
Qlemo
Step 1: does not need any additional care, setting the policy is immediate.
Step 2: As there seems to be an issue with not waiting here (for unknown reasons), but there is no way to know whether the copy is really complete besides maybe polling the length/size of the file and check if it changed, I would add a short sleep as recommended earlier.
# Set-ExecutionPolicy Bypass has been run for this script with a .bat file run as administrator.# Copies WMI_08.ps1 into Env:$Source = "\\fileserver\Scanner\0 AA Scanner\Scripts\WMI_08.ps1"$Destination = "$($Env:Temp)"Copy-Item $Source $Destination | Out-NullStart-Sleep -Seconds 5Import-Module "$($Env:Temp)\WMI_08.ps1"pause
But, so far it's done no good. The script WMI_08.ps1 doesn't seem to run unless I download it, create a new local file, and run as admin. In reading more about this, I see that Import-Module is about running a*Module* that needs to be saved in a particularly-named folder, etc. Might that be part of the problem? I don't know why all this worked before...
Qlemo
I guess there is a lot of confusing here, and that is the real issue. Yes, Import-Modul just imports the module definition, it does (usually) not execute anything. If you just want to run a script, use "dot source":
could be to see the output of the previous command before it continues; comes in handy for debugging
it will sit there and say 'press any key to continue . . .'
aside from that, it does nothing
to be fully automated, remove both else it will sit there forever when it hits the first one