I'm provisioning a Windows VM that needs to run some PowerShell code when it boots. It also needs to run some different code when it shuts down.
To do the former, I can use New-JobTrigger and Register-ScheduledJob in my initial provisioning script like so:
$StartupTrigger = New-JobTrigger -AtStartup
Register-ScheduledJob -Name "Startup Job" -Trigger $StartupTrigger -Credential $DesiredCredentials -ScriptBlock {
Do-InterestingThings $using:ExternalResource
}
Doesn't even have to be a separate script file, it can just be a script block. Any variables from an outer scope will be serialized and used when the job runs. Pretty neat.
The real problem I'm solving involves creating an external resource whose lifetime is tied to the VM's uptime. When the VM is created, this resource will be created. When the VM is shut down, this resource needs to be cleaned up. How can I use PowerShell to run some code just before the VM is scheduled to shut down (regardless of how it got the order)? It doesn't need to be a script block, it can be a separate script file.
There are two reasonable ways to do this:
Local Group Policy:
This can be done in the local group policy editor: gpedit.msc. Navigate to Computer Configuration/Windows Settings/Scripts (Startup/Shutdown)/Shutdown. You can add 'Scripts' and/or 'PowerShell Scripts' here which get executed before other shutdown processes.
Event-Based Scheduled Task:
From this answer:
scheduled task as follows:
Type : On Event (Basic)
Log : System
Source : User32
EventID : 1074
When a user or command initiates a shutdown or restart as a logged on
user or on a user's behalf, event ID 1074 will fire. By creating a
task to use this to trigger a script, it will start the script and
allow it to finish
Note that this does not delay the shutdown (so has to be quick) and can sometimes fail to trigger before the task scheduler service stops.
Final Note:
Always make sure that your code can handle a dirty shutdown. After all, the fastest way to call a reboot is the power button...
Related
I want to know if it is possible to configure a service to call a batch/powershell script when I stop it from services.msc.
While in Linux init.d services are fully programmable and even systemd services can have additional procedures I've yet to find a way to accomplish this on Windows.
Thanks in advance
You can configure services to run a program on failure, but if you are stopping the service via services.msc then that likely wouldn't count as a failure.
The only other option I can think of would be to set up a PowerShell script running as a scheduled task that either periodically checks the services running status, or (for a more foolproof option) looks at the event log for events indicating that the service has been stopped (since the last time the script checked) and then performs whatever actions you require.
Per the comment from montonero, you wouldn't need to run the scheduled task periodically as it could be configured to run when the event itself occurs. This is described here: https://blogs.technet.microsoft.com/wincat/2011/08/25/trigger-a-powershell-script-from-a-windows-event/
Use the Event Viewer “Attach Task to This Event…” feature to
create the task.
Launch "Event Viewer" and find the event. Once found, right-click on the event and select "Attach Task to
This Event...".
I'm trying to use Powershell to get the NextRunTime for some scheduled tasks. I'm retrieving the values but they don't match up to what I'm seeing in the Task Scheduler Management Console.
For example, in the Task Scheduler console my "TestTask" has a Next Run Time value of "1/9/2018 12:52:30 PM". But when I do the following call in Powershell it shows "12:52:52 PM" for the NextRunTime.
Get-ScheduledTask -TaskName "TestTask" | Get-ScheduledTaskInfo
From what I've seen the seconds value is always the same value as the minutes when returned from the PowerShell Get-ScheduledTaskInfo cmdlet. I'm wondering if there's a time formatting error (hh:mm:mm instead of hh:mm:ss) in that cmdlet but I have no idea how to look for that.
The task is running at the exact time shown in the console so that makes me think that it's an issue with the powershell call.
Has anyone seen this issue before and know how to get the correct NextRunTime value in PowerShell? I'm also seeing the same issue with the LastRunTime value.
I've tried this on Windows Server 2016 and Windows 10 and get the same results on both operating systems.
I can confirm that I see the same issue on Server 2012R2 as well. You can get the correct information by using the task scheduler COM object, getting the root folder (or whatever folder your task is stored in, but most likely its in the root folder), and then getting the task info from that. Here's how you'd do it:
$Scheduler = New-Object -ComObject Schedule.Service
$Scheduler.Connect()
$RootFolder = $Scheduler.GetFolder("\")
$Task = $RootFolder.GetTask("TestTask")
$Task.NextRunTime
Probably also worth noting that you can use the Connect() method to connect to the task scheduler on other computers (if you have rights to access their task scheduler), and get information about their tasks, stop or start their tasks, make new tasks... lots of good stuff here if you don't mind not using the *-ScheduledTask* cmdlets.
We have a legacy VB6 application that automatically emails reports. It runs from a scheduled task on a server. Occasionally a user will run the exe - it's in a folder that we can't lock them out of, and it has to remain in that folder for reasons too complicated to go into here. Is there a way to prevent users from running the exe while still letting it run from the scheduled task? I can modify the source code for the exe, so that's an option if someone can help me figure out how.
I'm going to call your existing app AppChild and a new VB6 (or other program language) program AppParent.
Modify AppChild to test for a command line parameter at either Sub Main() or at the first form loaded in the Form_Load() event. If the command line parameter isn't there, AppChild terminates.
AppParent would be in a location not accessible to the other users. The Scheduled task runs AppParent which runs AppChild and passes the required command line parameter. This could be secured somewhat by passing a calculated hash and decoding it in AppChild if needed.
Or, if the users don't have access to the Scheduled Tasks, you could just run AppChild , passing the required parameter from the Scheduled Task. If the users do have access to the Scheduled Task this won't work because they could then see the passed parameter and create a shortcut which passes the required parameter.
You didn't state which OS the server is running but you may have problems using network resources if you try to run the Scheduled Task without a logged in user. Task Manager got a major update to handle security issues to prevent hackers from running tasks without a logged in user. Essentailly, network resources, .e.g. eMail, are not available unless a user is logged in.
https://technet.microsoft.com/en-us/library/cc722152(v=ws.11).aspx
The only way I found around that problem is to run a machine with a user with the correct permissions logged in all the time.
Are you sure you cannot lock the user out?
You could restrict access to the folder so that the user cannot access it and set up the scheduled task to use an account with access to the folder.
Although the users can't be locked out of the folder (perhaps the reports end up in there?), in Windows you can set the permissions on a per file basis. Make a new user that has the full rights (the same as your users). Schedule the VB6 app to run with that user. Remove the rights for the regular users to see the app. You do this by changing the permissions on just the VB6 app.
Until we can implement our new HEAT SM system i am needing to create some workflows to ease our currently manual user administration processes.
I intend to use Powershell to execute the actual tasks but need to use VBS to send an argument to PS from an app.
My main question on this project is, Can an argument be sent to an already running Powershell process?
Example:
We have a PS menu app that we will launch in the AM and leave running all day.
I would love for there to be a way to allow PS to listen for commands/args and take action on them as they come in.
The reason I am wanting to do it this way is because one of the tasks needs to disable exchange features and the script will need to establish a connection a remote PSsession which, in our environment, can take between 10-45 seconds. If i were to invoke the command directly from HEAT (call-logging software) it would lock up while also preventing the tech from moving on to another case until the script terminates.
I have searched all over for similar functionality but i fear that this is not possible with PS.
Any suggestions?
I had already setup a script to follow this recommendation but i was curious to see if there was a more seamless approach
As suggested by one of the comments by #Tony Hinkle
I would have the PS script watch for a file, and then have the VBScript script create a file with the arguments. You would either need to start it on another thread (since the menu is waiting for user input), or just use a separate script that in turn starts another instance of the existing PS script with a param used to specify the needed action
I have created a Windows Task that runs on Admin account with highest privileges that runs a batch file every minute.
This batch file will execute a PHP script to retrieve a webpage , after which it checks if no page or wrong content is returned.
If the result is negative then the batch routine kills the httpd process and its children using taskkill (I am currently dealing with a PHP hang causing the Apache Http process to hang as well).
This entire process works perfectly when executed while logged onto the machine as admin. However when executing as a task (and despite admin privileges) the process does NOT get killed. There is no event or debug entry.
So my question is why is task kill unable to kill the process, how can I get more info and what alternatives exist?