I'm writing an Ansible automation and I need to know if it is possible to check if my account (service-account) is able to stop/start a windows service (ie: Schedule)
It happens that sometimes the sa is not able to do this and the automation crashes. I would like to prevent the crash checking if I have the rights to start/stop service for all machines.
Is there a powershell for example to check this?
Riccardo
Related
I have a script on my domain stored on the Active Directory server. every machine on the domain has a task that when fired, calls this script to run.
Running the task from the AD server works. Running the task from another machine doesn't work. However, running the command that is triggered from cmd manually on the remote computer works?
Could anyone shine some light on this. Basically I call it like this so that the trigger is set up like...
Action: PowerShell.exe
Arguments: -noprofile –ExecutionPolicy Bypass -File "\\<>NameOfADServer<>\C$\Tasks\script.ps1" "Argument 1" "Argument 2"
Running as SYSTEM is probably your issue - it wont have any access outside of the PC its running on.
When you run it manually youll have the access.
There's several problems here.
You're running the task as the local SYSTEM accounts. SYSTEM generally does not have access to any network resources.
You're using the administrative share (\\<servername>\C$) to share the script. Only users that have Administrator access to the server can access the administrative shares. Administrative shares are heavily restricted by design and you cannot modify the access on them.
My guess is that the script works when you run it manually is because it's using the current user's credentials for network access when you do that, but don't quote me on that.
The simplest solution with the least amount of change is to do this:
Create a group in Active Directory. Add the Computer accounts, or, preferably, groups with Computer accounts which you want to be able to run the script to this new group. If you really want any SYSTEM account on any computer in the domain to be able to run the script, you can add the "Domain Computers" group to the group.
Create a folder on the server. Put the script in the folder. Don't put anything in this folder you don't want your users to read. Assign the "Read" NTFS permission to the group created above to the folder.
Share the folder out. Grant the group you just created the "Full Control" share access. If you want, you can make it a hidden share by adding a dollar sign to the end of the name.
Update your scheduled tasks to use \\<servername>\<sharename>\script.ps1.
This is almost certainly not the best method to accomplish what you're actually trying to do, but this is probably the best way to use scheduled tasks running scripts on a network share with the SYSTEM account.
I'm running into the same problem again and again for ages so I decided to ask my question here :
I added a service account "ZYX" into the Administrators group of my Windows 2K8 Server.
Whenever I try to run a scheduled task (running as "ZYX") that modifies a file located under a folder where the Administrators group has full control, my PowerShell script always gets "Access to the path xxxxxxx is denied".
When I check the effective permissions of my service account on this folder, it is written that it is granted every single permission.
I found two ways to overcome the situation, but I find this really ugly :
Running the scheduled task with highest privileges
Add the service account "ZYX" with full control in the folder Security part.
Im starting believing my service account only gets the rights inherited from the Administrators group when the shell runs in elevated mode.
Can someone explain me why Windows manages the rights like this ?
Do you have any better solution for this ?
Thanks
I am having trouble when I have one Windows service try to install another Windows service.
Specifically, I have a TeamCity agent running tests for me on a Windows 2008 AWS instance. The tests are written in Java, which shell out to a .bat script to install a service (let's call it Service A), giving it a unique name each time.
The offending line is in the .bat script: sc create "%serviceName%" binPath= %binPath% DisplayName= "%serviceDisplayName:"=%" start= %serviceStartType%. I believe as long as the service name is unique that should work.
And indeed it does work if I run the tests manually on the command line, using an administrator account. Service A is installed, the test completes and Service A is uninstalled at the end.
I have tried running the TeamCity agent as LocalSystem, as Administrator, and as another user that is member of the administrators group. I have also tried disabling UAC completely.
Presumably the problem is access denied type errors, although that is not clear at this point. There are a few avenues to explore still, but it is a simple question really: are processes running as services forbidden from installing other services? Are there special things I have to do to configure the machine/ account to allow it to do this?
The point of the test it to install and use Service A, so workarounds are not relevant - Service A must be operated as a black box.
Thanks!
There are no restrictions on creating services with regards to how the creating process can execute, as long as the process has the appropriate permissions. That is to say, a process could be running as a service and create another service -- the only consideration here is the appropriate permission level.
The problem that often occurs with running batch scripts from within processes (as opposed to directly through user input on the command line) is that the environment expected isn't always the environment that is loaded. In this case, it appears that the env variables referred to in the batch script weren't properly set when running as a service, which of course then caused the service install failure. Correcting the environment loaded when the batch script is shelled out is the correct solution here.
I'm trying to create a scheduled task in a Group Policy that runs a script that lives on the domain periodically.
I understand that storing the password in GP is a no no, so avoiding that. However, it seems like there is no way to deploy a scheduled task that can run with access to the network.
I tried the "System" account, that failed with access denied. I also tried using the "Do not store password" setting with a named account, which also prevents network access.
The scripts live in \domain\netlogon land and has full read access to authenticated users.
Is there anyway to accomplish this without having to manually install the task on every server and provide a named service account?
This is a Windows 2012 server domain with about 20 servers.
I ended up getting the "System" account to work correctly.
I have 2 separate servers (windows server 2008 r2) from where I am running vbs scripts through a microsoft scheduler ( my-computer>manager>Schedule). when I run vbs scripts locally they are working fine, but when it is being run through scheduler one of servers is getting stacked. while the other is working fine. And also I have noticed from task manager that the working server runs the WMIPRVSE.exe though Network Service user and the other one shows SERVICES as user.
How to make sure that WMIPRVSE.exe will always run under Network Services. Thanks
Edit:
I have tried to change the log on user from services, but it failed to start the service than.
There are a few things I have tried, but I don't know which one helped me.
What I did is I granted all permissions to wscript file which is located in system32 somewhere, and after some time it became Network Servies. Again not really sure whether it was because of that change or other thing.