Giving System User access to create files in network location - windows

I have a batch file, which i'm wanting to run as System, as the program the batch runs refuses to run as an administrator account through scheduled tasks. As a last resort, i'm wanting to run the scheduled task as the local system account. The batch file needs to save files in a network path, which is specified in the batch file. The problem is that the system user doesn't seem to have the needed access to write to the network locations. How can i give the local system user access to write to the network path?

Related

How to run a VB6 app from a scheduled task without users being able to run it

We have a legacy VB6 application that automatically emails reports. It runs from a scheduled task on a server. Occasionally a user will run the exe - it's in a folder that we can't lock them out of, and it has to remain in that folder for reasons too complicated to go into here. Is there a way to prevent users from running the exe while still letting it run from the scheduled task? I can modify the source code for the exe, so that's an option if someone can help me figure out how.
I'm going to call your existing app AppChild and a new VB6 (or other program language) program AppParent.
Modify AppChild to test for a command line parameter at either Sub Main() or at the first form loaded in the Form_Load() event. If the command line parameter isn't there, AppChild terminates.
AppParent would be in a location not accessible to the other users. The Scheduled task runs AppParent which runs AppChild and passes the required command line parameter. This could be secured somewhat by passing a calculated hash and decoding it in AppChild if needed.
Or, if the users don't have access to the Scheduled Tasks, you could just run AppChild , passing the required parameter from the Scheduled Task. If the users do have access to the Scheduled Task this won't work because they could then see the passed parameter and create a shortcut which passes the required parameter.
You didn't state which OS the server is running but you may have problems using network resources if you try to run the Scheduled Task without a logged in user. Task Manager got a major update to handle security issues to prevent hackers from running tasks without a logged in user. Essentailly, network resources, .e.g. eMail, are not available unless a user is logged in.
https://technet.microsoft.com/en-us/library/cc722152(v=ws.11).aspx
The only way I found around that problem is to run a machine with a user with the correct permissions logged in all the time.
Are you sure you cannot lock the user out?
You could restrict access to the folder so that the user cannot access it and set up the scheduled task to use an account with access to the folder.
Although the users can't be locked out of the folder (perhaps the reports end up in there?), in Windows you can set the permissions on a per file basis. Make a new user that has the full rights (the same as your users). Schedule the VB6 app to run with that user. Remove the rights for the regular users to see the app. You do this by changing the permissions on just the VB6 app.

Jenkins Job (In Windows environment) not able to access Shared locations

I am trying to schedule a batch in Jenkins (Windows environment) for Windows EXE program (Implemented through .NET).
This program refers to some shared location in the network (viz. \shared network.net\sample path) for the sake of reading from and writing into files.
When I run this program independently out of Jenkins, it works fine, as it considers my login as user who actually has access over shared path.
However, when I run it through Jenkins, there is issue over access. Through my program logs I checked and found that it uses 'NT AUTHORITY\SYSTEM' as user.
I need to make Jenkins job run through particular user's authentication, which will have relevant access over shared path.
Please advise.
The Authorize Project Plugin allows you to run a job as a specific user.
Or, if you are executing from a bat script, you should be able to change the user in your script before running your program.
Several options:
Use "net use" to map the network location under the job's session using your credentials.
In your Windows slave you can go to services-> Jenkins slave->properties. there under "Log On" section you can specify the user you want the service to run under.
I would definitely go with the first option as it is much more manageable (tomorrow you'll replace your slave and have to do it all over again, instead of just migrating the job and mapping the session again).
Good Luck!

Powershell script on remote computer not running as a scheduled task

I have a script on my domain stored on the Active Directory server. every machine on the domain has a task that when fired, calls this script to run.
Running the task from the AD server works. Running the task from another machine doesn't work. However, running the command that is triggered from cmd manually on the remote computer works?
Could anyone shine some light on this. Basically I call it like this so that the trigger is set up like...
Action: PowerShell.exe
Arguments: -noprofile –ExecutionPolicy Bypass -File "\\<>NameOfADServer<>\C$\Tasks\script.ps1" "Argument 1" "Argument 2"
Running as SYSTEM is probably your issue - it wont have any access outside of the PC its running on.
When you run it manually youll have the access.
There's several problems here.
You're running the task as the local SYSTEM accounts. SYSTEM generally does not have access to any network resources.
You're using the administrative share (\\<servername>\C$) to share the script. Only users that have Administrator access to the server can access the administrative shares. Administrative shares are heavily restricted by design and you cannot modify the access on them.
My guess is that the script works when you run it manually is because it's using the current user's credentials for network access when you do that, but don't quote me on that.
The simplest solution with the least amount of change is to do this:
Create a group in Active Directory. Add the Computer accounts, or, preferably, groups with Computer accounts which you want to be able to run the script to this new group. If you really want any SYSTEM account on any computer in the domain to be able to run the script, you can add the "Domain Computers" group to the group.
Create a folder on the server. Put the script in the folder. Don't put anything in this folder you don't want your users to read. Assign the "Read" NTFS permission to the group created above to the folder.
Share the folder out. Grant the group you just created the "Full Control" share access. If you want, you can make it a hidden share by adding a dollar sign to the end of the name.
Update your scheduled tasks to use \\<servername>\<sharename>\script.ps1.
This is almost certainly not the best method to accomplish what you're actually trying to do, but this is probably the best way to use scheduled tasks running scripts on a network share with the SYSTEM account.

Perserving File Ownership Win7 Share

I am trying to setup a "dropbox" on a Win7 workstation we will use to process simulation jobs. My plan was to pull ownership from file (do a simple dir /q "filename") so I can use the owner information during the simulation (send them an email when done for example).
The problem I have is when the user drops the simulation file on the share I setup, the ownership is set to BUILTIN\Administrators. I have tried tweak the share settings but so far nothing seems to work.
I do have a work around where users can embed their email address in the simulation file and I could pull that. But trying to make it easier as I know somer user will forget to do that... Any ideas how to preserver the ownership inforamation?
Quite possibly, you could embed the owner's email as an alternate data stream into the file. Read this link here.
And with a few powershell scripts, you could write the job owner into the file at submit time and extract it out on the remote machine at run time.
I believe the alternate data stream survives the command line copy command as long as NTFS is used everywhere as the filesystem.

service doesn't behave the same as command line

I am running on a Windows Server 2003. This is my problem:
I wrote a Perl script to automate the copy of some files from my Server machine to some network drives. I am using xcopy to copy the files. My problem is the permissions.
If I run the script from the command line, it works, all the copies are successful.
If I try to run the script using a service all the copies fail. This service is a program that I wrote that takes the script and runs it. In the background all it is doing is to call the C function 'system' and it runs the same program that I can run from the command line.
I have tried many variations of this to figure out what is wrong with it but I can't see why the service would not run the same way I run it from the command line.
I set up the service to run as the same user I am using from the command line.
I also tried to map the network drives as the user that has writing permission but the result is the same. Manually the script works, from the service, it doesn't.
Any suggestion is appreciated.
Thanks
Tony
The service may be running as the system and not have access to the network drives. In the Service settings, change the service to run under your account (or an account with the relevant permissions/mappings).
When the service runs, it uses whatever credentials you specify in the Services manager of Windows. The default, LOCAL SERVICE, probably does not have permission to access the resources to be copied.
Create a new user account with the minimum set of permissions needed to perform the copy and configure your service to run under that account.
I did figure out the issue (I think), and that matches what I later found in another post:
https://serverfault.com/questions/4623/windows-can-i-map-a-network-drive-for-a-service-account
<...Persistent drive mappings are only restored during an interactive login, which the service does not use. I believe the only way to get a service to use a network drive is for that service to map the drive itself or alternatively for it to us a UNC path instead of a mapped drive.>
What I did was mapping the drive using the service and that seems to work. It turns out, if I map the drive and save credentials, then I can access later the drive without having to map it again. I don't know why this approach seems to work though.
-Thanks everybody for your help.
Tony

Resources