How can I make a remote logon Powershell script run as the locally logged on user? - windows

I'm trying to use the Microsoft User State Migration tool with Powershell; the way the program "loadstate.exe" works is that it needs to be called locally so that it will load up a remotely saved state of the user's profile and then restore it to the local computer.
I am seeking to automate this with Powershell; in order to do this, I have written a script that will elevate the user to run Powershell in Executive/Administrative mode and then execute the following command, against the USMT program that is installed on all of our computers:
c:\usmt\loadstate.exe /i:$configfile $storepath
Where $configfile is the name of the configuration file and the store path is the name of where the files are stored at. This all works, except that when run from remote, it seems to try to execute it from the System context - the desktop files are instead restored with the ACL permissions of CREATOR OWNER, SYSTEM, and Administrators - rather than giving it to the actually logged on user.
So if I have it set as a logon script, and try to have it execute when the user GuineaPig logs in, it will seek to restore the files and it will actually restore it - but all desktop items will be invisible to GuineaPig because she doesn't have any rights to see them.
If I just have the script locally, say on the user's C:\ DRIVE, and then right-click and "Run as Powershell Script", it works fine.
How do I execute this remote powershell script(located on our domain controller) so that it will actually run it in the context of the local user?
Alternately, how do I just tell the Group Policy of a logon script to just run something from the user's computer? I can also just set the script to be copied to the C:\ of every local computer, I just need the logon policy to actually run it from the C:\ of every local computer as the locally logged on user.
Thanks in advance.

Related

Go cd configuration issue

I've been having an issue trying to add github materials from a private repo on a Windows server.
I've seen lots of people suggesting how to add the ssh keys and where but on unix based systems. Haven't found anything related to Windows Servers.
I'm using Go latest release and have installed Go Server & Agent on a Windows Server 2008 with git installed.
I can connect to the private repo using Git Bash.
Whenever I try to add the materials it keeps saying Checking Connection and looks like it stays there forever.
If I use basic auth it works but I would like to make it work without exposing my password in the URL.
Is there a way to do that?
If you run Go under the default local system account, you can follow the suggestions from http://opensourcetester.co.uk/2013/06/28/jenkins-windows-ssh/ to setup the ssh keys for local system account.
If you run Go Server under a domain account (and not the default Local System account), check if you have uploaded your ssh keys to %USERPROFILE%/.ssh/ folder on the server machine, %USERPROFILE% being HOME folder for the domain user. Once you set that up, Go server would be able to pick the required keys. The same holds good for the agent machines. Just so you know, Go would not invoke Git-bash internally to run the git commands, so any setup on bash will not take effect when running git from within Go.
If you are using a windows machine to host GoCD server and agents , it does not run under a normal user account, it runs under the “Local System Account”
So even you can access your git repo from git bash (logged in as the current user),GOCD cannot access the same.
So you need to add the SSH keys for the Local System Account from your your current user.
1.First find the home directory for the Local System Account(It will not reside under C:/Users )
2.Use any remote administration tool to find the home directory- If you go with http://download.sysinternals.com/files/PSTools.zip
a)unzip and run command-line as administrator
b)PsExec.exe -i -s cmd.exe -start the tool )
c)run echo %userprofile% to get the home directory (eg:C:\Windows\system32\config\systemprofile)
3.Now you can either copy the SSH key files from current user or create a new one using ssh commands.
Try checking Connection after creating/copying the SSH keys, it will show Connection OK!

File ownership and access

I have an established workflow, but a change has caused some complications. An upstream Windows server delivers a file to my Solaris server where the file is accessed by my Windows 2003 server.
The problem is that either the ownership or permissions on a file delivered daily to the Solaris server has changed, and now the service running on my Windows server cannot copy and delete the file.
My Windows server has a parent directory on the Solaris server mapped and authenticated by User1.
The failing file comes in with an ownership of User2 and permissions of 664.
The failing file can be copied and deleted directly through Windows Explorer without additional authentication. A scheduled task batch file also can perform the copy and delete without authentication. It is only the running service which is unable to perform these tasks.
For comparison, there are a collection of files following the same workflow. These have an of ownership of User1 and permissions of 755.
User1 is a member of User1.
User2 is a member of staff.
The Solaris directory holding the files has permissions of 755 and ownership of User1.
What change can I make to give my Windows services ongoing access to files with both ownerships?
UPDATE:
Using a persistent shell script to change the ownership.
Had to use a persistent shell script to edit the file ownership.

Permissions and SVN Updates on Windows Server 2008: same folder & SVN account, different Active Directory users

We're experiencing strange permission issues with SVN after switching from Windows Server 2003 to Server 2008.
On our standard build box there is a folder (C:\SVN_Code_Folder) which AD_User_A associates with a SVN repository using SVN_User and TortoiseSVN 1.7.6
When using Windows 2003, when AD_User_B logs into the box and tries to Update, Switch, Merge the SVN_Code_Folder with SVN_User, the command is executed.
It Windows 2008, it fails with the message:
Command: Update
Error: Working copy 'C:\jboss-4.2.3.GA\server\New folder' locked
Error: sqlite: attempt to write a readonly database
Error: sqlite: attempt to write a readonly database
Completed!
Attempting to unlock the file, which was never locked, via the context menu is met with the following message:
There's nothing to unlock. No file has a lock in this working copy.
I've played with the permissions of the folder and I've discovered that giving "Domain Users" control over the folder fixes the issue, but I would prefer to not have such a broad permissions. I've tried granting the same permissions to individual users and a SVN-group, but these too did not work.
What am I missing?
Is this an improper use of SVN?
Can 2 different Domain users update a folder using SVN without removing the .SVN file?
For future reference...
I had this same problem with some WC's that I copied over to my new laptop's hard drive, from a file share on my old machine.
It turned out that the problem was solved by giving myself (as opposed to all domain users, or any other group) full control over the folder.
Did you check the svn service user on the win2008 machine?
Does that user have local administrator privileges and also have permission to these folders on c:?
After changing anything restart the service.
For me change folder permissions did't help, but I have update for several directories in a batch script so I solved this by
cmd -> Run as administrator -> start update script

Jenkins calling batch file on mapped drive

I have a Jenkins job that calls a batch file on a ClearCase drive (V:).
My Jenkins slave agent is running as a service using a local admin account.
The Jenkins job does the follow:
cleartool startview MY_VIEW
cd /d "V:\MY_VIEW\Build"
call PrepareBuild.bat
When I run the Jenkins job, I keep getting "Access is denied." in the Console Output when it tries to call the batch file. However if I manually run the above in command prompt, it completes successfully.
I did not have this problem under Windows XP. Does anybody know why this is happening on Windows 7 (32-bit)?
Thanks.
The V:\ is a virtual drive obtained with the windows command subst.
It is a shortcut between the root directory of your dynamic view (M:\yourView) and the virtual drive.
(Ie, V:\ is not particularly linked to ClearCase. It is just a drive letter the user wishes to associate to a certain ClearCase view root directory)
However, ClearCase registers that association in the registry HKCU/software/atria/....
Which means the ClearCase session run under the local admin account for Jenkins won't know about said association and the need to restore that virtual drive.
A workaround would be to make that drive permanent, using psubst.
That register the drive path in [HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\Control\Session Manager\DOS Devices], and HKLM is accessible from all accounts.
See " How to make SUBST mapping persistent across reboots? "
I had the same problem. Had a simpler solution.
Jenkins doesn't have access to folders that only the user has access to (even though its run by the user). So the folder which is getting access denied you need to set folder permission to everyone not the user

Move files to remote file share after build

I want to create a post build script that moves files from the build directory to a remote (UNC) file share.
This line:
xcopy "C:\TeamCityBuild\project\WebSite\*" "\\192.168.1.1\WebSite\" /C /R /Y /E
Works fine when it is ran in a DOS-window but when TeamCitys buildrunner sln2008 tries to run it it fails with the message "Invalid drive specification"
I have shared the folder with full rights for 'Everyone' on the remote server.
Any ideas?
Just a guess. Not quite sure if it solves your problem. We had a similar problem using CruiseControl and deploying our application to remote JBoss server.
We've added
net use \\192.168.1.1\Website ...
before each copy. So that it 'mounts' the remote share before trying to access it. Note: you probably need to specify the username and password for the command (consult the command line for details).
The 'net use' seems needed even if you run the automated job as the same user you log on manually. These two kinds of sessions seem not to share remote shares information.
I've never used TeamCity Buildrunner sln2008, but if it runs as a service, then it is probably running under the "Local System" account, which doesn't have network access. Change the service properties (under the "Log On" tab) so that the service logs on as a user with permissions to that network share.
I don't beleave it works because the agent is running as a system service so it has limited network access (I beleave).
Instead of trying to use a post build step to copy the output, I think you should look into using TeamCity's Build Artifact's. That's what we use at my work altho we are new to TeamCity as well. What I don't know is if Build Artifact system will do extactly what you want.
You could try nANT
http://nant.sourceforge.net/release/latest/help/tasks/copy.html

Resources