Move files to remote file share after build - visual-studio

I want to create a post build script that moves files from the build directory to a remote (UNC) file share.
This line:
xcopy "C:\TeamCityBuild\project\WebSite\*" "\\192.168.1.1\WebSite\" /C /R /Y /E
Works fine when it is ran in a DOS-window but when TeamCitys buildrunner sln2008 tries to run it it fails with the message "Invalid drive specification"
I have shared the folder with full rights for 'Everyone' on the remote server.
Any ideas?

Just a guess. Not quite sure if it solves your problem. We had a similar problem using CruiseControl and deploying our application to remote JBoss server.
We've added
net use \\192.168.1.1\Website ...
before each copy. So that it 'mounts' the remote share before trying to access it. Note: you probably need to specify the username and password for the command (consult the command line for details).
The 'net use' seems needed even if you run the automated job as the same user you log on manually. These two kinds of sessions seem not to share remote shares information.

I've never used TeamCity Buildrunner sln2008, but if it runs as a service, then it is probably running under the "Local System" account, which doesn't have network access. Change the service properties (under the "Log On" tab) so that the service logs on as a user with permissions to that network share.

I don't beleave it works because the agent is running as a system service so it has limited network access (I beleave).
Instead of trying to use a post build step to copy the output, I think you should look into using TeamCity's Build Artifact's. That's what we use at my work altho we are new to TeamCity as well. What I don't know is if Build Artifact system will do extactly what you want.

You could try nANT
http://nant.sourceforge.net/release/latest/help/tasks/copy.html

Related

Connect to shared folder with username and password and copy files to that folder

I wanted to connect to a shared folder with user name and a password. After the connection is successful copy files to destination folder.
I know using "net use" command a shared folder can be mapped to a drive and after that I can use, I do not wanted to do that.
I am doing this to create a script to deployment a dotnet web application.
Any kind of help is highly appreciated.
Note: powershell is not available in that server.
Thanks
Use net use z: \\<network-location>\<some-share>\ password /USER:username first and then net /d z: to clean up, if your primary concern is not to leave unnecessary shares. Unfortunately, you cannot include your credentials in any other standard CLI command.

Global Subversion SSH config in Windows / Checking out Subversion project as SYSTEM on Windows

I'm trying to set up a scheduled Subversion commit from Windows Server 2003 machine over SVN+SSH as a task. I'd like the commit script to be executed as SYSTEM-user. So I'm guessing, for that to work I need to check-out the repository as SYSTEM, too - but am unable to achieve it so far.
I'm already able to achieve the above with my own user over SSH. I've done the following:
I added a [tunnels] entity in my local subversion configuration:
ssh = plink.exe -i "C:/Keys/my_key.ppk"
Added the key to the authorized_keys file on the server running Subversion
I checked out the repository with a script as below:
svn co svn+ssh://user#server/path/to/repo/ C:\Local\Project\Path
I'd now like to reproduce the above steps for SYSTEM user, to be able to run a scheduled commit later. The problem I'm facing is I don't know how to check out the repository as SYSTEM, because:
I don't know the syntax to use to check out a repository as SYSTEM
I don't know where the global (or SYSTEM's) Subversion config is stored on a Windows Server 2003. I've already tried: C:\Documents and Settings\Default User\Application Data\Subversion and C:\Documents and Settings\Administrator\Application Data\Subversion, but without success.
I also read somewhere I possibly could use svn switch for what I want, but wouldn't know how to svn switch as SYSTEM. I also considered writing scripts for svn check-out or switch and running them as SYSTEM, but then I still need global SVN config to add my_key.ppk, too.
I hope the above description is clear enough. I've been struggling with it for a long time now and am having problems summarizing it myself. Any hints appreciated.
As a side, that doesn't seem to be totally off-topic: https://serverfault.com/q/9325/122307
This is not a real answer to your question, yet it might solve your problem: Why not use svn <command> --config-dir ARG or svn <command> --config-option ARG?
You could specify the config file/option like this, thus being able to set [tunnels].
#cxxl really answered on question, when mentioned --config-dir. I'll just try to shed some light on problem
I'm guessing, for that to work I need to check-out the repository as SYSTEM
Wrong and bad guessing, because stored locally user's auth data doesn't used in case of SSH-auth, for ssh remote authentication performed. Per-user auth-dir
\%AppData%\Subversion\auth>dir /W
...
[svn.simple] [svn.ssl.client-passphrase]
[svn.ssl.server] [svn.username]
...
contain stored credentials only for http|https|svn and cert-based client authentication, and nothing for ssh-related repositories
I.e your executed under LSA script must be able to
* read Working Copy files (checkouted under any other real local user), maybe write (can't recall requirement for .svn dir permissions)
* read and, thus, use predefined and fine-tuned Subversion's config files (tunnel section), which can be config of any other user
PS: swn switch change linked URL of repository for Working Copy and have nothing common with users

Jenkins calling batch file on mapped drive

I have a Jenkins job that calls a batch file on a ClearCase drive (V:).
My Jenkins slave agent is running as a service using a local admin account.
The Jenkins job does the follow:
cleartool startview MY_VIEW
cd /d "V:\MY_VIEW\Build"
call PrepareBuild.bat
When I run the Jenkins job, I keep getting "Access is denied." in the Console Output when it tries to call the batch file. However if I manually run the above in command prompt, it completes successfully.
I did not have this problem under Windows XP. Does anybody know why this is happening on Windows 7 (32-bit)?
Thanks.
The V:\ is a virtual drive obtained with the windows command subst.
It is a shortcut between the root directory of your dynamic view (M:\yourView) and the virtual drive.
(Ie, V:\ is not particularly linked to ClearCase. It is just a drive letter the user wishes to associate to a certain ClearCase view root directory)
However, ClearCase registers that association in the registry HKCU/software/atria/....
Which means the ClearCase session run under the local admin account for Jenkins won't know about said association and the need to restore that virtual drive.
A workaround would be to make that drive permanent, using psubst.
That register the drive path in [HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\Control\Session Manager\DOS Devices], and HKLM is accessible from all accounts.
See " How to make SUBST mapping persistent across reboots? "
I had the same problem. Had a simpler solution.
Jenkins doesn't have access to folders that only the user has access to (even though its run by the user). So the folder which is getting access denied you need to set folder permission to everyone not the user

Problem cloning / fetching repository using Git plugin for Hudson on Windows

Before anybody shoots me down for this - I have already checked every appropriate thread and still not found a solution to my problem.
I have Hudson with git plugin installed on windows server (not my choice) and Hudson runs as a service. Git/bin is on the path. However I cannot clone the repository. Here is a shortened display of the console output:
Started by user anonymous
Checkout:workspace / C:\.hudson\jobs\sdf\workspace - hudson.remoting.LocalChannel#65394b
Last Built Revision: Revision 74200b32314231a5efdadd87bf36b42ec145c720 (origin/master)
Checkout:workspace / C:\.hudson\jobs\sdf\workspace - hudson.remoting.LocalChannel#65394b
Fetching changes from the remote Git repository
Fetching upstream changes from ssh://git.mccannlondon.co.uk/git/mccann_admin
[workspace] $ "C:\Program Files\Git\bin\git.exe" fetch -t ssh://git.mccannlondon.co.uk/git/mccann_admin +refs/heads/*:refs/remotes/origin/*
The server's host key is not cached in the registry. You
have no guarantee that the server is the computer you
think it is.
The server's rsa2 key fingerprint is:
ssh-rsa 2048 f1:48:2a:0a:d9:18:cf:2e:f2:8c:b3:25:7f:34:d5:34
Connection abandoned.
fatal: The remote end hung up unexpectedly
ERROR: Problem fetching from origin / origin - could be unavailable. Continuing anyway
So it seems I need to authenticate the host however I'm not sure why hudson is starting the job as user anonymous when I have set the Administrator as owner of the hudson service.
Does anyone know:
a) how to change hudson's run user? or
b) connect to the remove computer with the same user account as hudson uses as to allow hudson to fetch?
If this has been posted before apologies but I spent a good few hours searching around and couldn't find anything.
Thanks
Lewis
This may be related to the question Git, Can’t clone repo on windows
The problem is that MSysGit starts PLink in the background, i.e. the terminal is not actually connected to the input of PLink. That means that you simply can't type anything into PLink.
You simply have to connect to the server once using PLink or PuTTY, answer Yes and from then on, you won't be asked again.
The tutorial from cletus can be helpful as well.
A good description of the issue can be found in this blog entry
the problem is that Hudson is a service and runs under the user "Local Service Account".
The next step is to add the trace manually the cache (a file) because I know you can do in Linux.
Wrong again, Windows does not have this cache as a file but uses the registry.
Searching in the register found an entry for my user (who had previously accessed the repository and added to the cache footprint) where the trace was stored and copied to HKEY_USERS so that users can access it.
The entry goes like this:
Key Name: HKEY_USERS\.DEFAULT\Software\SimonTatham\PuTTY\SshHostKeys
Class Name: NO CLASS
Last Write Time: 23.01.2009 - 18:35
Value 0
Name: dss#22:bla_bla.com
Type: REG_SZ
Data: 0xb477b...
From the command line, you can easily add the key as follows:
reg add HKEY_USERS\.DEFAULT\Software\SimonTatham\PuTTY\SshHostKeys /v dss#22:bla_bla.com /d 0xb477b...
Now Hudson, run as a service, go happily and smoothly to the repository where the code.
The 'started by anonymous' is telling you which Hudson user started the job. If you haven't created any Hudson users, then everything is started by 'anonymous'
It is not related to which OS user is executing the process.
You asked how to change the user that Hudson runs as: you need to edit the service (Control Panel->Administrative Tools->Services, double click the Hudson service and change the "Log On" to "This account").
Once the account is setup I make sure that the git server is in MINGW's (Git Bash's) ~hudson/.ssh/known_hosts, and that there's a ~hudson/.ssh/identity file in place. The only warning I have is that if you have cygwin installed on the box you need to make sure that %CYGWIN% is empty, otherwise you'll see key permission errors in the Hudson logs. My recommendation is that you simply set the CYGWIN environment variable to empty in Hudson.
I meet the same problem and after check git plugin, http://wiki.hudson-ci.org/display/HUDSON/Git+Plugin
It is stated to change the git path to /usr/bin/git in Global setting and config the git username
And it fixes my problem
Hope it works for you as well, by the way, I use ubuntu, but it should be the same

What's causing xcopy to tell me Access Denied?

The postbuild task for one of our solutions uses xcopy to move files into a common directory for build artifacts. For some reason, on my computer (and on a VM I tested), the xcopy fails with "Access Denied". Here's what I've done to try and isolate the problems:
I tried a normal copy; this works.
I double-checked that none of the files in question were read-only.
I checked the permissions on both the source and destination folder; I have full control of both.
I tried calling the xcopy from the command line in case the VS build process had locked the file.
I used Unlocker and Process Explorer to determine there were no locks on the source file.
What have I missed, other than paranoid conspiracy theories involving computers out to get me? This happens on my dev machine and a clean VM, but doesn't happen for anyone else on the project.
/r = Use this option to overwrite read-only files in destination. If
you don't use this option when you want to overwrite a read-only file
in destination, you'll be prompted with an "Access denied" message and
the xcopy command will stop running.
That's was mine resolution to this error.
Source
Problem solved; there's two pieces to the puzzle.
The /O switch requires elevation on Vista. Also, I noticed that xcopy is deprecated in Vista in favor of robocopy. Now I'm talking with our build engineers about this.
You need to run XCOPY as Administrator, there is no way around this.
If you don't want to run your copy as Administrator, then you must use ROBOCOPY instead.
Note, however, that with ROBOCOPY it is very tempting to use the /COPYALL switch, which copies auditing info as well and requires "Manage Auditing user right", which again invites you to run as Administrator as a quick solution. If you don't want to run your copy as Administrator, then don't use the /COPYALL (or /Copy:DATSOU) switch. Instead use /Copy:DATSO, as the U stands for aUditing.
Also note that if you are copying from NTFS to a FAT files system, there is no way you can "Copy NTFS Security to Destination Directory/File".
Usually this happens because there's another process locking the file. I bet your machine has a different number of cores/different speed than the others. Try inserting some sleeps to see if it solves the problem.
If you can delete the file in Windows Explorer, try using an elevated command prompt. Not sure why Windows Explorer does not ask permission here for a delete operation that needs admin rights via cmd.
if you are copying file to IIS folder, then you need run the batch file as admin.

Resources