How do I unzip remotely during the automated deployment to our QA environment? - ftp

I'm trying to figure out a way to automate the deployment to our QA environment. The problem is that our release is quite big, so needs to be Zipped, FTP'd and then Unzipped on the QA server. I'm not sure how best to unzip remotely.
I can think of a few options, but none of them sound right:
Use PsExec to execute a remote commandline call on the QA server to unzip the release.
Host a web service on the QA server that unzips the release and copies it to the right place. This service can be called by our release when it's done uploading the files.
Host a windows service on the QA server that monitors a file location and does the unzipping.
None of these are pretty though. I wonder how others have solved this problem?
PS: we use CruiseControl.NET to execute a NAnt script that does the building, zipping and FTP.

Instead of compressing and un-compressing, you can use a tool like rsync; which can transparently compress data during file transfer. The -z option tells rsync to use compression.
But I assume you are in a Windows environment, in which case you could use cwRsync (which is "rsync for Windows").
Depending on your access to the QA box this might not be a viable solution. You'll need to:
install the cwRsync server on the remote machine and
allow the traffic through any firewalls.

At the last place I worked at, we had a guy write a Windows service on the CI box to do the unzipping. TFS Team Server finished the build and notified a service to zip the completed build and copy it to the CI box. The CI box picked up on the new file, and unzipped it. It may have been a bit heavy, but it worked well - and he was cognizant to log all actions to the event log, so it was easy to diagnose if a server had been reset and the service hadn't started.
Update: One thing that we would have liked to improve on that process was to have the service on the CI box check for zip files and uncompressed files that were older than x months, for purging purposes. We routinely ran out of disk space (it was a VM that we rarely looked at), and had to manually purge old builds when it happened.

Related

ClickOnce Error "different computed hash than specified in manifest" when transferring published files

I am in an interesting situation where I maintain the code for a program that is used and distributed primarily by our sister company. We are ready to distribute the program to all of the 3rd party users and since it is technically our sister companies program, we want to host it on their website. (in the interest of anonimity, I'll use 'program' everywhere instead of the actual application name, and 'www.SisterCompany.com' instead of their actual URL.)
So I get everything ready to go, setup the Publish setting to check for updates at program start, the minimum required version, and I set the Insallation Folder URL and Update Location to "http://www.SisterCompany.com/apps/program/", with the actual Publishing Folder Location as "C:\LocalProjects\Program\Publish\". Everything else is pretty standard.
After publish, I confirm that everything installs and works correctly when running directly from the publish location on my C: drive. So I put everything on our FTP server, and the guy at our sister company pulls it down and places everything in the '/apps/program/' directory on their webserver.
This is where it goes bad. When I try to install it from their site, I get the - File, Program.exe.config, has a different computed hash than specified in manifest. Error. I tested it a bit, and I even get that error trying to install from any network location on our network other than my local C: drive.
After doing the initial publish in visual studio, I have changed no files (which is the answer/reason I've found by doing some searching about this error).
What could be causing this? Is it because I set the Installation Folder URL to a location that it isn't initially published too?
Let me know if any additional info is needed.
Thanks.
After bashing my head against this all weekend, I have finally found the answer. After unsigning the project and removing the hash on the offending file (an xml file), I got the program to install, but it was giving me 'Windows Side by Side' Errors. I drilled down into the App Cache were the file was, and instead of a config .xml file, it was one of the HTML files from the website the clickonce installer was hosted on. Turns out that the web server didn't seem to like serving up an .XML (or .mdb it turns out) file.
This MSDN article ended up giving me the final solution:
I had to make sure that the 'Use ".deploy" file extension' was selected so that the web server wouldn't mangle files with extensions it didn't like.
I couldn't figure out why that one file's hash would be different. Turns out it wasn't even the same file at all.
It is possible that one of the FTP transfers is happening in text mode, rather than binary?
For me the problem was that .config transformations were done after generating manifest.
To anyone else who's still having trouble, five years later:
The first problem was configuring the MIME type, which on nginx (/etc/nginx/mime.types) should look like this:
application/x-ms-manifest application
See Click Once Server and Client Configuration.
The weirder problem to me was that I was using git to handle the push to the server, i.e.
git remote add live ssh://user#mybox/path/to/publish
git commit -am "committing...";git push live master
Works great for most things, but it was probably being registered as a "change," which prevented the app from installing locally. Once I started using scp instead:
scp -r * user#mybox/path/to/dir/
It worked without a hitch.
It is unfortunate that there is not a lot of helpful information out there about this.

Script to execute on CVS check-in, without access to the server?

Is it possible to write a script that executes certain instructions, and is triggered by any check-in to a CVS repository?
The script would scan the list of files in the change-set and do a copy operation on certain files in a certain sub-directory.
I would hopefully be able to execute various console applications, including ones written in .NET.
Problem is, I need this done quickly and I don't have access to the CVS server, due to corporate IT red-tape, etc.
Is there a way to set this up on one of the client workstations instead?
Can it be done without interfering with my working folder?
Can you get commit notifications by email as this blog shows? If so, you could be able to use maildrop (or good old procmail, etc) to run arbitrary commands and scripts on your workstation when the commit notification mails arrive.
I found a .NET library that seems up to the task - SharpCVSLib.
http://csharpopensource.com/sharpcvslib.aspx
(Hopefully it will work on a developer workstation and not need to be hosted on the CVS server.)

Changing an IIS6 website directory remotely

First, the prior situation: We have this project with a one-click build script. It's cobbled together with TFS Deployer + PowerShell + VB Script. TFS Deployer sits on the production machine, copies the new website files into a brand new directory, and then calls a VB Script that changes the IIS website to the new directory.
Now, I'm moving the team away from the horror that is TFS/MSBuild. I have a TeamCity build agent on a dedicated build server. A simple NANT script deploys the build artifacts from the build server to the production server through a shared folder. Simple, quick, and effective.
However, I haven't found either a way a) to run the VB Script remotely b) update the IIS site remotely with a different mechanisms (programmatically within the 1-click build). Windows Server 2003/IIS6. Any ideas?
Update: I solved this by creating another vbs that remotely called the old vbs I had through WMI. Thanks everyone!
If I were to head in any direction, I would consider setting up a WMI script to do the work and then configuring it on the server in question. I would have to think about how to easily include this into your automated build. I personaally have not worked with TeamCity yet, although I have attended sessions on how it works.
WMI might be able to run the script, as well, and act as a sort of service front end, so you may be able to reuse what you have already spent effort on.
Could you change the vbscript file into an ASP file in a different website on the same server? This would allow you to call it remotely.
We've used NAntContrib's mkiisdir task to create/modify a virtual directory on remote machines.
<mkiisdir iisserver="Staging" dirpath="c:\temp" vdirname="Temp" />
This should either create (if the vdir doesn't exist) or change the location (if the vdir already exists).
Generally, it seems the cleanest way to do this is to first delete the vdir with the deliisdir task, followed by a create.
<deliisdir vdirname="Temp" failonerror="false" />
<mkiisdir dirpath="c:\temp" vdirname="Temp" accessread="true" accesswrite="false" accessscript="true" enabledirbrowsing="false" authntlm="true" authbasic="false" authanonymous="false" appcreate="Pooled" />
Happy coding!

How can I publish a subversion repository to a local IIS?

At work, we have a windows server 2003 with IIS and Subversion installed. We use it to publish and test locally
our ASP.NET websites. Every programmer has Tortoise installed on his PC and can update/commit content to the server. Hosting the repositories is working fine.
But the files kept in those repositories needs then to be copied to our local IIS (virtual directories).
What is an easy way to publish those subversion repositories to our local IIS?
Edit:
Thanks to puetzk I added a simple bat file that gets executed every time a commit occurs (check the subversion documentation about hooks). My bat file only contains:
echo off
setlocal
:: Localize the working copy where IIS points)
pushd E:\wwwroot\yourapp\trunk
:: Update your working copy
svn update
endlocal
exit
Just keep the web server's file area as a working copy, and perform an svn up in it whenever you want to "publish". Configure it to hide the contents of the .svn folders if they seem untidy to you (I don't specifically know how to do this, but I assume it can be done). They will already have the filesystem hidden bit, which may take care of this.
If you want it really automatic (updates as soon as someone commits), use a post-commit hook script on the SVN server to kick off the first process.
Others in the comments have suggested using export instead of checkout. That can work too, and avoids the .svn clutter, but has two drawbacks. One, it has to redownload the entire contents every time, not just the modified files (since it didn't keep the .svn dir to remember what it has). If you have a lot of files, this will be much slower. Two, update replaces the file atomically (writes the new version in .svn/tmp, then moves it into place). Export writes the file gradually into it's destination as it downloads. That means export could deliver an incomplete file to someone who browsed it at just the wrong time.
SVN doesn't support IIS; you can however run the standalone svnserve server as a windows service.
There's the SVN FAQ entry about it, and this blog post on Vertigo Software blog may be helpful too.
UPDATE:
After your clarification, I see that what you are looking for is a way to automatically update the code on the server after it's checked in. Look into CruiseControl.NET, after looking at the subversion integration tutorial it looks like it should do what you want.
UPDATE 2: This tutorial describes integrating Subversion, CruiseControl.NET and Nant.
maybe SVNIsapi can solve the problem (http://www.svnisapi.com). Cause it only utilizes an IIS installation, therefore you don't need an APACHE server or an SVNSERVER service. Secondly it should be possible to stack the ASP.NET ISAPI plugin onto the processing of SVNISAPI, so that a ASP.NET (.aspx) page will interpreted after read from the repository.
Cheers
Paolo
Use can use the free Visual-SVN Server to quickly install Subversion with Apache front end. It also have a nice MMC snap-in for managing the server and repositories.
You will than be able to access subversion with HTTP or HTTPS, but the port number must be different from the one your local IIS uses (default port for Visual-SVN server is 8080).
If you really need to access the repositories using your local IIS port 80, you can try SVN-IIS which acts as a bridge between your IIS and Apache. I haven't tried this one myself though.

How to speed up the eclipse project 'refresh'

I have a fairly large PHP codebase (10k files) that I work with using Eclipse 3.4/PDT 2 on a windows machine, while the files are hosted on a Debian fileserver. I connect via a mapped drive on windows.
Despite having a 1gbit ethernet connection, doing an eclipse project refresh is quite slow. Up to 5 mins. And I am blocked from working while this happens.
This normally wouldn't be such a problem since Eclipse theoretically shouldn't have to do a full refresh very often. However I use the subclipse plugin also which triggers a full refresh each time it completes a switch/update.
My hunch is that the slowest part of the process is eclipse checking the 10k files one by one for changes over samba.
There is a large number of files in the codebase that I would never need to access from eclipse, so I don't need it to check them at all. However I can't figure out how to prevent it from doing so. I have tried marking them 'derived'. This prevents them from being included in the build process etc. But it doesn't seem to speed up the refresh process at all. It seems that Eclipse still checks their changed status.
I've also removed the unneeded folders from PDT's 'build path'. This does speed up the 'building workspace' process but again it doesn't speed up the actual refresh that precedes building (and which is what takes the most time).
Thanks all for your suggestions. Basically, JW was on the right track. Work locally.
To that end, I discovered a plugin called FileSync:
http://andrei.gmxhome.de/filesync/
This automatically copies the changed files to the network share. Works fantastically. I can now do a complete update/switch/refresh from within Eclipse in a couple of seconds.
Do you have to store the files on a share? Maybe you can set up some sort of automatic mirroring, so you work with the files locally, and they get automatically copied to the share. I'm in a similar situation, and I'd hate to give up the speed of editing files on my own machine.
Given it's subversioned, why not have the files locally, and use a post commit hook to update to the latest version on the dev server after every commit? (or have a specific string in the commit log (eg '##DEPLOY##') when you want to update dev, and only run the update when the post commit hook sees this string).
Apart from refresh speed-ups, the advantage of this technique is that you can have broken files that you are working on in eclipse, and the dev server is still ok (albeit with an older version of the code).
The disadvantage is that you have to do a commit to push your saved files onto the dev server.
I solwed this problem by changing "File Transfer Buffer Size" at:
Window->Preferences->Remote Systems-Files
and change "File transfer buffer size"-s Download (KB) and Upload (KB) values to high value, I set it to 1000 kb, by default it is 40 kb
Use offline folder feature in Windows by right-click and select "Make availiable offline".
It could save a lot of time and round trip delay in the file sharing protocol.
The use of svn externals with the revision flag for the non changing stuff might prevent subclipse from refreshing those files on update. Then again it might not. Since you'd have to make some changes to the structure of your subversion repository to get it working, I would suggest you do some simple testing before doing it for real.

Resources