Code, Build and Run on seperate Machines. Posssible? - windows

I know that I can code on one machine and have it build on a different machine (ie. a build server). Now I have also heard that you can have visual studio run a build on a virtual machine (i think it requires Virtual PC). Now my question is if anyone has been able to code on machine A, have it compile on machine B and run a debugging sesion on machine C?

This is pretty common in enterprise development and just about the de facto standard way of doing things.
Typically, a dev works locally. Once s/he is happy with their changes, they'll check it into a source control system.
From that point there are a couple of options ranging from automated building to having someone push the button to cause the remote build.
Once the build is complete there are a host of options available for deploying the app to one or more other servers. And yet other options for kicking off automated test suites.
Concerning remote debugging, you can do that independently of whether you are using a build/deployment/automated testing. It's just a matter of getting the right stuff installed and configured (see ho1's answer for a link).
All of that said, I highly recommend you never enable remote debugging on a production server. Some people might disagree with me but I personally think it's dangerous for security reasons and can certainly lead to site outages.
Finally, the only reasons you would need a virtual machine is if the servers aren't available or if you just want to sandbox everything.

You can do remote debugging, so if you had an automated process to copy the compiled code from B to C, I suppose you could do what you're asking.
See this MSDN article for more details: How to: Set Up Remote Debugging

Related

Automated integration testing of a client/server Windows desktop application

My team is developing a desktop application (mixed C++/Tcl) that is used in a client-server setup. Currently it is Windows-only, but soon we will need to port it to Linux. CruiseControl.NET builds it every night from the source code in SVN and packages it into NSIS installer, but we have no automated tests to run.
It is nearly impossible to add any unit tests, but integration testing of the application is easy, because it is already heavily script-based.
The main task is to install the app into 3 PCs, configure it (that involves copying some files around), run it, monitor a possible crash, wait till integration testing is done, collect a summary, send emails. It could be done with a bunch of custom PowerShell scripts, but
In future we will want to add more features and more testing, and
what used to be a simple script soon blows up (as usual), so I want
to minimize custom scripting, and if I need to script something, I
prefer bash/cygwin (I am not familiar with Python or Ruby).
I want a web dashboard that will report current progress, and if
something failed - show logs
I need some supervisor that will monitor the app under test and
report if it hangs or crashes
we will need to test it also on Linux
ideally I would like to orchestrate some test steps between the PCs
(e.g. run test X on PC1 and test Y on PC2 in parallel, wait till they
both finish, then run test Z on PC1, while monitoring that nothing
crashes on PC2 etc)
So, I am looking for a COTS tool/set of tools that will help me to do it and don't have a steep learning curve. Ideally, for free, but if it is really good and has fair pricing, my company may purchase a license.
The process should be triggered from CruiseControl.NET when the NSIS installer is ready, and then perform everything described above. Basically, it should allow at least remote installation of software, running custom scripts and have a web dashboard.
Apparently, SCCM tools like Chef could be used, but so far neither of them supports a Windows server, only nodes. I would like to avoid setting up a Linux VM just for that, although I can do it, if I have no other choice. Also, Chef seems to be a bit overkill - good for 10k machines, but I have only 3... maybe 5 in future. And I am particularly curious about chances to orchestrate a distributed test.
Most of the similar questions here on StackOverflow and in internets are about web apps, Java containers, Maven etc, and there are just so many tools and plugins for these tools to evaluate.
Thanks in advance.
Install ccnet on your test machines. Have those ccnet projects listen to a file that gets edited when a new installer is ready. Have the test machines install that new installer and run tests. There you go. ccnet sends emails so there's your basic reporting.
Have the test results get reported into a database via web services using gSOAP(that's what we did). For linux you can run java cruisecontrol if you must. Write a gSOAP enabled test controller program to report the test results from the test machines. A little c++ app will do. Then write a website(we use ASP.NET) to query the database(Postgresql) and show results. Have the test machines auto update themselves via SVN to get the latest changes to the configuration. Use Nant. Nant is far superior to just using ccnet to run tasks. Nant works through ccnet. Use XML, XSL and CSS with ccnet to make test emails have the information you want(new passes, new failures, SVN differences to code bases, etc...)
Our latest development is putting a big TV in the kitchen with a summary of test results so people can know more readily what they broke!
The first thing I'd get working is a test machine listening for the new installer, installing it, running some basic tests and emailing the results back. Put the ccnet and nant configuration in version control and get that auto updating on the test machine so you don't have to log into every test machine and do an update every time you make a change.
This is hugely broad and pretty close to opinion based. Chef can handle steps like deploying the application to the test machines but it isn't a GUI test framework so you would need something else to handle that. Jenkins supports distributing tests to windows hosts so that seems like a good choice on that side of things but it isn't that great at multi-node tests or orchestration between them. I suspect you'll need to write most of this yourself given the requirements.

Is it possible to run Teamcity on Linux and use Windows as a Build Agent?

I would like to run Teamcity (with a build agent) in a Linux VM to handle our none-.net projects. But in the same breath I'd like to have a BuildAgent setup on a Windows server to handle all of the .net projects.
I can't think of any reasons why this wouldn't work but has anyone any experience and any ideas about the problems I might encounter before I spend too much real time on this?
Ta
It's fully supported. TeamCity also knows which agents to route builds to.
This is a very normal scenario and many project I know do this without any problems. Just make sure that for the builds' Agent Requirements, you properly direct the appropriate job to the appropriate agent. One criterion can be that agent.os.name should contain Windows or Linux etc.

Building with Eclipse and Remote System Explorer

First with the background...
We have a Linux server that supports multiple projects.
The Clearcase server and repository are installed on this Linux server.
Different projects require different cross-compilers and libraries, and all of them are installed on the server.
User can choose different tool sets by running different scripts, which exports different environment variable values such as include paths and compilers.
User needs to run cleartool to mount the repository.
Developers develop in Eclipse and have two options:
SSH into the server and run Eclipse through with X11 tunneling.
Install Eclipse locally on their Windows machine and invoke builds from the SSH terminal.
Now:
Problem with #1 is that Eclipse operations (typing, content assist, etc) are extremely laggy.
Problem with #2 is that the developers need to go through extra hoops to build their code.
This is what I have tried:
Set up Remote System Explorer, which allows remote editing of files and remote running of the compiler:
How to build a c++ project on a remote computer in Eclipse?
This approach works perfectly for files that do not need special environment variable values and mounting of Clearcase repository, but I could not figure out how to get all of these things to integrate.
It would be great if someone can let me know how I can direct RSE to run a script (may be different per project) to set the environment variables and to run the cleartool commands to mount the repository so that it can locate the files.
The cleartool command arguments would be different per user for setting up a particular view.
Some extra info that may help:
I have root access to the development server
The Clearcase filesystem is mapped to a drive on the Windows machine
Thanks in advance for saving me hours of frustration dealing with a slow network!
==================
Additional detail per comments:
- The VOB storage is located locally on the Linux server. We would SSH to the server and start Eclipse there, therefore the delay should not be due to dynamic vs snapshot view and GUI performance seems to be the real problem.
- We also mount the same view on Windows by using Region Synchronizer. When running the local copy of Eclipse installed on Windows, there is no performance problems.
So this question can probably be solved by answering either question:
1. How to improve X11 performance such that development on Linux will suffice?
2. How to set up Windows Eclipse to perform all the steps mentioned above when building projects?
I came here a similar question to your part two, but alas, no one has answered it. However, I have an answer to your part one: https://www.nomachine.com/. It speeds up X11 forwarding considerably.

Is it possible to easily compile and run code in another machine in Visual Studio/Eclipse?

Let's say I am developing a program that needs a bit more power than a netbook can provide and I have a good computer at home connected to the internet.
Is there any easy way to code in the netbook while I'm not at home and then when building, making it go and run on the computer at home?
I know running programs on other computers isn't a problem, but I'd like to know if it is possible to have an easy experience (it's still possible to debug, etc).
Thanks
It is definitely possible for Java code and Eclipse. But there are issues as well.
It helps a great deal if you have a fixed IP.
You need to open up ports on your firewall to be able to
copy your code into your PC
remote debug and upload your application
The last step will create issues with security that you need to address. I use ssh and public/private key to secure my connections.
In general, what you are asking is not much different to releasing a code to a server and debugging it. And normally servers sit somewhere on the internet or cloud.

Do you have performance problems when you work on Visual Studio projects via a network share?

We have tremendous problems with Visual Studio (2008, if that matters) locking up and slowing down when accessing projects over a network drive. It can take several minutes to open a large Web site project through a mapped drive, and saving even a single file can take a minute or more.
I fired up Wireshark and watched the traffic. VS, it seems, requests massive amounts of files from the network -- there's an enormous amount of SMB traffic. I've done some research, and this traffic seems to stem from two situations.
VS has to have everything in its own process to provide Intellisense.
VS needs to have all the source in order to compile the project.
All the advice I've read seems to boil down to the same thing: work locally, not on a remote machine, then push your code to an integration server via source control.
This would sure solve our problems (VS is quite fast working locally), but what if you can't work locally? What if the project and the infrastructure required to run it is too large and complicated to be replicated on everyone's individual machines?
We've gone 'round this problem a couple times, and the only way we can figure to work on these projects is direct access via a mapped drive. However, the VS slowness and lockups are really becoming a problem.
One solution: we installed VS on the server and work on the projects directly on the servers via RDP. Seriously.
So, I ask:
What does everyone else do? Do you work via the network, or do you replicate projects locally? If remotely, do you suffer from VS performance issues.
We work locally and use SVN to keep all our code on the server.
I find VS 2008 quite slow working locally sometimes so I wouldn't fancy working on a network share.
Trying to compile over a network share is horribly slow using visual studio. Your start times will be bad as the intellisense database is regenerated. Each compilation has to go over the network multiple times. Linking takes forever.
If you need the output of your compilation on the network, I'd recommend doing your compile locally and defining a post-build command to copy the results to your share.
If, as you say, you cannot pull everything locally then I'd suggest your project is too big and needs to be broken up into more manageable chunks. For a multi-tier application, break it up by tier and invest in some form of continuous integration (e.g. CruiseControl) to automatically build individual pieces. In this way you can work locally on an particular piece and pull the pre-build portions from CI for the other pieces of the application.
I'm not terribly surprised that using VS to load projects over a network share has performance issues. VS (in any language) is constantly getting information from files in the project. Once you start loading this over a network you're at the mercy of the underlying network connection. All lags and access issues will directly translate into VS having an issue loading file contents.
I would advise copying the solution locally and using some form of source code control to sync the project on the share.
If the code is too complicated to install on everyone's machine, then don't put it on everyone's machine. Does everyone need to have everything in order to do productive work?
I have 79 projects in my solution that I work with. Several hundred thousand lines of code. I pull my source down everyday from TFS and build it; it's a lot of code, but it's a far better solution than trying to work over a network share.
A more legitimate situation of having the source code on a share is when one has a non-Windows host on which a (number of) virtual Windows machine is running.
I have this exact situation where my desktop machine (the host) is running Debian and I use VMware to run various virtual Windows machines (the guests), including one that has Visual Studio installed so that I can target Windows OS's. Having the source code on a Samba share on the host machine has the following pro's:
The source is not duplicated, so there is no way to confuse different copies while working on several virtual machines at the same time.
I have full control over the source from my preferred OS.
I can turn on and off any of the virtual machines, or roll back to a snapshot, without the risk of loosing changes.
I can build (etc.) from the same source on several machines without having to commit changes before the source fully tested (reason: I have to use Subversion <1.5).
The only problem with this setup is that Visual Studio (6,7,8,9) is painfully slow.
I have mounted the partition (on which the share lies) with "relatime" and this works in as far as the disk activity on the share moderate, but Visual Studio keeps the (virtual) network card occupied all the time.
Any solutions to this would be very appreciated.
I encountered similar problems everytime I worked (work = anything else then just copy / paste files) over a network drive. The problem occured with ZendStudio and Eclipse.
Why not use any kind of source control?
When working on Windows based projects I've always worked locally.
Once at a unix shop (AIX iirc) developers would work via NFS mount and checkin/checkout via RCS...
I'm using VS2005 across to a network share and not having any performance issues. However, it is a new server (Windows Server 2008). I don't have any other data points for VS since using it at work is relatively new for me.
However, some datapoints from using Netbeans for previous projects on a network share... Local build time for my project was 2 minutes on Vista, on a fast dual-core AMD 64-bit machine. For a network share project, on a Server 2003 box, it was 20 minutes. Building that same project from an ancient Tablet PC (1ghz, single core) running XP locally was around 5 minutes. Interestingly enough, the Tablet PC could build on the Server 2003 box in the same 5 minutes.
For those asking "why" on the network share. The network share is automatically backed up, archived, etc. Also, that way I can very easily look at the same projects from multiple machines without having to worry about pushing back into the repository, etc. Once you've gone to having your dev stuff on a device where you can get to it from anywhere/anything, you'll never want to do local storage again!
I have performance problems via network anything, they just aren't good enough yet.
I thought it was common knowledge that disk-speed is one of the major "slowness" factors when it comes to using VS in Windows. Most dev machines I've built have had projects located on 10k RPM RAID0 drives, or at least a single 10k RPM drive. And even then it seems slow sometimes. Just the way it is, I suppose, until VS2009/VS2010 fixes it? :)
From my experience, this lag when working on a network share is 99% due to Intellisense. Disable it and you'll see.
disabling Intellisense indeed speeds up saving and opening files trough a UNC share dramatically
http://blogs.msdn.com/saraford/archive/2007/12/03/did-you-know-how-to-turn-off-intellisense-by-default.aspx
but then again, as stated in other comments, you might as well use a good text editor
I've also experienced the problems with performance mentioned above. It seems to vary from project to project, but I did find one way of speeding up performance significantly for some project types.
Following the advice in this article made a previously unusable project on a network location (it would take minutes to open one file) perform almost like a local project. The basic gist is that you need to grant FULL TRUST to the network location:
To grant permission to all your projects in your Visual Studio Projects folder located on the network, follow these 8 steps:
Open Microsoft .NET Framework 1.1 (or 2.0) Configuration which you'll
find under Administrative Tools in the Control Panel.
Expand Runtime Security Policy | Machine, | Code Groups | All_Code |
LocalIntranet_Zone In the right-hand pane, click Add a Child Code
Group.
In the dialog that follows choose Create a new code group and fill in
a Name like Visual Studio Projects.
Optionally, provide a Description for the Code Group. (You'll see the
description when you click a Code Group in the left tree, helping you
identify the various Code Groups you may have) .
In the Condition Type drop down, choose URL
For the URL field, type something like this:
file://YourServer/My Documents/Visual Studio Projects/*
Under Use existing permission set, choose FullTrust (that is, if you
trust your own applications. If you don't, choose a different
permission set or create a new one).
Not sure why this works, but it made a previously unusable NET 2.0 project perform significantly better.
Original article: http://imar.spaanjaars.com/364/how-do-i-allow-my-visual-studio-net-projects-to-run-from-a-network-location
I was having the same problem. I have a local copy of our build system, which expects certain drive letters, and was also experiencing slowness.
I have solved the problem by adding the following registry keys:
[HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\Control\Session Manager\DOS Devices]
"R:"="\DosDevices\D:\devel\build
"S:"="\DosDevices\D:\devel\src"
Note that the double '\'s above are part of the .reg file format. When using regedit use single '\' throughout.
My build times were divided by 3. :)
I found the info in the wikipedia article on the SUBST command.

Resources