Should a Windows based build server get automatic updates installed? - windows

I'm asking myself if it is a common practice to have automatic updates activated on build server with windows operating system. The build server uses jenkins, visual studio and java to drive the build. On the one hand I want a system that is clearly defined which software is installed. On the other I have a server that should have up to date patches installed.
What is a common practice?

In my previous company, we were using Windows to host the Jenkins master and all the slaves. We were building our code with Visual Studio 2010. We tested the automatic updates and it broke our configuration 2 times (in 3 years). So if you want to control your server's configuration, I recommend to apply the Microsoft patches manually (you can test the patches on a staging environment before applying the patch in production).

Related

Better Version Control than TFS

I am working on TFS for Version Controlling since one year. It was good with Visual Studio but some days ago my PC Crashed and the Code on my local Machine Which were not Checked in Destroy... Then I have worked again which took much time and my project got late. Now I decide to move from TFS to another better Version controlling then TFS which handle code locally as well as on the server.
please advise me if anyone use good version control tool for Visual Studio. Thanks in advance
Just like Edward commented TFS is not a version control system. Which is a product that provides source code management (either with Team Foundation Version Control or Git), reporting, requirements management, project management, automated builds, lab management, testing and release management capabilities. It covers the entire application lifecycle, and enables DevOps capabilities.
TFS supports two types of version control: Git and Team Foundation Version Control (TFVC). One centralized and one distributed. As for which version control system should you use, you could take a look at this thread: Choosing the right version control for your project
For your case, you should be checking in frequently or setup some sort of backup system to avoid mistakes like this. Or set an alarm clock, memos to remind you to commit code to version control. If the files are gone from your file system, they're gone.

Why is latest Xamarin needed on local TFS server for CI?

The requirements on the following page state that you need to install Visual Studio with Xamarin on your local TFS server to setup Xamarin CI builds:
https://developer.xamarin.com/guides/cross-platform/ci/intro_to_ci/
topography of the CI
This is a real pain. We have lots of developers that rely on our local TFS server, most of whom don't do any Xamarin development. As such, any changes are heavily scrutinized. This often leads to us not installing the latest VS/Xamarin releases, as it's considered too risky for this vital bit of infrastructure.
We could have a Windows build machine with VS and Xamarin installed, that is connected to a Mac build machine. We'd be free to update the Windows and Mac build machines regularly, without the fear of compromising the TFS server. Is this possible? If not, why not?
Thanks in advance.
That diagram can't be right. There is no reason why you'd need VS or Xamarin installed on your TFS app tier.
I think it's showing a simplified configuration where the Windows build agent is installed alongside the app tier. That is a supported setup but is never, ever recommended by anyone, for exactly the reasons why you don't want to do it.
The diagram is simplified. You don't need to install anything on your TFS server. What you do instead is to install a Build Agent on a separate machine or virtual machine.
The installation details for the TFS 2017 / VSTS build agent v2 can be found in the official visual studio documentation.
The procedure is similar for both TFS and VSTS, where you generate an access token in TFS/VSTS, then simply enter the url for the TFS/VSTS instance when running the build agent install script, along with the access token.
There are build agents for Windows, Linux and macOS, so it is up to you how you configure how iOS builds are made.

Source Server support on VMs with VMWare Lab Manager and TFS?

My company is interested in better integrating our investment in VMWare with our TFS deployment. Currently the company is running TFS2005 SP1, VS2010, and we have a sizeable SAN that we would like to use in environment reproduction similar to what is offered in TFS2010 Lab Management.
Of the features offered by TFS2005, we are currently leveraging only TF Version Control--work items and build automation are handled by separate systems. However, we would like to use the TFS-integrated Symbol/Source server in order to accurately debug the different versions of our product, and that's where we're running into difficulty.
The VMs deployed in VMWare are not joined to the corporate domain, and this means that we run into difficulty when attemping to grab source code information via Source Server and the "tf.exe view" command.
If devenv is run on the VM, it can't authenticate a domain account, and tf.exe view fails when grabbing source info.
If devenv is run on the developer desktop and debugging is done with remote debugger, the vm's local user account fails to access the share exposed by Symbol Server and can't load symbols to begin with, much less retrieve source.
Has anyone done this before?
Yes - You can still do this. If you are using Windows 7 (and I believe Windows Vista) you can always add the domain credentials to the "Credentials Manager" in the Control Panel. This will help it authenticate for the TFS URL whenever it needs to talk to TFS.
BTW, I have a blog post discussing the Symbol Server and Source Server features of TFS 2010 available here: http://bit.ly/SymbolServerTFS

How is your Development Environment Set up?

Curious to know how people set up their personal and/or work development environment, in terms of:
Do you just have all of your developer tools (for example Visual Studio, SSMS, etc.) installed on your main operating system;
Do you use Virtual Machines to have a separate "clean" dev environment that consists only of the OS and one compiler you're working with;
Do you have multiple OS's in a multi-boot system;
Do you remote connect to a separate machine with your developer tools installed on there
It all depends on the type of the job i guess. Here is how my setup is:
The main PC. The one on my desk. Has everything on it.
The secondary machine. Runs Vista.
A bunch of "Clean" VMs for testing. Typically 2 machines of each OS we support.
A build machine. VM with no installed product. Just source code and some compilers.
A dedicated "Server" to host the server app and the DB. [Our product is a client-server thingy]
[On top of that, my primary and sec machines have the server and DB running too.]
EDIT: By "clean" i mean that they only have a freshly installed OS on them, nothing else. These are non-persistent and go back to clean state on shutdown.
I am running what I think is a fairly standard Agile C# development environment. Vista SP1, Visual Studio 2008 with Resharper 4.1, SQL Express 2008, Subversion server, command line svn client and Cruise Enterprise (unbelievable product) with 1 server and 1 agent for continuous integration.
I am running on a Dell XPS core 2 duo 2.4Ghz laptop with 4GB of RAM and 1 external 22" widescreen monitor.
I have tried and tried and persisted with VMWare Workstation (mostly but also Virtual PC) but I again and again resort back after tiring with the performance and annoying delays in Visual Studio. And I have tried every performance trick and tweak in the book available to me. It apparently just needs either more hardware than I have or far more patience.
I have also tried running 64bit Ubuntu with VMWare Worstation server running Vista (vlite'ed) and also windows XP (lite), but I found it just as annoying.
If you have similar specs to what I described then I can simply recommend not going down the VM path, unless it is ABSOLUTELY necessary.
I have a VMWare network replication of the main servers in my environment including SQLservers, Web-Servers, a copy of my dev box, and AD Servers. I also use VS on my dev box for simple things that don't need as much testing.
We use Virtual PC's for our development. As well as a VP for our build environment. The reason for this is so that we can switch between different projects without losing time. (for Support)
At our current client, we have an ESX server with virtual machines running on it. We access the virtual machines through Remote Desktop.
For my style in VS 2008, I use VibrantInk by Rob Conery.
We have Reflector and all Sysinternal tools available on all virtual machines.
I'm planning to have ReSharper on every machine also.
Firefox/Firebug combo is installed on every machine.
Web Developer for IE7 is also installed on every machine.
Cheers!
I really enjoyed using a single VM for each IDE I worked with, but that requires a beefy machine. However, my company has taken recently to the idea that the developers can do "just fine" with sub $500 machines. Thus, my current setup is everything on my only machine.
All of my tools are on my local machine. I generally work within the MVC mindset.
VMWare is set up on my machine, but it's only used on rare occasion for things beyond the control of my machine.
My work is primarily done on a windows machine, with Visual Studio.
I have Visual Studio 2005 and 2008 running on my main machine (Vista :p), and everything I can develop here without cluttering the machine, I do. Feels so much more responsive than in a VM. I have a VM for Linux-based development and several VMs for testing purposes. I never tested VMWare's debugging feature (run the debugger on the host and the debuggee on the guest), though I can imagine that that would be a good reason to have Visual Studio on the host, even if you don't care about responsiveness.
I have a number of IDEs and server products running on my main workstation. I also have a remote access laptop that has all the same critical software on it so I can develop locally (and not depend on Citrix and Remote Desktop to work on code fixes outside the office).
My main work system
Linux x64 dual core
Dual monitor
Redhat based OS
Vim, Kdevelop, Eclipse(with Epic, and Subclipse).
My system is similar(arch, and OS) to our servers, which is what I implement code for. Since I work for a small company with many hats, I tend to have a ssh'd mysql connection open in one window, with a vim screen open on the other side. Throughout the day I use SSH, VIM, SVN, firefox, and e-mail daily.
I put all toolchains and other apps needed to build my code into revision control, and write makefiles for all projects such that the version of the tools from the repository is used, not whatever may be in the $PATH. So when I do a label for a release, it includes everything needed to do the build, and depends on build machine setup as little as possible. All I need to do is sync to revision control, and type 'make'. Unfortunately this does require having cygwin installed on Windows, but personally, I consider a Windows machine just about unusable for development without cygwin, regardless of the prerequisites of the build system.
I have simple makefiles to build projects that include platform-specific .mk files. I don't manually create IDE project files. In a couple cases (Rowley Crossworks for embedded ARM development, Visual Studio for self-hosted windows PC development), I auto-generate project files based on my makefiles, as part of the "make debug" target, and then launch the IDE with the generated project. This makes debugging convenient, without requiring parallel maintenance of a IDE-specific project file in addition to my makefile.
I am about to set up a new development environment for a new department.
Build environment (support both Java development and .Net) will be on to separate VMware machines running on the same physical computer. Both images will use 2008 server.
Developer machines will be desktop computers, most likely qith 6 gig ram, big harddrives, 1 or 2 cpu's with dual or quad core, 24" screens * 2, etc., and with 2008 server installed. This to ensure that the developer code is compiled on the OS. Desktops because I want the developers to be able to use VMware to test, etc., without spending to much time complaining about lack of performance with 2 VMwares running at the same time :)
I am trying to figure out the build environment now. Considering Team City, ++. Difficult to find the right one when you want to support multi-platform environment without to much fuss :)
Every developper setup includes a MacBookPro 17" with a 22" lcd screen.
Eclipse is our IDE, and we use VMWare to host our developpement database (oracle) under winXP.
Obviously a lot of your answers are going to depend heavily on what kind of development each person does. Maybe we should be categorizing these? :)
Web Development
I use a VM to run a Linux guest with a development webserver. I use Notepad++ on my host for editing (recent convert from jEdit), and with drive mapping in the VM software (Sun's VirtualBox), my dev webserver guest machine has no problem serving up the ever-changing source files. I also use the Windows XP IE6 VPC image in another VM to test the page in IE6. I use this setup even if I'm not developing a complicated web-app and am simply working on a static HTML page; there are still some quirky differences in behavior between a locally opened file and a served webpage in a number of browsers that make this worthwhile.

Installing Team Foundation Server

What are the best practices in setting up a new instance of TFS 2008 Workgroup edition?
Specifically, the constraints are as follows:
Must install on an existing Windows Server 2008 64 bit
TFS application layer is 32 bit only
Should I install SQL Server 2008, Sharepoint and the app layer in a virtual instance of Windows Server 2008 or 2003(I am already running Hyper-V) or split the layers with a database on the host OS and the app layer in a virtual machine?
Edit: Apparently, splitting the layers is not recommended
This is my recipe for installing TFS 2008 SP1.
There is no domain controller in this scenario, we are only a couple of users. If I was to do it again, I would consider changing our environement to use a active directory domain.
Host Server running Windows Server 2008 with 8GB RAM and quad processor
Fresh install of Windows Server 2008 32bit in a VM under Hyper-V
Install Application Server role with IIS
Install SQL Server 2008 Standard edition
Use a user account for Reporting Services and Analysis Services
Create a slipstreamed image of TFS 2008 with SP1 and install TFS
Install VSTS 2008
Install Team System Explorer
Install VSTS 2008 SP1
Install TFS Web Access Power tool
After installing everything, reports were not generated. Found this forum post that helped resolve the problem.
Open p://localhost:8080/Warehouse/v1.0/warehousecontroller.asmx
Run the webservice (see above link for details), it will take a little while, the tfsWarehouse will be rebuilt
It is very important to do things in order, download the installation guide and follow it to the letter. I forgot to install the Team System Explorer until after installing SP1 and ventured into all sorts of problems. Installing SP1 once more fixed that.
One critical thing you has to keep in mind about TFS, is that it likes to have the machine all to it self. So if you have to create a separate instance on Hyper-V do it using the proven Windows Server 2003 platform with SQL Server 2005.
I am sure Microsoft has done a great job getting it to work under Windows Server 2008 and SQL Server 2008, however you don't get any additional features with this newer install and it is currently unproven in the wild.
So my recommendation is to stick with what is known until the next release of TFS comes out.
Also splitting the layers is definitely not recommended, especially in the workgroup edition where you will only be allowed to have 5 licensed users. Those 5 users will never exceed the server's needs. Also my recommendation is to not update Sharepoint if you don't need to. In my environment, we don't really use Sharepoint all that much, so I left it alone. Sharepoint is usually, in my experience, where most of the problems come from with TFS.
I just upgraded our team to TFS 2008, from TFS 2005. The hardest part was upgrading SharePoint 2.0 to 3.0, so I would make sure to do that first, if you have not already installed TFS 2008. We had a couple of other difficulties, but they were all either related to the SharePoint upgrade, or to the fact that we were using an aftermarket Policy package - Scrum for TeamSystem. We are on SQL Server 2005, so I cannot address SQL Server 2008. As for splitting the layers, we did not do this either, as we are running on Windows Server 2003 and everything ran under the host OS.
Splitting the layers is only needed for more than 450 users.
I would also recommend having the Build Server on a completely seperate machine. Building is very file system intensive. SQL Server performs best when it has complete control of a file system - so having build and TFS on the same machine may create performance issues while builds are executing.
Perhaps this can be alleviated with proper tuning and seperate physical drives - but I'd think in the long run it would be a lot simpler to just either use some old hardware - or spin up a small virtual machine on a seperate host for your builds

Resources