Is it better to create a reference image and clone that image on all of the machines or is it better to use the the Microsoft Deployment Toolkit to do an automated install on every machine? In a Lab setting with 25 identical machines why is one method better than the other? What are the pros and cons of each method? Is speed the only factor?
Microsoft says time is the only factor. Takes about 20 mins to image and 3 hours or so to do a fresh automated install. In my experience most of the time it takes to do an automated install eaten up by windows updates.
Related
We are a mid sized company and create a game engine by using C++ for all major operating systems (macOS, Windows and Linux). One issue we are facing is the amount of different configurations on developer machines. Some use different versions of Visual Studio, Xcode or other development tools. We have guide lines and a list with version numbers we officially use but at the end of the day we end up with these different configurations on everyones machine because everyone installs what they want.
Does anyone have experience with this problem, and how did you solve this? We often run into scenarios with "But it works on my machine".
I am eyeballing on (docker) containers to ensure everybody uses the same development environment, but I have my doubts its the right way to go for a C++ real time project. Any hints?
Disclaimer: I was wondering if SO is the right place to ask this question, according to MetaExchange, it is
My team is currently using VS2005 with the following development PCs that are a few years old: XP, Pentium D 2.8GHz, 2GB RAM.
My gut tells me that this is going to be poor hardware for VS2010 development. I am not running VS2010 beta but I am running Blend 3 beta and the performance is bad.
Can you point me to anything that I can show my boss to convince him to buy 6 new machines for my team?
Edit below after initial answer from Jon:
I should have added that my boss wants to upgrade current machines with new hard-drives so I am trying to use this opportunity to take a look forward and see if a HD upgrade is really worth it. This HD upgrade would not just be simple installation of 2nd drive but would replace current drive and would involve backup/restore or reinstallation headaches. There would be the added benefit of 64bit development too, something that we have been talking about.
Betas typically are bad in terms of performance. I know that MS is working hard to improve the performance of VS2010.
However, I have the beta running on my Samsung NC-10 netbook, so it does work on low spec machines.
Do you already find yourself frequently waiting for your machine to catch up? If so, that's the reason to give your manager: you'll be more efficient now with a new machine. If not, wait until VS2010 is out and you actually have it installed (will you even upgrade immediately?) - then if it's too slow, you can show that to your manager at that point.
Speculatively requesting an upgrade doesn't sound like a good idea to me.
Given that those Pentium-Ds are dual core processors, I'd suggest:
maxing out the RAM
Windows 7 x64
separate faster HDDs
Those CPUs, while not at the top of the list, are decently powerful. They should handle today's workload of VS2005/2008 without much problems.
It's likely back to the RAM and probably HDD speeds. I know you didn't mention HDD at all, but consider 2 drives (OS and data) an SSD drive instead.
I realize this isn't a direct answer to your question of how to convince your boss to kick out the money, but if that doesn't work, perhaps this will help in terms of getting more performance.
Even though I'm late a little, I guess I need to add that MS is really working hard to improve the performance of VS2010, but they aren't successful here. Using WPF slowed VS 2010 down a lot.
My experience shows that VS2010 has the only drawback, it's slow. Therefore, the developer machine should be based on Nehalem processor core at least and possess 4Gb memory.
Anyway VS2005 is not the best choice. VS2008 is much more mature.
I have secured the budget to upgrade the individual workstations and latops. While newer, bigger screens were welcomed with enthusiasm, the thought of re-installation tools and settings caused most of them to blanch and I got one "Do I really have to?".
How much downtime do you usually have when you move to a new machine?
Do you employ tools or script to set up your dev environment, tools, db's, debuggers etc.specifically for a windows environment?
Is there a standard image that you keep and then let devs move in and tweak the machine as necessary?
My company essentially virtualized in order to stop wasting so much time with upgrades/system failures.
Whenever a desktop/laptop failed, we'd have to spend a better part of a day fixing it and reloading the software.
So, we went out, bought iMacs for everyone and loaded Parallels (a VMware like product for OSX) on them. Then we made a standard dev image for everyone, and just copied it to everyone's machines.
Essentially, if anyone's configuration got messed, we just loaded in a fresh image and kept on truckin'. Saved a lot of time.
Some additional benefits:
When new software is out, we just make a new image and distribute it. Not OS re-installs or anything like that.
If hardware changes, doesn't matter, just move the image.
You can run multiple os's concurrently for testing
You can take "snapshots" in your current image and revert if you really messed something up.
Multiple builds on the same machine...since you can run multiple os's.
Surprisingly the overhead of a virtualized system is quite low.
We only run the software on a real machine for performance tuning/testing purposes.
One day is generally enough for upgrades. I do keep digital copies of VS.NET so much easier to install.
When it comes to other tools generally it's just better to go to websites and install the latest version.
Also it's a good idea to install tools whenever you need instead of trying to install everything at the same time.
The last time I upgraded to a new machine, I think it took about 4 hours to get most of the necessary tools reinstalled. Over time, I've had to re-install quite a few more tools, but I think it's worth it.
If you can get a ghost/image of the default tool set (Visual Studio 2003-2008, Eclipse, NetBeans, or whatever you're using), and all the major service packs, that would help a lot with the initial setup.
I think the downtime is definitely worth it, a new, faster machine will make anyone more productive.
You can have 0 downtime by having both machines available. You will not have as much productivity.
This depends on the number of tools needed by the development team. Tools such as Rational Software Architect can take hours to install on their own. The exercise of having the developers list the applications they need before moving in can help you optimize strategies to deploy effectively. Both machines should be available for a fixed period of time and having them available can allow develoers to both work and kick of long running installs at the same time.
Creating a standard image based on the list provided to you can improve efficiency. Having the relvant software on a share could also let them cherry pick as needed and give the development team the feeling that they can go back as necessary.
Tools to assist in catpuring user settings exist. I have only ever had experience with Doctor Mover. If you have 100 or more developers to move it may be worth the cost. I can't complain too much but it wasn't perfect.
I have never had a problem with just getting a list of all the software a particular users uses. In fact I have never found the base install to be much of an issue. The parts I tend to spend the most time on are re-configuring all of the users custom settings (very common with developers I find). This is where it is very valuable to have the old machine around for awhile so that the user can at a minimum remote-desktop to it and see how they have things set up.
Depending on how your team works, I would highly recommend having every user receiving a new computer get the latest source tree from your source control repository rather than by copying entire directories. And, I would also recommend doing that before actually sending the old workstation elsewhere or even disconnecting it.
One of the great things about tools like CVS and SVN is that it is quite easy for developers to end up with an unofficial "personal branch" from things that are not properly checked in, merged, etc.
While it will cost time to deal with the shift if things are not properly synchronized, it is an invaluable opportunities to catch those things before they come to haunt you later.
Is a CI server required for continous integration?
In order to facilitate continous integration you need to automate the build, distribution, and deploy processes. Each of these steps is possible without any specialized CI-Server. Coordinating these activities can be done through file notifications and other low level mechanisms; however, a database driven backend (a CI-Server) coordinating these steps greatly enhances the reliability, scalability, and maintainability of your systems.
You don't need a dedicated server, but a build machine of some kind is invaluable, otherwise there is no single central place where the code is always being built and tested. Although you can mimic this affect using a developer machine, there's the risk of overlap with the code that is being changed on that machine.
BTW I use Hudson, which is pretty light weight - doesn't need much to get it going.
It's important to use a dedicated machine so that you get independent verification, without corruption.
For small projects, it can be a pretty basic machine, so don't let hardware costs get you down. You probably have an old machine in a closet that is good enough.
You can also avoid dedicated hardware by using a virtual machine. Best bet is to find a server that is doing something else but is underloaded, and put the VM on it.
Before I ever heard the term "continuous-integration" (This was back in 2002 or 2003) I wrote a nightly build script that connected to cvs, grabbed a clean copy of the main project and the five smaller sub-projects, built all the jars via ant then built and redeployed a WAR file via a second ant script that used the tomcat ant tasks.
It ran via cron at 7pm and sent email with a bunch of attached output files. We used it for the entire 7 months of the project and it stayed in use for the next 20 months of maintenance and improvements.
It worked fine but I would prefer hudson over bash scripts, cron and ant.
A separate machine is really necessary if you have more than one developer on the project.
If you're using the .NET technology stack here's some pointers:
CruiseControl.Net is fairly lightweight. That's what we use. You could probably run it on your development machine without too much trouble.
You don't need to install or run Visual Studio unless you have Visual Studio Setup Projects. Instead, you can use a free command line build tool called MSBuild.
Microsoft already has a Windows 7 Beta Customer Preview Program on their MSDN site where they encourage us to: "Evaluate and jump-start your development efforts on Windows 7 Beta".
Do you feel it is worthwhile to spend my time now re Windows 7, or should I wait a few releases, or even until after Windows 7 is released?
What are the advantages and disadvantages to starting this early?
As Paul said, there's absolutely no reason not to start now. What you fix now is something less to fix later - and you also get the benefit of having an application that is deployable on an OS that over 2.5m people are expected to download and install over the next few weeks.
Of course, you can expect to run the risk of having to make minor adjustments to your program as bug fixes are implement, or as new features are rolled out, but what you do now will still save you time - even if it just saves you having to become familiar with any platform-specific constraints further down the line, when pressures from potential clients, customers, etc. will be significantly higher.
I've installed windows 7 on two computer. So far, there has only been one small issue (the software did not find a USB device). I ran the compatibility wizard and now it works fine.
They have made it easy enough for a end user to take care of.
It's basically Vista 6.2. Lots of good improvements but not a new operating system. So it's no rush to test,
I've downloaded the Windows 7 beta and will be installing it into a VM shortly.
There's really no reason not to check your stuff on it now. It's way better for you to find and fix any problems before your users do.