I am working to automate the install of some software.
It relies on some things like the Java JDK and well lots of things that have manual steps installing and copying things around.
I would like to be able to test if the various packages are installed and if not install or update them.
How likely is it that I can get MSBuild to do this sort of work? If unlikely then where can I look?
Thanks
The answer is Yes. MSBuild can execute any command -- as long as that command does not expect user to be in front of the computer. I know you can do silent JDK install, so you can just execute that command in your MSBuild target.
However a more interesting question is: should you do this? I think that performing machine-wide configuration steps as part of the build is bad practice. For certain things, like deployment of your newly built product for CI cycle it is ok, but for the purpose of the build it will be very inflexible.
What I would recommend in case of JDK: since JDK is big and mostly backwards-compatible, in your build script check if correct version of JDK exists on the machine. If it does not, fail the build and print out instruction in the log how to configure machine. For smaller dependencies, see this SO question.
Related
I know this may be overkill for a single developer solution (personal project and not yet enterprise software), but I was wondering how to better respond to my needs.
I would be needing to accomplish the following:
Run integration tests (none UI for the moment) at least daily in order to see if any of my commits breaks the build.
Build the entire solution daily to see if any of my commits are incomplete and would cause problems when checked out on another folder.
Be run on my personal computer at least once a day (using another computer to automate the build process is not an option for the moment)
I know that automated build software such as Jenkins are easily capable of doing the previous (even on the same machine as committed?), but I was wondering if lighter solutions are available. Ex: Post-commit actions on the repository?, scripts?, planned tasks etc...
Edit
Forgot to mention that I was using a Windows machine with a c# project running nunit tests. I use visual studio 2012 to compile solution and run tests with nunit. I use tortoise svn and Ank svn as repository browser.
You might make a crontab(5) entry to periodically (e.g. daily) run your build or tests.
I have a crontab entry invoking some shell script to fetch the source tree by svn or git version control in a fresh place and build it daily.
You could consider using inotify(7) facilities, perhaps thru incron, to have a test run as soon as you modify some file (e.g. an executable).
Look also at D.Moreno's garlic project (which I never used).
You could also simply have some Makefile targets for tests, and run them from emacs. I have
(load-library "compile")
(global-set-key [f8] 'recompile)
in my ~/.emacs so I just compile things by pressing the F8 key in my emacs editor.
Use Jenkins - no reason not to, considering its reasonably lightweight itself (despite being a java app). Its very self-contained too, backup involves stopping the Jenkins service and copying the installation directory so it's not going to pollute your OS.
Anything else you come up with is going to be too complex (in terms of maintaining a bundle of scripts, scheduled tasks and so on) or just as 'heavyweight'. You might as well save your time and use the tool that fits from the start.
We have a number of small projects within our system running on Linux (Slackware 7-11, slowly migrating to RHEL 6.0). Around 50-100 applications and 15-20 libraries. Almost all our applications use one or more of our libraries. Our source tree looks something like this:
/app1
/app2
/app3
/include
/foo/app4
/foo/app5
/foo/app6
/foo/lib1
/foo/lib2
/lib/lib3
/lib/lib4
/lib/include
Now, I've done some work creating some CMakeLists.txt files and built most of the libs and some of the apps. I'm fairly comfortable with using cmake to build. I did this with v2.6, and I recently (an hour ago) upgraded to 2.8. Each of the above projects have their own CMakeLists.txt file specific to the project to do building and installation (no packaging, yet).
I have a requirement to make use of and enforce continuous integration. I've installed and played around with Jenkins, and from what I've seen I'm very impressed. I'm also evaluating JIRA to do our issue tracking.
Just to get things up and going, I've done a cmake install on all the libs, so the apps can find them in the filesystem. Headers are installed to /usr/local/include and libs to /usr/local/lib. Is this a bad thing to do? Would it be better to tell cmake to look for the lib's source directory, use the export interface or the recently introduced ExternalProject_Add?
Because I'm going to be using Jenkins, I cannot be guaranteed that cmake can find the source or build directory. Of course, I can tell Jenkins to build the projects in order (or at least, build the dependencies first). If an update to a library breaks the building of another project, then I guess it'll be up to someone with 3/4 of a wit to determine this.
Thank you in advance
Just to get things up and going, I've done a cmake install on all the libs, so the apps can find them in the filesystem. Headers are installed to /usr/local/include and libs to /usr/local/lib. Is this a bad thing to do?
No it is not a bad thing to do, but your build should reproduce resources from scratch. Things like portability and fixing build bugs will become an issue if things need to be pre-installed in the system outside of the build process. If you are able to do it other ways as you mentioned I would suggest that way, but if its going to make your build that much longer, its something you need to feel out. My ideology is everything should be movable to a new Jenkins machine with a fresh install at the drop of a hat, again this always isn't achievable, but something to strive for.
Because I'm going to be using Jenkins, I cannot be guaranteed that
cmake can find the source or build directory. Of course, I can tell
Jenkins to build the projects in order (or at least, build the
dependencies first). If an update to a library breaks the building of
another project, then I guess it'll be up to someone with 3/4 of a wit
to determine this.
Well one of the things I do in interdependent jobs is that on the successful building of one jobs triggers the job that depends on it. So for example if A depends on B, and A fail, B will never be run and whoever created the issue in build A is responsible for it and so on. This prevents a cascading affect of broken build that all were caused by a broken dependency. I would suggest that you keep files in a particular build in its job folder and specify to the dependency the location of the required files. Again keep your builds separate and clean.
I'm also evaluating JIRA to do our issue tracking.
I highly recommend JIRA as an issue tracking system for company; You might want to look at this Jenkins plugin for integration. If your using git, and you dont mind hosting your code off site, I would GitHub issues a shot as well.
Goodluck you seem to be on the right track.
Two parts to this questions.
1) As part of our Continuous Integration build process i would like to install everything as-if it were a virgin machine. Martin folwler paper: http://martinfowler.com/articles/continuousIntegration.html
Does he mean that we take each (integration) build (clean machine) and installing ALL the necessary software to make the build work? I'm guess this is what he meant by "Single Command" build.
2) Which leads me nicely onto the next question. Is it possible to install programs using Powershell/Dos all through the command line? For example how would I install WinRar and possibly MySQL? (Winrar being a easy example, MySql complex).
Anyways, I am interested to hear from real-world practitioners of CI and how they approach their build processes.
In the latest CI environment I built, I installed and configured the toolchains and SDKs under a single directory tree and then created an ImageX WIM image of the tree. Each clean build would then mount the image, checkout sources from version control, build them, run tests etc. When unmounting, just remember to not commit the changes back to the image so that the image file stays clean.
For each of our builds with Zed we ensure a completely clean working environment, but assume that the entire tool-chain and utility applications are already installed on the machine.
If you really want to go to the level of virgin machine, then I would agree with laalto and look into VM. Setup your VM library to represent the different build environments/configurations that you will need for your product set, and load/start them on demand as you require builds for different products.
I think it is very important to always build from a clean working directory, but I'd question the real value of always trying to start with a bare OS and install everything from scratch for every build.
I am trying to set up a Continuous Integration process. For my various build tasks(compiling, testing, documentation etc.)I need to have tools that perform these tasks(csc, NUnit, NDoc etc.). My question is should these tools too go into my source control repository?
Why I think that they should is because I read in some online article that the developer environment should be as much similar to the build server environment. To fulfill this requirement, the article suggested that you put everything that is required for your build in the repository and when you check out the code(or the build server checks out the code) you are ready to build the project right away without first installing any other tools. But on the other hand if I put these tools with my source code in the repository then the build server will have to install them whenever a build is run.
Is it OK to install these tools? Won't it increase the time for each build unnecessarily?
It's often more trouble than it's worth to try to check in tools to source control. Rather, write a list of software requirements that must be installed before the source can be checked out and built (one thing that would need to be on this list in any case is the source control system itself). If you rely on software being in source control, some tools might need to be installed in certain paths or be otherwise configured (registry entries come to mind).
I would certainly not check in the compiler itself to source control, and I probably wouldn't check in NUnit or NDoc either. Just install these beforehand, as they are not likely to change too much over the lifetime of your project. Your build script might want to check that the expected version(s) of the required software packages are installed before the build may proceed.
Unless you're customizing the tools there's probably no reason to put their source code in your repository. However there are excellent reasons for putting your config files in the repository.
Re-installing the tools for every single build is overkill and will slow you down.
However it's by far better to have a server dedicated to the continuous integration so that you know its state ; you sure nobody installed anything that may have an impact on the outcome of the build.
If you want to be able to re-generate today's build next year, you need to be able to re-create your environment first. Make sure you'll be able to re-install your tools (exact same version), either by keeping them on your server (installing the newer versions in different directories), or storing the whole package in your configuration management tool.
Think about how you would create another continuous integration server, either to have two of them, or for a second site, or to recover after a disaster. Document how the continuous integration server was set up.
What really needs to be version controlled, is the build scripts, that access the right versions of the tools, especially if you opt for installing several versions of the tools.
In using our TeamCity Continuous Integration server we have uncovered some issues that we are unsure as to the best way to handle. Namely how to reference external applications that our application requires on the CI server.
This was initially uncovered with a dependency on Crystal Reports, so we went and installed Crystal Reports on the Server fixing the immediate problem. However as we move more applications over to the CI server we are finding more dependencies.
What is the best strategy here? Is it to continue installing the required applications on the Server?
Thanks
Where possible make the external dependencies part of your build system.
For instance check the installer in to your version control system and have a step that checks it out and runs it in silent mode (many installers support a mode with no user action sometimes using the commandline /s).
This way if you need to set up another build machine for a branch or just for new hardware everything is repeatable.
If your builds require the actual application to complete the build, then you should probably continue to install the application on your build server.
If you just need references to dlls or assemblies from the application, then what we've done at my company is to create installable 'SDKs' of the references required for a particular applicatoin and install them on our development and build machines in well-known library directories that our solutions reference.
On the build machine, our pre-build steps install the correct version of the dependencies and then clean them up when we are finished.
Recently, we've moved to using virtual machines for our build machines that our build process activates. These VMs get the SDKs installed on them as a pre-build, and then are restored to their snap-shot state after the build. We had some dependencies that were almost impossible to uninstall, so this made for a clean starting point each time.
If you use Maven to build, you can define your dependencies in the pom.xml file. They will then be automatically downloaded if necessary.
I am not sure if I followed correctly...
I am assuming your application is dependent on this external app, while building? In that case it should be on the machine doing CI...