I would like to use some build parameters from Project 1 in Project 2. I know that I can make Project 1 a dependency of Project 2 and then access its build parameters as described in Dependencies Properties, but I do not want Project 1 to be built in response to a build of Project 2. For example, suppose I want Project 2 to be built nightly, while I only want Project 1 built monthly.
Is there any way Project 2 can access Project 1's build parameters under these conditions?
I would use a build configuration template that is shared between the two projects.
This means you can share properties between the projects, but also override certain ones in each individual project.
We use this for hourly builds that are not tagged and nightly ones that are tagged.
Then use a different build trigger to set one off nightly and the other monthly.
EDIT
I'll just expand slightly as a result of your comment.
In TeamCity we have 2 build configuration for the same project. One that builds on every check-in to give developers quick feedback on their contribution (build within 15 minutes). It does the following:
Builds the project in Debug
Runs all unit tests
Checks results of build into Subversion
The other configuration runs every night at midnight; it build everything and as a result takes a long time (around 45 minutes). It does the following:
Build the project in Debug and Release
Runs all unit tests
Builds Sandcastle documentaion
Checks results of build into Subversion
Grabs the Sandcastle output at an artefact so developers can easily download it.
As you pointed out this isn't as straightforward as one would like; however you can use the following to achieve it:
We use the Autoincrementer to share build numbers between the two configurations (they both increment the same build number when built).
We have a property on template that defines what artefacts to collect and is referenced from the artefacts field. The property is overridden on the second build config to define the sandcastle output to grab.
Sharing VCS Roots is mentioned on the documentation. Both our builds get the source from the same place, and tag the results to the same place. One VCS is most definitely all we need.
Bit of a long edit but I think it goes exactly on the lines of what you're trying to achieve. I appreciate I should have included this in the original answer.
HTH
Dependency is different from Build Triggering in TeamCity. If you make one project dependent on another ( artifact dependency ), it does not mean that the the latter will trigger the former.
Even when one project has been defined as dependent on another ( and also, even if not ) you have to specify explicitly the build trigger ( in this case a Finish build trigger ) for the dependent project to be triggered.
Related
I have been working with TeamCity, Jenkins and Bamboo my last 8years. Latest 2years very involved in ContinuousIntegration factory setup and mainteinance on my team with very good results, giving me a lot of habits about how to deal with builds, artifacts and pipelines.
Now, i'm on a new company, new team, new CI, TFS2015, first time for me.
Just one month before I arrive to this new team they were on TFS2012, with XAMLs, so I took the migration to vNext builds.
At first look, I found on vNext builds the classic build definition, i mean adding steps as a single task to the build, instead of monolithic XAML file.
But with the time, i was trying to create more complex builds, like TeamCity build chains, but this is not possible, strike one...
Then, I was trying to deal with multiple branches, one continuous build for each branch (we are on TFSVC), create packages from each branch, and I found I was duplicating my builds just to change repository paths and a few details, so I moved builds to templates trying to reuse build definitions, introducing variables to generates paths(for repos and branches) and versions and expecting to change only in one place the build and having this changes reflected in all builds derived from templates... but that's was not the case... :
variables are not accepted everywhere, like in repo paths.
can't change templates after created them, just replace them, and builds created from templates are not affected after template is changed.
strike two ?.....
I'm wondering if maybe I'm doing things wrong with TFS, maybe this is a different system and I can't do things like in other CIs.
Any advice in how to approach TFS to have a good, dynamic and reusable set of builds ??
here isn’t the feature that the changes of a build definition or template affects other existing build definitions.
If the build steps of build definitions are the same for each branch, you just need a build definition and add filters for each branch (Triggers > Continuous integrations (CI), after that it will uses corresponding source to build for CI build, for example, develop branch changes > Trigger a build automatically with develop branch source.
On the other hand, you can change branch and source version when queue build manually or through REST API.
If the build steps are different for each branch, you need to modify the definitions independently for detail requirements.
I'm fairly new to TeamCity (but not to CI systems) and I've been trying to figure out how to work this configuration out:
I have latest TeamCity Professional Version 9.1.3 (3 build agents, 20 configs) installed
Here's my TeamCity Project layout:
Project A
-- Build Product X (WIN)
-- Build Product Y (WIN)
-- Build Product Z (Linux)
I have dedicated 3 agents to build the above build configurations accordingly - 2 on Windows and 1 agent on Linux
WIN products are built using a mix of batch, powershell and msbuild scripts
Linux product is built using shell script
Triggering these 3 builds (under Project A), manually, works just fine. However, this is not feasible as we have many feature branches and all of them would have similar build configurations - 3 clicks for each build + setting Build Parameters on each Build configuration is expensive.
So, Here are my questions:
Is there a way to trigger the entire Project to build with a Single click? doing so, should run these builds in parallel
If 1 is possible, then how do I set the same build number (build params) across these 3 build configurations?
Is it possible to set up a VCS trigger that would poll for changes on any of the repos that build these and trigger the entire project (provided 1 is possible)
Please note I tried configuring both snapshot and artifact dependencies to make this work, but creating dependencies only pauses the other build configurations to wait until the dependent project is complete but that's not feasible for us - they need to run in parallel. (our builds take approx 45mins to complete) - Yes, we have a huge product to package.
I'll be grateful for any pointers in the right direction
Thanks
Yes, one-click triggering of parallel runs is achievable: create a 4th (or 0th) build configuration that does nothing (no build steps). Let's call this config "Zero". Your three build configs will each have a "Finished build" trigger on Zero (trigger when Zero finishes) as well as a snapshot dependency on Zero.
The best bit: you only need to define the common/shared parameters in Zero and the other three configs can reuse these. For example, if you define %foo% in Zero, the other three can all use %dep.MyProject_Zero.foo%. This also means you can get at Zero's build number: %dep.MyProject_Zero.build.number%. In each of your three build configs, switch to the "General Settings" and set your "Build number format" to the above.
For VCS triggering, just set Zero to span all the three VCS areas. You're suggesting they are each in a different repo... I have no experience with that but assume that Zero can span all three repos.
Last, if you use feature branches, and your VCS is git, mercurial, or Perforce Streams, make sure you've read about TeamCity feature branches support; it can save you a lot of time!
we're trying to improve our Jenkins setup. So far we have two directories: /plugins and /tests.
Our project is a multi-module project of Eclipse Plugins. The test plugins in the /tests folder are fragment projects dependent on their corresponding productive code plugins in /plugins.
Until now, we had just one Jenkins job which checked out both /plugins and /tests, built all of them and produced the Surefire results etc.
We're now thinking about splitting the project into smaller jobs corresponding to features we provide. It seems that the way we tried to do it is suboptimal.
We tried the following:
We created a job for the core feature. This job checks out the whole /plugins and /tests directories and only builds the plugins the feature is comprised of. This job has a separate pom.xml which defines the core artifact and tells about the modules contained in the feature.
We created a separate job for the tests that should be run on the feature plugins. This job uses the cloned workspace from the core job. This job is to be run after the core feature is built.
I somehow think this is less than optimal.
For instance, only the core job can update the checked out files. If only the tests are updated, the core feature does not need to be built again, but it will be.
As soon as I have a feature which is dependent on the core feature, this feature would either need to use a clone of the core feature workspace or check out its own copy of /plugins and /tests, which would lead to bloat.
Using a cloned workspace, I can't update my sources. So when I have a feature depending on another feature, I can only do the job if the core feature is updated and built.
I think I'm missing some basic stuff here. Can someone help? There definitely is an easier way for this.
EDIT: I'll try to formulate what I think would ideally happen if everything works:
check if the feature components have changed (i.e. an update on them is possible)
if changed, build the feature
Build the dependent features, if necessary (i.e. check ob corresponding job)
Build the feature itself
if build successful, start feature test job
let me see the results of the test job in the feature job
Finally, the project job should
do a nightly build
check out all sources from /plugins and /tests
build all, test all, send results to Sonar
Additionally, it would be neat if the nightly build was unnecessary because the builds and test results of the projects' features would be combined in the project job results.
Is something like this possible?
Starting from the end of the question. I would keep a separate nightly job that does a clean check-out (gets rid of any generated stuff before check-out), builds everything from scratch, and runs all tests. If you aren't doing a clean build, you can't guarantee that what is checked into your repository really builds.
check if the feature components have changed (i.e. an update on them is possible)
if changed, build the feature
Build the dependent features, if necessary (i.e. check ob corresponding job)
Build the feature itself
if build successful, start feature test job
let me see the results of the test job in the feature job
[I am assuming that by "dependent features" in 1 you mean the things needed by the "feature" in 2.]
To do this, I would say that you have multiple jobs.
a job for every individual feature and every dependent feature that simply builds that feature. The jobs should be started by SCM changes for the (dependent) feature.
I wouldn't keep separate test jobs from compile jobs. It allows the possibility that successfully compiled code is never tested. Instead, I would rely on the fact that wen a build step fails in Jenkins, it normally aborts further build steps.
The trick is going to be in how you thread all of these together.
Let's say we have a feature and it's build job called F1 that is built on 2 dependent features DF1.1 and DF1.2, each with their own build jobs.
Both DF1.1 and DF1.2 should be configured to trigger the build of F1.
F1 should be configured to get the artifacts it needs from the latest successful DF1.1 and DF1.2 builds. Unfortunately, the very nice "Clone SCM" plugin is not going to be of much help here as it only pulls from one previous job. Perhaps one of the artifact publisher plugins might be useful, or you may need to add some custom build steps to put/get artifacts.
my situation is the following,
i have 2 big projects running separately on two machines (1 for each project) sitting of course on different locations and running different build.xml files.
i want to be able to run them both in 1 click and have the product in 1 library at the end.
meaning i can build my entire product in 1 click.
what us the best way to do so?
thanks for the help.
I would set up a build dependency between the two projects. This can be set up so that both projects always build back to back automatically.
When the first project is finished building, the artifacts you are interested in can be pulled over to the second project with the build dependency. Then, in that second project, you can configure the artifacts to be a collection of both projects' results.
You can read more about build dependencies here:
http://confluence.jetbrains.net/display/TCD6/Dependent+Build
If anything I said was too confusing, just let me know and I will come back to clarify.
Good luck!
I am tasked to improve quality and implement TeamCity for continuous integration. My experience with TeamCity is very limited - I use mostly TFS myself and have some experience with CC.NET.
A lot should happen within a build process... actually the build is already pushed into three different configurations that will run one after the next.
My main problem is that in each of those I actually would need to start multiple runners. For example, the first build step shall consist of:
The generation of new AssemblyInfo.cs files for consistent in assembly numbering
The actual compilation
A partial unit test run (all tests that run fast and check core functionality)
An FxCop run
A StyleCop run
The current version of TeamCity only allows to configure one runner ... which leaves me stuck with a lot of things.
How you would approach this? My current idea is going towards using the MsBuild runner for everything and basically start my own MsBuild based script which then does all the things, pretty much the way that TFS handles it (and the same way i did things back in the cc.net way with my own Nant build script).
On a further problem the question is how to present statistical information, for example from unit tests running in different stages (build configurations). We have some further down that take some time to run and want that to run in a 2nd or 3rd step (the latest for example testing database generation code which, including loading base data, takes about 15+ minutes to run). OTOH we would really like test results to be somehow consolidated.
Anyone any ideas?
Thanks.
TeamCity 6.0 allows multiple build steps for a single build configuration. Isn't it what you're looking for?
You'll need to script this out, at least parts of it. TeamCity provides some nice UI based config for some of your needs, but not all. Here's my suggestion:
Create an msbuild script to handle your first two bullet points, AssemblyInfo generation and compilation. Configure the msbuild runner to run your script, and to run your tests. Collect your assemblies as artifacts.
Create a second build configuration for FxCop. Trigger it from the first build. Give it an 'artifact dependency' on the first build, which is how it gets a hold of your dlls.
For StyleCop, TC doesn't support it out of the box like it does FxCop. Add it to your msbuild script manually, and have it produce an html report (which TeamCity can then display).
You need to take a look at the Dependencies functionality in the TeamCity. This feature allows you to create a sequence of build configurations. In other words, you need to create a build configuration for each step and then link all them as dependencies.
For consolidating test results please take a loot at the Artifact Dependencies. It might help.