I want to create a one-click release build. I am fine creating multiple build configurations and linking them together or building everything in one build configuration.
Module Dependencies are mentioned below:
Module C depends upon module A and B. Module D depends upon module C.
A > C > D
B > C
There could be instances where nothing has been changed on module A or B. If nothing has been changed in a module, I do not want to build and release them as it was already released in past. This would have been simple if I need to build all the modules every time which is not the case.
Let us consider a scenario that there is a change in module B. In this case, I only want to build module B, C & D (but not A).
Is there a way this could be achieved with build step or parameters or by any other means in Team City?
In the Snapshot Dependencies enable the checkbox
Do not run new build if there is a suitable one
Only use successful builds from suitable ones
I recommend using Artifact Dependencies with Snapshot Dependencies.
Enable the follows settings
Build from the same chain
Do not run new build if there is a suitable one
Only use successful builds from suitable ones
Then the TC will not build some of build chain if no chainges
Related
We have a few full gradle projects A, B, C, D. These are microservices that are going to start sharing the protobuf generated java files. we are thinking a structure like this
A
build.gradle (this is a full on gradle build)
B
build.gradle (this is B's full on gradle)
common
build.gradle (build the protobuf that is used by A and B)
Now, the question is how do we make sure that when a developer builds A, it also builds common in case it changed on his git pull. Same goes for B. The settings.gradle file didn't seem to hav a ../../:project or something like that.
I do remember gradle came out with a way to build multiple gradle projects though as well.
Ideally, when someone changes common, multiple jenkins builds would be kicked off as well verifying that changing core code didn't break any of the services that use it. I am not quite sure how to
1. document the things that depend on common
2. use the document to kick off builds of all things depending on common
Then if this were to grow, and you have D depends on C depends on common, each build needs to be kicked off feeding the binary upstream from common to C and then C's jar and common's jar to D. I know 'pants' is used at twitter to do this. Google is using bazel. Perhaps I look into that instead of gradle? or can we intermingle them?
Simply declaring a dependency on common should be sufficient enough:
// Project A's build.gradle
dependencies {
implementation(project(":common"))
}
In order to build a, the build of common would need to succeed. If the build of common failed for whatever reason, then then build of a will also fail. Example:
$ ./gradlew project-a:build
> Task :common:compileJava FAILED
/Users/cisco/code/example-multi-project/common/src/main/java/common/ExampleCommon.java:6: error: incompatible types: int cannot be converted to String
return 1;
^
1 error
FAILURE: Build failed with an exception.
* What went wrong:
Execution failed for task ':common:compileJava'.
> Compilation failed; see the compiler error output for details.
You can see in the above that when I tried to build project a (project-a:build), the common's build task was invoked (:common:compileJava).
Both projects a/b/etc should have thorough tests (unit, integration, smoke, etc) to make sure that any incompatible changes are detected early/often.
You can read more about multi-project builds in the official guide: https://docs.gradle.org/current/userguide/multi_project_builds.html
I have a build chain with two projects: A is the root project, B depends on it. B has two dependencies configured: an artifact and a snapshot dependency. One build configuration for B has an environment variable (parameter) set. However, I also need this parameter set for the root project A.
Is there any way in TeamCity 9 to pass a build configuration parameter from a project to its dependency (in the same build chain)?
Since TeamCity 9.0 it is possible to override the dependencies parameters by redefining them in the dependent build:
reverse.dep.<btID>.<property name>
For TeamCity 8 and below unfortunately the only way parameters can be passed on is in the direction of the build chain - the reverse of what you want - these properties are called Dependencies Properties:
Dependencies Properties
These are properties provided by the builds the current build depends
on (via a snapshot or an artifact dependency).
Dependencies properties have the following format:
dep.<btID>.<property name>
Dependencies properties flow from the root of the tree to its leaves (in the direction of the build chain flow) , but not the other way round, so the properties of A can be accessed in B.
This is clarified in the docs here:
Parameters in dependent builds
TeamCity provides the ability to use properties provided by the builds
the current build depends on (via a snapshot or artifact dependency).
When build A depends on build B, you can pass properties from build B
to build A, i.e. properties can be passed only in the direction of the
build chain flow and not vice versa. For the details on how to use
parameters of the previous build in chain, refer to the Dependencies
Properties page.
I've had a similar use case for the reverse flow before as well - the workaround was not pretty - basically instead of triggering the build chain directly we would trigger an independent build (let's call it X) that was only there to hold the build parameters - then modify the build chain to have the root build (A in your case) depend on the last successful build of X and have the build chain trigger on a successful build of X - this should accomplish what you want.
For TeamCity 9 see #Alina's answer (which should be the accepted answer).
Very new to Maven, can someone please explain to me the difference between using maven modules vs just adding a dependency to your maven project to another maven project in your workspace? When would you use one over the other?
A dependency is a pre-built entity. You get the artifact for that dependency from Maven Central (or Nexus or the like.) It is common to use dependencies for code that belongs to other teams or projects. For example, suppose you need a CSV library in Android. You'd pull it as a dependency.
A Maven module gets built just like your project does. It is common to use Maven modules for components that the project owns. For example, maybe your project creates three jar files.
A dependency can be thought of as a lib/jar (aka Artifact in Maven parlance) that you need to use for building and/or running your code.
This artifact can either be built by your one of the modules of your multi module project or a third party pre-build library (for example log4j).
One of the concepts of maven is that each module is going to output a single artifact (say a jar). So in case of a complex project it is good idea to split your project to multiple modules. And these modules can be dependent on each other via declared dependencies.
See http://books.sonatype.com/mvnex-book/reference/multimodule-sect-intro.html for example of how a web app is split to parent and child modules and how they are linked.
One of the most confusing aspects of Maven is the fact that the parent pom can act as both a parent and as an aggregator.
99% of the functionality you think about in Maven is the parent pom aspect, where you inherit things like repositories, plugins, and most importantly, dependencies.
Dependencies are hard, tangible relationships between your libs that are evaluated during each build. If you think of your software as a meal, it's basically saying A requires ingredient B.
So let's say you're preparing lasagne. Then your dependency chain would look something like this:
lasagne
<- meatSauce
<- groundBeef
<- tomatoPaste
<- cheese
<- noodles
The key thing is, each of the above items (meatSause, groundBeef, cheese, etc) are individual builds that have their individual set of dependencies.
By contrast, the only section of your pom that pertains to aggregation is the modules section:
<modules>
<module>meatSauce</module>
<module>groundBeef</module>
<module>tomatoPaste</module>
<module>cheese</module>
<module>noodles</module>
</modules>
Aggregation simply tells your build engine that it should run these 5 builds in rapid succession:
groundBeef -> tomatoPaste -> cheese -> noodles -> meatSauce
The main benefit of aggregation is the convenience (just click build once) and ensuring the builds are in the correct order (e.g. you wouldn't want to build meatSauce before tomatoPaste).
Here's the thing though: even if you organize the libs as standalone projects without module aggregation, your build will still come out the same provided you build in the correct order.
Moreover, both Jenkins and Eclipse have mechanisms for triggering builds if a dependent project has changed (e.g. changing groundBeef will automatically trigger meatSauce).
Therefore if you're building out of Jenkins or Eclipse, there is no need for aggregation
We have 2 project configurations A and B. B depends on A. A commiter makes a change in project A that causes build failure in downstream project B.
Is there a way in TeamCity to notify the commiter of project A that the B has failed because of his change?
Jenkins/Hudson supports that using upstream-individuals:A as an email address in this particular situation.
I tried to set it up through snapshot dependencies as Danere pointed out and it is working. Since the TC way is different from what you probably tried with Jenkins here is my setup:
I added another project to the chain named C that contains 2 VCS: projectA and projectB and a dummy build step.
Project C is configured to be triggered by any VCS change
Project C has a snapshot dependency on project B
Project B has a snapshot dependency on project A
Both project A and B do not have any triggers.
The project C could be probably eliminated but my configuration is more complex and the last step performs system tests and I didn't want it to monitor all VCSs of all the upstream projects.
I'm having a problem reconciling building a project for use within an application server and for use as a stand-alone application.
To give an overall simplified context, say I have three Projects A, B, C.
Project A depends on Project B which depends on Project C.
Project C has a dependency X which is marked as provided since it was expected that it would be available as a JEE library within say an application server. i.e. jms.jar.
So if I perform an assembly build of Project A, I get all the transitive dependencies save for those marked as provided as expected.
Now I have a new deployment scenario where Project A needs to be used in a standalone environment i.e. outside an application server.
So now I need the jms jar to be a compile dependency. Does this mean that I should explicitly add a compile dependency for X in Project A? Doesn't this violate the Law of Demeter, i.e. don't talk to strangers, in the sense Project A shouldn't explicitly know about Project C but only about Project B?
This is a simple example but in reality I have multiple dependencies which have been marked as provided but are now need to be compile or runtime dependencies so they end up in the artifact produced by the maven assembly plugin.
Is this a fundamental problem with Maven or am I not using the tools correctly?
Thanks in advance for any guidance.
If you need your build to have variations in it for different scenarios, you need to use profiles and keep certain things (such as some of the dependencies) in the various profiles.
http://maven.apache.org/pom.html#Profiles
Different dependencies for different build profiles in maven
answers a similar question - but you can swap in the "release" and "debug" for "Project A" and "Project C"
Provided dependencies are a difficult subject. First of all: Provided dependencies are not transitive in the following sense: If your project C has a provided dependency on X, then A will not get the dependency. It is silently ignored. This fits with the following meaning of "provided" which I propose:
Only the artifacts that are actually deployed should mark dependencies as "provided". Libraries or other jars that are not individually deployed to a specific server should not have provided dependencies. Instead, they should declare their dependencies as compile dependencies. In your example: Project C should have a compile dependency on X. If project A knows that X is provided, it sets X to provided in "dependencyManagement". As project A should know the environment in which it runs it should decide what is provided and what is not. And "dependenyManagement" is the right place to declare this.
If your project A should be able to run within and without a given server, you probably need to make a lot of adjustments, even change the type from ear to jar. So you either use build profiles for this, which then have different dependencyManagement entries, or you split A into two projects which depend on some other project that contains the common elements.
If some given project C already has a provided dependency on X and you cannot change that, this is effectively the same as a missing dependency in C. This has to be repaired at some point, and this could be project A itself.