Ok, this may be a silly question, maybe my English knowledge, or "just" my comprehesnion level, is fooling me, but what is the difference between snapshot dependencies and dependency triggers?
I guess the first means that when you build a project, TC makes sure the dependency is up to date, if it isn't it is rebuilt, and then it builds the original project, plus it won't allow builds in parallel. And the later means that if a new build of the dependency is built, it triggers a build of the project.
Also, if this is the case, I guess any "recursion" problem is already handled, like, eg: you force a build of a project with both features enabled, it checks the dependency and sees it needs to be rebuilt, and when it does so, the trigger isn't fired.
Are my assumptions right?
For the first part of your question, you answered it yourself.
Snapshot dependencies, force the dependent Build Configuration to build (if it's not up to date) before building the current Build Configuration.
Dependency triggers, make the current Build Configuration build after a successful build of the dependent project.
On the second part of your question, I think you are asking if Build Configuration A will run twice when it has both a Snapshot Dependency and a Dependency trigger to Build configuration B and you Run the Build Configuration A.
I tried this myself with Teamcity 5.1.2 and I saw that it only ran once.
Related
I had a NoClassDefFoundError problem with some test, launched from IntelliJ. In order to repair the situation, I had to make several changes in many poms of the project - adding new packages and excluding some old ones for to escape the overlapping of them. Also, I reapired the situation with different versions. But the situation did not improve. Again, some package, declared in pom, was not found where it should be.
I refreshed the maven repository by
mvn -e clean install -U
, as is advised in https://stackoverflow.com/a/9697970/715269 - so old and upvoted answer, that it surely looks as Santa.
The problem remained unchanged.
I output the maven map. It was correct and it contained all needed.
I looked at the list of the External Libraries of the project. It was the old uncorrected list of overlapping jars with same names and different versions, and without good packages I added just now, and well seen in maven tree output!
Already hapless,
I reimported packages in IntelliJ
by:
Ctrl+Shift+A, Reimport All Maven Projects.
Ho! The list of libraries got repaired. And the problem, mentioned in subj, disappeared.
The question is: How it could happen, that the same project has that very pom for everything, but gets packages differently being launched in maven and in IntelliJ?
I know about that feature "delegate IDE build to Maven". And I keep it turned off. But I am NOT talking about the different SW for building. Whether they are different or not, they should be up to the actual pom's. And whereas maven, if turned off from the automatic building won't know about changes in poms, IntelliJ KNOWS about them. It could have jars up to pom, or up to maven - it has sense, but it simply has some old rubbish. Was there some deep thought under that construction?
Every time you manually change the pom.xml file, including the dependencies you need to load these changes into IDE. IDE does it on Reload from Maven action. See also Import Maven dependencies.
Intellij doesn't use maven to bulid and run a project except you are delegating build and run action to maven:
Since, IDEA doen't really use maven to run and build, it uses the pom.xml to import the project structure and "tries" to build the project the same way was maven does.
Actually, there are quite a few differences between these to build processes.
Generating sources or filtering resources (don't know if this is still an issue) aren't done during building the project with Intellij IDEA.
In case you are using code generation you have to build the project via maven first and then - when all the resouces are filtered and additional sources are generated - you are able to run, debug aso. the project with Inellij IDEA.
That's an important thing to be aware of and that's the reason why maven and IntelliJ IDEA project structures might get out of sync.
You can enable the "Reload project after changes in build scripts" feature and select the Any changes checkbox to keep your project structure updated:
Why should you disable this feature anyway
If you are working on a build file (gradle or maven is not important) reloading the structure on any change can be very anoying. It's cpu intense, dependcies are fetched aso.
Therefore, I prefer to reload project structure only in case of an external change. This happens when pulling an updated version of the build file for example.
I have a number of builds that creates a package that is published to a package manager (for example NPM, Nuget and Maven).
I have subsequent builds that trigger on the completion of this build, they get the artifact from this repository. The problem is they show a warning:
I'm considering adding a Snapshot dependency, however, Teamcity's UI describes snapshot dependencies as builds using the same sources:
There is no source dependency between these projects and in fact, they may have completely different VCS roots.
What is the appropriate way to link these projects? Reading the documentation on Snapshot Dependencies, it sounds like things might not work as expected if I add a dependency without shared sources.
There is no requirement to link these projects, other than by the trigger you already have I don't think.
If the triggered build always gets the latest version from the package manager, then you'll get the behaviour that you want.
Snapshot dependency simply ensures that a build which depends on another build gets the same version of the source code when it builds, and doesn't end up being built using some changes that someone else checked in between the first build starting and the second build starting. This doesn't look like its going to be an issue in your situation (and indeed the builds may use completely different repositories), so I think your finished build trigger is an appropriate solution.
I want to write a mojo, which responsibility is to validate the POM against a set of rules (in addition to those specified by Maven). As an example, it'll make sure there's an organization element, and that it is populated with the company's details.
This plugin would have the validate phase defined as its default phase, and it would go in the company's parent pom, so that it will be executed upon any build.
The issue is that there are different set of rules that should be applied for snapshot and for release builds.
Now, maven release plugin executes a clean verify after it had done its manipulations (i.e., after release:prepare). And so, my plugin will be executed at this point. Cool.
The thing is, that since it is in the release context, I need to apply the release set of rules. But I cannot tell it is in a release context; I have no idea who/what had triggered it - "regular" build, or as a "sub-process" of release:prepare. There's no indication whatsoever...
One solution could be to have the preparationGoals populated with, for example, clean verify -DisRelease=true, where isRelease is a parameter the plugin queries in order to apply the correct set of rules.
But I do not like this idea; it feels like a hack, and too fragile...
So the question is:
Is there a way to tell what's the context of the execution of the plugin, so that I can tell a snapshot from a release? Is there a way to bind a configuration to a goal, rather than to a phase (as in release it is not a phase, but rather a release:prepare goal)? Is there any other way I can achieve this?
Thanks in advance,
Ohad
------------------ EDIT --------------------
Due to some misunderstandings, I am giving a possible example:
Let's say the organization idea of version is of the format: {number}.{number}.P-{number}, for whatever reason.
So, in "regular" build there will be a check that the version is in the format {number}.{number}.P-{number}-SNAPSHOT, and in case of release, it'll check that it is in the format of {number}.{number}.P-{number}.
The POM starts with the version with -SNAPSHOT, while in CI, and it passes validations. Then, the developer wants to release it. So a release:prepare is executed, chopping of the -SNAPSHOT. Then, as the default of preparationGoals is clean install, the validation rules are executed on the manipulated POM (where the version is without the -SNAPSHOT). It will fail, as it (wrongly) expects the version to be with -SNAPSHOT.
So how can I signal it, that "release" validations should be executed?
Thanks again
I was trying to build Maven pom in something similar to the following hierarchical form:
root
+-- A-POM
+-- B-POM
+-- C-POM
+---D-POM
I was hoping that this could take care of my changed module problem. That is, if C is changed, then A must be rebuilt, etc.
But I ran into the issue that it seems the packaging at root is "pom," and after that I can't have A as packaging "war" then continue to drill in to have A include B, C as its modules. It seems to me that any POM which does not have "pom" in the then it can't have child modules. Is my understanding correct? Is there a way to do what I wanted to do?
In addition, I don't seem about to chain the "changed" mechanism in Maven (must due to my lack of knowledge). I like to have Maven detect a dependent project has changed and rebuild all the affected projects.
Thanks so much!
the reactor project (the root of the multimodule project) must have pom packaging. So your nested structure is invalid since A is not of type pom and I'm pretty sure you won't get it to work this way.
Second point is that Maven is a modularized build system and uses repository mechanisms to locate pre-built artifacts instead of checking out all modules from version control and building them in a monolithic way like in the old days ;) This means that Maven cannot know what to rebuild when you change something at your module since it simple does not have all the other module there at this time.
I think this is more a CI task than that should be handled by the build system itself. I know that your can achieve such a behavior with an appropriate build/CI Server like Jenkins that supports upstream and downstream projects. This means it is able to detect dependencies between the projects and trigger other builds as soon as a dependency has been built. This comes close to the behavior you are trying to achieve.
Btw. rebuilding other projects is only required for SNAPSHOT dependencies. Jenkins with the maven plugin supports this behavior but, depending on the number of SNAPSHOT dependencies of your project, this can cause long chains of project builds on the server. Some folks are of the opinion that in general SNAPSHOT versions are hell for CI tasks since these artifacts can change over time and are not reproducible. You could think over completely omitting SNAPSHOT versions and building final versions each time. This would also obviate your requirement to rebuild other modules as soon as a module changes. There are simply no changes until you upgrade dependency versions.
I know how to avoid deploying a module during the release process. What is not clear is how to avoid building the module to begin with. For example, my project contains an "examples" module. This takes a very long time to build and is never deployed to Maven Central so I'd like to avoid building it altogether (but only during a deploy). Any ideas?
If you're using the Maven Release Plugin, you will need to take care of both the prepare and the release steps, since projects are usually built in both steps.
For the prepare step, I would try the preparationGoals parameter, which by default uses clean verify, which includes a build of the project. Maybe you can try setting it to clean only to avoid the build.
http://maven.apache.org/plugins/maven-release-plugin/perform-mojo.html
For the perform step, take a look at the goals parameter, which by default is set to deploy. You can override this by specifying clean only.
http://maven.apache.org/plugins/maven-release-plugin/perform-mojo.html
I haven't tried this exact combination, so I don't know whether this will have any side-effects on the rest of the build.
Best thing in this case is to use a profile and put the example into a separate module which can be activated by a profile. This prevents building during release but can be build during usual development etc. via activating profile.