We've configured Bamboo to use automatic dependencies based on Maven POM files. This appears to be working, insofar as the Bamboo interface correctly shows which the Maven artifacts required by and created by each plan, which upstream plans provide the dependencies and which downstream plans depend on the artifacts.
However, it also appears to not work, insofar as building an upstream artifact doesn't trigger a downstream build. Is it necessary to manually create child plans to trigger downstream builds -- an error-prone duplication of the information in the POM files? If so, the automatic dependency management isn't much use and it's hard to understand what the feature is for.
You can try adding a step in your parent build, using a script task to trigger whatever child build you'd like.
This way, you're not bounded by the Bamboo dependency/trigger mechanism.
https://answers.atlassian.com/questions/65517/trigger-bamboo-plan-via-rest-call
Hope this helps.
Related
I have 6 microservices in my project and i have seperated them into 6 projects in gitlab. When i tried to build this microservices all together or after building parent POM later child POM seperately outside Gitlab it is working but while using gitlab-ci i am not able to build it as they are failing non resolvable parent POM.Can someone please let me know how can i build this microservices independently(building parent POM and keeping the artifact available for all other projects).
Tried caching and artifacts in gitlab but they are strightly bound to single project
If you always want to build those six microservices together, put them into one multi-module project. Then you have one project on GitLab and everything will be much easier.
If you need to separate, then you need a Maven package manager. You can use the one that is included in GitLab, or you can use an external one like Artifactory.
I have a Maven multi-module project. Something like this:
- ParentProject
- ChildA
- ChildB
- ChildC
The child projects inherit from a Parent POM (ParentProject) solely for the reason of sharing stuff like <build>, <scm> and <properties>, so as to not repeat it in all the child modules. Thus, the objective of the parent-child relationship is not related to dependencies in any way. It plays a role at build-time, not at runtime, so to speak.
The child projects's artifacts are for consumption for a wider audience, hence they'll be published into a centralized repo.
How do I "break" the relationship between from the child up to the parent seen from a perspective of a consumer of a child?
Let's say another project, ProjectX, adds a dependency on ChildA. When doing this the Maven client will attempt to not only download the POM and artifact of ChildA itself but will even try to download the POM for ParentProject. However, there's absolutely no need for that POM seen from a consumer point of view. It doesn't contain information that the consumer needs to know.
How can I break this relationship from consumer's perspective? Forcing the POM for ParentProject to be published into a repo seems pointless as nobody has any need for it there.
Perhaps there's another way that Maven will let me share things like build instructions and properties between projects without mandating that a Parent POM exists in a centralized repo ?
Or perhaps there's some way I can manipulate the POM for the Child projects which gets put into the centralized repo (removing the <parent> element as it is irrelevant).
Perhaps only me but I feel that Maven is conflating two unrelated concepts here (build-time vs consume-time) and forcing unnecessary roundtrips and unnecessary artifacts in repo. I haven't dabbled with Gradle yet but I wonder if it does it any better?
Usually, the Maven POM is both build POM and consumer POM. This is not ideal, and will probably change in future versions of Maven.
At the moment, your best option seems to be the flatten Maven plugin, which allows you to remove "unnecessary" parts of the POM before uploading it.
Is there a way in Gradle to explicitly define where certain artifacts should be coming from?
We have a legacy project which is being on-boarded to use a proper artifact repository manager, instead of a network share. However, we have multiple repositories from which artifacts are being downloaded. We'd like to be able to fine-grain where certain artifacts should be coming from, until we can fully on-board to the artifact repository manager in question.
Is something like this possible?
Yes that is possible as of Gradle 5.1
https://docs.gradle.org/5.1/release-notes.html#repository-to-dependency-matching
Repository to dependency matching
It is now possible to match repositories to dependencies, so that Gradle doesn't search for a dependency in a repository if it's never going to be found there.
See the docs for more details: https://docs.gradle.org/5.6.2/userguide/declaring_repositories.html#sec::matching_repositories_to_dependencies
We have a basic Maven parent POM for all our projects, which is tested with integration tests. However a big part of the customization is for the Maven release plug-in:
<plugin>
<artifactId>maven-release-plugin</artifactId>
<configuration>
<tagBase>https://my-url</tagBase>
<preparationGoals>clean verify org.acme:my-plugin:my-goal</preparationGoals>
<completionGoals>org.acme:my-other-plugin:other-goal<completionGoals>
<resume>false</resume>
</configuration>
</plugin>
I tried testing it via "release:prepare" and got Can't release project due to non released dependencies for the parent POM, which can't even be removed via -DallowTimestampedSnapshots=true.
I could test via "release:prepare -DdryRun=true", but that doesn't even test the preparation goals. So the only other way I could think of was to release the POM and then try to release an arbitrary project. So now I'm at version 1.0.14 and have reverted about 50 times, and I don't think that's the right way anymore.
Is there any way to mock a Maven release? Maybe tell him to tag to a local path and have him commit changes there? And he shouldn't deploy to our Nexus either, but I'm at the point where I'm not picky anymore.
I also had a need to do this, and like you I was not interested in actually doing SVN commits or deploys to a remote repo - in my mind that verification was part of other integration tests. I figured that the maven-release-plugin developers would also have a similar need, and indeed they did. They wrote mock SCM and wagon providers.
You can see the mocks used in the release plugin POM profile with id run-its. Note the config uses setupIncludes to be sure the mocks are built and installed in the local repo prior to running any actual tests.
The projects themselves need to use the mocks. Look at one of the integration tests to see how to define the scm element and add the dependency on the Wagon mock.
I used a log verification technique to verify that the appropriate executions were run during the tests.
Note: There are 3 mocks in the setup directory I linked. I found I only needed to use two of them, the ones with suffix "-dummy."
Modularize your process with profiles. Have a profile that triggers your 'prepare' actions, and a profile that triggers your 'perform' actions, and test those instead of or before running the release plugin. Configure the release plugin to do these things by activating the profile.
I see several options:
directly in pom.xml
in company super-pom
in settings.xml (global or user)
in a profile or directly (in settings.xml or pom.xml)
We want our Jenkins to push artifacts to internal repository, and developers to pull missing artifacts from there.
If I put the repository URL in pom.xml, and later the internal repository is moved to a different address, the released versions will all have a broken link.
Super-pom saves some repetition, but in a clean setup you need to somehow know where the repository is to find the parent POM — to tell you where the repository is.
Having the URL in settings allows one to change it without modifying the artifacts, but there are two problems:
build will fail due to unresolved dependencies, if maven settings have no reference to the internal repo
developers have to update their settings.xml files manually
I'm also unsure about the merits of putting repository configuration in profiles. I know it let's you easily switch the repositories on and off, but shouldn't the -o option and snapshot resolution settings be enough for most uses?
What about using a different repository (e.g. with instrumented classes) for integration tests?
Configure a single repository in the users ${HOME}/.m2/settings.xml and configure other needed repositories in your appropriate repository manager either Nexus, Artifactory or Archiva. In Jenkins there is the Config File Provider plugin which exactly handles such situations in a very convinient way.
If you want to have repeatable builds and good control over your organization internally, use a repository manager and use a mirrorOf entry in everyone’s settings.xml to point at that url.
If you are exposing your source and want to make it easy for others to
build, then consider adding a repository entry to your POM, but don’t
pick a URL lightly, think long-term, and use a URL that will always be
under your control.
http://blog.sonatype.com/2009/02/why-putting-repositories-in-your-poms-is-a-bad-idea/