Well, this is kind of embarrassing. I am in the process of mavenizing our build processes and just don't know how the access the result of a build. I build, let's say, a jar file and mvn deploy it. So it ends up as some blah-0.1.2.jar in our company maven repository, which is just a webdav share. Now how would you pass that on to someone else to use? Just pry it from target/blah-0.1.2.jar can't be the answer. I found several suggestions to use variants of mvn dependency:get but they were all just close and didn't feel right. There must be a way to use all those nice versions of blah-*.jar that end up in the repository for purposes other than a maven dependency. Preferably from the command line and maybe even without maven. Hm, a webdav client doesn't look too bad except for snapshots. What would you suggest?
Creating a script that makes a dependency:get call is probably going to be closest to your desired outcome. You can specify the destination of your downloaded jar this way.
If you are looking for an easy way to share builds between people in/outside of your company then you can look into setting up some automated build software like Bamboo or something similar. A new build gets triggered any time a commit is made to the section where your project resides in whatever version control system you use. An artifact is then made available for each successful build and are available via Bamboo's web interface. Bamboo can be configured to run with your maven pom's.
While they can bit a bit of pain to set up, going the automated build route will take a lot of the sting out of sharing your builds in the future.
Related
Regardless of build tool, I have seen people doing a clean task/phase before every time they do package/compile or ... is it really necessary?
Dose build tools use reuse artifacts of previous builds?
Most of the time you see clean install as the default command, but I would encourage everybody to use verify instead.
When executing clean the target-folder is removed which makes it impossible to do incremental builds. Plugins have enough information to detect if they should do their action. For instance: the maven-compiler-plugin compares the java sourcesfiles and the compiled classfiles (and other things) to see if files needs to be (re)compiled. If you think that a plugin is not working correctly with incremental builds, please file an issue for that plugin.
The install was often required with multimodules in Maven2, but Maven3 is capable to resolve these inner module dependency references. The only thing 'install' does is copying artifacts to the local repository (=IO=expensive). And it'll make your local repo look different compared to your coworkers, which might give different results during builds. Better to let a buildserver push those artifacts to the shared remote repositories and let every pull those SNAPSHOTs from there. Only in rare cases calling install is valid (experienced Maven users know when :) ), so instead please use verify.
I have a Jenkins job that builds a simple Maven project. If all I do is build, it works just fine. The problem arises when I try and do a release, dry run or regular. It consistently fails with the Cannot prepare the release because you have local modifications error. I have wiped out the workspace, but the problem persists. Is there any way I can get Maven to tell me which file it thinks has been modified? I would assume that by wiping out the local workspace and immediately running the dry run release that there wouldn't be any opportunity for anything to get modified.
Please note, I do not have access to the Jenkins server or the slave that is running the actual release build, so I can't use any tools there (like SVN) to determine what is supposedly modified.
You can use the Maven SCM plugin to do a diff.
https://maven.apache.org/scm/maven-scm-plugin/diff-mojo.html
Basically, integrate the maven plugin upstream of the failure, and see if anything has been changed. I imagine you might be able to see the output in the log, but if you cannot, you might be able to move your "real" maven pom.xml aside and replace it with one that generates a diff file and with the help of the maven build helper plugin, attaches that file as an additional aritfact (to a pom target).
It turned out the solution to my problem was to not use the "Local to the workspace" strategy for my private Maven repository in the Jenkins job configuration. By changing that to the "Local to the executor" strategy the problem went away. I'm still not sure why it was having the problem in the workspace, but this solution resolved it form me, and might work for others.
We currently use SilkCentral Test Manager (SCTM) integrated with our source control system via SCTM source control profiles. However, we would like to explore integrating with build artifacts checked into Maven's remote Nexus repository instead.
The idea being that the application-under-test is built and checked into Nexus along with the automated tests only if the build and the tests pass. Therefore, when QA is ready to run tests from SCTM (manual or automated), there is a well-defined combination of application build artifacts and test build artifacts in Nexus that present a more reliable target for SCTM as compared to getting the latest code from the source control system.
All of this is more relevant during active development when the code and the tests and changing daily and the builds are snapshot builds rather than formal builds with tags in the source control system that SCTM could use.
SCTM apparently has support for both universal naming convention (UNC) and Apache virtual file system (VFS) and either of these should potentially be utilizable to point the SCTM source control profiles to Nexus artifacts rather than raw source code. However, I wanted to check with the community to see if there's a simpler approach. (For example, I noted the existence of a Hudson SCTM plugin.) Also, I welcome alternative thoughts and ideas.
There are probably many solutions for solving this, I'd try the following:
Manage the build/first test/publishing steps in Hudson/Jenkins.
For example by modelling it with dependent jobs, the publish job is only triggered if the tests pass. There are also more advanced gatekeeper plugins available (for example a Downstream Ext plugin) which might solve this even more comfortable.
Once the publishing is done, use the Hudson/Jenkins-Silk Central plugin to trigger the executions on Silk Central. There, instead of using UNC or VFS, I'd rather use a setup script which pulls the artifacts from the repository and prepares everything for the tests. This would allow you to use something Maven/Nexus aware to pull the correct artifacts from the repository, instead of somehow trying to make it accessible via UNC or VFS.
I have recently been charged with building out our "software infrastructure" and so I am putting together a continuous integration server.
After a build completes would it be considered bad form for the CI system to check in some of the artifacts it creates into a tag so that it can be fetched easily later (or if the build breaks you can more easily recreate the problem.)
For the record we use SVN and BuildMaster (free edition) here.
This is more of a best practices question rather than a how-to question. (It is pretty easy to do with BuildMaster)
Seth
If you believe this approach would be beneficial to you, go ahead and do it. As long as you maintain a clear trace of what source code was used to build each artifact, you'll be fine.
You should keep this artifact repository separated from the source code repository.
It is however a little odd to use a source code repository for this - these are typically used for things that will change, something your artifacts most definitely should not.
Source code repositories are also often used in a context where you want to check out "everything", for example the entire trunk. With artifacts you are typically looking for a specific version, and checking out all of the would only be done if exporting them to some other medium.
There are several artifact repositories specialized for this, for example Artifactory or Apache Archiva, but a properly backed up file server will thought-through access settings might be a simple and good-enough solution.
I would say it's a smell to check in binaries as a tag. Your build artifacts should be associated with a particular build version in your build system, and that build should be associated with a particular checkin. You should be able to recreate the exact source code from that information. If what you're looking for is a one-stop-function to open the precise source-code revision that generated the broken build, I'd suggest that you invest some time into building a Powershell module that will do that for you.
Something with a signature like:
OpenBuild -projectName "some project name" -buildNumber "some build number"
I want to deploy a generated Maven AppAssembler assembler/ directory to somewhere in a file system, SSH, or whatnot. Can Cargo do that for me, or is there an equivalent deployment tool that will let me glob a bunch of files (in this case the target/appassembler/ directory) and deploy them to a destination?
I have a couple command-line applications that run as scheduled tasks (via cron or Windows Scheduler), and I want to deploy them out to these remote locations (in one case via SSH, and another a network share \\servername\C$\whatever\). I don't know how I can accomplish that, since all of the deployment plugins I have been looking at cater to web applications and app containers or Remote repos like Nexus.
Try maven copy plugin - it has excellent networking support (scp,FTP,HTTP).
You might also find useful maven sshexec plugin.
I know this question is quite old, but since someone else might also be interested in this:
I don't have a complete/concrete example for this, since I never tried it, but maybe the maven assembly plugin could be used for this, with the dir assembly format?