I have a Bamboo plan that involves building with maven on Windows. The default path to the build directory under the bamboo user is long, and some files end up over the 255-char Windows limit. I wanted to solve the problem by (for this plan only) change the location where the Mavens are run to a short dir, C:\build. I can check out files, then run a script step to copy them from the build dir to C:\build. The Maven bamboo task is configured to override the project file, using C:\build\pom.xml instead. That all works fine. However, when it gets to the 'check in the updated pom' part of release:prepare, it somehow decides that the original build directory with the long path is right, dying with an error.
Anybody know how to specify that the updated pom is also supposed to come from C:\build? I tried overriding the 'Working Sub Directory' entry, but that won't let me specify a full path, so C:\build is out.
Did you try to override the localRepoDirectory parameter?
The command-line local repository directory in use for this build (if specified).
Default value is: ${maven.repo.local}.
You may set this parameter using a property in the POM:
<properties>
<propertyName>C:\build</propertyName>
</properties>
...
<localRepoDirectory>${propertyName}</localRepoDirectory>
It can be overridden in the Bamboo Maven command:
mvn -DpropertyName="D:\build" clean package
(Bamboo variables can also be used to set the propertyName)
You may define a single property with the desired path and use it in several places in the pom.xml.
Turns out there were several different things going on with the Maven 3.x task:
By setting the Override Project File to C:\build\pom.xml, I was able to get the task to try to build in C:\build.
Part of my copying of files from the normal root directory to C:\build was wrong. I'd used xcopy but forgot to add a /H to copy the .svn data as well, so the 'check-in updated pom' step failed because it couldn't find the .svn files.
Once the release:prepare and release:perform were finished, a bamboo 'Artifact Copy' step had been defined earlier to copy several generated artifacts back to the maven repository. Turns out that while this step is somewhat configurable about what files to copy and where they are to be found, it does not support providing an absolute path as the directory to copy from, unlike the Override Project File for the maven tasks. So I had to introduce yet another step, a script to copy the generated artifacts back from C:\build... to under the build root.
All in all, I wasn't able to mess with the build root as I wanted to, but by using the Override Project File and two scripts to copy the source files to C:\build and the artifacts back from C:\build, I got done what I needed to do.
Related
I'm very new to Maven, and I'm confused about the purpose and correct usage of the basedir and workingDirectory parameters of the Maven SCM Plugin. Let's say I have the following file/folder structure (in Windows):
C:\fruits\
├───.git\
├───apples\
│ └───pom.xml
└───oranges\
└───pom.xml
Before executing Maven (by invoking mvn.cmd from the command line), I change the current directory in shell to "C:\fruits\apples". Thus, in Maven's terms, "C:\fruits\apples" becomes the working directory with the appropriate pom.xml file to use for the apples project. On the other hand, in Git's terms, the working directory is "C:\fruits", because that's where the whole monorepo branch is checked out.
At some point during the build process, I use the Maven SCM Plugin to push some modifications. Currently, when specifying the scm:checkin goal, I provide --define workingDirectory="C:\fruits", but I'm not sure if this is the correct path here. Also, the plugin seems to work even though I currently don't define the basedir parameter at all, which is surprising, because the documentation lists basedir as the only required parameter. Unfortunately, the documentation doesn't explain the purpose of the basedir parameter (they just say it's "the base directory", duh), and doesn't explain which "working directory" (i.e., whether in Git's sense or in Maven's sense) the workingDirectory parameter is supposed to point at.
Could you please explain to me, exactly what paths should I define the basedir and workingDirectory parameters of the SCM plugin to point at (while Maven is working on my apples project) and why?
working directory parameter for SCM Plugin is in the context of Git i.e. the location of .git folder
basedir is the directory for the location of pom.xml. Maven has many default properties and one of them is basedir.
To find out more on how it is implemented, you can also check the code for this plugin from maven central or check ur local maven repo.
I want to store some additional files in the JAR that gets created. Those files are in a directory that is a subdirectory of a repository which is pulled in via a git submodule.
I want to copy that submodule to my src resources directory before compiling, but I also want to make sure that any old files at that location are removed first.
How can that be achieved best with Maven plugins? I did not find any option to remove any destination files with the copy-resources goal of the maven-resources-plugin and I could not get the maven-clean-plugin to run right before the copy-resources either. So how does one accomplish such a trivial task with Maven?
UPDATE: as mentioned above, the reason why I want to do this is because what is copied should become part of what gets added to the resulting jar (and could potentially be part of what gets compiled). So I need to copy these files into the src directory and NOT the target directory. What should get copied before each build is the input to the build, not an additional output.
There is one flaw in your approach, and it probably explains most of the obstructions you encountered.
During a build, the only directory in which you may write is target. Copying files to src or changing them is strictly discouraged.
The target folder is erased by clean, so no need to tidy up yourself or to manage old files.
I have situation where i am running 2 maven commands back to back to run 2 different set of tests.
However i need the final target folder to have results of both maven commands. Problem is that the second maven command cleans the target folder. Is it possible to ask mvn to not clean a specific folder in target.
We have a maven project for which we have set up jenkins for build. The reporsitory has a large tools folder which i didn't want Jenkins to download.
I just want jenkins to download src folder and pom.xml file.
I added two reporsitory locations in Jenkins - only to learn that Single file checkouts are not possible
This forced me to use shell script option provided by Jenkins for checking out pom .xml . PFB the script outline.
svn checkout $pomUrl . --depth empty
svn update pom.xml
I did not find an option in my scm plugin of Jenkins to do an empty checkout
Checkout one file from Subversion
But POLL SCM of jenkins is only polling the src folder and builds are not triggered if i make some changes to pom.xml. Is there a way to ensure Polling of my pom.xml as well?
No. Jenkins will poll what it knows.
In your scenario:
Jenkins doesn't know about your pom.xml.
Jenkins doesn't work in single file checkouts anyways.
You will have to rearrange your structure, either move the tools folder outside of the main checkout (if it's so large that it's prohibitive, why do you have it in the root location?), or move the pom.xml into the src folder.
Edit:
Here is an idea. Haven't tried so don't know if that will work.
Keep your manual checkout and update of that pom like you currently do.
Setup another SVN Add module....
Enter the root location of SVN where your pom is, give it a non-conflicting folder name
Configure Repository depth for that module as Empty (if you don't see this option, you may need to upgrade your SVN plugin and/or Jenkins).
Click Advanced... section.
Configure Included Regions with the path to your src folder, and the pom only.
Something like:
/trunk/myapp/src/.*
/trunk/myapp/pom.xml
I have a Maven project which performs a number of time consuming tests as part of the integration-test Maven cycle. I'm using Jenkins as the CI server.
During the integration test a number of files are produced in the target folder. For example, an "actual" BMP file is produced and compared to an "expected" BMP file. If the test fails, I need to look at the files in the target folder to determine how to deal with the error. Maybe the actual BMP looks fine and so it should be promoted to the new expected BMP. On the other hand, it may reveal a problem that requires a code fix.
The thing is I don't have any way to get access to these files, other than to ssh into the CI server and manually scp the files over to my own machine for closer inspection. It would be extremely helpful if I could access these files from the Jenkins web interface.
I tried using the build-helper-maven-plugin to attach the relevant files as Maven artifacts, but the problem is that there is no suitable phase in Maven that executes after an integration-test, if any test fails.
What can I do? Can I use the "Copy Artifact" plugin for this?
1) The files in the target folder can be accessed using a link such as /ws/projectname/target/filename...
2) Rather than typing the url each time, the SideBar plugin can be used to add a link to the file to Jenkins' left menu, making it easily accessible.
You need to copy your files into your workspace in a build step and archive them from there - Jenkins lets you specify artifacts only relative to the workspace.
I usually create a directory keyed by the BUILD_ID in the workspace, so that artifacts from different builds do not get mixed up in case I do not clean the workspace and archive from there (specifying ${BUILD_ID}/**/* in the archiving step).
In case your build fails before it can run the copying step and because of it does not do the copy, take a look at this question.