I have situation where i am running 2 maven commands back to back to run 2 different set of tests.
However i need the final target folder to have results of both maven commands. Problem is that the second maven command cleans the target folder. Is it possible to ask mvn to not clean a specific folder in target.
Related
For example, project A generates two artifacts processor.exe and t.txt. Then in project B, can I add a build step to execute processor.exe t.txt?
I know there are two Runner types(.NET Process Runner and Command Line) that can execute programs. But how to get the paths of these artifacts?
Yes, I think it should be possible to run with Command Line runner *.exe file generated by other build.
You just have to make sure that build agent which runs Project A outputs these artifacts to place where build agent which runs Project B has access (in case if you have single agent it's not a concern obviously). And probably placing these artifacts into agent's working directory is not the best place because it can be cleared by doing clean checkout from VCS. Just choose some generic directory on the server and specify it for artifact output in Project A and then for Command Line runner in Project B.
I have a Bamboo plan that involves building with maven on Windows. The default path to the build directory under the bamboo user is long, and some files end up over the 255-char Windows limit. I wanted to solve the problem by (for this plan only) change the location where the Mavens are run to a short dir, C:\build. I can check out files, then run a script step to copy them from the build dir to C:\build. The Maven bamboo task is configured to override the project file, using C:\build\pom.xml instead. That all works fine. However, when it gets to the 'check in the updated pom' part of release:prepare, it somehow decides that the original build directory with the long path is right, dying with an error.
Anybody know how to specify that the updated pom is also supposed to come from C:\build? I tried overriding the 'Working Sub Directory' entry, but that won't let me specify a full path, so C:\build is out.
Did you try to override the localRepoDirectory parameter?
The command-line local repository directory in use for this build (if specified).
Default value is: ${maven.repo.local}.
You may set this parameter using a property in the POM:
<properties>
<propertyName>C:\build</propertyName>
</properties>
...
<localRepoDirectory>${propertyName}</localRepoDirectory>
It can be overridden in the Bamboo Maven command:
mvn -DpropertyName="D:\build" clean package
(Bamboo variables can also be used to set the propertyName)
You may define a single property with the desired path and use it in several places in the pom.xml.
Turns out there were several different things going on with the Maven 3.x task:
By setting the Override Project File to C:\build\pom.xml, I was able to get the task to try to build in C:\build.
Part of my copying of files from the normal root directory to C:\build was wrong. I'd used xcopy but forgot to add a /H to copy the .svn data as well, so the 'check-in updated pom' step failed because it couldn't find the .svn files.
Once the release:prepare and release:perform were finished, a bamboo 'Artifact Copy' step had been defined earlier to copy several generated artifacts back to the maven repository. Turns out that while this step is somewhat configurable about what files to copy and where they are to be found, it does not support providing an absolute path as the directory to copy from, unlike the Override Project File for the maven tasks. So I had to introduce yet another step, a script to copy the generated artifacts back from C:\build... to under the build root.
All in all, I wasn't able to mess with the build root as I wanted to, but by using the Override Project File and two scripts to copy the source files to C:\build and the artifacts back from C:\build, I got done what I needed to do.
Is it possible to make TeamCity only clean up certain files upon fetching files from my git repo? I modify one file as a build step, and thus always need a clean version of that file. However, it's really unnecessary to fetch the whole repo everytime because usually only a few files are modified (thus, I'd rather not use the 'Clean all files before build' command).
Thanks!
To clarify, lets say I have the following structure:
- index.html
- js/script.js
- js/plugins.js
I only want to always (regardless if any change has happened) to checkout index.html. The files in the js folder I only want to replace whenever any updates on them have happened.
If you are using TeamCity 6.5 or above you can use the Build Files Cleaner (Swabra) Build Feature. Once you have added it your build steps and run clean build it will clean any new unversioned files generated during the build either before the new build starts or at the end of the current build.
I personally prefer to run it before the new build starts as it allows you to look at any of the output when trying to work out why something went wrong.
Basically it makes sure that there is nothing in the build agents work folder that was not pulled from the repository before each build.
I have a Maven project which performs a number of time consuming tests as part of the integration-test Maven cycle. I'm using Jenkins as the CI server.
During the integration test a number of files are produced in the target folder. For example, an "actual" BMP file is produced and compared to an "expected" BMP file. If the test fails, I need to look at the files in the target folder to determine how to deal with the error. Maybe the actual BMP looks fine and so it should be promoted to the new expected BMP. On the other hand, it may reveal a problem that requires a code fix.
The thing is I don't have any way to get access to these files, other than to ssh into the CI server and manually scp the files over to my own machine for closer inspection. It would be extremely helpful if I could access these files from the Jenkins web interface.
I tried using the build-helper-maven-plugin to attach the relevant files as Maven artifacts, but the problem is that there is no suitable phase in Maven that executes after an integration-test, if any test fails.
What can I do? Can I use the "Copy Artifact" plugin for this?
1) The files in the target folder can be accessed using a link such as /ws/projectname/target/filename...
2) Rather than typing the url each time, the SideBar plugin can be used to add a link to the file to Jenkins' left menu, making it easily accessible.
You need to copy your files into your workspace in a build step and archive them from there - Jenkins lets you specify artifacts only relative to the workspace.
I usually create a directory keyed by the BUILD_ID in the workspace, so that artifacts from different builds do not get mixed up in case I do not clean the workspace and archive from there (specifying ${BUILD_ID}/**/* in the archiving step).
In case your build fails before it can run the copying step and because of it does not do the copy, take a look at this question.
i'm new to world of continuous integration and software developement.
I wanted to try hudson so i installed it on my ubuntu machine and created a new job. i pointed it to an open source project's svn (keepassx) just to try.
Hudson downloaded everything from the repository and marked blue for successful build.
aren't i suppose to be able to execute the software now somehow ? i thought once it is built i can run it, but i can't find any executable in the project's home page under hudson user home dir.
thanks.
A Hudson/Jenkins build breaks down into three steps:
update source code in workspace
run build
publish build artifacts
It sounds like you've got step 1 covered.
If the project you linked to has instructions for building (ant, maven, etc.), you can enter these as build steps into the "Build" section of the project configuration.
You can then take the resulting files ("artifacts"--jar, exe, so, bin, whatever) and publish these using the "Post-build Actions", or if necessary you can grab them directly from the workspace filesystem.
Assuming the build artifact was an executable, you could then run it after downloading it from Hudson, or make a build step or post-build action which moved it into the appropriate location and ran it.
It helps to run the build locally before trying to get Hudson to handle it--then you know what the build steps are, and what the final build artifacts are.
How would jenkins/hudson know how to 'execute' some arbitrary package that you told it to download and build? It's up to you to write a program or script to run what you want to run, and then make a downstream job (for example) to do so.