maven :: install multiple third-party artifacts to local repository at once from filesystem - maven

We're using non-public artifacts from third-party companies in our project. We don't have maven proxy installed (and there're no plants to do so, because we found it complicates things rather than solves problems. especially if no internet connection or VPN is available).
So I created set of 'maven install file' plugin executions, like this:
<plugin>
<artifactId>maven-install-plugin</artifactId>
<version>2.3.1</version>
<inherited>false</inherited>
<executions>
<execution>
<id>install-artifacts.1</id>
<goals>
<goal>install-file</goal>
</goals>
<phase>initialize</phase>
<configuration>
<pomFile>thirdparty/gwt-0.99.1.pom</pomFile>
<file>thirdparty/gwt-0.99.1.jar</file>
</configuration>
</execution>
<execution>
<id>install-artifacts.2</id>
<goals>
<goal>install-file</goal>
</goals>
<phase>initialize</phase>
<configuration>
<pomFile>thirdparty/morphia-0.99.1.pom</pomFile>
<file>thirdparty/morphia-0.99.1.jar</file>
</configuration>
</execution>
<execution>
<id>install-artifacts.3</id>
<goals>
<goal>install-file</goal>
</goals>
<phase>initialize</phase>
<configuration>
<pomFile>thirdparty/gwt-oauth2-0.2-alpha.pom</pomFile>
<file>thirdparty/gwt-oauth2-0.2-alpha.jar</file>
</configuration>
</execution>
</executions>
</plugin>
it works great and does exactly what we need. However if new artifact is added - new big XML section has to be added.
Is there any way to avoid this, like use 'yet another plugin' which will search for folder and install everything from it?

Best solution for such kind of thing is to install a repository manager.
You've written you won't installing a proxy but that's the wrong way. The only solution to solve such kind of problems is to install a repository manager.

Related

Why does maven-release-plugin uploads build information? And can it be removed?

When using the maven-release-plugin to release an artifact onto a repository, the entire pom is copied. This includes sections build and reporting.
I can understand that deployement information is propagated since dependencies of a project by the same creators are likely to be deployed on the same servers, but, for non-pom artifact, I don't understand the point of having the build information.
Is it possible to create a release stripped of this information?
Use the flatten-maven-plugin
https://www.mojohaus.org/flatten-maven-plugin/
I copied the relevant plugin configuration from the website above.
<plugin>
<groupId>org.codehaus.mojo</groupId>
<artifactId>flatten-maven-plugin</artifactId>
<!--<version>1.1.0</version>-->
<configuration>
</configuration>
<executions>
<!-- enable flattening -->
<execution>
<id>flatten</id>
<phase>process-resources</phase>
<goals>
<goal>flatten</goal>
</goals>
</execution>
<!-- ensure proper cleanup -->
<execution>
<id>flatten.clean</id>
<phase>clean</phase>
<goals>
<goal>clean</goal>
</goals>
</execution>
</executions>
</plugin>
I strips the POM from all unnecessary information.

Source feature generated by Tycho is not being included in p2 repository

I'm trying to create and include a source feature of my plugins in the generated p2 repository. Currently, the source jars for each plugin get created, as does the source feature to each normal feature. However, those source features then don't get included in the final product, an eclipse update site.
In my parent POM, I have
<plugin>
<groupId>org.eclipse.tycho</groupId>
<artifactId>tycho-source-plugin</artifactId>
<executions>
<execution>
<id>plugin-source</id>
<goals>
<goal>plugin-source</goal>
</goals>
</execution>
</executions>
</plugin>
<plugin>
<groupId>org.eclipse.tycho.extras</groupId>
<artifactId>tycho-source-feature-plugin</artifactId>
<version>${tycho.version}</version>
<executions>
<execution>
<id>source-feature</id>
<phase>package</phase>
<goals>
<goal>source-feature</goal>
</goals>
</execution>
</executions>
</plugin>
<plugin>
<groupId>org.eclipse.tycho</groupId>
<artifactId>tycho-p2-plugin</artifactId>
<version>${tycho.version}</version>
<executions>
<execution>
<id>attach-p2-metadata</id>
<phase>package</phase>
<goals>
<goal>p2-metadata</goal>
</goals>
</execution>
</executions>
</plugin>
Do I need to add something to the POM of the feature? Of the eclipse-repository? I'm out of ideas.
Gonna answer this myself. I found a solution thanks to this article.
I had to add the generated source feature to the category.xml that describes my update site.
I had tried that before but it didn't work because I made the mistake of writing *.source.feature instead of *.feature.source.

Maven stop build if svn is out of date

After reading through a lot of SO questions as well as other sites I still haven't been able to exactly address this problem.
We have a long build cycle (10-20 mins) because there are a lot of dependencies. It sometimes happens that you start a build with everything up do date, but while it's being done, someone pushes new changes to the remote svn.
I would like Maven to check on the validate and verify phases if svn is still up to date basically, on all dependent projects.
I've tried using the Enforcer plugin, and the Build number plugin with no success yet. The enforcer seems like it could do the trick, but I haven't figured out which rules to set.
The build number plugin on the other hand checks if there are no local modifications, but I don't think it checks the remote changes.
I don't think the POM is very relevant to the question, but if anyone needs it, or some parts please let me know and I'll update with it.
I would try a combination of the maven-scm-plugin's diff goal and Enforcer.
scm:diff may be configured to write output to a file. Run that when there are no changes and see how big the file is, or, if it generates the file at all if there are no changes. Then, use the Enforcer plugin's requireFilesDontExist and/or requireFileSize rules to make sure the scm:diff output file is the "no changes" size you determined. If it's larger than that, changes were committed during this build.
After a lot of testing I found another solution. This solution is for people that work with SVN and only want to commit changes once the build succeeds, and need to use the latest revision for a build.
What this will do is retrieve the latest revision number from SVN and update the working copy. At the end of the build process it will check the revision number again, to ensure that no one has pushed any changes.
<plugin>
<groupId>org.codehaus.mojo</groupId>
<artifactId>buildnumber-maven-plugin</artifactId>
<version>1.2</version>
<executions>
<execution>
<id>get-svn-local-revision-before</id>
<phase>validate</phase>
<goals>
<goal>create</goal>
</goals>
<configuration>
<doCheck>false</doCheck>
<doUpdate>true</doUpdate>
<buildNumberPropertyName>buildNumberLocal</buildNumberPropertyName>
<useLastCommittedRevision>true</useLastCommittedRevision>
</configuration>
</execution>
<execution>
<id>get-svn-remote-revision-before</id>
<phase>validate</phase>
<goals>
<goal>create</goal>
</goals>
<configuration>
<doCheck>false</doCheck>
<doUpdate>false</doUpdate>
<buildNumberPropertyName>buildNumberRemote</buildNumberPropertyName>
<useLastCommittedRevision>false</useLastCommittedRevision>
</configuration>
</execution>
<!-- Repeat after everything is done -->
<execution>
<id>get-svn-remote-revision-after</id>
<phase>verify</phase>
<goals>
<goal>create</goal>
</goals>
<configuration>
<doCheck>false</doCheck>
<doUpdate>false</doUpdate>
<buildNumberPropertyName>buildNumberRemote</buildNumberPropertyName>
<useLastCommittedRevision>false</useLastCommittedRevision>
</configuration>
</execution>
</executions>
</plugin>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-enforcer-plugin</artifactId>
<version>1.3.1</version>
<executions>
<execution>
<id>check-svn-revisions-before</id>
<phase>process-test-resources</phase>
<goals>
<goal>enforce</goal>
</goals>
<configuration>
<rules>
<evaluateBeanshell>
<condition>${buildNumberLocal} == ${buildNumberRemote}</condition>
<message>[ERROR] Local build (${buildNumberLocal}) doesn't match remote build (${buildNumberRemote})</message>
</evaluateBeanshell>
</rules>
<fail>true</fail>
</configuration>
</execution>
<!-- Repeat after everything is done -->
<execution>
<id>check-svn-revisions-after</id>
<phase>verify</phase>
<goals>
<goal>enforce</goal>
</goals>
<configuration>
<rules>
<evaluateBeanshell>
<condition>${buildNumberLocal} == ${buildNumberRemote}</condition>
<message>[ERROR] Local build (${buildNumberLocal}) doesn't match remote build (${buildNumberRemote})</message>
</evaluateBeanshell>
</rules>
<fail>true</fail>
</configuration>
</execution>
</executions>
</plugin>

Maven dependency plugin - How can I ensure that an artifact is present when using dependency-unpack

I'm wondering if there is a way to enforce the existence of a dependency to unpack when using the dependency-unpack goal of the maven dependency plugin. I'm using the configuration below and the problem is that if there is no dependency specified for "${properties.artifactId}" in the dependencies section of the pom the build goes ahead even though nothing has been unpacked. It invariably fails later at the test stage but it would be so much easier if the build could fail when no dependency is present. So does anyone know of a way that this can be enforced?
Thanks
Piers
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-dependency-plugin</artifactId>
<executions>
<execution>
<id>unpack-properties</id>
<phase>generate-resources</phase>
<goals>
<goal>unpack-dependencies</goal>
</goals>
<configuration>
<includeArtifactIds>${properties.artifactId}</includeArtifactIds>
<outputDirectory>${project.build.directory}</outputDirectory>
<includes>${properties.file.name}</includes>
</configuration>
</execution>
</executions>
</plugin>
A couple of executions of the maven-enforcer-plugin should do it. You need one to run before the dependency plugin, to make sure ${properties.artifactId} has a value, then another that runs after the dependency plugin to make sure there are files in the target location. Here's the idea, modify for your requirements.
You may write your own rules too if those available don't quite fit.
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-enforcer-plugin</artifactId>
<version>fillInTheVersion</version>
<executions>
<execution>
<id>enforce-config-properties</id>
<phase>validate</phase>
<goals>
<goal>enforce</goal>
</goals>
<configuration>
<rules>
<requireProperty>
<property>properties.artifactId</property>
<message><![CDATA[### Missing property 'properties.artifactId': the artifact that ....]]></message>
</requireProperty>
</rules>
</configuration>
</execution>
<execution>
<id>enforce-files-exist</id>
<phase>process-resources</phase>
<goals>
<goal>enforce</goal>
</goals>
<configuration>
<rules>
<requireFilesExist>
<files>
<file>${project.build.directory}/${properties.artifactId}</file>
</files>
<message><![CDATA[### Did not find unpacked artifact ...]]></message>
</requireFilesExist>
</rules>
</configuration>
</execution>
</executions>
</plugin>

Running selenium hub in maven

I'm trying to run selenium server using role hub in maven using selenium-maven-plugin in order to use phantomjs driver from remote control test, so far my plugin configuration is very straightforward:
<plugin>
<groupId>org.codehaus.mojo</groupId>
<artifactId>selenium-maven-plugin</artifactId>
<version>2.3</version>
<executions>
<execution>
<id>start-selenium</id>
<phase>pre-integration-test</phase>
<goals>
<goal>start-server</goal>
</goals>
<configuration>
<background>true</background>
</configuration>
</execution>
<execution>
<id>stop-seleniump</id>
<phase>post-integration-test</phase>
<goals>
<goal>stop-server</goal>
</goals>
</execution>
</executions>
</plugin>
Then I hook phantomjs using maven execution plugin:
<plugin>
<groupId>org.codehaus.mojo</groupId>
<artifactId>exec-maven-plugin</artifactId>
<version>1.2.1</version>
<executions>
<execution>
<phase>pre-integration-test</phase>
<goals>
<goal>exec</goal>
</goals>
</execution>
</executions>
<configuration>
<executable>phantomjs</executable>
<arguments>
<argument>--webdriver=8080</argument>
<argument>--webdriver-selenium-grid-hub=http://localhost:4444</argument>
</arguments>
</configuration>
</plugin>
With this configuration the output is: HTTP ERROR: 403 Forbidden for Proxy and I can't go any further. Anyone has successfully configured this?
It wouldn't be too much of a stretch to just create a script that uses YAJSW (Yet Another Java Service Wrapper) to accommodate registering the grid hub as a service. Then, Maven can call the script and start it as its own separate process. Also, Maven could call a stop service to stop it. I think it would be elegant.
Here is my almost working attempt. I'll need to solicit help from a Selenium expert to get it working. Having a error when registering service that is unexpected. Most of the work is done though. Once I get this working, it will be good to go for you.
Now, while you could run the Grid Hub as a service, you wouldn't want to do the same to the Node because it needs access to the desktop (and services can only access their own invisible private desktop). So, perhaps that brings us back to the same problem that you are trying to solve.

Resources