Run a Maven plugin when the build fails - maven

I am using a plugin to send a Slack message through Maven. I am wondering if it's possible to use a plugin when the build failed so I get automatically notified about the failed build?

You could do that within Maven itself, through the EventSpy mechanism, built-in from Maven 3.0.2. At each step of the build, several events are raised by Maven itself, or by custom code, and it is possible to listen to those events to perform some actions. The execution event raised by Maven are represented by the class ExecutionEvent. Each event has a type, that describes what kind of event it represents: project failure, Mojo failure, project skipped, etc. In this case, the project failure event is what you're looking for.
A custom spy on events is just a Java class that implements the EventSpy interface. Preferably, it should inherit from the AbstractEventSpy helper class. As an example, create a new project (let's call it my-spy), and add the following Java class under a package:
import org.apache.maven.eventspy.AbstractEventSpy;
import org.apache.maven.eventspy.EventSpy;
import org.apache.maven.execution.ExecutionEvent;
import org.codehaus.plexus.component.annotations.Component;
import org.codehaus.plexus.component.annotations.Requirement;
import org.codehaus.plexus.logging.Logger;
#Component(role = EventSpy.class)
public class BuildFailureEventSpy extends AbstractEventSpy {
#Requirement
private Logger logger;
#Override
public void onEvent(Object event) throws Exception {
if (event instanceof ExecutionEvent) {
ExecutionEvent executionEvent = (ExecutionEvent) event;
if (executionEvent.getType() == ExecutionEvent.Type.ProjectFailed) {
logger.info("My spy detected a build failure, do the necessary here!");
}
}
}
}
This code simply registers the spy through the Plexus' #Component annotation, and logs a message when a project failed to build. To compile that class, you just need to add to the my-spy project a dependency on Maven Core and an execution of the plexus-component-metadata plugin to create the right Plexus metadata for the component.
<dependencies>
<dependency>
<groupId>org.apache.maven</groupId>
<artifactId>maven-core</artifactId>
<version>3.0.2</version>
</dependency>
</dependencies>
<build>
<plugins>
<plugin>
<groupId>org.codehaus.plexus</groupId>
<artifactId>plexus-component-metadata</artifactId>
<version>1.6</version>
<executions>
<execution>
<goals>
<goal>generate-metadata</goal>
</goals>
</execution>
</executions>
</plugin>
</plugins>
</build>
Once this project is compiled and installed into your local repository (through mvn clean install), you can add it to the build of another project through the core extensions mechanism.
Before Maven 3.3.1, you had to drop the my-spy JAR into your ${MAVEN_HOME}/lib/ext folder, so that Maven could find it. As of 3.3.1, you don't need to fiddle with your Maven installation, and can create a file .mvn/extensions.xml in your project base directory (${maven.multiModuleProjectDirectory}/.mvn/extensions.xml). Its content would be
<?xml version="1.0" encoding="UTF-8"?>
<extensions>
<extension>
<groupId>my.spy</groupId>
<artifactId>my-spy</artifactId>
<version>0.0.1</version>
</extension>
</extensions>
which just declares an extension pointing to the Maven coordinates of the spy project. Maven (≥ 3.3.1) will by default look for that file, and, as such, your spy will be correctly registered and invoked throughout the build.
The only remaining thing to do, is to code what the spy should do. In your case, it should invoke a Maven plugin, so you take a look at the Mojo Executor library, which makes that very easy to do.

Related

Add custom codegen implementation for openapi-generator gradle plugin

I implement my custom code generation for https://github.com/OpenAPITools/openapi-generator
but i have no idea how to add this to gradle plugin. I need to add it to classpath while gradle perform openapi tasks
For maven i can easily add my custom implementation com.my.generator:customgenerator:1.0-SNAPSHOT in plugin dependency block,
<plugin>
<groupId>org.openapitools</groupId>
<artifactId>openapi-generator-maven-plugin</artifactId>
<version>${openapi-generator-maven-plugin-version}</version>
<executions>
<execution>
<goals>
<goal>generate</goal>
</goals>
<configuration>
<templateDirectory>myTemplateDir</templateDirectory>
<apiPackage>${default.package}.handler</apiPackage>
<modelPackage>${default.package}.model</modelPackage>
</configuration>
</execution>
</executions>
<dependencies>
<dependency>
<groupId>com.my.generator</groupId>
<artifactId>customgenerator</artifactId>
<version>1.0-SNAPSHOT</version>
</dependency>
</dependencies>
</plugin>
but in gradle i have no idea how to do it
The solution is simple if you know how Gradle plugins work. Here are steps how to do it:
You need to add your custom generator class to the classpath of the plugin. But, you can not use there any module of the Gradle project, in which you want to use the generator plugin, because Gradle plugins are applied before the whole compilation of the project and also before dependencies are resolved. So, you must use the already compiled jar file. For example, create a new Gradle project where you place custom generator code and publish it to maven local repository (How to publish source into local maven repository with Gradle?). Then you can add it to plugins classpath like this:
buildscript {
repositories {
mavenLocal()
mavenCentral()
}
dependencies {
classpath "org.openapitools:openapi-generator:4.3.0"
classpath "some.custom.openapi:generator:0.0.1"
}
}
Openapi generator use Java service loader to load generators (https://docs.oracle.com/javase/8/docs/api/java/util/ServiceLoader.html). So, in your custom generator project create file org.openapitools.codegen.CodegenConfig with content
some.custom.openapi.CustomJavaCodegen
(Here must be the name of the custom generator class) and place it to folder src/main/resources/META-INF/services/.
In your custom generator class override method getName with your generator name, which you will use in the configuration of openApiGenerator in the Gradle file.
I get this working with these steps. If I forget something to write it here, comment, and I will try to fill missing information.

Glue code is not loaded when running with cucumber-spring back-end from jar file

I have been trying to get spring-based cucumber tests to run using a combination of Junit(4.12), Cucumber-Java(4.1.1), Cucumber-Spring(4.1.1) and Cucumber-Junit(4.1.1).
I have no issues loading glue code when running the tests from inside the IDE (IntelliJ 2018.3.4) but it seems that for some reason when I try running from the a compiled jar file (which is a requirement in this case) cucumber doesn't find the step definitions.
I've already tried multiple glue code formats such as:
"classpath:com.a.b.c.stepdefs"
"com.a.b.c.stepdefs"
"classpath:com/a/b/c/stepdefs"
I've also tried providing relative paths from the runner class up to the step definitions class (nested just one level below)
"stepdefs"
Also gave a try running using both JUnit and the cucumber.cli.Main and attempted to use different style of step definitions (both cucumber expression - which the missing step snippets are pointing me to - and regex)
I am using the spring-boot-maven-plugin so I am aware that that generally changes the jar structure
All of the above variations fully work when running from the IDE, but not from the jar file
Main Class:
#SpringBootApplication(exclude = {DataSourceAutoConfiguration.class, HibernateJpaAutoConfiguration.class})
#ComponentScan(basePackages = {"com.a.b.test.core.data",
"com.a.b.c",
"com.a.b.c.stepdefs"}
)
public class CucumberApplication {
public static void main(String[] args) throws IOException, InterruptedException {
SpringApplication.run(CucumberApplication.class, args);
Result result = JUnitCore.runClasses(RunnerCentral.class);
System.exit(result.wasSuccessful() ? 0 : 1);
}
}
Runner Class:
package com.a.b.c;
#RunWith(Cucumber.class)
#CucumberOptions(features = "classpath:BOOT-INF/classes/features",
glue = "classpath:com/a/b/c/stepdefs",
plugin = "json:target/cucumber-html-reports/cucumber.json")
public class RunnerCentral {
}
POM config of spring-boot-maven-plugin:
<plugin>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-maven-plugin</artifactId>
<version>2.1.0.RELEASE</version>
<configuration>
<fork>true</fork>
<mainClass>${start-class}</mainClass>
<requiresUnpack>
<dependency>
<groupId>io.cucumber</groupId>
<artifactId>cucumber-java</artifactId>
</dependency>
<dependency>
<groupId>io.cucumber</groupId>
<artifactId>cucumber-spring</artifactId>
</dependency>
<dependency>
<groupId>io.cucumber</groupId>
<artifactId>cucumber-junit</artifactId>
</dependency>
</requiresUnpack>
</configuration>
<executions>
<execution>
<goals>
<goal>repackage</goal>
</goals>
</execution>
</executions>
</plugin>
I am expecting the behavior to be consistent between running from IDE and running from a packaged source although I may be missing something
Another thing I want to mention is that when swapping the backend with cucumber-picocontainer everything seems to work (spring is a requirement so a swap isn't possible)
This is the kind of issue that can have you launching your hot coffee at the nearest colleague.
Have you seen this post about using a custom ResourceLoader https://github.com/cucumber/cucumber-jvm/issues/1320
I think you'd have to copy and paste the Cucumber.java class, providing the resource loader to the runtime from the Application Context, and change your RunnerCentral class to RunWith the new class.
FWIW in my case, I placed the raw project in a docker container, that on startup ran ./mvnw test which is the Maven Wrapper supplied in Spring Boot projects. You can do ./mvnw test -s /path/to/maven/settings.xml if using a corporate repository, and if your container host can't access the corporate repository, run the image first on the Jenkins box (or wherever the image is being built) which will cause the dependency jars to be downloaded inside, then commit the docker image, and push that image out.
That way, the container can run the cucumber test phase using the local .m2 directory inside it, with the dependencies it needs already there.

Find classes that implement interfaces or being subclasses/superclasses in maven CLASSPATH?

VisualVM OQL queries can't query for interfaces because current heap dump format doesn't preserve this info.
To workaround this issue it is possible to find classes that implements interface and further perform heap dump analysis.
I have an application managed by Maven. During build Maven know full application CLASSPATH.
Is it possible to query via mvn command which classes in which package implements selected interface?
Or even more - to find classes and packages in application build CLASSPATH which is subclasses or superclasses of selected class?
Are there exist plug-in suitable for my needs?
UPDATE Interesting suggestion to use IDE for getting list of known implementation.
I work with Emacs and NetBeans. NetBeans have limited ability (Find Usage dialog by Alt+ F7) to find know implementation but its scope is limited to only to open projects. For example I look for org.hibernate.cfg.NamingStrategy implementation and NetBeans doesn't help in my case.
Because I need that list for further scripting GUI tools are not relevant unless they provide clean text export.
If you really need to achieve this via maven or scripting, here is how I got it working.
Based on the approach suggested by another answer on Stackoverflow, I implemented the following simple class:
package com.sample;
import java.io.File;
import java.io.IOException;
import java.util.ArrayList;
import java.util.Collection;
import java.util.Scanner;
import org.clapper.util.classutil.ClassFilter;
import org.clapper.util.classutil.ClassFinder;
import org.clapper.util.classutil.ClassInfo;
public class MainScan {
public static void main(String[] args) throws Exception {
if (args.length < 2) {
System.out.println("Missing options");
System.exit(-1);
}
System.out.println("Filtering by: " + args[1]);
ClassFinder finder = new ClassFinder();
finder.addClassPath();
loadClasspath(finder, args[0]);
ClassFilter filter = new ImplementInterfaceFilter(args[1]);
// you could also use as a filter: new
// SubclassClassFilter(AbstractFileFilter.class);
// or make a concatenation of filters using an AndClassFilter
Collection<ClassInfo> foundClasses = new ArrayList<ClassInfo>();
finder.findClasses(foundClasses, filter);
if (foundClasses.size() > 0) {
for (ClassInfo classInfo : foundClasses) {
System.out.println("- " + classInfo.getClassName());
// consider also using classInfo.getClassLocation() to get the
// jar file providing it
}
} else {
System.out.println("No matches found.");
}
}
static void loadClasspath(ClassFinder finder, String file) throws IOException {
Scanner s = new Scanner(new File(file));
s.useDelimiter(File.pathSeparator);
try {
while (s.hasNext()) {
finder.add(new File(s.next()));
}
} finally {
s.close();
}
}
static class ImplementInterfaceFilter implements ClassFilter {
private String interfaceName;
public <T> ImplementInterfaceFilter(String name) {
this.interfaceName = name;
}
public boolean accept(ClassInfo info, ClassFinder finder) {
for (String i : info.getInterfaces()) {
if (i.endsWith(this.interfaceName)) {
return true;
}
}
return false;
}
}
}
Note, the class is located in the com.sample package, but it can obviously be moved to some other package. The main method expects two options, a classpath file and an interface name, it will then add the classpath to the classpath finder and scan it looking for classes implementing the provided interface name (via a custom filter also provided above). Both options will be provided at runtime by Maven as following:
I used this library for the classpath scanning, hence as suggested on its official page, we need to add a custom repository to our POM:
<repositories>
<repository>
<releases>
<enabled>true</enabled>
<updatePolicy>always</updatePolicy>
<checksumPolicy>warn</checksumPolicy>
</releases>
<id>clapper-org-maven-repo</id>
<name>org.clapper Maven Repo</name>
<url>http://maven.clapper.org/</url>
<layout>default</layout>
</repository>
</repositories>
And the required dependency:
<dependencies>
...
<dependency>
<groupId>org.clapper</groupId>
<artifactId>javautil</artifactId>
<version>3.1.2</version>
</dependency>
...
</dependencies>
Then we just need to configure the following in our Maven build:
<build>
<plugins>
...
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-dependency-plugin</artifactId>
<version>2.1</version>
<executions>
<execution>
<phase>validate</phase>
<goals>
<goal>build-classpath</goal>
</goals>
<configuration>
<outputFile>${project.build.directory}/classpath.txt</outputFile>
</configuration>
</execution>
</executions>
</plugin>
<plugin>
<groupId>org.codehaus.mojo</groupId>
<artifactId>exec-maven-plugin</artifactId>
<version>1.1</version>
<executions>
<execution>
<phase>validate</phase>
<goals>
<goal>java</goal>
</goals>
<configuration>
<mainClass>com.sample.MainScan</mainClass>
<arguments>
<argument>${project.build.directory}/classpath.txt</argument>
<argument>${interfaceName}</argument>
</arguments>
</configuration>
</execution>
</executions>
</plugin>
...
</plugins>
</build>
We are basically configuring the Maven Dependency Plugin to write the full Maven build classpath to a file, then using the Exec Maven Plugin to execute our custom Java main, passing to it the classpath file and a parameter, ${interfaceName}. Both plugins executions are linked to the validate phase: we don't need to execute the full maven build, we will just invoke one of its first phases for this task.
As such, we can invoke the maven build as following:
mvn validate -DinterfaceName=Serializable -q
And have an output like the following:
Filtering by: Serializable
- org.apache.commons.io.ByteOrderMark
- org.apache.commons.io.comparator.CompositeFileComparator
- org.apache.commons.io.comparator.DefaultFileComparator
...
The Maven command will directly invoke our concerned phase, validate, using the -q option (quite) to skip any maven build log and just get the output interesting to us. Moreover, we can then dynamically pass the interface we want via the -DinterfaceName=<value_here> option. It will pass the value to the Exec Maven Plugin and as such to the Java main above.
According to further needs (scripting, output, format, etc.), the Java main can be easily adapted. Moreover, the plugins, dependency, repositories configuration could also be moved to a Maven profile to have it cleaner and better organized.
Last note: if you change the package of the Java main above, do not forget to change the Exec Maven Plugin configuration accordingly (the mainClass element).
So, back to your questions:
Is it possible to query via mvn command which classes in which package implements selected interface? Yes, applying the approach above.
Or even more - to find classes and packages in application build CLASSPATH which is subclasses or superclasses of selected class? Yes, look at the SubclassClassFilter from the same library, change the main above accordingly and you will get to it.
Are there exist plug-in suitable for my needs? I couldn't find any, but the code above could be easily converted into a new Maven plugin. Otherwise the approach described here is a mix of Java code and existing Maven plugins usage, which could suit your need anyway.

Invoking maven plugin as a part of build lifecycle

I’m new to maven. I’m trying to integrate a plugin into my build so that it would execute automatically as part of phase execution.
Say I want to plug into clean lifecycle phase.
The mojo I’m using was annotated specifying that it should be injected into clean phase:
/**
*
* #goal clean
* #phase clean
* #requiresProject
*/
public class CleanMojo extends AbstractSCAMojo {
This mojo was installed following instructions in Using Plugin Tools Java5 Annotations.
I added plugin to my pom.xml:
<build>
<plugins>
<plugin>
<groupId>myclean.plugin</groupId>
<artifactId>myclean-maven-plugin</artifactId>
<version>1.0</version>
<configuration>
<logfile>C:/temp/clean.log</logfile>
</configuration>
</plugin>
</plugins>
</build>
In my understanding having lifecycle binding in Mojo java code eliminates the need to provide executions in build-plugins-plugin. Is that correct?
I was expecting that after invoking mvn clean configured above myclean.plugin:myclean-maven-plugin will be executed as part of the clean goal, but nothing happens besides regular maven clean procedure.
When pom is changed to specify executions myclean.plugin:myclean-maven-plugin is invoked so I’m certain mojo code doesn’t contain blocking errors – this is just a question of configuration.
There is probably something more I need to specify to make plugin executed automatically (i.e. without specifying executions), but what?
As per the documentation, you should be adding the following annotation before the class definition:
#Mojo(name = "clean", defaultPhase = LifecyclePhase.clean)
#goal and #phase are for javadocs.

Configure Maven to use CXF wsdl2java with Basic Authentication

I have an application that needs to integrate with one of SharePoint's web services. This web service cannot be accessed freely and needs authentication.
As such, the standard wsdl2java Maven plugin in my application gives an HTTP 401 error when the generate-sources phase is executed.
Is there a way to setup Maven/POM so that I can provide a user/password that will generate the stubs?
I have come across some answers saying this is not possible but all answers are older than 1 year. I haven't found if Maven have issued an update on this. One option is to save a local copy of the WSDL (as suggested here) but I would like to avoid having local copies.
Because you mentioned CXF then I suppose you meant cxf-codegen-plugin. It's a bit of a hack but it works.
HTTP authentication credentials can be provided using java.net.Authenticator. One need to just define his own Authenticator class which overrides getPasswordAuthentication(..) method. Then it has to be set as default Authenticator. As far as I know it can't be done declaratively (for instance using environment properties) only programatically using Authenticator.setDefault(..).
In order to call Authenticator.setDefault(..) I would use CXF extension mechanism. Create separate maven project with similar class:
public class AuthenticatorReplacer {
public AuthenticatorReplacer(Bus bus) {
java.net.Authenticator.setDefault(new java.net.Authenticator() {
#Override
protected PasswordAuthentication getPasswordAuthentication() {
return new PasswordAuthentication("test", "test123"
.toCharArray());
}
});
}
}
and file src\main\resources\META-INF\cxf\bus-extensions.txt with contents:
org.example.AuthenticatorReplacer::false
Then add newly created project as a dependency to cxf-codegen-plugin:
<plugin>
<groupId>org.apache.cxf</groupId>
<artifactId>cxf-codegen-plugin</artifactId>
<version>${project.version}</version>
<dependencies>
<dependency>
<groupId>org.example</groupId>
<artifactId>cxf-authenticator-replacer</artifactId>
<version>0.0.1-SNAPSHOT</version>
</dependency>
</dependencies>
...
</plugin>
This way AuthenticatorReplacer is initialized by CXF extension mechanism and replaces default Authenticator with ours.
An clean alternative to #Dawid Pytel's solution would be to run this class during lifecycle of wsdl class auto generation:
<plugin>
<groupId>org.codehaus.mojo</groupId>
<artifactId>exec-maven-plugin</artifactId>
<version>1.4.0</version>
<executions>
<execution>
<phase>generate-sources</phase>
<goals>
<goal>java</goal>
</goals>
</execution>
</executions>
<configuration>
<mainClass>path.to.AuthenticatorReplacer</mainClass>
</configuration>
</plugin>
Important: your AuthenticatorReplacer has to be a main(String[] args) class and running the code inside.
I verified that Dawid's solution works. Alternatively, you can use SoapUI to pull down and cache the wsdl and then use SoapUi code generation support to use cxf to generate the code.
http://java.dzone.com/tips/generating-client-java-code
Dawid's solution works for me too. It is a little tricky though. In Eclipse, the pom.xml keeps complaining that "wsdl2java failed: Could not load extension class AuthenticatorReplacer". You have to ignore this error message and use the command line:
mvn generate-sources
The Java classes will then be generated successfully.

Resources