MavenProject: Get the available classes for use on my plugin - maven

I'm loading a Maven project as described here. I'm trying to figure out how I can retrieve the source roots so I can figure out the Java classes I have so my Mojo can use them.
I tried a couple of the methods in there, like getResources or getScriptSources without luck. Any idea?
Thanks in advance!
Edit:
I was asked to elaborate a little bit in what I'm attempting to do, so here it is:
The plugin I'm developing will take the sources in the project and create test cases from those. Unless configured, I want to generate tests for all the classes, and for that, I need to somehow figure out where are my sources so I can configure properly.
Hope that helps.
Here's the repository. I planned on publishing it later but I provided source as requested.

Have you read the plugin developers documentation?
That page will link to Plugins Cookbook which links to Mojo Developer Cookbook which has The maven project, or the effective pom. and gives you access to org.apache.maven.project.MavenProject object via
/** #parameter default-value="${project}" */
private org.apache.maven.project.MavenProject mavenProject;
Alternatively via Java 5 annotations
#Component
MavenProject project;
You can call getCompileSourceRoots() to get a list of the directories that will be used for compilation.
You will also need to do more reading about how to setup inclusion/exclusions. You can use other plugins as examples of how to do this, e.g. maven-compiler-plugin
If you want to use annotations, it is very important to make sure your pom is configured as per using annotations and that you use annotations at the class level as well. Mixing javadoc annotations might not work.

I think the simplest solution would be to define a mojo parameter:
/**
* #parameter default-value="${project.build.sourceDirectory}"
* #required
*/
private File sourceDirectory;
or with new annotation based definition:
#Parameter(required = true, defaultValue="${project.build.sourceDirectory}"}
private File sourceDirectory;
which should give your wished result.

Related

How to automatically register all available interface implementations in Quarkus?

I'm trying to adapt a library to be usable in Quarkus native mode. Since it's reflection-heavy, I need to manually register all implementations of certain interfaces.
What I've done so far and which seems to work fine for user code:
private static void registerAllImplementations(CombinedIndexBuildItem combinedIndexBuildItem,
BuildProducer<ReflectiveHierarchyBuildItem> reflectiveHierarchyClass,
Class<?>... classNames) {
for (Class<?> klass : classNames) {
combinedIndexBuildItem.getIndex().getAllKnownImplementors(DotName.createSimple(klass.getName())).stream()
.map(ci -> new ReflectiveHierarchyBuildItem(Type.create(ci.name(), Type.Kind.CLASS)))
.forEach(reflectiveHierarchyClass::produce);
}
}
However, the below line doesn't pick up implementors that come from external jars:
combinedIndexBuildItem.getIndex().getAllKnownImplementors(...)
It's not a tragedy, but it'd be much more future-proof if one did not need to pay attention to the internals of some external jar and make sure that all relevant implementations get registered manually.
Do you have any clues?
Behind the scene, Quarkus uses Jandex to index your sources. This is Jandex that provides the CombinedIndexBuildItem so you need all the external jar to be indexed by Jandex.
For this you can add the Jandex maven plugin to those external JAR or add some configuration options for each jar :
quarkus.index-dependency.<name>.group-id=
quarkus.index-dependency.<name>.artifact-id=
More information here: https://quarkus.io/guides/cdi-reference#how-to-generate-a-jandex-index

How to statically weave JPA entities using EclipseLink when there is no persistence.xml as the entities are managed by Spring

I've got a project that is Spring based, so the entity manager is set up progammatically, with no need for persistence.xml files to list all the entities.
I'm currently using load time weaving but am trying to get static weaving working using Eclipselink and Gradle. I want to replicate what is performed by the maven eclipselink plugin:
https://github.com/ethlo/eclipselink-maven-plugin
I have the following gradle set up (note that it's Kotlin DSL not groovy):
task<JavaExec>("performJPAWeaving") {
val compileJava: JavaCompile = tasks.getByName("compileJava") as JavaCompile
dependsOn(compileJava)
val destinationDir = compileJava.destinationDir
println("Statically weaving classes in $destinationDir")
inputs.dir(destinationDir)
outputs.dir(destinationDir)
main = "org.eclipse.persistence.tools.weaving.jpa.StaticWeave"
args = listOf("-persistenceinfo", "src/main/resources", destinationDir.getAbsolutePath(), destinationDir.getAbsolutePath())
classpath = configurations.getByName("compile")
}
When I try and run the task the weaving tasks fails as it's looking for a non-existent persistence.xml.
Is there any way you can statically weave JPA entities in a Spring based JPA project ?
Exception Description: An exception was thrown while processing persistence.xml from URL: file:/home/blabla/trunk/my-module/src/main/resources/
Internal Exception: java.net.MalformedURLException
at org.eclipse.persistence.exceptions.PersistenceUnitLoadingException.exceptionProcessingPersistenceXML(PersistenceUnitLoadingException.java:117)
at org.eclipse.persistence.internal.jpa.deployment.PersistenceUnitProcessor.processPersistenceXML(PersistenceUnitProcessor.java:579)
at org.eclipse.persistence.internal.jpa.deployment.PersistenceUnitProcessor.processPersistenceArchive(PersistenceUnitProcessor.java:536)
... 6 more
Caused by: java.net.MalformedURLException
at java.net.URL.<init>(URL.java:627)
at java.net.URL.<init>(URL.java:490)
at java.net.URL.<init>(URL.java:439)
at com.sun.org.apache.xerces.internal.impl.XMLEntityManager.setupCurrentEntity(XMLEntityManager.java:620)
at com.sun.org.apache.xerces.internal.impl.XMLVersionDetector.determineDocVersion(XMLVersionDetector.java:148)
at com.sun.org.apache.xerces.internal.parsers.XML11Configuration.parse(XML11Configuration.java:806)
at com.sun.org.apache.xerces.internal.parsers.XML11Configuration.parse(XML11Configuration.java:771)
at com.sun.org.apache.xerces.internal.parsers.XMLParser.parse(XMLParser.java:141)
at com.sun.org.apache.xerces.internal.parsers.AbstractSAXParser.parse(AbstractSAXParser.java:1213)
at com.sun.org.apache.xerces.internal.jaxp.SAXParserImpl$JAXPSAXParser.parse(SAXParserImpl.java:643)
at org.eclipse.persistence.internal.jpa.deployment.PersistenceUnitProcessor.processPersistenceXML(PersistenceUnitProcessor.java:577)
... 7 more
Caused by: java.lang.NullPointerException
at java.net.URL.<init>(URL.java:532)
... 17 more
According to org.eclipse.persistence.tools.weaving.jpa.StaticWeave documentation, it requires the persistence.xml in place to generate the static weaving sources.
Usage:
StaticWeave [options] source target
Options:
-classpath
Set the user class path, use ";" as the delimiter in Window system and
":" in Unix system.
-log
The path of log file, the standard output will be the default.
-loglevel
Specify a literal value for eclipselink log level(OFF,SEVERE,WARNING,INFO,CONFIG,FINE,FINER,FINEST). The default
value is OFF.
-persistenceinfo
The path contains META-INF/persistence.xml. This is ONLY required when the source does not include it. The classpath must contain all
the classes necessary in order to perform weaving.
I run a maven build using eclipselink maven plugin, it works without the persistence.xml as you mentioned, because it generates the persistence.xml
before invoking the StaticWeave.class when It is not located in the CLASSPATH, using this method.
private void processPersistenceXml(ClassLoader classLoader, Set<String> entityClasses)
{
final File targetFile = new File(this.persistenceInfoLocation + "/META-INF/persistence.xml");
getLog().info("persistence.xml location: " + targetFile);
final String name = project.getArtifactId();
final Document doc = targetFile.exists() ? PersistenceXmlHelper.parseXml(targetFile) : PersistenceXmlHelper.createXml(name);
checkExisting(targetFile, classLoader, doc, entityClasses);
PersistenceXmlHelper.appendClasses(doc, entityClasses);
PersistenceXmlHelper.outputXml(doc, targetFile);
}
The complete source code is here
I believe you could follow the same approach in your gradle build.
Kinda late to the party but this is definitely possible with Gradle.
There are 3 steps to do in order to make this work:
Copy the persistence.xml file into the source folder next to the classes
Do the weaving
Remove the persistence.xml file from the classes source folder to avoid duplicate persistence.xml conflicts on the classpath
Also, it's very important to hook the weaving process into the compileJava task's last step in order to not break Gradle's up-to-date check, otherwise Gradle will just recompile everything all the time which can be quite inconvenient when developing.
For a more detailed explanation, check out my article on it: EclipseLink static weaving with Gradle.
I admit, I do not completely understand what you mean by weaving. My answer might help if you need to create dynamically PersistenceUnits which provide JPA-Entitymanagers, and if these units should be able to create a Db-Schema (for example in H2) and manage Entities based dynamically on the classes you provide at runtime.
The code-example I am mentioning later, does not work with JPA in Spring but in Weld. I think the answer to your question is related to how EntityManagers are created and what classes the PersistenceUnit, which creates the EntityManager, does manage. There is no difference between those two. Instead of using the EntityManagerFactory as CDI-Producer you might Autowire it or register it using an old fashioned application-context. Therefore I think the answer to your question lies in the following official sources:
PersistenceProviderResolverHolder and
PersistenceProvider#createEntityManagerFactory(getPersistenceUnitName(), properties)
properties is the replacement for the persistence.xml, where a SEPersistenceUnitInfo-Object can be registered in.
To start look at: PersistenceProviderResolverHolder
Later: PersistenceProvider
or you can try to understand how my code (see below) is doing that. But I have to admit, I am not very proud of this part of that software, sorry.
Those classes and objects are used by me to create a module that enables the simulation of a server deployed JPA-WAR-File.
To do that, it scans some classes and identifies Entities.
Later in the Testcode a so called PersistenceFactory creates EntityManager and Datasources. If eclipselink is used this factory weaves those classes together. You need no persistence.xml. The working there might be help to answer your question.
If you look at:
ioc-unit-ejb:TestPersistencefactory
search for the creation of SEPersistenceUnitInfo. That Interface got fed by a list of classes which it returns as
#Override
public List<String> getManagedClassNames() {
return TestPersistenceFactory.this.getManagedClassNames();
}
This object is used to create a Persistencefactory with the help of a PersistenceProvider. This can be discovered as soon as eclipselink is available in the classpath.
The code is not easy to be understood because it allows both Hibernate or Eclipselink to be used for JPA, that depends on the availability of the jars in the classpath.

maven mojo for reading app classes and generating java

I want to write a maven plugin which will explore the classpath of my application at build time, search for classes with a certain annotation, and generate some java code adding utilities for these classes, which should get compiled in the JAR of the application.
So I wrote a mojo, inheriting from AbstractMojo, and getting the project through:
#Parameter(defaultValue = "${project}", readonly = true, required = true)
private MavenProject project;
I have most of the code, and my mojo does get execute, but I'm having trouble inserting my mojo at right build phase.
If I plug it like that:
#Mojo(name = "generate", defaultPhase = LifecyclePhase.GENERATE_SOURCES,
requiresDependencyResolution = ResolutionScope.COMPILE)
then the java code which I generate is compiled in the JAR file.
Note that I use project.addCompileSourceRoot to register the output folder.
But that isn't enough for me because it's too early in the build: I cannot read the classpath and find the classes from my project. I think they're not compiled yet.
I search for classes like so:
final List<URL> urls = List.ofAll(project.getCompileClasspathElements())
.map(element -> Try.of(() -> new File(element).toURI().toURL()).get());
final URLClassLoader classLoader = new URLClassLoader(urls.toJavaList().toArray(new URL[0]), Thread.currentThread().getContextClassLoader());
final Set<Class<?>> entities = HashSet.ofAll(new Reflections(classLoader).getTypesAnnotatedWith(MyAnnotation.class));
(I'm using vavr but you get the gist in any case)
So, by plugging my code at the GENERATE_SOURCES phase, this code doesn't work and I don't find any classes.
However, if I plug my mojo at the PROCESS_CLASSES phase:
#Mojo(name = "generate", defaultPhase = LifecyclePhase.PROCESS_CLASSES,
requiresDependencyResolution = ResolutionScope.COMPILE)
Then my classes are found, I can access the rest of the code from the application, but the code that I generate is not taken into account in the build, despite using addCompileSourceRoot.
How do I get both features working at the same time: ability to explore code from the rest of the application and ability to generate code which will be compiled with the rest of the JAR?
I guess a possible answer would be "you can't", but as far as I can tell, querydsl and immmutables are doing it (I tried reading their source but couldn't find the relevant code).
So #khmarbaise was right, what I wanted was not a maven mojo, but rather a maven annotation processor.
I found that this walkthrough was very helpful in creating one, and also this stackoverflow answer came in handy.

sonarqube + lombok = false positives

import lombok.Data;
#Data
public class Filter {
private Operator operator;
private Object value;
private String property;
private PropertyType propertyType;
}
For code above there are 4 squid:S1068 reports about unused private fields. (even they are used by lombok generated getters). I've seen that some fixes related to support of "lombok.Data" annotation have been pushed, but still having these annoying false positives.
Versions:
SonarQube 6.4.0.25310
SonarJava 4.13.0.11627
SonarQube scanner for Jenkins (2.6.1)
This case should be perfectly handled by SonarJava. Lombok annotations are taken into account at least since version 3.14 (SONARJAVA-1642). The issues you are getting are resulting from a misconfiguration of your Java project. No need to write any custom rules to handle this, this is natively supported by the analyzer.
SonarJava reads bytecode to know which annotation are used. Consequently, if you are not providing bytecode from your dependencies, on top of bytecode from your own code, the analyzer will behave erratically.
In particular, setting property sonar.java.libraries should solve your issue. Note that this property is normally automatically set when using SonarQube maven or gradle scanners.
Please have a look at documentation in order to correctly configure your project: https://docs.sonarqube.org/display/PLUG/Java+Plugin+and+Bytecode
I added following property to Sonar analysis properties file. And it works for me.
sonar.java.libraries=${env.HOME}/.m2/repository/org/projectlombok/lombok/**/*.jar
lombok v1.16.20 is lombok version on my project.
I'm using sonar-maven-plugin 3.4.0.905, lombok 1.16.18, with SonarQube CE Server v8.3.1.
I resolved the issue by adding
<sonar.java.libraries>target/classes</sonar.java.libraries> to the POM properties.
The answer suggested by Wohops and Barış Özdemir worked for me. Posting this answer because in my scenario, it took some time to figure out how to implement it because my CI builds are running in Travis and we don't know the path where the lombok-x.x.x.jar file will be downloaded because there is no much control we have on travis environment where the build runs.
I used my build tool (Gradle) to implement it. Following configuration in build.gradle ensured that as part of building of the project, all the jar dependencies get copied to ${buildDir}/output/libs
task copyToLib(type: Copy) {
into "${buildDir}/output/libs"
from configurations.runtime
}
build.dependsOn(copyToLib)
And then as mentioned in the previous answers, I configured the property in the sonar-project.properties file to this libs directory.
sonar.java.libraries=/home/travis/build/xxxxxx/build/output/libs/lombok-1.16.20.jar
Hope this helps.
Cheers.
You can configure the ignore issue rules:
sonar.issue.ignore.multicriteria=e1
sonar.issue.ignore.multicriteria.e1.ruleKey=java:S1068

Is it possible to access Maven project information from a custom plugin?

I'm writing a custom plugin in Maven and would like to access information about the project. As a simple example, within my Java code, I'd like to get the project's build directory. I know that I can get it using a parameter annotation like so:
#Mojo( name="myplugin" )
public class MyPluginMojo extends AbstractMojo {
// DOES work. The project.build.directory prop is resolved.
#Parameter( property="myplugin.buildDir", defaultValue="${project.build.directory}", required=true )
private File buildDir;
public void execute () throws MojoExecutionException
{
System.out.println(buildDir.getPath());
// DOES NOT work, prints the literal string.
System.out.println("${project.build.directory}");
}
}
That feels like a hack. To start with, I have no need to expose this parameter to the pom.xml. I'm only doing it this way because within the annotation, the property gets resolved.
I'd also like access to other properties, namely the project's dependencies.
I've been googling for hours without luck. The closest thing I've found is the MavenProject plugin but I can't get it to work either and it hasn't been updated since 2009 from the looks of it.
Gradle provides the "project" variable for this when writing plugins. Does Maven simply not allow this?
--- Update ---
Thanks to Robert's link to the documentation, I got this working. One thing that was a surprise to me was that the project.build.directory is not available through the injected project. According to the docs, you inject that separately. Here's what I added to my class to get both the project object and the build directory:
#Parameter( defaultValue="${project}", readonly=true, required=true )
MavenProject project;
#Parameter( defaultValue = "${project.build.directory}", readonly=true, required=true )
private File target;
And a dependency to my pom:
<dependency>
<groupId>org.apache.maven</groupId>
<artifactId>maven-project</artifactId>
<version>3.0-alpha-2</version>
</dependency>
You're on the right path, it not a hack. But if you don't want to expose it as a parameter, then you should also add readonly=true. Maven also has the project variable, see http://maven.apache.org/plugin-tools/maven-plugin-tools-annotations/ for all common objects you can use within your project.
The only way to get current Maven project inside a Maven plugin is to inject it. See this question for more info.
See also the Mojo Cookbook that describe how to inject the current Maven project.

Resources