How to automatically register all available interface implementations in Quarkus? - quarkus

I'm trying to adapt a library to be usable in Quarkus native mode. Since it's reflection-heavy, I need to manually register all implementations of certain interfaces.
What I've done so far and which seems to work fine for user code:
private static void registerAllImplementations(CombinedIndexBuildItem combinedIndexBuildItem,
BuildProducer<ReflectiveHierarchyBuildItem> reflectiveHierarchyClass,
Class<?>... classNames) {
for (Class<?> klass : classNames) {
combinedIndexBuildItem.getIndex().getAllKnownImplementors(DotName.createSimple(klass.getName())).stream()
.map(ci -> new ReflectiveHierarchyBuildItem(Type.create(ci.name(), Type.Kind.CLASS)))
.forEach(reflectiveHierarchyClass::produce);
}
}
However, the below line doesn't pick up implementors that come from external jars:
combinedIndexBuildItem.getIndex().getAllKnownImplementors(...)
It's not a tragedy, but it'd be much more future-proof if one did not need to pay attention to the internals of some external jar and make sure that all relevant implementations get registered manually.
Do you have any clues?

Behind the scene, Quarkus uses Jandex to index your sources. This is Jandex that provides the CombinedIndexBuildItem so you need all the external jar to be indexed by Jandex.
For this you can add the Jandex maven plugin to those external JAR or add some configuration options for each jar :
quarkus.index-dependency.<name>.group-id=
quarkus.index-dependency.<name>.artifact-id=
More information here: https://quarkus.io/guides/cdi-reference#how-to-generate-a-jandex-index

Related

How to statically weave JPA entities using EclipseLink when there is no persistence.xml as the entities are managed by Spring

I've got a project that is Spring based, so the entity manager is set up progammatically, with no need for persistence.xml files to list all the entities.
I'm currently using load time weaving but am trying to get static weaving working using Eclipselink and Gradle. I want to replicate what is performed by the maven eclipselink plugin:
https://github.com/ethlo/eclipselink-maven-plugin
I have the following gradle set up (note that it's Kotlin DSL not groovy):
task<JavaExec>("performJPAWeaving") {
val compileJava: JavaCompile = tasks.getByName("compileJava") as JavaCompile
dependsOn(compileJava)
val destinationDir = compileJava.destinationDir
println("Statically weaving classes in $destinationDir")
inputs.dir(destinationDir)
outputs.dir(destinationDir)
main = "org.eclipse.persistence.tools.weaving.jpa.StaticWeave"
args = listOf("-persistenceinfo", "src/main/resources", destinationDir.getAbsolutePath(), destinationDir.getAbsolutePath())
classpath = configurations.getByName("compile")
}
When I try and run the task the weaving tasks fails as it's looking for a non-existent persistence.xml.
Is there any way you can statically weave JPA entities in a Spring based JPA project ?
Exception Description: An exception was thrown while processing persistence.xml from URL: file:/home/blabla/trunk/my-module/src/main/resources/
Internal Exception: java.net.MalformedURLException
at org.eclipse.persistence.exceptions.PersistenceUnitLoadingException.exceptionProcessingPersistenceXML(PersistenceUnitLoadingException.java:117)
at org.eclipse.persistence.internal.jpa.deployment.PersistenceUnitProcessor.processPersistenceXML(PersistenceUnitProcessor.java:579)
at org.eclipse.persistence.internal.jpa.deployment.PersistenceUnitProcessor.processPersistenceArchive(PersistenceUnitProcessor.java:536)
... 6 more
Caused by: java.net.MalformedURLException
at java.net.URL.<init>(URL.java:627)
at java.net.URL.<init>(URL.java:490)
at java.net.URL.<init>(URL.java:439)
at com.sun.org.apache.xerces.internal.impl.XMLEntityManager.setupCurrentEntity(XMLEntityManager.java:620)
at com.sun.org.apache.xerces.internal.impl.XMLVersionDetector.determineDocVersion(XMLVersionDetector.java:148)
at com.sun.org.apache.xerces.internal.parsers.XML11Configuration.parse(XML11Configuration.java:806)
at com.sun.org.apache.xerces.internal.parsers.XML11Configuration.parse(XML11Configuration.java:771)
at com.sun.org.apache.xerces.internal.parsers.XMLParser.parse(XMLParser.java:141)
at com.sun.org.apache.xerces.internal.parsers.AbstractSAXParser.parse(AbstractSAXParser.java:1213)
at com.sun.org.apache.xerces.internal.jaxp.SAXParserImpl$JAXPSAXParser.parse(SAXParserImpl.java:643)
at org.eclipse.persistence.internal.jpa.deployment.PersistenceUnitProcessor.processPersistenceXML(PersistenceUnitProcessor.java:577)
... 7 more
Caused by: java.lang.NullPointerException
at java.net.URL.<init>(URL.java:532)
... 17 more
According to org.eclipse.persistence.tools.weaving.jpa.StaticWeave documentation, it requires the persistence.xml in place to generate the static weaving sources.
Usage:
StaticWeave [options] source target
Options:
-classpath
Set the user class path, use ";" as the delimiter in Window system and
":" in Unix system.
-log
The path of log file, the standard output will be the default.
-loglevel
Specify a literal value for eclipselink log level(OFF,SEVERE,WARNING,INFO,CONFIG,FINE,FINER,FINEST). The default
value is OFF.
-persistenceinfo
The path contains META-INF/persistence.xml. This is ONLY required when the source does not include it. The classpath must contain all
the classes necessary in order to perform weaving.
I run a maven build using eclipselink maven plugin, it works without the persistence.xml as you mentioned, because it generates the persistence.xml
before invoking the StaticWeave.class when It is not located in the CLASSPATH, using this method.
private void processPersistenceXml(ClassLoader classLoader, Set<String> entityClasses)
{
final File targetFile = new File(this.persistenceInfoLocation + "/META-INF/persistence.xml");
getLog().info("persistence.xml location: " + targetFile);
final String name = project.getArtifactId();
final Document doc = targetFile.exists() ? PersistenceXmlHelper.parseXml(targetFile) : PersistenceXmlHelper.createXml(name);
checkExisting(targetFile, classLoader, doc, entityClasses);
PersistenceXmlHelper.appendClasses(doc, entityClasses);
PersistenceXmlHelper.outputXml(doc, targetFile);
}
The complete source code is here
I believe you could follow the same approach in your gradle build.
Kinda late to the party but this is definitely possible with Gradle.
There are 3 steps to do in order to make this work:
Copy the persistence.xml file into the source folder next to the classes
Do the weaving
Remove the persistence.xml file from the classes source folder to avoid duplicate persistence.xml conflicts on the classpath
Also, it's very important to hook the weaving process into the compileJava task's last step in order to not break Gradle's up-to-date check, otherwise Gradle will just recompile everything all the time which can be quite inconvenient when developing.
For a more detailed explanation, check out my article on it: EclipseLink static weaving with Gradle.
I admit, I do not completely understand what you mean by weaving. My answer might help if you need to create dynamically PersistenceUnits which provide JPA-Entitymanagers, and if these units should be able to create a Db-Schema (for example in H2) and manage Entities based dynamically on the classes you provide at runtime.
The code-example I am mentioning later, does not work with JPA in Spring but in Weld. I think the answer to your question is related to how EntityManagers are created and what classes the PersistenceUnit, which creates the EntityManager, does manage. There is no difference between those two. Instead of using the EntityManagerFactory as CDI-Producer you might Autowire it or register it using an old fashioned application-context. Therefore I think the answer to your question lies in the following official sources:
PersistenceProviderResolverHolder and
PersistenceProvider#createEntityManagerFactory(getPersistenceUnitName(), properties)
properties is the replacement for the persistence.xml, where a SEPersistenceUnitInfo-Object can be registered in.
To start look at: PersistenceProviderResolverHolder
Later: PersistenceProvider
or you can try to understand how my code (see below) is doing that. But I have to admit, I am not very proud of this part of that software, sorry.
Those classes and objects are used by me to create a module that enables the simulation of a server deployed JPA-WAR-File.
To do that, it scans some classes and identifies Entities.
Later in the Testcode a so called PersistenceFactory creates EntityManager and Datasources. If eclipselink is used this factory weaves those classes together. You need no persistence.xml. The working there might be help to answer your question.
If you look at:
ioc-unit-ejb:TestPersistencefactory
search for the creation of SEPersistenceUnitInfo. That Interface got fed by a list of classes which it returns as
#Override
public List<String> getManagedClassNames() {
return TestPersistenceFactory.this.getManagedClassNames();
}
This object is used to create a Persistencefactory with the help of a PersistenceProvider. This can be discovered as soon as eclipselink is available in the classpath.
The code is not easy to be understood because it allows both Hibernate or Eclipselink to be used for JPA, that depends on the availability of the jars in the classpath.

Deploying BEAN in OSGi plugin

I am currently deploying my custom controls as OSGi plugins and I wanted to do the same thing with my beans. I have tried putting them into the OSGi plugin and it works fine but the only problem I have is the faces-config.
It seems it has to be called faces-config in the OSGi plugin to work but that means i can't use beans in the NSF anymore because it seems to ignore the local faces-config.
Is there a way to change the name of the faces-config in the OSGi plugin?
Something like FEATURE-faces-config.xml?
In the class in your plugin that extends AbstractXspLibrary, you can override "getFacesConfigFiles", which should return an array of strings representing paths within the plugin to additional files of any name to load as faces-config additions. For example:
#Override
public String[] getFacesConfigFiles() {
return new String[] {
"com/example/config/beans.xml"
};
}
Then you can put the config file in that path within your Java source folder (or another folder that is included in build.properties) and it will be loaded in addition to your app's normal faces-config, beans and all.
The NSFs are running as separate, distinct Java applications. The OSGi plugin is running in the OSGi layer, above all those distinct Java applications, as a single code base. Consequently, the faces-config is only at that level.
It's possible to load them dynamically, by using an ImplicitObjectFactory, loaded from an XspContributor. That's what is done in OpenNTF Domino API for e.g. userScope (which is a bean stored in applicationScope of an NSF). See org.openntf.domino.xsp.helpers.OpenntfDominoImplicitObjectFactory, which is referenced in OpenntfDominoXspContributor, loaded via the extension point of type "com.ibm.xsp.library.Contributor".
A few caveats:
You have no control over what happens if you try to register your bean with a name the developer also uses for a different variable in that scope.
Unless you add code to check if the library is enabled, as we do, you'll be adding the bean to every database on the server.
You still need to add the library to the NSF. Unless you also provide a component that those databases will all use, there's no way you can programmatically add it, as far as I know.
It might be easier to skip the bean approach and just add an instance of the Java class in beforePageLoad, page controller class, or however you're managing the backing to the relevant XPage (if viewScope) or application (if sessionScope / applicationScope).

MavenProject: Get the available classes for use on my plugin

I'm loading a Maven project as described here. I'm trying to figure out how I can retrieve the source roots so I can figure out the Java classes I have so my Mojo can use them.
I tried a couple of the methods in there, like getResources or getScriptSources without luck. Any idea?
Thanks in advance!
Edit:
I was asked to elaborate a little bit in what I'm attempting to do, so here it is:
The plugin I'm developing will take the sources in the project and create test cases from those. Unless configured, I want to generate tests for all the classes, and for that, I need to somehow figure out where are my sources so I can configure properly.
Hope that helps.
Here's the repository. I planned on publishing it later but I provided source as requested.
Have you read the plugin developers documentation?
That page will link to Plugins Cookbook which links to Mojo Developer Cookbook which has The maven project, or the effective pom. and gives you access to org.apache.maven.project.MavenProject object via
/** #parameter default-value="${project}" */
private org.apache.maven.project.MavenProject mavenProject;
Alternatively via Java 5 annotations
#Component
MavenProject project;
You can call getCompileSourceRoots() to get a list of the directories that will be used for compilation.
You will also need to do more reading about how to setup inclusion/exclusions. You can use other plugins as examples of how to do this, e.g. maven-compiler-plugin
If you want to use annotations, it is very important to make sure your pom is configured as per using annotations and that you use annotations at the class level as well. Mixing javadoc annotations might not work.
I think the simplest solution would be to define a mojo parameter:
/**
* #parameter default-value="${project.build.sourceDirectory}"
* #required
*/
private File sourceDirectory;
or with new annotation based definition:
#Parameter(required = true, defaultValue="${project.build.sourceDirectory}"}
private File sourceDirectory;
which should give your wished result.

Spring namespaces and placeholder in springs.schemas

Spring-namespaces allows you to define your own structure how spring beans could be configured. Very cool.
I have to use a 3rd party software (Assentis Docbase) which defines in its spring.schemas the following (example below simplified)
http\://com.apress.prospring2/ch07/custom.xsd=custDir:/custom.xsd
Meaning: If user defines in its spring-xml with schema-location: "http://com.apress.prospring2/ch07/custom.xsd" spring will validate this file against custom.xsd.
custDir is a directory OUTSIDE the provided jar. Does anyone have an idea how I can set this custDir to point to a valid path during junit test? I already tried -DcustDir=/pathToXsd/ but it did not work.
If I remove custDir than everything works as expected, but I can not remove it from provided spring.schemas since it is 3rd party software.
Maybe this is an issue how property-files are handled in java but I have no idea.
You may be able to "override" this entry by providing your own custom spring.schemas file with the same entry but with a location to your custom xsd file. The catch is that this is highly dependent on the order in which the spring.schemas are loaded up, but could be worth a try.
Since custDir is not a place holder, you cannot replace it the way you are doing it, I am surprised that the third party schema location is outside of the classpath.
The syntax of spring.schemas I provided in my question is a properitary definition of 3.rd party software. They implemented there own EntityResolver which manually reacts on "custDir:" and starts some magic algorithm. So I came to the following workaround.
You have to create your own my_spring.schemas which must be live in META-INF/. Than you have to make sure that spring loads my_spring.schemas and NOT spring.schemas.
I achieved it with implementing my own TestingContext which is a subclass of ClassPathXmlApplicationContext. In TestingContext I overwrote method protected void loadBeanDefinitions(DefaultListableBeanFactory beanFactory) throws IOException and filled it with implementation from org.springframework.context.support.AbstractXmlApplicationContext. The only change I made was to line beanDefinitionReader.setEntityResolver(new ResourceEntityResolver(this)) to beanDefinitionReader.setEntityResolver(new PluggableSchemaResolver(getClassLoader(), "META-INF/my_spring.schemas). And voila if I use TestingContext my own my_spring.schemas is loaded.
Drawback with this solution is that you have to provide all xsd in your jar because the default name, where spring looks up definitions has been changed.

maven: Running the same tests for different configurations

In my spring + maven app, I have created some tests for the Data Access Layer that I would like now to run against multiple datasources. I have something like:
#ContextConfiguration(locations={"file:src/test/resources/testAppConfigMysql.xml"})
public class TestFooDao extends AbstractTransactionalJUnit38SpringContextTests {
public void testFoo(){
...
}
}
It has currently the config location hardcoded, so it can be used only against one datasource.
What is the best way to invoke the test twice and pass two different configs (say testAppConfigMysql.xml and testMyConfigHsqlDb.xml)?
I've seen suggestions to do this via system properties. How can I tell maven to invoke the tests twice, with different values of a system property?
I don't know if there is some sexy and fancy solution, being simple as well, for this. I would just implement base class with all testing stuff and then inherit it into 2 classes with different annotation-based configuration, like this:
#ContextConfiguration(locations={"firstDs.xml"})
public class TestFooDaoUsingFirstDs extends TestFooDao {
}
#ContextConfiguration(locations={"secondDs.xml"})
public class TestFooDaoUsingSecondDs extends TestFooDao {
}
Unless you have to handle really high number of different datasources this way, that is OK for me.
Rather than file:..., you can use classpath:... (remove the src/test/resources, it's implicit if you use classpath). Then you can have a single master context with the line:
<import resource="dao-${datasource}.xml" />
If you run the Maven build with the option -Ddatasource=foo, it will replace the ${datasource} in the master context with the whatever you specify. So you can have datasource-foo.xml, datasource-bar.xml etc. for your different configurations.
(You need to enable Maven resource filtering in the POM for this to work).
Alternatively, check out the new stuff in Spring 3.1: http://www.baeldung.com/2012/03/12/project-configuration-with-spring/
Edit: A third option would be to have all the test classes extend some superclass, and use
Junit's #Parameterised, where the parameters are the different Spring contexts. You couldn't use #ContextConfiguration in that case, but you can always create the Spring context manually, then autowire the test class using org.springframework.beans.factory.config.AutowireCapableBeanFactory.autowireBean()
Check maven invoker plugin. It supports profiles also.

Resources