I am writing a Maven plugin that is supposed to generated JPA entities from a DSL. The entities are generated directly as ByteCode, i.e. without an intermediate java source (using ByteBuddy). The generation works fine. But the problem is that my entity isn't found by the hibernate orm. The error occurs in: org.hibernate.boot.registry.classloading.internal.ClassLoaderServiceImpl.classForName(ClassLoaderServiceImpl.java:242), and the exception is a ClassNotFoundException. Does the hibernate instrumentation require the java code of the entity? Or how can I let hibernate know about my entity?
Further background: I use spring-data-jpa, and configured the EntityScan with the correct basePackage. My maven plugin is run in the compile phase (immediately after the compilation of the source code).
Works as designed. The only problem is to take care that the package of the class and the folder hierarchy match, and that these packages are added as basePackages for the EntityScan.
Related
I have separated my project in multiple modules using Maven. On of the modules is used as a 'common' module which shares entities and other code between modules. I have imported this module within the other modules. A problem arises when launching a module which uses the 'common' module. The module seems to scan for all entities in the common package and tries to validate the schema. This module does not have the SQL permission to access some tables, which results in a validation error.
Is there a way to disable this feature and only validate the schema based on the actually used entities in the code (based on imports)?
A somewhat vague explanation, I would like to see an immediate problem.
But the first thing that comes to mind: exclude auto jpa support. For your application class:
#SpringBootApplication(exclude = {JpaRepositoriesAutoConfiguration.class})
Possible, you can exclude also class DataSourceAutoConfiguration.
I have a Spring Boot application and want to use Liquibase to generate the changelogs for my JPA entities. However, I encounter different issues, depending on my approach.
My first approach is to use the diff goal of the maven plugin. The url is my H2 development database with the H2 driver and the reference URL is something like "hibernate:spring:myBasePackage.myEntityPackage?dialect=org.hibernate.dialect.H2Dialect" with the driver "liquibase.ext.hibernate.database.connection.HibernateDriver". In that case Liquibase seems to recognize my entities, but prints the differences to the console. Also the differences do not have the form of a changelog file.
My second approach would be to use the generateChangeLog goal of the maven plugin. In this case my url is "hibernate:spring:myBasePackage.myEntityPackage?dialect=org.hibernate.dialect.H2Dialect" with the driver "liquibase.ext.hibernate.database.connection.HibernateDriver. In this case I am getting an error "Unable to resolve persistence unit root URL: class path resource [] cannot be resolved to URL because it does not exist". This error can be found in both the Spring an the Liquibase issue trackers, but it seems like it is always said that this error is already fixed.
My third approach is basically like the second, but in this case I am using a "hibernate:classic" url with an implementation of "CustomClassicConfigurationFactory", which registers my annotated classes explicitly. This does work. However, in this case I have to do this in my application-jar. I have to add my application-jar as a dependency for the maven-plugin. Thus I have to build my application-jar (and install it to the local Maven repository), before I can generate the changelogs. This seems to be cumbersome.
My questions are:
Is there an easier way to generate the changelogs for JPA entities in a Spring boot based application?
Why are the first two approaches not working?
Is there a way to simplify the third approach?
I am using:
Spring Boot 1.5.4.RELEASE
Liquibase-Hibernate4 3.6
Liquibase 3.5.3
Many thanks in advance.
In the first approach using liquibase:diff , the change set for the entities (create table change set) will not be generated since liquibase do not assume new jpa entity as change.
In the second approach generateChangeLog, it generates the change log from the given data base.It won't look into your jpa entities.
In order to generate the ddl scripts for your jpa entities , just submit the following to your jpa properties
<property key="javax.persistence.schema-generation.scripts.action">drop-and-create</property>
<property key="javax.persistence.schema-generation.scripts.create-target">./ddl/create.sql</property>
<property key="javax.persistence.schema-generation.scripts.drop-target">./ddl/drop.sql</property>
The above will generate the scripts the ddl folder under the root folder.
You can check the other properties here https://thoughts-on-java.org/standardized-schema-generation-data-loading-jpa-2-1/
I am using activejdbc 1.4.9. I created one jar (using maven) which has two ActiveJDBC Model Classes. I added the jar to the application. Application has three more model classes. When I compile and try to run the application (gradle based), activejdbc is instrumenting only 3 classes which are in application but not instrumenting the classes which are in jar. When I try to write the data into the two models which are in jar, It is throwing exception as
org.javalite.activejdbc.DBException: Failed to retrieve metadata from DB. Are you sure table exists in DB ?
Now I have certain doubts. Please help me to resolve and understand few things.
How instrumentation happens ?
When we create a jar, will it include instrumented classes ?
Why it is throwing this error ?
It is throwing this error in case classes have not been instrumented. This means that before placing your model classes into a jar file, you need to instrument them. Does not matter which build method you use though. This http://javalite.io/instrumentation explains what is instrumentation and how to do it. Instrumentation does not create jars, it merely adds some byte code into your classes. In all scenarios you need:
Write code :)
Compile
Instrument
after this, you can do any of the following:
run app using class files in the file system
package class files into jar file and use that on your classpath
package jar file into a larger app (WAR, EAR, RAR, etc.) and deploy your app
making sense?
try to fix one issue in our old spring application. Unfortunately source was lost and I have difficulty to decompile jpa entity. The class is decompiled sucessfuly but there is only basic anotation as #Entity, #Table. There is completety missing links and column names between entity classes (#OnToMany, #Column, etc). I have tried decompiler cfr, d4j, procyon but without sucess.
Does anybody know if it is possible and what kind of decompiler is able to do that?
One I use is http://jd.benow.ca/
Maybe all metadata isn't specified in annotations? if in XML then a decompiler clearly won't give that.
If decompilation gives you just partial results it could be that some annotations are not targeting runtime.
This should however not be the case for any hibernate annotations.
One possibility could be to create a new spring test app, include your jar and launch a component scan on #entity, #mappedsuperclass and #embeddable. You would have to process the classes after yourself using reflection but at least you should be able to get everything out of it.
Should be some work though. Make an opensource project out of it after ;)
We have a very comfortable setup using JPA through Spring/Hibernate, where we attach a PersistenceUnitPostProcessor to our entity manager factory, and this post processor takes a list of project names, scans the classpath for jars that contain that name, and adds those jar files for scanning for entities to the persistence unit, this is much more convenient than specifying in persistence.xml since it can take partial names and we added facilities for detecting the different classpath configurations when we are running in a war, a unit test, an ear, etc.
Now, we are trying to replace Spring with Seam, and I cant find a facility to accomplish the same hooking mechanism. One Solution is to try and hook Seam through Spring, but this solution has other short-comings on our environment. So my question is: Can someone point me to such a facility in Seam if exists, or at least where in the code I should be looking if I am planning to patch Seam?
Thanks.
If you're running in a Java EE container like JBoss 6 (and I really recommend so), all you need is to package your beans into a jar, place a META-INF/persistence.xml inside it and place the jar into your WAR or EAR package. All #Entity annotated beans inside the jar will be processed.
For unit-testing, you could point the <jar-file> element to the generated .class output directory and Hibernate will also pick the Entities. Or even configure during runtime using Ejb3Configuration.addAnnotatedClass;
#see http://docs.jboss.org/hibernate/entitymanager/3.6/reference/en/html/configuration.html