Testing Hibernate Mappings - spring

I'm using Hibernate to map objects to a legacy schema which contains some ginormous tables via annotations (as XML files are so 2003). Since these classes are so large, yes I occasionally make an occasional typo, which Hibernate doesn't bother to tell me about until I try to run it.
Here's what I've tried:
One: Setting hbm2ddl.auto to "validate":
This causes the String values of the class to validate against varchar(255). Since many of the column types in the database are CHAR(n), this blows up. I would have to add the columnDefinition="CHAR(n)" to several hundred mappings.
Two: Using Unitils.
Importing these via Maven causes imports of dependency libraries which blow up other sections of code. Example: I'm using Hibernate 4.1, but Unitils imported Hibernate 3.2.5 and blew up a UserType.
So, is there another way to do this? I looked at the Unitils code to see if I could simply yank the sections I needed (I do that with apache-commons fairly often when I just need a single method), but that's not a simple task.
Hibernate is configured via a Spring application context.
Any ideas out there?

I would write tests against an in-memory database (HSQLDB, H2) using the Spring testing framework. You'll quickly see any mapping errors when you attempt to run queries against the tables.
The test class would look something like this:
#RunWith(SpringJUnit4ClassRunner.class)
#ContextConfiguration(classes=MyTestConfig.class)
#TransactionConfiguration(transactionManager="txMgr", defaultRollback=true)
public class MyTest {
#Autowired
private SessionFactory sessionFactory;
// class body...
}
I would configure Hibernate to auto-deploy the tables as part of the tests.

Related

Spring boot with multiple database connections

I made a simple SpringBoot REST application for testing purposes where the tables are in 2 databases, one is Mysql and one is Postgresql. To configure the 2 connections I used the instructions from here, at point 6 - "Multiple Databases in Spring Boot" and all seemed to be fine, the 2 connections were initiated but only the primary connection works.
So if the Mysql connection is annotated as #Primay only Mysql REST services work, on Postgresql the error for all tables is "org.hibernate.hql.internal.ast.QuerySyntaxException: <Entity_Name> is not mapped". But if I make a single change and set #Primary on the Postgresql connection then all Postgres tables are working and all Mysql tables give the same error(table not mapped).
So somehow I think the right connection is not autoselected based on the package.
UPDATE: I found another tutorial here using different database types, I followed the instructions but the result is the same, all tables in the secondary database give the error "org.hibernate.hql.internal.ast.QuerySyntaxException: <Entity_Name> is not mapped". I think the secondary connection is not used, somehow the primary one defaults on the wrong tables but I don't know why.
I uploaded this small Github project with my work.
https://github.com/victorqedu/MultipleSpringBootDS
UPDATE: In the DAO class a have autowired the constructor and #Autowire is setting the wrong EntityManager(I think this is the source of the problem), could I manually specify the right EntityManager?
#Autowired
public AntibiogramaAntibioticeDAOHibernateImpl(EntityManager theEntityManager) {
entityManager = theEntityManager;
}
I also tried the annotation #PersistenceContext on the EntityManager but the result is the same.
#PersistenceContext
private EntityManager entityManager;
I'm not sure the problem is EntityManagaer or the Session that I obtain from EntityManager.unwrap, seems to be little documentation about this...
This can be solved with Qualifier in a short description if you have multiple same type of beans(like EntityManager) you should use qualifier to wire them.
Therefore in your code you should
public AntibiogramaAntibioticeDAOHibernateImpl(
#Qualifier("primaryEntityManagerFactory") EntityManager theEntityManager)

How to statically weave JPA entities using EclipseLink when there is no persistence.xml as the entities are managed by Spring

I've got a project that is Spring based, so the entity manager is set up progammatically, with no need for persistence.xml files to list all the entities.
I'm currently using load time weaving but am trying to get static weaving working using Eclipselink and Gradle. I want to replicate what is performed by the maven eclipselink plugin:
https://github.com/ethlo/eclipselink-maven-plugin
I have the following gradle set up (note that it's Kotlin DSL not groovy):
task<JavaExec>("performJPAWeaving") {
val compileJava: JavaCompile = tasks.getByName("compileJava") as JavaCompile
dependsOn(compileJava)
val destinationDir = compileJava.destinationDir
println("Statically weaving classes in $destinationDir")
inputs.dir(destinationDir)
outputs.dir(destinationDir)
main = "org.eclipse.persistence.tools.weaving.jpa.StaticWeave"
args = listOf("-persistenceinfo", "src/main/resources", destinationDir.getAbsolutePath(), destinationDir.getAbsolutePath())
classpath = configurations.getByName("compile")
}
When I try and run the task the weaving tasks fails as it's looking for a non-existent persistence.xml.
Is there any way you can statically weave JPA entities in a Spring based JPA project ?
Exception Description: An exception was thrown while processing persistence.xml from URL: file:/home/blabla/trunk/my-module/src/main/resources/
Internal Exception: java.net.MalformedURLException
at org.eclipse.persistence.exceptions.PersistenceUnitLoadingException.exceptionProcessingPersistenceXML(PersistenceUnitLoadingException.java:117)
at org.eclipse.persistence.internal.jpa.deployment.PersistenceUnitProcessor.processPersistenceXML(PersistenceUnitProcessor.java:579)
at org.eclipse.persistence.internal.jpa.deployment.PersistenceUnitProcessor.processPersistenceArchive(PersistenceUnitProcessor.java:536)
... 6 more
Caused by: java.net.MalformedURLException
at java.net.URL.<init>(URL.java:627)
at java.net.URL.<init>(URL.java:490)
at java.net.URL.<init>(URL.java:439)
at com.sun.org.apache.xerces.internal.impl.XMLEntityManager.setupCurrentEntity(XMLEntityManager.java:620)
at com.sun.org.apache.xerces.internal.impl.XMLVersionDetector.determineDocVersion(XMLVersionDetector.java:148)
at com.sun.org.apache.xerces.internal.parsers.XML11Configuration.parse(XML11Configuration.java:806)
at com.sun.org.apache.xerces.internal.parsers.XML11Configuration.parse(XML11Configuration.java:771)
at com.sun.org.apache.xerces.internal.parsers.XMLParser.parse(XMLParser.java:141)
at com.sun.org.apache.xerces.internal.parsers.AbstractSAXParser.parse(AbstractSAXParser.java:1213)
at com.sun.org.apache.xerces.internal.jaxp.SAXParserImpl$JAXPSAXParser.parse(SAXParserImpl.java:643)
at org.eclipse.persistence.internal.jpa.deployment.PersistenceUnitProcessor.processPersistenceXML(PersistenceUnitProcessor.java:577)
... 7 more
Caused by: java.lang.NullPointerException
at java.net.URL.<init>(URL.java:532)
... 17 more
According to org.eclipse.persistence.tools.weaving.jpa.StaticWeave documentation, it requires the persistence.xml in place to generate the static weaving sources.
Usage:
StaticWeave [options] source target
Options:
-classpath
Set the user class path, use ";" as the delimiter in Window system and
":" in Unix system.
-log
The path of log file, the standard output will be the default.
-loglevel
Specify a literal value for eclipselink log level(OFF,SEVERE,WARNING,INFO,CONFIG,FINE,FINER,FINEST). The default
value is OFF.
-persistenceinfo
The path contains META-INF/persistence.xml. This is ONLY required when the source does not include it. The classpath must contain all
the classes necessary in order to perform weaving.
I run a maven build using eclipselink maven plugin, it works without the persistence.xml as you mentioned, because it generates the persistence.xml
before invoking the StaticWeave.class when It is not located in the CLASSPATH, using this method.
private void processPersistenceXml(ClassLoader classLoader, Set<String> entityClasses)
{
final File targetFile = new File(this.persistenceInfoLocation + "/META-INF/persistence.xml");
getLog().info("persistence.xml location: " + targetFile);
final String name = project.getArtifactId();
final Document doc = targetFile.exists() ? PersistenceXmlHelper.parseXml(targetFile) : PersistenceXmlHelper.createXml(name);
checkExisting(targetFile, classLoader, doc, entityClasses);
PersistenceXmlHelper.appendClasses(doc, entityClasses);
PersistenceXmlHelper.outputXml(doc, targetFile);
}
The complete source code is here
I believe you could follow the same approach in your gradle build.
Kinda late to the party but this is definitely possible with Gradle.
There are 3 steps to do in order to make this work:
Copy the persistence.xml file into the source folder next to the classes
Do the weaving
Remove the persistence.xml file from the classes source folder to avoid duplicate persistence.xml conflicts on the classpath
Also, it's very important to hook the weaving process into the compileJava task's last step in order to not break Gradle's up-to-date check, otherwise Gradle will just recompile everything all the time which can be quite inconvenient when developing.
For a more detailed explanation, check out my article on it: EclipseLink static weaving with Gradle.
I admit, I do not completely understand what you mean by weaving. My answer might help if you need to create dynamically PersistenceUnits which provide JPA-Entitymanagers, and if these units should be able to create a Db-Schema (for example in H2) and manage Entities based dynamically on the classes you provide at runtime.
The code-example I am mentioning later, does not work with JPA in Spring but in Weld. I think the answer to your question is related to how EntityManagers are created and what classes the PersistenceUnit, which creates the EntityManager, does manage. There is no difference between those two. Instead of using the EntityManagerFactory as CDI-Producer you might Autowire it or register it using an old fashioned application-context. Therefore I think the answer to your question lies in the following official sources:
PersistenceProviderResolverHolder and
PersistenceProvider#createEntityManagerFactory(getPersistenceUnitName(), properties)
properties is the replacement for the persistence.xml, where a SEPersistenceUnitInfo-Object can be registered in.
To start look at: PersistenceProviderResolverHolder
Later: PersistenceProvider
or you can try to understand how my code (see below) is doing that. But I have to admit, I am not very proud of this part of that software, sorry.
Those classes and objects are used by me to create a module that enables the simulation of a server deployed JPA-WAR-File.
To do that, it scans some classes and identifies Entities.
Later in the Testcode a so called PersistenceFactory creates EntityManager and Datasources. If eclipselink is used this factory weaves those classes together. You need no persistence.xml. The working there might be help to answer your question.
If you look at:
ioc-unit-ejb:TestPersistencefactory
search for the creation of SEPersistenceUnitInfo. That Interface got fed by a list of classes which it returns as
#Override
public List<String> getManagedClassNames() {
return TestPersistenceFactory.this.getManagedClassNames();
}
This object is used to create a Persistencefactory with the help of a PersistenceProvider. This can be discovered as soon as eclipselink is available in the classpath.
The code is not easy to be understood because it allows both Hibernate or Eclipselink to be used for JPA, that depends on the availability of the jars in the classpath.

Spring batch - hibernate integration

i'm working on a batch project that uses the spring batch core library
the library uses jdbcTemplate to persist jobs meta data
and i'm trying to use hibernate in order to read the data about a specific job
package com.ben.batch.repository;
import org.springframework.batch.core.JobInstance;
import org.springframework.data.jpa.repository.JpaRepository;
import org.springframework.data.jpa.repository.Query;
public interface JobInstanceRepository extends JpaRepository<JobInstance,Long> {
#Query("select count(j) from JobInstance j where j.jobName in :jobName ") //Can't resolve symbol 'JobInstance'
Long countBuJobName(String jobName);
}
in ordinary spring boot project this works but now it shows this error
Can't resolve symbol 'JobInstance'
tho I imported the class correctly
Any idea would be much appreciated.
For similar purposes exists JobRepository
http://docs.spring.io/spring-batch/apidocs/org/springframework/batch/core/repository/JobRepository.html
It allows you to fetching any information regarding your jobs.
the spring batch infrastructure is not yet available as spring-data repository, see this JIRA Ticket BATCH-2203
JobInstance is not a Hibernate entity (source code for reference). You'll need to implement your own Hibernate persistence layer if you'd like to query the tables using Hibernate. The primary reason for this is that the framework allows you to define any table prefix you like so your tables would end up as BATCH_JOB_EXECUTION, NIGHTLY_JOB_EXECUTION, ABCD_JOB_EXECTION, etc. Because of that the Hibernate model wouldn't know what table names to point to.

NullPointerException com.liferay.portal.spring.util.SpringFactoryImpl.setBeanDefinitions

I'm using springFramework and I try to setBeanDefinitions, the problem is that this methode need a Map beanDefinition as a param... could U tell me plz how I could instantiate this param?
NullPointerException at com.liferay.portal.spring.util.SpringFactoryImpl.setBeanDefinitions(SpringFactoryImpl.java:56)
additional information:
I try to deploy a liferay project without using liferay configuration files (only springFramework libraries), I created my own sessionFactory, my own dataSource ... etc!!
when I run the program, I'm able to create dataBase Schema basing on portlet-hbm.xml information... well now I try to instantiate beans for portal-spring.xml.. (which are xxxxpersistance.java)! those latters told me that they use 'com.liferay.portal.kernel.dao.orm.SessionFactory' as a required type and it can not convert property value of type 'org.hibernate.impl.SessionFactoryImpl'!! so I tried to use the liferay libraries only for those beans and I try to instanciate them manually... but I wasn't able to setBeanDefinitions cause I need a Map beanDefinition as a param... I don't know if there is a way to get them using sessionFactory or not!!
Thanks again
You only mention junit in the tags to your question. I'd recommend to write unit tests without relying on the whole Liferay infrastructure. That will tremendously lower your required setup efforts and simplify your life a lot.

Grails 2.1: Setting sessionFactory and dataSource from custom Spring configuration

I've got a complex, custom-configured Hibernate setup in Spring (including JPA entities, session factory and data source definitions) that I want to use in Grails 2.1.0. Because of that, I want to give Grails a reference to the sessionFactory and dataSource that I already have. So, i do not want (and in fact, can't) use the hibernate.cfg.xml that's placed in conf/ - nor do I want to use DataSource.groovy, as all the complex configuration is already handled by tested and working code we already have and is all Spring-based.
So, I have managed to get my custom Spring configuration to load on grails run-app (through importBeans() in resources.groovy.) In the logs, I can see the db connection, Spring config and Hibernate starting up just fine, so at runtime the beans to sessionFactory and to the dataSource are created. Now, How do i configure Grails to use those and not try to create its own?
Ideally something like dataSource = ref('myDataSource') somewhere would be great - and the same with sessionFactory = ref('sessionFactory') or similar. I've seen some people putting that in resources.groovy, but it just doesn't work.
I've seen this too:
eventDao(com.JavaClassRequiringDataSource) { dataSource = ref('dataSource') }
but it does not work either (not sure if it ever did.)
Any help would be enormously appreciated … i've spent the last 10 hours trying to get this to work to no avail. I don't mind if I lose some Grails features, as long as it works. The immediate objective is to get GORM to see the (~200) entities we already have and do some scaffolding :)
I also know the entities are not seen by Grails because I've added the following to BootStrap.groovy:
// ...
def grailsApplication
def init = { servletContext ->
println grailsApplication.domainClasses
}
// ...
And it prints [].
If a patch is required, just give me a general idea of where to start and I'll take a look... I just want to get this working.
Thanks!
Update 1:
I've tried several incantations of the resources.groovy file, and currently it looks like this:
beans = {
importBeans('main-spring-file-for-the-rest.xml')
dataSource = ref('dataSource')
}
But when trying to scaffold I still get:
Error 2012-09-06 00:02:00,768 [Thread-9] ERROR plugins.DefaultGrailsPlugin - Cannot generate controller logic for scaffolded class x.y.z.Class. It is not a domain class!
(Log line edited: replaced the actual name of the class with x.y.z.Class.) As I've shown before, the list of entities is empty, and I can see no way of setting up the Hibernate sessionFactory - for example
sessionFactory = ref('sessionFactory')
Doesn't work.
Update 2:
With the beans and entities loading from spring but not being used or seen by GORM, I was able to force the conversion of the entities using a utility built into Grails and a new bean, configured from resources.groovy thusly:
public class TestFix implements ApplicationContextAware {
SessionFactory sessionFactory
ApplicationContext applicationContext
GrailsApplication grailsApplication
def init() {
GrailsHibernateUtil.configureHibernateDomainClasses(sessionFactory, "sessionFactory", grailsApplication)
}
}
beans = {
importBeans('main-spring-file-for-the-rest.xml')
myBean(TestFix) { bean ->
sessionFactory = ref('sessionFactory')
grailsApplication = ref(GrailsApplication.APPLICATION_ID)
bean.initMethod = 'init'
}
}
Now the entities are seen by Grails but scaffolding doesn't work because the augmented domain objects seem to lack the GORM methods (.list() and such.) You would expect GrailsHibernateUtil.configureHibernateDomainClasses() to add those methods in when it creates all the GrailsHibernateDomainClass classes, but either it's failing silently or I'm missing something (perhaps not running early enough? not sure.) Any help very appreciated.
Have you tried the other way around by using the db-reverse-engineer plugin? We had great success on migrating a fairly complex Spring application to grails (approx. 90 Entities).

Resources