I try to use the following Hibernate property to create the database schema.
<property name="hibernate.hbm2ddl.auto" value="create"/>
This is a Maven Java EE (Hibernate/Spring) project that consists of Maven unit tests which validates the database schema at the beginning. That means database schema is validated (and thus deployment fails) before it is created.
How can I overcome this issue?
Not sure to understand what you want to do, but if your DDL is generated before the test phase (and it should be the case if you do this at process-classes phase), it should exist when your tests are running.
<execution>
<phase>process-classes</phase>
<goals>
<goal>hbm2ddl</goal>
</goals>
</execution>
Related
We are setting up a Helidon MP application that connects to a SQL database and exposes some endpoints for CRUD operations. I am facing issues when implementing the integration tests. Our objective is to have the application use the SQL database, but when running the tests use an in-memory database.
I've used this type of implementation on others frameworks and programing languages. The initial solution was to access the dependency injection container and change the configuration of the ORM (hibernate in this case) to use an in-memory database. Unfortunately I did not manage to do this.
The second approach was to configure another persistence.xml file in the test folder, that would override the one from main folder. Using on each of them a different jta-data-source, I would be able to configure separate connection credentials. I found out that this causes a ambiguous dependency and would fail.
content of src/resources/META-INF/persistence.xml
<persistence>
<persistence-unit name="dservice" transaction-type="JTA">
<jta-data-source>dsource</jta-data-source>
<class>.....</class>
.
.
.
<class>.....</class>
<properties>
<property name="hibernate.dialect" value="org.hibernate.dialect.OracleDialect"/>
</properties>
</persistence-unit>
</persistence>
content of test/resources/META-INF/persistence.xml
<persistence>
<persistence-unit name="dservice" transaction-type="JTA">
<jta-data-source>dsource_test</jta-data-source>
<class>.....</class>
.
.
.
<class>.....</class>
<properties>
<property name="hibernate.dialect" value="org.hibernate.dialect.H2Dialect"/>
<property name="jakarta.persistence.sql-load-script-source" value="META-INF/init_script.sql"/>
<property name="jakarta.persistence.schema-generation.database.action" value="drop-and-create"/>
</properties>
</persistence-unit>
</persistence>
content of src/resources/META-INF/microprofile-config.properties
# used for build
oracle.ucp.jdbc.PoolDataSource.dsource.URL=jdbc:oracle:something
oracle.ucp.jdbc.PoolDataSource.dsource.connectionFactoryClassName=oracle.jdbc.pool.OracleDataSource
oracle.ucp.jdbc.PoolDataSource.dsource.user=some_user
oracle.ucp.jdbc.PoolDataSource.dsource.password=some_password
# used for in-memory testing
oracle.ucp.jdbc.PoolDataSource.dsource_test.URL=jdbc:h2:mem:depServerDb;DB_CLOSE_DELAY=-1
oracle.ucp.jdbc.PoolDataSource.dsource_test.connectionFactoryClassName=org.h2.jdbcx.JdbcDataSource
oracle.ucp.jdbc.PoolDataSource.dsource_test.user=db_user
oracle.ucp.jdbc.PoolDataSource.dsource_test.password=user_password
I added in main/resources/META-INF/persistence.xml another persistence unit with a different name and try to create the entity manager manually using Persistence.createEntityManagerFactory() and have a provider class to access the entity manager. Unfortunately this attempt also failed.
content of src/resources/META-INF/persistence.xml
<persistence>
<persistence-unit name="dservice" transaction-type="JTA">
<jta-data-source>dsource</jta-data-source>
<class>.....</class>
.
.
.
<class>.....</class>
<properties>
<property name="hibernate.dialect" value="org.hibernate.dialect.OracleDialect"/>
</properties>
</persistence-unit>
<persistence-unit name="dservice_test" transaction-type="JTA">
<jta-data-source>dsource_test</jta-data-source>
<class>.....</class>
.
.
.
<class>.....</class>
<properties>
<property name="hibernate.dialect" value="org.hibernate.dialect.H2Dialect"/>
<property name="jakarta.persistence.sql-load-script-source" value="META-INF/init_script.sql"/>
<property name="jakarta.persistence.schema-generation.database.action" value="drop-and-create"/>
</properties>
</persistence-unit>
</persistence>
I ended up with a solution that I am not satisfied with. I kept the persistence.xml from scenario 3 and added a entity manager provider class in which I inject two entity managers, one for each persistence unit. In the test class I added a #AddConfig(key = "app.testing", value = "true"). This will make my entity manager provider to deliver the entity manager "depservice" when the application is running and "depservice-test" when I run "mvn test" command.
content of provider class
#ApplicationScoped
public class PersistenceUnitProvider {
#PersistenceContext(unitName = "dservice")
private EntityManager em_application;
#PersistenceContext(unitName = "dservice_test")
private EntityManager em_test;
private String testing = false;
#Inject
public PersistenceUnitProvider(#ConfigProperty(name = "app.testing") String testing){
this.testing = testing;
}
public EntityManager getPersistenceUnit(){
if(this.testing == "true"){
return em_test;
}
return em_application;
}
}
content of junit test class that will change app.testing property to true and use in-memory database
#HelidonTest
#AddConfig(key = "app.testing", value = "true")
class MainTest {
.
..
...
....
}
The issue is that this makes the application establish two connections when it's run or tested. Is there a better way to achieve this?
..............................................................................................................................................
UPDATE:
Following the solution Laird mentioned in the accepted answer, we are using the maven build process, that has phase triggers to add whatever config files needed to change the behavior of the helidon application.
We created a folder to store all the config files used for development, test and production, with the following structure:
_config
development
(files that use a local SQL database)
tests
(files that use a in memory database)
production
(files that use a development SQL database)
In the pom.xml file we switch the config files as needed for the different phases
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-resources-plugin</artifactId>
<executions>
<execution>
<id>copy-resources-dev</id>
<phase>compile</phase>
<goals>
<goal>copy-resources</goal>
</goals>
<configuration>
<outputDirectory>${basedir}/target/classes/META-INF</outputDirectory>
<resources>
<resource>
<directory>_config/development</directory>
<filtering>true</filtering>
</resource>
</resources>
</configuration>
</execution>
<execution>
<id>copy-resources-test</id>
<phase>test-compile</phase>
<goals>
<goal>copy-resources</goal>
</goals>
<configuration>
<outputDirectory>${basedir}/target/classes/META-INF</outputDirectory>
<resources>
<resource>
<directory>_config/tests</directory>
<filtering>true</filtering>
</resource>
</resources>
</configuration>
</execution>
<execution>
<id>copy-resources-packaging</id>
<phase>prepare-package</phase>
<goals>
<goal>copy-resources</goal>
</goals>
<configuration>
<outputDirectory>${basedir}/target/classes/META-INF</outputDirectory>
<resources>
<resource>
<directory>_config/production</directory>
<filtering>true</filtering>
</resource>
</resources>
</configuration>
</execution>
</executions>
</plugin>
This will enable us, when we run "helidon dev", to use a localhost database. When we execute the tests, use a in-memory database. And when we package the application and want to run it on a server somewhere, use whatever database we need it to.
There are many, many, many things going on here. I'll try to keep it short; a full JPA tutorial is beyond the scope of this question and this website.
The short answer is: (1) in JPA, a persistence.xml is definitionally environment-specific, and (2) persistence.xmls don't "override" each other. When seen this way, the problem reduces to: I want two environments in the same project and can't figure out how to turn them on and off selectively.
There are a variety of (non-Helidon-specific) ways you can do this sort of thing:
Use the maven-resources-plugin to defer copying src/main/resources/META-INF/persistence.xml into target/classes/META-INF/ until after unit tests have run (so exclude persistence.xml from <resources> in your pom.xml and then bind the maven-resources-plugin:copy-resources goal to the prepare-package phase. Now src/test/resources/META-INF/persistence.xml will be in effect at unit test time, and your (untested) src/main/resources/META-INF/persistence.xml will be the one you deploy with.
Do amazing things with MicroProfile Config configuration profiles if the only thing you need to change is data source information, which is already external to a container-mode-JPA persistence.xml file, but from your example it seems that you need to change <property> elements as well.
Recognize that since persistence.xmls are inherently environment-specific, just don't include a src/main/resources/META-INF/persistence.xml at all in your library project, since a library project, by definition, is supposed to be used in a variety of environments. Instead, place a persistence.xml in its own "thin" project and combine that project with your library project to form an application. You can of course test these combinations in all sorts of other ways.
I would strongly recommend against adding test code to production bits, like you are doing with em_application and em_test. This may look simple at start but quickly go out of hand before you realize it.
Instead, try to retain only one line of code for testing and production. Have a single JPA persistence unit (em_application) in your PersistenceUnitProvider, persistence.xml and microprofile-config.properties. Stick to OracleDialect in test as well. Make a second copy of your META-INF/microprofile-config.properties under your JUnit test directory and retain the H2 configuration in the test copy. You would additionally have to shift your init_script.sql to the H2 connection URL to orchestrate the setup, since persistence.xml can no longer carry H2 configuration.
The HelidonTest application should pick up the test copy of mp-config properties file and the persistence until would automatically now work with H2.
My springboot application builds into a WAR file (using Jenkins). I want to automate the remote deployment to Websphere 9.
I have read around and it seems there is no maven plugin for deployment to websphere 9 but ant support is pretty good. So, I'm using maven ant plugin to help running those ant tasks. I started with attempt to list the applications installed, just to see if it works. However I'm running into an exception related to localization:
[ERROR] C:\DEV\ant-was-deploy.xml:81:
java.util.MissingResourceException: Can't find bundle for base name
com.ibm.ws.profile.resourcebundle.WSProfileResourceBundle, locale
en_US
My ant-was-deploy.xml is referenced from pom.xml:
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-antrun-plugin</artifactId>
<version>3.0.0</version>
<executions>
<execution>
<id>id123</id>
<phase>clean</phase>
<configuration>
<locales>es</locales>
<target>
<ant antfile="${basedir}/ant-was-deploy.xml">
<target name="listApps"/>
</ant>
</target>
</configuration>
<goals>
<goal>run</goal>
</goals>
</execution>
</executions>
</plugin>
ant-was-deploy.xml:
<?xml version="1.0" encoding="UTF-8"?>
<project name="websphere" default="listApps" basedir="." >
<target name="listApps" >
<taskdef name="wsListApps" classname="com.ibm.websphere.ant.tasks.ListApplications" classpath="${wasHome.dir}/plugins/com.ibm.ws.runtime.jar" />
<wsListApps
profileName="AppServ01"
wasHome="C:\\opt\\IBM\\WebSphere\\AppServer"
/>
</target>
</project>
I think the error comes from com.ibm.ws.runtime.jar. Inside it has WSProfileResourceBundle.class and WSProfileResourceBundle_en.class but not WSProfileResourceBundle_en_US.class (name is just an assumption - I have copied the bundle with this name inside the jar but it didn't work).
I also tried to set the locale for the entire plugin but it seems that localization for this plugin is not implemented properly (no impact in my case - I set the locale to 'es' but still got the error for en_US).
I also tried to pass system parameters to maven command: mvn clean -Duser.language=fr -Duser.country=FR
It didn't work either.
So, my question is if there is a way to change the locale before the ant script? If I can set it to 'en' probably it will find the right resource bundle.
I'm fairly new to Websphere, if there is another solution to automate the remote deployment to websphere 9 I would be happy to hear it. I would rather not use scripts on target server or Jenkins plugin but if there is no other way ...
I just had the same issue. In my case, i was using an AppServer name (AppSrv1 instead of AppSrv01) that did not exist anymore, in my maven settings.xml.
The right server name solved the issue.
I'm facing problems with a jdbc dynamic properties configurer. I try to explain what exactly the problem is.
When I do mvn clean install and right after I deploy the applications in my server (Weblogic 10.3.3), everything is correct, and all the applications work fine. But, every morning, when I try to redeploy the same applications, it was shown an error message like this:
Error creating bean with name 'path.to.my.bean.JDBCPropertiesFactoryBean#6015a10' defined in class path resource [spring/configuration/placeholder-jdbcproperties.xml]: Invocation of init method failed; nested exception is org.springframework.jdbc.BadSqlGrammarException: StatementCallback; bad SQL grammar [
SELECT
A.COLUMN1 || '.' || P.COLUMN2,
COLUMN3
FROM
T_TABLE_WITH_PROPERTIES${application.version} P,
T_TABLE_WITH_PROPERTIES_2 G
WHERE G.ID = P.ID
]; nested exception is java.sql.SQLSyntaxErrorException: ORA-00911: invalid character
This application.version comes from maven pom.xml:
<properties>
...
<application.version>MyVersion</application.version>
...
</properties>
The bean is:
<bean id="jdbcPlaceholderConfig"
class="path.to.my.bean.DefaultPropertyPlaceholderConfigurer"> <!-- Class to extend PropertyPlaceholderConfigurer -->
<property name="ignoreUnresolvablePlaceholders" value="true"/>
<property name="properties">
<bean class="path.to.my.bean.JDBCPropertiesFactoryBean"> <!-- Class to extend PropertiesFactoryBean -->
<property name="query">
<value>
SELECT
A.COLUMN1 || '.' || P.COLUMN2,
COLUMN3
FROM
T_TABLE_WITH_PROPERTIES${application.version} P,
T_TABLE_WITH_PROPERTIES_2 G
WHERE G.ID = P.ID
</value>
</property>
<property name="dataSource" ref="ref.to.datasource.bean"/>
</bean>
</property>
So, every morning I have to rebuild with maven, and the loop starts again.
Additional information: I try to use JRebel too, but I'm not sure where can be the problem, maybe this is relevant.
Thanks in advance.
UPDATE:
This how I generate the rebel.xml:
<build>
...
<plugins>
<plugin>
<groupId>org.zeroturnaround</groupId>
<artifactId>jrebel-maven-plugin</artifactId>
<version>1.1.5</version>
<configuration>
<relativePath>../../</relativePath>
<rootPath>PATH\TO\MY\SIS_VOB</rootPath>
<addResourcesDirToRebelXml>true</addResourcesDirToRebelXml>
<alwaysGenerate>true</alwaysGenerate>
</configuration>
<executions>
<execution>
<id>generate-rebel-xml</id>
<phase>process-resources</phase>
<goals>
<goal>generate</goal>
</goals>
</execution>
</executions>
</plugin>
</plugins>
</build>
I've just realized that with <executions>...<goal>generate</goal>...</executions>, when I do mvn clean install, without jrebel:generate, the rebel.xml files are always generated, so maybe I have to delete the executions tag, and generate the rebel.xml files once with jrebel:generate, and then, edit the rebel.xml and do again mvn clean install.
Would be that correct?
Thanks.
UPDATE WITH THE SOLUTION:
This is the final version of maven jrebel plugin in the pom.xml:
<build>
...
<plugins>
<plugin>
<groupId>org.zeroturnaround</groupId>
<artifactId>jrebel-maven-plugin</artifactId>
<version>1.1.5</version>
<configuration>
<relativePath>../../</relativePath>
<rootPath>PATH\TO\MY\SIS_VOB</rootPath>
<addResourcesDirToRebelXml>true</addResourcesDirToRebelXml>
<alwaysGenerate>true</alwaysGenerate>
</configuration>
<!-- executions tag out! to not regenerate files always -->
</plugin>
</plugins>
</build>
To create the rebel.xml:
mvn jrebel:generate
Then, if we want, we can modify the rebel.xml files if we want to exclude some files, like *.properties, as Henri's answer.
And that's it!
This can happen if you're using resource filtering with JRebel, as the application looks up the bean's xml in its unfiltered form from the project working directory (as per rebel.xml).
To resolve this, you'll need to update rebel.xml for that module, adding exclude for that particular XML file - see here.
Example
In my ServletFilter, I want to use specific jetty API exposed in the HttpServletRequest implementation.
I launched it like that:
final Request jettyRequest = Request.getBaseRequest(request)
If I want to avoid ClassNotFoundException, I must add the jetty-server artifact to my maven dependencies. But if I do that, getBaseRequest returns null because 'request instanceof Request' returns false instead of true.
This is certainly due to conflict between jetty and application classloaders because both of them have loaded 'org.eclipse.jetty.server.Request' class. I tried several configurations, but I was not able to make the Request class exposed to my webapp without adding the dependency in WEB-INF/lib, which causes the classpath issue.
My application is launched with "mvn jetty:run-forked" and configured like this:
<plugin>
<groupId>org.eclipse.jetty</groupId>
<artifactId>jetty-maven-plugin</artifactId>
<version>${jetty-version}</version>
<configuration>
<webAppSourceDirectory>${project.build.directory}/${project.name}</webAppSourceDirectory>
<systemProperties>
<force>true</force>
</systemProperties>
<scanIntervalSeconds>10</scanIntervalSeconds>
<webAppConfig>
<contextPath>/</contextPath>
</webAppConfig>
<jettyXml>../jetty.xml,../jetty-ssl.xml,../jetty-https.xml</jettyXml>
<jvmArgs>-agentlib:jdwp=transport=dt_socket,server=y,suspend=n,address=5005 -Xbootclasspath/p:${settings.localRepository}/org/mortbay/jetty/alpn/alpn-boot/${alpn-version}/alpn-boot-${alpn-version}.jar</jvmArgs>
</configuration>
</plugin>
Any help will be appreciated!
I fixed the issue by adding a WebAppContext configuration file that contains:
<Configure class="org.eclipse.jetty.webapp.WebAppContext">
<Set name="parentLoaderPriority">true</Set>
</Configure>
The file must be referenced in jetty-maven-plugin like that:
<contextXml>../jetty-context.xml</contextXml>
I have a multimodule maven project with the following setup of relevant modules:
root
commons-app
backend
frontend
Module frontend is built into war and deployed on Tomcat. Module backend is a standard Java application packaged as jar. All I am trying to accomplish is to make the following aspect work (in both frontend and backend):
#Aspect
public class VirtuosoSequenceSanitizerAspect {
#Around("execution(* cz.cuni.mff.xrg.odcs.commons.app.facade.*Facade.save(..))")
public Object sanitizeSequenceOnSave(ProceedingJoinPoint pjp) throws Throwable {
// ... some code
}
#Before("execution(* org.eclipse.persistence.internal.descriptors.ObjectBuilder.assignSequenceNumber(java.lang.Object, org.eclipse.persistence.internal.sessions.AbstractSession))")
public void rememberAssignSequence(JoinPoint jp) {
// .. some code
}
}
This aspect is setup as a Spring bean in commons-app-context.xml like so:
<!-- enable aspects -->
<aop:aspectj-autoproxy />
<!-- Aspect for fixing corrupted database sequences. -->
<bean id="sequenceAspect" class="cz.cuni.mff.xrg.odcs.commons.app.dao.VirtuosoSequenceSanitizerAspect" />
With this setup the around advice is working properly, however the before advice is not triggered. From what I found I concluded I need to use aspectj-maven-plugin to weave to 3rd party libs. So I added the plugin into the pom.xml for commons-app module like so:
<plugin>
<groupId>org.codehaus.mojo</groupId>
<artifactId>aspectj-maven-plugin</artifactId>
<version>1.5</version>
<configuration>
<source>1.7</source>
<target>1.7</target>
<complianceLevel>1.7</complianceLevel>
<showWeaveInfo>true</showWeaveInfo>
<verbose>true</verbose>
<!-- Weave EclipseLink dependency -->
<weaveDependencies>
<weaveDependency>
<groupId>org.eclipse.persistence</groupId>
<artifactId>eclipselink</artifactId>
</weaveDependency>
</weaveDependencies>
</configuration>
<executions>
<execution>
<goals>
<goal>compile</goal>
</goals>
</execution>
</executions>
<dependencies>
</dependencies>
</plugin>
With this plugin before advice works, but around advice stops working. I have been struggling to set this up correctly so both advices work as expected, but to no avail. When building commons-app module log says both advices are woven:
--- aspectj-maven-plugin:1.5:compile (default) # commons-app ---
Join point 'method-execution(void cz.cuni.mff.xrg.odcs.commons.app.facade.ScheduleFacade.save(cz.cuni.mff.xrg.odcs.commons.app.scheduling.Schedule))' in Type 'cz.cuni.mff.xrg.odcs.commons.app.facade.ScheduleFacade' (ScheduleFacade.java:127) advised by around advice from 'cz.cuni.mff.xrg.odcs.commons.app.dao.VirtuosoSequenceSanitizerAspect' (VirtuosoSequenceSanitizerAspect.java:90)
Join point 'method-execution(void cz.cuni.mff.xrg.odcs.commons.app.facade.DPUFacade.save(cz.cuni.mff.xrg.odcs.commons.app.dpu.DPUTemplateRecord))' in Type 'cz.cuni.mff.xrg.odcs.commons.app.facade.DPUFacade' (DPUFacade.java:123) advised by around advice from 'cz.cuni.mff.xrg.odcs.commons.app.dao.VirtuosoSequenceSanitizerAspect' (VirtuosoSequenceSanitizerAspect.java:90)
Join point 'method-execution(void cz.cuni.mff.xrg.odcs.commons.app.facade.DPUFacade.save(cz.cuni.mff.xrg.odcs.commons.app.dpu.DPUInstanceRecord))' in Type 'cz.cuni.mff.xrg.odcs.commons.app.facade.DPUFacade' (DPUFacade.java:185) advised by around advice from 'cz.cuni.mff.xrg.odcs.commons.app.dao.VirtuosoSequenceSanitizerAspect' (VirtuosoSequenceSanitizerAspect.java:90)
Join point 'method-execution(void cz.cuni.mff.xrg.odcs.commons.app.facade.PipelineFacade.save(cz.cuni.mff.xrg.odcs.commons.app.pipeline.Pipeline))' in Type 'cz.cuni.mff.xrg.odcs.commons.app.facade.PipelineFacade' (PipelineFacade.java:134) advised by around advice from 'cz.cuni.mff.xrg.odcs.commons.app.dao.VirtuosoSequenceSanitizerAspect' (VirtuosoSequenceSanitizerAspect.java:90)
...
However, when I deploy frontend to Tomcat, only the before advice is triggered. How can I configure maven to always weave both advices?
My mistake, I actually found out, that the around advice is being triggered. I did not see this because the code did not do what I expected. Also, I thought it is not triggered because a debugger breakpoint was not hit. From a brief googling I found the reason...
If around advice is inlined, the debugger can't figure out what to do
(we still have some JSR 45 related work to do in this area, and
possibly so does the Eclipse debugger). To debug around advice, you
also need to go to the project properties and turn off the "inline
around advice" AspectJ compiler option. Debugging should then
hopefully work as expected...