Oracle Advanced Queuing and Jakarta namespace - spring-boot

I am using Oracle AQ in my Java Spring Boot application. I have Oracle JMS implementation AQAPI as dependencies.
Recently I had tried to update the application to Spring Boot 3.x which is build on Jakarta, not Javax namespace. However my code is no longer compatible with Oracle AQ since is using Javax namespace, i.e. javax.jms.Connection.
So question how to solve this problem? Seems like Oracle has not yet produced new version of AQAPI compatible with Jakarta MS

I had the same problem when upgrading to Spring boot 3, so I wrote an adapter to wrap javax.jms based AQAPI as jakarta.jms:
<dependency>
<groupId>net.sf.gavgav</groupId>
<artifactId>jakarta-javax-jms-adapter</artifactId>
<version>1.0.0</version>
</dependency>
This is just a collection of jakarata.jms interfaces delegating calls to corresponding javax.jms implementation:
https://sourceforge.net/p/jakarta-javax-jms-adapter/code/ci/master/tree/src/main/java/net/sf/gavgav/jakartajavax/jms/
For example:
Wrapping AQjmsFactory (javax.jms.ConnectionFactory) as jakarta.jms.ConnectionFactory in Spring boot 3:
import java.sql.SQLException;
import javax.sql.DataSource;
import jakarta.jms.ConnectionFactory;
import jakarta.jms.JMSException;
import net.sf.gavgav.jakartajavax.jms.JakartaJmsConnectionFactory;
import net.sf.gavgav.jakartajavax.jms.JmsException;
import oracle.jms.AQjmsFactory;
...
#Bean
public ConnectionFactory connectionFactory(DataSource ds) throws JMSException, SQLException {
try {
return new JakartaJmsConnectionFactory(AQjmsFactory.getQueueConnectionFactory(ds));
} catch (javax.jms.JMSException e) {
throw JmsException.wrap(e);
}
}
Implementing Spring's DestinationResolver for JmsTemplate or DefaultJmsListenerContainerFactory:
import net.sf.gavgav.jakartajavax.jms.JakartaJmsSession;
import net.sf.gavgav.jakartajavax.jms.JakartaJmsQueue;
import net.sf.gavgav.jakartajavax.jms.JmsException;
import jakarta.jms.Destination;
import jakarta.jms.Session;
import oracle.jms.AQjmsSession;
import org.springframework.jms.support.destination.DestinationResolver;
public class AqDestinationResolver implements DestinationResolver {
private final String schema;
public AqDestinationResolver(String schema) {
this.schema = schema;
}
#Override
public Destination resolveDestinationName(Session session, String destinationName, boolean pubSubDomain) throws JMSException {
JakartaJmsSession jakartaSession = (JakartaJmsSession) session;
try {
AQjmsSession aqjmsSession = ((AQjmsSession) jakartaSession.getSession());
javax.jms.Queue aqjmsQueue = aqjmsSession.getQueue(schema, destinationName);
return new JakartaJmsQueue(aqjmsQueue);
} catch (javax.jms.JMSException e) {
throw JmsException.wrap(e);
}
}
}

Related

How to enable connection pooling in spring boot embedded tomcat

I have a spring boot application which is not a web application. In this application i have configured embedded tomcat with the help of following bean.
#Bean
public TomcatServletWebServerFactory tomcatFactory() {
return new TomcatServletWebServerFactory() {
protected TomcatWebServer getTomcatWebServer(Tomcat tomcat) {
tomcat.enableNaming();
return super.getTomcatWebServer(tomcat);
}
protected void postProcessContext(Context context) {
ContextResource contextResource = new ContextResource();
contextResource.setName("jdbc/BPMDB");
contextResource.setType(DataSource.class.getName());
contextResource.setProperty("driverClassName", env.getProperty("bpm.db.driverClassName"));
contextResource.setProperty("url", env.getProperty("bpm.db.url"));
contextResource.setProperty("username", env.getProperty("bpm.db.username"));
contextResource.setProperty("password", env.getProperty("bpm.db.password"));
context.getNamingResources().addResource(contextResource);
}
};
}
How do i do connection pooling for this embedded tomcat. I am using spring boot 2.x which says hikaricp is the default connection pooling but how to set it into this embedded tomcat.
Does this require to set properties like spring.datasource.hikari.initial-size=15
spring.datasource.hikari.max-wait=20000
but again how boot will know and how will i know that these properties are used.
Thanks.
I have got answer for my problem.
Its simple. We just have to make a DataSource reference and autowire it and mention database related properties along with hikari related properties.
Code is below.
#Autowired
public DataSource dataSource;
Add above to your #Configuration marked class and add following properties to application.properties file.
spring.datasource.driver-class=...
spring.datasource.url=jdbc:oracle:thin:....
spring.datasource.username=..
spring.datasource.password=..
spring.datasource.hikari.initial-size=15
spring.datasource.hikari.max-wait=20000
spring.datasource.hikari.max-active=50
spring.datasource.hikari.max-idle=50
spring.datasource.hikari.min-idle=8
Also i have written a test case to check for hikari connection pool. Below is the code.
import javax.sql.DataSource;
import org.junit.Test;
import org.junit.runner.RunWith;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.boot.test.context.SpringBootTest;
import org.springframework.test.context.junit4.SpringRunner;
#RunWith(SpringRunner.class)
#SpringBootTest(
properties = "spring.datasource.type=com.zaxxer.hikari.HikariDataSource",
classes = {ApplicationConfiguration.class,PersistenceJpaContext.class}
)
public class HikariConnectionPoolTest {
#Autowired
private DataSource dataSource;
#Test
public void hikariConnectionPoolIsConfigured() {
assertEquals("com.zaxxer.hikari.HikariDataSource", dataSource.getClass().getName());
}
}

How to run DelegatingSecurityContextRunnable every time when tomcat creates new Thread

I have an spring app which is using tomcat with websockets. I would like to use the DelegatingSecurityContextRunnable to be executed every time when tomcat creates a new thread, i.e. warp the tomcat thread. Does anyone know how this is done. The reason for the question can be found.here
Maybe this can be done with using AOP and some advice?
In Spring boot you can configure a Wrapper by hooking into the Tomcat connector. See this as an example:
#Bean
public EmbeddedServletContainerFactory servletContainerFactory() {
TomcatEmbeddedServletContainerFactory factory = new TomcatEmbeddedServletContainerFactory();
factory.addConnectorCustomizers(new TomcatConnectorCustomizer() {
#Override
public void customize(Connector connector) {
AbstractProtocol protocolHandler = (AbstractProtocol) connector.getProtocolHandler();
TaskQueue taskqueue = new TaskQueue() {
#Override
public boolean offer(Runnable e, long timeout, TimeUnit unit) throws InterruptedException {
return super.offer(new MyRunnable(e), timeout, unit);
}
#Override
public boolean offer(Runnable o) {
return super.offer(new MyRunnable(o));
}
};
TaskThreadFactory tf = new TaskThreadFactory("artur-" + "-exec-", false, 0);
ThreadPoolExecutor e = new ThreadPoolExecutor(10, 10, 1000, TimeUnit.SECONDS, taskqueue);
taskqueue.setParent(e);
protocolHandler.setExecutor(e);
}
});
return factory;
}
And here is my custom Runable (this can be any wrapper, i did not bother implementing exactly yours):
static class MyRunnable implements Runnable {
private Runnable r;
public MyRunnable(Runnable r) {
this.r = r;
}
#Override
public void run() {
System.out.println("Custom runable");
runInner();
}
void runInner() {
r.run();
}
}
And here are my imports:
import java.util.concurrent.TimeUnit;
import org.apache.catalina.connector.Connector;
import org.apache.coyote.AbstractProtocol;
import org.apache.tomcat.util.threads.TaskQueue;
import org.apache.tomcat.util.threads.TaskThreadFactory;
import org.apache.tomcat.util.threads.ThreadPoolExecutor;
import org.springframework.boot.autoconfigure.EnableAutoConfiguration;
import org.springframework.boot.context.embedded.EmbeddedServletContainerFactory;
import org.springframework.boot.context.embedded.tomcat.TomcatConnectorCustomizer;
import org.springframework.boot.context.embedded.tomcat.TomcatEmbeddedServletContainerFactory;
import org.springframework.boot.web.support.SpringBootServletInitializer;
import org.springframework.context.annotation.Bean;
import org.springframework.context.annotation.ComponentScan;
import org.springframework.context.annotation.Configuration;
import org.springframework.context.annotation.FilterType;
import org.springframework.context.annotation.PropertySource;
What this does:
The Tomcat connector initialises itself. You can set the executor to use, in which case Tomcat will stop creating its own configuration and instead use yours.
By overwriting the offer methods in the queue, you have the chance to wrap your Runnable in any custom Runnable. In my case, for testing, I simply added a Sysout to see that everything is working correctly.
The Threadpool implementation I used is an exact copy of the tomcat default (minus the properties). This way, behaviour stays the same, except that any Runnable is now your delegating wrapper.
When I test that, my console prints:
Custom runable
I hope this is what you were looking for.
I use spring boot, but this is essentially a tomcat issue not a spring issue. You can adapt the solution to your specific scenario.
-- Artur

NoUniqueBeanDefinitionException with #EnableExperimentalNeo4jRepositories annotation and SpringBoot 1.4.2

I'm having an issue with Spring boot 1.4.2.M1 and #EnableExperimentalNeo4jRepositories.
It seems to be a conflict between two beans, one spring boot, one spring-data-neo4j.
Here is a stack trace excerpt:
18:12:15.891 [main] DEBUG o.s.b.d.LoggingFailureAnalysisReporter - Application failed to start due to an exception
org.springframework.beans.factory.NoUniqueBeanDefinitionException: No qualifying bean of type 'org.neo4j.ogm.session.Session' available: expected single matching bean but found 2: getSession,org.springframework.data.neo4j.transaction.SharedSessionCreator#0
And another...
Parameter 0 of method setSession in org.springframework.data.neo4j.repository.support.Neo4jRepositoryFactoryBean required a single bean, but 2 were found:
- getSession: defined in BeanDefinition defined in class path resource [org/springframework/boot/autoconfigure/data/neo4j/Neo4jDataAutoConfiguration$SpringBootNeo4jConfiguration.class]
- org.springframework.data.neo4j.transaction.SharedSessionCreator#0: defined by method 'createSharedSession' in null
Anybody have any idea how to solve this?
Below is my Neo4j Configuration
package com.domain.core.context;
import javax.annotation.PostConstruct;
import org.neo4j.ogm.session.Session;
import org.neo4j.ogm.session.SessionFactory;
import org.neo4j.ogm.session.event.Event;
import org.neo4j.ogm.session.event.EventListenerAdapter;
import org.springframework.boot.autoconfigure.SpringBootApplication;
import org.springframework.boot.autoconfigure.data.neo4j.Neo4jDataAutoConfiguration;
import org.springframework.context.annotation.Bean;
import org.springframework.context.annotation.ComponentScan;
import org.springframework.context.annotation.Configuration;
import org.springframework.data.neo4j.repository.config.EnableExperimentalNeo4jRepositories;
import org.springframework.data.neo4j.transaction.Neo4jTransactionManager;
import org.springframework.transaction.annotation.EnableTransactionManagement;
import lombok.extern.slf4j.Slf4j;
#Slf4j
#Configuration
#ComponentScan("com.domain")
#EnableExperimentalNeo4jRepositories(basePackages = "com.domain.core.repository")
#EnableTransactionManagement
#SpringBootApplication(exclude = Neo4jDataAutoConfiguration.class)
public class TestPersistenceContext {
#PostConstruct
public void init() {
log.info("TheScene.Co: Initializing Test Neo4jConfig ...");
}
#Bean
public Neo4jTransactionManager transactionManager() throws Exception {
return new Neo4jTransactionManager(sessionFactory());
}
#Bean
public SessionFactory sessionFactory() {
return new SessionFactory(getConfiguration(), "com.domain") {
#Override
public Session openSession() {
Session session = super.openSession();
session.register(new EventListenerAdapter() {
#Override
public void onPreSave(Event event) {
// do something - like set an id on an object
log.debug("***** Saving domain object ********");
}
});
return session;
}
};
}
#Bean
public org.neo4j.ogm.config.Configuration getConfiguration() {
org.neo4j.ogm.config.Configuration config = new org.neo4j.ogm.config.Configuration();
config.driverConfiguration().setCredentials("neo4j", "password")
.setDriverClassName("org.neo4j.ogm.drivers.http.driver.HttpDriver");
return config;
}
}
You must be using Spring Data Neo4j (SDN) version 4.2.0.M1. This milestone release was put out to get feedback on several big changes from 4.1.x.
SDN 4.2.0.RC1 should be out later this week but for now 4.2.0.BUILD-SNAPSHOT is actually quite stable in the lead up to Ingalls release train for Spring Data in Decemeber.
I have written a guide for users coming from SDN 4.0/4.1 which goes over how to upgrade to the snapshot build.
In this guide there is a link to an example project branch which shows how to get this version to work with Spring Boot 1.4.x with a few minor work arounds.
WIth the upcoming release of Spring Boot 1.5, we have updated all the autoconfiguration to work straight out of the box with SDN 4.2. We will update the documenation for Spring Boot closer to release.

Spring Boot Apache Artemis Embedded JMS Queue Eample

I am trying to set up a simple Spring Boot application that uses an embedded JMS Queue. I am successful with HornetQ but when I try to convert to Artemis I am getting a failure on the ArtemisConnectionFactory. Here is my code that I use for HornetQ. Any help would be appreciative.
package com.comporium.log.server;
import javax.jms.ConnectionFactory;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.boot.SpringApplication;
import org.springframework.boot.autoconfigure.SpringBootApplication;
import org.springframework.context.annotation.Bean;
import org.springframework.context.annotation.PropertySource;
import org.springframework.jms.listener.DefaultMessageListenerContainer;
import com.comporium.log.server.services.LogListener;
#SpringBootApplication
public class Application {
#Autowired
private ConnectionFactory connectionFactory;
#Autowired
LogListener logListener;
#Bean
public DefaultMessageListenerContainer messageListener() {
DefaultMessageListenerContainer container = new DefaultMessageListenerContainer();
container.setConnectionFactory(this.connectionFactory);
container.setDestinationName("loggerQueue");
container.setMessageListener(logListener);
return container;
}
public static void main(String[] args) throws Exception {
SpringApplication.run(Application.class, args);
}
}
For me your code worked. To test the application I have added a CommandLineRunner which produces a message.
#Bean
CommandLineRunner sendMessage(JmsTemplate jmsTemplate) {
return args -> {
jmsTemplate.convertAndSend("loggerQueue", "Message to Artemis");
};
}
The consumer will consume the message sent to this queue. It it not necessary to declare any properties, but I have defined the following compile time dependencies on my project:
compile('org.springframework.boot:spring-boot-starter-artemis')
compile('org.apache.activemq:artemis-jms-server')

Spring beans are not injected in flyway java based migration

I'm trying to inject component of configuration properties in the flyway migration java code but it always null.
I'm using spring boot with Flyway.
#Component
#ConfigurationProperties(prefix = "code")
public class CodesProp {
private String codePath;
}
Then inside Flyway migration code, trying to autowrire this component as following:
public class V1_4__Migrate_codes_metadata implements SpringJdbcMigration {
#Autowired
private CodesProp codesProp ;
public void migrate(JdbcTemplate jdbcTemplate) throws Exception {
codesProp.getCodePath();
}
Here, codesProp is always null.
Is there any way to inject spring beans inside flyway or make it initialized before flyway bean?
Thank You.
Flyway doesn't support dependency injection into SpringJdbcMigration implementations. It simply looks for classes on the classpath that implement SpringJdbcMigration and creates a new instance using the default constructor. This is performed in SpringJdbcMigrationResolver. When the migration is executed, SpringJdbcMigrationExecutor creates a new JdbcTemplate and then calls your migration implementation's migrate method.
If you really need dependencies to be injected into your Java-based migrations, I think you'll have to implement your own MigrationResolver that retrieves beans of a particular type from the application context and creates and returns a ResolvedMigration instance for each.
If like me, you don't want to wait for Flyway 4.1, you can use Flyway 4.0 and add the following to your Spring Boot application:
1) Create a ApplicationContextAwareSpringJdbcMigrationResolver class in your project:
import org.flywaydb.core.api.FlywayException;
import org.flywaydb.core.api.MigrationType;
import org.flywaydb.core.api.MigrationVersion;
import org.flywaydb.core.api.configuration.FlywayConfiguration;
import org.flywaydb.core.api.migration.MigrationChecksumProvider;
import org.flywaydb.core.api.migration.MigrationInfoProvider;
import org.flywaydb.core.api.migration.spring.SpringJdbcMigration;
import org.flywaydb.core.api.resolver.ResolvedMigration;
import org.flywaydb.core.internal.resolver.MigrationInfoHelper;
import org.flywaydb.core.internal.resolver.ResolvedMigrationComparator;
import org.flywaydb.core.internal.resolver.ResolvedMigrationImpl;
import org.flywaydb.core.internal.resolver.spring.SpringJdbcMigrationExecutor;
import org.flywaydb.core.internal.resolver.spring.SpringJdbcMigrationResolver;
import org.flywaydb.core.internal.util.ClassUtils;
import org.flywaydb.core.internal.util.Location;
import org.flywaydb.core.internal.util.Pair;
import org.flywaydb.core.internal.util.StringUtils;
import org.flywaydb.core.internal.util.scanner.Scanner;
import org.springframework.context.ApplicationContext;
import java.util.ArrayList;
import java.util.Collection;
import java.util.Collections;
import java.util.Map;
/**
* Migration resolver for {#link SpringJdbcMigration}s which are registered in the given {#link ApplicationContext}.
* This resolver provides the ability to use other beans registered in the {#link ApplicationContext} and reference
* them via Spring's dependency injection facility inside the {#link SpringJdbcMigration}s.
*/
public class ApplicationContextAwareSpringJdbcMigrationResolver extends SpringJdbcMigrationResolver {
private final ApplicationContext applicationContext;
public ApplicationContextAwareSpringJdbcMigrationResolver(Scanner scanner, Location location, FlywayConfiguration configuration, ApplicationContext applicationContext) {
super(scanner, location, configuration);
this.applicationContext = applicationContext;
}
#SuppressWarnings("unchecked")
#Override
public Collection<ResolvedMigration> resolveMigrations() {
// get all beans of type SpringJdbcMigration from the application context
Map<String, SpringJdbcMigration> springJdbcMigrationBeans =
(Map<String, SpringJdbcMigration>) this.applicationContext.getBeansOfType(SpringJdbcMigration.class);
ArrayList<ResolvedMigration> resolvedMigrations = new ArrayList<ResolvedMigration>();
// resolve the migration and populate it with the migration info
for (SpringJdbcMigration springJdbcMigrationBean : springJdbcMigrationBeans.values()) {
ResolvedMigrationImpl resolvedMigration = extractMigrationInfo(springJdbcMigrationBean);
resolvedMigration.setPhysicalLocation(ClassUtils.getLocationOnDisk(springJdbcMigrationBean.getClass()));
resolvedMigration.setExecutor(new SpringJdbcMigrationExecutor(springJdbcMigrationBean));
resolvedMigrations.add(resolvedMigration);
}
Collections.sort(resolvedMigrations, new ResolvedMigrationComparator());
return resolvedMigrations;
}
ResolvedMigrationImpl extractMigrationInfo(SpringJdbcMigration springJdbcMigration) {
Integer checksum = null;
if (springJdbcMigration instanceof MigrationChecksumProvider) {
MigrationChecksumProvider version = (MigrationChecksumProvider) springJdbcMigration;
checksum = version.getChecksum();
}
String description;
MigrationVersion version1;
if (springJdbcMigration instanceof MigrationInfoProvider) {
MigrationInfoProvider resolvedMigration = (MigrationInfoProvider) springJdbcMigration;
version1 = resolvedMigration.getVersion();
description = resolvedMigration.getDescription();
if (!StringUtils.hasText(description)) {
throw new FlywayException("Missing description for migration " + version1);
}
} else {
String resolvedMigration1 = ClassUtils.getShortName(springJdbcMigration.getClass());
if (!resolvedMigration1.startsWith("V") && !resolvedMigration1.startsWith("R")) {
throw new FlywayException("Invalid Jdbc migration class name: " + springJdbcMigration.getClass()
.getName() + " => ensure it starts with V or R," + " or implement org.flywaydb.core.api.migration.MigrationInfoProvider for non-default naming");
}
String prefix = resolvedMigration1.substring(0, 1);
Pair info = MigrationInfoHelper.extractVersionAndDescription(resolvedMigration1, prefix, "__", "");
version1 = (MigrationVersion) info.getLeft();
description = (String) info.getRight();
}
ResolvedMigrationImpl resolvedMigration2 = new ResolvedMigrationImpl();
resolvedMigration2.setVersion(version1);
resolvedMigration2.setDescription(description);
resolvedMigration2.setScript(springJdbcMigration.getClass().getName());
resolvedMigration2.setChecksum(checksum);
resolvedMigration2.setType(MigrationType.SPRING_JDBC);
return resolvedMigration2;
}
}
2) Add a new configuration class to post process the Spring Boot generated Flyway instance:
import org.flywaydb.core.Flyway;
import org.flywaydb.core.internal.dbsupport.DbSupport;
import org.flywaydb.core.internal.dbsupport.h2.H2DbSupport;
import org.flywaydb.core.internal.dbsupport.mysql.MySQLDbSupport;
import com.pegusapps.zebra.infrastructure.repository.flyway.ApplicationContextAwareSpringJdbcMigrationResolver;
import org.flywaydb.core.internal.resolver.sql.SqlMigrationResolver;
import org.flywaydb.core.internal.util.Location;
import org.flywaydb.core.internal.util.PlaceholderReplacer;
import org.flywaydb.core.internal.util.scanner.Scanner;
import org.springframework.beans.BeansException;
import org.springframework.beans.factory.config.BeanPostProcessor;
import org.springframework.context.ApplicationContext;
import org.springframework.context.annotation.Bean;
import org.springframework.context.annotation.ComponentScan;
import org.springframework.context.annotation.Configuration;
import javax.sql.DataSource;
import java.sql.SQLException;
#Configuration
#ComponentScan("db.migration")
public class FlywayConfiguration {
#Bean
public BeanPostProcessor postProcessFlyway(ApplicationContext context) {
return new BeanPostProcessor() {
#Override
public Object postProcessBeforeInitialization(Object o, String s) throws BeansException {
return o;
}
#Override
public Object postProcessAfterInitialization(Object o, String s) throws BeansException {
if (o instanceof Flyway) {
Flyway flyway = (Flyway) o;
flyway.setSkipDefaultResolvers(true);
ApplicationContextAwareSpringJdbcMigrationResolver resolver = new ApplicationContextAwareSpringJdbcMigrationResolver(
new Scanner(Thread.currentThread().getContextClassLoader()),
new Location("classpath:db/migration"),
context.getBean(org.flywaydb.core.api.configuration.FlywayConfiguration.class),
context);
SqlMigrationResolver sqlMigrationResolver = null;
try {
sqlMigrationResolver = new SqlMigrationResolver(
getDbSupport(),
new Scanner(Thread.currentThread().getContextClassLoader()),
new Location("classpath:db/migration"),
PlaceholderReplacer.NO_PLACEHOLDERS,
"UTF-8",
"V",
"R",
"__",
".sql");
} catch (SQLException e) {
e.printStackTrace();
}
flyway.setResolvers(sqlMigrationResolver, resolver);
}
return o;
}
private DbSupport getDbSupport() throws SQLException {
DataSource dataSource = context.getBean(DataSource.class);
if( ((org.apache.tomcat.jdbc.pool.DataSource)dataSource).getDriverClassName().equals("org.h2.Driver"))
{
return new H2DbSupport(dataSource.getConnection());
}
else
{
return new MySQLDbSupport(dataSource.getConnection());
}
}
};
}
}
Note that I have some hardcoded dependencies on tomcat jdbc pool, h2 and mysql. If you are using something else, you will need to change the code there (If there is anybody that knows how to avoid it, please comment!)
Also note that the #ComponentScan package needs to match with where you will put the Java migration classes.
Also note that I had to add the SqlMigrationResolver back in since I want to support both the SQL and the Java flavor of the migrations.
3) Create a Java class in the db.migrations package that does the actual migration:
#Component
public class V2__add_default_surveys implements SpringJdbcMigration {
private final SurveyRepository surveyRepository;
#Autowired
public V2__add_surveys(SurveyRepository surveyRepository) {
this.surveyRepository = surveyRepository;
}
#Override
public void migrate(JdbcTemplate jdbcTemplate) throws Exception {
surveyRepository.save(...);
}
}
Note that you need to make the class a #Component and it needs to implement the SpringJdbcMigration. In this class, you can use Spring constructor injection for any Spring bean from your context you might need to do the migration(s).
Note: Be sure to disable ddl validation of Hibernate, because the validation seems to run before Flyway runs:
spring.jpa.hibernate.ddl-auto=none
In short do not autowire beans in your db migrations or even reference classes from your application!
If you refactor/delete/change classes you referenced in the migration it may not even compile or worse corrupt your migrations.
The overhead of using plain JDBC template for the migrations is not worth the risk.
If you are using deltaspike you can use BeanProvider to get a reference to your Class. Here is a DAO example, but it should work fine with your class too.
Change your DAO code:
public static UserDao getInstance() {
return BeanProvider.getContextualReference(UserDao.class, false, new DaoLiteral());
}
Then in your migration method:
UserDao userdao = UserDao.getInstance();
And there you've got your reference.
(referenced from: Flyway Migration with java)

Resources