unable run Quartz JDBCJobStore with AbstractRoutingDataSource - spring-boot

I have implemented the application using Spring RoutingDataSource.
Spring -> DS1,
DS2
Based on the logged in URL I am changing the Data Source. it is working fine.
Coming to the quartz, I am unable to change the data source dynamically. Always jobs are getting scheduled on default data source.
#Configuration
public class SchedulerConfig {
#Autowired
private DataSource dataSource;
#Autowired
private QuartzProperties quartzProperties;
#Bean
public JobFactory jobFactory(ApplicationContext applicationContext) {
AutowiringSpringBeanJobFactory jobFactory = new AutowiringSpringBeanJobFactory();
jobFactory.setApplicationContext(applicationContext);
return jobFactory;
}
#Bean
public SchedulerFactoryBean schedulerFactoryBean(JobFactory jobFactory) {
Properties properties = new Properties();
properties.putAll(quartzProperties.getProperties());
SchedulerFactoryBean factory = new SchedulerFactoryBean();
factory.setJobFactory(jobFactory);
factory.setDataSource(dataSource);
factory.setGlobalJobListeners(jobListener());
factory.setQuartzProperties(properties);
return factory;
}
#Bean
public JobListenerSupport jobListener() {
return new JobListener();
}
}
Data Source Routing Configuration::
#Component
public class DataSourceRouting extends AbstractRoutingDataSource {
private DataSourceOneConfig dataSourceOneConfig;
private DataSourceTwoConfig dataSourceTwoConfig;
private DataSourceContextHolder dataSourceContextHolder;
public DataSourceRouting(DataSourceContextHolder dataSourceContextHolder, DataSourceOneConfig dataSourceOneConfig,
DataSourceTwoConfig dataSourceTwoConfig) {
this.dataSourceOneConfig = dataSourceOneConfig;
this.dataSourceTwoConfig = dataSourceTwoConfig;
this.dataSourceContextHolder = dataSourceContextHolder;
Map<Object, Object> dataSourceMap = new HashMap<>();
dataSourceMap.put(DataSourceEnum.tenant1, dataSourceOneDataSource());
dataSourceMap.put(DataSourceEnum.tenant2, dataSourceTwoDataSource());
this.setTargetDataSources(dataSourceMap);
this.setDefaultTargetDataSource(dataSourceTwoDataSource());
}
#Override
protected Object determineCurrentLookupKey() {
return dataSourceContextHolder.getBranchContext();
}
public DataSource dataSourceOneDataSource() {
DriverManagerDataSource dataSource = new DriverManagerDataSource();
dataSource.setUrl(dataSourceOneConfig.getUrl());
dataSource.setUsername(dataSourceOneConfig.getUsername());
dataSource.setPassword(dataSourceOneConfig.getPassword());
return dataSource;
}
public DataSource dataSourceTwoDataSource() {
DriverManagerDataSource dataSource = new DriverManagerDataSource();
dataSource.setUrl(dataSourceTwoConfig.getUrl());
dataSource.setUsername(dataSourceTwoConfig.getUsername());
dataSource.setPassword(dataSourceTwoConfig.getPassword());
return dataSource;
}
}
Data Soruce Config ::
#RequiredArgsConstructor
#DependsOn("dataSourceRouting")
public class DataSourceConfig {
private final DataSourceRouting dataSourceRouting;
#Bean
#Primary
public DataSource dataSource() {
return dataSourceRouting;
}
#Bean(name = "entityManager")
public LocalContainerEntityManagerFactoryBean entityManagerFactoryBean(EntityManagerFactoryBuilder builder) {
return builder.dataSource(dataSource()).packages("com.model.entity").build();
}
#Bean(name = "transcationManager")
public JpaTransactionManager transactionManager(
#Autowired #Qualifier("entityManager") LocalContainerEntityManagerFactoryBean entityManagerFactoryBean) {
return new JpaTransactionManager(entityManagerFactoryBean.getObject());
}
}

Related

Spring 5 open session in view

I'm trying to implement the OpenSessionInView pattern (antipattern) with Spring 5. I configured via Java (not xml) the OpenSessionInView in my WebMvcConfigurer, and watching the logs it seems that is already running. But when I try to load a lazy collection it says the known "Could not write JSON: failed to lazily initialize a collection of role:..."
The crazy stuff is that I see in my logs:
DEBUG [http-nio-8080-exec-7] (OpenSessionInViewInterceptor.java:128) - Opening Hibernate Session in OpenSessionInViewInterceptor
DEBUG [http-nio-8080-exec-7] (AbstractPlatformTransactionManager.java:370) - Creating new transaction with name [org.springframework.data.jpa.repository.support.SimpleJpaRepository.findById]: PROPAGATION_REQUIRED,ISOLATION_DEFAULT,readOnly
DEBUG [http-nio-8080-exec-7] (JpaTransactionManager.java:393) - Opened new EntityManager [SessionImpl(1867707287<open>)] for JPA transaction
DEBUG [http-nio-8080-exec-7] (DriverManagerDataSource.java:144) - Creating new JDBC DriverManager Connection to [jdbc:mysql://MYIP:3306/DB_NAME?useUnicode=true&useJDBCCompliantTimezoneShift=true&useLegacyDatetimeCode=false&serverTimezone=Europe/Madrid]
DEBUG [http-nio-8080-exec-7] (DataSourceUtils.java:186) - Setting JDBC Connection [com.mysql.cj.jdbc.ConnectionImpl#2e6ff9ac] read-only
DEBUG [http-nio-8080-exec-7] (TransactionImpl.java:56) - On TransactionImpl creation, JpaCompliance#isJpaTransactionComplianceEnabled == false
...
...
...
WARN [http-nio-8080-exec-7] (AbstractHandlerExceptionResolver.java:199) - Resolved [org.springframework.http.converter.HttpMessageNotWritableException: Could not write JSON: failed to lazily initialize a collection of role: com....Route.styles, could not initialize proxy - no Session; nested exception is com.fasterxml.jackson.databind.JsonMappingException: failed to lazily initialize a collection of role...
DEBUG [http-nio-8080-exec-7] (OpenSessionInViewInterceptor.java:153) - Closing Hibernate Session in OpenSessionInViewInterceptor
So, it seems that the session is opened as it should, is not closed when the exception is thrown. But my EntityManager is not using that session... Am I correct? How can I achieve that?
Thanks!
EDIT:
My WebConfig.java:
#Configuration
#ComponentScan
#EnableWebMvc
public class WebConfig implements WebMvcConfigurer {
#Value( "${resources.files.location}" )
private String fileLocation;
#Autowired
DataSource datasource;
#Autowired
private Environment env;
#Bean
public ResourceBundleMessageSource messageSource() {
ResourceBundleMessageSource messageSource = new ResourceBundleMessageSource();
messageSource.setBasename("Messages");
messageSource.setCacheMillis(10);
return messageSource;
}
#Bean
public LocaleResolver localeResolver() {
SessionLocaleResolver localeResolver = new SessionLocaleResolver();
localeResolver.setDefaultLocale(Locale.getDefault());
return localeResolver;
}
#Bean
public LocaleChangeInterceptor localeChangeInterceptor() {
LocaleChangeInterceptor localeInterceptor = new LocaleChangeInterceptor();
localeInterceptor.setIgnoreInvalidLocale(true);
localeInterceptor.setParamName("idioma");
return localeInterceptor;
}
#Bean
public PlatformTransactionManager hibernateTransactionManager() {
HibernateTransactionManager transactionManager
= new HibernateTransactionManager();
transactionManager.setSessionFactory(sessionFactoryBean().getObject());
return transactionManager;
}
#Bean
public ViewResolver internalResourceViewResolver() {
InternalResourceViewResolver internalResourceViewResolver = new InternalResourceViewResolver();
internalResourceViewResolver.setPrefix("/WEB-INF/views/");
internalResourceViewResolver.setSuffix(".jsp");
return internalResourceViewResolver;
}
public void addInterceptors(InterceptorRegistry registry) {
OpenSessionInViewInterceptor openSessionInViewInterceptor = new OpenSessionInViewInterceptor();
openSessionInViewInterceptor.setSessionFactory(sessionFactoryBean().getObject());
registry.addWebRequestInterceptor(openSessionInViewInterceptor).addPathPatterns("/**");
registry.addInterceptor(new miLoggerInterceptor());
registry.addInterceptor(localeChangeInterceptor());
}
#Bean
public FilterRegistrationBean openEntityManagerInViewFilter() {
FilterRegistrationBean reg = new FilterRegistrationBean();
reg.setName("OpenEntityManagerInViewFilter");
reg.setFilter(new OpenEntityManagerInViewFilter());
return reg;
}
#Bean
public LocalSessionFactoryBean sessionFactoryBean() {
LocalSessionFactoryBean sessionFactory = new LocalSessionFactoryBean();
sessionFactory.setDataSource(datasource);
sessionFactory.setPackagesToScan("my.package");
sessionFactory.setHibernateProperties(hibernateProperties());
return sessionFactory;
}
private Properties hibernateProperties() {
Properties jpaProperties = new Properties();
jpaProperties.put("hibernate.hbm2ddl.auto", env.getProperty("hibernate.hbm2ddl.auto"));
jpaProperties.put("hibernate.show_sql", env.getProperty("hibernate.show_sql"));
jpaProperties.put("hibernate.format_sql", env.getProperty("hibernate.format_sql"));
return jpaProperties;
}
public void addResourceHandlers(final ResourceHandlerRegistry registry) {
registry.addResourceHandler("/js/**").addResourceLocations("/js/");
registry.addResourceHandler("/css/**").addResourceLocations("/css/");
registry.addResourceHandler("/img/**").addResourceLocations("/img/");
registry.addResourceHandler("/.well-known/acme-challenge/**").addResourceLocations("/.well-known/acme-challenge/");
registry.addResourceHandler("/webfonts/**").addResourceLocations("/webfonts/");
registry.addResourceHandler("/multimedia/").addResourceLocations("file:"+fileLocation);
registry.addResourceHandler("swagger-ui.html").addResourceLocations("classpath:/META-INF/resources/");
registry.addResourceHandler("/webjars/**").addResourceLocations("classpath:/META-INF/resources/webjars/");
registry.addResourceHandler("/files/**").addResourceLocations("file:///C:/tmp/images/");
registry.addResourceHandler("/favico.ico").addResourceLocations("/favico.ico");
}
}
BusinessConfig.java:
#Configuration
#ComponentScan
#PropertySources({
#PropertySource("classpath:application.properties"),
})
#EnableJpaRepositories("com.muskers.web.business.repositories")
public class BusinessConfig {
#Autowired
private Environment env;
#Bean
public DataSource dataSource() {
DriverManagerDataSource dataSource = new DriverManagerDataSource();
dataSource.setDriverClassName(env.getProperty("db.driver"));
dataSource.setUrl(env.getProperty("db.url"));
dataSource.setUsername(env.getProperty("db.username"));
dataSource.setPassword(env.getProperty("db.password"));
return dataSource;
}
#Bean
public PasswordEncoder delegatingPasswordEncoder() {
PasswordEncoder defaultEncoder = PasswordEncoderFactories.createDelegatingPasswordEncoder();
Map<String, PasswordEncoder> encoders = new HashMap<>();
encoders.put("bcrypt", new BCryptPasswordEncoder());
encoders.put("scrypt", new SCryptPasswordEncoder());
encoders.put("pbkdf2", new Pbkdf2PasswordEncoder());
DelegatingPasswordEncoder passwordEncoder = new DelegatingPasswordEncoder(
"bcrypt", encoders);
passwordEncoder.setDefaultPasswordEncoderForMatches(defaultEncoder);
return passwordEncoder;
}
#Bean
public LocalContainerEntityManagerFactoryBean entityManagerFactory() {
LocalContainerEntityManagerFactoryBean emf = new LocalContainerEntityManagerFactoryBean();
emf.setDataSource(dataSource());
emf.setPackagesToScan(User.class.getPackage().getName());
HibernateJpaVendorAdapter hibernateJpa = new HibernateJpaVendorAdapter();
hibernateJpa.setDatabase(Database.MYSQL);
hibernateJpa.setDatabasePlatform(env.getProperty("hibernate.dialect"));
hibernateJpa.setGenerateDdl(env.getProperty("hibernate.generateDdl", Boolean.class));
hibernateJpa.setShowSql(env.getProperty("hibernate.show_sql", Boolean.class));
emf.setJpaVendorAdapter(hibernateJpa);
Properties jpaProperties = new Properties();
jpaProperties.put("hibernate.hbm2ddl.auto", env.getProperty("hibernate.hbm2ddl.auto"));
jpaProperties.put("hibernate.show_sql", env.getProperty("hibernate.show_sql"));
jpaProperties.put("hibernate.format_sql", env.getProperty("hibernate.format_sql"));
emf.setJpaProperties(jpaProperties);
return emf;
}
#Bean
public JpaTransactionManager transactionManager() {
JpaTransactionManager txnMgr = new JpaTransactionManager();
txnMgr.setEntityManagerFactory(entityManagerFactory().getObject());
return txnMgr;
}
#PostConstruct
public void setTimeZone() {
TimeZone.setDefault(TimeZone.getTimeZone("Europe/Madrid"));
}
public class Roles {
public final static String ROLE_USER = "USER";
public final static String ROLE_ADMIN = "ADMIN";
}
public class Authorities {
public final static String MANAGE_GYMS = "MANAGE_GYMS";
public final static String MANAGE_USERS = "MANAGE_USERS";
public final static String READ_GYMS = "READ_GYMS";
public final static String CREATE_GYMS = "CREATE_GYMS";
public final static String CREATE_ROUTES = "CREATE_ROUTES";
public final static String READ_ROUTES = "READ_ROUTES";
public final static String MANAGE_ROUTES = "MANAGE_ROUTES";
}
}
App.java:
public class App extends AbstractAnnotationConfigDispatcherServletInitializer
{
#Override
protected Class<?>[] getRootConfigClasses() {
return new Class<?>[] { BusinessConfig.class, WebSecurityConfig.class };
}
#Override
protected Class<?>[] getServletConfigClasses() {
return new Class<?>[] { WebConfig.class };
}
#Override
protected String[] getServletMappings() {
return new String[] { "/" };
}
#Override
protected Filter[] getServletFilters() {
CharacterEncodingFilter characterEncodingFilter = new CharacterEncodingFilter();
characterEncodingFilter.setEncoding("UTF-8");
return new Filter[] { characterEncodingFilter};
}
}

Data Migration from One Db to another using Spring batch and Spring boot

Im new to spring framework and I Want to migrate data from One DB to another using Spring boot n Batch
Im trying to read from an Mysql db with an item reader and write it to an Oracle db using an item writer of the same job.
Im able to read the data from the Mysql db but unable to write it to Oracle Db as the writer is trying it write it in Mysql Db itself.
Im not sure why the connection is not switching to Oracle db.
Please help me over here what im doing wrong.
Application properties
server.port=8082
spring.batch.job.enabled= false
spring.datasource.url = jdbc:mysql://localhost:3306/projectdb1
spring.datasource.username = root
spring.datasource.password = Mech_2015
spring.datasource.driverClassName = com.mysql.jdbc.Driver
#second db2 ...
db2.datasource.url = jdbc:oracle:thin:#localhost:1521:xe
db2.datasource.username = system
db2.datasource.password = Mech_2015
db2.datasource.driverClassName = oracle.jdbc.driver.OracleDriver
Entity Managers
package com.techprimers.springbatchexample1.entity.manger;
#Configuration
#EnableJpaRepositories(entityManagerFactoryRef = "radiusEntityManager", transactionManagerRef = "radiusTransactionManager", basePackages = "com.techprimers.springbatchexample1.repository.userrepo")
public class RadiusConfig {
private final PersistenceUnitManager persistenceUnitManager;
public RadiusConfig(ObjectProvider<PersistenceUnitManager> persistenceUnitManager) {
this.persistenceUnitManager = persistenceUnitManager.getIfAvailable();
}
#Bean
#ConfigurationProperties("spring.jpa")
public JpaProperties radiusJpaProperties() {
return new JpaProperties();
}
#Bean
#Primary
#ConfigurationProperties("spring.datasource")
public DataSourceProperties radiusDataSourceProperties() {
return new DataSourceProperties();
}
#Bean
#ConfigurationProperties(prefix = "spring.datasource.properties")
public DataSource radiusDataSource() {
return (DataSource) radiusDataSourceProperties().initializeDataSourceBuilder().type(DataSource.class).build();
}
#Bean(name = "radiusEntityManager")
public LocalContainerEntityManagerFactoryBean radiusEntityManager(JpaProperties radiusJpaProperties) {
EntityManagerFactoryBuilder builder = createEntityManagerFactoryBuilder(radiusJpaProperties);
return builder.dataSource(radiusDataSource()).packages(User.class).persistenceUnit("userDs").build();
}
#Bean
public JpaTransactionManager radiusTransactionManager(EntityManagerFactory radiusEntityManager) {
return new JpaTransactionManager(radiusEntityManager);
}
private EntityManagerFactoryBuilder createEntityManagerFactoryBuilder(JpaProperties radiusJpaProperties) {
JpaVendorAdapter jpaVendorAdapter = createJpaVendorAdapter(radiusJpaProperties);
return new EntityManagerFactoryBuilder(jpaVendorAdapter, radiusJpaProperties.getProperties(),
this.persistenceUnitManager);
}
private JpaVendorAdapter createJpaVendorAdapter(JpaProperties jpaProperties) {
AbstractJpaVendorAdapter adapter = new HibernateJpaVendorAdapter();
adapter.setShowSql(jpaProperties.isShowSql());
adapter.setDatabase(jpaProperties.getDatabase());
adapter.setDatabasePlatform(jpaProperties.getDatabasePlatform());
adapter.setGenerateDdl(jpaProperties.isGenerateDdl());
return adapter;
}
}
package com.techprimers.springbatchexample1.entity.manger;
#Configuration
#EnableJpaRepositories(entityManagerFactoryRef = "esbEntityManager", transactionManagerRef = "esbDetailsTransactionManager", basePackages = "com.techprimers.springbatchexample1.repository.userdetails")
public class EsbConfig {
private final PersistenceUnitManager persistenceUnitManager;
public EsbConfig(ObjectProvider<PersistenceUnitManager> persistenceUnitManager) {
this.persistenceUnitManager = persistenceUnitManager.getIfAvailable();
}
#Bean
#ConfigurationProperties("db2.jpa")
public JpaProperties esbJpaProperties() {
return new JpaProperties();
}
#Bean
#ConfigurationProperties("db2.datasource")
public DataSourceProperties esbDataSourceProperties() {
return new DataSourceProperties();
}
#Bean
#ConfigurationProperties(prefix = "db2.datasource.properties")
public DataSource esbDataSource() {
return (DataSource) esbDataSourceProperties().initializeDataSourceBuilder().type(DataSource.class).build();
}
#Bean(name = "esbEntityManager")
public LocalContainerEntityManagerFactoryBean esbEntityManager(JpaProperties esbJpaProperties) {
EntityManagerFactoryBuilder builder = createEntityManagerFactoryBuilder(esbJpaProperties);
return builder.dataSource(esbDataSource()).packages(UserDetails.class).persistenceUnit("userDetailDs").build();
}
#Bean
public JpaTransactionManager esbTransactionManager(EntityManagerFactory esbEntityManager) {
return new JpaTransactionManager(esbEntityManager);
}
private EntityManagerFactoryBuilder createEntityManagerFactoryBuilder(JpaProperties esbJpaProperties) {
JpaVendorAdapter jpaVendorAdapter = createJpaVendorAdapter(esbJpaProperties);
return new EntityManagerFactoryBuilder(jpaVendorAdapter, esbJpaProperties.getProperties(),
this.persistenceUnitManager);
}
private JpaVendorAdapter createJpaVendorAdapter(JpaProperties jpaProperties) {
AbstractJpaVendorAdapter adapter = new HibernateJpaVendorAdapter();
adapter.setShowSql(jpaProperties.isShowSql());
adapter.setDatabase(jpaProperties.getDatabase());
adapter.setDatabasePlatform(jpaProperties.getDatabasePlatform());
adapter.setGenerateDdl(jpaProperties.isGenerateDdl());
return adapter;
}
}
Batch Config
package com.techprimers.springbatchexample1.config;
#Configuration
#EnableBatchProcessing
public class SpringBatchConfig {
private static final Logger LOGGER = LoggerFactory.getLogger(SpringBatchConfig.class);
#Bean
#Primary
#Qualifier("radiusDatasource")
#ConfigurationProperties(prefix = "spring.datasource")
DataSource mysqlDataSource() {
return DataSourceBuilder.create().build();
}
#Bean
#ConfigurationProperties(prefix = "db2.datasource")
#Qualifier("esbDatasource")
DataSource oracleDataSource() {
return DataSourceBuilder.create().build();
}
#Bean
public InspectionProcessor processor() {
return new InspectionProcessor();
}
#Bean
public JdbcCursorItemReader<User> reader() {
JdbcCursorItemReader<User> cursorItemReader = new JdbcCursorItemReader<>();
cursorItemReader.setDataSource(mysqlDataSource());
cursorItemReader.setSql("SELECT ID,CNAME,SID,CREATEDDATE,COMPLETEDDATE FROM INSPECTION");
cursorItemReader.setRowMapper(new InspectionDetailRowmapper());
return cursorItemReader;
}
private static final String QUERY_INSERT_STUDENT = "INSERT "
+ "INTO inspect(id,cname,completeddate,createddate,lastupdateddate,sid) " + "VALUES (?,?,?,?,?,?)";
#Bean
ItemWriter<UserDetails> databaseItemWriter(DataSource dataSource, NamedParameterJdbcTemplate jdbcTemplate) {
LOGGER.info("Starting writer");
JdbcBatchItemWriter<UserDetails> databaseItemWriter = new JdbcBatchItemWriter<>();
databaseItemWriter.setDataSource(oracleDataSource());
databaseItemWriter.setJdbcTemplate(jdbcTemplate);
LOGGER.info(" writer");
databaseItemWriter.setSql(QUERY_INSERT_STUDENT)
ItemPreparedStatementSetter<UserDetails> valueSetter = new UserDetailsPreparedStatementSetter();
databaseItemWriter.setItemPreparedStatementSetter(valueSetter);
return databaseItemWriter;
}
#Bean
Step dataMigrationStep(ItemReader<User> reader, ItemProcessor<User, UserDetails> processor,
ItemWriter<UserDetails> databsaeItemWriter, StepBuilderFactory stepBuilderFactory) {
return stepBuilderFactory.get("dataMigrationStep").<User, UserDetails>chunk(5).reader(reader)
.processor(processor).writer(databsaeItemWriter).build();
}
#Bean
Job dataMigrationJob(JobBuilderFactory jobBuilderFactory, #Qualifier("dataMigrationStep") Step dataMigrationStep) {
return jobBuilderFactory.get("csvFileToDatabaseJob").incrementer(new RunIdIncrementer()).flow(dataMigrationStep)
.end().build();
}
}
Your writer does not inject properly the database where to write your data
this
#Bean
ItemWriter<UserDetails> databaseItemWriter(DataSource dataSource, NamedParameterJdbcTemplate jdbcTemplate)
instead of
#Bean
ItemWriter<UserDetails> databaseItemWriter(#Qualifier("esbDatasource") DataSource dataSource, NamedParameterJdbcTemplate jdbcTemplate)

Hikaricp configuration for multiple datasources

I have a multi database application. Users can select the database on the login page.
Then the database is routing selected database thanks for AbstractRoutingDataSource from Spring.
I want to use HikariCP, but it needs dataSourceUrl. But my Datasource URL changes dynamically. How can I configure Hikaricp for multiple databases?
File application.properties:
#database1 properties
app.database1.connection.url = url1
app.database1.connection.username = sameusername
app.database1.connection.password = samepassword
#database2 properties
app.database2.connection.url = url2
app.database2.connection.username = sameusername
app.database2.connection.password = samepassword
My Datasource configuration class example:
public class DataSourceConfiguration {
#Autowired(required = false)
private PersistenceUnitManager persistenceUnitManager;
#Bean
#ConfigurationProperties(prefix = "app.database1.connection")
public DataSource database1DataSource() {
return DataSourceBuilder.create().build();
}
#Bean
#ConfigurationProperties(prefix = "app.database2.connection")
public DataSource database2DataSource() {
return DataSourceBuilder.create().build();
}
#Bean
#Primary
public DataSource appDataSource() {
DataSourceRouter router = new DataSourceRouter();
final HashMap<Object, Object> map = new HashMap<>(3);
map.put(DatabaseEnvironment.DATABASE1, database1DataSource());
map.put(DatabaseEnvironment.DATABASE2, database2DataSource());
router.setTargetDataSources(map);
return router;
}
#Bean
#Primary
#ConfigurationProperties("app.connection.jpa")
public JpaProperties appJpaProperties() {
return new JpaProperties();
}
private JpaVendorAdapter createJpaVendorAdapter(JpaProperties jpaProperties) {
AbstractJpaVendorAdapter adapter = new HibernateJpaVendorAdapter();
adapter.setShowSql(jpaProperties.isShowSql());
adapter.setDatabase(jpaProperties.getDatabase());
adapter.setDatabasePlatform(jpaProperties.getDatabasePlatform());
adapter.setGenerateDdl(jpaProperties.isGenerateDdl());
return adapter;
}
My session scoped class instead of context holder:
#Component
#Scope(value = "session", proxyMode = ScopedProxyMode.TARGET_CLASS)
public class PreferredDatabaseSession implements Serializable {
/**
*
*/
private static final long serialVersionUID = 1L;
private DatabaseEnvironment preferredDb;
public DatabaseEnvironment getPreferredDb() {
return preferredDb;
}
public void setPreferredDb(DatabaseEnvironment preferredDb) {
this.preferredDb = preferredDb;
}
}
If I understand your requirement correctly, you intend to define two data sources and for a given request you want to route your queries to a particular data source based on some condition.
The solution is:
File application.properties
#database1 properties
app.database1.connection.url = url1
app.database1.connection.username = username1
app.database1.connection.password = password1
#database2 properties
app.database2.connection.url = url2
app.database2.connection.username = username2
app.database2.connection.password = password2
#default
default.datasource.key=dataSource1
File CommonRoutingDataSource.java
public class CommonRoutingDataSource extends AbstractRoutingDataSource {
#Override
protected Object determineCurrentLookupKey() {
return DataSourceContextHolder.getDataSourceName();
}
public void initDataSources(final DataSource dataSource1, final DataSource dataSource2,
final String defaultDataSourceKey) {
final Map<Object, Object> dataSourceMap = new HashMap<Object, Object>();
dataSourceMap.put("dataSource1", dataSource1);
dataSourceMap.put("dataSource2", dataSource2);
this.setDefaultTargetDataSource(dataSourceMap.get(defaultDataSourceKey));
this.setTargetDataSources(dataSourceMap);
}
}
File DataSourceContextHolder.java
public class DataSourceContextHolder {
private static final ThreadLocal<String> contextHolder = new ThreadLocal<>();
private DataSourceContextHolder() {
// Private no-op constructor
}
public static final void setDataSourceName(final String dataSourceName) {
Assert.notNull(dataSourceName, "dataSourceName cannot be null");
contextHolder.set(dataSourceName);
}
public static final String getDataSourceName() {
return contextHolder.get();
}
public static final void clearDataSourceName() {
contextHolder.remove();
}
}
File DataSourceConfig.java
public class DataSourceConfig {
#Autowired
private Environment env;
#Autowired
#Bean(name = "dataSource")
public DataSource getDataSource(final DataSource dataSource1, final DataSource dataSource2) {
final CommonRoutingDataSource dataSource = new CommonRoutingDataSource();
dataSource.initDataSources(dataSource1, dataSource2, env.getProperty("default.datasource.key"));
return dataSource;
}
#Bean(name = "dataSource1")
public DataSource getDataSource1() throws SQLException {
// The exact DataSource class imported shall be as per your requirement - HikariCP, or Tomcat etc.
final DataSource dataSource = new DataSource();
dataSource.setDriverClassName();
dataSource.setUrl(env.getProperty("app.database1.connection.url"));
// Set all data source attributes from the application.properties file
return dataSource;
}
#Bean(name = "dataSource2")
public DataSource getDataSource2() throws SQLException {
// The exact DataSource class imported shall be as per your requirement - HikariCP, or Tomcat etc.
final DataSource dataSource = new DataSource();
dataSource.setDriverClassName();
dataSource.setUrl(env.getProperty("app.database2.connection.url"));
// set all data source attributes from the application.properties file
return dataSource;
}
}
Now, somewhere in your code (either an Aspect or Controller), you need to dynamically set the data source conditionally:
DataSourceContextHolder.setDataSourceName("dataSource1");
Note: It's better to declare the data source names as enums rather than strings "dataSource1", "dataSource2", etc.
The below snippet works for me
first.datasource.jdbc-url=jdbc-url
first.datasource.username=username
first.datasource.password=password
.
.
.
.
=================== In Java Configuration File ==================
#Primary
#Bean(name = "firstDataSource")
#ConfigurationProperties(prefix = "first.datasource")
public DataSource dataSource() {
return DataSourceBuilder.create().build();
}
#Primary
#Bean(name = "firstEntityManagerFactory")
public LocalContainerEntityManagerFactoryBean barEntityManagerFactory(EntityManagerFactoryBuilder builder,
#Qualifier("firstDataSource") DataSource dataSource) {
Map<String, String> props = new HashMap<String, String>();
props.put("spring.jpa.database-platform", "org.hibernate.dialect.Oracle12cDialect");
.
.
.
return builder.dataSource(dataSource).packages("com.first.entity").persistenceUnit("firstDB")
.properties(props)
.build();
}
#Primary
#Bean(name = "firstTransactionManager")
public PlatformTransactionManager firstTransactionManager(
#Qualifier("firstEntityManagerFactory") EntityManagerFactory firstEntityManagerFactory) {
return new JpaTransactionManager(firstEntityManagerFactory);
}
second.datasource.jdbc-url=jdbc-url
second.datasource.username=username
second.datasource.password=password
.
.
.
.
=================== In Java Configuration File ==================
#Bean(name = "secondDataSource")
#ConfigurationProperties(prefix = "second.datasource")
public DataSource dataSource() {
return DataSourceBuilder.create().build();
}
#Bean(name = "secondEntityManagerFactory")
public LocalContainerEntityManagerFactoryBean barEntityManagerFactory(EntityManagerFactoryBuilder builder,
#Qualifier("secondDataSource") DataSource dataSource) {
Map<String, String> props = new HashMap<String, String>();
props.put("spring.jpa.database-platform", "org.hibernate.dialect.Oracle12cDialect");
.
.
.
return builder.dataSource(dataSource).packages("com.second.entity").persistenceUnit("secondDB")
.properties(props)
.build();
}
#Bean(name = "secondTransactionManager")
public PlatformTransactionManager secondTransactionManager(
#Qualifier("secondEntityManagerFactory") EntityManagerFactory secondEntityManagerFactory) {
return new JpaTransactionManager(secondEntityManagerFactory);
}

Spring boot Read write Split / Master - Slave / Multiple Databases

I am following this link
https://github.com/kwon37xi/replication-datasource
I have implemented the code But STILL Both my service functions are using the same Database(one which is marked primary)
Service Class
public class TableService{
#Autowired
private Table1Repo t1Repo;
#Transactional(readOnly = false)
public void saveTable1(Table1 t,int a, Table1 t2){
try{
t1Repo.save(t2);
}
catch(Exception e){
System.out.println("Inside");
}
}
#Transactional(readOnly = true)
public Table1 getTable(int id){
return t1Repo.findOne(id);
}
}
Then Added two Class(from the link)
ReplicationRoutingDataSource
public class ReplicationRoutingDataSource extends AbstractRoutingDataSource {
#Override
protected Object determineCurrentLookupKey() {
String dataSourceType = TransactionSynchronizationManager.isCurrentTransactionReadOnly() ? "read" : "write";
return dataSourceType;
}
}
WithRoutingDataSourceConfig
#Configuration
public class WithRoutingDataSourceConfig {
/*#Bean(destroyMethod = "shutdown")*/
#Bean
#Primary
#ConfigurationProperties(prefix="datasource.primary")
public DataSource writeDataSource() {
/* EmbeddedDatabaseBuilder builder = new EmbeddedDatabaseBuilder()
.setName("routingWriteDb")
.setType(EmbeddedDatabaseType.H2)
.setScriptEncoding("UTF-8")
.addScript("classpath:/writedb.sql");
return builder.build();*/
return DataSourceBuilder.create().build();
}
/* #Bean(destroyMethod = "shutdown")*/
#Bean
#ConfigurationProperties(prefix="datasource.secondary")
public DataSource readDataSource() {
/*EmbeddedDatabaseBuilder builder = new EmbeddedDatabaseBuilder()
.setName("routingReadDb")
.setType(EmbeddedDatabaseType.H2)
.setScriptEncoding("UTF-8")
.addScript("classpath:/readdb.sql");
return builder.build();*/
return DataSourceBuilder.create().build();
}
/**
* {#link org.springframework.jdbc.datasource.lookup.AbstractRoutingDataSource}는
* {#link org.springframework.beans.factory.InitializingBean}을 구현하므로,
* 명시적으로 afterPropertiesSet()메소드를 호출하거나
* 별도 #Bean으로 만들어 Spring Life Cycle을 타도록 해야 한다.
*/
#Bean
public DataSource routingDataSource(#Qualifier("writeDataSource") DataSource writeDataSource, #Qualifier("readDataSource") DataSource readDataSource) {
ReplicationRoutingDataSource routingDataSource = new ReplicationRoutingDataSource();
Map<Object, Object> dataSourceMap = new HashMap<Object, Object>();
dataSourceMap.put("write", writeDataSource);
dataSourceMap.put("read", readDataSource);
routingDataSource.setTargetDataSources(dataSourceMap);
routingDataSource.setDefaultTargetDataSource(writeDataSource);
return routingDataSource;
}
/**
* {#link org.springframework.jdbc.datasource.LazyConnectionDataSourceProxy}로 감싸서
* 트랜잭션 동기화가 이루어진 뒤에 실제 커넥션을 확보하도록 해준다.
*
* #param routingDataSource
* #return
*/
#Bean
public DataSource dataSource(#Qualifier("routingDataSource") DataSource routingDataSource) {
return new LazyConnectionDataSourceProxy(routingDataSource);
}
}
application.prop file
server.port=8089
spring.jpa.show-sql = true
spring.jpa.properties.hibernate.show_sql=true
# Primary DataSource configuration
datasource.primary.url=jdbc:mysql://127.0.0.1:3306/jpa
datasource.primary.username=root
datasource.primary.password=root
# Any of the other Spring supported properties below...
# Secondary DataSource configuration
datasource.secondary.url=jdbc:mysql://127.0.0.1:3306/jpa2
datasource.secondary.username=root
datasource.secondary.password=root
Repository
public interface Table1Repo extends JpaRepository<Table1, Integer>{}
Issue is my both service functions are using the primary Database. What am I missing. I only have these class. Rest I have one Controller
Edited
I have made by code work by adding this class
#Configuration
#EnableTransactionManagement
#EnableJpaRepositories(basePackages="com.example")
public class ReplicationDataSourceApplicationConfig {
#Bean
public LocalContainerEntityManagerFactoryBean entityManagerFactory(#Qualifier("dataSource") DataSource dataSource) {
LocalContainerEntityManagerFactoryBean emfb = new LocalContainerEntityManagerFactoryBean();
emfb.setDataSource(dataSource);
emfb.setPackagesToScan("com.example");
HibernateJpaVendorAdapter jpaVendorAdapter = new HibernateJpaVendorAdapter();
emfb.setJpaVendorAdapter(jpaVendorAdapter);
return emfb;
}
#Bean
public PlatformTransactionManager transactionManager(EntityManagerFactory entityManagerFactory) {
JpaTransactionManager transactionManager = new JpaTransactionManager();
transactionManager.setEntityManagerFactory(entityManagerFactory);
return transactionManager;
}
#Bean
public PersistenceExceptionTranslationPostProcessor exceptionTranslationPostProcessor() {
return new PersistenceExceptionTranslationPostProcessor();
}
}
I'm the writer of the link you refered.
Which data source do you use with Table1Repo?
You have to inject the specific bean "dataSource" to you JDBC call.
I guess #Primary "writeDataSource" is injected to your Repository.
Try to change #Primary to "dataSource" or find a way to inject "dataSource" to your repository.
you can also try in this way:
Spring Boot 2 with Multiple DataSource for Postgres Data Replication
and here is source code in github:
spring-boot-multi-data-source
The following link explans, how you can have multiple datasource
DB1 (write):
#Configuration
#ConfigurationProperties("spring.datasource-write")
#EnableTransactionManagement
#EnableJpaRepositories(
entityManagerFactoryRef = "entityManagerFactoryWrite",
transactionManagerRef = "transactionManagerWrite",
basePackages = {"com.ehsaniara.multidatasource.repository.writeRepository"}
)
public class DataSourceConfigWrite extends HikariConfig {
public final static String PERSISTENCE_UNIT_NAME = "write";
#Bean
public HikariDataSource dataSourceWrite() {
return new HikariDataSource(this);
}
#Bean
public LocalContainerEntityManagerFactoryBean entityManagerFactoryWrite(
final HikariDataSource dataSourceWrite) {
return new LocalContainerEntityManagerFactoryBean() {{
setDataSource(dataSourceWrite);
setPersistenceProviderClass(HibernatePersistenceProvider.class);
setPersistenceUnitName(PERSISTENCE_UNIT_NAME);
setPackagesToScan(MODEL_PACKAGE);
setJpaProperties(JPA_PROPERTIES);
}};
}
#Bean
public PlatformTransactionManager transactionManagerWrite(EntityManagerFactory entityManagerFactoryWrite) {
return new JpaTransactionManager(entityManagerFactoryWrite);
}
}
DB2 (read):
#Configuration
#ConfigurationProperties("spring.datasource-read")
#EnableTransactionManagement
#EnableJpaRepositories(
entityManagerFactoryRef = "entityManagerFactoryRead",
transactionManagerRef = "transactionManagerRead",
basePackages = {"com.ehsaniara.multidatasource.repository.readRepository"}
)
public class DataSourceConfigRead extends HikariConfig {
public final static String PERSISTENCE_UNIT_NAME = "read";
#Bean
public HikariDataSource dataSourceRead() {
return new HikariDataSource(this);
}
#Bean
public LocalContainerEntityManagerFactoryBean entityManagerFactoryRead(
final HikariDataSource dataSourceRead) {
return new LocalContainerEntityManagerFactoryBean() {{
setDataSource(dataSourceRead);
setPersistenceProviderClass(HibernatePersistenceProvider.class);
setPersistenceUnitName(PERSISTENCE_UNIT_NAME);
setPackagesToScan(MODEL_PACKAGE);
setJpaProperties(JPA_PROPERTIES);
}};
}
#Bean
public PlatformTransactionManager transactionManagerRead(EntityManagerFactory entityManagerFactoryRead) {
return new JpaTransactionManager(entityManagerFactoryRead);
}
}

Cannot autowired beans when separate configuration classes

I have a JavaConfig configurated Spring Batch job. The main job configuration file is CrawlerJobConfiguration. Before now, I have all the configuration (infrastructure, autowired beans, etc) in this class and it works fine. So I decided to separate the job configuration from autowired beans and infracstruture beans configuration and create another 2 configuration classes Beans and MysqlInfrastructureConfiguration.
But now I am having problems to run my job. I'm receiving a NullPointerException when the application try to use autowired fields indicating that the autowired is not working.
I put a breakpoint in the methods that create autowired beans to make sure that they are being called and really are, so I cannot realize what can be the problem.
java.lang.NullPointerException: null
at br.com.alexpfx.supermarket.batch.tasklet.StartCrawlerTasklet.execute(StartCrawlerTasklet.java:27) ~[classes/:na]
at org.springframework.batch.core.step.tasklet.TaskletStep$ChunkTransactionCallback.doInTransaction(TaskletStep.java:406) ~[spring-batch-core-3.0.6.RELEASE.jar:3.0.6.RELEASE]
Main job configuration class:
#Configuration
#EnableBatchProcessing
public class CrawlerJobConfiguration {
#Autowired
private InfrastructureConfiguration infrastructureConfiguration;
#Autowired
private StepBuilderFactory steps;
#Autowired
Environment environment;
#Bean
public Job job(JobBuilderFactory jobs) {
Job theJob = jobs.get("job").start(crawlerStep()).next(processProductStep()).build();
((AbstractJob) theJob).setRestartable(true);
return theJob;
}
#Bean
public Step crawlerStep() {
TaskletStep crawlerStep = steps.get("crawlerStep").tasklet(crawlerTasklet()).build();
crawlerStep.setAllowStartIfComplete(true);
return crawlerStep;
}
#Bean
public Step processProductStep() {
TaskletStep processProductStep = steps.get("processProductStep")
.<TransferObject, Product>chunk(10)
.reader(reader())
.processor(processor())
.writer(writer())
.build();
processProductStep.setAllowStartIfComplete(true);
return processProductStep;
}
private Tasklet crawlerTasklet() {
return new StartCrawlerTasklet();
}
private ItemProcessor<TransferObject, Product> processor() {
return new ProductProcessor();
}
private ItemReader<TransferObject> reader() {
return new ProductItemReader();
}
private ItemWriter<Product> writer() {
return new HibernateProductsItemWriter();
}
}
Beans configuration class:
#Configuration
#EnableBatchProcessing
public class Beans {
#Bean
public Crawler crawler() {
return new RibeiraoCrawler(new UserAgentFactory());
}
#Bean
public ProductBo productBo() {
return new ProductBoImpl();
}
#Bean
public ProductDao productDao() {
return new ProductDaoImpl();
}
#Bean
public CrawlerListener listener() {
CrawlerListener listener = new RibeiraoListener();
return listener;
}
#Bean
public ProductList getProductList() {
return new ProductList();
}
}
MysqlInfrastructureConfiguration:
#Configuration
#EnableBatchProcessing
#PropertySource("classpath:database.properties")
#EnableJpaRepositories(basePackages = {"br.com.alexpfx.supermarket.domain"})
public class MysqlInfrastructureConfiguration implements InfrastructureConfiguration {
#Value("${jdbc.url}")
String url;
#Value("${jdbc.driverClassName}")
String driverClassName;
#Value("${jdbc.username}")
String username;
#Value("${jdbc.password}")
String password;
#Bean
#Override
public DataSource getDataSource() {
DriverManagerDataSource dataSource = new DriverManagerDataSource();
dataSource.setDriverClassName(driverClassName);
dataSource.setUrl(url);
dataSource.setUsername(username);
dataSource.setPassword(password);
return dataSource;
}
#Bean
#Override
public PlatformTransactionManager transactionManager() {
JpaTransactionManager transactionManager = new JpaTransactionManager();
transactionManager.setEntityManagerFactory(entityManagerFactory());
transactionManager.setDataSource(getDataSource());
return transactionManager;
}
#Bean
#Override
public EntityManagerFactory entityManagerFactory() {
LocalContainerEntityManagerFactoryBean em = new LocalContainerEntityManagerFactoryBean();
em.setDataSource(getDataSource());
em.setPackagesToScan(new String[]{"br.com.alexpfx.supermarket.domain"});
JpaVendorAdapter vendorAdapter = new HibernateJpaVendorAdapter();
em.setJpaVendorAdapter(vendorAdapter);
em.setJpaProperties(additionalJpaProperties());
em.afterPropertiesSet();
return em.getObject();
}
private Properties additionalJpaProperties() {
Properties properties = new Properties();
properties.setProperty("hibernate.hbm2ddl.auto", "create");
properties.setProperty("hibernate.dialect", "org.hibernate.dialect.MySQL5Dialect");
properties.setProperty("hibernate.show_sql", "true");
properties.setProperty("current_session_context_class", "thread");
return properties;
}
}
tasklet:
public class StartCrawlerTasklet implements Tasklet {
#Autowired
private Crawler crawler;
#Autowired
private CrawlerListener listener;
#Override
public RepeatStatus execute(StepContribution stepContribution, ChunkContext chunkContext) throws Exception {
crawler.setListener(listener);
crawler.setStopCondition(new TimeLimitStopCondition(1, TimeUnit.MINUTES));
crawler.crawl();
return RepeatStatus.FINISHED;
}
}
StartCrawlerTasklet use the autowired annotation, so it should be a bean as well. So change your code :
private Tasklet crawlerTasklet() {
return new StartCrawlerTasklet();
}
to a bean definition:
#Bean
public Tasklet crawlerTasklet() {
return new StartCrawlerTasklet();
}

Resources