MyBatis+Spring MapperScan with Mulitple Data Sources - spring

I am pulling data from two different databases using MyBatis 3.3.1 and Spring 4.3. The two configuration classes to scan for mappers look at follows:
#Configuration
#MapperScan(value="com.mapper1.map",
SqlSessionFactoryRef="sqlSessionFactory1")
public class AppConfig {
#Bean
public DataSource getDataSource1() {
BasicDataSource dataSource = new BasicDataSource();
dataSource.setDriverClassName("com.mysql.jdbc.Driver");
dataSource.setUrl("jdbc:mysql://localhost:3306/database1");
dataSource.setUsername("user");
dataSource.setPassword("pw");
return dataSource;
}
#Bean
public DataSourceTransactionManager transactionManager1() {
return new DataSourceTransactionManager(getDataSource1());
}
#Bean
public SqlSessionFactory sqlSessionFactory1() throws Exception {
SqlSessionFactoryBean sessionFactory = new SqlSessionFactoryBean();
sessionFactory.setDataSource(getDataSource1());
return sessionFactory.getObject();
}
}
#Configuration
#MapperScan(value="com.mapper2.map",
SqlSessionFactoryRef="sqlSessionFactory2")
public class AppConfig {
#Bean
public DataSource getDataSource2() {
BasicDataSource dataSource = new BasicDataSource();
dataSource.setDriverClassName("com.mysql.jdbc.Driver");
dataSource.setUrl("jdbc:mysql://localhost:3307/database2");
dataSource.setUsername("user");
dataSource.setPassword("pw");
return dataSource;
}
#Bean
public DataSourceTransactionManager transactionManager2() {
return new DataSourceTransactionManager(getDataSource2());
}
#Bean
public SqlSessionFactory sqlSessionFactory2() throws Exception {
SqlSessionFactoryBean sessionFactory = new SqlSessionFactoryBean();
sessionFactory.setDataSource(getDataSource2());
return sessionFactory.getObject();
}
}
The code deploys fine, but only mappers from data source 1 works. When I try to use a mapper from data source 2, I get a "No table found" exception from my database. The problem is that although I am setting the specific SqlSessionFactory that I want to use in the mapperScan, it ends up using the other SqlSessionFactory for all the mappers. If I comment out the SqlSessionFactory in configuration 1, then Configuration 2 will work.
Note that if I don't use MapperScan, but instead use a MapperScannerConfigurer bean, I am able to correctly retrieve data.
Has anyone else had problems using #MapperScan with multiple data sources?

The only issue I see in your code is SqlSessionFactoryRef should be from lowercase: (sqlSessionFactory). Apart from that everything is fine, this approach works for me.
You can also look at ace-mybatis. It allows to work with multiple datasources configuring only one bean.

Related

How do I configure two databases in Spring Boot?

I am all new to Spring Boot and I have read som documentation about how you create a Spring boot application.
I have created an application in Spring Boot, this application should run a Spring Batch job (I am rewriting an old job in Spring Batch to a stand alone Spring Boot applikation). I have created the structure of the job with steps and so on. All the job does right now is moving files and that works. I work with embedded databases during development, h2, and Spring Boot has generated the whole database for Spring Batch. Really nice :)
So now my problem, in one of the steps I have to fetch and store data in another database. And I don't know (understand) how I should create this database and access the database in the job.
So in my application.properties
spring.h2.console.enabled=true
spring.h2.console.path=/h2
spring.datasource.url=jdbc:h2:mem:springdb
spring.datasource.username=sa
spring.datasource.password=
spring.datasource.driver-class-name=org.h2.Driver
persons.h2.console.enabled=true
persons.h2.console.path=/h2
persons.datasource.url=jdbc:h2:mem:persons
persons.datasource.username=sa
persons.datasource.password=
persons.datasource.driverClassName=org.h2.Driver
In test I will change database to a SQL-database on another server.
I have some entitys (example)
public class Person {
private Long id;
private String name;
private String familyname;
private Long birthDate;
public Person () {
}
...with getters and setters
In the configuration I have
#Primary
#Bean
#ConfigurationProperties(prefix = "persons.datasource")
public DataSource personsDataSource() {
return DataSourceBuilder.create().build();
}
#Bean
#ConfigurationProperties(prefix = "spring.datasource")
public DataSource batchDataSource() {
return DataSourceBuilder.create().build();
}
And the job
#Autowired
PersonItemWriter itemWriter;
#Autowired
PersonItemProcessor itemProcessor;
#Autowired
PersonItemReader workReader;
#Bean(name = "personsImportJob")
public Job personImportJob() {
Step downloadFiles = stepBuilderFactory.get("download-files")
.tasklet(downloadTasklet())
.build();
Step syncDbAndSavePersons = stepBuilderFactory.get("syncandsave-persons")
.<File, Person>chunk(50)
.reader(workReader)
.processor(itemProcessor)
.writer(itemWriter)
.build();
Step deleteFiles = stepBuilderFactory.get("delete-files")
.tasklet(deleteTasklet())
.build();
Job job = jobBuilderFactory.get("personsimport-job")
.incrementer(new RunIdIncrementer())
.flow(downloadFiles)
.next(syncDbAndSavePersons)
.next(deleteFiles)
.end()
.build();
return job;
}
My writer
#Component
public class PersonItemWriter implements ItemWriter<Person> {
private static final Logger LOGGER = LoggerFactory.getLogger(PersonItemWriter.class);
#Override
public void write(List<? extends Person> list) throws Exception {
LOGGER.info("Write Person to db");
}
}
Now this works, the step syncAndSavePersons does not do anything right now, but I want this step to access another database and update posts to the persons-database.
Can I do this without JPA? Because the existing job doesn't use JPA and if I have to use JPA there will be a lot of code changes and I want to avoid this. I just want to move the job with minimum of changes,
If I run my application with this the only database that is created is the database for spring batch. How can I make sure that the other database is also created? Or is that impossible when I am working with h2 embedded databases?
Before I added the second database I didn't have the datasource configuration at all in my code, but the database was created anyway. I think Spring Boot just created the batch datasource. So maybe I won't need that configuration?
UPDATE:
I solved it by removing the properties for the database in te properties file. The only thing I left is:
spring:
datasource:
initialization-mode: never
h2:
console:
enabled: true
I then created a class called EmbeddedDataSourceConfig:
#Profile("default")
#Configuration
public class EmbeddedDataSourceConfig {
#Bean
#Primary
public DataSource dataSource() {
EmbeddedDatabaseBuilder builder = new EmbeddedDatabaseBuilder();
return builder
.setType(EmbeddedDatabaseType.H2)
.addScript("classpath:/org/springframework/batch/core/schema-h2.sql")
.build();
}
#Bean(name = "personsDataSource")
public DataSource personsDataSource() {
EmbeddedDatabaseBuilder builder = new EmbeddedDatabaseBuilder();
return builder
.setName("persons")
.setType(EmbeddedDatabaseType.H2)
.addScript("classpath:persons_schema.sql")
.build();
}
#Bean(name = "personsTransactionManager")
public PlatformTransactionManager personsTransactionManager(
#Qualifier("personsDataSource") DataSource dataSource) {
return new DataSourceTransactionManager(dataSource);
}
#Bean(name = "personsJdbcTemplate")
public JdbcTemplate personsJdbcTemplate(#Qualifier("personsDataSource") DataSource dataSource) {
return new JdbcTemplate(dataSource);
}
}
and in the job I changed to following
#Bean(name = "personsImportJob")
public Job personImportJob(#Qualifier("personsTransactionManager") PlatformTransactionManager personsTransactionManager) {
Step downloadFiles = stepBuilderFactory.get("download-files")
.tasklet(downloadTasklet())
.build();
Step syncDbAndSavePersons = stepBuilderFactory.get("syncandsave-persons")
.<File, Person>chunk(50)
.reader(workReader)
.processor(itemProcessor)
.writer(itemWriter)
.transactionManager(personsTransactionManager)
.build();
...
}
That's it. Now it generates two h2 in-memory databases.

Spring Disable #Transactional from Configuration java file

I have a code base which is using for two different applications. some of my spring service classes has annotation #Transactional. On server start I would like to disable #Transactional based on some configuration.
The below is my configuration Class.
#Configuration
#EnableTransactionManagement
#PropertySource("classpath:application.properties")
public class WebAppConfig {
private static final String PROPERTY_NAME_DATABASE_DRIVER = "db.driver";
#Resource
private Environment env;
#Bean
public DataSource dataSource() {
DriverManagerDataSource dataSource = new DriverManagerDataSource();
dataSource.setDriverClassName(env.getRequiredProperty(PROPERTY_NAME_DATABASE_DRIVER));
dataSource.setUrl(url);
dataSource.setUsername(userId);
dataSource.setPassword(password);
return dataSource;
}
#Bean
public PlatformTransactionManager txManager() {
DefaultTransactionDefinition def = new DefaultTransactionDefinition();
def.setIsolationLevel(TransactionDefinition.ISOLATION_DEFAULT);
if(appName.equqls("ABC")) {
def.setPropagationBehavior(TransactionDefinition.PROPAGATION_NEVER);
}else {
def.setPropagationBehavior(TransactionDefinition.PROPAGATION_REQUIRED);
}
CustomDataSourceTransactionManager txM=new CustomDataSourceTransactionManager(def);
txM.setDataSource(dataSource());
return txM;
}
#Bean
public JdbcTemplate jdbcTemplate() {
JdbcTemplate jdbcTemplate = new JdbcTemplate();
jdbcTemplate.setDataSource(dataSource());
return jdbcTemplate;
}
}
I am trying to ovveried methods in DataSourceTransactionManager to make the functionality. But still it is trying to commit/rollback the transaction at end of transaction. Since there is no database connection available it is throwing exception.
If I keep #Transactional(propagation=Propagation.NEVER), everything works perfectly, but I cannot modify it as another app is using the same code base and it is necessary in that case.
I would like to know if there is a to make transaction fully disable from configuration without modifying #Transactional annotation.
I'm not sure if it would work but you can try to implement custom TransactionInterceptor and override its method that wraps invocation into a transaction, by removing that transactional stuff. Something like this:
public class NoOpTransactionInterceptor extends TransactionInterceptor {
#Override
protected Object invokeWithinTransaction(
Method method,
Class<?> targetClass,
InvocationCallback invocation
) throws Throwable {
// Simply invoke the original unwrapped code
return invocation.proceedWithInvocation();
}
}
Then you declare a conditional bean in one of #Configuration classes
// assuming this property is stored in Spring application properties file
#ConditionalOnProperty(name = "turnOffTransactions", havingValue = "true"))
#Bean
#Role(BeanDefinition.ROLE_INFRASTRUCTURE)
public TransactionInterceptor transactionInterceptor(
/* default bean would be injected here */
TransactionAttributeSource transactionAttributeSource
) {
TransactionInterceptor interceptor = new NoOpTransactionInterceptor();
interceptor.setTransactionAttributeSource(transactionAttributeSource);
return interceptor;
}
Probably you gonna need additional configurations, I can't verify that right now

multiple datasources in Spring Boot application

I'm trying to using two database connections w/in a Spring Boot (v1.2.3) application as described in the docs (http://docs.spring.io/spring-boot/docs/1.2.3.RELEASE/reference/htmlsingle/#howto-two-datasources.
Problem seems to be that the secondary datasource is getting constructed with the properties for the primary datasource.
Can someone point out what I'm missing here?
#SpringBootApplication
class Application {
#Bean
#ConfigurationProperties(prefix="spring.datasource.secondary")
public DataSource secondaryDataSource() {
return DataSourceBuilder.create().build();
}
#Bean
public JdbcTemplate secondaryJdbcTemplate(DataSource secondaryDataSource) {
return new JdbcTemplate(secondaryDataSource)
}
#Bean
#Primary
#ConfigurationProperties(prefix="spring.datasource.primary")
public DataSource primaryDataSource() {
return DataSourceBuilder.create().build();
}
#Bean(name = "primaryJdbcTemplate")
public JdbcTemplate jdbcTemplate(DataSource primaryDataSource) {
return new JdbcTemplate(primaryDataSource)
}
static void main(String[] args) {
SpringApplication.run Application, args
}
}
application.properties:
spring.datasource.primary.url=jdbc:oracle:thin:#example.com:1521:DB1
spring.datasource.primary.username=user1
spring.datasource.primary.password=
spring.datasource.primary.driverClassName=oracle.jdbc.OracleDriver
spring.datasource.secondary.url=jdbc:oracle:thin:#example.com:1521:DB2
spring.datasource.secondary.username=user2
spring.datasource.secondary.password=
spring.datasource.secondary.driverClassName=oracle.jdbc.OracleDriver
Both JdbcTemplate beans will be getting created with the primary DataSource. You can use #Qualifier to have the secondary DataSource injected into the secondary JdbcTemplate. Alternatively, you could call the DataSource methods directly when creating the JdbcTemplate beans.

spring-boot-starter-jta-atomikos and spring-boot-starter-batch

Is it possible to use both these starters in a single application?
I want to load records from a CSV file into a database table. The Spring Batch tables are stored in a different database, so I assume I need to use JTA to handle the transaction.
Whenever I add #EnableBatchProcessing to my #Configuration class it configures a PlatformTransactionManager, which stops this being auto-configured by Atomikos.
Are there any spring boot + batch + jta samples out there that show how to do this?
Many Thanks,
James
I just went through this and I found something that seems to work. As you note, #EnableBatchProcessing causes a DataSourceTransactionManager to be created, which messes up everything. I'm using modular=true in #EnableBatchProcessing, so the ModularBatchConfiguration class is activated.
What I did was to stop using #EnableBatchProcessing and instead copy the entire ModularBatchConfiguration class into my project. Then I commented out the transactionManager() method, since the Atomikos configuration creates the JtaTransactionManager. I also had to override the jobRepository() method, because that was hardcoded to use the DataSourceTransactionManager created inside DefaultBatchConfiguration.
I also had to explicitly import the JtaAutoConfiguration class. This wires everything up correctly (according to the Actuator's "beans" endpoint - thank god for that). But when you run it the transaction manager throws an exception because something somewhere sets an explicit transaction isolation level. So I also wrote a BeanPostProcessor to find the transaction manager and call txnMgr.setAllowCustomIsolationLevels(true);
Now everything works, but while the job is running, I cannot fetch the current data from batch_step_execution table using JdbcTemplate, even though I can see the data in SQLYog. This must have something to do with transaction isolation, but I haven't been able to understand it yet.
Here is what I have for my configuration class, copied from Spring and modified as noted above. PS, I have my DataSource that points to the database with the batch tables annotated as #Primary. Also, I changed my DataSource beans to be instances of org.apache.tomcat.jdbc.pool.XADataSource; I'm not sure if that's necessary.
#Configuration
#Import(ScopeConfiguration.class)
public class ModularJtaBatchConfiguration implements ImportAware
{
#Autowired(required = false)
private Collection<DataSource> dataSources;
private BatchConfigurer configurer;
#Autowired
private ApplicationContext context;
#Autowired(required = false)
private Collection<BatchConfigurer> configurers;
private AutomaticJobRegistrar registrar = new AutomaticJobRegistrar();
#Bean
public JobRepository jobRepository(DataSource batchDataSource, JtaTransactionManager jtaTransactionManager) throws Exception
{
JobRepositoryFactoryBean factory = new JobRepositoryFactoryBean();
factory.setDataSource(batchDataSource);
factory.setTransactionManager(jtaTransactionManager);
factory.afterPropertiesSet();
return factory.getObject();
}
#Bean
public JobLauncher jobLauncher() throws Exception {
return getConfigurer(configurers).getJobLauncher();
}
// #Bean
// public PlatformTransactionManager transactionManager() throws Exception {
// return getConfigurer(configurers).getTransactionManager();
// }
#Bean
public JobExplorer jobExplorer() throws Exception {
return getConfigurer(configurers).getJobExplorer();
}
#Bean
public AutomaticJobRegistrar jobRegistrar() throws Exception {
registrar.setJobLoader(new DefaultJobLoader(jobRegistry()));
for (ApplicationContextFactory factory : context.getBeansOfType(ApplicationContextFactory.class).values()) {
registrar.addApplicationContextFactory(factory);
}
return registrar;
}
#Bean
public JobBuilderFactory jobBuilders(JobRepository jobRepository) throws Exception {
return new JobBuilderFactory(jobRepository);
}
#Bean
// hopefully this will autowire the Atomikos JTA txn manager
public StepBuilderFactory stepBuilders(JobRepository jobRepository, JtaTransactionManager ptm) throws Exception {
return new StepBuilderFactory(jobRepository, ptm);
}
#Bean
public JobRegistry jobRegistry() throws Exception {
return new MapJobRegistry();
}
#Override
public void setImportMetadata(AnnotationMetadata importMetadata) {
AnnotationAttributes enabled = AnnotationAttributes.fromMap(importMetadata.getAnnotationAttributes(
EnableBatchProcessing.class.getName(), false));
Assert.notNull(enabled,
"#EnableBatchProcessing is not present on importing class " + importMetadata.getClassName());
}
protected BatchConfigurer getConfigurer(Collection<BatchConfigurer> configurers) throws Exception {
if (this.configurer != null) {
return this.configurer;
}
if (configurers == null || configurers.isEmpty()) {
if (dataSources == null || dataSources.isEmpty()) {
throw new UnsupportedOperationException("You are screwed");
} else if(dataSources != null && dataSources.size() == 1) {
DataSource dataSource = dataSources.iterator().next();
DefaultBatchConfigurer configurer = new DefaultBatchConfigurer(dataSource);
configurer.initialize();
this.configurer = configurer;
return configurer;
} else {
throw new IllegalStateException("To use the default BatchConfigurer the context must contain no more than" +
"one DataSource, found " + dataSources.size());
}
}
if (configurers.size() > 1) {
throw new IllegalStateException(
"To use a custom BatchConfigurer the context must contain precisely one, found "
+ configurers.size());
}
this.configurer = configurers.iterator().next();
return this.configurer;
}
}
#Configuration
class ScopeConfiguration {
private StepScope stepScope = new StepScope();
private JobScope jobScope = new JobScope();
#Bean
public StepScope stepScope() {
stepScope.setAutoProxy(false);
return stepScope;
}
#Bean
public JobScope jobScope() {
jobScope.setAutoProxy(false);
return jobScope;
}
}
I found a solution where I was able to keep #EnableBatchProcessing but had to implement BatchConfigurer and atomikos beans, see my full answer in this so answer.

How to define HSQ DB properties in Spring JPA using annoations

I am running HSQL DB as a Im memory using run manger Swing.I have to connect the HSQLDB server from Spring JPA repository using annotations.
My repository class.
#RepositoryRestResource
public interface Vehicle extends JpaRepository<Vehicle , BigInteger>{
public List<Vehicle > findAll(Sort sort);
}
service Method:
#Service
public class LocationService {
#Autowired
VehicletRepository vehicleRepository = null;
/**
* This method is to get all the locations from the repository
*/
public List<Vehicle> getVehicless() {
Order order = new Order(Direction.ASC,"vehicleCode");
Sort sort = new Sort(order);
List<Airport> airports = vehicletRepository .findAll(sort);
System.out.println("inside service");
return vehicles;
}
}
Anyone help to achieve Spring JPA conenction with HSQL DB using annotations.
I assume you dont use Spring boot:
you need #Configuration class -(it basically is new way to configure spring applications in java ) with #EnableJpaRepositories which turn it spring data jpa/ spring data rest for you. You will also have to specify your entity manager, transaction manager and data source beans. Example below:
#Configuration
#EnableJpaRepositories("your.package.with.repositories")
public class DBConfig{
#Bean
public JpaTransactionManager transactionManager() {
JpaTransactionManager transactionManager = new JpaTransactionManager();
transactionManager.setEntityManagerFactory(entityManagerFactory().getObject());
return transactionManager;
}
#Bean
public DataSource dataSource() {
BasicDataSource dataSource = new BasicDataSource();
DBConfigurationCommon.configureDB(dataSource, your_jdbc_url_here, db_username_here, db_password_here);
return dataSource;
}
#Bean
public LocalContainerEntityManagerFactoryBean entityManagerFactory() {
LocalContainerEntityManagerFactoryBean entityManagerFactoryBean = new LocalContainerEntityManagerFactoryBean();
entityManagerFactoryBean.setDataSource(dataSource());
entityManagerFactoryBean.setJpaVendorAdapter(new HibernateJpaVendorAdapter());
//you may need to define new Properties(); here with hibernate dialect and add it to entity manager factory
entityManagerFactoryBean.setPackagesToScan("your_package_with_domain_classes_here");
return entityManagerFactoryBean;
}
}

Resources