I need to get access to two datasources:
Spring batch repository: in memory H2
My step needs to get access to a mssql.
I've seen several example over there about how to create a custom batch configurer.
However, sometimes:
implements BatchConfigurer
extends DefaultBatchConfigurer
Currently, my configuration files are:
.
├── steps
│ └── MssqlBatchConfigurationStep.java
└── MainJobBatchConfiguration.java
My step configuration is:
#Configuration
public class MssqlBatchConfigurationStep {
private DataSource dataSource;
/**
*
* #param dataSource
*/
public MssqlBatchConfigurationStep(DataSource dataSource) {
this.dataSource = dataSource;
}
/**
*
* #return
*/
public ItemReader<Unitat> reader() {
String sql = "SELECT operation,update_time,table_name,rowid,user_login,user_name, user_ip,application_name,application_version,new_value,old_value FROM renovastorage.data_log";
JdbcCursorItemReader<Unitat> jdbcCursorItemReader = new JdbcCursorItemReader<>();
jdbcCursorItemReader.setDataSource(this.dataSource);
jdbcCursorItemReader.setSql(sql);
jdbcCursorItemReader.setVerifyCursorPosition(false);
jdbcCursorItemReader.setRowMapper(new UnitatRowMapper());
return jdbcCursorItemReader;
}
/**
*
* #return
*/
public ItemWriter<UnitatDenormalized> writer() {
// write to solr
return null;
}
}
The problem here is that, this step is getting default datasource. This datasource is the same that is got by Spring Batch.
In order to solve that, I want to create a "Batch Configurer" in order to get specific datasource instead of the default one.
Here you can see my job configuration:
#Configuration
#EnableBatchProcessing
// #EnableScheduling
public class MainJobBatchConfiguration {
private JobBuilderFactory jobBuilderFactory;
private StepBuilderFactory stepBuilderFactory;
private MssqlBatchConfigurationStep unitatBatchStep;
/**
*
* #param jobBuilderFactory
* #param stepBuilderFactory
*/
public MainJobBatchConfiguration(
JobBuilderFactory jobBuilderFactory,
StepBuilderFactory stepBuilderFactory,
MssqlBatchConfigurationStep unitatBatchStep
) {
this.jobBuilderFactory = jobBuilderFactory;
this.stepBuilderFactory = stepBuilderFactory;
this.unitatBatchStep = unitatBatchStep;
}
/**
*
* #return
*/
#Bean
public Step step() {
return this.stepBuilderFactory
.get("mssql")
.<Unitat, UnitatDenormalized>chunk(10)
.reader(this.unitatBatchStep.reader())
.writer(this.unitatBatchStep.writer())
.build();
}
/**
*
* #param step
* #return
*/
#Bean
public Job job(Step step) {
Job job = this.jobBuilderFactory.get("job1")
.flow(step)
.end()
.build();
return job;
}
}
You need to add a secondary datasource bean and autowire that datasource.
application.properties
spring.second-datasource.url = [url]
spring.second-datasource.username = [username]
spring.second-datasource.password = [password]
spring.second-datasource.driverClassName= [driverClassName]
Datasource config
#Primary
#Bean(value = "defaultDataSource")
#ConfigurationProperties(prefix = "spring.datasource")
public DataSource datasource() {
DriverManagerDataSource dataSource = new DriverManagerDataSource();
return dataSource;
}
#Bean(value = "secondDataSource")
#ConfigurationProperties(prefix = "spring.second-datasource")
public DataSource ticketDataSource() {
DriverManagerDataSource dataSource = new DriverManagerDataSource();
return dataSource;
}
Autowire secondDataSource in your reader.
private DataSource dataSource;
/**
*
* #param dataSource
*/
public MssqlBatchConfigurationStep(#Qualifier("secondDataSource") DataSource dataSource) {
this.dataSource = dataSource;
}
My step needs to get access to a mssql.
In order to solve that, I want to create a "Batch Configurer" in order to get specific datasource instead of the default one.
In order to solve that, I would instead add a qualifier on the datasource to specify which one should be used in the step:
#Configuration
public class MssqlBatchConfigurationStep {
private DataSource dataSource;
/**
*
* #param dataSource
*/
public MssqlBatchConfigurationStep(#Qualifier("YOUR_MSSQL_DATASOURCE_BEAN_NAME") DataSource dataSource) {
this.dataSource = dataSource;
}
}
With that, your reader should be pointing to the mssql datasource and read data from it.
Related
I have configured a Spring Batch app to use two different datasources in each step. Only the step that has the datasource annotation #Primary works and the other one errors out since it tries to use that datasource and not the one passed in to the reader/writer.
As soon as I switch the #primary annotation, the opposite step fails.
SO essentially I can only get one step to work - which ever one I put #Primary on.
I thought that I could just get rid of the annotation since I am passing the datasources explicitly to the reader/writer but I get an error stating that #Primary is required...
Any suggestions?
here is the datasource config class where I am pulling in values from my yaml file:
#Bean
#Primary
#ConfigurationProperties("spring.batch.datasource")
public DataSource dataSourceOne(CredentialsFactory credentialsFactory, ResourceLoader resourceLoader) {
ConjurSecret conjurSecret = credentialsFactory.getCredentials(dataSourceOneCredentialAlias);
return DataSourceBuilder.create(resourceLoader.getClassLoader()).username(conjurSecret.getUserName()).password(conjurSecret.getPassword()).url(dataSourceOnedbUrl).build();
}
#Bean
#ConfigurationProperties("spring.ds2.datasource")
public DataSource dataSourceTwoDataSource(CredentialsFactory credentialsFactory, ResourceLoader resourceLoader) {
ConjurSecret conjurSecret = credentialsFactory.getCredentials(dataSourceTwoCredentialAlias);
return DataSourceBuilder.create(resourceLoader.getClassLoader()).username(conjurSecret.getUserName()).password(conjurSecret.getPassword()).url(dataSourceTwoUrl).build();
}
}
And then In my main config class I pass each datasource to each reader/writer accordingly
#Autowired
public DataSource dataSourceOne;
#Autowired
public DataSource dataSourceTwo;
private String table1ReadQuery = "SELECT * FROM table1 a WHERE a.TYPE_CD='TESTX' with ur";
private String table1DeleteQuery = "DELETE FROM table1 WHERE TD_ID=:TdId";
private String table2ReadQuery = "SELECT * FROM table2 a WHERE a.CTGY_CD='TESTX' with ur";
private String tableDeleteQuery = "DELETE FROM table2 WHERE D_ID=:TdId";
#Bean
public Job importUserJob(Step readCurrentAssessment) {
return jobBuilderFactory.get("importUserJob")
.listener(jobListener())
.start(readTableOne)
// .next(readTableTwo)
.build();
}
#Bean
public Step readTableOne(ItemReader<Table1> table1Reader, ItemWriter<Table1> table1Writer){
return stepBuilderFactory.get("table 1")
.listener(stepExecutionListener())
.<Table1,Table1>chunk(2000)
.reader(table1Reader)
.writer(table1Writer)
.build();
}
#Bean
public Step readTableTwo(ItemReader<Table2> table2Reader, ItemWriter<Table2> table2Writer){
return stepBuilderFactory.get("table 2")
.listener(stepExecutionListener())
.<Table2,Table2>chunk(2000)
.reader(table2Reader)
.writer(table2Writer)
.build();
}
#Bean
public JdbcCursorItemReader<Table1> table1Reader(DataSource dataSourceOne) {
return new JdbcCursorItemReaderBuilder<Table1>()
.dataSource(dataSourceOne)
.name("Table1Reader")
.sql(table1ReadQuery)
.rowMapper(new Table1Mapper())
.build();
}
/
#Bean
public JdbcCursorItemReader<Table2> table2Reader(DataSource dataSourceTwo) {
return new JdbcCursorItemReaderBuilder<Table2>()
.dataSource(dataSourceTwo)
.name("Table2Reader")
.sql(readTable2Query)
.rowMapper(new Table2Mapper())
.build();
}
#Bean
ItemWriter<Table2> table2Writer() {
return new JdbcBatchItemWriterBuilder<Table2>()
.itemSqlParameterSourceProvider(new BeanPropertyItemSqlParameterSourceProvider<>())
.sql(table2DeleteQuery)
.assertUpdates(true)
.dataSource(dataSourceTwo)
.build();
}
#Bean
ItemWriter<Table1> table1Writer() {
return new JdbcBatchItemWriterBuilder<Table1>()
.itemSqlParameterSourceProvider(new BeanPropertyItemSqlParameterSourceProvider<>())
.sql(table1Writer)
.assertUpdates(true)
.dataSource(dataSourceOne)
.build();
}
The Only way
I have a multi database application. Users can select the database on the login page.
Then the database is routing selected database thanks for AbstractRoutingDataSource from Spring.
I want to use HikariCP, but it needs dataSourceUrl. But my Datasource URL changes dynamically. How can I configure Hikaricp for multiple databases?
File application.properties:
#database1 properties
app.database1.connection.url = url1
app.database1.connection.username = sameusername
app.database1.connection.password = samepassword
#database2 properties
app.database2.connection.url = url2
app.database2.connection.username = sameusername
app.database2.connection.password = samepassword
My Datasource configuration class example:
public class DataSourceConfiguration {
#Autowired(required = false)
private PersistenceUnitManager persistenceUnitManager;
#Bean
#ConfigurationProperties(prefix = "app.database1.connection")
public DataSource database1DataSource() {
return DataSourceBuilder.create().build();
}
#Bean
#ConfigurationProperties(prefix = "app.database2.connection")
public DataSource database2DataSource() {
return DataSourceBuilder.create().build();
}
#Bean
#Primary
public DataSource appDataSource() {
DataSourceRouter router = new DataSourceRouter();
final HashMap<Object, Object> map = new HashMap<>(3);
map.put(DatabaseEnvironment.DATABASE1, database1DataSource());
map.put(DatabaseEnvironment.DATABASE2, database2DataSource());
router.setTargetDataSources(map);
return router;
}
#Bean
#Primary
#ConfigurationProperties("app.connection.jpa")
public JpaProperties appJpaProperties() {
return new JpaProperties();
}
private JpaVendorAdapter createJpaVendorAdapter(JpaProperties jpaProperties) {
AbstractJpaVendorAdapter adapter = new HibernateJpaVendorAdapter();
adapter.setShowSql(jpaProperties.isShowSql());
adapter.setDatabase(jpaProperties.getDatabase());
adapter.setDatabasePlatform(jpaProperties.getDatabasePlatform());
adapter.setGenerateDdl(jpaProperties.isGenerateDdl());
return adapter;
}
My session scoped class instead of context holder:
#Component
#Scope(value = "session", proxyMode = ScopedProxyMode.TARGET_CLASS)
public class PreferredDatabaseSession implements Serializable {
/**
*
*/
private static final long serialVersionUID = 1L;
private DatabaseEnvironment preferredDb;
public DatabaseEnvironment getPreferredDb() {
return preferredDb;
}
public void setPreferredDb(DatabaseEnvironment preferredDb) {
this.preferredDb = preferredDb;
}
}
If I understand your requirement correctly, you intend to define two data sources and for a given request you want to route your queries to a particular data source based on some condition.
The solution is:
File application.properties
#database1 properties
app.database1.connection.url = url1
app.database1.connection.username = username1
app.database1.connection.password = password1
#database2 properties
app.database2.connection.url = url2
app.database2.connection.username = username2
app.database2.connection.password = password2
#default
default.datasource.key=dataSource1
File CommonRoutingDataSource.java
public class CommonRoutingDataSource extends AbstractRoutingDataSource {
#Override
protected Object determineCurrentLookupKey() {
return DataSourceContextHolder.getDataSourceName();
}
public void initDataSources(final DataSource dataSource1, final DataSource dataSource2,
final String defaultDataSourceKey) {
final Map<Object, Object> dataSourceMap = new HashMap<Object, Object>();
dataSourceMap.put("dataSource1", dataSource1);
dataSourceMap.put("dataSource2", dataSource2);
this.setDefaultTargetDataSource(dataSourceMap.get(defaultDataSourceKey));
this.setTargetDataSources(dataSourceMap);
}
}
File DataSourceContextHolder.java
public class DataSourceContextHolder {
private static final ThreadLocal<String> contextHolder = new ThreadLocal<>();
private DataSourceContextHolder() {
// Private no-op constructor
}
public static final void setDataSourceName(final String dataSourceName) {
Assert.notNull(dataSourceName, "dataSourceName cannot be null");
contextHolder.set(dataSourceName);
}
public static final String getDataSourceName() {
return contextHolder.get();
}
public static final void clearDataSourceName() {
contextHolder.remove();
}
}
File DataSourceConfig.java
public class DataSourceConfig {
#Autowired
private Environment env;
#Autowired
#Bean(name = "dataSource")
public DataSource getDataSource(final DataSource dataSource1, final DataSource dataSource2) {
final CommonRoutingDataSource dataSource = new CommonRoutingDataSource();
dataSource.initDataSources(dataSource1, dataSource2, env.getProperty("default.datasource.key"));
return dataSource;
}
#Bean(name = "dataSource1")
public DataSource getDataSource1() throws SQLException {
// The exact DataSource class imported shall be as per your requirement - HikariCP, or Tomcat etc.
final DataSource dataSource = new DataSource();
dataSource.setDriverClassName();
dataSource.setUrl(env.getProperty("app.database1.connection.url"));
// Set all data source attributes from the application.properties file
return dataSource;
}
#Bean(name = "dataSource2")
public DataSource getDataSource2() throws SQLException {
// The exact DataSource class imported shall be as per your requirement - HikariCP, or Tomcat etc.
final DataSource dataSource = new DataSource();
dataSource.setDriverClassName();
dataSource.setUrl(env.getProperty("app.database2.connection.url"));
// set all data source attributes from the application.properties file
return dataSource;
}
}
Now, somewhere in your code (either an Aspect or Controller), you need to dynamically set the data source conditionally:
DataSourceContextHolder.setDataSourceName("dataSource1");
Note: It's better to declare the data source names as enums rather than strings "dataSource1", "dataSource2", etc.
The below snippet works for me
first.datasource.jdbc-url=jdbc-url
first.datasource.username=username
first.datasource.password=password
.
.
.
.
=================== In Java Configuration File ==================
#Primary
#Bean(name = "firstDataSource")
#ConfigurationProperties(prefix = "first.datasource")
public DataSource dataSource() {
return DataSourceBuilder.create().build();
}
#Primary
#Bean(name = "firstEntityManagerFactory")
public LocalContainerEntityManagerFactoryBean barEntityManagerFactory(EntityManagerFactoryBuilder builder,
#Qualifier("firstDataSource") DataSource dataSource) {
Map<String, String> props = new HashMap<String, String>();
props.put("spring.jpa.database-platform", "org.hibernate.dialect.Oracle12cDialect");
.
.
.
return builder.dataSource(dataSource).packages("com.first.entity").persistenceUnit("firstDB")
.properties(props)
.build();
}
#Primary
#Bean(name = "firstTransactionManager")
public PlatformTransactionManager firstTransactionManager(
#Qualifier("firstEntityManagerFactory") EntityManagerFactory firstEntityManagerFactory) {
return new JpaTransactionManager(firstEntityManagerFactory);
}
second.datasource.jdbc-url=jdbc-url
second.datasource.username=username
second.datasource.password=password
.
.
.
.
=================== In Java Configuration File ==================
#Bean(name = "secondDataSource")
#ConfigurationProperties(prefix = "second.datasource")
public DataSource dataSource() {
return DataSourceBuilder.create().build();
}
#Bean(name = "secondEntityManagerFactory")
public LocalContainerEntityManagerFactoryBean barEntityManagerFactory(EntityManagerFactoryBuilder builder,
#Qualifier("secondDataSource") DataSource dataSource) {
Map<String, String> props = new HashMap<String, String>();
props.put("spring.jpa.database-platform", "org.hibernate.dialect.Oracle12cDialect");
.
.
.
return builder.dataSource(dataSource).packages("com.second.entity").persistenceUnit("secondDB")
.properties(props)
.build();
}
#Bean(name = "secondTransactionManager")
public PlatformTransactionManager secondTransactionManager(
#Qualifier("secondEntityManagerFactory") EntityManagerFactory secondEntityManagerFactory) {
return new JpaTransactionManager(secondEntityManagerFactory);
}
I am following this link
https://github.com/kwon37xi/replication-datasource
I have implemented the code But STILL Both my service functions are using the same Database(one which is marked primary)
Service Class
public class TableService{
#Autowired
private Table1Repo t1Repo;
#Transactional(readOnly = false)
public void saveTable1(Table1 t,int a, Table1 t2){
try{
t1Repo.save(t2);
}
catch(Exception e){
System.out.println("Inside");
}
}
#Transactional(readOnly = true)
public Table1 getTable(int id){
return t1Repo.findOne(id);
}
}
Then Added two Class(from the link)
ReplicationRoutingDataSource
public class ReplicationRoutingDataSource extends AbstractRoutingDataSource {
#Override
protected Object determineCurrentLookupKey() {
String dataSourceType = TransactionSynchronizationManager.isCurrentTransactionReadOnly() ? "read" : "write";
return dataSourceType;
}
}
WithRoutingDataSourceConfig
#Configuration
public class WithRoutingDataSourceConfig {
/*#Bean(destroyMethod = "shutdown")*/
#Bean
#Primary
#ConfigurationProperties(prefix="datasource.primary")
public DataSource writeDataSource() {
/* EmbeddedDatabaseBuilder builder = new EmbeddedDatabaseBuilder()
.setName("routingWriteDb")
.setType(EmbeddedDatabaseType.H2)
.setScriptEncoding("UTF-8")
.addScript("classpath:/writedb.sql");
return builder.build();*/
return DataSourceBuilder.create().build();
}
/* #Bean(destroyMethod = "shutdown")*/
#Bean
#ConfigurationProperties(prefix="datasource.secondary")
public DataSource readDataSource() {
/*EmbeddedDatabaseBuilder builder = new EmbeddedDatabaseBuilder()
.setName("routingReadDb")
.setType(EmbeddedDatabaseType.H2)
.setScriptEncoding("UTF-8")
.addScript("classpath:/readdb.sql");
return builder.build();*/
return DataSourceBuilder.create().build();
}
/**
* {#link org.springframework.jdbc.datasource.lookup.AbstractRoutingDataSource}는
* {#link org.springframework.beans.factory.InitializingBean}을 구현하므로,
* 명시적으로 afterPropertiesSet()메소드를 호출하거나
* 별도 #Bean으로 만들어 Spring Life Cycle을 타도록 해야 한다.
*/
#Bean
public DataSource routingDataSource(#Qualifier("writeDataSource") DataSource writeDataSource, #Qualifier("readDataSource") DataSource readDataSource) {
ReplicationRoutingDataSource routingDataSource = new ReplicationRoutingDataSource();
Map<Object, Object> dataSourceMap = new HashMap<Object, Object>();
dataSourceMap.put("write", writeDataSource);
dataSourceMap.put("read", readDataSource);
routingDataSource.setTargetDataSources(dataSourceMap);
routingDataSource.setDefaultTargetDataSource(writeDataSource);
return routingDataSource;
}
/**
* {#link org.springframework.jdbc.datasource.LazyConnectionDataSourceProxy}로 감싸서
* 트랜잭션 동기화가 이루어진 뒤에 실제 커넥션을 확보하도록 해준다.
*
* #param routingDataSource
* #return
*/
#Bean
public DataSource dataSource(#Qualifier("routingDataSource") DataSource routingDataSource) {
return new LazyConnectionDataSourceProxy(routingDataSource);
}
}
application.prop file
server.port=8089
spring.jpa.show-sql = true
spring.jpa.properties.hibernate.show_sql=true
# Primary DataSource configuration
datasource.primary.url=jdbc:mysql://127.0.0.1:3306/jpa
datasource.primary.username=root
datasource.primary.password=root
# Any of the other Spring supported properties below...
# Secondary DataSource configuration
datasource.secondary.url=jdbc:mysql://127.0.0.1:3306/jpa2
datasource.secondary.username=root
datasource.secondary.password=root
Repository
public interface Table1Repo extends JpaRepository<Table1, Integer>{}
Issue is my both service functions are using the primary Database. What am I missing. I only have these class. Rest I have one Controller
Edited
I have made by code work by adding this class
#Configuration
#EnableTransactionManagement
#EnableJpaRepositories(basePackages="com.example")
public class ReplicationDataSourceApplicationConfig {
#Bean
public LocalContainerEntityManagerFactoryBean entityManagerFactory(#Qualifier("dataSource") DataSource dataSource) {
LocalContainerEntityManagerFactoryBean emfb = new LocalContainerEntityManagerFactoryBean();
emfb.setDataSource(dataSource);
emfb.setPackagesToScan("com.example");
HibernateJpaVendorAdapter jpaVendorAdapter = new HibernateJpaVendorAdapter();
emfb.setJpaVendorAdapter(jpaVendorAdapter);
return emfb;
}
#Bean
public PlatformTransactionManager transactionManager(EntityManagerFactory entityManagerFactory) {
JpaTransactionManager transactionManager = new JpaTransactionManager();
transactionManager.setEntityManagerFactory(entityManagerFactory);
return transactionManager;
}
#Bean
public PersistenceExceptionTranslationPostProcessor exceptionTranslationPostProcessor() {
return new PersistenceExceptionTranslationPostProcessor();
}
}
I'm the writer of the link you refered.
Which data source do you use with Table1Repo?
You have to inject the specific bean "dataSource" to you JDBC call.
I guess #Primary "writeDataSource" is injected to your Repository.
Try to change #Primary to "dataSource" or find a way to inject "dataSource" to your repository.
you can also try in this way:
Spring Boot 2 with Multiple DataSource for Postgres Data Replication
and here is source code in github:
spring-boot-multi-data-source
The following link explans, how you can have multiple datasource
DB1 (write):
#Configuration
#ConfigurationProperties("spring.datasource-write")
#EnableTransactionManagement
#EnableJpaRepositories(
entityManagerFactoryRef = "entityManagerFactoryWrite",
transactionManagerRef = "transactionManagerWrite",
basePackages = {"com.ehsaniara.multidatasource.repository.writeRepository"}
)
public class DataSourceConfigWrite extends HikariConfig {
public final static String PERSISTENCE_UNIT_NAME = "write";
#Bean
public HikariDataSource dataSourceWrite() {
return new HikariDataSource(this);
}
#Bean
public LocalContainerEntityManagerFactoryBean entityManagerFactoryWrite(
final HikariDataSource dataSourceWrite) {
return new LocalContainerEntityManagerFactoryBean() {{
setDataSource(dataSourceWrite);
setPersistenceProviderClass(HibernatePersistenceProvider.class);
setPersistenceUnitName(PERSISTENCE_UNIT_NAME);
setPackagesToScan(MODEL_PACKAGE);
setJpaProperties(JPA_PROPERTIES);
}};
}
#Bean
public PlatformTransactionManager transactionManagerWrite(EntityManagerFactory entityManagerFactoryWrite) {
return new JpaTransactionManager(entityManagerFactoryWrite);
}
}
DB2 (read):
#Configuration
#ConfigurationProperties("spring.datasource-read")
#EnableTransactionManagement
#EnableJpaRepositories(
entityManagerFactoryRef = "entityManagerFactoryRead",
transactionManagerRef = "transactionManagerRead",
basePackages = {"com.ehsaniara.multidatasource.repository.readRepository"}
)
public class DataSourceConfigRead extends HikariConfig {
public final static String PERSISTENCE_UNIT_NAME = "read";
#Bean
public HikariDataSource dataSourceRead() {
return new HikariDataSource(this);
}
#Bean
public LocalContainerEntityManagerFactoryBean entityManagerFactoryRead(
final HikariDataSource dataSourceRead) {
return new LocalContainerEntityManagerFactoryBean() {{
setDataSource(dataSourceRead);
setPersistenceProviderClass(HibernatePersistenceProvider.class);
setPersistenceUnitName(PERSISTENCE_UNIT_NAME);
setPackagesToScan(MODEL_PACKAGE);
setJpaProperties(JPA_PROPERTIES);
}};
}
#Bean
public PlatformTransactionManager transactionManagerRead(EntityManagerFactory entityManagerFactoryRead) {
return new JpaTransactionManager(entityManagerFactoryRead);
}
}
the given entitymanager has no managedtype for my class, it is the wrong entitymanager.
i thought that by simply defining the model and repository scan to the specific namespace does handle this automatically, but as it seems i am wrong (?!). Does anybody know how to handle this problem?
He has the class something.application.model.User but uses the entitymanager with the managedtypes of the namespace something.manager.model.
public class JpaConfiguration {
#Configuration
#EntityScan(basePackages = { "something.application.model" })
#EnableJpaAuditing
#EnableTransactionManagement
#EnableJpaRepositories(basePackages = { "something.application.model" }, repositoryFactoryBeanClass = RepositoryFactoryBean.class, entityManagerFactoryRef = "applicationEntityManagerFactory", transactionManagerRef = "applicationTransactionManager")
public static class ApplicationJpaConfiguration {
#Autowired
private ApplicationContext applicationContext;
/**
* The datasource used by the system. Configuration see
* /src/main/resources/application.properties.
*/
#Primary
#Bean(name="applicationDataSource")
#ConfigurationProperties(prefix = "spring.application.datasource")
public DataSource applicationDataSource() {
return new TenantAwareDataSource();
}
/**
* The transaction manager used by the application system.
*
* #param javax.sql.DataSource applicationDataSource
* #param javax.sql.DataSource loadTimeWeaver
* #return org.springframework.orm.jpa.JpaTransactionManager
*/
#Primary
#Bean(name="applicationTransactionManager")
public JpaTransactionManager applicationTransactionManager(LoadTimeWeaver loadTimeWeaver) {
JpaTransactionManager transactionManager = new JpaTransactionManager();
transactionManager.setEntityManagerFactory(applicationEntityManagerFactory(loadTimeWeaver).getObject());
return transactionManager;
}
/**
* The entity manager factory used by application system. Specific properties of the
* persistence provider eclipselink are configured here.
*/
#Primary
#Bean(name="applicationEntityManagerFactory")
public LocalContainerEntityManagerFactoryBean applicationEntityManagerFactory(LoadTimeWeaver loadTimeWeaver) {
Map<String, String> jpaProperties = new HashMap<String, String>();
jpaProperties.put("eclipselink.ddl-generation", "create-or-extend-tables");
jpaProperties.put("eclipselink.session.customizer", HistoryBuildingSessionCustomizer.class.getName());
jpaProperties.put("eclipselink.logging.level.sql", "fine");
jpaProperties.put("eclipselink.logging.parameters", "true");
jpaProperties.put("eclipselink.temporal.mutable", "true");
jpaProperties.put("eclipselink.persistence-context.flush-mode", "commit");
jpaProperties.put("eclipselink.cache.shared.default", "false");
jpaProperties.put("eclipselink.query-results-cache", "false");
jpaProperties.put("eclipselink.target-database", "MySQL");
LocalContainerEntityManagerFactoryBean entityManagerFactoryBean = new LocalContainerEntityManagerFactoryBean();
entityManagerFactoryBean.setDataSource(applicationDataSource());
entityManagerFactoryBean.setJpaVendorAdapter(new EclipseLinkJpaVendorAdapter());
entityManagerFactoryBean.setJpaPropertyMap(jpaProperties);
entityManagerFactoryBean.setLoadTimeWeaver(loadTimeWeaver);
entityManagerFactoryBean.setPersistenceUnitName("applicationPersistenceUnit");
return entityManagerFactoryBean;
}
/**
* A support object that wraps all available repositories and has some
* convenience functions.
*/
#Bean
public Repositories repositories() {
return new Repositories(applicationContext);
}
}
#Configuration
#EntityScan(basePackages = { "something.manager.model" })
#EnableJpaAuditing
#EnableTransactionManagement
#EnableJpaRepositories(basePackages = { "something.manager.model" }, repositoryFactoryBeanClass = RepositoryFactoryBean.class, entityManagerFactoryRef = "managerEntityManagerFactory", transactionManagerRef = "managerTransactionManager")
public static class ManagerJpaConfiguration {
#Autowired
private ApplicationContext applicationContext;
/**
* The datasource used by the system. Configuration see
* /src/main/resources/application.properties.
*/
#Bean(name="managerDataSource")
#ConfigurationProperties(prefix = "spring.manager.datasource")
public DataSource managerDataSource() {
return new BasicDataSource();
}
/**
* The transaction manager used by the manager system.
*
* #param javax.sql.DataSource managerDataSource
* #param javax.sql.DataSource loadTimeWeaver
* #return org.springframework.orm.jpa.JpaTransactionManager
*/
#Bean(name="managerTransactionManager")
public JpaTransactionManager managerTransactionManager(DataSource managerDataSource, LoadTimeWeaver loadTimeWeaver) {
JpaTransactionManager transactionManager = new JpaTransactionManager();
transactionManager.setEntityManagerFactory(managerEntityManagerFactory(managerDataSource, loadTimeWeaver).getObject());
return transactionManager;
}
/**
* The entity manager factory used by manager system. Specific properties of the
* persistence provider eclipselink are configured here.
*/
#Bean(name="managerEntityManagerFactory")
public LocalContainerEntityManagerFactoryBean managerEntityManagerFactory(DataSource managerDataSource, LoadTimeWeaver loadTimeWeaver) {
Map<String, String> jpaProperties = new HashMap<String, String>();
jpaProperties.put("eclipselink.ddl-generation", "create-or-extend-tables");
jpaProperties.put("eclipselink.session.customizer", HistoryBuildingSessionCustomizer.class.getName());
jpaProperties.put("eclipselink.logging.level.sql", "fine");
jpaProperties.put("eclipselink.logging.parameters", "true");
jpaProperties.put("eclipselink.temporal.mutable", "true");
jpaProperties.put("eclipselink.persistence-context.flush-mode", "commit");
jpaProperties.put("eclipselink.cache.shared.default", "false");
jpaProperties.put("eclipselink.query-results-cache", "false");
jpaProperties.put("eclipselink.target-database", "MySQL");
LocalContainerEntityManagerFactoryBean entityManagerFactoryBean = new LocalContainerEntityManagerFactoryBean();
entityManagerFactoryBean.setDataSource(managerDataSource);
entityManagerFactoryBean.setJpaVendorAdapter(new EclipseLinkJpaVendorAdapter());
entityManagerFactoryBean.setJpaPropertyMap(jpaProperties);
entityManagerFactoryBean.setLoadTimeWeaver(loadTimeWeaver);
return entityManagerFactoryBean;
}
/**
* A support object that wraps all available repositories and has some
* convenience functions.
*/
#Bean
public Repositories repositories() {
return new Repositories(applicationContext);
}
}
}
I use Spring version 3.2.2.RELEASE.
I want to create a webapp with hibernate and have the folllowing configuration:
public class LifepulseServletInitializer extends AbstractAnnotationConfigDispatcherServletInitializer {
#Override
protected Class<?>[] getRootConfigClasses() {
return new Class[] { PersistenceConfig.class};
}
#Override
protected Class<?>[] getServletConfigClasses() {
return new Class[] { LifepulseWebConfig.class };
}
#Override
protected String[] getServletMappings() {
return new String[] { "/" };
}
#Override
protected WebApplicationContext createRootApplicationContext() {
return super.createRootApplicationContext();
}
}
My two with #Configuration annotated files look like this:
My PersistenceConfig should be used for the ContextLoaderListener:
#Configuration
#EnableTransactionManagement(proxyTargetClass=true, mode=AdviceMode.PROXY)
public class PersistenceConfig{
Logger logger = LoggerFactory.getLogger(PersistenceConfig.class);
/** The Application Context. */
#Autowired
ApplicationContext context;
/**
* Gets the database url.
*
* #return the database url
*/
#Bean(name="databaseUrl")
public String getDatabaseUrl(){
return "jdbc:mysql://localhost/lifepulse";
}
/**
* Gets the session factory properties.
*
* #return the session factory properties
*/
#Bean(name="sessionFactoryProperties")
public Properties getSessionFactoryProperties(){
Properties props = new Properties();
props.put("hibernate.dialect", MySQL5InnoDBDialect.class.getName());
props.put("hibernate.hbm2ddl.auto", "create-drop");
props.put("hibernate.show_sql", "true");
props.put("hibernate.format_sql", "true");
return props;
}
/** The Constant ANNOTATED_CLASSES. */
#SuppressWarnings("unchecked")
private static final Class<? extends Serializable>[] ANNOTATED_CLASSES=new Class[]{
Melder.class,
LogEntry.class
};
/**
* Gets the annotated classes.
*
* #return the annotated classes
*/
private static Class<? extends Serializable>[] getAnnotatedClasses(){
return ANNOTATED_CLASSES;
}
/**
* Gets the data source.
* This bean represents the application's MYSQL datasource, without using xml.
*
* #return the data source
*/
#Bean
public DataSource dataSource() {
DriverManagerDataSource dataSource = new DriverManagerDataSource();
dataSource.setDriverClassName("com.mysql.jdbc.Driver");
dataSource.setUrl(getDatabaseUrl());
dataSource.setUsername("lifepulse");
dataSource.setPassword("lifepulse");
return dataSource;
}
/**
* Session factory.
* This bean represents the Hibernate Session Factory. By declaring this bean
* it can easily be injected into Spring DAOs later on.
*
* #return the local session factory bean
*/
#Bean
public LocalSessionFactoryBean sessionFactory() {
LocalSessionFactoryBean factory = new LocalSessionFactoryBean();
factory.setAnnotatedClasses(getAnnotatedClasses());
factory.setHibernateProperties(getSessionFactoryProperties());
factory.setDataSource(dataSource());
return factory;
}
#Bean()
public GenericDao genericDao() {
return new HibernateDaoImpl();
}
#Bean
PlatformTransactionManager txManager(){
HibernateTransactionManager htm = new HibernateTransactionManager(sessionFactory().getObject());
return htm;
}
}
My LifepulseWebConfig should be used for the DispatcherSerlvet:
#Configuration
#EnableWebMvc
#ComponentScan(value= {"com.ansiworks.lifepulse.controllers"})
#EnableTransactionManagement(proxyTargetClass=true, mode=AdviceMode.PROXY)
public class LifepulseWebConfig extends WebMvcConfigurerAdapter{
#Autowired
ApplicationContext context;
#Bean
ViewResolver viewResolver(){
InternalResourceViewResolver resolver = new InternalResourceViewResolver();
//resolver.setPrefix("");
resolver.setSuffix(".jsp");
return resolver;
}
#Bean
MainService mainService(){
return new MainServiceImpl();
}
}
This configuration doesn't work. The dispathcer servlet doesn't load my PersistenceConfig, and so the beans are not herited to the LifepulseWebConfig.
But: When I add the PersistenceConfig to the class-array of the Method LifepulseServletInitializer#getServletConfigClasses() everything works fine.
However... I can't use sprin-security in this way....
What am I doing wrong!? Why are the config-classes returned by LifepulseServletInitializer#getRootConfigClasses() not read by the ContextLoaderListener?
I also tried it this way (without the LifepulseServletInitializer ) with the same effect. The beans of my PersistenceConfig are simply not loaded...
public class MyWebAppInitializer implements WebApplicationInitializer {
#Override
public void onStartup(ServletContext container) {
// Create the 'root' Spring application context
AnnotationConfigWebApplicationContext rootContext =
new AnnotationConfigWebApplicationContext();
rootContext.register(PersistenceConfig.class);
// Manage the lifecycle of the root application context
container.addListener(new ContextLoaderListener(rootContext));
// Create the dispatcher servlet's Spring application context
AnnotationConfigWebApplicationContext dispatcherContext =
new AnnotationConfigWebApplicationContext();
dispatcherContext.register(LifepulseWebConfig.class);
// Register and map the dispatcher servlet
ServletRegistration.Dynamic dispatcher =
container.addServlet("dispatcher", new DispatcherServlet(dispatcherContext));
dispatcher.setLoadOnStartup(1);
dispatcher.addMapping("/");
}
}