Spring boot - Issue with multiple DataSource - spring

I have a ReST service that needs to fetch data from two different DBs (Oracle and MySQL) and merge this data in the response.
I have below configuration.
Config for DB 1:
#Configuration
public class DbConfig_DB1{
#Bean(name="siebelDataSource")
public EmbeddedDatabase siebelDataSource(){
return new EmbeddedDatabaseBuilder().
setType(EmbeddedDatabaseType.H2).
addScript("schema.sql").
addScript("test-data.sql").
build();
}
#Autowired
#Qualifier("siebelDataSource")
#Bean(name = "siebelJdbcTemplate")
public JdbcTemplate siebelJdbcTemplate(DataSource siebelDataSource) {
return new JdbcTemplate(siebelDataSource);
}
}
Config for DB2:
#Configuration
public class DbConfig_DB2{
#Bean(name="brmDataSource")
public EmbeddedDatabase brmDataSource(){
return new EmbeddedDatabaseBuilder().
setType(EmbeddedDatabaseType.H2).
addScript("schema-1.sql").
addScript("test-data-1.sql").
build();
}
#Autowired
#Qualifier("brmDataSource")
#Bean(name = "brmJdbcTemplate")
public JdbcTemplate brmJdbcTemplate(DataSource brmDataSource) {
return new JdbcTemplate(brmDataSource);
}
}
Data Access:
#Repository
public class SiebelDataAccess {
protected final Logger log = LoggerFactory.getLogger(getClass());
#Autowired
#Qualifier("siebelJdbcTemplate")
protected JdbcTemplate jdbc;
public String getEmpName(Integer id) {
System.out.println(jdbc.queryForObject("select count(*) from employee", Integer.class));
Object[] parameters = new Object[] { id };
String name = jdbc.queryForObject(
"select name from employee where id = ?", parameters,
String.class);
return name;
}
}
I am not able to start the app as I below error:
Exception in thread "main" org.springframework.beans.factory.BeanCreationException: Error creating bean with name 'org.springframework.boot.autoconfigure.jdbc.DataSourceTransactionManagerAutoConfiguration': Injection of autowired dependencies failed; nested exception is org.springframework.beans.factory.BeanCreationException: Could not autowire field: private javax.sql.DataSource org.springframework.boot.autoconfigure.jdbc.DataSourceTransactionManagerAutoConfiguration.dataSource;
nested exception is org.springframework.beans.factory.NoUniqueBeanDefinitionException: No qualifying bean of type [javax.sql.DataSource] is defined: expected single matching bean but found 2: brmDataSource,siebelDataSource
The issue is with two DataSource beans in the context. How to resolve this?

You could mark one of them as #Primary so Spring Boot auto configuration for transactions manager would know which one to pick. If you need to manage transactions with both of them then I'm afraid you would have to setup transactions management explicitly.
Please refer to the Spring Boot documentation
Creating more than one data source works the same as creating the first one. You might want to mark one of them as #Primary if you are using the default auto-configuration for JDBC or JPA (then that one will be picked up by any #Autowired injections).

Related

Spring Batch test case with multiple data sources

I have a Spring Batch Classifier to test for which I've defined this test class:
#RunWith(SpringRunner.class)
#SpringBatchTest
#ContextConfiguration(classes = { BatchConfiguration.class })
class CsvOutputClassifierTest {
#Autowired
private FlatFileItemWriter<CsvData> createRequestForProposalWriter;
#Autowired
private FlatFileItemWriter<CsvData> createRequestForQuotationWriter;
private final CsvOutputClassifier csvOutputClassifier = new CsvOutputClassifier(
createRequestForProposalWriter,
createRequestForQuotationWriter);
#Test
void shouldReturnProposalWriter() {
...
}
The batch configuration class has this constructor:
public BatchConfiguration(
final JobBuilderFactory jobBuilderFactory,
final StepBuilderFactory stepBuilderFactory,
#Qualifier("oerationalDataSource") final DataSource oerationalDataSource,
final DwhFileManager dwhFileManager,
final OperationalRepository operationalRepository)
And these beans:
#StepScope
#Bean
public FlatFileItemWriter<CsvData> createRequestForProposalWriter(
#Value("#{jobParameters['startDate']}") String startDate) {
FlatFileItemWriter<CsvData> writer = new FlatFileItemWriter<CsvData>();
...
return writer;
}
#StepScope
#Bean
public FlatFileItemWriter<CsvData> createRequestForQuotationWriter(
#Value("#{jobParameters['startDate']}") String startDate) {
FlatFileItemWriter<CsvData> writer = new FlatFileItemWriter<CsvData>();
...
return writer;
}
Running the test class I'm not able to trigger the first test method as I'm getting:
Error creating bean with name 'batchConfiguration': Unsatisfied dependency expressed through constructor parameter 2; nested exception is org.springframework.beans.factory.NoSuchBeanDefinitionException: No qualifying bean of type 'javax.sql.DataSource' available: expected at least 1 bean which qualifies as autowire candidate. Dependency annotations: {#org.springframework.beans.factory.annotation.Qualifier(value="oerationalDataSource")}
In fact, I defined two different data sources, one for the 'operational' data and the 'app' for Spring Batch persistency:
#Configuration(proxyBeanMethods = false)
public class DataSourceConfiguration {
#Bean
#Primary
#ConfigurationProperties("app.datasource")
public DataSourceProperties defaultDataSourceProperties() {
return new DataSourceProperties();
}
#Bean
#Primary
#ConfigurationProperties("app.datasource.configuration")
public HikariDataSource defaultDataSource(DataSourceProperties properties) {
return properties.initializeDataSourceBuilder().type(HikariDataSource.class)
.build();
}
#Bean
#ConfigurationProperties("aggr.datasource")
public DataSourceProperties oerationalDataSourceProperties() {
return new DataSourceProperties();
}
#Bean
#ConfigurationProperties("aggr.datasource.configuration")
public HikariDataSource oerationalDataSource(
#Qualifier("oerationalDataSourceProperties") DataSourceProperties oerationalDataSourceProperties) {
return oerationalDataSourceProperties.initializeDataSourceBuilder().type(HikariDataSource.class).build();
}
#Bean
public JdbcTemplate operationalJdbcTemplate(#Qualifier("oerationalDataSource") DataSource dataSource) {
return new JdbcTemplate(dataSource);
}
}
In #SpringBatchTest documentation it is reported that just one DataSource should be found or it should be marked as Primary:
It should be noted that JobLauncherTestUtils requires a org.springframework.batch.core.Job bean and that JobRepositoryTestUtils requires a javax.sql.DataSource bean. Since this annotation registers a JobLauncherTestUtils and a JobRepositoryTestUtils in the test context, it is expected that the test context contains a single autowire candidate for a org.springframework.batch.core.Job and a javax.sql.DataSource (either a single bean definition or one that is annotated with org.springframework.context.annotation.Primary).
But I have it. So how to fix it?
Update #1
Thanks to #Henning's tip, I've changed the annotations as follows:
#RunWith(SpringRunner.class)
#SpringBatchTest
#SpringBootTest(args={"--mode=custom", "--startDate=2022-05-31T01:00:00.000Z", "--endDate=2022-05-31T23:59:59.999Z"})
#ContextConfiguration(classes = { BatchConfiguration.class, DataSourceConfiguration.class, LocalFileManager.class, AggregatorRepository.class })
#ActiveProfiles({"integration"})
#EnableAutoConfiguration(exclude={DataSourceAutoConfiguration.class})
Where:
#SpringBootTest is needed to avoid 'Failed to determine a suitable driver class exception'
args is needed to provide the required parameters to the batch
But still having this exception:
Error creating bean with name 'scopedTarget.createRequestForProposalWriter' defined in BatchConfiguration: Bean instantiation via factory method failed; nested exception is org.springframework.beans.BeanInstantiationException: Failed to instantiate [org.springframework.batch.item.file.FlatFileItemWriter]: Factory method 'createRequestForProposalWriter' threw exception; nested exception is java.lang.NullPointerException: text
raised in the implementation of:
#StepScope
#Bean
public FlatFileItemWriter<CsvData> createRequestForProposalWriter(
#Value("#{jobParameters['startDate']}") String startDate)
as the parameter 'startDate' is null.
In my naivety I assumed that I could test in isolation the classifier with something like that:
#Test
void shouldReturnProposalWriter() {
CsvData csvData = create-some-fake-data
CsvOutputClassifier csvOutputClassifier = new CsvOutputClassifier(
createRequestForProposalWriter,
createRequestForQuotationWriter);
ItemWriter itemWriter = csvOutputClassifier.classify(csvData);
some-assert-about-itemWriter-properties
}
So now the question is: how to correctly test the classifier?
You need to list DataSourceConfiguration as argument of #ContextConfiguration, i.e. your test class should start like this
#RunWith(SpringRunner.class)
#SpringBatchTest
#ContextConfiguration(classes = { BatchConfiguration.class, DataSourceConfiguration.class })
class CsvOutputClassifierTest {
...
}
The DataSourceConfiguration is currently not known within the test as you didn't declare it as part of the context or enabled classpath scanning in any form.

DelegatingDataSource Spring boot

I am trying to implement Spring Boot AOP for data-source pointcut - where before running any query I need to set client context in DB connection.
I was trying this approach of using DelegatingDataSource. But I am getting below error during server startup
org.springframework.beans.factory.BeanCurrentlyInCreationException: Error creating bean with name 'dataSource': Requested bean is currently in creation: Is there an unresolvable circular reference?
Please let me know DeletegatingDatasource for JNDI based DB lookup.
Edit 1: AOP - I tried to add pointcut execution(public * javax.sql.DataSource+.getConnection(..)). This works only when Spring datasource is used with username/password. Once i deploy in Jboss with JNDI I am getting WildFlyDataSource Proxy error. So, instead of AOP approach I thought of using DelegatingDatasource
// AOP Example
#Pointcut("execution(public * javax.sql.DataSource+.getConnection(..))")
void prepareConnectionPointcut() {
logger.debug("prepareConnectionPointcut");
}
#AfterReturning(pointcut = "prepareConnectionPointcut()", returning = "connection")
void afterPrepareConnection(Connection connection) {
// Set context in Connection - return same connection for query execution
}
But when i deploy this code in JBoss - I am getting WildFlyDataSource datasource bean creation error.
Error creating bean with name
'org.springframework.boot.autoconfigure.orm.jpa.HibernateJpaConfiguration':
Unsatisfied dependency expressed through constructor parameter 0;
nested exception is
org.springframework.beans.factory.BeanCreationException: Error
creating bean with name 'dataSource' defined in class path resource
[org/springframework/boot/autoconfigure/jdbc/JndiDataSourceAutoConfiguration.class]:
Initialization of bean failed; nested exception is
org.springframework.aop.framework.AopConfigException: Could not
generate CGLIB subclass of class
org.jboss.as.connector.subsystems.datasources.WildFlyDataSource:
Common causes of this problem include using a final class or a
non-visible class; nested exception is
org.springframework.cglib.core.CodeGenerationException:
java.lang.NoClassDefFoundError-->org/jboss/as/connector/subsystems/datasources/WildFlyDataSource
I have also added proxyTargetClass flag during initialization
#EnableAspectJAutoProxy(proxyTargetClass = true)
Thanks #M.Deinum for recommendation of using BeanPostProcessor & Implement DelegatingDatasource for setting client info. Please find snippet below which i have implemented to accomplish this in Spring Boot which works well with JBoos JNDI based connection or Spring Boot URL Datasource connection.
#Component
public class MyBeanPostProcessor implements BeanPostProcessor {
private static Logger logger = LoggerFactory.getLogger(MyBeanPostProcessor.class);
#Override
public Object postProcessAfterInitialization(Object bean, String beanName) throws BeansException {
if (bean instanceof DataSource) {
// Check DataSource bean initialization & enclose it with DelegatingDataSource
logger.debug("MyBeanPostProcessor:: postProcessAfterInitialization:: DataSource");
DataSource beanDs = (DataSource) bean;
return new MyDelegateDS(beanDs);
}
return BeanPostProcessor.super.postProcessAfterInitialization(bean, beanName);
}
#Override
public Object postProcessBeforeInitialization(Object bean, String beanName) throws BeansException {
if (bean instanceof DataSource) {
logger.debug("MyBeanPostProcessor:: postProcessBeforeInitialization:: DataSource");
}
logger.debug("MyBeanPostProcessor:: postProcessBeforeInitialization:: " + beanName);
return BeanPostProcessor.super.postProcessBeforeInitialization(bean, beanName);
}
}
My implementation of DelegatingDataSource to handle each user request to set client context in DB connection session
public class MyDelegateDS extends DelegatingDataSource {
private static Logger logger = LoggerFactory.getLogger(MyDelegateDS.class);
public MyDelegateDS(DataSource delegate) {
super(delegate);
logger.debug("MyDelegateDS:: constructor");
}
#Override
public Connection getConnection() throws SQLException {
logger.debug("MyDelegateDS:: getConnection");
// To do this context only for user Request - to avoid this during Server initialization
if (RequestContextHolder.getRequestAttributes() != null
&& ((ServletRequestAttributes) RequestContextHolder.getRequestAttributes()).getRequest() != null) {
logger.debug("MyDelegateDS:: getConnection: valid user request");
HttpServletRequest request = ((ServletRequestAttributes) RequestContextHolder.getRequestAttributes())
.getRequest();
// Checking each user request & calling SP to set client context before invoking actual native query/SP
}
logger.debug("MyDelegateDS:: getConnection: Not User Request");
return super.getConnection();
}
}
Hope this is helpful for someone facing same problem

Want to configure Job with out DataSource

I am able to run my Springbatch application when Datasource was configured in my SpringbatchConfiguration class. But i dont want Datasource to be configured. SO i used ResourcelessTransactionManager. See below my configuration class. Some one guide me how i can launch Jobs without configuring Datasource as part of Batchjob configurations.
#Configuration
#EnableBatchProcessing
#EnableAutoConfiguration
//#EnableAutoConfiguration(exclude={DataSourceAutoConfiguration.class})
public class SprintgBatchConfiguration {
/*#Autowired
private DBConfiguration dbConfig;*/
/*#Autowired
private DataSource dataSource;
#Autowired
private DataSourceTransactionManager transactionManager;
*/
//Tomcat relaated configuration//
#Bean
public MultipartConfigElement multipartConfigElement() {
MultipartConfigFactory factory = new MultipartConfigFactory();
factory.setMaxFileSize("124MB");
factory.setMaxRequestSize("124MB");
return factory.createMultipartConfig();
}
#Bean(name="csvjob")
public Job job(JobBuilderFactory jobBuilderFactory,StepBuilderFactory stepBuilderFactory,ItemReader<List<CSVPojo>> itemReader,ItemProcessor<List<CSVPojo>,CsvWrapperPojo> itemProcessor,AmqpItemWriter<CsvWrapperPojo> itemWriter){
Step step=stepBuilderFactory.get("ETL-CSV").<List<CSVPojo>,CsvWrapperPojo>chunk(100)
.reader(itemReader)
.processor(itemProcessor)
.writer(itemWriter)
.build();
Job csvJob= jobBuilderFactory.get("ETL").incrementer(new RunIdIncrementer())
.start(step).build();
return csvJob;
}
#Bean(name="exceljob")
public Job jobExcel(JobBuilderFactory jobBuilderFactory,StepBuilderFactory stepBuilderFactory,ItemReader<List<ExcelPojo>> itemReader,ItemProcessor<List<ExcelPojo>,ExcelWrapperPojo> itemProcessor,AmqpItemWriter<ExcelWrapperPojo> itemWriter){
Step step=stepBuilderFactory.get("ETL-Excel").<List<ExcelPojo>,ExcelWrapperPojo>chunk(100)
.reader(itemReader)
.processor(itemProcessor)
.writer(itemWriter)
.build();
Job ExcelJob= jobBuilderFactory.get("ETL-Excel").incrementer(new RunIdIncrementer())
.start(step).build();
return ExcelJob;
}
/*#Override
public void setDataSource(DataSource dataSource){
System.out.println("overriden");
}*/
/*#Bean
public FlatFileItemReader<CSVPojo> fileItemReader(Resource resource){
return null;
}*/
/*#Bean(name="dataSource")
public DataSource dataSource() throws SQLException
{
//BasicDataSource dataSource = new BasicDataSource();
return dataSource;
}*/
#Bean(name="transactionManager")
public ResourcelessTransactionManager transactionManager() throws SQLException{
return new ResourcelessTransactionManager();
}
/*#Bean(name="transactionManager")
public DataSourceTransactionManager transactionManager() throws SQLException{
DataSourceTransactionManager transactionManager = new DataSourceTransactionManager(this.dataSource());
return transactionManager;
}*/
/*#Bean
public JobRepository jobRepository() throws Exception{
JobRepositoryFactoryBean factoryBean = new JobRepositoryFactoryBean();
factoryBean.setDatabaseType("ORACLE");
factoryBean.setDataSource(dataSource);
factoryBean.setTransactionManager(transactionManager);
factoryBean.setIsolationLevelForCreate("ISOLATION_READ_UNCOMMITTED");
return factoryBean.getObject();
}*/
#Bean
public JobRepository jobRepository(ResourcelessTransactionManager transactionManager) throws Exception {
MapJobRepositoryFactoryBean mapJobRepositoryFactoryBean = new MapJobRepositoryFactoryBean(transactionManager);
mapJobRepositoryFactoryBean.setTransactionManager(transactionManager);
return mapJobRepositoryFactoryBean.getObject();
}
}
But i am getting below exception when i am running Application.
ationConfigEmbeddedWebApplicationContext : Exception encountered during context initialization - cancelling refresh attempt: org.springframework.beans.factory.UnsatisfiedDependencyException: Error creating bean with name 'fileProcessController': Unsatisfied dependency expressed through field 'jobLauncher'; nested exception is org.springframework.beans.factory.UnsatisfiedDependencyException: Error creating bean with name 'org.springframework.batch.core.configuration.annotation.SimpleBatchConfiguration': Unsatisfied dependency expressed through field 'dataSources'; nested exception is org.springframework.beans.factory.BeanCreationException: Error creating bean with name 'dataSource' defined in class path resource [org/springframework/boot/autoconfigure/jdbc/DataSourceConfiguration$Tomcat.class]: Bean instantiation via factory method failed; nested exception is org.springframework.beans.BeanInstantiationException: Failed to instantiate [org.apache.tomcat.jdbc.pool.DataSource]: Factory method 'dataSource' threw exception; nested exception is org.springframework.boot.autoconfigure.jdbc.DataSourceProperties$DataSourceBeanCreationException: Cannot determine embedded database driver class for database type NONE. If you want an embedded database please put a supported one on the classpath. If you have database settings to be loaded from a particular profile you may need to active it (no profiles are currently active).
Thanks in advance!!!!
Spring Boot is intended for building production grade applications. When it is used to build a Spring Batch application, it requires a data source to persist Spring Batch meta-data (See BATCH-2704).
But you can always use either:
an embedded datasource supported by Spring Boot (H2, HSQL or Derby) by just adding it to the classpath. This data source will be picked up automatically by Spring Batch
or provide a custom BatchConfigurer and use the MapJobRepository (See here)
Hope this helps.

how to config mybatis in springboot

I have two config here:
#Configuration
public class DataConfig {
#Value("${datasource.jdbcUrl}")
private String jdbcUrl;
#Value("${datasource.username}")
private String username;
#Value("${datasource.password}")
private String password;
#Value("${datasource.driverClassName:com.mysql.jdbc.Driver}")
private String driverClassName;
#Value("${datasource.initialSize:20}")
private int initialSize;
#Value("${datasource.maxActive:30}")
private int maxActive;
#Value("${datasource.minIdle:20}")
private int minIdle;
#Value("${datasource.transactionTimeoutS:30}")
private int transactionTimeoutS;
#Value("${datasource.basePackage:com.tg.ms.mapper}")
private String basePackage;
#Value("${datasource.mapperLocations}")
private String mapperLocations;
#Bean
public DataSource dataSource() {
DruidDataSource ds = new DruidDataSource();
ds.setMaxWait(maxWait);
ds.setValidationQuery(validationQuery);
ds.setRemoveAbandoned(removeAbandoned);
ds.setRemoveAbandonedTimeout(removeAbandonedTimeout);
ds.setTestWhileIdle(testWhileIdle);
ds.setTestOnReturn(testOnReturn);
ds.setTestOnBorrow(testOnBorrow);
ds.setMinIdle(minIdle);
return ds;
}
#Bean
public SqlSessionFactory sqlSessionFactoryBean() throws Exception {
SqlSessionFactoryBean sqlSessionFactoryBean = new SqlSessionFactoryBean();
sqlSessionFactoryBean.setDataSource(dataSource());
PathMatchingResourcePatternResolver resolver = new PathMatchingResourcePatternResolver();
sqlSessionFactoryBean.setMapperLocations(resolver.getResources("classpath:/mybatis/*.xml"));
return sqlSessionFactoryBean.getObject();
}
---------- Another Config -------------
#Configuration
#AutoConfigureAfter(DataBaseConfig.class)
public class MapperScannerConfig {
#Value("${datasource.basePackage:com.tg.ms.mapper}")
private String basePackage;
#Bean
public MapperScannerConfigurer BPMapperScannerConfigurer() {
System.out.println("mapper--1.----******----"+basePackage+"----*******");
MapperScannerConfigurer mapperScannerConfigurer = new MapperScannerConfigurer();
mapperScannerConfigurer.setBasePackage("com.tg.mapper");
mapperScannerConfigurer.setSqlSessionFactoryBeanName("sqlSessionFactoryBean");
return mapperScannerConfigurer;
}
}
Can I put#Bean public MapperScannerConfigurer BPMapperScannerConfigurer() into DataConfig? I try but print:
Exception encountered during context initialization - cancelling refresh attempt: org.springframework.beans.factory.UnsatisfiedDependencyException: Error creating bean with name 'testController': Unsatisfied dependency expressed through field 'testMapper'; nested exception is org.springframework.beans.factory.BeanCreationException: Error creating bean with name 'testMapper' defined in file [/Users/twogoods/codesource/mainetset/target/classes/com/tg/mapper/TestMapper.class]: Cannot resolve reference to bean 'sqlSessionFactoryBean' while setting bean property 'sqlSessionFactory'; nested exception is org.springframework.beans.factory.BeanCreationException: Error creating bean with name 'sqlSessionFactoryBean' defined in class path resource [com/tg/config/DataConfig.class]: Bean instantiation via factory method failed; nested exception is org.springframework.beans.BeanInstantiationException: Failed to instantiate [org.apache.ibatis.session.SqlSessionFactory]: Factory method 'sqlSessionFactoryBean' threw exception; nested exception is java.lang.NullPointerException
MapperScannerConfig init earlier than DataConfig, I get it from print log,#Value("${datasource.basePackage:com.tg.ms.mapper}") private String basePackage;can not get value(in DataConfig can get),I use #AutoConfigureAfter is useless,MapperScannerConfig is also eariler, I can not config mapper basePackage
log:Cannot enhance #Configuration bean definition 'BPMapperScannerConfigurer' since its singleton instance has been created too early. The typical cause is a non-static #Bean method with a BeanDefinitionRegistryPostProcessor return type: Consider declaring such methods as 'static'.
I got the same problem. MapperScannerConfigurer is initialized too early in spring framework and ,i think it causes the annotation #AutoConfigureAfter become useless.
So i solve it like : avoid the use of MapperScannerConfigurer:
two ways:
just use #MapperScan("com.a.b.package")
use annotation #org.apache.ibatis.annotations.Mapper in your mybatis mapper interface.

Spring Data: inject 2 repositories with same name but in 2 different packages

Context
I want to use in the same Spring context two different databases that have entities that share the same name, but not the same structure. I rely on Spring Data MongoDB and JPA/JDBC. I have two packages, containing among others the following files:
com.bar.entity
Car.class
com.bar.repository
CarRepository.class
RepoBarMarker.class
com.bar.config
MongoConfiguration.class
com.foo.entity
Car.class
com.foo.repository
CarRepository.class
RepoFooMarker.class
com.foo.config
JPAConfiguration.class
SpecEntityManagerFactory.class
The content of each Car.class is different, I cannot reuse them. bar uses Spring-Mongo and foo uses Spring-JPA, and repositories are initialised via #EnableMongoRepositories and #EnableJpaRepositories annotations. When in one of my application component I try to access the foo version of the repository:
#Resource
private com.foo.repository.CarRepository carRepository;
I have the following exception when the class containing the #Resource field is created:
Caused by: org.springframework.beans.factory.BeanNotOfRequiredTypeException: Bean named 'carRepository' must be of type [com.foo.repository.CarRepository], but was actually of type [com.sun.proxy.$Proxy31]
at org.springframework.beans.factory.support.AbstractBeanFactory.doGetBean(AbstractBeanFactory.java:374)
at org.springframework.beans.factory.support.AbstractBeanFactory.getBean(AbstractBeanFactory.java:198)
at org.springframework.context.annotation.CommonAnnotationBeanPostProcessor.autowireResource(CommonAnnotationBeanPostProcessor.java:446)
at org.springframework.context.annotation.CommonAnnotationBeanPostProcessor.getResource(CommonAnnotationBeanPostProcessor.java:420)
at org.springframework.context.annotation.CommonAnnotationBeanPostProcessor$ResourceElement.getResourceToInject(CommonAnnotationBeanPostProcessor.java:545)
at org.springframework.beans.factory.annotation.InjectionMetadata$InjectedElement.inject(InjectionMetadata.java:155)
at org.springframework.beans.factory.annotation.InjectionMetadata.inject(InjectionMetadata.java:87)
at org.springframework.context.annotation.CommonAnnotationBeanPostProcessor.postProcessPropertyValues(CommonAnnotationBeanPostProcessor.java:305)
... 26 more
It appears that Spring tries to convert a bar repository to a foo repository, instead of creating a new bean, as in the same stack I also have the following exception:
Caused by: java.lang.IllegalStateException: Cannot convert value of type [com.sun.proxy.$Proxy31 implementing com.bar.repository.CarRepository,org.springframework.data.repository.Repository,org.springframework.aop.SpringProxy,org.springframework.aop.framework.Advised] to required type [com.foo.repository.CarRepository]: no matching editors or conversion strategy found
at org.springframework.beans.TypeConverterDelegate.convertIfNecessary(TypeConverterDelegate.java:267)
at org.springframework.beans.TypeConverterDelegate.convertIfNecessary(TypeConverterDelegate.java:93)
at org.springframework.beans.TypeConverterSupport.doConvert(TypeConverterSupport.java:64)
... 35 more
If I try instead to autowire the repository:
#Autowire
private com.foo.repository.CarRepository carRepository;
I get the following exception:
Caused by: org.springframework.beans.factory.BeanCreationException: Could not autowire field: private com.foo.CarRepository com.shell.ShellApp.carRepository; nested exception is org.springframework.beans.factory.NoSuchBeanDefinitionException: No qualifying bean of type [com.foo.CarRepository] found for dependency: expected at least 1 bean which qualifies as autowire candidate for this dependency. Dependency annotations: {#org.springframework.beans.factory.annotation.Autowired(required=true)}
at org.springframework.beans.factory.annotation.AutowiredAnnotationBeanPostProcessor$AutowiredFieldElement.inject(AutowiredAnnotationBeanPostProcessor.java:509)
at org.springframework.beans.factory.annotation.InjectionMetadata.inject(InjectionMetadata.java:87)
at org.springframework.beans.factory.annotation.AutowiredAnnotationBeanPostProcessor.postProcessPropertyValues(AutowiredAnnotationBeanPostProcessor.java:290)
... 26 more
Spring-data configuration
In foo (JPA) package, JPAConfigration.class:
#Configuration
#EnableJpaRepositories(basePackageClasses = RepoFooMarker.class)
public class JPAConfiguration {
#Autowired
public DataSource dataSource;
#Autowired
public EntityManagerFactory entityManagerFactory;
#Bean
public EntityManager entityManager(final EntityManagerFactory entityManagerFactory) {
return entityManagerFactory.createEntityManager();
}
#Bean
public Session session(final EntityManager entityManager)
{
return entityManager.unwrap(Session.class);
}
#Bean
public PlatformTransactionManager transactionManager() throws SQLException {
final JpaTransactionManager txManager = new JpaTransactionManager();
txManager.setEntityManagerFactory(entityManagerFactory);
return txManager;
}
#Bean
public HibernateExceptionTranslator hibernateExceptionTranslator() {
return new HibernateExceptionTranslator();
}
}
SpecEntityManagerFactory.class:
#Configuration
public class SpecEntityManagerFactory {
#Bean
public EntityManagerFactory entityManagerFactory(final DataSource dataSource) throws SQLException {
final HibernateJpaVendorAdapter vendorAdapter = new HibernateJpaVendorAdapter();
vendorAdapter.setGenerateDdl(false);
vendorAdapter.setDatabase(Database.POSTGRESQL);
final LocalContainerEntityManagerFactoryBean factory = new LocalContainerEntityManagerFactoryBean();
factory.setJpaVendorAdapter(vendorAdapter);
factory.setPackagesToScan("com.foo.entity");
factory.setJpaProperties(getHibernateProperties());
factory.setDataSource(dataSource);
factory.afterPropertiesSet();
return factory.getObject();
}
private Properties getHibernateProperties()
{
final Properties hibernateProperties = new Properties();
hibernateProperties.setProperty("hibernate.temp.use_jdbc_metadata_defaults", "false");
return hibernateProperties;
}
}
In bar (MongoDB) package, MongoConfiguration.class:
#Configuration
#EnableMongoRepositories(basePackageClasses = RepoBarMarker.class)
public class MongoConfiguration extends AbstractRepoConfig {
#Override
#Bean
public MongoOperations mongoTemplate() {
final MongoClient mongo = this.getMongoClient();
final MongoClientURI mongoUri = this.getMongoClientUri();
final MongoTemplate mongoTemplate = new MongoTemplate(mongo, mongoUri.getDatabase());
mongoTemplate.setReadPreference(ReadPreference.secondaryPreferred());
mongoTemplate.setWriteConcern(WriteConcern.UNACKNOWLEDGED);
return mongoTemplate;
}
}
Question
If I change in foo repository the entity name to CarFoo.class and the repository to CarFooRepository.class, then everything works. But is there away to avoid renaming them and still have a real wiring per type, instead of name (as it is what seems to be done here), for Spring Data repositories?
In your case, you can use
#Repository("fooCarRepository")
on the interface declaration of
com.foo.repository.CarRepository
Although when using Spring Data #Repository is not generally needed on the interface, however in your case you need to supply it. That's because you need to make Spring register the implementation of the bean with a custom name (in this case fooCarRepository) in order to avoid the name collision.

Resources