Here is my case:
I have two databases: one sybase and one mssql. I wish to access both of the database in a single service class. For example, I want to get some data from sybase, then I need do some update on mssql.
I have setup two datasources based on multiple samples found online, but Im unable to access my second database (sybase).
Here is my code:
pom.xml
spring.mvc.view.prefix: /WEB-INF/jsp/
spring.mvc.view.suffix: .jsp
# Database
# spring.datasource.jndi-name=jdbc/database1
spring.datasource.driver-class-name=net.sourceforge.jtds.jdbc.Driver
spring.datasource.url=jdbc:jtds:sqlserver://database1/db_aes
spring.datasource.username=user1
spring.datasource.password=password1
# Keep the connection alive if idle for a long time (needed in production)
spring.datasource.testWhileIdle = true
spring.datasource.validationQuery = SELECT 1
# 2nd Database
spring.secondDatasource.driver-class-name=net.sourceforge.jtds.jdbc.Driver
spring.secondDatasource.url=jdbc:jtds:sybase://database2/aidcconfig
spring.secondDatasource.username=user2
spring.secondDatasource.password=password2
spring.secondDatasource.hibernate.dialect = org.hibernate.dialect.SybaseASE15Dialect
spring.secondDatasource.testWhileIdle = true
spring.secondDatasource.validationQuery = SELECT 1
# Show or not log for each sql query
spring.jpa.show-sql = false
# Hibernate ddl auto (create, create-drop, update, validate)
spring.jpa.hibernate.ddl-auto = validate
# Naming strategy
spring.jpa.hibernate.naming-strategy = org.hibernate.cfg.EJB3NamingStrategy
# Use spring.jpa.properties.* for Hibernate native properties (the prefix is
# stripped before adding them to the entity manager)
# The SQL dialect makes Hibernate generate better SQL for the chosen database
spring.jpa.properties.hibernate.dialect = org.hibernate.dialect.SQLServerDialect
com.ibm.websphere.persistence.ApplicationsExcludedFromJpaProcessing=*
fileUploadServiceImpl
#Component("fileUploadService")
#Transactional
public class FileUploadServiceImpl implements FileUploadService {
#Autowired
#Qualifier("dbAesJdbcTemplate")
JdbcTemplate dbAesJdbcTemplate;
#Autowired
#Qualifier("aidcconfigJdbcTemplate")
JdbcTemplate aidcconfigJdbcTemplate;
private int uploadId = 1;
private void testDB(){
String db = aidcconfigJdbcTemplate.queryForObject("select db_name()", String.class);
System.out.println("database name: " + db);
}
...
}
DbAesDataSource
package config.database;
#Configuration
#EnableTransactionManagement
#EnableJpaRepositories(
entityManagerFactoryRef = "dbAesEntityManagerFactory",
transactionManagerRef = "dbAesTransactionManager",
basePackages = {"web.fileUpload.repo.db_aes.dao"}
)
public class DbAesDataSource {
#Primary
#Bean(name="dbAesDataSource")
#ConfigurationProperties(prefix = "spring.datasource")
public DataSource dbAesDataSource(){
return DataSourceBuilder.create().build();
}
#Bean(name="dbAesJdbcTemplate")
public JdbcTemplate dbAesJdbcTemplate(#Qualifier("dbAesDataSource") DataSource dbAesDataSource)
{
return new JdbcTemplate(dbAesDataSource);
}
#Bean(name="dbAesEntityManagerFactory")
public LocalContainerEntityManagerFactoryBean dbAesEntityManagerFactory(
EntityManagerFactoryBuilder builder,
#Qualifier("dbAesDataSource") DataSource dbAesDataSource) {
return builder
.dataSource(dbAesDataSource)
.packages("web.fileUpload.repo.db_aes.models")
.build();
}
#Bean(name = "dbAesTransactionManager")
public PlatformTransactionManager dbAesTransactionManager(
#Qualifier("dbAesEntityManagerFactory") EntityManagerFactory dbAesEntityManagerFactory) {
return new JpaTransactionManager(dbAesEntityManagerFactory);
}
}
AidcconfigDataSource
#Configuration
#EnableTransactionManagement
#EnableJpaRepositories(
entityManagerFactoryRef = "aidcconfigEntityManagerFactory",
transactionManagerRef = "aidcconfigTransactionManager",
basePackages = {"web.fileUpload.repo.aidcconfig.dao"}
)
public class AidcconfigDataSource {
#Bean(name="aidcconfigDataSource")
#ConfigurationProperties(prefix = "spring.secondDatasource")
public DataSource aidcconfigDataSource(){
return DataSourceBuilder.create().build();
}
#Bean(name="aidcconfigJdbcTemplate")
public JdbcTemplate aidcconfigJdbcTemplate(#Qualifier("aidcconfigDataSource") DataSource aidcconfigDataSource)
{
return new JdbcTemplate(aidcconfigDataSource);
}
#Bean(name="aidcconfigEntityManagerFactory")
public LocalContainerEntityManagerFactoryBean aidcconfigEntityManagerFactory(
EntityManagerFactoryBuilder builder,
#Qualifier("aidcconfigDataSource") DataSource aidcconfigDataSource) {
return builder
.dataSource(aidcconfigDataSource)
.packages("web.fileUpload.repo.aidcconfig.models")
.build();
}
#Bean(name = "aidcconfigTransactionManager")
public PlatformTransactionManager aidcconfigTransactionManager(
#Qualifier("aidcconfigEntityManagerFactory") EntityManagerFactory aidcconfigEntityManagerFactory) {
return new JpaTransactionManager(aidcconfigEntityManagerFactory);
}
}
Here is my error:
[ERROR] Failed to execute goal org.springframework.boot:spring-boot-maven-plugin:1.3.5.RELEASE:run (default-cli) on
project testUpload: An exception occurred while running. null: InvocationTargetException: Error creating bean with
name 'fileDownloadController': Injection of autowired dependencies failed; nested exception is org.springframework
.beans.factory.BeanCreationException: Could not autowire field: private web.fileUpload.services.FileUploadService w
eb.fileUpload.controller.FileDownloadController.fileUploadService; nested exception is org.springframework.beans.fa
ctory.BeanCreationException: Error creating bean with name 'fileUploadService': Injection of autowired dependencies
failed; nested exception is org.springframework.beans.factory.BeanCreationException: Could not autowire field: org
.springframework.jdbc.core.JdbcTemplate web.fileUpload.services.FileUploadServiceImpl.dbAesJdbcTemplate; nested exc
eption is org.springframework.beans.factory.NoSuchBeanDefinitionException: No qualifying bean of type [org.springfr
amework.jdbc.core.JdbcTemplate] found for dependency: expected at least 1 bean which qualifies as autowire candidat
e for this dependency. Dependency annotations: {#org.springframework.beans.factory.annotation.Autowired(required=tr
ue), #org.springframework.beans.factory.annotation.Qualifier(value=dbAesJdbcTemplate)} -> [Help 1]
If I removed the Qualifier in the FileUploadServiceImpl, then any jdbcTemplate will only connect to my Primary database which is db_aes. How can I access to my second datasource using jdbcTemplate?
Following are some of the references I used:
Spring Boot, Spring Data JPA with multiple DataSources
https://www.infoq.com/articles/Multiple-Databases-with-Spring-Boot
Multiple DataSource and JdbcTemplate in Spring Boot (> 1.1.0)
Trial#1
I noticed that it is unable to create the bean, and I placed some logger in the AidcconfigDataSource class. As a result, I didn't see my method is being executed. Thus, I assumed that the application is not reading my AidcconfigDataSource class.
I relocated the config folder as such, from java/config to java/web/config:
now I have another error:
[ERROR] Failed to execute goal org.springframework.boot:spring-boot-maven-plugin:1.3.5.RELEASE:run (default-cli) on
project testUpload: An exception occurred while running. null: InvocationTargetException: Error creating bean with
name 'dataSourceInitializerPostProcessor': Injection of autowired dependencies failed; nested exception is org.spr
ingframework.beans.factory.BeanCreationException: Could not autowire field: private org.springframework.beans.facto
ry.BeanFactory org.springframework.boot.autoconfigure.jdbc.DataSourceInitializerPostProcessor.beanFactory; nested e
xception is org.springframework.beans.factory.BeanDefinitionStoreException: Invalid bean definition with name 'aidc
configDataSource' defined in class path resource [web/config/database/AidcconfigDataSource.class]: factory-bean ref
erence points back to the same bean definition -> [Help 1]
Trial#2
I have changed my bean name from aidcconfigDataSource to aidcconfigDS and same to the primary datasource. Plus I have added "spring.jpa.open_in_view = false" in my application.properties. However another error happens. How to do this the right way?
2016-11-03 09:28:16.118 ERROR 11412 --- [nio-8080-exec-1] o.a.c.c.C.[.[.[/].[dispatcherServlet] : Servlet.servic
e() for servlet [dispatcherServlet] in context with path [] threw exception [Request processing failed; nested exce
ption is org.springframework.beans.factory.NoUniqueBeanDefinitionException: No qualifying bean of type [org.springf
ramework.transaction.PlatformTransactionManager] is defined: expected single matching bean but found 2: dbAesTransa
ctionManager,aidcconfigTransactionManager] with root cause
org.springframework.beans.factory.NoUniqueBeanDefinitionException: No qualifying bean of type [org.springframework.
transaction.PlatformTransactionManager] is defined: expected single matching bean but found 2: dbAesTransactionMana
ger,aidcconfigTransactionManager
I think Spring Boot is trying to instantiate 2 beans with the same name:
aidcconfigDataSource.
One is your configuration class AidcconfigDataSource.class and the other one is the bean:
#Bean(name="aidcconfigDataSource")
#ConfigurationProperties(prefix = "spring.secondDatasource")
public DataSource aidcconfigDataSource(){
return DataSourceBuilder.create().build();
}
Related
We are getting exception
Error creating bean with name 'costsEntityManagerFactory' defined in class path resource...Invocation of init method failed; nested exception is javax.persistence.PersistenceException: [PersistenceUnit: consolidatedCosts] Unable to build Hibernate SessionFactory; nested exception is org.hibernate.search.util.common.SearchException: HSEARCH000573: Invalid configuration passed to Hibernate Search: some properties in the given configuration are obsolete.Configuration properties changed between Hibernate Search 5 and Hibernate Search 6 Check out the reference documentation and upgrade your configuration.
Here is the code snippet
#Bean
public LocalContainerEntityManagerFactoryBean costsEntityManagerFactory(
final EntityManagerFactoryBuilder builder,
final #Qualifier("awsCostsDataSource") DataSource costsDataSource) {
final Map<String, String> properties = new HashMap<>();
properties.put(
"hibernate.dialect",
"com.x.y.consolidatedcosts.client.xxxxxPostgresDialect");
properties.put(
"hibernate.physical_naming_strategy", CostsTableNamingStrategy.class.getName());
return builder.dataSource(costsDataSource)
.packages("com.xxx.xxx.consolidatedcosts")
.persistenceUnit("consolidatedCosts")
.properties(properties)
.build();
}
I have a springboot application which connects both to Elasticsearch and Database. Firstly I used spring-data features such as JpaRepository and ElasticsearchRepository but I would like to start using JdbcTemplate and ElasticSearchTemplate. I deleted all Jpa related services and annotations like #Entity, #Id, and created my own configuration but spring still wants to create EntityManagerFactory and use Hiberante and so on. What shall I do ?
Here is my config
#Bean(name = DE_DATABASE_BEAN_NAME)
#ConfigurationProperties(prefix = "spring.de")
#Primary
public DataSource deDBDataSource() {
return DataSourceBuilder.create().build();
}
#Bean(name = DE_DATABASE_JDBC_TEMPLATE_NAME)
public JdbcTemplate jdbcTemplate(#Qualifier(DE_DATABASE_BEAN_NAME) final DataSource dataSource) {
return new JdbcTemplate(dataSource);
}
#Bean(name = DE_ELASTICSEARCH_TEMPLATE_NAME)
public ElasticsearchTemplate elasticsearchTemplate() throws UnknownHostException {
final String server = clusterNodes.split(":")[0];
final Integer port = Integer.parseInt(clusterNodes.split(":")[1]);
final Settings settings = Settings.builder().put("cluster.name", clusterName).build();
final Client client = new PreBuiltTransportClient(settings).addTransportAddress(new InetSocketTransportAddress(InetAddress.getByName(server), port));
return new ElasticsearchTemplate(client);
}
And my stacktrace
org.springframework.beans.factory.UnsatisfiedDependencyException: Error creating bean with name 'entityManagerFactory' defined in class path resource [org/springframework/boot/autoconfigure/orm/jpa/HibernateJpaConfiguration.class]: Unsatisfied dependency expressed through method 'entityManagerFactory' parameter 0; nested exception is org.springframework.beans.factory.UnsatisfiedDependencyException: Error creating bean with name 'entityManagerFactoryBuilder' defined in class path resource [org/springframework/boot/autoconfigure/orm/jpa/HibernateJpaConfiguration.class]: Unsatisfied dependency expressed through method 'entityManagerFactoryBuilder' parameter 0; nested exception is org.springframework.beans.factory.BeanCreationException: Error creating bean with name 'jpaVendorAdapter' defined in class path resource [org/springframework/boot/autoconfigure/orm/jpa/HibernateJpaConfiguration.class]: Bean instantiation via factory method failed; nested exception is org.springframework.beans.BeanInstantiationException: Failed to instantiate [org.springframework.orm.jpa.JpaVendorAdapter]: Factory method 'jpaVendorAdapter' threw exception; nested exception is java.lang.IllegalArgumentException: jdbcUrl is required with driverClassName.
I have a spring batch written using Spring boot. My batch only reads from MongoDB and prints the record.
I'm not using any SQL DB nor have any dependencies declared in project but While running it I'm getting below exception:
s.c.a.AnnotationConfigApplicationContext : Exception encountered during context initialization - cancelling refresh attempt: org.springframework.beans.factory.UnsatisfiedDependencyException: Error creating bean with name 'batchConfigurer' defined in class path resource [org/springframework/boot/autoconfigure/batch/BatchConfigurerConfiguration$JdbcBatchConfiguration.class]: Unsatisfied dependency expressed through method 'batchConfigurer' parameter 1; nested exception is org.springframework.beans.factory.NoSuchBeanDefinitionException: No qualifying bean of type 'javax.sql.DataSource' available: expected at least 1 bean which qualifies as autowire candidate. Dependency annotations: {}
o.s.j.e.a.AnnotationMBeanExporter : Unregistering JMX-exposed beans on shutdown
ConditionEvaluationReportLoggingListener :
Error starting ApplicationContext. To display the conditions report re-run your application with 'debug' enabled.
2018-06-01 10:43:39.485 ERROR 15104 --- [ main] o.s.b.d.LoggingFailureAnalysisReporter :
***************************
APPLICATION FAILED TO START
***************************
Description:
Parameter 1 of method batchConfigurer in org.springframework.boot.autoconfigure.batch.BatchConfigurerConfiguration$JdbcBatchConfiguration required a bean of type 'javax.sql.DataSource' that could not be found.
- Bean method 'dataSource' not loaded because #ConditionalOnProperty (spring.datasource.jndi-name) did not find property 'jndi-name'
- Bean method 'dataSource' not loaded because #ConditionalOnClass did not find required class 'javax.transaction.TransactionManager'
Action:
Consider revisiting the conditions above or defining a bean of type 'javax.sql.DataSource' in your configuration.
In my pom.xml I've added below dependancies:
spring-boot-starter-batch
spring-boot-starter-data-mongodb
spring-boot-starter-test
spring-batch-test
Here's my batch configuration class:
#Configuration
#EnableBatchProcessing
public class BatchConfig {
#Autowired
private JobBuilderFactory jobBuilderFactory;
#Autowired
private StepBuilderFactory stepBuilderFactory;
#Autowired
private MongoTemplate mongoTemplate;
#Bean
public Job job() throws Exception {
return jobBuilderFactory.get("job1").flow(step1()).end().build();
}
#Bean
public Step step1() throws Exception {
return stepBuilderFactory.get("step1").<BasicDBObject, BasicDBObject>chunk(10).reader(reader())
.writer(writer()).build();
}
#Bean
public ItemReader<BasicDBObject> reader() {
MongoItemReader<BasicDBObject> reader = new MongoItemReader<BasicDBObject>();
reader.setTemplate(mongoTemplate);
reader.setCollection("test");
reader.setQuery("{'status':1}");
reader.setSort(new HashMap<String, Sort.Direction>() {
{
put("_id", Direction.ASC);
}
});
reader.setTargetType((Class<? extends BasicDBObject>) BasicDBObject.class);
return reader;
}
public ItemWriter<BasicDBObject> writer() {
MongoItemWriter<BasicDBObject> writer = new MongoItemWriter<BasicDBObject>();
return writer;
}
}
Here's is my launcher class:
import org.springframework.boot.SpringApplication;
import org.springframework.boot.autoconfigure.SpringBootApplication;
import org.springframework.boot.autoconfigure.jdbc.DataSourceAutoConfiguration;
#SpringBootApplication(exclude = DataSourceAutoConfiguration.class)
public class MyBatchApplication {
...
}
Spring Batch requires the use of a relational data store for the job repository. Spring Boot enforces that fact. In order to fix this, you'll need to add an in memory database or create your own BatchConfigurer that uses the Map based repository. For the record, the recommended approach is to add an in memory database (HSQLDB/H2/etc).
Any help getting this config to work would be welcome.
I am trying to take over the automatic connection pool, datasource and JPA configuration from Spring Boot to allow me to bring DataNucleus into the mix instead of Hibernate.
My approach is to code up the pieces Boot says are missing on a trial and error basis. I had to remove the Hibernate dependencies to allow DataNucleus to run.
Maybe I've now coded up too much or maybe not I'm not far enough.
Spring falls over with the error:
Exception encountered during context initialization - cancelling refresh attempt:
[huge SNIP]
nested exception is org.springframework.beans.BeanInstantiationException:
Failed to instantiate [org.springframework.data.repository.support.Repositories]:
Factory method 'repositories' threw exception;
nested exception is org.springframework.beans.factory.UnsatisfiedDependencyException:
Error creating bean with name 'symbolRepositoryImpl':
Unsatisfied dependency expressed through field 'entityManager';
nested exception is org.springframework.beans.factory.NoUniqueBeanDefinitionException:
No qualifying bean of type 'javax.persistence.EntityManager' available:
expected single matching bean but found 2:
org.springframework.orm.jpa.SharedEntityManagerCreator#0,
org.springframework.orm.jpa.SharedEntityManagerCreator#1
[SNIP]
2017-06-01 09:43:09.675 ERROR 9108 --- [ restartedMain] o.s.b.d.LoggingFailureAnalysisReporter
***************************
APPLICATION FAILED TO START
***************************
Description:
Field entityManager in com.bp.gis.tardis.repository.SymbolRepositoryImpl
required a single bean, but 2 were found:
- org.springframework.orm.jpa.SharedEntityManagerCreator#0: defined by method 'createSharedEntityManager' in null
- org.springframework.orm.jpa.SharedEntityManagerCreator#1: defined by method 'createSharedEntityManager' in null
Action:
Consider marking one of the beans as #Primary, updating the consumer to accept multiple beans,
or using #Qualifier to identify the bean that should be consumed
I could spend hours debugging this further but the breakpoint comes in the initialisation of one of the repositories which should have an entityManager injected.
This is what I'm manually instantiating:
#Configuration
#EnableJpaRepositories(
basePackages = {"org.adam.repository"}
)
public class DataSourceConfig {
#Bean
#ConfigurationProperties(prefix = "adam.datasource")
public AdamDataSourceProperties getDataSourceProperties() {
return new AdamDataSourceProperties();
}
#Bean
public DataSource getDataSource() {
AdamDataSourceProperties props = getDataSourceProperties();
return new HikariDataSource(props.getHikariConfig());
}
#Bean
public LocalContainerEntityManagerFactoryBean getEmfBean() {
LocalContainerEntityManagerFactoryBean emfBean =
new LocalContainerEntityManagerFactoryBean();
emfBean.setDataSource(getDataSource());
emfBean.setPersistenceUnitName("adam");
return emfBean;
}
#Bean
public EntityManagerFactory getEmf() {
LocalContainerEntityManagerFactoryBean emfBean = getEmfBean();
return emfBean.getNativeEntityManagerFactory();
}
}
My AdamDatasourceProperties is initialised by Spring using the "adam.datasource" prefixed values in application.properties, and it can then create a HikariConfig object to use to instantiate the HikariDataSource. That bit is actually fine, it's the entity manager factory that is probably causing issues - or something else.
I've got no evidence that my last method getEmf() is actually helping.
Also, I'm dubious that the error
Required a single bean, but 2 were found
or the suggested action are helpful - I don't fancy going into the Spring source code in order to annotate one of those methods on Spring's SharedEntityManagerCreator as #Primary.
UPDATE
DataNucleus won't run if it finds other JPA API classes on the classpath - it insists on its own version of the persistence API - hence removing the Hibernate packages was necessary.
Caused by: org.datanucleus.exceptions.NucleusUserException:
Found Meta-Data for class org.adam.entity.TimeSeriesEntity
but this class is either not enhanced or you have multiple copies
of the persistence API jar in your CLASSPATH!!
Make sure all persistable classes are enhanced before running
DataNucleus and/or the CLASSPATH is correct.
at org.datanucleus.metadata.MetaDataManagerImpl
.initialiseClassMetaData(MetaDataManagerImpl.java:2814)
so I have excluded Hibernate from spring-boot-starter-data-jpa and the error disappears.
I changed the LocalContainerEntityManagerFactoryBean method name to entityManagerFactory:
#Bean
public LocalContainerEntityManagerFactoryBean entityManagerFactory() {
LocalContainerEntityManagerFactoryBean emfBean =
new LocalContainerEntityManagerFactoryBean();
emfBean.setDataSource(getDataSource());
emfBean.setPersistenceUnitName("adam");
return emfBean;
}
and to enable testing, I have to copy this #Configuration class and change the EMF method to accept Spring's test database:
#Bean
public LocalContainerEntityManagerFactoryBean entityManagerFactory(
#Qualifier("dataSource") DataSource dataSource) {
LocalContainerEntityManagerFactoryBean emfBean =
new LocalContainerEntityManagerFactoryBean();
emfBean.setDataSource(dataSource);
emfBean.setPersistenceUnitName("adam");
return emfBean;
}
That #Qualifier is for the sake of Intellij whose Spring facet complains about 2 candidates for injection here.
I also discovered that with this configuration, the repository/DTO dependency injection for the EntityManager doesn't work with #Autowired. It has to be the native-JPA annotation:
#PersistenceContext
private EntityManager entityManager;
With my previous Hibernate and OpenJPA configurations, Spring was happy to inject its own self-instantiated EntityManager in the presence of #Autowire.
This adds more fuel to my beef with Spring. It just too often doesn't do what it says on the tin. The Spring tests should find the #Configuration classes in the package hierarchy, but doesn't - I need to use #Import. Spring should also find dependency injection candidates based on type (EntityManager, DataSource etc) but it doesn't - in some cases they have to be produced by methods named a particular name or with #Bean annotations declaring a name.
Still, it's done.
I have datasource autowired with setters. Trying to return datasource value with Bean declaration in Spring javaconfig file. For some reason, it is not identifying and showing the error:
Property 'dataSource' required
Any idea? Here is my Bean in the javaconfig file:
#Bean(name = "dataSource")
public DataSource dataSource() {
DriverManagerDataSource dataSource = new DriverManagerDataSource();
dataSource.setDriverClassName("org.hsqldb.jdbcDriver");
dataSource.setUrl("xyz");
dataSource.setUsername("xyz");
dataSource.setPassword("xyz");
return dataSource;
}
and the log trace:
Error creating bean with name 'featureStoreSpringJDBC' defined
in URL [jar:file:/C:home/WEB-INF/lib/ff4j-store-springjdbc.jar!
/org/ff4j/store/FeatureStoreSpringJDBC.class]:
Initialization of bean failed; nested exception
is org.springframework.beans.factory.BeanInitializationException
Property 'dataSource' is required for bean 'featureStoreSpringJDBC'
Please note that the attribute dataSource is not annotated with the #Autowired annotation, as a consequence you must explicitely invoke setter and initialize the FeatureStore in javaconfig.
The reason is you should defined the whole FF4J as a bean in the java config.In version before 1.3 it was autowired but we got issues with spreading of javaConfig.