Micronaut - Configuring Oracle UCP Multiple Data Sources And jdbcOperations - oracle

I am developing my first micronaut application and i'm having a problem configuring oracle multiple datasources with ucp.
I'm following the oficial tutorial (https://micronaut-projects.github.io/micronaut-sql/latest/guide/) by when I try to perform a select, I get a error:
io.micronaut.transaction.jdbc.exceptions.CannotGetJdbcConnectionException: No current JDBC Connection found. Consider wrapping this call in transactional boundaries.
I check the DataSourceFactory and the PoolDataSource is set correctly
What am I missing?
Thanks!
micronaut:
application:
name: myapp
datasources:
first:
url: url
connectionFactoryClassName: oracle.jdbc.pool.OracleDataSource
username: user
password: password
minPoolSize: 1
maxPoolSize: 10
second:
url: url
connectionFactoryClassName: oracle.jdbc.pool.OracleDataSource
username: user
password: password
minPoolSize: 1
maxPoolSize: 10
#JdbcRepository(dialect = Dialect.ORACLE)
public abstract class MyRepository {
#Inject
#Named("first")
protected final DataSource dataSource;
protected final JdbcOperations jdbcOperations;
public MyRepository(JdbcOperations jdbcOperations, DataSource dataSource) {
this.dataSource = dataSource;
this.jdbcOperations = jdbcOperations;
}
}
#Singleton
public class MyDao extends MyRepository {
public MyDao(JdbcOperations jdbcOperations, DataSource dataSource) {
super(jdbcOperations, dataSource);
}
#Transactional
public Long find() {
String sql = "select * from table where id = 1";
return this.jdbcOperations.prepareStatement(sql, st -> {
return 1L;
});
}

The first datasource needs to be called 'default'.
Also when using a transaction with the non-default datasource, you need to use #TransactionAdvice with the datasource name, e.g. #TransactionalAdvice("Second").

Related

Amazon RDS Read Replica configuration Postgres database from an spring boot application deployed on PCF?

Hi All currently we are have deployed our springboot code to pcf which is running on aws.
we are using aws database - where we have cup service and VCAP_SERVICES which hold the parameter of db.
Below our configuration to get datasource
#Bean
public DataSource dataSource() {
if (dataSource == null) {
dataSource = connectionFactory().dataSource();
configureDataSource(dataSource);
}
return dataSource;
}
#Bean
public JdbcTemplate jdbcTemplate() {
return new JdbcTemplate(dataSource());
}
private void configureDataSource(DataSource dataSource) {
org.apache.tomcat.jdbc.pool.DataSource tomcatDataSource = asTomcatDatasource(dataSource);
tomcatDataSource.setTestOnBorrow(true);
tomcatDataSource.setValidationQuery("SELECT 1");
tomcatDataSource.setValidationInterval(30000);
tomcatDataSource.setTestWhileIdle(true);
tomcatDataSource.setTimeBetweenEvictionRunsMillis(60000);
tomcatDataSource.setRemoveAbandoned(true);
tomcatDataSource.setRemoveAbandonedTimeout(60);
tomcatDataSource.setMaxActive(Environment.getAsInt("MAX_ACTIVE_DB_CONNECTIONS", tomcatDataSource.getMaxActive()));
}
private org.apache.tomcat.jdbc.pool.DataSource asTomcatDatasource(DataSource dataSource) {
Objects.requireNonNull(dataSource, "There is no DataSource configured");
DataSource targetDataSource = ((DelegatingDataSource)dataSource).getTargetDataSource();
return (org.apache.tomcat.jdbc.pool.DataSource) targetDataSource;
}
Now when we have read replicas created , what configuration do i need to modify so our spring boot application uses the read replicas?
is Just #Transactional(readOnly = true) on the get call is enough - that it will be automatically taken care? or do i need to add some more configuration
#Repository
public class PostgresSomeRepository implements SomeRepository {
#Autowired
public PostgresSomeRepository(JdbcTemplate jdbcTemplate, RowMapper<Consent> rowMapper) {
this.jdbcTemplate = jdbcTemplate;
this.rowMapper = rowMapper;
}
#Override
#Transactional(readOnly = true)
public List<SomeValue> getSomeGetCall(List<String> userIds, String applicationName, String propositionName, String since, String... types) {
//Some Logic
try {
return jdbcTemplate.query(sql, rowMapper, paramList.toArray());
} catch (DataAccessException ex) {
throw new ErrorGettingConsent(ex.getMessage(), ex);
}
}
}
Note:we have not added any spring aws jdbc dependency
Let's assume the cloud service name is my_db.
Map the cloud service to the application config appication-cloud.yml used by default in the CF (BTW this is better than using the connector because you can customize the datasource)
spring:
datasource:
type: com.zaxxer.hikari.HikariDataSource
# my_db
url: ${vcap.services.my_db.credentials.url}
username: ${vcap.services.my_db.credentials.username}
password: ${vcap.services.my_db.credentials.password}
hikari:
poolName: Hikari
auto-commit: false
data-source-properties:
cachePrepStmts: true
prepStmtCacheSize: 250
prepStmtCacheSqlLimit: 2048
useServerPrepStmts: true
jpa:
generate-ddl: false
show-sql: true
put the service to the application manifest.yml:
---
applications:
- name: my-app
env:
SPRING_PROFILES_ACTIVE: "cloud" # by default
services:
- my_db

How to create multiple beans (same type) in one Spring Boot java config class (#Configuration)?

In my yaml file, I have config values as below:
myapp:
rest-clients:
rest-templates:
- id: myService1
username: chris
password: li
base-url: http://localhost:3000/service1
read-timeout: 2s
connect-timeout: 1s
- id: myService2
username: chris
password: li
base-url: http://localhost:3000/service1
read-timeout: 2s
connect-timeout: 1s
I want to Spring Boot 2 app register a RestTemplate for each config items.
My configuration is bean is below:
#Configuration
#AllArgsConstructor
public class MyAppRestClientsConfiguration {
private MyAppRestClientsProperties properties;
private GenericApplicationContext applicationContext;
private RestTemplateBuilder restTemplateBuilder;
#PostConstruct
public void init() {
properties.getRestTemplates().forEach(this::registerRestTemplate);
}
private void registerRestTemplate(MyAppRestTemplateConfig config) {
// do some work
applicationContext.registerBean(config.getId(), RestTemplate.class, () -> restTemplate)
}
}
The problem is that when I inject my registered RestTemplate via #Autowire, this config bean has not finished init yet. So there is no RestTemplate bean could be injected.
#Autowired
#Qualifier("myService1")
private RestTemplate client1;
The injection point has the following annotations:
- #org.springframework.beans.factory.annotation.Autowired(required=true)
- #org.springframework.beans.factory.annotation.Qualifier(value=myService1)
Is there any correct way to implement this requirement?
The problem with registering new beans in a #PostConstruct annotated method is that Spring is already past that particular point in the Spring life cycle (more info on the Spring life cycle). Sometimes an annotation such as #DependsOn (already mentioned), #Order, or #Lazy might help. However, as you mentioned you'd rather not force (spring) implementation details upon projects that make use of your library, I've written a BeanFactoryPostProcessor that registers a RestTemplate bean:
#Component
public class DemoBeanFactoryPostProcessor implements BeanFactoryPostProcessor {
#Override
public void postProcessBeanFactory(ConfigurableListableBeanFactory configurableListableBeanFactory) {
GenericBeanDefinition genericBeanDefinition = new GenericBeanDefinition();
genericBeanDefinition.setBeanClass(RestTemplate.class);
HttpComponentsClientHttpRequestFactory factory = new HttpComponentsClientHttpRequestFactory();
factory.setReadTimeout(Integer.valueOf(configurableListableBeanFactory.resolveEmbeddedValue("${rest-templates[0].read-timeout}")));
factory.setConnectTimeout(Integer.valueOf(configurableListableBeanFactory.resolveEmbeddedValue("${rest-templates[0].connect-timeout}")));
// etc
ConstructorArgumentValues constructorArgumentValues = new ConstructorArgumentValues();
constructorArgumentValues.addGenericArgumentValue(factory);
genericBeanDefinition.setConstructorArgumentValues(constructorArgumentValues);
String beanId = configurableListableBeanFactory.resolveEmbeddedValue("${rest-templates[0].id}");
((DefaultListableBeanFactory) configurableListableBeanFactory).registerBeanDefinition(beanId, genericBeanDefinition);
}
}
application.yml:
rest-templates:
- id: myService1
username: chris
password: li
base-url: http://localhost:3000/service1
read-timeout: 2000
connect-timeout: 1000
- id: myService2
username: chris
password: li
base-url: http://localhost:3000/service1
read-timeout: 2000
connect-timeout: 1000
Accompanying test:
#RunWith(SpringRunner.class)
#SpringBootTest
public class DemoApplicationTests {
#Autowired
#Qualifier("myService1")
private RestTemplate restTemplate;
#Test
public void demoBeanFactoryPostProcessor_shouldRegisterBean() {
String stackOverflow =
restTemplate.getForObject("https://stackoverflow.com/questions/57122343/how-to-create-multiple-beans-same-type-in-one-spring-boot-java-config-class", String.class);
Assertions.assertThat(stackOverflow).contains("How to create multiple beans (same type) in one Spring Boot java config class (#Configuration)?");
}
}
As the BeanFactoryPostProcessor is invoked before the application context is fully set up, I had to find a different way to retrieve the application properties. I used the method ConfigurableListableBeanFactory#resolveEmbeddedValue to retrieve placeholder values instead of having them injected by an #Value annotation or environment#getProperty. Furthermore, I rewrote the property value 2s to 2000 as the HttpComponentsClientHttpRequestFactory required an int value.
You can levarage the binding mechanism that Spring Boot uses for the #ConfigurationPropeties. The corresponding beans can then be generated dynamically from a map of configurations
#Configuration
public class MyConfiguration implements BeanFactoryPostProcessor, EnvironmentAware {
private Environment environment;
#Override
public void setEnvironment(Environment environment) {
this.environment = environment;
}
#Override
public void postProcessBeanFactory(ConfigurableListableBeanFactory configurableListableBeanFactory) {
Binder.get(environment)
.bind("com.example", FooMapProps.class)
.get()
.getMap()
.forEach((name, props) -> configurableListableBeanFactory.registerSingleton(name + "FooService", new FooService(props.getGreeting())));
}
}
In this way you can create multiple FooServices with different configurations based on your YAML config
com:
example:
map:
hello:
greeting: 'hello there!'
hey:
greeting: 'hey ho :)'
The corresponding properties class looks like this
#ConfigurationProperties(prefix = "com.example")
public class FooMapProps {
private Map<String, FooProps> map;
public Map<String, FooProps> getMap() {
return map;
}
public void setMap(Map<String, FooProps> map) {
this.map = map;
}
public static class FooProps {
private String greeting;
public String getGreeting() {
return greeting;
}
public void setGreeting(String greeting) {
this.greeting= greeting;
}
}
}
In this example two FooService beans has been created with different greetings as constructor paramters. The beans can be accessed by the qualifiers 'helloFooService' and 'heyFooService'.

Spring boot connection to more then 2 data source

I'm currently working on Spring Boot project where I need to connect to more than 2 data source (actually 4).
I found many examples how to connect to 2 DS and it works, but when I add next in the same way it's not working:
...Invocation of init method failed; nested exception is java.lang.IllegalArgumentException: Not a managed type: class...
Is there any restriction on data sources?
Or is it possible to connect to more than 2 DS?
Create as many datasource you want. No restrictions i am aware of. It is just another bean .Refer here
#first db
spring.datasource.url = [url]
spring.datasource.username = [username]
spring.datasource.password = [password]
spring.datasource.driverClassName = oracle.jdbc.OracleDriver
#second db ...
spring.secondDatasource.url = [url]
spring.secondDatasource.username = [username]
spring.secondDatasource.password = [password]
spring.secondDatasource.driverClassName = oracle.jdbc.OracleDriver
#Bean("firstds")
#ConfigurationProperties(prefix="spring.datasource")
public DataSource primaryDataSource() {
return DataSourceBuilder.create().build();
}
#Bean("secondds")
#ConfigurationProperties(prefix="spring.secondDatasource")
public DataSource secondaryDataSource() {
return DataSourceBuilder.create().build();
}
Adding to what #Surya said
Step 1: Set up the data base configuration in application.properties file as mentioned above.
Step 2:You need to create a bean.In your case u need to create 2 separate bean pointing to 2 diff data source.
#Bean("firstds")
#ConfigurationProperties(prefix="spring.datasource")
public DataSource primaryDataSource() {
return DataSourceBuilder.create().build();
}
#Bean("secondds")
#ConfigurationProperties(prefix="spring.secondDatasource")
public DataSource secondaryDataSource() {
return DataSourceBuilder.create().build();
}
U can create this 2 bean in a single class.
Now wondering how to use this bean!
#Autowired
#Qualifier("firstds")
NamedParameterJdbcTemplate firstConnection;
result = firstConnection.query(Your query goes here)
Above query will always hit first db via the bean you created.
Similarly you can use connection to hit db 2 via the second bean.
Thanks for your help.
Actually I did all configuration right but ... (one small misspell crash all :))
As I was starting including my code I found it, fixed and now all works fine.
But here is my code as example (maybe someone will use it :))
All three entities and repositories (Shadow, Main, Project) are configured exactly in the same way so I put it only once. (the only differences is that DS for Main is as #Primary.
Thanks.
BR
application.properties
#-------------------------- first
spring.datasource.url=jdbc:mysql://
spring.datasource.username=
spring.datasource.password=
spring.datasource.driver-class-name=com.mysql.cj.jdbc.Driver
#-------------------------- second
second.datasource.url=jdbc:mysql://
second.datasource.username=
second.datasource.password=
second.datasource.driver-class-name=com.mysql.cj.jdbc.Driver
#-------------------------- third
third.datasource.url=jdbc:mysql://
third.datasource.password=
third.datasource.username=
third.datasource.driver-class-name=com.mysql.cj.jdbc.Driver
Shadow.java
package com.server.shadow.domain;
import ...
#Entity
#Table(name = "shadow")
public class Shadow {
#Id
private String id;
private String shadow;
public Shadow(){}
//getters and setters
}
ShadowRepository.java
package com.server.shadow.repo;
import ...
#Repository
public interface ShadowRepository extends JpaRepository<Shadow,Long> {
}
ShadowDbConf.java
package com.server;
import ...
#Configuration
#EnableTransactionManagement
#EnableJpaRepositories(entityManagerFactoryRef = "shadowEntityManagerFactory",
transactionManagerRef = "shadowTransactionManager",
basePackages = {"com.server.shadow.repo"}
)
public class ShadowDbConf {
#Bean(name = "shadowDataSource")
#ConfigurationProperties(prefix = "second.datasource")
public DataSource dataSource() {
return DataSourceBuilder.create().build();
}
#Bean(name = "shadowEntityManagerFactory")
public LocalContainerEntityManagerFactoryBean
shadowEntityManagerFactory(
EntityManagerFactoryBuilder builder,
#Qualifier("shadowDataSource") DataSource dataSource
) {
return
builder
.dataSource(dataSource)
.packages("com.server.shadow.domain")
.persistenceUnit("shadow")
.build();
}
#Bean(name = "shadowTransactionManager")
public PlatformTransactionManager shadowTransactionManager(
#Qualifier("shadowEntityManagerFactory") EntityManagerFactory
shadowEntityManagerFactory
) {
return new JpaTransactionManager(shadowEntityManagerFactory);
}
}
MainController.java
package com.server;
import ...
#RestController
public class MainController {
private final ShadowRepository shadowRepository;
private final MainRepository mainRepository;
private final PojectRepository projectRepository;
#Autowired
MainController(ShadowRepository shadowRepository, MainRepository mainRepository,PojectRepository projectRepository) {
this.shadowRepository = shadowRepository;
this.mainRepository = mainRepository;
this.projectRepository = projectRepository;
}
#GetMapping(path = "/shadow")
public #ResponseBody Iterable<Shadow> getShadows(){
return shadowRepository.findAll();
}
}

Spring Cloud Task - specify database config

I have Spring Cloud Task that loads data from SQL Server to Cassandra DB which will be run on Spring Cloud Data Flow.
One of the requirement of Spring Task is to provide relational database to persist metadata like task execution state. But I don't want use either of the above databases for that. Instead, I have to specify third database for persistence. But it seems like Spring Cloud Task flow automatically picks up data source properties of SQL Server from application.properties. How can I specify another db for task state persistence?
My Current properties:
spring.datasource.url=jdbc:sqlserver://iphost;databaseName=dbname
spring.datasource.username=user
spring.datasource.password=password
spring.datasource.driverClassName=com.microsoft.sqlserver.jdbc.SQLServerDriver
spring.jpa.show-sql=false
#spring.jpa.hibernate.dialect=org.hibernate.dialect.SQLServer2012Dialect
spring.jpa.hibernate.naming.physical-strategy=org.hibernate.boot.model.naming.PhysicalNamingStrategyStandardImpl
spring.jpa.hibernate.ddl-auto=none
spring.data.cassandra.contact-points=ip
spring.data.cassandra.port=9042
spring.data.cassandra.username=username
spring.data.cassandra.password=password
spring.data.cassandra.keyspace-name=mykeyspace
spring.data.cassandra.schema-action=CREATE_IF_NOT_EXISTS
Update: 1
I added below code to point to 3rd database as suggested by Michael Minella. Now Spring Task is able to connect to this DB and persist state. But now my batch job source queries are also connecting to this database. Only thing I changed was to add datasource for task.
spring.task.datasource.url=jdbc:postgresql://host:5432/testdb?stringtype=unspecified
spring.task.datasource.username=user
spring.task.datasource.password=passwrod
spring.task.datasource.driverClassName=org.postgresql.Driver
#Configuration
public class DataSourceConfigs {
#Bean(name = "taskDataSource")
#ConfigurationProperties(prefix="spring.task.datasource")
public DataSource getDataSource() {
return DataSourceBuilder.create().build();
}
}
#Configuration
public class DDTaskConfigurer extends DefaultTaskConfigurer{
#Autowired
public DDTaskConfigurer(#Qualifier("taskDataSource") DataSource dataSource) {
super(dataSource);
}
}
Update #2:
#Component
#StepScope
public class MyItemReader extends RepositoryItemReader<Scan> implements InitializingBean{
#Autowired
private ScanRepository repository;
private Integer lastScanIdPulled = null;
public MyItemReader(Integer _lastIdPulled) {
super();
if(_lastIdPulled == null || _lastIdPulled <=0 ){
lastScanIdPulled = 0;
} else {
lastScanIdPulled = _lastIdPulled;
}
}
#PostConstruct
protected void setUpRepo() {
final Map<String, Sort.Direction> sorts = new HashMap<>();
sorts.put("id", Direction.ASC);
this.setRepository(this.repository);
this.setSort(sorts);
this.setMethodName("findByScanGreaterThanId");
List<Object> methodArgs = new ArrayList<Object>();
System.out.println("lastScanIdpulled >>> " + lastScanIdPulled);
if(lastScanIdPulled == null || lastScanIdPulled <=0 ){
lastScanIdPulled = 0;
}
methodArgs.add(lastScanIdPulled);
this.setArguments(methodArgs);
}
}
#Repository
public interface ScanRepository extends JpaRepository<Scan, Integer> {
#Query("...")
Page<Scan> findAllScan(final Pageable pageable);
#Query("...")
Page<Scan> findByScanGreaterThanId(int id, final Pageable pageable);
}
Update #3:
If I add config datasource for Repository, I now get below exception. Before you mention that one of the datasource needs to be declared Primary. I already tried that.
Caused by: java.lang.IllegalStateException: Expected one datasource and found 2
at org.springframework.cloud.task.batch.configuration.TaskBatchAutoConfiguration$TaskBatchExecutionListenerAutoconfiguration.taskBatchExecutionListener(TaskBatchAutoConfiguration.java:65) ~[spring-cloud-task-batch-1.0.3.RELEASE.jar:1.0.3.RELEASE]
at org.springframework.cloud.task.batch.configuration.TaskBatchAutoConfiguration$TaskBatchExecutionListenerAutoconfiguration$$EnhancerBySpringCGLIB$$baeae6b9.CGLIB$taskBatchExecutionListener$0(<generated>) ~[spring-cloud-task-batch-1.0.3.RELEASE.jar:1.0.3.RELEASE]
at org.springframework.cloud.task.batch.configuration.TaskBatchAutoConfiguration$TaskBatchExecutionListenerAutoconfiguration$$EnhancerBySpringCGLIB$$baeae6b9$$FastClassBySpringCGLIB$$5a898c9.invoke(<generated>) ~[spring-cloud-task-batch-1.0.3.RELEASE.jar:1.0.3.RELEASE]
at org.springframework.cglib.proxy.MethodProxy.invokeSuper(MethodProxy.java:228) ~[spring-core-4.3.14.RELEASE.jar:4.3.14.RELEASE]
at org.springframework.context.annotation.ConfigurationClassEnhancer$BeanMethodInterceptor.intercept(ConfigurationClassEnhancer.java:358) ~[spring-context-4.3.14.RELEASE.jar:4.3.14.RELEASE]
at org.springframework.cloud.task.batch.configuration.TaskBatchAutoConfigu
#Configuration
#EnableTransactionManagement
#EnableJpaRepositories(
entityManagerFactoryRef = "myEntityManagerFactory",
basePackages = { "com.company.dd.collector.tool" },
transactionManagerRef = "TransactionManager"
)
public class ToolDbConfig {
#Bean(name = "myEntityManagerFactory")
public LocalContainerEntityManagerFactoryBean
myEntityManagerFactory(
EntityManagerFactoryBuilder builder,
#Qualifier("ToolDataSource") DataSource dataSource
) {
return builder
.dataSource(dataSource)
.packages("com.company.dd.collector.tool")
.persistenceUnit("tooldatasource")
.build();
}
#Bean(name = "myTransactionManager")
public PlatformTransactionManager transactionManager(
#Qualifier("myEntityManagerFactory") EntityManagerFactory
entityManagerFactory
) {
return new JpaTransactionManager(entityManagerFactory);
}
}
#Configuration
public class DataSourceConfigs {
#Bean(name = "taskDataSource")
#ConfigurationProperties(prefix="spring.task.datasource")
public DataSource getDataSource() {
return DataSourceBuilder.create().build();
}
#Primary
#Bean(name = "ToolDataSource")
#ConfigurationProperties(prefix = "tool.datasource")
public DataSource dataSource() {
return DataSourceBuilder.create().build();
}
}
You need to create a TaskConfigurer to specify the DataSource to be used. You can read about this interface in the documentation here: https://docs.spring.io/spring-cloud-task/1.1.1.RELEASE/reference/htmlsingle/#features-task-configurer
The javadoc can be found here: https://docs.spring.io/spring-cloud-task/docs/current/apidocs/org/springframework/cloud/task/configuration/TaskConfigurer.html
UPDATE 1:
When using more than one DataSource, both Spring Batch and Spring Cloud Task follow the same paradigm in that they both have *Configurer interfaces that need to be used to specify what DataSource to use. For Spring Batch, you use the BatchConfigurer (typically by just extending the DefaultBatchConfigurer) and as noted above, the TaskConfigurer is used in Spring Cloud Task. This is because when there is more than one DataSource, the framework has no way of knowing which one to use.

Override Default Datasource getConnection()

I am trying to convert a Spring application (for the most part) to a Spring Boot application. In the app, I have an HTTP basic filter that collects a username and password, this is then passed as variables in a DataSource implementation.
In this DataSource, the getConnection() method is so:
#Override\n public Connection getConnection() throws SQLException {
Statement stmt = null;
try {
ConnectionWrapper connection = this.authenticatedConnection.get();
if (connection == null) {
connection = new ConnectionWrapper(this.dataSource.getConnection());
StringBuilder command;
// The CONNECT command allows indicating a user name, a password
// and a database to initiate a
// new session in the server with a new profile.
command = new StringBuilder("CONNECT USER ").append(this.parameters.get().get(USER_NAME)).append(" PASSWORD ")
.append("'").append(this.parameters.get().get(PASSWORD_NAME)).append("'").append(" DATABASE ")
.append(this.parameters.get().get(DATA_BASE_NAME));
this.authenticatedConnection.set(connection);
stmt = connection.createStatement();
stmt.execute(command.toString());
}
return connection;
} catch (final SQLException e) {...`
(With \n as a new line due to StackOverflow formatting issues)
In Spring, I am able to implement #Autowired Private DataSource dataSource without a problem. In Spring Boot, as I understand it, the Object needs to be a Bean to use #Autowired, but when I add #Bean before this implemented DataSource I get "The annotation #Bean is disallowed for this location"
How can I get it so that I can do a dataSource.getConnection(); and get a connection from the primary DataSource, or be able to Override the methods of the primary DataSource?
The way I see it, there are 4 possible solutions listed here in order of preference:
Create a DataSource that is actually overwriting the spring.datasource' methods.
Get this implementation "Beanified" so I can just #Autowired the dataSource again.
I think I can skip the #Autowired and simply set this.dataSource = [unknown reference to spring.datasource defined in application.properties]
Create another DataSource class ProgrammedDataSource configured with the spring.datasource properties, then set it as this.dataSource = new ProgrammedDataSource();
but my attempts at implementing any of these solutions have produced this question.
I figured it out. I didn't need to make the Bean there, although I am still not sure why I was not allowed to call #Bean before the DataSource, but regardless.
In the application I had:
public class ServiceApplication {
#Bean
#Primary
#ConfigurationProperties("spring.datasource")
public DataSource dataSource(){
return DataSourceBuilder.create().build();
}
#Bean(name="AuthDataSource")
public DataSource authDataSource() {
return new AuthDataSource();
}
public static void main(String[] args) {
SpringApplication.run(ServiceApplication.class, args);
}
}
and in the controller I had:
#Controller
#RequestMapping("/service")
public class ServiceController {
#Autowired
public void MyBean(JdbcTemplate jdbcTemplate) {
this.jdbcTemplate = new JdbcTemplate(new AuthDataSource());
} ...
However, since I was calling new AuthDataSource() inside that JdbcTemplate, it was not doing the Autowiring. Now the Controller looks like this and it works:
#Controller
#RequestMapping("/service")
public class ServiceController {
#Autowired
#Qualifier("AuthDataSource")
private DataSource datasource;
private JdbcTemplate jdbcTemplate;
#Autowired
public void MyBean(JdbcTemplate jdbcTemplate) {
this.jdbcTemplate = new JdbcTemplate(this.dataSource);
} ...

Resources