Override Default Datasource getConnection() - spring-boot

I am trying to convert a Spring application (for the most part) to a Spring Boot application. In the app, I have an HTTP basic filter that collects a username and password, this is then passed as variables in a DataSource implementation.
In this DataSource, the getConnection() method is so:
#Override\n public Connection getConnection() throws SQLException {
Statement stmt = null;
try {
ConnectionWrapper connection = this.authenticatedConnection.get();
if (connection == null) {
connection = new ConnectionWrapper(this.dataSource.getConnection());
StringBuilder command;
// The CONNECT command allows indicating a user name, a password
// and a database to initiate a
// new session in the server with a new profile.
command = new StringBuilder("CONNECT USER ").append(this.parameters.get().get(USER_NAME)).append(" PASSWORD ")
.append("'").append(this.parameters.get().get(PASSWORD_NAME)).append("'").append(" DATABASE ")
.append(this.parameters.get().get(DATA_BASE_NAME));
this.authenticatedConnection.set(connection);
stmt = connection.createStatement();
stmt.execute(command.toString());
}
return connection;
} catch (final SQLException e) {...`
(With \n as a new line due to StackOverflow formatting issues)
In Spring, I am able to implement #Autowired Private DataSource dataSource without a problem. In Spring Boot, as I understand it, the Object needs to be a Bean to use #Autowired, but when I add #Bean before this implemented DataSource I get "The annotation #Bean is disallowed for this location"
How can I get it so that I can do a dataSource.getConnection(); and get a connection from the primary DataSource, or be able to Override the methods of the primary DataSource?
The way I see it, there are 4 possible solutions listed here in order of preference:
Create a DataSource that is actually overwriting the spring.datasource' methods.
Get this implementation "Beanified" so I can just #Autowired the dataSource again.
I think I can skip the #Autowired and simply set this.dataSource = [unknown reference to spring.datasource defined in application.properties]
Create another DataSource class ProgrammedDataSource configured with the spring.datasource properties, then set it as this.dataSource = new ProgrammedDataSource();
but my attempts at implementing any of these solutions have produced this question.

I figured it out. I didn't need to make the Bean there, although I am still not sure why I was not allowed to call #Bean before the DataSource, but regardless.
In the application I had:
public class ServiceApplication {
#Bean
#Primary
#ConfigurationProperties("spring.datasource")
public DataSource dataSource(){
return DataSourceBuilder.create().build();
}
#Bean(name="AuthDataSource")
public DataSource authDataSource() {
return new AuthDataSource();
}
public static void main(String[] args) {
SpringApplication.run(ServiceApplication.class, args);
}
}
and in the controller I had:
#Controller
#RequestMapping("/service")
public class ServiceController {
#Autowired
public void MyBean(JdbcTemplate jdbcTemplate) {
this.jdbcTemplate = new JdbcTemplate(new AuthDataSource());
} ...
However, since I was calling new AuthDataSource() inside that JdbcTemplate, it was not doing the Autowiring. Now the Controller looks like this and it works:
#Controller
#RequestMapping("/service")
public class ServiceController {
#Autowired
#Qualifier("AuthDataSource")
private DataSource datasource;
private JdbcTemplate jdbcTemplate;
#Autowired
public void MyBean(JdbcTemplate jdbcTemplate) {
this.jdbcTemplate = new JdbcTemplate(this.dataSource);
} ...

Related

Spring Disable #Transactional from Configuration java file

I have a code base which is using for two different applications. some of my spring service classes has annotation #Transactional. On server start I would like to disable #Transactional based on some configuration.
The below is my configuration Class.
#Configuration
#EnableTransactionManagement
#PropertySource("classpath:application.properties")
public class WebAppConfig {
private static final String PROPERTY_NAME_DATABASE_DRIVER = "db.driver";
#Resource
private Environment env;
#Bean
public DataSource dataSource() {
DriverManagerDataSource dataSource = new DriverManagerDataSource();
dataSource.setDriverClassName(env.getRequiredProperty(PROPERTY_NAME_DATABASE_DRIVER));
dataSource.setUrl(url);
dataSource.setUsername(userId);
dataSource.setPassword(password);
return dataSource;
}
#Bean
public PlatformTransactionManager txManager() {
DefaultTransactionDefinition def = new DefaultTransactionDefinition();
def.setIsolationLevel(TransactionDefinition.ISOLATION_DEFAULT);
if(appName.equqls("ABC")) {
def.setPropagationBehavior(TransactionDefinition.PROPAGATION_NEVER);
}else {
def.setPropagationBehavior(TransactionDefinition.PROPAGATION_REQUIRED);
}
CustomDataSourceTransactionManager txM=new CustomDataSourceTransactionManager(def);
txM.setDataSource(dataSource());
return txM;
}
#Bean
public JdbcTemplate jdbcTemplate() {
JdbcTemplate jdbcTemplate = new JdbcTemplate();
jdbcTemplate.setDataSource(dataSource());
return jdbcTemplate;
}
}
I am trying to ovveried methods in DataSourceTransactionManager to make the functionality. But still it is trying to commit/rollback the transaction at end of transaction. Since there is no database connection available it is throwing exception.
If I keep #Transactional(propagation=Propagation.NEVER), everything works perfectly, but I cannot modify it as another app is using the same code base and it is necessary in that case.
I would like to know if there is a to make transaction fully disable from configuration without modifying #Transactional annotation.
I'm not sure if it would work but you can try to implement custom TransactionInterceptor and override its method that wraps invocation into a transaction, by removing that transactional stuff. Something like this:
public class NoOpTransactionInterceptor extends TransactionInterceptor {
#Override
protected Object invokeWithinTransaction(
Method method,
Class<?> targetClass,
InvocationCallback invocation
) throws Throwable {
// Simply invoke the original unwrapped code
return invocation.proceedWithInvocation();
}
}
Then you declare a conditional bean in one of #Configuration classes
// assuming this property is stored in Spring application properties file
#ConditionalOnProperty(name = "turnOffTransactions", havingValue = "true"))
#Bean
#Role(BeanDefinition.ROLE_INFRASTRUCTURE)
public TransactionInterceptor transactionInterceptor(
/* default bean would be injected here */
TransactionAttributeSource transactionAttributeSource
) {
TransactionInterceptor interceptor = new NoOpTransactionInterceptor();
interceptor.setTransactionAttributeSource(transactionAttributeSource);
return interceptor;
}
Probably you gonna need additional configurations, I can't verify that right now

How to register hibernate spring entity listeners

I have built an entity listener but have not figured out how to register it so that it will get called. This all runs, and I verified in the debugger that the
registration code executes (apparently successfully) at startup. But the debugger never stops in the listener code.
This is my listener:
public class DirtyAwareListener implements PostLoadEventListener
{
#Override
public void onPostLoad(PostLoadEvent postLoadEvent)
{
if (postLoadEvent.getEntity() instanceof DirtyAware)
{
((DirtyAware)postLoadEvent.getEntity()).commitFields();
}
}
}
and this is the registration component:
#Component
public class HibernateListenerConfigurer
{
#PersistenceUnit
private EntityManagerFactory emf;
#Autowired
private SessionFactory sessionFactory;
#PostConstruct
protected void init()
{
DirtyAwareListener listener = new DirtyAwareListener();
// SessionFactoryImpl sessionFactory = emf.unwrap(SessionFactoryImpl.class);
EventListenerRegistry registry = ((SessionFactoryImpl)sessionFactory).getServiceRegistry().getService(EventListenerRegistry.class);
registry.getEventListenerGroup(EventType.POST_LOAD).appendListener(listener);
}
}
Here is how my general Hibernate configuration code generates a session factory:
LocalSessionFactoryBean sessionFactory = new LocalSessionFactoryBean();
sessionFactory.setDataSource(getDataSource());
sessionFactory.setPackagesToScan("com.my.entities");
sessionFactory.setHibernateProperties(getHibernateProperties());
sessionFactory.setEntityInterceptor(new DirtyAwareInterceptor());
return sessionFactory;
Note that the interceptor does work as expected (but unfortunately does not have hooks where I need them.)
To add entity listeners implement org.hibernate.integrator.spi.Integrator. See example https://www.boraji.com/hibernate-5-event-listener-example
I got this working as desired using the Integrator approach as Anton suggested. The link provided in his answer did not provide sufficient information for me to get this to work - I had to reference multiple posts and also do a bit of trial and error. Since I could not find a single post which provided the info, here is how I did it:
The listener code is the same as the above. The Configurer code is not needed - I deleted it. Here is the new Integrator code:
#Component
public class EventListenerIntegrator implements Integrator
{
#Override
public void integrate(Metadata metadata, SessionFactoryImplementor sessionFactoryImplementor, SessionFactoryServiceRegistry sessionFactoryServiceRegistry)
{
EventListenerRegistry eventListenerRegistry =
sessionFactoryServiceRegistry.getService(EventListenerRegistry.class);
DirtyAwareListener t = new DirtyAwareListener();
eventListenerRegistry.getEventListenerGroup(EventType.POST_LOAD).appendListener(t);
}
#Override
public void disintegrate(SessionFactoryImplementor sessionFactoryImplementor, SessionFactoryServiceRegistry sessionFactoryServiceRegistry) {}
}
And here is the revised getSessionFactory method on my #Configuration class:
private static SessionFactory sessionFactory = null;
#Bean
public SessionFactory getSessionFactory()
{
if (sessionFactory == null)
{
BootstrapServiceRegistry bootstrapRegistry =
new BootstrapServiceRegistryBuilder()
.applyIntegrator(new EventListenerIntegrator())
.build();
StandardServiceRegistryBuilder registryBuilder =
new StandardServiceRegistryBuilder(bootstrapRegistry);
registryBuilder.applySetting(org.hibernate.cfg.Environment.DATASOURCE, getDataSource());
registryBuilder.applySettings(getHibernateProperties());
StandardServiceRegistry registry = registryBuilder.build();
MetadataSources sources = new MetadataSources(registry).addPackage("com.my.entities");
sources.addAnnotatedClass(User.class);
Metadata metadata = sources.getMetadataBuilder().build();
sessionFactory = metadata.getSessionFactoryBuilder().build();
}
return sessionFactory;
}
Note: I think the addPackage call is not needed and does not do anything. I had hoped it would do the package scan the old code was doing, but it does not do that. I simply changed that to explicity add each annotated class.

Spring Java based config "chicken and Egg situation"

Just recently started looking into Spring and specifically its latest features, like Java config etc.
I have this somewhat strange issue:
Java config Snippet:
#Configuration
#ImportResource({"classpath*:application-context.xml","classpath:ApplicationContext_Output.xml"})
#Import(SpringJavaConfig.class)
#ComponentScan(excludeFilters={#ComponentScan.Filter(org.springframework.stereotype.Controller.class)},basePackages = " com.xx.xx.x2.beans")
public class ApplicationContextConfig extends WebMvcConfigurationSupport {
private static final Log log = LogFactory.getLog(ApplicationContextConfig.class);
#Autowired
private Environment env;
#Autowired
private IExtendedDataSourceConfig dsconfig;
#PostConstruct
public void initApp() {
...
}
#Bean(name="transactionManagerOracle")
#Lazy
public DataSourceTransactionManager transactionManagerOracle() {
return new DataSourceTransactionManager(dsconfig.oracleDataSource());
}
IExtendedDataSourceConfig has two implementations which are based on spring active profile one or the other in instantiated. For this example let say this is the implementation :
#Configuration
#PropertySources(value = {
#PropertySource("classpath:MYUI.properties")})
#Profile("dev")
public class MYDataSourceConfig implements IExtendedDataSourceConfig {
private static final Log log = LogFactory.getLog(MYDataSourceConfig.class);
#Resource
#Autowired
private Environment env;
public MYDataSourceConfig() {
log.info("creating dev datasource");
}
#Bean
public DataSource oracleDataSource() {
DriverManagerDataSource dataSource = new DriverManagerDataSource();
dataSource.setDriverClassName("oracle.jdbc.driver.OracleDriver");
dataSource.setUrl(env.getProperty("oracle.url"));
dataSource.setUsername(env.getProperty("oracle.user"));
dataSource.setPassword(env.getProperty("oracle.pass"));
return dataSource;
}
The problem is that when transactionManagerOracle bean is called, (even if I try to mark it as lazy) dsconfig variable value appears to be null.
I guess #beans are processed first and then all Autowires, is there a fix for this? How do I either tell spring to inject dsconfig variable before creating beans, or somehow create #beans after dsconfig is injected?
You can just specify DataSource as method parameter for the transaction manager bean. Spring will then automatically inject the datasource, which is configured in the active profile:
#Bean(name="transactionManagerOracle")
#Lazy
public DataSourceTransactionManager transactionManagerOracle(DataSource dataSource) {
return new DataSourceTransactionManager(dataSource);
}
If you still want to do this through the configuration class, specify that as parameter:
public DataSourceTransactionManager transactionManagerOracle(IExtendedDataSourceConfig dsconfig) {}
In both ways you declare a direct dependency to another bean, and Spring will make sure, that the dependent bean exists and will be injected.

multiple datasources in Spring Boot application

I'm trying to using two database connections w/in a Spring Boot (v1.2.3) application as described in the docs (http://docs.spring.io/spring-boot/docs/1.2.3.RELEASE/reference/htmlsingle/#howto-two-datasources.
Problem seems to be that the secondary datasource is getting constructed with the properties for the primary datasource.
Can someone point out what I'm missing here?
#SpringBootApplication
class Application {
#Bean
#ConfigurationProperties(prefix="spring.datasource.secondary")
public DataSource secondaryDataSource() {
return DataSourceBuilder.create().build();
}
#Bean
public JdbcTemplate secondaryJdbcTemplate(DataSource secondaryDataSource) {
return new JdbcTemplate(secondaryDataSource)
}
#Bean
#Primary
#ConfigurationProperties(prefix="spring.datasource.primary")
public DataSource primaryDataSource() {
return DataSourceBuilder.create().build();
}
#Bean(name = "primaryJdbcTemplate")
public JdbcTemplate jdbcTemplate(DataSource primaryDataSource) {
return new JdbcTemplate(primaryDataSource)
}
static void main(String[] args) {
SpringApplication.run Application, args
}
}
application.properties:
spring.datasource.primary.url=jdbc:oracle:thin:#example.com:1521:DB1
spring.datasource.primary.username=user1
spring.datasource.primary.password=
spring.datasource.primary.driverClassName=oracle.jdbc.OracleDriver
spring.datasource.secondary.url=jdbc:oracle:thin:#example.com:1521:DB2
spring.datasource.secondary.username=user2
spring.datasource.secondary.password=
spring.datasource.secondary.driverClassName=oracle.jdbc.OracleDriver
Both JdbcTemplate beans will be getting created with the primary DataSource. You can use #Qualifier to have the secondary DataSource injected into the secondary JdbcTemplate. Alternatively, you could call the DataSource methods directly when creating the JdbcTemplate beans.

spring-boot-starter-jta-atomikos and spring-boot-starter-batch

Is it possible to use both these starters in a single application?
I want to load records from a CSV file into a database table. The Spring Batch tables are stored in a different database, so I assume I need to use JTA to handle the transaction.
Whenever I add #EnableBatchProcessing to my #Configuration class it configures a PlatformTransactionManager, which stops this being auto-configured by Atomikos.
Are there any spring boot + batch + jta samples out there that show how to do this?
Many Thanks,
James
I just went through this and I found something that seems to work. As you note, #EnableBatchProcessing causes a DataSourceTransactionManager to be created, which messes up everything. I'm using modular=true in #EnableBatchProcessing, so the ModularBatchConfiguration class is activated.
What I did was to stop using #EnableBatchProcessing and instead copy the entire ModularBatchConfiguration class into my project. Then I commented out the transactionManager() method, since the Atomikos configuration creates the JtaTransactionManager. I also had to override the jobRepository() method, because that was hardcoded to use the DataSourceTransactionManager created inside DefaultBatchConfiguration.
I also had to explicitly import the JtaAutoConfiguration class. This wires everything up correctly (according to the Actuator's "beans" endpoint - thank god for that). But when you run it the transaction manager throws an exception because something somewhere sets an explicit transaction isolation level. So I also wrote a BeanPostProcessor to find the transaction manager and call txnMgr.setAllowCustomIsolationLevels(true);
Now everything works, but while the job is running, I cannot fetch the current data from batch_step_execution table using JdbcTemplate, even though I can see the data in SQLYog. This must have something to do with transaction isolation, but I haven't been able to understand it yet.
Here is what I have for my configuration class, copied from Spring and modified as noted above. PS, I have my DataSource that points to the database with the batch tables annotated as #Primary. Also, I changed my DataSource beans to be instances of org.apache.tomcat.jdbc.pool.XADataSource; I'm not sure if that's necessary.
#Configuration
#Import(ScopeConfiguration.class)
public class ModularJtaBatchConfiguration implements ImportAware
{
#Autowired(required = false)
private Collection<DataSource> dataSources;
private BatchConfigurer configurer;
#Autowired
private ApplicationContext context;
#Autowired(required = false)
private Collection<BatchConfigurer> configurers;
private AutomaticJobRegistrar registrar = new AutomaticJobRegistrar();
#Bean
public JobRepository jobRepository(DataSource batchDataSource, JtaTransactionManager jtaTransactionManager) throws Exception
{
JobRepositoryFactoryBean factory = new JobRepositoryFactoryBean();
factory.setDataSource(batchDataSource);
factory.setTransactionManager(jtaTransactionManager);
factory.afterPropertiesSet();
return factory.getObject();
}
#Bean
public JobLauncher jobLauncher() throws Exception {
return getConfigurer(configurers).getJobLauncher();
}
// #Bean
// public PlatformTransactionManager transactionManager() throws Exception {
// return getConfigurer(configurers).getTransactionManager();
// }
#Bean
public JobExplorer jobExplorer() throws Exception {
return getConfigurer(configurers).getJobExplorer();
}
#Bean
public AutomaticJobRegistrar jobRegistrar() throws Exception {
registrar.setJobLoader(new DefaultJobLoader(jobRegistry()));
for (ApplicationContextFactory factory : context.getBeansOfType(ApplicationContextFactory.class).values()) {
registrar.addApplicationContextFactory(factory);
}
return registrar;
}
#Bean
public JobBuilderFactory jobBuilders(JobRepository jobRepository) throws Exception {
return new JobBuilderFactory(jobRepository);
}
#Bean
// hopefully this will autowire the Atomikos JTA txn manager
public StepBuilderFactory stepBuilders(JobRepository jobRepository, JtaTransactionManager ptm) throws Exception {
return new StepBuilderFactory(jobRepository, ptm);
}
#Bean
public JobRegistry jobRegistry() throws Exception {
return new MapJobRegistry();
}
#Override
public void setImportMetadata(AnnotationMetadata importMetadata) {
AnnotationAttributes enabled = AnnotationAttributes.fromMap(importMetadata.getAnnotationAttributes(
EnableBatchProcessing.class.getName(), false));
Assert.notNull(enabled,
"#EnableBatchProcessing is not present on importing class " + importMetadata.getClassName());
}
protected BatchConfigurer getConfigurer(Collection<BatchConfigurer> configurers) throws Exception {
if (this.configurer != null) {
return this.configurer;
}
if (configurers == null || configurers.isEmpty()) {
if (dataSources == null || dataSources.isEmpty()) {
throw new UnsupportedOperationException("You are screwed");
} else if(dataSources != null && dataSources.size() == 1) {
DataSource dataSource = dataSources.iterator().next();
DefaultBatchConfigurer configurer = new DefaultBatchConfigurer(dataSource);
configurer.initialize();
this.configurer = configurer;
return configurer;
} else {
throw new IllegalStateException("To use the default BatchConfigurer the context must contain no more than" +
"one DataSource, found " + dataSources.size());
}
}
if (configurers.size() > 1) {
throw new IllegalStateException(
"To use a custom BatchConfigurer the context must contain precisely one, found "
+ configurers.size());
}
this.configurer = configurers.iterator().next();
return this.configurer;
}
}
#Configuration
class ScopeConfiguration {
private StepScope stepScope = new StepScope();
private JobScope jobScope = new JobScope();
#Bean
public StepScope stepScope() {
stepScope.setAutoProxy(false);
return stepScope;
}
#Bean
public JobScope jobScope() {
jobScope.setAutoProxy(false);
return jobScope;
}
}
I found a solution where I was able to keep #EnableBatchProcessing but had to implement BatchConfigurer and atomikos beans, see my full answer in this so answer.

Resources