Spring Batch not deserialising dates - spring

I'm adding an object in the JobExecution context of spring batch, which contains an Instant field.
It's getting serialised as follows:
{
"startFrom": {
"nano": 0,
"epochSecond": 1541116800
}
}
However, Spring Batch doesn't seem to be able to deserialise it.
Caused by: java.lang.IllegalArgumentException: Unable to deserialize the execution context
at org.springframework.batch.core.repository.dao.JdbcExecutionContextDao$ExecutionContextRowMapper.mapRow(JdbcExecutionContextDao.java:325)
at org.springframework.batch.core.repository.dao.JdbcExecutionContextDao$ExecutionContextRowMapper.mapRow(JdbcExecutionContextDao.java:309)
at org.springframework.jdbc.core.RowMapperResultSetExtractor.extractData(RowMapperResultSetExtractor.java:93)
at org.springframework.jdbc.core.RowMapperResultSetExtractor.extractData(RowMapperResultSetExtractor.java:60)
at org.springframework.jdbc.core.JdbcTemplate$1.doInPreparedStatement(JdbcTemplate.java:667)
at org.springframework.jdbc.core.JdbcTemplate.execute(JdbcTemplate.java:605)
at org.springframework.jdbc.core.JdbcTemplate.query(JdbcTemplate.java:657)
at org.springframework.jdbc.core.JdbcTemplate.query(JdbcTemplate.java:688)
at org.springframework.jdbc.core.JdbcTemplate.query(JdbcTemplate.java:700)
at org.springframework.jdbc.core.JdbcTemplate.query(JdbcTemplate.java:756)
at org.springframework.batch.core.repository.dao.JdbcExecutionContextDao.getExecutionContext(JdbcExecutionContextDao.java:112)
at org.springframework.batch.core.explore.support.SimpleJobExplorer.getJobExecutionDependencies(SimpleJobExplorer.java:202)
at org.springframework.batch.core.explore.support.SimpleJobExplorer.getJobExecutions(SimpleJobExplorer.java:83)
at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.base/java.lang.reflect.Method.invoke(Method.java:564)
at org.springframework.aop.support.AopUtils.invokeJoinpointUsingReflection(AopUtils.java:343)
at org.springframework.aop.framework.ReflectiveMethod
When doing some research I see that Jackson has a JavaTimeModule to serialise/deserialise Instant and other date classes.
However, in the Jackson2ExecutionContextStringSerializer class, it creates the ObjectMapper as follows, not registering the right module:
public Jackson2ExecutionContextStringSerializer() {
this.objectMapper = new ObjectMapper();
this.objectMapper.configure(MapperFeature.DEFAULT_VIEW_INCLUSION, false);
this.objectMapper.configure(DeserializationFeature.FAIL_ON_UNKNOWN_PROPERTIES, true);
this.objectMapper.enableDefaultTyping();
this.objectMapper.registerModule(new JobParametersModule());
}
Is there a reason for Spring Batch not to use the autowired ObjectMapper? Or a reason for them not to register the JavaTimeModule?
Is there maybe a workaround this issue?
Thanks!
Edit:
I've found how to overwrite this object mapper:
#Bean
public JobRepository createJobRepository() throws Exception {
ObjectMapper objectMapper = new ObjectMapper().registerModule(new JavaTimeModule()).findAndRegisterModules();
Jackson2ExecutionContextStringSerializer defaultSerializer = new Jackson2ExecutionContextStringSerializer();
defaultSerializer.setObjectMapper(objectMapper);
JobRepositoryFactoryBean factory = new JobRepositoryFactoryBean();
factory.setDataSource(dataSource);
factory.setTransactionManager(transactionManager);
factory.setSerializer(defaultSerializer);
factory.afterPropertiesSet();
return factory.getObject();
}
However, even with this, the issue persists.

The following worked for me. I am extending Mahmoud Ben Hassine's answer above which didn't work for Nicolas Widart. (I don't have enough reputation to just comment).
#Bean
public BatchConfigurer configurer(DataSource dataSource, PlatformTransactionManager transactionManager,
ObjectMapper objectMapper) {
return new DefaultBatchConfigurer(dataSource) {
final Jackson2ExecutionContextStringSerializer serializer = new Jackson2ExecutionContextStringSerializer();
#Override
protected JobRepository createJobRepository() throws Exception {
serializer.setObjectMapper(objectMapper);
JobRepositoryFactoryBean factory = new JobRepositoryFactoryBean();
factory.setDataSource(dataSource);
factory.setTransactionManager(transactionManager);
factory.setSerializer(serializer);
factory.afterPropertiesSet();
return factory.getObject();
}
#Override
protected JobExplorer createJobExplorer() throws Exception {
serializer.setObjectMapper(objectMapper);
JobExplorerFactoryBean jobExplorerFactoryBean = new JobExplorerFactoryBean();
jobExplorerFactoryBean.setSerializer(serializer);
jobExplorerFactoryBean.setDataSource(dataSource);
jobExplorerFactoryBean.afterPropertiesSet();
return jobExplorerFactoryBean.getObject();
}
};
}
(I have defined an ObjectMapper bean in another #Configuration class.) The important thing here is to also override the createJobExplorer() method and set the correct ExecutionContextSerializer there, as this method will call getTarget() on the JobExplorerFactoryBean, which will set a vanilla ExecutionContextSerializer if there is none set and which causes the error you described.

The code you showed is the default initialization of the serializer. You can override it by providing a custom object mapper that you pre-configure with the modules you need. Here is an example:
#Bean
public JobRepository jobRepository() throws Exception {
ObjectMapper objectMapper = null; // configure the object mapper as required
Jackson2ExecutionContextStringSerializer serializer = new Jackson2ExecutionContextStringSerializer();
serializer.setObjectMapper(objectMapper);
JobRepositoryFactoryBean jobRepositoryFactoryBean = new JobRepositoryFactoryBean();
jobRepositoryFactoryBean.setSerializer(serializer);
// set other properties on the jobRepositoryFactoryBean
jobRepositoryFactoryBean.afterPropertiesSet();
return jobRepositoryFactoryBean.getObject();
}
Hope this helps.

UPDATE
Spring batch now has a fix that allows you to customise your ObjectMapper while also getting all of the default setup that Jackson2ExecutionContextStringSerialier does internally after this issue was addressed: https://jira.spring.io/browse/BATCH-2828.
The fix is in 4.2.0 which at time of writing is currently at RC1.
Original answer
I've noticed similar problems. I need to customise ObjectMapper to add KotlinModule for dealing with kotlin data classes. Unfortunately, the way Jackson2ExecutionContextStringSerializer is currently written is not very extensible. As you have seen from the default ObjectMapper that was created there are some defaults that were added, including a fix for deserialisation issues with job parameters: https://jira.spring.io/browse/BATCH-2680.
The JobParametersModule they setup is a private class and as such we cannot setup ObjectMapper with the same features and add to it.
I have raised an issue for this in Jira: https://jira.spring.io/browse/BATCH-2828
The dirty workaround until this is fixed is to copy and paste the source code for JobParmetersModule so you can register the same in your override.

Related

Spring Kafka #SendTo throws exception : a KafkaTemplate is required to support replies

I'm trying to get consumer result, according to Spring kafka doc.
Based on this stackoverflow question, it should be possible to do this only by using #SendTo annotation beacuse spring boot "also auto configures a kafka template if there is not one already in the context."
But I can't get it works, I still get
java.lang.IllegalStateException: a KafkaTemplate is required to support replies
at org.springframework.util.Assert.state(Assert.java:73) ~[spring-core-5.1.8.RELEASE.jar:5.1.8.RELEASE]
at org.springframework.kafka.config.MethodKafkaListenerEndpoint.createMessageListener(MethodKafkaListenerEndpoint.java:156)
...
This is my listener method
#KafkaListener(topics = "t_invoice")
#SendTo("t_ledger")
public List<LedgerEntry> consume(Invoice invoice) throws IOException {
// do some processing
var ledgerCredit = new LedgerEntry(invoice.getAmount(), "Credit side", 0, "");
var ledgerDebit = new LedgerEntry(0, "", invoice.getAmount(), "Debit side");
return List.of(ledgerCredit, ledgerDebit);
}
What did I miss?
This my the only #Configuration file I have on consumer.
Consumer & producer is separated system (e.g. payment system produce invoice to kafka, my program is accounting system that took data and create ledger entry)
#Configuration
public class KafkaConfig {
#Autowired
private KafkaProperties kafkaProperties;
#Bean
public ConsumerFactory<String, String> consumerFactory() {
var properties = kafkaProperties.buildConsumerProperties();
properties.put(ConsumerConfig.METADATA_MAX_AGE_CONFIG, "600000");
return new DefaultKafkaConsumerFactory<>(properties);
}
#Bean
public KafkaListenerContainerFactory<ConcurrentMessageListenerContainer<String, String>> kafkaListenerContainerFactory() {
var factory = new ConcurrentKafkaListenerContainerFactory<String, String>();
factory.setConsumerFactory(consumerFactory());
return factory;
}
}
aplication.yml
spring:
kafka:
consumer:
group-id: default-spring-consumer
auto-offset-reset: earliest
Trial-Error 1
If I disable the KafkaConfig, or enable debug during run, this error exists:
org.apache.kafka.common.errors.SerializationException: Can't convert value of class com.accounting.kafkaconsumer.entity.LedgerEntry to class org.apache.kafka.common.serialization.StringSerializer specified in value.serializer
Caused by: java.lang.ClassCastException: class com.accounting.kafkaconsumer.entity.LedgerEntry cannot be cast to class java.lang.String (com.accounting.kafkaconsumer.entity.LedgerEntry is in unnamed module of loader 'app'; java.lang.String is in module java.base of loader 'bootstrap')
at org.apache.kafka.common.serialization.StringSerializer.serialize(StringSerializer.java:28) ~[kafka-clients-2.0.1.jar:na]
at org.apache.kafka.common.serialization.ExtendedSerializer$Wrapper.serialize(ExtendedSerializer.java:65) ~[kafka-clients-2.0.1.jar:na]
at org.apache.kafka.common.serialization.ExtendedSerializer$Wrapper.serialize(ExtendedSerializer.java:55) ~[kafka-clients-2.0.1.jar:na]
...
Trial-Error 2
If I disable KafkaConfig and using this signature (returning String), it works. But this is not expected, since my configuration is on KafkaConfig
#KafkaListener(topics = "t_invoice")
#SendTo("t_ledger")
public String consume(Invoice invoice) throws IOException {
// do some processing
var listLedger = List.of(ledgerCredit, ledgerDebit);
return objectMapper.writeValueAsString(listLedger);
}
I think the problem is in here (KafkaConfig), since I create new instance of KafkaListenerContainerFactory, the replyTemplate is null.
How is the correct way to set up my KafkaConfig?
#Bean
public KafkaListenerContainerFactory<ConcurrentMessageListenerContainer<String, String>> kafkaListenerContainerFactory() {
var factory = new ConcurrentKafkaListenerContainerFactory<String, String>();
factory.setConsumerFactory(consumerFactory());
return factory;
}
If you override Boot's auto-configured container factory then it won't be... auto-configured, including applying the template. When you define your own factory, you are responsible for configuring it. It's not clear why you are overriding Boot's kafkaListenerContainerFactory bean since all you are doing is injecting the consumer factory. Just remove that #Bean and use Boot's.
If you override Boot's kafkaListenerContainerFactory make sure that you set the reply template
#Bean
public ConcurrentKafkaListenerContainerFactory<String, Object> kafkaListenerContainerFactory(KafkaTemplate<String, Object> kafkaTemplate) {
ConcurrentKafkaListenerContainerFactory<String, Object> factory =
new ConcurrentKafkaListenerContainerFactory<>();
factory.setConsumerFactory(consumerFactory());
factory.setReplyTemplate(kafkaTemplate); // <============
return factory;
}

Spring Batch Multiple DataSource Session/Entitymanager Closed

I am trying to Pull Records from SQLServer Database to Persist into Mysql Using Spring Boot and Sprin Batch(JpaPagingItemReader and JpaItemWriter).
I have configured Multiple Datasources.
How ever i am facing with below error.
org.springframework.batch.item.ItemStreamException: Error while closing item reader
at org.springframework.batch.item.support.AbstractItemCountingItemStreamItemReader.close(AbstractItemCountingItemStreamItemReader.java:138)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.springframework.beans.factory.support.DisposableBeanAdapter.invokeCustomDestroyMethod(DisposableBeanAdapter.java:337)
at org.springframework.beans.factory.support.DisposableBeanAdapter.destroy(DisposableBeanAdapter.java:271)
at org.springframework.beans.factory.support.DefaultSingletonBeanRegistry.destroyBean(DefaultSingletonBeanRegistry.java:571)
at org.springframework.beans.factory.support.DefaultSingletonBeanRegistry.destroySingleton(DefaultSingletonBeanRegistry.java:543)
at org.springframework.beans.factory.support.DefaultListableBeanFactory.destroySingleton(DefaultListableBeanFactory.java:957)
at org.springframework.beans.factory.support.DefaultSingletonBeanRegistry.destroySingletons(DefaultSingletonBeanRegistry.java:504)
at org.springframework.beans.factory.support.DefaultListableBeanFactory.destroySingletons(DefaultListableBeanFactory.java:964)
at org.springframework.context.support.AbstractApplicationContext.destroyBeans(AbstractApplicationContext.java:1041)
at org.springframework.context.support.AbstractApplicationContext.doClose(AbstractApplicationContext.java:1017)
at org.springframework.context.support.AbstractApplicationContext.close(AbstractApplicationContext.java:967)
at org.springframework.batch.core.launch.support.CommandLineJobRunner.start(CommandLineJobRunner.java:377)
at org.springframework.batch.core.launch.support.CommandLineJobRunner.main(CommandLineJobRunner.java:597)
at net.com.org.batch.MyApplication.main(MyApplication.java:15)
Caused by: java.lang.IllegalStateException: Session/EntityManager is closed
at org.hibernate.internal.AbstractSharedSessionContract.checkOpen(AbstractSharedSessionContract.java:344)
at org.hibernate.engine.spi.SharedSessionContractImplementor.checkOpen(SharedSessionContractImplementor.java:137)
at org.hibernate.internal.AbstractSharedSessionContract.checkOpenOrWaitingForAutoClose(AbstractSharedSessionContract.java:350)
at org.hibernate.internal.SessionImpl.close(SessionImpl.java:413)
at org.springframework.batch.item.database.JpaPagingItemReader.doClose(JpaPagingItemReader.java:232)
at org.springframework.batch.item.support.AbstractItemCountingItemStreamItemReader.close(AbstractItemCountingItemStreamItemReader.java:135)
... 17 common frames omitted
20:10:55.875 [main] DEBUG org.springframework.beans.factory.support.DefaultListableBeanFactory - Retrieved dependent beans for bean 'jpaMappingContext': [mySqljobRepository, sqlServerLogsRepository]
Below is my Batch,Step Configuration
#Autowired
private StepBuilderFactory stepBuilderFactory;
#Autowired
#Qualifier("mysqlEntityManager")
private LocalContainerEntityManagerFactoryBean mysqlLocalContainerEntityManagerFactoryBean;
#Autowired
#Qualifier("secondarySqlEntityManager")
private LocalContainerEntityManagerFactoryBean localContainerEntityManagerFactoryBean;
#Autowired
#Qualifier("mysqlTransactionManager")
private PlatformTransactionManager mySqlplatformTransactionManager;
#Autowired
#Qualifier("secondaryTransactionManager")
private PlatformTransactionManager secondaryTransactionManager;
#Autowired
private JobBuilderFactory jobBuilderFactory;
#Bean
public JpaPagingItemReader itemReader(PlatformTransactionManager secondaryTransactionManager) {
JpaPagingItemReader<SqlServerJobLogs> serverJobLogsJpaPagingItemReader = new JpaPagingItemReader<>();
serverJobLogsJpaPagingItemReader.setMaxItemCount(1000);
serverJobLogsJpaPagingItemReader.setPageSize(100);
serverJobLogsJpaPagingItemReader.setEntityManagerFactory(localContainerEntityManagerFactoryBean.getNativeEntityManagerFactory());
serverJobLogsJpaPagingItemReader.setQueryString("select p from SqlServerJobLogs p");
return serverJobLogsJpaPagingItemReader;
}
#Bean
public ItemProcessor itemProcessor() {
return new DataItemProcessor();
}
#Bean
public ItemWriter itemWriter(PlatformTransactionManager mySqlplatformTransactionManager) {
DataWriter dataWriter = new DataWriter();
return dataWriter;
}
#Bean
public Step step() {
return stepBuilderFactory.get("myJob").chunk(100).reader(itemReader(secondaryTransactionManager)).processor(itemProcessor()).writer(itemWriter(mySqlplatformTransactionManager)).build();
}
#Bean(name = "myJob")
public Job myJob() throws Exception {
return jobBuilderFactory.get("myJob").start(step()).build();
}
#Bean
public ResourcelessTransactionManager resourcelessTransactionManager(){
return new ResourcelessTransactionManager();
}
#Bean
public JobRepository jobRepository() throws Exception{
MapJobRepositoryFactoryBean mapJobRepositoryFactoryBean = new MapJobRepositoryFactoryBean(resourcelessTransactionManager());
return mapJobRepositoryFactoryBean.getObject();
}
#Bean
public SimpleJobLauncher jobLauncher() throws Exception {
SimpleJobLauncher simpleJobLauncher = new SimpleJobLauncher();
simpleJobLauncher.setJobRepository(jobRepository());
return simpleJobLauncher;
}
I have Tried to Configure BatchConfigurer.But No luck.
Please let me know if i need to configure anything else apart from the above mentioned details
Thanks in Advance
I would like to answer the question.
A big thanks to #MahmoudBenHassine
Providing a custom transaction manager by overriding
DefaultBatchConfigurer#getTransactionManager works only with Spring
Batch v4.1+ as said in comments. We need to use Spring Boot v2.1 to
have Spring Batch 4.1

Migration to Spring Boot 2 and using Spring Batch 4

I am migrating Spring Boot from 1.4.2 to 2.0.0, which also includes migrating Spring batch from 3.0.7 to 4.0.0 and it looks like batch process is no longer working when i try to run it with new Spring Batch version.
When I tried to debug i found a problem when batch tries to get data from batch_job_execution_context.
I can see the getting the data from database works fine, but the new version of batch fails to parse database data
{"map":[{"entry":[{"string":["name",""]},{"string":["sender",""]},{"string":["id",""]},{"string":["nav",""]},{"string":["created",140418]}]}]}
with this error :
com.fasterxml.jackson.databind.exc.MismatchedInputException: Unexpected token (START_OBJECT), expected VALUE_STRING: need JSON String that contains type id (for subtype of java.lang.Object) at [Source: (ByteArrayInputStream); line: 1, column: 9] (through reference chain: java.util.HashMap["map"])
I have found that when I delete all batch metadata tables and recreate them from scratch, batch seems to work again. It looks like the metadata JSON format has changed to this
{"name":"","sender":"145844","id":"","nav":"","created":"160909"}
I do not want to delete old data to makes this work again, so is there any way to fix this?
Has anyone else tried to do this upgrade? It would be nice to know if there are any other breaking changes that I may not have noticed.
Thanks
Before Spring Batch 4, the default serialization mechanism for the ExecutionContext was via XStream. Now it uses Jackson by default which is not compatible with the old serialization format. We still have the old version available (XStreamExecutionContextStringSerializer) but you'll need to configure it yourself by implementing a BatchConfigurer and overriding the configuration in the JobRepositoryFactoryBean.
For the record, this is related to this issue: https://jira.spring.io/browse/BATCH-2575.
Based on Michael's answer above, this code block worked for me to extend the default configuration — I had to wire up the Serializer to both the JobRepository.class and the JobExplorer.class:
#Configuration
#EnableBatchProcessing
MyBatchConfigurer extends DefaultBatchConfigurer {
private final DataSource dataSource;
#Autowired
public BatchConfiguration(final DataSource dataSource) throws Exception {
this.dataSource = dataSource;
}
#Bean
ExecutionContextSerializer getSerializer() {
return new XStreamExecutionContextStringSerializer();
}
#Override
protected JobRepository createJobRepository() throws Exception {
final JobRepositoryFactoryBean factory = new JobRepositoryFactoryBean();
factory.setDataSource(dataSource);
factory.setSerializer(getSerializer());
factory.setTransactionManager(getTransactionManager());
factory.afterPropertiesSet();
return factory.getObject();
}
#Override
protected JobExplorer createJobExplorer() throws Exception {
final JobExplorerFactoryBean jobExplorerFactoryBean = new JobExplorerFactoryBean();
jobExplorerFactoryBean.setDataSource(dataSource);
jobExplorerFactoryBean.setSerializer(getSerializer());
jobExplorerFactoryBean.afterPropertiesSet();
return jobExplorerFactoryBean.getObject();
}
}
In the solution by #anotherdave and #michael-minella, you could also replace the plain XStreamExecutionContextStringSerializer with an instance of the following class. It accepts both formats when deserializing and serializes to the new format.
import java.io.BufferedInputStream;
import java.io.IOException;
import java.io.InputStream;
import java.io.OutputStream;
import java.util.Map;
import com.fasterxml.jackson.core.JsonProcessingException;
import org.springframework.batch.core.repository.ExecutionContextSerializer;
import org.springframework.batch.core.repository.dao.Jackson2ExecutionContextStringSerializer;
import org.springframework.batch.core.repository.dao.XStreamExecutionContextStringSerializer;
/**
* Enables Spring Batch 4 to read both ExecutionContext entries written by ealier versions and the Spring 5 format. Entries are
* written in Spring 5 format.
*/
#SuppressWarnings("deprecation")
class XStreamOrJackson2ExecutionContextSerializer implements ExecutionContextSerializer {
private final XStreamExecutionContextStringSerializer xStream = new XStreamExecutionContextStringSerializer();
private final Jackson2ExecutionContextStringSerializer jackson = new Jackson2ExecutionContextStringSerializer();
public XStreamOrJackson2ExecutionContextSerializer() throws Exception {
xStream.afterPropertiesSet();
}
// The caller closes the stream; and the decoration by ensureMarkSupported does not need any cleanup.
#SuppressWarnings("resource")
#Override
public Map<String, Object> deserialize(InputStream inputStream) throws IOException {
InputStream repeatableInputStream = ensureMarkSupported(inputStream);
repeatableInputStream.mark(Integer.MAX_VALUE);
try {
return jackson.deserialize(repeatableInputStream);
} catch (JsonProcessingException e) {
repeatableInputStream.reset();
return xStream.deserialize(repeatableInputStream);
}
}
private static InputStream ensureMarkSupported(InputStream in) {
return in.markSupported() ? in : new BufferedInputStream(in);
}
#Override
public void serialize(Map<String, Object> object, OutputStream outputStream) throws IOException {
jackson.serialize(object, outputStream);
}
}

Quartz JDBCJobStore with RoutingDataSource

For my application, we are using the spring's
org.springframework.jdbc.datasource.lookup.AbstractRoutingDataSource
The target dataSources are configured and chosen based on request's domain URL.
Eg:
qa.example.com ==> target datasource = DB1
qa-test.example.com ==> target datasource = DB2
Following is the configuration for the same
#Bean(name = "dataSource")
public DataSource dataSource() throws PropertyVetoException, ConfigurationException {
EERoutingDatabase routingDB = new EERoutingDatabase();
Map<Object, Object> targetDataSources = datasourceList();
routingDB.setTargetDataSources(targetDataSources);
return routingDB;
}
public class EERoutingDatabase extends AbstractRoutingDataSource {
#Override
protected Object determineCurrentLookupKey() {
// This is derived from the request's URL/Domain
return SessionUtil.getDataSourceHolder();
}
}
The task is now using Quartz JDBCJobStore to store the quartz jobs/triggers.
The preferred option is using JobStoreCMT.
We used the following config
#Configuration
public class QuartzConfig {
private static final Logger LOG = LoggerFactory.getLogger(QuartzConfig.class);
private static final String QUARTZ_CONFIG_FILE = "ee-quartz.properties";
#Autowired
private DataSource dataSource;
#Autowired
private PlatformTransactionManager transactionManager;
#Autowired
private ApplicationContext applicationContext;
/**
* Spring wrapper over Quartz Scheduler bean
*/
#Bean(name="quartzRealTimeScheduler")
SchedulerFactoryBean schedulerFactoryBean() {
LOG.info("Creating QUARTZ Scheduler for real time Job invocation");
SchedulerFactoryBean factory = new SchedulerFactoryBean();
factory.setConfigLocation(new ClassPathResource(QUARTZ_CONFIG_FILE));
factory.setDataSource(dataSource);
factory.setTransactionManager(transactionManager);
factory.setJobFactory(springBeanJobFactory());
factory.setWaitForJobsToCompleteOnShutdown(true);
factory.setApplicationContextSchedulerContextKey("applicationContext");
return factory;
}
#Bean
public SpringBeanJobFactory springBeanJobFactory() {
AutoWiringSpringBeanJobFactory jobFactory = new AutoWiringSpringBeanJobFactory();
jobFactory.setApplicationContext(applicationContext);
jobFactory.setIgnoredUnknownProperties("applicationContext");
return jobFactory;
}
}
and following is the config in quartz properties file (ee-quartz.properties)
org.quartz.scheduler.instanceId=AUTO
org.quartz.jobStore.useProperties=false
org.quartz.jobStore.misfireThreshold: 60000
org.quartz.jobStore.driverDelegateClass = org.quartz.impl.jdbcjobstore.PostgreSQLDelegate
On starting the application, following exception occurs
Caused by: java.lang.IllegalStateException: Cannot determine target DataSource for lookup key [null]
at org.springframework.jdbc.datasource.lookup.AbstractRoutingDataSource.determineTargetDataSource(AbstractRoutingDataSource.java:202) ~[spring-jdbc-4.1.6.RELEASE.jar:4.1.6.RELEASE]
at com.expertly.config.EERoutingDatabase.determineTargetDataSource(EERoutingDatabase.java:60) ~[EERoutingDatabase.class:na]
at org.springframework.jdbc.datasource.lookup.AbstractRoutingDataSource.getConnection(AbstractRoutingDataSource.java:164) ~[spring-jdbc-4.1.6.RELEASE.jar:4.1.6.RELEASE]
at org.springframework.jdbc.datasource.DataSourceUtils.doGetConnection(DataSourceUtils.java:111) ~[spring-jdbc-4.1.6.RELEASE.jar:4.1.6.RELEASE]
at org.springframework.jdbc.datasource.DataSourceUtils.getConnection(DataSourceUtils.java:77) ~[spring-jdbc-4.1.6.RELEASE.jar:4.1.6.RELEASE]
at org.springframework.jdbc.support.JdbcUtils.extractDatabaseMetaData(JdbcUtils.java:289) ~[spring-jdbc-4.1.6.RELEASE.jar:4.1.6.RELEASE]
at org.springframework.jdbc.support.JdbcUtils.extractDatabaseMetaData(JdbcUtils.java:329) ~[spring-jdbc-4.1.6.RELEASE.jar:4.1.6.RELEASE]
at org.springframework.scheduling.quartz.LocalDataSourceJobStore.initialize(LocalDataSourceJobStore.java:149) ~[spring-context-support-4.0.1.RELEASE.jar:4.0.1.RELEASE]
at org.quartz.impl.StdSchedulerFactory.instantiate(StdSchedulerFactory.java:1321) ~[quartz-2.2.2.jar:na]
at org.quartz.impl.StdSchedulerFactory.getScheduler(StdSchedulerFactory.java:1525) ~[quartz-2.2.2.jar:na]
at org.springframework.scheduling.quartz.SchedulerFactoryBean.createScheduler(SchedulerFactoryBean.java:599) ~[spring-context-support-4.0.1.RELEASE.jar:4.0.1.RELEASE]
at org.springframework.scheduling.quartz.SchedulerFactoryBean.afterPropertiesSet(SchedulerFactoryBean.java:482) ~[spring-context-support-4.0.1.RELEASE.jar:4.0.1.RELEASE]
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.invokeInitMethods(AbstractAutowireCapableBeanFactory.java:1612) ~[spring-beans-4.0.1.RELEASE.jar:4.0.1.RELEASE]
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.initializeBean(AbstractAutowireCapableBeanFactory.java:1549) ~[spring-
beans-4.0.1.RELEASE.jar:4.0.1.RELEASE]
It seems that
Quartz is trying to create connections with my datasource upfront.
Since my dataSource isn't concrete one (its routing dataSource) and in addition doesn't have knowledge to which target Db to connect (at config time), it fails
Do we have any provision, where quartz can be used with RoutingDataSource? If Not, what would be the next best thing?
Ideally you can try making SchedulerFactoryBean as #Lazy.
But It seems lazy initialization will not work bug, there is also a work around listed in the comments.
Create schedulerFactory bean dynamically after
ContextRefreshedEvent received on root context.
Let us know, If this works.

Spring Batch Jpa Repository Save not committing the data

I am using Spring Batch and JPA to process a Batch Job and perform updates. I am using the default Repository implementations.
And I am using a repository.save to save the modified object in the processor.
Also,I don't have any #Transactional Annotation specified in the processor or writer.
I don't see any exceptions throughout. The selects happen fine.
Is there any setting like "setAutoCommit(true)" that I should be using for the JPA to save the data in the DB.
Here is my step,reader and writer config:
Also, my config class is annotated with EnableBatchProcessing
#EnableBatchProcessing
public class UpgradeBatchConfiguration extends DefaultBatchConfigurer{
#Autowired
private PlatformTransactionManager transactionManager;
#Override
protected JobRepository createJobRepository() throws Exception {
JobRepositoryFactoryBean factory = new JobRepositoryFactoryBean();
factory.setDataSource(getdataSource());
factory.setTransactionManager(transactionManager);
factory.setTablePrefix("CFTES_OWNER.BATCH_");
factory.afterPropertiesSet();
return factory.getObject();
}
#Bean(name = "updateFilenetJobStep")
public Step jobStep(StepBuilderFactory stepBuilderFactory,
#Qualifier("updateFileNetReader") RepositoryItemReader reader,
#Qualifier("updateFileNetWriter") ItemWriter writer,
#Qualifier("updateFileNetProcessor") ItemProcessor processor) {
return stepBuilderFactory.get("jobStep").allowStartIfComplete(true).chunk(1).reader(reader).processor(processor)
.writer(writer).transactionManager(transactionManager).build();
}
#Bean(name = "updateFileNetWriter")
public ItemWriter getItemWriter() {
return new BatchItemWriter();
}
#Bean(name = "updateFileNetReader")
public RepositoryItemReader<Page<TermsAndConditionsErrorEntity>> getItemReader(
TermsAndConditionsErrorRepository repository) {
RepositoryItemReader<Page<TermsAndConditionsErrorEntity>> reader = new RepositoryItemReader<Page<TermsAndConditionsErrorEntity>>();
reader.setRepository(repository);
reader.setMethodName("findAll");
HashMap<String, Direction> map = new HashMap<String, Direction>();
map.put("transactionId", Direction.ASC);
reader.setSort(map);
return reader;
}
}
And in the writer this is what I am using Repository.save
repository.save(entity);
I was able to solve this by injecting a JPATransactionManager throughout(the job repository, job, step etc) , instead of the Autowired PlatformTransactionManager.

Resources