Tasklet or ItemReader that makes calls to Google Cloud Datastore - spring

I'm attempting to create a Spring Batch tasklet that calls a DatastoreRepository. Tasklet execute step
#Override public RepeatStatus execute(StepContribution contribution, ChunkContext chunkContext)
throws Exception {
String syncId = this.stepExecutionContext.getJobParameters()
.getString(JobParameterKeys.SYNC_ID);
SyncJob syncJob = syncJobRepo.findById(Long.parseLong(syncId)).get();
When I attempt to call the syncJobRepo I receive a the org.springframework.transaction.NoTransactionException: No transaction aspect-managed TransactionStatus in scope exception.
I have a custom configured datasource (mySql instance) backing Spring Batch for storing the job execution metadata.
I've attempted to define the DatastoreTransactionManager
#Bean
DatastoreTransactionManager datastoreTransactionManager() {
DatastoreTransactionManager manager
= new DatastoreTransactionManager(DatastoreOptions.getDefaultInstance().getService());
return manager;
}
My configuration is annotated with #EnableBatchProcessing
the batch job config:
#Bean public Job customersJob(StepBuilderFactory stepBuilderFactory,
JobBuilderFactory jobBuilderFactory,
Tasklet batchCustomerReader,
SyncJobNotificationListener listener,
DatastoreTransactionManager datastoreTransactionManager) {
Step step = stepBuilderFactory.get(CUSTOMERS_BATCH_JOB_LOAD_FROM_ERP_STEP)
.tasklet(batchCustomerReader)
.transactionManager(datastoreTransactionManager)
.listener(listener)
.build();
return jobBuilderFactory.get(CUSTOMERS_BATCH_JOB)
.incrementer(new RunIdIncrementer())
.start(step)
.build();
}

I have a custom configured datasource (mySql instance) backing Spring Batch for storing the job execution metadata.
If you use different datasources for Spring Batch meta-data and your business data, then you need to configure an XA transaction manager to synchronize the transaction between both datasources. This way, both data and meta-data are kept in sync in case of failure/restart scenario.
A similar Q/A can be found here: Seperate datasource for jobrepository and writer of Spring Batch

Related

Spring Batch/Data JPA application not persisting/saving data to Postgres database when calling JPA repository (save, saveAll) methods

I am near wits-end. I read/googled endlessly so far and tried the solutions on all the google/stackoverflow posts that have this similiar issue (there a quite a few). Some seemed promising, but nothing has worked for me yet; though I have made some progress and I am on the right track I believe (I'm believing at this point its something with the Transaction manager and some possible conflict with Spring Batch vs. Spring Data JPA).
References:
Spring boot repository does not save to the DB if called from scheduled job
JpaItemWriter: no transaction is in progress
Similar to the aforementioned posts, I have a Spring Boot application that is using Spring Batch and Spring Data JPA. It reads comma delimited data from a .csv file, then does some processing/transformation, and attempts to persist/save to database using the JPA Repository methods, specifically here .saveAll() (I also tried .save() method and this did the same thing), since I'm saving a List<MyUserDefinedDataType> of a user-defined data type (batch insert).
Now, my code was working fine on Spring Boot starter 1.5.9.RELEASE, but I recently attempted to upgrade to 2.X.X, which I found, after countless hours of debugging, only version 2.2.0.RELEASE would persist/save data to database. So an upgrade to >= 2.2.1.RELEASE breaks persistence. Everything is read fine from the .csv, its just when the first time the code flow hits a JPA repository method like .save() .saveAll(), the application keeps running but nothing gets persisted. I also noticed the Hikari pool logs "active=1 idle=4", but when I looked at the same log when on version 1.5.9.RELEASE, it says active=0 idle=5 immediately after persisting the data, so the application is definitely hanging. I went into the debugger and even saw after jumping into the Repository calls, it goes into almost an infinite cycle through the Spring AOP libraries and such (all third party) and I don't believe ever comes back to the real application/business logic that I wrote.
3c22fb53ed64 2021-05-20 23:53:43.909 DEBUG
[HikariPool-1 housekeeper] com.zaxxer.hikari.pool.HikariPool - HikariPool-1 - Pool stats (total=5, active=1, idle=4, waiting=0)
Anyway, I tried the most common solutions that worked for other people which were:
Defining a JpaTransactionManager #Bean and injecting it into the Step function, while keeping the JobRepository using the PlatformTransactionManager. This did not work. Then I also I tried using the JpaTransactionManager also in the JobRepository #Bean, this also did not work.
Defining a #RestController endpoint in my application to manually trigger this Job, instead of doing it manually from my main Application.java class. (I talk about this more below). And per one of the posts I posted above, the data persisted correctly to the database even on spring >= 2.2.1, which further I suspect now something with the Spring Batch persistence/entity/transaction managers is messed up.
The code is basically this:
BatchConfiguration.java
#Configuration
#EnableBatchProcessing
#Import({DatabaseConfiguration.class})
public class BatchConfiguration {
// Datasource is a Postgres DB defined in separate IntelliJ project that I add to my pom.xml
DataSource dataSource;
#Autowired
public BatchConfiguration(#Qualifier("dataSource") DataSource dataSource) {
this.dataSource = dataSource;
}
#Bean
#Primary
public JpaTransactionManager jpaTransactionManager() {
final JpaTransactionManager tm = new JpaTransactionManager();
tm.setDataSource(dataSource);
return tm;
}
#Bean
public JobRepository jobRepository(PlatformTransactionManager transactionManager) throws Exception {
JobRepositoryFactoryBean jobRepositoryFactoryBean = new JobRepositoryFactoryBean();
jobRepositoryFactoryBean.setDataSource(dataSource);
jobRepositoryFactoryBean.setTransactionManager(transactionManager);
jobRepositoryFactoryBean.setDatabaseType("POSTGRES");
return jobRepositoryFactoryBean.getObject();
}
#Bean
public JobLauncher jobLauncher(JobRepository jobRepository) {
SimpleJobLauncher simpleJobLauncher = new SimpleJobLauncher();
simpleJobLauncher.setJobRepository(jobRepository);
return simpleJobLauncher;
}
#Bean(name = "jobToLoadTheData")
public Job jobToLoadTheData() {
return jobBuilderFactory.get("jobToLoadTheData")
.start(stepToLoadData())
.listener(new CustomJobListener())
.build();
}
#Bean
#StepScope
public TaskExecutor taskExecutor() {
ThreadPoolTaskExecutor threadPoolTaskExecutor = new ThreadPoolTaskExecutor();
threadPoolTaskExecutor.setCorePoolSize(maxThreads);
threadPoolTaskExecutor.setThreadGroupName("taskExecutor-batch");
return threadPoolTaskExecutor;
}
#Bean(name = "stepToLoadData")
public Step stepToLoadData() {
TaskletStep step = stepBuilderFactory.get("stepToLoadData")
.transactionManager(jpaTransactionManager())
.<List<FieldSet>, List<myCustomPayloadRecord>>chunk(chunkSize)
.reader(myCustomFileItemReader(OVERRIDDEN_BY_EXPRESSION))
.processor(myCustomPayloadRecordItemProcessor())
.writer(myCustomerWriter())
.faultTolerant()
.skipPolicy(new AlwaysSkipItemSkipPolicy())
.skip(DataValidationException.class)
.listener(new CustomReaderListener())
.listener(new CustomProcessListener())
.listener(new CustomWriteListener())
.listener(new CustomSkipListener())
.taskExecutor(taskExecutor())
.throttleLimit(maxThreads)
.build();
step.registerStepExecutionListener(stepExecutionListener());
step.registerChunkListener(new CustomChunkListener());
return step;
}
My main method:
Application.java
#Autowired
#Qualifier("jobToLoadTheData")
private Job loadTheData;
#Autowired
private JobLauncher jobLauncher;
#PostConstruct
public void launchJob () throws JobParametersInvalidException, JobExecutionAlreadyRunningException, JobRestartException, JobInstanceAlreadyCompleteException
{
JobParameters parameters = (new JobParametersBuilder()).addDate("random", new Date()).toJobParameters();
jobLauncher.run(loadTheData, parameters);
}
public static void main(String[] args) {
SpringApplication.run(Application.class, args);
}
Now, normally I'm reading this .csv from Amazon S3 bucket, but since I'm testing locally, I am just placing the .csv in the project directory and reading it directly by triggering the job in the Application.java main class (as you can see above). Also, I do have some other beans defined in this BatchConfiguration class but I don't want to over-complicate this post more than it already is and from the googling I've done, the problem possibly is with the methods I posted (hopefully).
Also, I would like to point out, similar to one of the other posts on Google/stackoverflow with a user having a similar problem, I created a #RestController endpoint that simply calls the .run() method the JobLauncher and I pass in the JobToLoadTheData Bean, and it triggers the batch insert. Guess what? Data persists to the database just fine, even on spring >= 2.2.1.
What is going on here? is this a clue? is something funky going wrong with some type of entity or transaction manager? I'll take any advice tips! I can provide any more information that you guys may need , so please just ask.
You are defining a bean of type JobRepository and expecting it to be picked up by Spring Batch. This is not correct. You need to provide a BatchConfigurer and override getJobRepository. This is explained in the reference documentation:
You can customize any of these beans by creating a custom implementation of the
BatchConfigurer interface. Typically, extending the DefaultBatchConfigurer
(which is provided if a BatchConfigurer is not found) and overriding the required
getter is sufficient.
This is also documented in the Javadoc of #EnableBatchProcessing. So in your case, you need to define a bean of type Batchconfigurer and override getJobRepository and getTransactionManager, something like:
#Bean
public BatchConfigurer batchConfigurer(EntityManagerFactory entityManagerFactory, DataSource dataSource) {
return new DefaultBatchConfigurer(dataSource) {
#Override
public PlatformTransactionManager getTransactionManager() {
return new JpaTransactionManager(entityManagerFactory);
}
#Override
public JobRepository getJobRepository() {
JobRepositoryFactoryBean jobRepositoryFactoryBean = new JobRepositoryFactoryBean();
jobRepositoryFactoryBean.setDataSource(dataSource);
jobRepositoryFactoryBean.setTransactionManager(getTransactionManager());
// set other properties
return jobRepositoryFactoryBean.getObject();
}
};
}
In a Spring Boot context, you could also override the createTransactionManager and createJobRepository methods of org.springframework.boot.autoconfigure.batch.JpaBatchConfigurer if needed.

Spring Batch Persist Job Meta Data

I wanted to Persist Spring Batch Meta Data related to Jobs/Steps instead of Using MapJobRepository, as it is not recommended for Production Use.
I have extended DefaultBatchConfigurer and have overridden the methods to use My Primary DataSource and Transaction Manager by using JobRepositoryFactoryBean.
Also i have Registered SimpleJobLauncher bean to use custom job repository , for which i have used the below code
#Bean
SimpleJobLauncher simpleJobLauncher() throws Exception {
SimpleJobLauncher simpleJobLauncher= new SimpleJobLauncher();
simpleJobLauncher.setTaskExecutor(new SimpleAsyncTaskExecutor());
simpleJobLauncher.setJobRepository(customRepository());
return simpleJobLauncher;
}
#Bean
public JobRepository customRepository() throws Exception {
JobRepositoryFactoryBean jobRepositoryFactoryBean = new JobRepositoryFactoryBean();
jobRepositoryFactoryBean.setDataSource(myPrimaryDataSource);
jobRepositoryFactoryBean.setTransactionManager(myTransactionManager);
jobRepositoryFactoryBean.setDatabaseType("MYSQL");
jobRepositoryFactoryBean.setTablePrefix("BATCH_");
jobRepositoryFactoryBean.setMaxVarCharLength(1000);
return jobRepositoryFactoryBean.getObject();
}
I am facing with below exception when i launch the job
java.lang.NullpointerException:null
at org.springframework.batch.core.repository.dao.MapJobExecutionDao:synchronizeStatus(MapJobExecutionDao.java 161)
Please do let me know where exactly my configuration is wrong.
From your exception, it looks like Spring Batch is configured to use the Map based job repository and not the persistent one.
Since you extended DefaultBatchConfigurer, you need to override createJobRepository and return your custom job repository. Declaring a bean of type JobRepository in the application context is not sufficient.

Issues with Spring Batch

Hi I have been working in Spring batch recently and need some help.
1) I want to run my Job using multiple threads, hence I have used TaskExecutor as below,
#Bean
public TaskExecutor taskExecutor() {
SimpleAsyncTaskExecutor taskExecutor = new SimpleAsyncTaskExecutor();
taskExecutor.setConcurrencyLimit(4);
return taskExecutor;
}
#Bean
public Step myStep() {
return stepBuilderFactory.get("myStep")
.<MyEntity,AnotherEntity> chunk(1)
.reader(reader())
.processor(processor())
.writer(writer())
.taskExecutor(taskExecutor())
.throttleLimit(4)
.build();
}
but, while executing in can see below line in console.
o.s.b.c.l.support.SimpleJobLauncher : No TaskExecutor has been set, defaulting to synchronous executor.
What does this mean? However, while debugging I can see four SimpleAsyncExecutor threads running. Can someone shed some light on this?
2) I don't want to run my Batch application with the metadata tables that spring batch creates. I have tried adding spring.batch.initialize-schema=never. But it didn't work. I also saw some way to do this by using ResourcelessTransactionManager, MapJobRepositoryFactoryBean. But I have to make some database transactions for my job. So will it be alright if I use this?
Also I was able to do this by extending DefaultBatchConfigurer and overriding:
#Override
public void setDataSource(DataSource dataSource) {
// override to do not set datasource even if a datasource exist.
// initialize will use a Map based JobRepository (instead of database)
}
Please guide me further. Thanks.
Update:
My full configuration class here.
#EnableBatchProcessing
#EnableScheduling
#Configuration
public class MyBatchConfiguration{
#Autowired
public JobBuilderFactory jobBuilderFactory;
#Autowired
public StepBuilderFactory stepBuilderFactory;
#Autowired
public DataSource dataSource;
/* #Override
public void setDataSource(DataSource dataSource) {
// override to do not set datasource even if a datasource exist.
// initialize will use a Map based JobRepository (instead of database)
}*/
#Bean
public Step myStep() {
return stepBuilderFactory.get("myStep")
.<MyEntity,AnotherEntity> chunk(1)
.reader(reader())
.processor(processor())
.writer(writer())
.taskExecutor(executor())
.throttleLimit(4)
.build();
}
#Bean
public Job myJob() {
return jobBuilderFactory.get("myJob")
.incrementer(new RunIdIncrementer())
.listener(listener())
.flow(myStep())
.end()
.build();
}
#Bean
public MyJobListener myJobListener()
{
return new MyJobListener();
}
#Bean
public ItemReader<MyEntity> reader()
{
return new MyReader();
}
#Bean
public ItemWriter<? super AnotherEntity> writer()
{
return new MyWriter();
}
#Bean
public ItemProcessor<MyEntity,AnotherEntity> processor()
{
return new MyProcessor();
}
#Bean
public TaskExecutor taskExecutor() {
SimpleAsyncTaskExecutor taskExecutor = new SimpleAsyncTaskExecutor();
taskExecutor.setConcurrencyLimit(4);
return taskExecutor;
}}
In the future, please break this up into two independent questions. That being said, let me shed some light on both questions.
SimpleJobLauncher : No TaskExecutor has been set, defaulting to synchronous executor.
Your configuration is configuring myStep to use your TaskExecutor. What that does is it causes Spring Batch to execute each chunk in it's own thread (based on the parameters of the TaskExecutor). The log message you are seeing has nothing to do with that behavior. It has to do with launching your job. By default, the SimpleJobLauncher will launch the job on the same thread it is running on, thereby blocking that thread. You can inject a TaskExecutor into the SimpleJobLauncher which will cause the job to be executed on a different thread from the JobLauncher itself. These are two separate uses of multiple threads by the framework.
I don't want to run my Batch application with the metadata tables that spring batch creates
The short answer here is to just use an in memory database like HSQLDB or H2 for your metadata tables. This provides a production grade data store (so that concurrency is handled correctly) without actually persisting the data. If you use the ResourcelessTransactionManager, you are effectively turning transactions off (a bad idea if you're using a database in any capacity) because that TransactionManager doesn't actually do anything (it's a no-op implementation).

Which is the best transaction manager to use in spring batch application in production environment?

Can you please let me know which Transaction Manager should be used in Spring Batch Application in Production ? I am using Resourceless Transaction manager. Is it fine ? I am facing this issue when reading the data from External Oracle DB
[org.springframework.scheduling.support.TaskUtils$LoggingErrorHandler]
(pool-3130-thread-1) Unexpected error occurred in scheduled task.:
org.springframework.transaction.CannotCreateTransactionException:
Could not open JDBC Connection for transaction; nested exception is
java.sql.SQLRecoverableException: Closed Connection
#Bean
public ResourcelessTransactionManager resourcelessTransactionManager() {
return new ResourcelessTransactionManager();
}
#Bean
public MapJobRepositoryFactoryBean mapJobRepositoryFactory(
ResourcelessTransactionManager txManager) throws Exception {
//LOGGER.info("Inside mapJobRepositoryFactory method");
MapJobRepositoryFactoryBean factory = new MapJobRepositoryFactoryBean(txManager);
factory.setTransactionManager(txManager);
factory.setIsolationLevelForCreate("ISOLATION_READ_UNCOMMITTED");
factory.afterPropertiesSet();
return factory;
}
#Bean
public JobRepository jobRepository(
MapJobRepositoryFactoryBean factory) throws Exception {
//LOGGER.info("Inside jobRepository method");
return factory.getObject();
}
#Bean
public ThreadPoolTaskExecutor taskExecutor() {
ThreadPoolTaskExecutor taskExecutor = new ThreadPoolTaskExecutor();
taskExecutor.setCorePoolSize(5);
taskExecutor.setMaxPoolSize(10);
taskExecutor.setQueueCapacity(30);
return taskExecutor;
}
#Bean
public JobLauncher jobLauncher(JobRepository jobRepository,ThreadPoolTaskExecutor taskExecutor) {
//LOGGER.info("Inside jobLauncher method");
SimpleJobLauncher launcher = new SimpleJobLauncher();
launcher.setTaskExecutor(taskExecutor);
launcher.setJobRepository(jobRepository);
final SimpleAsyncTaskExecutor simpleAsyncTaskExecutor = new SimpleAsyncTaskExecutor();
launcher.setTaskExecutor(simpleAsyncTaskExecutor);
return launcher;
}
As you have said in your question that you are using Oracle DB so most likely you wouldn't need a ResourcelessTransactionManager.
What your current code is doing is storing job meta data into map based in memory structure and my guess is that you wouldn't be doing that in production and you would actually be storing job meta data in DB - for later analysis , restart-ability etc
In Spring Batch , there are two kinds of transactions - one for your business data and methods and second for job repository and lets say you want to read from a file , write to a file and wish to store job meta data into MapJobRepository then your code would work OK.
But the moment you define a DataSource , you can't use ResourcelessTransactionManager . In fact , with databases , you don't need to define any transaction manager on your own but Spring Batch chunk oriented processing will take care of itself and will store job meta data into database.

run spring batch job from the controller

I am trying to run my batch job from a controller. It will be either fired up by a cron job or by accessing a specific link.
I am using Spring Boot, no XML just annotations.
In my current setting I have a service that contains the following beans:
#EnableBatchProcessing
#PersistenceContext
public class batchService {
#Bean
public ItemReader<Somemodel> reader() {
...
}
#Bean
public ItemProcessor<Somemodel, Somemodel> processor() {
return new SomemodelProcessor();
}
#Bean
public ItemWriter writer() {
return new CustomItemWriter();
}
#Bean
public Job importUserJob(JobBuilderFactory jobs, Step step1) {
return jobs.get("importUserJob")
.incrementer(new RunIdIncrementer())
.flow(step1)
.end()
.build();
}
#Bean
public Step step1(StepBuilderFactory stepBuilderFactory,
ItemReader<somemodel> reader,
ItemWriter<somemodel> writer,
ItemProcessor<somemodel, somemodel> processor) {
return stepBuilderFactory.get("step1")
.<somemodel, somemodel> chunk(100)
.reader(reader)
.processor(processor)
.writer(writer)
.build();
}
}
As soon as I put the #Configuration annotation on top of my batchService class, job will start as soon as I run the application. It finished successfully, everything is fine. Now I am trying to remove #Configuration annotation and run it whenever I want. Is there a way to fire it from the controller?
Thanks!
You need to create a application.yml file in the src/main/resources and add following configuration:
spring.batch.job.enabled: false
With this change, the batch job will not automatically execute with the start of Spring Boot. And batch job will be triggered when specific link.
Check out my sample code here:
https://github.com/pauldeng/aws-elastic-beanstalk-worker-spring-boot-spring-batch-template
You can launch a batch job programmatically using JobLauncher which can be injected into your controller. See the Spring Batch documentation for more details, including this example controller:
#Controller
public class JobLauncherController {
#Autowired
JobLauncher jobLauncher;
#Autowired
Job job;
#RequestMapping("/jobLauncher.html")
public void handle() throws Exception{
jobLauncher.run(job, new JobParameters());
}
}
Since you're using Spring Boot, you should leave the #Configuration annotation in there and instead configure your application.properties to not launch the jobs on startup. You can read more about the autoconfiguration options for running jobs at startup (or not) in the Spring Boot documentation here: http://docs.spring.io/spring-boot/docs/current-SNAPSHOT/reference/htmlsingle/#howto-execute-spring-batch-jobs-on-startup

Resources