Spring Batch Job taking previous execution parameters - spring

I am using spring cloud dataflow and have created a spring cloud task which contains a job. This job has a parameter called last_modified_date, which is optional. In the code, I have specified which date to take in case last_modified_date is null, that is, it has not been passed as a parameter. The issue is that if for one instance of the job I pass the last_modified_date but for the next one I don't, it picks up the one in the last execution rather than passing it as null and getting it from the code.
#Component
#StepScope
public class SalesforceAdvertiserLoadTasklet implements Tasklet {
#Value("#{jobParameters['last_modified_date']}")
protected Date lastModifiedDate;
private static final Logger logger =
LoggerFactory.getLogger(SalesforceAdvertiserLoadTasklet.class);
#Override
public RepeatStatus execute(StepContribution contribution, ChunkContext chunkContext)
throws Exception {
if(lastModifiedDate == null) {
lastModifiedDate =
Date.from(LocalDate.now().minusDays(1).atStartOfDay(ZoneId.systemDefault()).toInstant());
}
logger.info("In Method: runSalesforceAdvertiserLoadJob launch started on last_modified_date {}",
lastModifiedDate);
logger.info("Getting advertisers from SalesForce");
try {
getAdvertisersFromSalesforceAndAddtoDb();
} catch (JsonSyntaxException | IOException | ParseException e) {
logger.error("ERROR--> {}", e.getMessage());
}
return RepeatStatus.FINISHED;
}
#Bean
public JobParametersIncrementer runIdIncrementor() {
return new RunIdIncrementer();
}
#Bean
public Job salesforceAdvertiserLoadJob() {
return jobBuilderFactory.get(SalesforceJobName.salesforceAdvertiserLoadJob.name())
.incrementer(runIdIncrementor())
.listener(batchJobsExecutionListener)
.start(stepsConfiguration.salesforceAdvertiserLoadStep()).build();
}
Is there a way I can stop the new job instance from taking parameters from the previous job instance?

I think that you didn't provide JobParametersIncrementer to your JobBuilder. Example:
Job job = jobBuilderFactory.get(jobName)
.incrementer(new RunIdIncrementer())
.start(step)
.end()
.build();

Related

Spring batch exception handling sended as ResponseEntity

i m new in Spring boot, i'm training on a small project with Spring batch to get experience, Here my context: I have 2 csv file, one hold employees, the other contains all managers of the compagny. I have to read files, then add each record in database. To make it simple , i just need to call an endpoint from my controller , upload my csv file (multipartfile), then the job will start. I actually was able to do that, my problem is the following.
I have to manage multiple kind of validation (i'm using jsr 380 validation for my entites and i have also to check business exception). A kind of buisness exception can be the following rule, An employee is supervised by a manager of his departement (the employee can't be supervised by a manager, if he's not on same departement, otherwise should throw exception). So for mistaken records, with some invalid or "Illogic" input, i have to skip them (don't save on database) but store them in an Map or List that should be sended as Responses Enity to the client. Hence the client would know which row need to be fixed. I suppose i have to take a look about** Listeners** , But i really can t store exceptions in a map or list then send it as ResponseEntity. Bellow Example of what i want to achieve.
My csv files screenshots
EmployeeBatchConfig.java
#Configuration
#EnableBatchProcessing
#AllArgsConstructor
public class EmployeeBatchConfig {
private JobBuilderFactory jobBuilderFactory;
private StepBuilderFactory stepBuilderFactory;
private EmployeeRepository employeeRepository;
private EmployeeItemWriter employeeItemWriter;
#Bean
#StepScope
public FlatFileItemReader<EmployeeDto> itemReader(#Value("#
{jobParameters[fullPathFileName]}") final String pathFile) {
FlatFileItemReader<EmployeeDto> flatFileItemReader = new
FlatFileItemReader<>();
flatFileItemReader.setResource(new FileSystemResource(new
File(pathFile)));
flatFileItemReader.setName("CSV-Reader");
flatFileItemReader.setLinesToSkip(1);
flatFileItemReader.setLineMapper(lineMapper());
return flatFileItemReader;
}
private LineMapper<EtudiantDto> lineMapper() {
DefaultLineMapper<EtudiantDto> lineMapper = new DefaultLineMapper<>
();
DelimitedLineTokenizer lineTokenizer = new DelimitedLineTokenizer();
lineTokenizer.setDelimiter(",");
lineTokenizer.setStrict(false);
lineTokenizer.setNames("Username", "lastName", "firstName",
"departement", "supervisor");
BeanWrapperFieldSetMapper<EmployeeDto> fieldSetMapper = new
BeanWrapperFieldSetMapper<>();
fieldSetMapper.setTargetType(EmployeeDto.class);
lineMapper.setLineTokenizer(lineTokenizer);
lineMapper.setFieldSetMapper(fieldSetMapper);
return lineMapper;
}
#Bean
public EmployeeProcessor processor() {
return new EmployeeProcessor(); /*Create a bean processor to skip
invalid rows*/
}
#Bean
public RepositoryItemWriter<Employee> writer() {
RepositoryItemWriter<Employee> writer = new RepositoryItemWriter<>();
writer.setRepository(employeeRepository);
writer.setMethodName("save");
return writer;
}
#Bean
public Step step1(FlatFileItemReader<EmployeeDto> itemReader) {
return stepBuilderFactory.get("slaveStep").<EmployeeDto,
Employee>chunk(5)
.reader(itemReader)
.processor(processor())
.writer(employeeItemWriter)
.faultTolerant()
.listener(skipListener())
.skip(SkipException.class)
.skipLimit(10)
.skipPolicy(skipPolicy())
.build();
}
#Bean
#Qualifier("executeJobEmployee")
public Job runJob(FlatFileItemReader<Employee> itemReader) {
return jobBuilderFactory
.get("importEmployee")
.flow(step1(itemReader))
.end()
.build();
}
#Bean
public SkipPolicy skipPolicy(){
return new ExceptionSkipPolicy();
}
#Bean
public SkipListener<EmployeeDto, Employee> skipListener(){
return new StepSkipListener();
}
/*#Bean
public ExecutionContext executionContext(){
return new ExecutionContext();
}*/
}
EmployeeProcessor.java
public class EmployeeProcessor implements ItemProcessor<EmployeeDto,
Employee>{
#Autowired
private SupervisorService managerService;
#Override
public Employee process(#Valid EmployeeDto item) throws Exception,
SkipException {
ManagerDto manager =
SupervisorService.findSupervisorById(item.getSupervisor());
//retrieve the manager of the employee and compare departement
if(!(manager.getDepartement().equals(item.getDepartement()))) {
throw new SkipException("Manager Invalid", item);
//return null;
}
return ObjectMapperUtils.map(item, Employee.class);
}
}
MySkipPolicy.java
public class MySkipPolicy implements SkipPolicy {
#Override
public boolean shouldSkip(Throwable throwable, int i) throws
SkipLimitExceededException {
return true;
}
}
StepSkipListenerPolicy.java
public class StepSkipListener implements SkipListener<EmployeeDto,
Number> {
#Override // item reader
public void onSkipInRead(Throwable throwable) {
System.out.println("In OnSkipReader");
}
#Override // item writter
public void onSkipInWrite(Number item, Throwable throwable) {
System.out.println("Nooooooooo ");
}
//#SneakyThrows
#Override // item processor
public void onSkipInProcess(#Valid EmployeeDto employee, Throwable
throwable){
System.out.println("Process... ");
/* I guess this is where I should work, but how do I deal with the
exception occur? How do I know which exception I would get ? */
}
}
SkipException.java
public class SkipException extends Exception {
private Map<String, EmployeeDto> errors = new HashMap<>();
public SkipException(String errorMessage, EmployeeDto employee) {
super();
this.errors.put(errorMessage, employee);
}
public Map<String, EmployeeDto> getErrors() {
return this.errors;
}
}
JobController.java
#RestController
#RequestMapping("/upload")
public class JobController {
#Autowired
private JobLauncher jobLauncher;
#Autowired
#Qualifier("executeJobEmployee")
private Job job;
private final String EMPLOYEE_FOLDER = "C:/Users/Project/Employee/";
#PostMapping("/employee")
public ResponseEntity<Object> importEmployee(#RequestParam("file")
MultipartFile multipartFile) throws JobInterruptedException,
SkipException, IllegalStateException, IOException,
FlatFileParseException{
try {
String fileName = multipartFile.getOriginalFilename();
File fileToImport= new File(EMPLOYEE_FOLDER + fileName);
multipartFile.transferTo(fileToImport);
JobParameters jobParameters = new JobParametersBuilder()
.addString("fullPathFileName", EMPLOYEE_FOLDER + fileName)
.addLong("startAt", System.currentTimeMillis())
.toJobParameters();
JobExecution jobExecution = this.jobLauncher.run(job,
jobParameters);
ExecutionContext executionContext =
jobExecution.getExecutionContext();
System.out.println("My Skiped items : " +
executionContext.toString());
} catch (ConstraintViolationException | FlatFileParseException |
JobRestartException | JobInstanceAlreadyCompleteException |
JobParametersInvalidException |
JobExecutionAlreadyRunningException e) {
e.printStackTrace();
return new ResponseEntity<>(e.getMessage(), HttpStatus.BAD_REQUEST);
}
return new ResponseEntity<>("Employee inserted succesfully",
HttpStatus.OK);
}
}
That requirement forces your implementation to wait for the job to finish before returning the web response, which is not the typical way of launching batch jobs from web requests. Typically, since batch jobs can run for several minutes/hours, they are launched in the background and a job ID is returned back to the client for later status check.
In Spring Batch, the SkipListener is the extension point that allows you to add custom code when a skippable exception happens when reading, processing or writing an item. I would add the business validation in an item processor and throw an exception with the skipped item and the reason for that skip (both encapsulated in the exception class that should be declared as skippable).
Skipped items are usually stored somewhere for later analysis (like a table or a file or the job execution context). In your case, you need to send them back in the web response, so you can read them from the store of your choice before returning them attached in the web response. In pseudo code in your controller, this should be something like the following:
- run the job and wait for its termination (the skip listener would write skipped items in the storage of your choice)
- get skipped items from storage
- return web response
For example, if you choose to store skipped items in the job execution context, you can do something like this in your controller:
JobExecution jobExecution = jobLauncher.run(job, jobParameters);
ExecutionContext executionContext = jobExecution.getExecutionContext();
// get skipped items from the execution context
// return the web response

how to get Spring Batch job instance id from execute method in TASKLET

I am using a layout using Spring Batch 3.0 version.
Create a Job and execute the placement by executing the JobLauncher run method of the TASKLET.
I want to know more accurately whether the Job is executed or not through insert logic in the query in TASKLET with the corresponding JobId and other tables other than the metatables.
public class SampleScheduler {
protected final Logger log = LoggerFactory.getLogger(this.getClass());
#Autowired
private JobLauncher jobLauncher;
#Autowired
private Job sampleJob;
public void run() {
try {
String dateParam = new Date().toString();
JobParameters param = new JobParametersBuilder().addString("date",dateParam).toJobParameters();
JobExecution execution = jobLauncher.run(sampleJob, param);
log.debug("###################################################################");
log.debug("Exit Status : " + execution.getStatus());
log.debug("###################################################################");
} catch (Exception e) {
// e.printStackTrace();
log.error(e.toString());
}
}
}
Code for calling tasklet -
public class SampleTasklet implements Tasklet{
#Autowired
private SampleService sampleService;
#Override
public RepeatStatus execute(StepContribution contribution,
ChunkContext chunkContext) throws Exception {
sampleService.query();
return RepeatStatus.FINISHED;
}
}
This is my tasklet code.
StepContext stepContext = chunkContext.getStepContext();
StepExecution stepExecution = stepContext.getStepExecution();
JobExecution jobExecution = stepExecution.getJobExecution();
long jobInstanceId = jobExecution.getJobId();
Is it right to try this in the TASKLET code above?
how to get Spring Batch job instance id from execute method in TASKLET
The org.springframework.batch.core.step.tasklet.Tasklet#execute method gives you access to the ChunkContext which in turn allows you to get the parent StepExecution and JobExecution. You can then get the job instance id from the job execution.
Is it right to try this in the TASKLET code above?
Yes, that's the way to go.

How to use spring transaction support with Spring Batch

I am trying to use spring batch to read file from a .dat file and persist the data into database. My requirement says to either insert all of the data or insert none of the data into table i.e, atomicity. However, using spring batch i'm not able to achieve the same it is reading data in chunks and is inserting data as long as the records are fine. if at some point the record is inappropriate and some db exception is thrown then i want complete rollback which is not happening. Let's say we get error at 2051th record then my code saves 2050 records but i want complete rollback and if all data is good then all N records should be persisted. Thanks in advance for any help or relevant approach that may solve my issue...
NOTE: I have already used Spring Transactional annotation on caller method but it's not working and i'm reading data in a chunk size of 10 items.
MyConfiguration.java
#Configuration
public class MyConfiguration
{
#Autowired
JobBuilderFactory jobBuilderFactory;
#Autowired
StepBuilderFactory stepBuilderFactory;
#Autowired
#Qualifier("MyCompletionListener")
JobCompletionNotificationListener jobCompletionNotificationListener;
#StepScope
#Bean(name="MyReader")
public FlatFileItemReader<InputMapperDTO> reader(#Value("#{jobParameters['fileName']}") String fileName) throws IOException
{
FlatFileItemReader<InputMapperDTO> newBean = new FlatFileItemReader<>();
newBean.setName("MyReader");
newBean.setResource(new InputStreamResource(FileUtils.openInputStream(new File(fileName))));
newBean.setLineMapper(lineMapper());
newBean.setLinesToSkip(1);
return newBean;
}
#Bean(name="MyLineMapper")
public DefaultLineMapper<InputMapperDTO> lineMapper()
{
DefaultLineMapper<InputMapperDTO> lineMapper = new DefaultLineMapper<>();
lineMapper.setLineTokenizer(lineTokenizer());
Reader reader = new Reader();
lineMapper.setFieldSetMapper(reader);
return lineMapper;
}
#Bean(name="MyTokenizer")
public DelimitedLineTokenizer lineTokenizer()
{
DelimitedLineTokenizer tokenizer = new DelimitedLineTokenizer();
tokenizer.setDelimiter("|");
tokenizer.setNames("InvestmentAccountUniqueIdentifier", "BaseCurrencyUniqueIdentifier",
"OperatingCurrencyUniqueIdentifier", "PricingHierarchyUniqueIdentifier", "InvestmentAccountNumber",
"DummyAccountIndicator", "InvestmentAdvisorCompanyNumberLegacy","HighNetWorthAccountTypeCode");
tokenizer.setIncludedFields(0, 5, 7, 13, 29, 40, 49,75);
return tokenizer;
}
#Bean(name="MyBatchProcessor")
public ItemProcessor<InputMapperDTO, FinalDTO> processor()
{
return new Processor();
}
#Bean(name="MyWriter")
public ItemWriter<FinalDTO> writer()
{
return new Writer();
}
#Bean(name="MyStep")
public Step step1() throws IOException
{
return stepBuilderFactory.get("MyStep")
.<InputMapperDTO, FinalDTO>chunk(10)
.reader(this.reader(null))
.processor(this.processor())
.writer(this.writer())
.build();
}
#Bean(name=MyJob")
public Job importUserJob(#Autowired #Qualifier("MyStep") Step step1)
{
return jobBuilderFactory
.get("MyJob"+new Date())
.incrementer(new RunIdIncrementer())
.listener(jobCompletionNotificationListener)
.flow(step1)
.end()
.build();
}
}
Writer.java
public class Writer implements ItemWriter<FinalDTO>
{
#Autowired
SomeRepository someRepository;
#Override
public void write(List<? extends FinalDTO> listOfObjects) throws Exception
{
someRepository.saveAll(listOfObjects);
}
}
JobCompletionNotificationListener.java
public class JobCompletionNotificationListener extends JobExecutionListenerSupport
{
#Override
public void afterJob(JobExecution jobExecution)
{
if(jobExecution.getStatus() == BatchStatus.COMPLETED)
{
System.err.println("****************************************");
System.err.println("***** Batch Job Completed ******");
System.err.println("****************************************");
}
else
{
System.err.println("****************************************");
System.err.println("***** Batch Job Failed ******");
System.err.println("****************************************");
}
}
}
MyCallerMethod
#Transactional
public String processFile(String datFile) throws JobExecutionAlreadyRunningException, JobRestartException,
JobInstanceAlreadyCompleteException, JobParametersInvalidException
{
long st = System.currentTimeMillis();
JobParametersBuilder builder = new JobParametersBuilder();
builder.addString("fileName",datFile);
builder.addDate("date", new Date());
jobLauncher.run(job, builder.toJobParameters());
System.err.println("****************************************");
System.err.println("***** Total time consumed = "+(System.currentTimeMillis()-st)+" ******");
System.err.println("****************************************");
return response;
}
The operation I have tried is not provided in batch. For my requirement, I have implemented custom delete which flushes the database upon failure in any step.

Spring Batch - Read query from file and execute it on database

I know I can simply read the file straight from step1, a moment before setting the sql query into the reader, but I want to keep the process of reading the query separate from database reading.
Here is my job configuration.
#Configuration
public class BatchConfiguration {
[...]
#Bean
#StepScope
public JdbcCursorItemReader<Map<String, Object>> dynamicSqlItemReader() {
JdbcCursorItemReader<Map<String, Object>> jir = new JdbcCursorItemReader<>();
jir.setSql((String) contextHolder.getContext().get("fileContent"));
jir.setDataSource(dataSource);
jir.setRowMapper(new ColumnMapRowMapper());
return jir;
}
private FlatFileItemReader<String> flatFileItemReader() {
[...]
}
private ItemWriter<? super String> sysoItemWriter() {
return (ItemWriter<String>) list -> {
for (String element : list) {
System.out.println(element);
}
contextHolder.getContext().put("fileContent", list.get(0));
};
}
#Bean
public ItemWriter<Map<String, Object>> customerItemWriter() {
return list -> {
for (Map<String, Object> stringObjectMap : list) {
System.out.println(stringObjectMap);
}
};
}
#Bean
public Step step0() {
return stepBuilderFactory.get("step0")
.<String, String>chunk(1)
.reader(flatFileItemReader())
.writer(sysoItemWriter())
.build();
}
#Bean
public Step step1() {
return stepBuilderFactory.get("step1")
.<Map<String, Object>, Map<String, Object>>chunk(10)
.reader(dynamicSqlItemReader())
.writer(customerItemWriter())
.build();
}
#Bean
public Job job() throws Exception {
return jobBuilderFactory.get("job")
.incrementer(new RunIdIncrementer())
.start(step0())
.next(step1())
.build();
}
}
This throws a java.lang.IllegalArgumentException: The SQL query must be provided because the contextHolder.getContext().get("fileContent") is still null at time of setting the query.
Before step1, you could write a tasklet for building the query and putting it into context, so that it stays separate and also it becomes available to step1. See more about tasklet here: Tasklet to delete a table in spring batch
You are not using your created contextHolder properly that's why the value there is null.
Make sure you are putting your data in flatFileItemReader() in contextHolder in directly as a map because when you are getting value, you are using contextholder.getContext(). Since it's an simple map,not a ApplicationContext, the method you are using does not exist.

method annotation with #BeforeStep not getting called

my goal is passing some value from a tasklet to another step which is consist of itemreader, processor and writer. based on the reference, i should use ExecutionContextPromotionListener. however, for some reason #BeforeStep is not being called. this is what I have.
Tasklet
#Component
public class RequestTasklet implements Tasklet {
#Autowired
private HistoryRepository historyRepository;
#Override
public RepeatStatus execute(StepContribution sc, ChunkContext cc) throws Exception {
List<History> requests
= historyRepository.findHistory();
ExecutionContext stepContext = cc.getStepContext().getStepExecution().getJobExecution().getExecutionContext();
stepContext.put("someKey", requests);
return RepeatStatus.FINISHED;
}
}
ItemReader
#Component
public class RequestReader implements ItemReader<History> {
private List<History> requests;
#Override
public History read() throws UnexpectedInputException,
ParseException,
NonTransientResourceException {
System.out.println("requests====>" + requests);
if (CollectionUtils.isNotEmpty(requests)) {
History request = requests.remove(0);
return request;
}
return null;
}
#BeforeStep
public void beforeStep(StepExecution stepExecution) {
System.out.println("====================here======================");
JobExecution jobExecution = stepExecution.getJobExecution();
ExecutionContext jobContext = jobExecution.getExecutionContext();
this.requests = (List<History>) jobContext.get("someKey");
}
}
Configuration
#Bean(name = "update-job")
public Job updateUserAttributes(
JobBuilderFactory jbf,
StepBuilderFactory sbf,
ExecutionContextPromotionListener promoListener,
HistoryProcessor processor,
RequestReader reader,
HistoryWriter writer,
RequestTasklet loadRequestTasklet) {
Step preStep = sbf.get("load-request-tasklet")
.tasklet(loadRequestTasklet)
.listener(promoListener)
.build();
Step step = sbf.get("update-step")
.<History, History>chunk(2)
.reader(reader)
.processor(processor)
.writer(writer)
.taskExecutor(taskExecutor())
.build();
return jbf.get("update-job")
.incrementer(new RunIdIncrementer())
.start(preStep).next(step).
build();
}
#Bean
public ExecutionContextPromotionListener promoListener() {
ExecutionContextPromotionListener listener = new ExecutionContextPromotionListener();
listener.setKeys(new String[]{
"someKey",
});
return listener;
}
I also tried to extend StepExecutionListenerSupport in ItemReader but got the same result.
I googled around and looked at this question answered due to spring proxy which is not my case beforestep issue.
little more information as i am testing. I set listener.setStrict(Boolean.TRUE); to see if keys are set. but i am getting
java.lang.IllegalArgumentException: The key [someKey] was not found in the Step's ExecutionContext.
at org.springframework.batch.core.listener.ExecutionContextPromotionListener.afterStep(ExecutionContextPromotionListener.java:61) ~[spring-batch-core-4.0.0.M1.jar:4.0.0.M1]
appreciate any help.

Resources