Read New File While Doing Processing For A Field In Spring Batch - spring-boot

I have a fixedlength input file reading by using SPRING BATCH.
I have already implemented Job, Step, Processor, etc.
Here are the sample code.
#Configuration
public class BatchConfig {
private JobBuilderFactory jobBuilderFactory;
private StepBuilderFactory stepBuilderFactory;
#Value("${inputFile}")
private Resource resource;
#Autowired
public BatchConfig(JobBuilderFactory jobBuilderFactory, StepBuilderFactory stepBuilderFactory) {
this.jobBuilderFactory = jobBuilderFactory;
this.stepBuilderFactory = stepBuilderFactory;
}
#Bean
public Job job() {
return this.jobBuilderFactory.get("JOB-Load")
.start(fileReadingStep())
.build();
}
#Bean
public Step fileReadingStep() {
return stepBuilderFactory.get("File-Read-Step1")
.<Employee,EmpOutput>chunk(1000)
.reader(itemReader())
.processor(new CustomFileProcesser())
.writer(new CustomFileWriter())
.faultTolerant()
.skipPolicy(skipPolicy())
.build();
}
#Bean
public FlatFileItemReader<Employee> itemReader() {
FlatFileItemReader<Employee> flatFileItemReader = new FlatFileItemReader<Employee>();
flatFileItemReader.setResource(resource);
flatFileItemReader.setName("File-Reader");
flatFileItemReader.setLineMapper(LineMapper());
return flatFileItemReader;
}
#Bean
public LineMapper<Employee> LineMapper() {
DefaultLineMapper<Employee> defaultLineMapper = new DefaultLineMapper<Employee>();
FixedLengthTokenizer fixedLengthTokenizer = new FixedLengthTokenizer();
fixedLengthTokenizer.setNames(new String[] { "employeeId", "employeeName", "employeeSalary" });
fixedLengthTokenizer.setColumns(new Range[] { new Range(1, 9), new Range(10, 20), new Range(20, 30)});
fixedLengthTokenizer.setStrict(false);
defaultLineMapper.setLineTokenizer(fixedLengthTokenizer);
defaultLineMapper.setFieldSetMapper(new CustomFieldSetMapper());
return defaultLineMapper;
}
#Bean
public JobSkipPolicy skipPolicy() {
return new JobSkipPolicy();
}
}
For Processing I have added some sample code What I need, But if I add BufferedReader here then it's taking more times to do the job.
#Component
public class CustomFileProcesser implements ItemProcessor<Employee, EmpOutput> {
#Override
public EmpOutput process(Employee item) throws Exception {
EmpOutput emp = new EmpOutput();
emp.setEmployeeSalary(checkSal(item.getEmployeeSalary()));
return emp;
}
public String checkSal(String sal) {
// need to read the another file
// required to do some kind of validation
// after that final result need to return
File f1 = new File("C:\\Users\\John\\New\\salary.txt");
FileReader fr;
try {
fr = new FileReader(f1);
BufferedReader br = new BufferedReader(fr);
String s = br.readLine();
while (s != null) {
String value = s.substring(5, 7);
if(value.equals(sal))
sal = value;
else
sal = "5000";
s = br.readLine();
}
} catch (Exception e) {
e.printStackTrace();
}
return sal;
}
// other fields need to check by reading different different file.
// These new files contains more than 30k records.
// all are fixedlength file.
// I need to get the field by giving the index
}
While doing the processing for one or more field, I need to check In another file by reading that file (it's a file I will read from fileSystem/Cloud).
While processing the data for 5 fields I need to read 5 different different file again, I will check the fields details inside those file and then I will gererate the result , That result will process forther.

You can cache the content of the file in memory and do your check against the cache instead of re-reading the entire file from disk for each item.
You can find an example here: Spring Batch With Annotation and Caching.

Related

can anyone help me Spring Batch Issue? (Unintended schedule Spring Batch)

The implemented function is to send LMS to the user at the alarm time.
Send a total of 4 alarms (9:00, 13:00, 19:00, 21:00).
Log was recorded regardless of success.
It was not recorded in the Log, but when I looked at the batch data in the DB, I found an unintended COMPLETED.
Issue>
Batch was successfully executed at 9 and 13 on the 18th.
But at 13:37 it's not even a schedule, but it's executed. (and FAILED)
Subsequently, 13:38, 40, 42, 44 minutes executed. (all COMPLETED)
Q1. Why was it executed when it wasn't even the batch execution time?
Q2. I save the log even when executing batch and sending SMS. Log was printed normally at 9 and 13 o'clock.
But Log is not saved for non-schedule(13:37, 38, 40, 42, 44).
Check spring boot service and tomcat service with one
server CPU, memory usage is normal
Batch Problem
Spring Boot (2.2.6 RELEASE)
Spring Boot - Embedded Tomcat
===== Start Scheduler =====
#Component
public class DosageAlarmScheduler {
public static final int MORNING_HOUR = 9;
public static final int LUNCH_HOUR = 13;
public static final int DINNER_HOUR = 19;
public static final int BEFORE_SLEEP_HOUR = 21;
#Scheduled(cron = "'0 0 */1 * * *") // every hour
public void executeDosageAlarmJob() {
LocalDateTime nowDateTime = LocalDateTime.now();
try {
if(isExecuteTime(nowDateTime)) {
log.info("[Send LMS], {}", nowDateTime);
EatFixCd eatFixCd = currentEatFixCd(nowDateTime);
jobLauncher.run(
alarmJob,
new JobParametersBuilder()
.addString("currentDate", nowDateTime.toString())
.addString("eatFixCodeValue", eatFixCd.getCodeValue())
.toJobParameters()
);
} else {
log.info("[Not Send LMS], {}", nowDateTime);
}
} catch (JobExecutionAlreadyRunningException e) {
log.error("[JobExecutionAlreadyRunningException]", e);
} catch (JobRestartException e) {
log.error("[JobRestartException]", e);
} catch (JobInstanceAlreadyCompleteException e) {
log.error("[JobInstanceAlreadyCompleteException]", e);
} catch (JobParametersInvalidException e) {
log.error("[JobParametersInvalidException]", e);
} catch(Exception e) {
log.error("[Exception]", e);
}
/* Start private method */
private boolean isExecuteTime(LocalDateTime nowDateTime) {
return nowDateTime.getHour() == MORNING_TIME.getHour()
|| nowDateTime.getHour() == LUNCH_TIME.getHour()
|| nowDateTime.getHour() == DINNER_TIME.getHour()
|| nowDateTime.getHour() == BEFORE_SLEEP_TIME.getHour();
}
private EatFixCd currentEatFixCd(LocalDateTime nowDateTime) {
switch(nowDateTime.getHour()) {
case MORNING_HOUR:
return EatFixCd.MORNING;
case LUNCH_HOUR:
return EatFixCd.LUNCH;
case DINNER_HOUR:
return EatFixCd.DINNER;
case BEFORE_SLEEP_HOUR:
return EatFixCd.BEFORE_SLEEP;
default:
throw new RuntimeException("Not Dosage Time");
}
}
/* End private method */
}
}
===== End Scheduler =====
===== Start Job =====
#Configuration
public class DosageAlarmConfiguration {
private final int chunkSize = 20;
private final JobBuilderFactory jobBuilderFactory;
private final StepBuilderFactory stepBuilderFactory;
private final EntityManagerFactory entityManagerFactory;
#Bean
public Job dosageAlarmJob() {
log.info("[dosageAlarmJob excute]");
return jobBuilderFactory.get("dosageAlarmJob")
.start(dosageAlarmStep(null, null)).build();
}
#Bean
#JobScope
public Step dosageAlarmStep(
#Value("#{jobParameters[currentDate]}") String currentDate,
#Value("#{jobParameters[eatFixCodeValue]}") String eatFixCodeValue
) {
log.info("[dosageAlarm Step excute]");
return stepBuilderFactory.get("dosageAlarmStep")
.<Object[], DosageReceiverInfoDto>chunk(chunkSize)
.reader(dosageAlarmReader(currentDate, eatFixCodeValue))
.processor(dosageAlarmProcessor(currentDate, eatFixCodeValue))
.writer(dosageAlarmWriter(currentDate, eatFixCodeValue))
.build();
}
#Bean
#StepScope
public JpaPagingItemReader<Object[]> dosageAlarmReader(
#Value("#{jobParameters[currentDate]}") String currentDate,
#Value("#{jobParameters[eatFixCodeValue]}") String eatFixCodeValue
) {
log.info("[dosageAlarm Reader excute : {}, {}]", currentDate, eatFixCodeValue);
if(currentDate == null) {
return null;
} else {
JpaPagingItemReader<Object[]> jpaPagingItemReader = new JpaPagingItemReader<>();
jpaPagingItemReader.setName("dosageAlarmReader");
jpaPagingItemReader.setEntityManagerFactory(entityManagerFactory);
jpaPagingItemReader.setPageSize(chunkSize);
jpaPagingItemReader.setQueryString("select das from DosageAlarm das where :currentDate between das.startDate and das.endDate ");
HashMap<String, Object> parameterValues = new HashMap<>();
parameterValues.put("currentDate", LocalDateTime.parse(currentDate).toLocalDate());
jpaPagingItemReader.setParameterValues(parameterValues);
return jpaPagingItemReader;
}
}
#Bean
#StepScope
public ItemProcessor<Object[], DosageReceiverInfoDto> dosageAlarmProcessor(
#Value("#{jobParameters[currentDate]}") String currentDate,
#Value("#{jobParameters[eatFixCodeValue]}") String eatFixCodeValue
) {
log.info("[dosageAlarm Processor excute : {}, {}]", currentDate, eatFixCodeValue);
...
convert to DosageReceiverInfoDto
...
}
#Bean
#StepScope
public ItemWriter<DosageReceiverInfoDto> dosageAlarmWriter(
#Value("#{jobParameters[currentDate]}") String currentDate,
#Value("#{jobParameters[eatFixCodeValue]}") String eatFixCodeValue
) {
log.info("[dosageAlarm Writer excute : {}, {}]", currentDate, eatFixCodeValue);
...
make List
...
if(reqMessageDtoList != null) {
sendMessages(reqMessageDtoList);
} else {
log.info("[reqMessageDtoList not Exist]");
}
}
public SmsExternalSendResDto sendMessages(List<reqMessagesDto> reqMessageDtoList) {
log.info("[receiveList] smsTypeCd : {}, contentTypeCd : {}, messages : {}", smsTypeCd.LMS, contentTypeCd.COMM, reqMessageDtoList);
...
send Messages
}
}
===== End Job =====
Thank U.
i want to fix my problem and i hope this question is hepled another people.

Loading value from json upon start up application

I want to load the values from json file upon the Spring Boot Application is started.
My code for the Configuration File is like the below:
#Configuration
#Getter
public class FedexAPIConfig {
private final static String JSON_FILE = "/static/config/fedex-api-credentials.json";
private final boolean IS_PRODUCTION = false;
private FedexAPICred apiCredentials;
public FedexAPIConfig() {
try (InputStream in = getClass().getResourceAsStream(JSON_FILE);
BufferedReader reader = new BufferedReader(new InputStreamReader(in, StandardCharsets.UTF_8))) {
JSONObject json = new JSONObject();
// this.apiCredentials = new JSONObject(new JSONTokener(reader));
if (IS_PRODUCTION) {
json = new JSONObject(new JSONTokener(reader)).getJSONObject("production");
} else {
json = new JSONObject(new JSONTokener(reader)).getJSONObject("test");
}
System.out.println(json.toString());
this.apiCredentials = FedexAPICred.builder()
.url(json.optString("url"))
.apiKey(json.optString("api_key"))
.secretKey(json.optString("secret_key"))
.build();
} catch (FileNotFoundException fnfe) {
fnfe.printStackTrace();
} catch (IOException ioe) {
ioe.printStackTrace();
} catch (Exception e) {
e.printStackTrace();
}
}
}
and with this, when the application is in progress of startup, values are successfully printed on the console.Startup console log
When I tried to call this value from other ordinary class, like the below:, it brings nothing but just throws NullPointerException... What are my faults and what shall I do?
public class FedexOAuthTokenManager extends OAuthToken {
private static final String VALIDATE_TOKEN_URL = "/oauth/token";
private static final String GRANT_TYPE_CLIENT = "client_credentials";
private static final String GRANT_TYPE_CSP = "csp_credentials";
#Autowired
private FedexAPIConfig fedexApiConfig;
#Autowired
private Token token;
#Override
public void validateToken() {
// This is the part where "fedexApiConfig" is null.
FedexAPICred fedexApiCred = fedexApiConfig.getApiCredentials();
Response response = null;
try {
RequestBody body = new FormBody.Builder()
.add("grant_type", GRANT_TYPE_CLIENT)
.add("client_id", fedexApiCred.getApiKey())
.add("client_secret", fedexApiCred.getSecretKey())
.build();
response = new HttpClient().post(fedexApiCred.getUrl() + VALIDATE_TOKEN_URL, body);
if (response.code() == 200) {
JSONObject json = new JSONObject(response.body().string());
token.setAccessToken(json.optString("access_token"));
token.setTokenType(json.optString("token_type"));
token.setExpiredIn(json.optInt("expires_in"));
token.setExpiredDateTime(LocalDateTime.now().plusSeconds(json.optInt("expires_in")));
token.setScope(json.optString("scope"));
}
} catch (IOException ioe) {
ioe.printStackTrace();
}
}
}
fedexApiConfg is null even though I autowired it in prior to call.
And this FedexOAuthTokenManager is called from other #Component class by new FedexOAuthTokenManager()
Did you try like below?
Step 1: Create one Configuration class like below
public class DemoConfig implements ApplicationListener<ApplicationPreparedEvent> {
#Override
public void onApplicationEvent(ApplicationPreparedEvent event) {
//Load the values from the JSON file and populate the application
//properties dynamically
ConfigurableEnvironment environment = event.getApplicationContext().getEnvironment();
Properties props = new Properties();
props.put("spring.datasource.url", "<my value>");
//Add more properties
environment.getPropertySources().addFirst(new PropertiesPropertySource("myProps", props));
}
To listen to a context event, a bean should implement the ApplicationListener interface which has just one method onApplicationEvent().The ApplicationPreparedEvent is invoked very early in the lifecycle of the application
Step 2: Customize in src/main/resources/META-INF/spring.factories
org.springframework.context.ApplicationListener=com.example.demo.DemoConfig
Step 3: #Value in spring boot is commonly used to inject the configuration values into the spring boot application. Access the properties as per your wish.
#Value("${spring.datasource.url}")
private String valueFromJSon;
Try this sample first in your local machine and then modify your changes accordingly.
Refer - https://www.baeldung.com/spring-value-annotation
Refer - https://www.knowledgefactory.net/2021/02/aws-secret-manager-service-as.html

Writing in multiple unrelated tables in spring batch writer

Can we have a writer which will write in 2 different unrelated tables simultaneously in spring batch? Actually, Along with the main data I need to store some metadata in a different table. How can I go about it?
Please find below . Let say you have 3 tables to write
#Bean
public CompositeItemWriter compositeWriter() throws Exception {
CompositeItemWriter compositeItemWriter = new CompositeItemWriter();
List<ItemWriter> writers = new ArrayList<ItemWriter>();
writers.add(firstTableWriter());
writers.add(secondTableWriter());
writers.add(thirdTableWriter());
compositeItemWriter.setDelegates(writers);
return compositeItemWriter;
}
#Bean
public JdbcBatchItemWriter<YourDTO> firstTableWriter() {
JdbcBatchItemWriter<YourDTO> databaseItemWriter = new JdbcBatchItemWriter<>();
databaseItemWriter.setDataSource(dataSource);
databaseItemWriter.setSql("INSERT INTO FIRSTTABLE");
ItemPreparedStatementSetter<YourDTO> invoicePreparedStatementSetter = new FirstTableSetter();
databaseItemWriter.setItemPreparedStatementSetter(invoicePreparedStatementSetter);
return databaseItemWriter;
}
#Bean
public JdbcBatchItemWriter<YourDTO> secondTableWriter() {
JdbcBatchItemWriter<YourDTO> databaseItemWriter = new JdbcBatchItemWriter<>();
databaseItemWriter.setDataSource(dataSource);
databaseItemWriter.setSql("INSERT INTO SECOND TABLE");
ItemPreparedStatementSetter<YourDTO> invoicePreparedStatementSetter = new SecondTableSetter();
databaseItemWriter.setItemPreparedStatementSetter(invoicePreparedStatementSetter);
return databaseItemWriter;
}
#Bean
public JdbcBatchItemWriter<YourDTO> thirdTableWriter() {
JdbcBatchItemWriter<YourDTO> databaseItemWriter = new JdbcBatchCustomItemWriter<>();
databaseItemWriter.setDataSource(dataSource);
databaseItemWriter.setSql("INSERT INTO THIRD TABLE");
ItemPreparedStatementSetter<YourDTO> invoicePreparedStatementSetter = new ThirdTableSetter();
databaseItemWriter.setItemPreparedStatementSetter(invoicePreparedStatementSetter);
return databaseItemWriter;
}
//SettterClass Example
public class FirstTableSetter implements ItemPreparedStatementSetter<YourDTO> {
#Override
public void setValues(YourDTO yourDTO, PreparedStatement preparedStatement) throws SQLException {
preparedStatement.setString(1, yourDTO.getMyValue());
}
}

Spring Batch file header

I have a header that looks like this:
// Writer
#Bean(name = "cms200Writer")
#StepScope
public FlatFileItemWriter<Cms200Item> cmsWriter(#Value("#{jobExecutionContext}") Map<Object, Object> ec, //
#Qualifier("cms200LineAggregator") FormatterLineAggregator<Cms200Item> lineAgg) throws IOException {
#SuppressWarnings("unchecked")
String fileName = ((Map<String, MccFtpFile>) ec.get(AbstractSetupTasklet.BATCH_FTP_FILES)).get("cms").getLocalFile();
//Ensure the file can exist.
PrintWriter fos = getIoHarness().getFileOutputStream(fileName);
fos.close();
FlatFileItemWriter<Cms200Item> writer = new FlatFileItemWriter<>();
writer.setResource(new FileSystemResource(fileName));
writer.setLineAggregator(lineAgg);
Calendar cal = Calendar.getInstance();
Date date = cal.getTime();
DateFormat dateFormat = new SimpleDateFormat("HH:mm:ss");
String formattedDate=dateFormat.format(date);
writer.setHeaderCallback(new FlatFileHeaderCallback() {
public void writeHeader(Writer writer) throws IOException {
writer.write(" Test Company. " + formattedDate);
writer.write("\n CMS200 CUSTOMER SHIPMENT MANIFEST AUTHORIZATION BY CUSTOMER NAME Page 1");
writer.write("\n\n");
writer.write(" CUSTOMER NAME CITY ST CONTROL MNFST ID AUTH CODE I03 CLS EDI EXPRESS POV MOST CURRENT DEACTIVE");
writer.write("\n");
writer.write(" NBR TRL 214 WORK ACCESS DATE ");
}
});
return writer;
}
I want to print this header everytime 53 records are processed. I can't figure out how to implement that logic into my Spring Batch job. I have the writeCount added to my execution context, but not sure how to access that here, or if that's the correct approach.
The writer I posted is in my BatchConfiguration.java file
EDIT:
Below I have my filestep and added chunk size
#Bean(name = "cms200FileStep")
public Step createFileStep(StepBuilderFactory stepFactory, //
#Qualifier("cms200Reader") ItemReader<Cms200Item> reader, //
Cms200Processor processor, //
#Qualifier("cms200Writer") ItemWriter<Cms200Item> writer) {
return stepFactory.get("cms200FileStep") //
.<Cms200Item, Cms200Item>chunk(100000) //
.reader(reader) //
.processor(processor) //
.writer(writer).chunk(53) //
.allowStartIfComplete(true)//
.build();//
}
Edit: Added job config
// Job
#Bean(name = "mccCMSCLRPTjob")
public Job mccCmsclrptjob(JobBuilderFactory jobFactory, //
#Qualifier("cms200SetupStep") Step setupStep, //
#Qualifier("cms200FileStep") Step fileStep, //
#Qualifier("putFtpFilesStep") Step putFtpStep, //
#Qualifier("cms200TeardownStep") Step teardownStep, //
#Autowired SingleInstanceListener listener,
#Autowired ChunkSizeListener chunkListener) { //
return jobFactory.get("mccCMSCLRPTjob") //
.incrementer(new RunIdIncrementer()) //
.listener(listener) //
.start(setupStep) //
.next(fileStep) //
.next(putFtpStep) //
.next(teardownStep) //
.build();
}
Edit: adding the listener
#Bean(name = "cms200FileStep")
public Step createFileStep(StepBuilderFactory stepFactory, //
#Qualifier("cms200Reader") ItemReader<Cms200Item> reader, //
Cms200Processor processor, //
#Qualifier("cms200Writer") ItemWriter<Cms200Item> writer,
#Autowired ChunkSizeListener listener) {
return stepFactory.get("cms200FileStep") //
.<Cms200Item, Cms200Item>chunk(100000) //
.reader(reader) //
.processor(processor) //
.writer(writer).chunk(53) //
.allowStartIfComplete(true)//
.listener(listener) //
.build();//
}
EDIT: After a lot of back and forth this is where I'm at
// Utility Methods
#Bean(name = "cms200FileStep")
public Step createFileStep(StepBuilderFactory stepFactory, Map<Object, Object> ec, //
#Qualifier("cms200Reader") ItemReader<Cms200Item> reader, //
Cms200Processor processor, //
#Qualifier("cms200Writer") ItemWriter<Cms200Item> writer) throws IOException {
#SuppressWarnings("unchecked")
String fileName = ((Map<String, MccFtpFile>) ec.get(AbstractSetupTasklet.BATCH_FTP_FILES)).get("cms").getLocalFile();
return stepFactory.get("cms200FileStep") //
.<Cms200Item, Cms200Item>chunk(100000) //
.reader(reader) //
.processor(processor) //
.writer(writer).chunk(53) //
.allowStartIfComplete(true)//
// .listener((ChunkListener) listener) //
.listener((ChunkListener) new ChunkSizeListener(new File(fileName))) //
.build();//
}
The FlatFileHeaderCallback is called only once before the chunk-oriented step, aka before all chunks.
I want to print this header everytime 53 records are processed
What you can do is set the chunk-size to 53 and use a ChunkListener or ItemWriteListener to write the required data.
EDIT: Add an example
class MyChunkListener extends StepListenerSupport {
private FileWriter fileWriter;
public MyChunkListener(File file) throws IOException {
this.fileWriter = new FileWriter(file, true);
}
#Override
public void beforeChunk(ChunkContext context) {
try {
fileWriter.write("your custom header");
fileWriter.flush();
} catch (IOException e) {
System.err.println("Unable to write header to file");
}
}
#Override
public ExitStatus afterStep(StepExecution stepExecution) {
try {
fileWriter.close();
} catch (IOException e) {
System.err.println("Unable to close writer");
}
return super.afterStep(stepExecution);
}
}

How to set the override on a compound trigger?

I have a Spring integration application that normally polls daily for a file via SFTP using a cron trigger. But if it doesn't find the file it expects, it should poll every x minutes via a periodic trigger until y attempts. To do this I use the following component:
#Component
public class RetryCompoundTriggerAdvice extends AbstractMessageSourceAdvice {
private final static Logger logger = LoggerFactory.getLogger(RetryCompoundTriggerAdvice.class);
private final CompoundTrigger compoundTrigger;
private final Trigger override;
private final ApplicationProperties applicationProperties;
private final Mail mail;
private int attempts = 0;
public RetryCompoundTriggerAdvice(CompoundTrigger compoundTrigger,
#Qualifier("secondaryTrigger") Trigger override,
ApplicationProperties applicationProperties,
Mail mail) {
this.compoundTrigger = compoundTrigger;
this.override = override;
this.applicationProperties = applicationProperties;
this.mail = mail;
}
#Override
public boolean beforeReceive(MessageSource<?> source) {
return true;
}
#Override
public Message<?> afterReceive(Message<?> result, MessageSource<?> source) {
final int maxOverrideAttempts = applicationProperties.getMaxFileRetry();
attempts++;
if (result == null && attempts < maxOverrideAttempts) {
logger.info("Unable to find load file after " + attempts + " attempt(s). Will reattempt");
this.compoundTrigger.setOverride(this.override);
} else if (result == null && attempts >= maxOverrideAttempts) {
mail.sendAdminsEmail("Missing File");
attempts = 0;
this.compoundTrigger.setOverride(null);
}
else {
attempts = 0;
this.compoundTrigger.setOverride(null);
logger.info("Found load file");
}
return result;
}
public void setOverrideTrigger() {
this.compoundTrigger.setOverride(this.override);
}
public CompoundTrigger getCompoundTrigger() {
return compoundTrigger;
}
}
If a file doesn't exist, this works great. That is, the override (i.e. periodic trigger) takes effect and polls every x minutes until y attempts.
However, if a file does exist but it's not the expected file (e.g. the data is at the wrong date), another class (that reads the file) calls the setOverrideTrigger of the RetryCompoundTriggerAdvice class. But afterReceive is not subsequently called at every x minutes. Why would this be?
Here's more of the application code:
SftpInboundFileSynchronizer:
#Bean
public SftpInboundFileSynchronizer sftpInboundFileSynchronizer() {
SftpInboundFileSynchronizer fileSynchronizer = new SftpInboundFileSynchronizer(sftpSessionFactory());
fileSynchronizer.setDeleteRemoteFiles(false);
fileSynchronizer.setRemoteDirectory(applicationProperties.getSftpDirectory());
CompositeFileListFilter<ChannelSftp.LsEntry> compositeFileListFilter = new CompositeFileListFilter<ChannelSftp.LsEntry>();
compositeFileListFilter.addFilter(new SftpPersistentAcceptOnceFileListFilter(store, "sftp"));
compositeFileListFilter.addFilter(new SftpSimplePatternFileListFilter(applicationProperties.getLoadFileNamePattern()));
fileSynchronizer.setFilter(compositeFileListFilter);
fileSynchronizer.setPreserveTimestamp(true);
return fileSynchronizer;
}
Session factory is:
#Bean
public SessionFactory<LsEntry> sftpSessionFactory() {
DefaultSftpSessionFactory sftpSessionFactory = new DefaultSftpSessionFactory();
sftpSessionFactory.setHost(applicationProperties.getSftpHost());
sftpSessionFactory.setPort(applicationProperties.getSftpPort());
sftpSessionFactory.setUser(applicationProperties.getSftpUser());
sftpSessionFactory.setPassword(applicationProperties.getSftpPassword());
sftpSessionFactory.setAllowUnknownKeys(true);
return new CachingSessionFactory<LsEntry>(sftpSessionFactory);
}
The SftpInboundFileSynchronizingMessageSource is set to poll using the compound trigger.
#Bean
#InboundChannelAdapter(autoStartup="true", channel = "sftpChannel", poller = #Poller("pollerMetadata"))
public SftpInboundFileSynchronizingMessageSource sftpMessageSource() {
SftpInboundFileSynchronizingMessageSource source =
new SftpInboundFileSynchronizingMessageSource(sftpInboundFileSynchronizer());
source.setLocalDirectory(applicationProperties.getScheduledLoadDirectory());
source.setAutoCreateLocalDirectory(true);
CompositeFileListFilter<File> compositeFileFilter = new CompositeFileListFilter<File>();
compositeFileFilter.addFilter(new LastModifiedFileListFilter());
compositeFileFilter.addFilter(new FileSystemPersistentAcceptOnceFileListFilter(store, "dailyfilesystem"));
source.setLocalFilter(compositeFileFilter);
source.setCountsEnabled(true);
return source;
}
#Bean
public PollerMetadata pollerMetadata(RetryCompoundTriggerAdvice retryCompoundTriggerAdvice) {
PollerMetadata pollerMetadata = new PollerMetadata();
List<Advice> adviceChain = new ArrayList<Advice>();
adviceChain.add(retryCompoundTriggerAdvice);
pollerMetadata.setAdviceChain(adviceChain);
pollerMetadata.setTrigger(compoundTrigger());
pollerMetadata.setMaxMessagesPerPoll(1);
return pollerMetadata;
}
#Bean
public CompoundTrigger compoundTrigger() {
CompoundTrigger compoundTrigger = new CompoundTrigger(primaryTrigger());
return compoundTrigger;
}
#Bean
public CronTrigger primaryTrigger() {
return new CronTrigger(applicationProperties.getSchedule());
}
#Bean
public PeriodicTrigger secondaryTrigger() {
return new PeriodicTrigger(applicationProperties.getRetryInterval());
}
Update
Here's the message handler:
#Bean
#ServiceActivator(inputChannel = "sftpChannel")
public MessageHandler dailyHandler(SimpleJobLauncher jobLauncher, Job job, Mail mail) {
JobRunner jobRunner = new JobRunner(jobLauncher, job, store, mail);
jobRunner.setDaily("true");
jobRunner.setOverwrite("false");
return jobRunner;
}
JobRunner kicks off a Spring Batch job. After processing the job, my application looks to see if the file had the data it expected for the day. If not, it is setting the override trigger.
That's the way triggers work - you only get an opportunity to change the trigger when the trigger fires.
Since you reset to the cron trigger, the next opportunity for change is when that trigger fires (if the poller thread is released by the downstream flow before changing the trigger).
Are you handing off the file to another thread (queue channel or executor)? If not, I would expect any changes to the trigger should be applied, because nextExecutionTime() will not be called until the downstream flow returns.
If there's a thread handoff, you have no opportunity to change the trigger.

Resources