Records are not written in files when invoked from BillerOrderWriter which implements ItemWriter in Spring Batch - spring

I am trying to write successful records using one writer and failed records in another writer.
I have written BillerOrderWriter class which implements ItemWriter. I put some log statements and I can see it writes success billerOrderId or failed billerOrderId . But, it seems like it does not invoke DatabaseToCsvFileJobConfig or SuccessfulOrdersToCsvFileJobConfig .
public class BillerOrderWriter implements ItemWriter<BillerOrder>{
private static Logger log = LoggerFactory.getLogger("BillerOrderWriter.class");
#Autowired
SuccessfulOrdersToCsvFileJobConfig successfulOrdersToCsvFileJobConfig;
#Autowired
DatabaseToCsvFileJobConfig databaseToCsvFileJobConfig;
#Override
public void write(List<? extends BillerOrder> items) throws Exception {
for (BillerOrder item : items) {
log.info("item = " + item.toString());
if (item.getResult().equals("SUCCESS")) {
log.info(" Success billerOrderId = " + item.getBillerOrderId());
successfulOrdersToCsvFileJobConfig.successfulDatabaseCsvItemWriter();
} else {
log.info("Failed billerOrderId = " + item.getBillerOrderId());
databaseToCsvFileJobConfig.databaseCsvItemWriter();
}
}
}
}
Here is BatchConfig class.
#Bean
public BillerOrderWriter billerOrderWriter() {
return new BillerOrderWriter();
}
#Bean
public Job importJobOrder(JobCompletionNotificationListner listener, Step step1) {
return jobBuilderFactory.get("importJobOrder")
.incrementer(new RunIdIncrementer())
.listener(listener)
.flow(step1)
.end()
.build();
}
#Bean(name="step1")
public Step step1(BillerOrderWriter billerOrderWriter) {
return stepBuilderFactory.get("step1")
.<BillerOrder, BillerOrder> chunk(10)
.reader((ItemReader<? extends BillerOrder>) reader())
.processor(processor())
.writer(billerOrderWriter)
.build();
}
Here is my successwriter and failedwriter class .
#Configuration
public class SuccessfulOrdersToCsvFileJobConfig {
private static Logger log = LoggerFactory.getLogger("SuccessfulOrdersToCsvFileJobConfig.class");
#Bean
public ItemWriter<BillerOrder> successfulDatabaseCsvItemWriter() {
log.info("Entering SuccessfulOrdersToCsvFileJobConfig...");
FlatFileItemWriter<BillerOrder> csvFileWriter = new FlatFileItemWriter<>();
String exportFileHeader = "BillerOrderId;SuccessMessage";
OrderWriter headerWriter = new OrderWriter(exportFileHeader);
csvFileWriter.setHeaderCallback(headerWriter);
String exportFilePath = "/tmp/SuccessBillerOrderIdForRetry.csv";
csvFileWriter.setResource(new FileSystemResource(exportFilePath));
LineAggregator<BillerOrder> lineAggregator = createOrderLineAggregator();
csvFileWriter.setLineAggregator(lineAggregator);
return csvFileWriter;
}
private LineAggregator<BillerOrder> createOrderLineAggregator() {
log.info("Entering createOrderLineAggregator...");
DelimitedLineAggregator<BillerOrder> lineAggregator = new DelimitedLineAggregator<>();
lineAggregator.setDelimiter(";");
FieldExtractor<BillerOrder> fieldExtractor = createOrderFieldExtractor();
lineAggregator.setFieldExtractor(fieldExtractor);
return lineAggregator;
}
private FieldExtractor<BillerOrder> createOrderFieldExtractor() {
log.info("Entering createOrderFieldExtractor...");
BeanWrapperFieldExtractor<BillerOrder> extractor = new BeanWrapperFieldExtractor<>();
extractor.setNames(new String[] {"billerOrderId","successMessage"});
return extractor;
}
}
#Configuration
public class DatabaseToCsvFileJobConfig {
private static Logger log = LoggerFactory.getLogger("DatabaseToCsvFileJobConfig.class");
#Bean
public ItemWriter<BillerOrder> databaseCsvItemWriter() {
log.info("Entering databaseCsvItemWriter...");
FlatFileItemWriter<BillerOrder> csvFileWriter = new FlatFileItemWriter<>();
String exportFileHeader = "BillerOrderId;ErrorMessage";
OrderWriter headerWriter = new OrderWriter(exportFileHeader);
csvFileWriter.setHeaderCallback(headerWriter);
String exportFilePath = "/tmp/FailedBillerOrderIdForRetry.csv";
csvFileWriter.setResource(new FileSystemResource(exportFilePath));
LineAggregator<BillerOrder> lineAggregator = createOrderLineAggregator();
csvFileWriter.setLineAggregator(lineAggregator);
return csvFileWriter;
}
private LineAggregator<BillerOrder> createOrderLineAggregator() {
log.info("Entering createOrderLineAggregator...");
DelimitedLineAggregator<BillerOrder> lineAggregator = new DelimitedLineAggregator<>();
lineAggregator.setDelimiter(";");
FieldExtractor<BillerOrder> fieldExtractor = createOrderFieldExtractor();
lineAggregator.setFieldExtractor(fieldExtractor);
return lineAggregator;
}
private FieldExtractor<BillerOrder> createOrderFieldExtractor() {
log.info("Entering createOrderFieldExtractor...");
BeanWrapperFieldExtractor<BillerOrder> extractor = new BeanWrapperFieldExtractor<>();
extractor.setNames(new String[] {"billerOrderId","errorMessage"});
return extractor;
}
}
Here is my job completion listener class.
#Component
public class JobCompletionNotificationListner extends JobExecutionListenerSupport {
private static final org.slf4j.Logger log = LoggerFactory.getLogger(JobCompletionNotificationListner.class);
#Override
public void afterJob(JobExecution jobExecution) {
log.info("In afterJob ...");
if (jobExecution.getStatus() == BatchStatus.COMPLETED) {
DatabaseToCsvFileJobConfig databaseToCsvFileJobConfig = new DatabaseToCsvFileJobConfig();
SuccessfulOrdersToCsvFileJobConfig successfulOrdersToCsvFileJobConfig = new SuccessfulOrdersToCsvFileJobConfig();
}
}
}

In your BillerOrderWriter#write method, it is supposed to write code that does the actual write operation of items to a data sink. But in your case, you are calling successfulOrdersToCsvFileJobConfig.successfulDatabaseCsvItemWriter(); and databaseToCsvFileJobConfig.databaseCsvItemWriter(); which create item writer beans. You should inject those delegate writers and call their write method when needed, something like:
public class BillerOrderWriter implements ItemWriter<BillerOrder>{
private ItemWriter<BillerOrder> successfulDatabaseCsvItemWriter;
private ItemWriter<BillerOrder> databaseCsvItemWriter;
// constructor with successfulDatabaseCsvItemWriter + databaseCsvItemWriter
#Override
public void write(List<? extends BillerOrder> items) throws Exception {
for (BillerOrder item : items) {
if (item.getResult().equals("SUCCESS")) {
successfulDatabaseCsvItemWriter.write(Collections.singletonList(item));
} else {
databaseCsvItemWriter.write(Collections.singletonList(item));
}
}
}
}

Instead Of BillerOrderWriter, I wroter BillerOrderClassifier class.
public class BillerOrderClassifier implements Classifier<BillerOrder, ItemWriter<? super BillerOrder>> {
private static final long serialVersionUID = 1L;
private ItemWriter<BillerOrder> successItemWriter;
private ItemWriter<BillerOrder> failedItemWriter;
public BillerOrderClassifier(ItemWriter<BillerOrder> successItemWriter, ItemWriter<BillerOrder> failedItemWriter) {
this.successItemWriter = successItemWriter;
this.failedItemWriter = failedItemWriter;
}
#Override
public ItemWriter<? super BillerOrder> classify(BillerOrder billerOrder) {
return billerOrder.getResult().equals("SUCCESS") ? successItemWriter : failedItemWriter;
}
}
In BatchConfiguration, I wrote classifierBillerOrderCompositeItemWriter method.
#Bean
public ClassifierCompositeItemWriter<BillerOrder> classifierBillerOrderCompositeItemWriter() throws Exception {
ClassifierCompositeItemWriter<BillerOrder> compositeItemWriter = new ClassifierCompositeItemWriter<>();
compositeItemWriter.setClassifier(new BillerOrderClassifier(successfulOrdersToCsvFileJobConfig.successfulDatabaseCsvItemWriter(), databaseToCsvFileJobConfig.databaseCsvItemWriter()));
return compositeItemWriter;
}
#Bean(name="step1")
public Step step1() throws Exception{
return stepBuilderFactory.get("step1")
.<BillerOrder, BillerOrder> chunk(10)
.reader((ItemReader<? extends BillerOrder>) reader())
.processor(processor())
.writer(classifierBillerOrderCompositeItemWriter())
.stream(successfulOrdersToCsvFileJobConfig.successfulDatabaseCsvItemWriter())
.stream(databaseToCsvFileJobConfig.databaseCsvItemWriter())
.build();
}

Related

Listener not getting message in REDIS PubS/ub with Spring Boot

I am relatively new to Redis Pub/Sub. I have integrated this recently in my Spring Boot application.
Redis Pub/Sub configuration is as follows:
#Configuration
public class RedisPubSubConfiguration {
#Bean
public RedisMessageListenerContainer messageListenerContainer(RedisConnectionFactory
connectionFactory,
#Qualifier("topicAdapterPair")
List<Pair
<Topic,
MessageListenerAdapter>>
channelAdaperPairList) {
RedisMessageListenerContainer container = new RedisMessageListenerContainer();
for (Pair<Topic, MessageListenerAdapter> chanelAdapterPair : channelAdaperPairList) {
container.addMessageListener(chanelAdapterPair.getValue(),
chanelAdapterPair.getKey());
}
container.setConnectionFactory(connectionFactory);
return container;
}
#Bean("msg-listener-adptr-1")
public MessageListenerAdapter messageListnerAdapter1(
#Qualifier("message-listener-1")
MessageListener listener) {
return new MessageListenerAdapter(listener, REDIS_RECEIVER_METHOD_NAME);
}
#Bean("message-listener-1")
public MessageListener messageListener1(ManagerProxy managerProxy) {
return new MessageListener1(managerProxy);
}
#Bean("message-sender-1")
public MessageSender messageSender1(RedisTemplate redisTemplate,
#Value("${chnlTopicName1}")
String channelTopicName) {
return new MessageSender1(redisTemplate, channelTopicName);
}
#Bean
#Qualifier("topicAdapterPair")
public Pair<Topic, MessageListenerAdapter> getTopicListenerAdapterpair1(
#Value("${chnlTopicName1}") String channelTopicName,
#Qualifier("msg-listener-adptr-1")
MessageListenerAdapter messageListenerAdapter) {
return Pair.of(new ChannelTopic(channelTopicName), messageListenerAdapter);
}
#Bean("msg-listener-adptr-2")
public MessageListenerAdapter messageListnerAdapter2(
#Qualifier("message-listener-2")
MessageListener listener) {
return new MessageListenerAdapter(listener, REDIS_RECEIVER_METHOD_NAME);
}
#Bean("message-listener-2")
public MessageListener messageListener2(NotificationServiceImpl notificationService) {
return new MessageListener2(notificationService);
}
#Bean("message-sender-2")
public MessageSender messageSender2(RedisTemplate redisTemplate,
#Value("${chnlTopicName2}")
String channelTopicName) {
return new MessageSender2(redisTemplate, channelTopicName);
}
#Bean
#Qualifier("topicAdapterPair")
public Pair<Topic, MessageListenerAdapter> getTopicListenerAdapterPair2(
#Value("${chnlTopicName2}") String channelTopicName,
#Qualifier("msg-listener-adptr-2")
MessageListenerAdapter messageListenerAdapter) {
return Pair.of(new ChannelTopic(channelTopicName), messageListenerAdapter);
}
}
MessageSender2 is as follows:
public class MessageSender2 implements MessageSender<MyDTO> {
private final RedisTemplate<String, Object> redisTemplate;
private final String chanelName;
public MessageSender2(
RedisTemplate<String, Object> redisTemplate,
String chanelName) {
this.redisTemplate = redisTemplate;
this.chanelName = chanelName;
}
#Override
public void send(MyDTO myDTO) {
redisTemplate.convertAndSend(chanelName, myDTO);
}
}
MessageListener2 is as follows:
public class MessageListener2 implements MessageListener<EventDTO> {
private static final Logger LOGGER = LoggerFactory
.getLogger(MessageListener2.class);
private final NotificationService notificationService;
public MessageListener1(NotificationServiceImpl notificationService) {
this.notificationService = notificationService;
}
#Override
public void receiveMessage(MyDTO message) {
LOGGER.info("Received message : {} ", message); <--HERE MESSAGE IS NOT COMING EVEN AFTER PUBLISHING MESSAGE TO THE ASSOCIATED TOPIC FROM PUBLISHER
Type type = message.getType();
...
}
}
MessageSender1 is as follows:
public class MessageSender1 implements MessageSender<String> {
private final RedisTemplate<String, Object> redisTemplate;
private final String chanelName;
public MessageSender1(
RedisTemplate<String, Object> redisTemplate,
String chanelName) {
this.redisTemplate = redisTemplate;
this.chanelName = chanelName;
}
#Override
public void send(String message) {
redisTemplate.convertAndSend(chanelName, message);
}
}
Associated listener is follows:
public class MessageListener1 implements MessageListener<String> {
private static final Logger LOGGER = LoggerFactory
.getLogger(MessageListener1.class);
private final ManagerProxy managerProxy;
public MessageListener1(ManagerProxy managerProxy) {
this.managerProxy = managerProxy;
}
public void receiveMessage(String message) {
LOGGER.info("Received message : {} ", message);
managerProxy.refresh();
}
}
Here though MessageSender1 and associated message listener are working fine, I don't understand what I did with MessageSender2 and associated listener, because of which I am not able to receive message in the listener.

How to use ClassifierCompositeItemProcessor in Spring Batch and write data into same table for Insert and Upsert?

I went through the link - https://github.com/spring-projects/spring-batch/blob/master/spring-batch-infrastructure/src/test/java/org/springframework/batch/item/support/ClassifierCompositeItemProcessorTests.java, but did not strike much out of it.
I am trying to replace ETL Informatica mapping logic into the Batch. I am looking to separate out Status=I and Status=U into separate (Individual) processor and then further perform lookup and massage the data and then write those records directly into the table for Status=I and for status=U, perform another complex logic (like lookups, massaging and match and merge logic) and then upsert those records again into the same table.
I've tried to do POC, where I am looking to segregate the records in the processor
CustomerClassifier.java
public class CustomerClassifier implements Classifier<Customer, ItemProcessor<Customer, Customer>> {
private ItemProcessor<Customer, Customer> insertCustomerProcessor;
private ItemProcessor<Customer, Customer> updateCustomerProcessor;
public CustomerClassifier(ItemProcessor<Customer, Customer> evenCustomerProcessor, ItemProcessor<Customer, Customer> oddCustomerProcessor) {
this.insertCustomerProcessor= insertCustomerProcessor;
this.updateCustomerProcessor= updateCustomerProcessor;
}
#Override
public ItemProcessor<Customer, Customer> classify(Customer customer) {
return customer.getStatus().equals("I") ? insertCustomerProcessor : updateCustomerProcessor;
}
}
OddCustomerProcessor.java
public class OddCustomerProcessor implements ItemProcessor<Customer, Customer> {
#Override
public Customer process(Customer item) throws Exception {
Customer customer = new Customer();
// Perform some msaaging and lookups here
customer.setId(item.getId());
customer.setFirstName(item.getFirstName());
customer.setLastName(item.getLastName());
customer.setBirthdate(item.getBirthdate());
customer.setStatus(item.getStatus());
return customer;
}
}
EvenCustomerProcessor.java
public class EvenCustomerProcessor implements ItemProcessor<Customer, Customer> {
#Override
public Customer process(Customer item) throws Exception {
Customer customer = new Customer();
// Perform some msaaging and lookups here
customer.setId(item.getId());
customer.setFirstName(item.getFirstName());
customer.setLastName(item.getLastName());
customer.setBirthdate(item.getBirthdate());
customer.setStatus(item.getStatus());
return customer;
}
}
CustomLineAggregator.java
public class CustomLineAggregator implements LineAggregator<Customer> {
private ObjectMapper objectMapper = new ObjectMapper();
#Override
public String aggregate(Customer item) {
try {
return objectMapper.writeValueAsString(item);
} catch (Exception e) {
throw new RuntimeException("Unable to serialize Customer", e);
}
}
}
Customer.java
#Data
#AllArgsConstructor
#Builder
#NoArgsConstructor
public class Customer {
private Long id;
private String firstName;
private String lastName;
private String birthdate;
private String status;
}
Error-
The method setClassifier(Classifier<? super Customer,ItemProcessor<?,? extends Customer>>) in the type ClassifierCompositeItemProcessor<Customer,Customer> is not applicable for the
arguments (CustomerClassifier)
Configuration
#Configuration
public class JobConfiguration {
#Autowired
private JobBuilderFactory jobBuilderFactory;
#Autowired
private StepBuilderFactory stepBuilderFactory;
#Autowired
private DataSource dataSource;
#Bean
public JdbcPagingItemReader<Customer> customerPagingItemReader(){
// reading database records using JDBC in a paging fashion
JdbcPagingItemReader<Customer> reader = new JdbcPagingItemReader<>();
reader.setDataSource(this.dataSource);
reader.setFetchSize(1000);
reader.setRowMapper(new CustomerRowMapper());
// Sort Keys
Map<String, Order> sortKeys = new HashMap<>();
sortKeys.put("id", Order.ASCENDING);
// MySQL implementation of a PagingQueryProvider using database specific features.
MySqlPagingQueryProvider queryProvider = new MySqlPagingQueryProvider();
queryProvider.setSelectClause("id, firstName, lastName, birthdate");
queryProvider.setFromClause("from customer");
queryProvider.setSortKeys(sortKeys);
reader.setQueryProvider(queryProvider);
return reader;
}
#Bean
public EvenCustomerProcessor evenCustomerProcessor() {
return new EvenCustomerProcessor();
}
#Bean
public OddCustomerProcessor oddCustomerProcessor() {
return new OddCustomerProcessor();
}
#Bean
public JdbcBatchItemWriter<Customer> customerItemWriter(){
JdbcBatchItemWriter<Customer> batchItemWriter = new JdbcBatchItemWriter<>();
batchItemWriter.setDataSource(dataSource);
batchItemWriter.setSql(""); // Query Goes here
return batchItemWriter;
}
#Bean
public ClassifierCompositeItemProcessor<Customer, Customer> classifierCustomerCompositeItemProcessor() throws Exception{
ClassifierCompositeItemProcessor<Customer, Customer> itemProcessor = new ClassifierCompositeItemProcessor<>();
itemProcessor.setClassifier(new CustomerClassifier(evenCustomerProcessor(), oddCustomerProcessor()));
}
#Bean
public Step step1() throws Exception {
return stepBuilderFactory.get("step1")
.<Customer, Customer> chunk(10)
.reader(customerPagingItemReader())
.processor(classifierCustomerCompositeItemProcessor())
.writer(customerItemWriter())
.build();
}
#Bean
public Job job() throws Exception {
return jobBuilderFactory.get("job")
.start(step1())
.build();
}
}
You can remove the CustomerClassifier and define the composite item processor as follows:
#Bean
public ClassifierCompositeItemProcessor<Customer, Customer> classifierCustomerCompositeItemProcessor(
EvenCustomerProcessor evenCustomerProcessor,
OddCustomerProcessor oddCustomerProcessor
) {
ClassifierCompositeItemProcessor<Customer, Customer> itemProcessor = new ClassifierCompositeItemProcessor<>();
itemProcessor.setClassifier(new Classifier<Customer, ItemProcessor<?, ? extends Customer>>() {
#Override
public ItemProcessor<?, ? extends Customer> classify(Customer customer) {
return customer.getStatus().equals("I") ? evenCustomerProcessor : oddCustomerProcessor;
}
});
return itemProcessor;
}
Then update your step definition as follows:
#Bean
public Step step1() throws Exception {
return stepBuilderFactory.get("step1")
.<Customer, Customer> chunk(10)
.reader(customerPagingItemReader())
.processor(classifierCustomerCompositeItemProcessor(evenCustomerProcessor(), oddCustomerProcessor()))
.writer(customerItemWriter())
.build();
}

PayloadRootSmartSoapEndpointInterceptor Intercepts multiple EndPoints

I'm trying to add a Custom Interceptors to the interceptors List in my EndPoint Config, but i have a problem where PayloadRootSmartSoapEndpointInterceptor intercepts 2 of my Endpoints instead of one, I have Defined 2 SOAP EndPoints using spring-ws.
#EnableWs
#Configuration
#Order(1)
public class Config extends WsConfigurerAdapter {
private String namespaceBti = "http://tarim.bull.ro/BullTarimWS/BTIService";
private String namespaceBtiLst = "http://tarim.bull.ro/BullTarimWS/BTILSTService";
#Bean
public ServletRegistrationBean messageDispatcherServlet(ApplicationContext applicationContext) {
MessageDispatcherServlet servlet = new MessageDispatcherServlet();
servlet.setApplicationContext(applicationContext);
servlet.setTransformWsdlLocations(true);
return new ServletRegistrationBean(servlet, "/public/btiWS/*");
}
//Service 1
#Bean(name = "BTIService")
public DefaultWsdl11Definition defaultWsdl11DefinitionBti(#Qualifier("BTISchema") XsdSchema certificateSchema) {
DefaultWsdl11Definition wsdl11Definition = new DefaultWsdl11Definition();
wsdl11Definition.setPortTypeName("BtiPort");
wsdl11Definition.setLocationUri("/public/btiWS"); //<context-path>
wsdl11Definition.setTargetNamespace(namespaceBti);
wsdl11Definition.setRequestSuffix("Input");
wsdl11Definition.setResponseSuffix("Output");
wsdl11Definition.setSchema(certificateSchema);
return wsdl11Definition;
}
#Bean(name="BTISchema")
public XsdSchema certificateSchemaBti() {
return new SimpleXsdSchema(new ClassPathResource("xml-resources/GETBTI.xsd"));
}
// Service 2
#Bean(name = "BTILSTService") //name of the wsdl in the URL
public DefaultWsdl11Definition defaultWsdl11DefinitionBtiLst(#Qualifier("BTILSTSchema") XsdSchema certificateSchema) {
DefaultWsdl11Definition wsdl11Definition = new DefaultWsdl11Definition();
wsdl11Definition.setPortTypeName("BtiLstPort");
wsdl11Definition.setLocationUri("/public/btiWS"); //<context-path>
wsdl11Definition.setTargetNamespace(namespaceBtiLst);
wsdl11Definition.setRequestSuffix("Input");
wsdl11Definition.setResponseSuffix("Output");
wsdl11Definition.setSchema(certificateSchema);
return wsdl11Definition;
}
#Bean(name="BTILSTSchema")
public XsdSchema certificateSchemaBtiLst() {
return new SimpleXsdSchema(new ClassPathResource("xml-resources/GETBTILST.xsd"));
}
#Autowired
private WriteBtiDto writeBtiDto;
Adding a Custom Interceptor to the list>
#Override
public void addInterceptors(List<EndpointInterceptor> interceptors) {
interceptors.add(new PayloadRootSmartSoapEndpointInterceptor(
new BtiEndpointInterceptor(), //let Spring Build and Manage The Bean, not me
BtiEndpoint.getNamespaceUri(),
BtiEndpoint.getLocalPart()
));
}
BTI EndPoint
#Endpoint()
public class BtiEndpoint {
private static final String NAMESPACE_URI="http://tarim.bull.ro/BullTarimWS/BTIService";
private static final String LOCAL_PART = "CXMLTYPE-GETBTIInput";
#PayloadRoot(namespace = NAMESPACE_URI, localPart = LOCAL_PART)
#ResponsePayload
public CXMLTYPEGETBTIOutput getBTI(#RequestPayload CXMLTYPEGETBTIInput request){
CXMLTYPEGETBTIOutput response = new CXMLTYPEGETBTIOutput();
return response;
}
// GETTERS AND SETTER FOR NAMESPACE AND LOCAL PART
BTILST EndPoint
#Endpoint()
public class BtiLstEndpoint {
private static final String NAMESPACE_URI="http://tarim.bull.ro/BullTarimWS/BTILSTService";
private static final String LOCAL_PART = "CXMLTYPE-GETBTILSTInput";
#PayloadRoot(namespace = NAMESPACE_URI, localPart = LOCAL_PART)
#ResponsePayload
public CXMLTYPEGETBTILSTOutput getBTI(#RequestPayload CXMLTYPEGETBTILSTInput request){
CXMLTYPEGETBTILSTOutput response = new CXMLTYPEGETBTILSTOutput();
return response;
}
// GETTERS AND SETTER FOR NAMESPACE AND LOCAL PART
EndpointInterceptor
#Component
public class BtiEndpointInterceptor implements EndpointInterceptor {
private static final Log LOG = LogFactory.getLog(BtiEndpointInterceptor.class);
#Override
public boolean handleRequest(MessageContext messageContext, Object o) throws Exception {
LOG.info("1. Global Request Handling");
return true;
}
#Override
public boolean handleResponse(MessageContext messageContext, Object o) throws Exception {
LOG.info("2. Global Response Handling");
return true;
}
#Override
public boolean handleFault(MessageContext messageContext, Object o) throws Exception {
LOG.info("Global Exception Handling");
return true;
}
#Override
public void afterCompletion(MessageContext messageContext, Object endpoint, Exception ex) {
}

Springboot Batch get CSV from rest API

I have a scenario for Springboot batch application where I need to read the CSV-11 and write to another CSV-2, the problem arises when I need to get the CSV-1 from a REST endpoint, for e.g. when I initiate the job it should fetch a CSV-1 from an endpoint and then continue the batch processing to write to CSV-2.
But it seems like to process a CSV-1 we need to feed the 'Resource' in advance while application startup [I am fairly new to batch so not sure if it's 100% correct].
Can anyone please guide me the right approach to resolve this?
EDIT : Adding code, I was able to resolve it but need advice if it's the right way to do things (please ignore the hard-coded data).
#Configuration
#EnableBatchProcessing
public class BatchConfig {
#Autowired
private JobBuilderFactory jobbuilder;
#Autowired
private StepBuilderFactory stepbuilder;
#Autowired
private CSVProcessor processor;
#Autowired
private CustomSkipPolicy customSkipPolicy;
#Autowired
private CSVResponse response;
#Bean(name="csv")
public Job job_csv() throws Exception {
Step step = stepbuilder
.get("csv-step")
.<Person, Person>chunk(5)
.reader(new CSVReader(response.getResource()))
.processor(processor)
.writer(new CSVWriter().write(response.getFilename()))
.faultTolerant()
.skipPolicy(customSkipPolicy)
.listener(new StepListener())
.build();
return jobbuilder
.get("csv-job")
.incrementer(new RunIdIncrementer())
.listener(new JobListener())
.flow(step)
.end()
.build();
}
}
#Slf4j
public class CSVReader extends FlatFileItemReader<Person> {
public CSVReader(Resource resource) throws Exception {
super();
setResource(resource);
setStrict(false);
setLinesToSkip(1);
doOpen();
DelimitedLineTokenizer dlt = new DelimitedLineTokenizer();
dlt.setNames(new String[] {"id","first_name","last_name","email","gender","ip_address","dob"});
dlt.setDelimiter(",");
dlt.setStrict(false);
BeanWrapperFieldSetMapper<Person> fsp = new BeanWrapperFieldSetMapper<>();
fsp.setTargetType(Person.class);
DefaultLineMapper<Person> dlp = new DefaultLineMapper<>();
dlp.setLineTokenizer(dlt);
dlp.setFieldSetMapper(fsp);
setLineMapper(dlp);
}
}
public class CSVWriter {
private static final String DATA_PROCESSED = "C:/data/processed";
public FlatFileItemWriter<Person> write(String filename) throws Exception {
FlatFileItemWriter<Person> writer = new FlatFileItemWriter<>();
writer.setResource(resource(filename));
writer.setHeaderCallback(new Header());
BeanWrapperFieldExtractor<Person> fe = new BeanWrapperFieldExtractor<>();
fe.setNames(new String[] { "id", "first_name", "last_name", "age" });
DelimitedLineAggregator<Person> dla = new DelimitedLineAggregator<>();
dla.setDelimiter(",");
dla.setFieldExtractor(fe);
writer.setLineAggregator(dla);
return writer;
}
private Resource resource(String filename) throws IOException {
String processed_file = filename.replace(".csv", "").concat("_PROCESSED").concat(".csv");
if (Files.notExists(Paths.get(DATA_PROCESSED), new LinkOption[] { LinkOption.NOFOLLOW_LINKS })) {
Files.createDirectory(Paths.get(DATA_PROCESSED));
}
if (Files.notExists(Paths.get(DATA_PROCESSED + "/" + processed_file),
new LinkOption[] { LinkOption.NOFOLLOW_LINKS })) {
Files.createFile(Paths.get(DATA_PROCESSED + "/" + processed_file));
}
return new FileSystemResource(DATA_PROCESSED + "/" + processed_file);
}
}
#Component
#Slf4j
public class CSVResponse {
private static final String DATA_RECEIVE = "C:/data/receive";
private String filename;
public Resource getResource() throws IOException {
ResponseEntity<Resource> resp = new RestTemplate().getForEntity("http://localhost:8081/csv", Resource.class);
log.info("File Received : "+resp.getBody().getFilename());
setFilename(resp.getBody().getFilename());
Path path = Paths.get(DATA_RECEIVE);
if(!Files.exists(path, new LinkOption[]{ LinkOption.NOFOLLOW_LINKS})) {
Files.createDirectory(path);
}
Files.copy(resp.getBody().getInputStream(), Paths.get(DATA_RECEIVE + "/" +resp.getBody().getFilename()), StandardCopyOption.REPLACE_EXISTING);
return resp.getBody();
}
public String getFilename() {
return filename;
}
public void setFilename(String filename) {
this.filename = filename;
}
}
You can implement item reader with StepExecutionListener interface, and you have to implement the fetching CSV-1 file via RestTemplate in beforeStep() method of the item reader.
You can refer how to use RestTemplate from https://www.baeldung.com/rest-template

The FlatFileItemReader read only one line from the CSV file - Spring Batch

I'm creating a Spring Batch Job to populate Data into a Database table from a given CSV file.
I created a customized FlatFileItemReader.
my problem is that the read() method is invoked only one time so only the first line of my CSV file is inserted into the database.
#Configuration
#EnableBatchProcessing
public class SpringBatchConfig {
private MultipartFile[] files;
#Bean
public Job job(JobBuilderFactory jobBuilderFactory, StepBuilderFactory stepBuilderFactory,
ItemReader<MyModelEntity> itemReader,
ItemWriter<MyModelEntity> itemWriter) {
Step step = stepBuilderFactory.get("Load-CSV-file_STP")
.<MyModelEntity, MyModelEntity > chunk(12)
.reader(itemReader)
.writer(itemWriter).build();
return jobBuilderFactory.get("Load-CSV-Files").
incrementer(new RunIdIncrementer()) /
.start(step)
.build();
}
#Bean
ItemReader<MyModelEntity> myModelCsvReader() throws Exception {
return new MyModelCsvReader();
}
}
the myModelCsvReader
#Component
#StepScope
public class MyModelCsvReader implements ItemReader<MyModelEntity>{
#Value("#{jobParameters['SDH']}")
private String sdhPath;
private boolean batchJobState= false;
#Autowired
MyModelFieldSetMapper myModelFieldSetMapper;
public LineMapper<MyModelEntity> lineMapper() throws Exception {
DefaultLineMapper<MyModelEntity> defaultLineMapper = new
DefaultLineMapper<MyModelEntity>();
DelimitedLineTokenizer lineTokenizer = new DelimitedLineTokenizer();
lineTokenizer.setDelimiter(",");
lineTokenizer.setStrict(false);
lineTokenizer.setNames(new String[]
{
"clientId","ddId","institName","progName",
"qual","startDate","endDate","eType", "country","comments"
});
defaultLineMapper.setLineTokenizer(lineTokenizer);
defaultLineMapper.setFieldSetMapper(myModelFieldSetMapper);
return defaultLineMapper;}
#Override
public MyModelEntity read()
throws Exception, UnexpectedInputException, ParseException, NonTransientResourceException {
//if(!batchJobState )
{
FlatFileItemReader<MyModelEntity> flatFileItemReader = new
FlatFileItemReader<MyModelEntity>();
flatFileItemReader.setMaxItemCount(2000);
flatFileItemReader.setResource(new UrlResource("file:\\"+sdhPath));
flatFileItemReader.setName("CSV-Reader");
flatFileItemReader.setLinesToSkip(1);
flatFileItemReader.setLineMapper(lineMapper());
flatFileItemReader.open(new ExecutionContext());
batchJobState=true;
return flatFileItemReader.read();
}
// return null;
}
}
the FieldSetMapper Implementation
#Component
public class MyModelFieldSetMapper implements FieldSetMapper<MyModelEntity> {
//private SiteService siteService =BeanUtil.getBean(SiteServiceImpl.class);
#Autowired
private SiteService siteService;
#Override
public MyModelEntity mapFieldSet(FieldSet fieldSet ) throws BindException {
if(fieldSet == null){
return null;
}
MyModelEntity educationHistory = new MyModelEntity();
// setting MyModelAttributes Values
return myModel;
}
}
any conribution is welcomed . thanks
// thereader after extending FlatFileItemReader
#Component
#StepScope
public class CustomUserItemReader extends FlatFileItemReader<User> {
#Value("#{jobParameters['UserCSVPath']}")
private String UserCSVPath;
private boolean batchJobState;
public CustomUserItemReader() throws Exception {
super();
setResource(new UrlResource("file:\\"+UserCSVPath));
setLineMapper(lineMapper());
afterPropertiesSet();
setStrict(false);
}
public LineMapper<User> lineMapper() throws Exception {
DefaultLineMapper<User> defaultLineMapper =
new DefaultLineMapper<User>();
DelimitedLineTokenizer lineTokenizer = new DelimitedLineTokenizer();
lineTokenizer.setDelimiter(",");
lineTokenizer.setStrict(false);
lineTokenizer.setNames(new String[]{"name", "dept",
"salary","endDate"});
defaultLineMapper.setLineTokenizer(lineTokenizer);
defaultLineMapper.setFieldSetMapper(new CustomUserFieldSetMapper());
//defaultLineMapper.setFieldSetMapper(fieldSetMapper);
return defaultLineMapper;}
#Override
public User read()
throws Exception, UnexpectedInputException, ParseException,
NonTransientResourceException {
//if(!batchJobState )
{
flatFileItemReader).size())
// flatFileItemReader.setMaxItemCount(2000);
this.setResource(new UrlResource("file:\\"+UserCSVPath));
this.setName("CSV-Reader");
this.setLinesToSkip(1);
//flatFileItemReader.setLineMapper(lineMapper());
this.open(new ExecutionContext());
User e = this.read();
batchJobState = true;
return e ;
}
// return null;
}
public
String getUserCSVPath() {
return UserCSVPath;
}
public
void setUserCSVPath(String userCSVPath) {
UserCSVPath = userCSVPath;
}
}
Thanks for all your suggestions, even if i have implemented ItemReader<>. I fixed the problem by moving the instantiation of the FlatFileItemReader Out from the read() method.
that was creating a new FlatFileItemReader in each loop and reading only the first line for each object created .
thanks

Resources