How to write a custom Flat File Item Reader - spring

I am new to Spring batch. I wanted to ask how can I write a custom flat file item reader in Spring batch.
I know there is one generic FlatFileItemReader available in Spring batch, but we want to add some business logic while reading then how to write a custom flat file reader.
Thanks in advance

I have done same thing for MultiResourceItemReader, you can extend FlatFileReader and copy complete code of FlatFileReader. Then you can add your own method.

If the in-build logging provided in the FlatFileItemReader is not sufficient for your needs,
Grab the code for the FlatFileItemReader - https://github.com/spring-projects/spring-batch/blob/master/spring-batch-infrastructure/src/main/java/org/springframework/batch/item/file/FlatFileItemReader.java
Rename the class to you class-name/package name
Add loggers as needed and use
I don't recommend this - in more cases you would be better off debugging your code with a debugger.

Related

Custom SpringBatch ItemReader, that read form JPA repository using custom query

I am new to Spring Batch and I would like to seek the advice of the experienced programmer on how to implement the following:
I am writing an API that triggers a batch task to generate a report from a list of transactions from JPA repo.
In spring batch, there is ItemReader, ItemProcessor, ItemWriter for each step.
How should I implement my ItemReader such that it can execute the following:
Generate a custom query from the parameters set in the jobParameters (obtained from the API query)
Using the custom query in 2 to obtain the list of transactions from the Transactions JPA repo, which will be processed in the ItemProcessor and subsequently generate reports through ItemWriter.
Question: How should I write the ItemReader, I looking at JpaCursorItemReader (dk if it is the right one) but could not find examples of implementation online to refer to. Any help is appreciated. Thank you.
I am at the stage of trying to understand how Spring batch works, I hope to seek proper guidance from experts in this field of the direction to go to accomplished the mentioned task.
Generate a custom query from the parameters set in the jobParameters (obtained from the API query)
You can define a step-scoped bean and inject job parameters to use them in your query:
#Bean
#StepScope
public JpaCursorItemReader jpaCursorItemReader(#Value("#{jobParameters['parameter']}") String parameter) {
return new JpaCursorItemReaderBuilder()
.queryString("..") // use job parameter in your query here
// set other properties on the item reader
.build();
}
You can find more details on step-scoped components in the documentation here: Late Binding of Job and Step Attributes.

How to replace flatFileItemReader with openCSV in spring batch

I am using a multiResourceItemReader in spring batch. I found about openCSV that it automatically binds csv columns to java object. But how to replace the FlatFileItemReader/multiResourceItemReader, with openCSV using CsvToBeanBuilder.
The line to object mapping logic in Spring Batch is implemented in a LineMapper, which is used by the FlatFileItemReader to map read lines to domain objects.
So the question "How to replace flatFileItemReader with openCSV in spring batch" is incorrect, those are not at the same level. What you can do is create a LineMapper implementation based on OpenCSV and use it with the FlatFileItemReader.

Is there a way to create a separate log file for each user ? (spring boot)

I am working on a spring boot application, i want to create a separate logging file for each user of the application, is this possible ?
For example: user1.log, user2.log, user3.log, ....
Thanks.
It's possible but it will create as many log files as users. Imagine if your user base increases to 20K. Unless you have very strong need don't go for it.
Instead go for application level and user level loggings. To achieve this refer here - https://stackoverflow.com/a/9652239
Although I agree with Kann's answer that the best approach is to filter after the fact, the answer to your question when using Log4j 2 would be to use the RoutingAppender or the SiftingAppender when using Logback. They both work similarly in that they will create new appenders for each unique item which can cause a problem with file handles. Log4j2's RoutingAppender provides for a PurgePolicy to handle that while Logback provides a timeToLive attribute. Logback uses a Discriminator class to choose how to determine how to match the log event to an Appender while Log4j 2 uses either a pattern that should contain a Lookup (Log4j 2's variable substitution mechanism) or a script to perform the matching.
If you are using java.util.logging you will have to write your own mechanism.

Returning an object from a Spring Batch job / processor

I have a Spring Batch job that reads in a very large fixed length file and maps it just fine to an object. Validated all of the data in the associated processing task.
Being rather new to Spring and Spring Batch I am wondering if it is possible to get out of the job, a fully populated object to be used in a particular case when I am running the job as part of another process ( that I would like to have access to the data).
I realize that I could do the above without Batch, and it seems to be designed with scope limitations for its purpose.
I could serialize the objects in the processor and go that route but for my immediate satisfaction I am hoping there is a way to get around this.
Thanks
In my #configuration class for the batch processing, I created a class variable (it is a list of the object I want to get back) and instantiated with the no arg constructor.
My Step, ItemReader, LineMapper are setup to use a list for input. The custom FieldSetMapper takes that list instantiated from the constructor as a parameter and adds to the list as the file is read and mapped. Similarly my custom ItemProcessor takes the list as input and returns it.
Finally I created a ReturnObjectList bean that returns the populated list.
In my main I cast the AnnotationConfigApplicationContext getbean to the list of that object type. I am now able to use the list of objects generated from the fixed file in the scope of my main application.
Not sure if this is a healthy work around in terms of how Spring Java config is supposed to work, but it does give me what I need.

Spring 3 Field Formatting

I am looking at using the Spring's Field formatting in particular the existing DateFormatter. I do understand that I need to specify a pattern on an annotation in my POJO.
Instead of hard coding the pattern I need to be able to provide it dynamically, I know this is not feasible with annotations. To properly support internationalization I would need to look up a pattern from a properties file before passing it to a Formatter.
Can anyone suggest an approach I can take?
Not sure however you may try implementing InitializingBean or init-method and set the values dynamically.
like suggested in spring forum for cron expression.

Resources