spring-batch Read a file that contains file names to be ingested - spring-boot

I have the following case and I am trying to solve it with Spring-Boot and Spring-Batch. I have to read a flat file (sidecar file) in which every line is a name of a file to be ingested in a database.
I have configured a job to read the sidecar file, but I am having a problem to decide what is the accepted method in spring-batch to process the contained files. I also have configured steps that can read each file and insert the records in a data base.
Any ideas how to configure the sidecar job with the steps I have written for the individual files.
I can provide actual configuration from my implementation if needed.

FactoryBean is your friend. In this case, you can create a FactoryBean that reads the file of file names and creates a list of Resource objects for it. That can be injected into the MultiResourceItemReader which will iterate over that list.

Related

How to make properties file in Spring boot Primary and secondary

Hi I am using Spring 4.3.7, java 8 and spring boot, My requirement is I have 2 properties file one inside the classpath and another outside. I was able to load both. using
#PropertySources({ #PropertySource("classpath:common.properties"), #PropertySource("classpath:anotherFile.properties") })
#PropertySource(value = {"file:${external.config.location}/config_one.properties"}, ignoreResourceNotFound = true)
the input values of both the file will be almost same eg file naming convention or file create location (besides db details and few other token details)
what has to be done is if external property file exist read the property value from it or else read from one inside the classpath. is this possible via any annotation in Spring boot?
It works out of the box. The difference is that it does not "alternatively" read prop from one or another source, but it rather reads all properties from the first source, then reads all properties from another (and overrides eventual duplicates), than moves to the third source... and so on and on
There are in total 17 "default" sources and all have its own precedence over the others. See more in docs
https://docs.spring.io/spring-boot/docs/current/reference/html/spring-boot-features.html#boot-features-external-config
Please bare in mind, that those sources are read "from bottom to top", so eg key from internal application.properties #15 will be overriden by key from external appication.properties #14 and so on.

Is there a way to create a separate log file for each user ? (spring boot)

I am working on a spring boot application, i want to create a separate logging file for each user of the application, is this possible ?
For example: user1.log, user2.log, user3.log, ....
Thanks.
It's possible but it will create as many log files as users. Imagine if your user base increases to 20K. Unless you have very strong need don't go for it.
Instead go for application level and user level loggings. To achieve this refer here - https://stackoverflow.com/a/9652239
Although I agree with Kann's answer that the best approach is to filter after the fact, the answer to your question when using Log4j 2 would be to use the RoutingAppender or the SiftingAppender when using Logback. They both work similarly in that they will create new appenders for each unique item which can cause a problem with file handles. Log4j2's RoutingAppender provides for a PurgePolicy to handle that while Logback provides a timeToLive attribute. Logback uses a Discriminator class to choose how to determine how to match the log event to an Appender while Log4j 2 uses either a pattern that should contain a Lookup (Log4j 2's variable substitution mechanism) or a script to perform the matching.
If you are using java.util.logging you will have to write your own mechanism.

Returning an object from a Spring Batch job / processor

I have a Spring Batch job that reads in a very large fixed length file and maps it just fine to an object. Validated all of the data in the associated processing task.
Being rather new to Spring and Spring Batch I am wondering if it is possible to get out of the job, a fully populated object to be used in a particular case when I am running the job as part of another process ( that I would like to have access to the data).
I realize that I could do the above without Batch, and it seems to be designed with scope limitations for its purpose.
I could serialize the objects in the processor and go that route but for my immediate satisfaction I am hoping there is a way to get around this.
Thanks
In my #configuration class for the batch processing, I created a class variable (it is a list of the object I want to get back) and instantiated with the no arg constructor.
My Step, ItemReader, LineMapper are setup to use a list for input. The custom FieldSetMapper takes that list instantiated from the constructor as a parameter and adds to the list as the file is read and mapped. Similarly my custom ItemProcessor takes the list as input and returns it.
Finally I created a ReturnObjectList bean that returns the populated list.
In my main I cast the AnnotationConfigApplicationContext getbean to the list of that object type. I am now able to use the list of objects generated from the fixed file in the scope of my main application.
Not sure if this is a healthy work around in terms of how Spring Java config is supposed to work, but it does give me what I need.

Read properties file in original order from Spring Application Context

I need to read original order properties file from Spring Application context.
example:
A.properties
1:abc
2:xyz
3:qwe
The above file I need to get in the same order.

How to write a custom Flat File Item Reader

I am new to Spring batch. I wanted to ask how can I write a custom flat file item reader in Spring batch.
I know there is one generic FlatFileItemReader available in Spring batch, but we want to add some business logic while reading then how to write a custom flat file reader.
Thanks in advance
I have done same thing for MultiResourceItemReader, you can extend FlatFileReader and copy complete code of FlatFileReader. Then you can add your own method.
If the in-build logging provided in the FlatFileItemReader is not sufficient for your needs,
Grab the code for the FlatFileItemReader - https://github.com/spring-projects/spring-batch/blob/master/spring-batch-infrastructure/src/main/java/org/springframework/batch/item/file/FlatFileItemReader.java
Rename the class to you class-name/package name
Add loggers as needed and use
I don't recommend this - in more cases you would be better off debugging your code with a debugger.

Resources