I have an requirement to use sprint batch in order to bulk upload excel data. user would be importing this file from an UI and service is expected to use this file and import into database. I am new to spring batch and with some analysis was able to infer that we will not be able to send excel file as parameter . Is saving the file to local is only way to read this file ? Is there anyway i can read the incoming file directly using spring batch ?
If I understand your question correctly you want a user to invoke a Spring service endpoint with a file and then Spring batch job should pick that file up as an input and start job processing.
Yes, this is very much doable and you do not need to explicitly save it to your local yourself.
Here is what I would do:
Take the file as an input to a POST endpoint using Spring "org.springframework.web.multipart.MultipartFile" object. Let's call this object "file".
Then get the input stream from MultipartFile object using "file.getInputStream()".
Set this input stream as "resource" to "FlatFileItemReader" object of Spring Batch.
Sample code:
flatFileItemReader.setResource(new
InputStreamResource(file.getInputStream()));
Once this is done and you start the Spring batch job, this file will be processed in your job.
Related
I have a working Spring Boot application which embeds a Spring Batch Job. The job is not run on a schedule, instead we kick it with an endpoint. It is working as it should. The basics of the batch are
Kick the endpoint to start the job
Reader reads from input file
Processor reads from oracle database using jpa repository and simple spring datasource config
Writer writes to output file
However there are new requirements:
The schema of the repository database is from here on unknown on application startup. The tables are the same, it is just an unknown schema. This fact is out of our control and you might think it is stupid but there are reasons for it and this cant be changed. This means that with current functionality we need to reconfigure the datasource when we know the new schema name, and restart the application. This is a job that we will run for a number of times when migrating from one system to another, so it has a limited lifecycle and we just need a "quick fix" to be able to use it without rewriting the whole app. So what I would like to do is:
Send the schema name as a query param to the application, put it in job parameters and then - get a new datasource when the processor reads from the repository. Would this be doable at all using Spring Batch? Any help appreciated!
We have Spring4 and Spring Batch 3 and our app consumes CSV files as input file. Currently we kick off the jobs manually from the command line, using CommandLineJobRunner with parms, including the name of the file to process.
I want to kick off a job to process asynchronously just as soon as the input file arrives in a monitored directory. How can we do that?
You may use java.nio.file.WatchService to monitor directory for a file.
Once file appears you may start (or kick off a job to process asynchronously) actual processing.
You may also use FileReadingMessageSource.WatchServiceDirectoryScanner from Spring Integration (https://docs.spring.io/spring-integration/reference/html/files.html#watch-service-directory-scanner)
Comparing release notes Spring Batch https://github.com/spring-projects/spring-batch/releases
to Spring Integration https://github.com/spring-projects/spring-integration/releases it looks that Spring Integration is released more often. It also has more features and Integration points.
In this case it looks like a overkill to bring Spring Integration if you just need to watch a directory for a file.
I would recommend using the powerful combination of Spring Batch with Spring Integration. For example, you can use a FileInboundChannelAdapter from Spring Integration to monitor a directory and start a Spring Batch Job as soon as the input file arrives.
There is a code example for this typical use case in the reference documentation of Spring Batch here: https://docs.spring.io/spring-batch/4.0.x/reference/html/spring-batch-integration.html#launching-batch-jobs-through-messages
I hope this helps.
I am using spring batch partitioning and i would like to send a file list to each partition for processing.
I need to know how to invoke SftpOutboundGateway and get the list of remote files names from the spring batch Partitioner.
If that's all you need to do, consider just using FtpRemoteFileTemplate.list() or even Session.listNames().
How to Write into multiple files with Spring Batch using writer, Multiple files depends on data return from Database. Any better solution to implement in Spring Batch
Use a ClassifierCompositeItemWriter
Calls one of a collection of ItemWriters for each item, based on a router pattern implemented through the provided Classifier.
I'm new to spring integration. I need to fetch some file via sftp and immediately start some processing on the content of that file. There is SFTP Inbound Channel Adapter that partially satisfy me. But it saves(as documentation says) fetched file in local directory. I have no possibility to save it on local machine, but just want to start processing the content of that file, so it will be good for me to retrieve remote file as byte array or as InputStream. How can I achieve this with spring integration?
Also I want to configure my system to periodically fetch that file. I know that I can configure spring bean with #Scheduled annotation on some method and start processing from that method. But, maybe, spring integration has more elegant solution for such case?
Spring Integration 3.0.1 has a new RemoteFileTemplate that you can use to programmatically receive a file as a stream. See the Javadocs.