I am trying to read parquet file in Spring Batch Job and write is to JDBC. Is there any sample code for reader bean which can be used in springframework batch StepBuilderFactory?
Spring for Apache Hadoop has capabilities for reading and writing Parquet files. You can read more about that project here: https://spring.io/projects/spring-hadoop
Related
I have an requirement to use sprint batch in order to bulk upload excel data. user would be importing this file from an UI and service is expected to use this file and import into database. I am new to spring batch and with some analysis was able to infer that we will not be able to send excel file as parameter . Is saving the file to local is only way to read this file ? Is there anyway i can read the incoming file directly using spring batch ?
If I understand your question correctly you want a user to invoke a Spring service endpoint with a file and then Spring batch job should pick that file up as an input and start job processing.
Yes, this is very much doable and you do not need to explicitly save it to your local yourself.
Here is what I would do:
Take the file as an input to a POST endpoint using Spring "org.springframework.web.multipart.MultipartFile" object. Let's call this object "file".
Then get the input stream from MultipartFile object using "file.getInputStream()".
Set this input stream as "resource" to "FlatFileItemReader" object of Spring Batch.
Sample code:
flatFileItemReader.setResource(new
InputStreamResource(file.getInputStream()));
Once this is done and you start the Spring batch job, this file will be processed in your job.
Here is my scenario..
I need to read csv file and store output to ElasticSearch. I am using Spring Batch to read csv file. can anyone give me example how to save in elasticsearch using Spring Batch or Spring Batch Extension?
Its an old question and probably you might have found answer by now but here it goes...
To work with ElasticSearch, you need Spring Data and you simply write items from your writer as you normally do but with a repository instance like - repository.save(list) where list is a List of items passed from Spring Batch processor to writer.
Where repository is basically a ElasticsearchRepository from Spring Data. You need to define repositories for your items.
You need to provide your ElasticsearchRepository definitions to ElasticSearch instance definitions by editing - #EnableElasticsearchRepositories in and you define persistent layer as done here. Edit #EnableElasticsearchRepositories to actual repository package location of your project.
Hope it helps !!
Actually, I worked with a similar project but instead of importing data from a CSV file, I imported it from a relational database MySQL, reading and filtering data with the spring batch and write it into elasticsearch , this is the link of the project in the GitHub read carefully the readme.md file you will find all the required configuration :
the github project link
I am aware of spring batch metadata tables are written to Relational database tables say MySQL,H2 etc. But my question is whether spring batch metadata tables can be written to elasticsearch . If so how to proceed further?
have you checked the Spring Batch Extensions module? This module provides an ItemReader and an ItemWriter for interacting with Elasticsearch. Hope it helps.
Has anyone one work or guide on how to load csv file to Gemfire database using Spring XD job? Reference or example will help.
Thanks
If you want to run this as a batch job then I believe you can use filejdbc OOTB job module.
Spring XD doesn't have an OOTB job for raw Gemfire. As Ilaya points out, if you're using Gemfire's SQL support (via Sqlfire or Gemfire XD) you can use the filejdbc job. Without that support, you'd need to write a custom job that imports the data using Spring Batch's GemfireItemWriter support.
If you want to load csv files into gemfirexd, you can use the build-in system procedure SYSCS_UTIL.IMPORT_DATA_EX().
I have been trying to fetch details from multiple tables in a database and provide the output in the form of XML file using Spring batch. Is there any working code to do the above task or any idea to do so?
Following links should guide you.
If simple then xml then:
http://www.mkyong.com/spring-batch/spring-batch-example-mysql-database-to-xml/
For complex xml using spring, refer to these threads;
Complex XML using Spring Batch; StaxEventItemWriter ; Jaxb2Marshaller
Build non trivial XML file with StaxEventItemWriter