How to import CSV data to PostgreSQL Database using Spring Batch Job
Spring Batch is a powerful module to implement a batch process for tons of data conveniently.
You can find a lot of example on google or you can look at this post: https://stackoverflow.com/a/47471465/4253361
To me, the best reference that helped me in a simple way to understand how spring batch works with examples is tutorialspoint/spring-batch. Try this.
Related
As the title says, I am trying to read a cvs file that contains thousands of ip addresses to their respective country. I want to import the cvs file into a MongoDB using WebFlux. I haven't been able to find any resources on how to do this. I have come across Spring Batch but I don't believe it supports WebFlux.
One way I thought of achieving this is just read the CVS file, parse the file, create DTO with values then save it into the database, however, I worry about performance.
Spring WebFlux is the alternative to Spring MVC module. It is not suitable for data processing. So if you want to solve your problem, use this way:
One way I thought of achieving this is just read the CVS file, parse the file, create DTO with values then save it into the database, however, I worry about performance.
And "reactive way" won't be faster than batch processing just because it's "reactive".
Here is my scenario..
I need to read csv file and store output to ElasticSearch. I am using Spring Batch to read csv file. can anyone give me example how to save in elasticsearch using Spring Batch or Spring Batch Extension?
Its an old question and probably you might have found answer by now but here it goes...
To work with ElasticSearch, you need Spring Data and you simply write items from your writer as you normally do but with a repository instance like - repository.save(list) where list is a List of items passed from Spring Batch processor to writer.
Where repository is basically a ElasticsearchRepository from Spring Data. You need to define repositories for your items.
You need to provide your ElasticsearchRepository definitions to ElasticSearch instance definitions by editing - #EnableElasticsearchRepositories in and you define persistent layer as done here. Edit #EnableElasticsearchRepositories to actual repository package location of your project.
Hope it helps !!
Actually, I worked with a similar project but instead of importing data from a CSV file, I imported it from a relational database MySQL, reading and filtering data with the spring batch and write it into elasticsearch , this is the link of the project in the GitHub read carefully the readme.md file you will find all the required configuration :
the github project link
I am aware of spring batch metadata tables are written to Relational database tables say MySQL,H2 etc. But my question is whether spring batch metadata tables can be written to elasticsearch . If so how to proceed further?
have you checked the Spring Batch Extensions module? This module provides an ItemReader and an ItemWriter for interacting with Elasticsearch. Hope it helps.
Has anyone one work or guide on how to load csv file to Gemfire database using Spring XD job? Reference or example will help.
Thanks
If you want to run this as a batch job then I believe you can use filejdbc OOTB job module.
Spring XD doesn't have an OOTB job for raw Gemfire. As Ilaya points out, if you're using Gemfire's SQL support (via Sqlfire or Gemfire XD) you can use the filejdbc job. Without that support, you'd need to write a custom job that imports the data using Spring Batch's GemfireItemWriter support.
If you want to load csv files into gemfirexd, you can use the build-in system procedure SYSCS_UTIL.IMPORT_DATA_EX().
We are trying to develop a framework on top of spring batch, basically it has to read the data from database like fields, fields order, file location..etc..
Is there any existing frameworks to achieve this, otherwise please shed some light on this...
Thanks,
MK
I don't think there is any such extension available over the framework. You might have to write your own customizations to achieve a database driven configuration for Spring Batch.
What do you want Spring Batch to read in? Have you looked at ItemReaders? http://docs.spring.io/spring-batch/trunk/reference/html/readersAndWriters.html
In your ItemReader constructor you can read in whatever configuration you require.
And see:
Reading Records From a Database in Spring Batch