Is there a way to spring batch read the field set configuration from database? - spring

We are trying to develop a framework on top of spring batch, basically it has to read the data from database like fields, fields order, file location..etc..
Is there any existing frameworks to achieve this, otherwise please shed some light on this...
Thanks,
MK

I don't think there is any such extension available over the framework. You might have to write your own customizations to achieve a database driven configuration for Spring Batch.

What do you want Spring Batch to read in? Have you looked at ItemReaders? http://docs.spring.io/spring-batch/trunk/reference/html/readersAndWriters.html
In your ItemReader constructor you can read in whatever configuration you require.
And see:
Reading Records From a Database in Spring Batch

Related

spring batch probleme importing a csv file to database

How to import CSV data to PostgreSQL Database using Spring Batch Job
Spring Batch is a powerful module to implement a batch process for tons of data conveniently.
You can find a lot of example on google or you can look at this post: https://stackoverflow.com/a/47471465/4253361
To me, the best reference that helped me in a simple way to understand how spring batch works with examples is tutorialspoint/spring-batch. Try this.

How to import a CVS file into a MongoDB using reactive way?

As the title says, I am trying to read a cvs file that contains thousands of ip addresses to their respective country. I want to import the cvs file into a MongoDB using WebFlux. I haven't been able to find any resources on how to do this. I have come across Spring Batch but I don't believe it supports WebFlux.
One way I thought of achieving this is just read the CVS file, parse the file, create DTO with values then save it into the database, however, I worry about performance.
Spring WebFlux is the alternative to Spring MVC module. It is not suitable for data processing. So if you want to solve your problem, use this way:
One way I thought of achieving this is just read the CVS file, parse the file, create DTO with values then save it into the database, however, I worry about performance.
And "reactive way" won't be faster than batch processing just because it's "reactive".

Spring Batch and ElasticSearch

Here is my scenario..
I need to read csv file and store output to ElasticSearch. I am using Spring Batch to read csv file. can anyone give me example how to save in elasticsearch using Spring Batch or Spring Batch Extension?
Its an old question and probably you might have found answer by now but here it goes...
To work with ElasticSearch, you need Spring Data and you simply write items from your writer as you normally do but with a repository instance like - repository.save(list) where list is a List of items passed from Spring Batch processor to writer.
Where repository is basically a ElasticsearchRepository from Spring Data. You need to define repositories for your items.
You need to provide your ElasticsearchRepository definitions to ElasticSearch instance definitions by editing - #EnableElasticsearchRepositories in and you define persistent layer as done here. Edit #EnableElasticsearchRepositories to actual repository package location of your project.
Hope it helps !!
Actually, I worked with a similar project but instead of importing data from a CSV file, I imported it from a relational database MySQL, reading and filtering data with the spring batch and write it into elasticsearch , this is the link of the project in the GitHub read carefully the readme.md file you will find all the required configuration :
the github project link

How to load csv file to GemFire using Spring XD Job

Has anyone one work or guide on how to load csv file to Gemfire database using Spring XD job? Reference or example will help.
Thanks
If you want to run this as a batch job then I believe you can use filejdbc OOTB job module.
Spring XD doesn't have an OOTB job for raw Gemfire. As Ilaya points out, if you're using Gemfire's SQL support (via Sqlfire or Gemfire XD) you can use the filejdbc job. Without that support, you'd need to write a custom job that imports the data using Spring Batch's GemfireItemWriter support.
If you want to load csv files into gemfirexd, you can use the build-in system procedure SYSCS_UTIL.IMPORT_DATA_EX().

How can I read the db table instead of reading the property in Spring?

I want to read properties from database tables instead of reading property files in Spring.
How can I achieve this? At the first time, I thought that I have to override the spring-context related classes. But I think there is easy way to implement this or Spring already provides this feature.
Fortunately, I found the solution like the below link.
http://www.albinsblog.com/2014/07/loading-configuration-properties-from.html#.VNBK9Gjkep8

Resources