Spring batch uses relational database to store job metadata - elasticsearch

I am aware of spring batch metadata tables are written to Relational database tables say MySQL,H2 etc. But my question is whether spring batch metadata tables can be written to elasticsearch . If so how to proceed further?

have you checked the Spring Batch Extensions module? This module provides an ItemReader and an ItemWriter for interacting with Elasticsearch. Hope it helps.

Related

What is the best way to maintain queries in Spring boot application?

In My Application, Using the below technologies
Spring boot 2.7.x
Cassandra
spring batch 5. x
java 11
As part of this, I need to extract data from the Cassandra database and need to write out the file
so here I need to use queries to fetch data so
just want to know what is the best way to maintain all queries at one place so any query changes come in the future, I shouldn't build the app rather just need to modify the query.
Using a repository class is necessary. If you are using JPA i recommend using a repository for each Entity class. With JDBC it is possible to create a single repository which contains all the queries. To access the query methodes i would use a service class. In this way your code is structured well and maintainable for future changes.

Creating tables on fly with Spring Boot Data JPA

Trying to build application where tables & its fields are managed by master table & creation of tables & its fields happens on demand, on fly based on user data posted to server.
Tried to look over similar question like this but wasnt able to find clue how to execute dynamic queries in DB without creating Repository & Entity in spring boot.
Robert Niestroj is correct when he writes in the comments
If you have dynamic tables JPA is the wrong approach. JPA is for static schema
I guess in theory you could do something where you generate code and possibly restart your ApplicationContext but it is bound to be really painful and you won't benefit from using JPA.
You should look into more dynamic technologies. MyBatis and plain old JdbcTemplate or NamedParameterJdbcTemplate come to my mind.
Possibly with your own abstraction layer on top of it if you have recurring scenarios.

Spring Batch and ElasticSearch

Here is my scenario..
I need to read csv file and store output to ElasticSearch. I am using Spring Batch to read csv file. can anyone give me example how to save in elasticsearch using Spring Batch or Spring Batch Extension?
Its an old question and probably you might have found answer by now but here it goes...
To work with ElasticSearch, you need Spring Data and you simply write items from your writer as you normally do but with a repository instance like - repository.save(list) where list is a List of items passed from Spring Batch processor to writer.
Where repository is basically a ElasticsearchRepository from Spring Data. You need to define repositories for your items.
You need to provide your ElasticsearchRepository definitions to ElasticSearch instance definitions by editing - #EnableElasticsearchRepositories in and you define persistent layer as done here. Edit #EnableElasticsearchRepositories to actual repository package location of your project.
Hope it helps !!
Actually, I worked with a similar project but instead of importing data from a CSV file, I imported it from a relational database MySQL, reading and filtering data with the spring batch and write it into elasticsearch , this is the link of the project in the GitHub read carefully the readme.md file you will find all the required configuration :
the github project link

How to read and write data from multiple databases by using spring batch update?

I am working on spring batchupdate ,I search on google I didn't find any solution for my problem.
I have two databases(MySQL,ORALCE) I want to read data from mysql and write into oracle by using batch update .
Your problem is unclear.
You can first read the data from MySQL with one Spring JdbcTemplate object initialized with MySql data source, and then use another JdbcTemplate object, initialized with Oracle data source, to write the data.
If you want to do it in one transaction, you will have to use distributed transactions/XA libraries, such as Atomicos, and Spring distributed transaction manager. See here https://docs.spring.io/spring-boot/docs/current/reference/html/boot-features-jta.html for details on Spring integration with distributed transactions libraries.

Is there a way to spring batch read the field set configuration from database?

We are trying to develop a framework on top of spring batch, basically it has to read the data from database like fields, fields order, file location..etc..
Is there any existing frameworks to achieve this, otherwise please shed some light on this...
Thanks,
MK
I don't think there is any such extension available over the framework. You might have to write your own customizations to achieve a database driven configuration for Spring Batch.
What do you want Spring Batch to read in? Have you looked at ItemReaders? http://docs.spring.io/spring-batch/trunk/reference/html/readersAndWriters.html
In your ItemReader constructor you can read in whatever configuration you require.
And see:
Reading Records From a Database in Spring Batch

Resources