Database event listener using spring boot - spring

I need to attach a listener to a table in db
which should call a spring boot method, once CRUD operation is performed in the table(pre listeners and post listeners)
the entry can be made from any source
how can i do that in spring boot?

If the entity can be created from any source - e.g. manual insert - this is something which is outside of the scope and context of your running application.
What you're describing is known as the CDC (change data capture) pattern.
To implement CDC in this case you need to use the instrumentation of the underlying database - for example triggers.
As I see this is tagged with MongoDb - triggers are not an option as mongodb doesn't have support for triggers.
If you are using MongoDb v3.6+ you can leverage the new Change Streams feature. This is the official example with Java.
Change streams allow applications to access real-time data changes
without the complexity and risk of tailing the oplog. Applications can
use change streams to subscribe to all data changes on a single
collection, a database, or an entire deployment, and immediately react
to them. Because change streams use the aggregation framework,
applications can also filter for specific changes or transform the
notifications at will.
If you are using earlier versions of MongoDb you can monitor the oplog or use tailable cursors with capped collections.
Another approach would be to look into a 3rd party solution that turns everything happening in the DB as event streams - like for example debezium.

This article explains how to call any program from DB-Trigger.
Therefore, you can just create a Spring Boot java app and make the sys call to your app.
Similar mechanism is also available in Oracle and other DB.

Related

What is the best way to maintain queries in Spring boot application?

In My Application, Using the below technologies
Spring boot 2.7.x
Cassandra
spring batch 5. x
java 11
As part of this, I need to extract data from the Cassandra database and need to write out the file
so here I need to use queries to fetch data so
just want to know what is the best way to maintain all queries at one place so any query changes come in the future, I shouldn't build the app rather just need to modify the query.
Using a repository class is necessary. If you are using JPA i recommend using a repository for each Entity class. With JDBC it is possible to create a single repository which contains all the queries. To access the query methodes i would use a service class. In this way your code is structured well and maintainable for future changes.

Which datastore (database) should be used for Spring-boot REST API application with AZURE

There are may blog available around this but still not getting exactly what is needed.
I am trying to write a REST API with Spring Boot and store data in database. Here the database structure may change (new tables can get introduced or some existing names may get renamed).
Which DB can be used so that there would be minimal code changes needed both at java side and DB side.
What could be a best design approach in this scenario considering technology stack as Spring Boot and Azure
Please visualize about your persistent storage? Why Azure Only? Refine question.
e.g. H2 database with Spring Boot is the most memory efficient.
see Lightest Database to be packed with an application
About Minimal code changes - I'd go with one of the ORM - JPA(or Hibernate). So will only need to maintain #Entity class on java side.
Don't forget - minimal changes still need to be addressed at database & Java side.

MongoDB Panache Best Practices for Multi-Document Transactions

Regarding the quote below in the MongoDB Panache documentation [https://quarkus.io/guides/mongodb-panache]
MongoDB offers ACID transactions since version 4.0. MongoDB with Panache doesn’t provide support for them.
As such, is there a recommended approach or a best practice on handling Multi-Document transactions to ensure atomicity?
Consider the example:
public void buyCarTest() {
carRepository.increaseStock(1);
cashRepository.decreaseCash(10000);
}
If we were to do it manually, it would be:
check if the write operation into 2nd repository failed, and
if so, revert changes made in carRepository
This approach seems tenuous at best especially if there are more than 2 repositories I'm writing into.
Thanks.
What you propose is what is called a compensation and it is tricky to implements.
I'd rather use an event based mechanism for this: you send the two events and asynchronously they are processed so a failure of one of the tenant (the stock manager) will not impact the second one.
You can also use MongoDB transaction but for this you will need to use the MongoDB API instead of Panache (so get the collection from your entity and use it).
Transaction support for MongoDB is a work in progress (see https://github.com/quarkusio/quarkus/pull/7222) you can watch this issue to be notified when it'll be implemented.

Reactive Spring boot with SQL databases

I found many examples of using spring boot reactive with document databases, but none with SQL databases.
I see that it may not support sql databases yet, probably because some missing feature on the jpa/jdbc stack.
I also see that there is no point to use reactive services that depend on the a sql database with no reactive support.
The question here is: Is there any ongoing development to make this happen (reactive jpa)?
There is a reactive feature built into many RDBMSs called "Change Data Capture" which writes data to an async transaction log for an enabled table. Usually, reactive systems built to stream this data are built on top of that feature. For example, a well-known open source tool that does this is Debezium. You can find other open source projects online that do something similar, or to write your own using the simple CDC functions that are usually provided to support it.

SDN4 - Entity lifecycle event handlers compatible with GraphRepository

I am using Spring Data Neo4j 4.0.0.RELEASE and would like to take advantage of the built-in data manipulation events to insert some audit information on the fly (e.g. timestamps). The documentation seems to suggest that this is only available to me if I am directly using Neo4jTemplate.
Are there any similar hooks available for the GraphRepository abstraction? That is, is there an out of box way for me to hook into graph repository operations (a la Spring DataJPA?) I've written some tests and can confirm that the documented events don't fire when I'm just using the GraphRepository.
AbstractGraphRepository is from the 3.x codebase, so is not directly relevant here.
As noted, SDN 4 does not yet provide automatic support for Spring's RepositoryEventListener interfaces. Implementing event listeners correctly in SDN 4.0 is complicated because of the nature of the underlying save mechanism, which persists an entire tree of "dirty" objects rather than just a single top-level entity. If the object you want to intercept is not the top-level entity being saved, the event listener for it won't fire.
The SDN development team is currently considering the best way to enable event handlers to fire for objects that may be persisted at any depth in the save tree.
In the meantime, the solution suggested by simonl should work.

Resources