Spring Boot: how to implement workflow as synchronous REST services integrating external applications - spring-boot

I'm here to ask you for a suggestion on which technology to use to add new features on an existing application based on Spring Boot.
The new features consist of some workflows, exposed as synchronous REST services, that must update the database and call REST services exposed by external applications, which consequently will update their own database.
For example, a service could implement this workflow:
insert rows in database
call the REST service of the application X, which will update its database
update rows in database
call the REST service of the application Y, which will update its database
update rows in database
This workflow must be synchronous and it is started from an human operator that will receive the outcome in few seconds.
If, for example, the step 4) fails, I need to:
make a compensation, calling another REST service of the application X, in order to undo what it made in step 2)
rollback the insert/update made in my database in steps 1) and 3)
Which technology, framework, tool or other would you use? In the past I implemented a similar scenario using Oracle SOA, but in this case I would avoid to introduce a new infrastructure in my application based on Spring Boot.
Thank you

I guess you need to learn a bit more about Spring Framework and Spring Boot.
1.insert rows in database : Spring Data JPA
2.call the REST service of the application X, which will update its database : A Http Client such as RestTemplate or WebClient
3.update rows in database : Spring Data JPA (again)
4.call the REST service of the application Y, which will update its database update rows in database : RestTemplate...
So so and so...
If you want to make a real workflow, you can use Activiti.

Related

Cache refresh on spring boot whenever Data update happens in Database from third party application also(Update will not happening by our application )

I need to refresh my Cache as soon as DB changes happened ,update might happen to DB from another source also not from my application in MySql. How can I do that as mysql will not provide access to call java method as part of TRigger as like in Oracle? (SpringBoot application)

Transaction management in microservices

We are rewriting legacy app using microservices. Each microservice has its own DB. There are certain api calls that require to call another microservice and persist data into both DBs. How to implement distributed transaction management effectively in this case?
Since we are not migrated completely to the new micro services environment, we still writeback data to old monolith. For this when an microservice end point is called, we call monolith service from microservice api to writeback same data. How to deal with the same problem in this case as well.
Thanks in advance.
There are different distributer transaction frameworks usually included and maintained as part of heavy application servers like JBoss and WebLogic.
The standard usually used by such services is Jakarta Transactions (JTA; formerly Java Transaction API).
Tomcat and Spring don't support distributed transactions out-of-the-box. You can add this functionality using third party framework like Atomikos (just googled, I've never used it).
But remember, microservice with JTA ist not "micro" anymore :-)
Here is a small overview over available technologies and possible workarounds:
https://www.baeldung.com/transactions-across-microservices
If you can afford to write to the legacy system later (i.e. allow some latency between updating the microservice and the legacy system) you can use the outbox pattern.
Essentially that means that you write to the microservice database in a transactional way both to the tables you usually write and an additional "outbox" table of changes to apply and then have a separate process that reads that table and updates the legacy system.
You can also achieve something similar with a change data capture mechanism on the db used in the microservice(s)
Check out this answer on "Why is 2-phase commit not suitable for a microservices architecture?": https://stackoverflow.com/a/55258458/3794744

Should microservices connected with axon share the axon framework related tables?

I am starting a project where I want to have multiple services that communicate with each other using the axon server.
I have more than one service with the following stack:
Spring Boot 2.3.0.RELEASE (with starters: Data, JPA, web, mysql)
Axon
Spring Boot Starter - 4.2.1
Each one of the services uses different schemas in the mysql server.
When I start the spring boot service with the axon framework activated, some tables for tokens, sagas, etc are created in the database schema of each application.
I have two questions
In the architecture that I am trying to build, should I have only
one database for all the ‘axon enabled’ services, so the sagas,
tokens, events, etc are only in one place?
If so, can anyone
provide an example of how to configure a custom
EntityManagerProvider to have the database of the service separated
from the database of Axon?
I assume each of your microservices models a sub-domain. Since the events do model a (sub)domain, along with aggregates, entities and value objects, I very much favor keeping the Axon-related schemas separated, most likely along with the databases/schemas corresponding to each service. I would, thus, prefer a modeling-first approach when considering such technical options.
It is what we're currently doing in our microservices ecosystem.
There is at least one more technical reason to go with the same schema (one per sub-domain, that is), both for Axon assets and application-specific assets. It was pointed out to me by my colleague Marian. If you (will) use Event Sourcing (thus reconstructing the state of an aggregate by fetching and applying all past events resulted after handling the commands) then you will, most likely, need transactions which encompass this fetching as well as the command handling code which might, in turn, trigger (through events) writes to your microservice-specific database.
Axon can require five tables, depending on your usages of Axon of course.
These are:
The Event table.
The Snapshot Event table.
The Token table.
The Saga table.
The Association Value Entry table.
When using Axon Server, tables 1 and 2 will not be created since Axon Server is the storage solution for events and snapshots.
When not using Axon Server, I would indeed suggest to have a dedicated datasource for these.
Table 3 which services the TokenStore, should be as close as possible to your Query Models. The tokens portray how far a given EventProcessor is with handling events. As these EventProcessors typically service projectors which create your query models, keeping them together is sensible from a transactional perspective.
Table 4 and 5 are both required for Sagas. The "Saga table" stores the serialized sagas, whereas the "Association Value Entry table" carries the associations values between events and sagas so that the framework can load the right sagas. I'd store these either in a dedicated database or along with the other tables of the given (micro)service.

Spring Realtime Data Sending With Database

I'm having small e-commerce application with Spring boot, but I want to realtime the rest endpoint where the products are listed, so I want the rest endpoint to be updated if any product is updated or created.
I'm confused how to listen database in realtime. I found Sse for realtime data sending but i couldn't found how to do this with database
Can you please suggest a best methodology
If you are using Hibernate for managing Data. then use hibernate envers and process the model after create/update db.
my suggestion is whenever an action create/update happens in hibernate.
Hibernate envers will trigger and use Akka actor to process the event.
Hibernate -> hibernate Envers -> Akka (business layer) -> rest endpoint
Thanks,
Vimalesh

How to Restart Spring boot application (or just a particular instance of a class) when flyway migrate is ran or a new schema version is added

I have a particular instance of a class that loads some data from the database, so every time the database is updated the system should recreate the instance of that class to get the updated data
There are trivial several solutions:
(1) Use scheduling to load the latest data from DB periodically.
(2) Provide a web service such as RESTful API to load the latest data from DB.
(3) If your DB supports event-driven listeners, you can trigger your application to achieve this either by invoking a service described in (2) or send a message to queue and handle it by consumer.

Resources