How to make Spring Data JPA and MongoDb multi-tenant at the same time? - spring-boot

I am using Spring Data JPA and MongoDb in my rest application. My database structure is such that for each Customer type, we have a separate oracle and mongodb. Whenever a Customer makes an http request to my rest server, based on some request header parameter i determine the customer type.
For instance, for Customer type A, there will be an "Oracle database A" and "Mongo database A". Similarly there will be "Oracle database B" and "Mongo database B" for customer type B and so on. The number of Customer types are fixed.
Now what i want is, suppose if Customer B makes an http request, then for this particular thread, all oracle hits should go to Oracle Database B and all mongo hits should go to Mongo database B.
I am aware of AbstractRoutingDataSource for making JPA multi-tenant but cannot think of a way to make both MongoDb and Oracle multi-tenant at the same time.

Related

How to send rest request for only particular records in a Oracle database table using Spring Boot

I have created a simple service using Spring Boot. There is a table Message_Queue as shown in this image link -
Message_Queue Table
I am using oracle database. Here, msisdn is a number which is primary key, and it has some other fields like sim_number, activation_date, subscription_id.
There is another Spring Boot service which sends rest API request to add the activation_date and subscription_id details for the respective record in message_queue table. Table message_queue holds data upto 2 million records.
I want to create a scheduled task in my Spring Boot application, which will send rest API request to some other 3rd service only for those records which has activation_date and subscription_id details filled and will also delete that record from the table.
What is the best way to achieve that using Spring Boot framework? Please try to answer on the enterprise level standards.
Is it a good approach to fetch like 1000 records with pagination until all the records are not checked from the table and then check for each record if it has activation_date and subscription_id or not, if it has then send a rest request for the record and also a delete request to the DB for the same record?

Spring Boot: how to implement workflow as synchronous REST services integrating external applications

I'm here to ask you for a suggestion on which technology to use to add new features on an existing application based on Spring Boot.
The new features consist of some workflows, exposed as synchronous REST services, that must update the database and call REST services exposed by external applications, which consequently will update their own database.
For example, a service could implement this workflow:
insert rows in database
call the REST service of the application X, which will update its database
update rows in database
call the REST service of the application Y, which will update its database
update rows in database
This workflow must be synchronous and it is started from an human operator that will receive the outcome in few seconds.
If, for example, the step 4) fails, I need to:
make a compensation, calling another REST service of the application X, in order to undo what it made in step 2)
rollback the insert/update made in my database in steps 1) and 3)
Which technology, framework, tool or other would you use? In the past I implemented a similar scenario using Oracle SOA, but in this case I would avoid to introduce a new infrastructure in my application based on Spring Boot.
Thank you
I guess you need to learn a bit more about Spring Framework and Spring Boot.
1.insert rows in database : Spring Data JPA
2.call the REST service of the application X, which will update its database : A Http Client such as RestTemplate or WebClient
3.update rows in database : Spring Data JPA (again)
4.call the REST service of the application Y, which will update its database update rows in database : RestTemplate...
So so and so...
If you want to make a real workflow, you can use Activiti.

AxonIQ AxonFramework MongoEventStorageEngine framework table creation on business DB

I am using AxonIQ AxonFramework version 4.5.3 with Spring Boot and custom event store.
I'm using MongoEventStorageEngine and configured a separate MongoDB database for the EventStorage.
I am doing some business logic with my business database through a microservice. In the same microservice, I've configured the custom EventStorage.
But a few tables (viz. association_value_entry, saga_entry, token_entry) are getting created on my business database which is a PostgresDB.
Why is AxonFramework creating new tables in my business database as I have already configured a separate MongoDB database for EventStorage. All the related database objects for Axon to work should be ideally created in the EventStorage database rather than in my business database.
The tables you are mentioned should be part of your 'read' model (I believe that is what you called business database).
They are not used for Event Storage or Event Sourcing but rather to specific things that are controlled on client side. For example, token_entry, among other things, is the table where your app keep track of the tokens and events it already consumed - you can read more about it here. Similar to the saga tables, where Sagas are stored on the client side having nothing to do with the Event Store - you can read more about it here.

Should microservices connected with axon share the axon framework related tables?

I am starting a project where I want to have multiple services that communicate with each other using the axon server.
I have more than one service with the following stack:
Spring Boot 2.3.0.RELEASE (with starters: Data, JPA, web, mysql)
Axon
Spring Boot Starter - 4.2.1
Each one of the services uses different schemas in the mysql server.
When I start the spring boot service with the axon framework activated, some tables for tokens, sagas, etc are created in the database schema of each application.
I have two questions
In the architecture that I am trying to build, should I have only
one database for all the ‘axon enabled’ services, so the sagas,
tokens, events, etc are only in one place?
If so, can anyone
provide an example of how to configure a custom
EntityManagerProvider to have the database of the service separated
from the database of Axon?
I assume each of your microservices models a sub-domain. Since the events do model a (sub)domain, along with aggregates, entities and value objects, I very much favor keeping the Axon-related schemas separated, most likely along with the databases/schemas corresponding to each service. I would, thus, prefer a modeling-first approach when considering such technical options.
It is what we're currently doing in our microservices ecosystem.
There is at least one more technical reason to go with the same schema (one per sub-domain, that is), both for Axon assets and application-specific assets. It was pointed out to me by my colleague Marian. If you (will) use Event Sourcing (thus reconstructing the state of an aggregate by fetching and applying all past events resulted after handling the commands) then you will, most likely, need transactions which encompass this fetching as well as the command handling code which might, in turn, trigger (through events) writes to your microservice-specific database.
Axon can require five tables, depending on your usages of Axon of course.
These are:
The Event table.
The Snapshot Event table.
The Token table.
The Saga table.
The Association Value Entry table.
When using Axon Server, tables 1 and 2 will not be created since Axon Server is the storage solution for events and snapshots.
When not using Axon Server, I would indeed suggest to have a dedicated datasource for these.
Table 3 which services the TokenStore, should be as close as possible to your Query Models. The tokens portray how far a given EventProcessor is with handling events. As these EventProcessors typically service projectors which create your query models, keeping them together is sensible from a transactional perspective.
Table 4 and 5 are both required for Sagas. The "Saga table" stores the serialized sagas, whereas the "Association Value Entry table" carries the associations values between events and sagas so that the framework can load the right sagas. I'd store these either in a dedicated database or along with the other tables of the given (micro)service.

How do microservices with independent databases communicate with each other?

For example Microservice A is having DB A and microservice B has DB B. Now Microservoce B want some share data from A. How we can handle such scenario?
Microservice B can make a synccall to get the data it needs. (like REST call) If microservice B needs this data frequently, it is good to avoid synccall to avoid network cost.
For such cases it is recommended to replicate data with an event driven architecture. With this architecture data changes in microservice A is published to a message broker like Kafka and then consumed by Microservice B. Microservice B update its own database with the information from the microservice A’s events. By that way coupling is also avoided.
They use a communication protocol, like HTTP.
Each Microservice needs to own its database schema and should only query its own database schema . In this case since Microservice B needs data from DB-A, Microservice A needs to expose this Data, possibly through a Rest Api and then, Microservice B needs to call that Rest Api to get the Data from DB - A

Resources