nestjs - how to use shared modules between nestjs applications - node-modules

Let's say I have NestJs project, that consists of following parts, each with it's own git repository and deployed on different machines:
Api service
Message-queue processing workers service
Both of the services require part #3 - library for accessing RabbitMq.
Let's say that #3 is some abstraction around the RabbitMq client that defines entities and services, that access MQ, so the modules #1. and #2. doesn't even know, they are working with RabbitMq. #3 is planned to be included / used as a library in #1 and #2, not as a microservice.
How should this be integrated together?
Let's assume it from the perspective of API service:
It has the RabbitMq module inside the node_modules folder.
I want to use e.g. dependency injection in that RabbitMq library. Ideally, expose it as NestJs module, to be used in #Module imports of API module.
Should I use NestJs as peerDependency in the RabbitMq module?
I have like 3-4 shared-modules I need to share between the modules #1 and #2. Not just MQ.

You can follow the approach of creating an npm module with the NestJS related modules as per dependencies. John Biundo has a good article on dev.to about creating NestJS modules and publishing them to npm.

You can use the workspace concept of nest js. I think monorepo mode will be the one that can help without having the complexity of making npm modules etc.
https://docs.nestjs.com/cli/monorepo

Related

Best practice- share Spring-boot Service and Repo layer code between applications

Need some best practice recommendations to a classic requirement around modularising Springboot application based on layers.
Some background info:
Small- medium size Spring boot project with less than 10 developers
2 different Spring-boot applications and shared Service, Repo layer and also shared models
Bit too late to go with micro service approach with full Model/ Controller/ Service/ Repo per API.
Currently there is just one web application exposing the APIs for a frontend application.
Requirement is to add new set of APIs which are used for B2B integration, so the request/ response formats will be quite different to the already available APIs. i.e. /webapi/v1/orders for frontend client and /b2b/v1/orders will need to return different response format.
The Service and Repository layer along with the models need to be shared among the 2 applications, so 3 modules identified as similar to how it's explained in https://stackoverflow.com/a/50352532/907032
-- Main app
-- Webapi (Got dependency to common, jar packaging)
-- b2b (Got dependency to common, jar packaging)
-- common (jar packaging)
The two applications need to be deployed separately and also separated from CICD perspective not to build all the sub modules every time (A change to b2b controller should not affect common/ Webapi)
A change to common module which is only required to the latest b2b module, preferably should not trigger a build and deploy of webapi. i.e. webapi uses common-1.01 and b2b module uses common-1.02. Understood the new version common-1.02 should not break any feature from common-1.01 but just trying to save unnecessary build & deploy for that module until required if that makes sense.
The challenge
Should the modules defined in the same Repo or 3 different Repos?
All the talks about mono vs multi repo is about whether to keep all different projects in same or not, but here as you can see these are modules which are kind of related to each other.
If we define these as sub-modules in same Repo, how versioning of the common module handled? If it's always triggering a build of all three sub modules, do we even have any advantage of modularising the code?
As per your description, the module named "common" is not not that comon to the other two. I'd go with the multi-modudle way by doing so:
first break that common module in three: common, utils-webapi, utils-b2b
The first will strictly contains the thing both webapi and b2b need at the same version. Utils-webapi will be dedicated strictly to the things in api. Same goes for utils-b2b
B2b depends on utils-b2b with depends on common. Webapi depends on utils-webapi with depends on common
Versionning of common module is always consistant, only utils-X module version change from the X module perspective
CI is thus independant for each build.
Note: You can go further and simply consider utils-webapi utils-b2b and get rid of common. At the cost of some deduped code.

Client Modules (DTOs) as a Multi-module or Separate Project Repo

Our project team had adopted Client Modules as a way to share DTOs with other Micro-Services (references: https://www.vinsguru.com/microservices-architecture-how-to-share-dto-data-transfer-objects/ and https://www.baeldung.com/java-microservices-share-dto).
However, 1 question that we had in mind was whether to structure the client modules (DTOs) within the Microservice as a multi-module project, or to locate it separately in a different project repo.
In this case, we envision the client-module to be uploaded onto our internal Maven Repository, while the Microservice will be deployed in our Kubernetes clusters.
As such, would like to seek opinion on:
how would you structure your Spring Boot projects if you adopted Client Module to share DTOs and
the pros and cons to structuring the client modules (DTOs) in a multi-module project within the Microservice, or to locate it separately in a different project repo.
Feel free to comment if you have any questions. Thanks in advance! :)
Some of the enterprise projects that I worked on used multi module approaches to separate client and microservice modules. The broad idea is as follows.
Have 3 modules: Client, Integration testing and microservice.
In the client module, place all resources you wish to share with others: DTOs, exception, Feign managers, and enums etc. Package this module as a Jar
Place all service and data layer logic in the micro service modules. This module will be packaged as a boot jar which can be deployed to your targets.
Place integration tests in the integration testing modules. The packaging is optional here.
The pros of this approach is as follows:
There is clear separation of concern between client, service and testing modules.
There is a very tight security: you won't be exposing your service logic unless you would want to.
The cons(Purely my opinion):
Managing the artifacts is cumbersome.

Microservice project structure using Spring boot and Spring Cloud

I am trying to convert a normal monolithic web application into microservices structure using Spring Boot and Spring Cloud. I am actually trying to create Angular 2 front-end application and calls these my developed microservices in the cloud. And I am already started to break the modules into independent process's structure for microservice architecture.
Here my doubt is that, when designing the flow of control and microservice structure architecture, can I use only one single Spring Boot project using different controller for this entire web application back end process?
Somewhere I found that when I am reading develop all microservices using 2 different Spring Boot project. I am new to Spring and Spring Cloud. Is it possible to create all services in single project by using different modules?
Actually, it doesn't matter to package all those services into ONE project. But in micro-service's opinion, you should separate them into many independent projects. There are several questions you can ask yourself before transforming original architecture.
Is your application critical? Can user be tolerant of downtime while you must re-deploying whole package for updating only one service?
If there is no any dependency between services, why you want to put them together? Isn't it hard to develop or maintain?
Is the usage rate of each service the same? Maybe you can isolate those services and deploy them which are often to be invoked to a strong server.
Try to read this article Adopting Microservices at Netflix: Lessons for Architectural Design to understand the best practices for designing a microservices architecture. And for developing with Spring Cloud, you can also read this post Spring Cloud Netflix to know which components you should use in your architecture.
Currently I am working on microservices too, according my experience we have designed microservices as step below,
Maven
You should create the project with different project. But actually you can separate your project to submodule. So you will be easy to manage your project, the submodule you can use with other project too.
Build the Jar Library put your local repository. it can save your time, you have just find the same component or your functionality then build the jar file put in your local repository , so every project that use this function call point to download this repository, you don't have to write many project same same.
So finally I would like you to create different springboot project, but just create submodule and build local repository.
By creating your modules in different projects you create a more flexible solution.
You could even use different languages and technologies in a service in particular. E.g. one of your services could be NodeJS and the rest Java/Spring.

differentiate between maven multi module and Spring micro services?

I am reading spring micro services for next project. Tut said that "The architecture style the main application divided in a set of sub applications called microservices. One large Application divided into multiple collaborating processes.". So already we have a framework maven multi module There I separated the project in my experience. even though it is. why do we need micro services to separate a project?. please differentiate it. thanks in advance..
Every service in the microservice architecture should be isolated which means that the team in charge of that service is able to put continuous deployment in practice without need to deploy other services. In practice, IMHO I think that we can use two approaches using our favourite build tool such as maven or gradle:
Monoproject: domain, repositories, services and controllers are all in the same project.
Multiproject: domain, repositories, services and controllers can be grouped in different modules. i.e domain and repositories are in repository module and services in another module with the same name and controllers in front module.
But, doesn't matter which approach you use the project (mono or multi) should represent one service.

Migrating J2EE style Project to OSGi Style using the OSGi declarative services

I am new to OSGi and am using Equinox "Virgo Tomcat Server" (VTS) along with eclipse blueprint, and got big assignment to do in limited time
There is application already developed in J2EE Style
By using JSP->Struts2->Spring->MySQL and SOAP Web Services.
-There are various layer in the existing architecture
Simple Request flow is as mention below
From UI layer->it goes to strus2 configration-> it goes to Spring Configuration->From Spring configuration xml (that is module wise application context xml) Struts Action class is called -From Struts Action class layer -> it goes to Task layer->handler layer->Service Layer -> Adapter or DAO layer ->DB in some cases from service layer call also goes to WebService layer and communicate with Back-end legacy system
My Queries are as Follows
Q1] From UI/JSP to up-to strtus2 action layer code for every module should be club together in to a single .war file say "onlinebank.war" and from struts2 action onwords module wise code in every layer should go in to Module wise OSGi bundles
For e.g. if there are 10 modules there should be 10 osgi bundles
And each module wise bundle should contain module specific code from every layer after action layer and there should be communication between one war "onlinebank.war" and 10 osgi bundles
Q2]To take Q1 to next level
If there are 10 modules then instead of crunching module specific code in one OSGi bundle,
I have to create 3 bundles for each module(XXXAPI,XXXMain,XXXConfig)
for e.g. TestModule
I] TestModuleAPI (will contain only interfaces and abstract classes)
II] TestModuleMain (will contain implementation of interfaces and abstract classes and will provide some default functionality)
III]TestModuleConfig (will be accessing the default functionality of Main Bundle Via API Bundle and also provide some customize/new functionality)
So if there are 10 Module and 3 OSGi Bundles for each module (API,Main,Config) then for 10 bundles there should be 3*10=30 bundles and there should be proper communication between -single war "onlinebank.war" and the 30 bundles
Also there should be proper communication among 30 bundles it self to resolve the dependencies and works together properly/synchronously
Any help will be greatly appreciated
Regards,
Gokul
This is a big task you are tackling. I did such a migration and am quite familiar with OSGi. Still it took about 2 months. So you should first not underestimate the problems you will be facing.
The next thing is that a typical spring application is not well modularized. As there are no private/public packages in JSE developers tend to ignore module boundaries. The modules often also do not have a clean and small API at all so people known what they should access and what not.
So I think your first task is to refactor the application so each bundle offers a minimal API and other modules only access the API. For this task it might make sense to use an architecture tool that allows to define and control these accesses. While still in spring you create beans from a service interface from the API. Later in OSGi the API will allow to define clean OSGi services. If you skip this step then OSGi will not have big advantages. OSGi only works well if the application is strictly modularized.
Then for the actual OSGi migration I can only urge you to hire some specialist to help you. It will be a waste of time and resources if you do this alone.

Resources