What is the purpose for the ui.router.router sub-module? - angular-ui-router

I found this in the ui-router documents recently:
ui.router.router sub-module
This module is a dependency of other sub-modules. Do not include this module as a dependency in your angular app (use ui.router module instead).
So what exactly is the purpose for this sub-module then and what is the use case for it?

Testing is the use case. As it appears, UI Router has a hierarchy of modules that can be tested separately.
In the case of ui.router.router this means that its components are tested apart from parent modules (i.e. ui.router.state).

Related

Spring Boot Multi Module and Fat jar with Shared Features

Experts,
I need some expert advice on how to approach the below use case in spring boot.
I need to have a maven multi-module approach to my project.
I need to have a single jar as output of the final build process.
There are to be common modules for controllers, data access and other functionality
Other modules are to be created based on functionality domain for eg a module for Payroll, a module for Admin etc etc.
Each domain functional module will then have their own controllers extending the common controller, exception handler and so on.
Each module will also have its own set of thyme leaf pages.
The reason for following such an approach is we have development in phases and we will be rolling out based on functional modules.
Here are the issues that I can sense using this approach.
Where do I add the spring web dependency? If I add to the parent pom - it gets replicated across the children and there will be port conflict issues as each module loads. the same issue will also be there the moment I add it to two child modules.
How do I build the fat jar which has all the jars from all modules and works as the final deployment?
All the text that I read i can't see anything even close to what I am trying to achieve.
AD1. They will not unless you are trying to setup independent application context in each module. Of course you can do that(it might be complicated but I believe it's achievable), but for me it's an overkill. Personally I think it's better to have one application context and rely on scanning components that are present in classpath.
AD2. The structure in maven might be a little bit complicated and overwhelming at first glance but it makes sense. Here's how I see it:
Create a parent module that will aggregate each module in project and will declare library/plugin dependencies for submodules.
Create 1-N shared submodules that will be used in other modules. With come common logic, utils, etc.
Create 1-N submodules that will be handling your business logic
Create an application submodule that creates application context and loads configuration and components from classpath
Create a submodule that will be responsible for packaging process, either to war, jar, uber-jar or whatever else you desire. Maven jar plugin should do that for you. For executable uber-jar, you have dedicated tool from spring.
Now you can choose three ways(these ways I know) of loading your modules.
1. Include some modules in maven build based on the build configuration via maven profiles and let spring IoC container load all the components he finds in the classpath
2. Include all of the modules in maven build and load them depending on spring active profiles - you can think about it as of feature flag. You annotate your components or configuration class with #Profile("XYZ") telling spring IoC container whether to instantiate component or not. You will need (most flexible solution) to provide a property file which tells spring which profiles are active and thus which modules should be loaded
3. Mix of these two above.
Solution 1 pros:
build is faster (modules that are not included will be skipped during build)
final build file is light (modules that are not included are... not included ;))
nobody can run module that is not present
Solution 1 contras:
project descriptor in maven may explode as you might have many different profiles
Solution 2 pros:
it's fairly easy and fun to maintain modules from code
less mess in project descriptor
Solution 2 contras:
somebody can run module that is not intended to be run as it's present in classpath, but just excluded during runtime via spring active profiles
final build file might be overweight - unused code is still present in code
build might take longer - unused code will be compiled
Summary:
It's not easy to build well structured project from scratch. It's much more easier to create a monolith and then split it into modules. It's because if you already created a project, you've probably already identified all the domains and relations between them.
Over past 8 years of using maven, I honestly and strongly recommend using gradle as it's far more flexible than maven. Maven is really great tool, but when it comes to weird customization it often fails as it's build capabilities rely on plugins. You can't write a piece of code on the fly to perform some custom build behaviour while buidling your project, you must have a dedicated plugin for doing that. If such plugin exists it's fine, if it's not you will probably end up writing your own and handling its shipment, so anyone in your company can easily perform project build.
I hope it helps. Have fun ;)

Equivalent of api for test dependency in gradle?

I'm having multi module gradle project. In one of my modules I'm having api dependency:
api('de.flapdoodle.embed:de.flapdoodle.embed.mongo')
I want to change it to dependency that will be visible in tests, across all modules. There is a testImplementation dependency but there is no testApi.
I cannot have this dependency on production classpath anymore since I want to use real mongo instance instead of embedded one. On the other hand I have tests in different modules that depend on data access - in that case I want to run those test with embedded mongo on test classpath.
How I can make this dependency visible in all modules tests?
The question (appears to me) is sharing the test code across modules in a multi-module project
Short answer - No - there is direct test dependency share across modules.
To share test code between modules internally via build settings
Official gradle route https://docs.gradle.org/current/userguide/java_testing.html#sec:java_test_fixtures
Simple hack
testImplementation files(project(':core-module').sourceSets.test.output.classesDirs)
add the above line either individually where you need or in root with subprojects() with appropriate condition
*there are other possible routes as well *
ex: via configuration
child.testImplementation extends parent.testImplementation (or runtime)
testCompileClassPath includes api dependencies so you are all good here, de.flapdoodle.embed.mongo will be visible in your tests.

Should I rely on transitive dependencies in Maven if they come from other sub-module of my parent?

Suppose we are working on mortgage sub-module, and we are directly using the Google Guava classes in module code, but the dependcy for the guava is defined in other sub-module under the same parent and we have access to Guava classes only by transitive dependency on "investment" module:
banking-system (parent pom.xml)
|
|-- investment (pom.xml defines <dependency>guava</dependency>)
|
|-- mortgage (pom.xml defiens <dependency>investment</dependency>)
Should we still put a <dependency> to Guava in the mortgage pom.xml?
The cons looks like duplication in our pom.xml, the pros are: if someone developing "investment" will drop guava, then it will not stop our mortgage sub-module from being successfuly build.
If yes, then what <version> shoudle we specify? (none + <dependencyManagement> in parent pom?)
If yes, should we use a <provided> scope in some module then?
Note: Keep in mind, that I am asking in specific situation, when modules have common parent pom (e.g. being an application as whole).
Maybe this structure was not the best example, imagine:
banking-app
banking-core (dep.on: guava, commons, spring)
investment (dep.on: banking-core)
mortgage (dep.on: banking-core)
Should still Investment explicitly declare Spring when it use #Component, and declare Guava if it uses Guava's LoadedCache?
we are directly using the Google Guava classes in module code, but the
dependcy for the guava is defined in other sub-module under the same
parent and we have access to Guava classes only by transitive
dependency on "investment" module [...] Should we still put a to Guava in the mortgage pom.xml?
Yes, you should declare Google Guava dependency in your module and not expect it to be available as transitive-dependency. Even if it works with the current version, it may not be the case anymore in later versions of direct dependencies.
If your code depends on a module, your code should depends only directly on classes of this module, not a transitive-dependency of this module. As you mentioned, there is no guarantee that the investment module will continue to depend on Guava in the future. You need to specify this dependency either in the parent's pom.xml or in the module itself to ensure it will be available without relying on transitive dependencies. It's not duplication as such, how else can you tell Maven your module depends on Guava?
I do not see any situation in which minimal best practices are respected where you would need to do otherwise.
If yes, then what <version> shoudle we specify? (none + <dependencyManagement> in parent pom?)
Yes, using <dependencyManagement> in parent and using a <dependency> in your child module without version is best: you will make sure all your modules uses the same version of your dependency. As your modules are an application as a whole, it is probably better as it will avoid various issues such as having different versions of the same dependency being present on the classpath causing havoc.
Even if for some reason one of your module using the same parent requires a different version of our dependency, it will still be possible to override the version for this specific module using <version>.
If yes, should we use a scope in some module then?
Probably not, having the dependency with a compile scope is the best wat to go with most packaging methods.
However you may have situations where you need or prefer to do this, for example if said modules requires to use a runtime environment specific version, or if your deployment or packaging model is designed in a way that demands it. Given the situation you expose, both are possible, though most of the time it should not be necessary.
Yes, declare the dep. It's not a duplication!!! That compile dependencies are transitive is not intended by the maven developer, it's forced by the java-language. Because features like class-inheritance forces this behavior. Your already mentioned "pro" is the important fact.
See the (*) note in the transitive-scope-table
Yes, always declare needed third party lib-versions in your reactor parent with dependencyManagement. It's a pain to find errors from different lib-versions at runtime. Avoid declaring versions of third-party libs in sub-modules of large reactors, always use a depMngs in parent.
No, i would use "provided" only for dependencies provided from the runtime, in your example tomcat/jboss/wildfly/.. for things like servlet-api/cdi-api/. But not for third party libraries.
Declare the "provided" scope as late as possible (i.e. your deployment(s) module war/ear) not in your business modules. This makes it easier to write tests.
For example:
investment (depends on guava scope:=provided)
mortgage (depends on investment, but don't need guava himself)
--> mortgage classpath doesn't contain guava.
If you write a unit-test for mortgage where classes involved from investment it will not work -> you need to declare at least guava with scope=test/runtime to run these tests...
When a module uses a 3rd party library, the module should explicitly depend on that library in its pom.xml too. Imagine if another project should use the 'mortgage' module, and doesn't depend on Guava already, it will fail e.g. when a unit test comes upon a code path that involves Guava. An explicit dependency also covers you against the scenario where you refactor the 'investment' module so that it doesn't use Guava anymore. Your 'investment' module should be agnostic to such changes in its dependencies.
It's always correct to explicitly list your direct dependencies. When it comes to version, it's best to keep that in the dependencyManagement section of your parent pom so all child projects inherit that (same) version.

Spring boot parent pom with custom parent

I read a lot of posts regarding the ways to use spring-boot-starter-parent in a spring boot project.
Essentially, I read posts (Spring documentation also talks about this) describing two ways to do this
To use spring-boot-starter-parent as the project parent directly. It gives us the benefits of having the dependency management as well as the plugin management.
The other way is to import the spring-boot-starter parent in the project pom (we may need this in case we already have a parent pom for the project).
It allows us to get the benefits of dependency management but not the plugin management)
I am creating a new Maven multi module project. Ideally I would like to have my own custom parent and also get all the benefits of using the Spring-boot-starter-parent.
I was wondering if it made sense to create a custom parent for my maven projects. This parent would in turn be a child of the spring-boot-starter-parent.
If I am not missing anything, this way I could get the benefits of having the dependency management and plugin management from spring-boot-starter-parent and at the
same time have a custom parent for all my projects where I could define some other common dependencies or if needed override the dependencies defined in the
spring-boot-starter-parent which would then be inherited by all my projects.
Does this design make sense or am I missing something.
What are the drawbacks of this approach?
There are no drawbacks -- this is exactly what you're meant to do if you want a multimodule spring-boot project. However, consider this: typically multi-module projects have all modules versioned together, released together, and dependant on each other. This rarely makes sense in a group of spring-boot modules, which are typically of the micro-service style and which require independent evolution. So, you should question your need for a multi-module project at all.

When to use maven multi module project

I was going to couple of blogs to get the basics of maven, in the mean time I was confused when I can use the multi module project. It will be great if the answer includes example.
The main idea is that you have small modules that are dependent on each other and can be grouped together. Its not necessary that all sub-modules in a multi-module project be dependent on every other sub-module.
Lets consider you have multiple modules for an application (e.g a social networking application) that belong together. These modules can range from smaller modules like a client consumer module or a server module that will serve requests initiated by the client module, an ejb module that will hold your beans that are used by both the server and the client module and a deploy-able web module that would comprise of your front-end application etc.
This is usually handled via a multi-module build which means all modules have the same version number, are bound together under a similar platform (a social networking application in our example) but can be accessed and used by other separately.
Please check How to assemble multimodule maven project into one WAR? to know how to package a multi module project in a war file. also, you can check maven official site on Introduction to pom file

Resources