Spring Boot, Maven, disable specific RestController in the derived project - spring

In my Maven, Spring Boot 2 project I have Maven module called api1. I have declared a number of #RestControllers there.
In order to extend the logic of the api1 module, I have implemented another Maven module called api2 and placed api1 there as Maven dependency.
Right now all of the #RestControllers from api1 project are initialized in the api2 because all of them are present on the api2 classpath.
How to disable a certain #RestController in api2 project?

You may try using Condition interface from Spring which provide support for conditional enable/disable of the beans based on certain condition/expression.
something like below:
#RestController
#ConditionalOnExpression("${api1.controller.enabled:false}")
#RequestMapping(value = "/", produces = "application/json;charset=UTF-8")
public class Api1Controller {
#RequestMapping(value = "/greeting")
public ResponseEntity<String> greeting() {
return new ResponseEntity<>("Hello world", HttpStatus.OK);
}
}
you have to set the Conditional expression by some way (env. variable / property key). check this for some reference. Condition docs can guide you on more details.

I think the crucial fact here to understand is that Spring works at runtime only, while maven matters in build time.
So maven sees that api2 depends on api1 so it understands that both modules have to be included in the artifact (in the case of spring boot its a big jar with all modules inside).
Now, when spring starts - it "takes for granted" that all modules are accessible, and depending on spring configurations it just defines beans to be loaded and processed, all rest controllers are among these beans of course.
So I assume, you don't mind having two modules in the artifact (and in classpath).
In this case, you shouldn't touch the maven part at all, but when the spring boot application starts it has to be "instructed" somehow that some rest controllers have to be excluded. The point is that it should be done not in terms of modules ("hey, spring, this controller belongs to module api2, so it has to be excluded"), but in terms of business "jargon". For example, api1 contains all "admin" functionality and api2 contains all "applicative" stuff. So, if you work with Java configurations, for example, you can do the following:
Inside module api1:
#Configuration
#ConditionalOnProperty(name = "admin.enabled", havingValue=true)
public class AdminControllersConfiguration {
#Bean
public AdminControllerFromModuleApi1 adminController() {
return new AdminControllerFromModuleApi1();
}
}
}
In module api2 you just define your rest controllers in a similar way but without "#ConditionalOnProperty" annotation.
The thing with this annotation is that it allows to "switch off" beans or entire configurations like in my example.
So, when you start api2, you just define in "application.properties" or something the following:
admin.enabled=false
And your controllers won't be "deployed" by spring although physically the files are certainly in the classpath.
Of course, since spring allows different types of configurations, this method might not be applicable to your project, but the idea is still the same.

Related

Autowire FeignClient across maven Project

I have Projects built with Spring-boot, Maven and Kotlin.
I want to expose some Services and FeignClients in a maven Project, so others can use them.
For a class with Annotations like #Service that works well. But I need to expose also FeignClients, which are annotated with #FeignClient, but as it looks other Projects are not
able to inject those Clients. Do I have to configure something in my pom.xml?
Im using spring-cloud-starter-openfeign
Here is some Code.
My FeignClient looks like:
...
#FeignClient(name = "MyAPIClient", url = "\${url}", configuration = [MyApiClientConfiguration::class])
interface MyAPIClient {
...
And I try to inject that Client in another Project like this:
...
#Service
class MyService(val myAPIClient: MyAPIClient) {
...
The Error is pretty clear. It says, there is no bean with the name MyAPIClient. So it's not
visible or available. "Consider defining a bean of type 'com.mycomp.MyAPIClient' in your configuration."
Do I have to configure something explicitly to expose an OpenFeignClient to other projects in my pom.xml?
Thanks for your help
It would work automatically if you had the same package structure in both projects. See how the search for feign clients is performed by default.
In other cases, you need to specify basePackages of basePackageClasses attributes of #EnableFeignClients annotation (in the app where you need to inject your feign client). Note that if you do that, then the default behavior (scanning the current package where this annotation is placed) stops working, so you need to specify it manually too in this case.

How to add rest controller to spring boot library, and use it in another application?

I am working on a library xyz.jar that needs to add a UI page with mappings like this one:
#RestController
public class LibCtrl {
#EventListener(ApplicationReadyEvent.class)
#RequestMapping("/updateDomainList")
String updateDomainList() {
return "we can call a controller from another jar like this";
}
}
This then needs to be called in my main springboot application, myMainApplication.war, so when I call
http://localhost/myMainApplication/updateDomainList
I should see
we can call controller from another jar like this
on the browser.
How can achieve this? #Component also did not work for me. Once it begins to work, would #Autowired to JdbcTemplate also work?
It was a simple fix. #ComponentScan allows for multiple packages to be scanned. This made is possible for me to add my Library Packages to be managed by Spring. Just add the following to your application class.
#ComponentScan({"my.mainapplication.package","my.library.package"})

How to define the spring.config.location on Spring Boot and JUnit tests?

How we can programmatically configure Spring Boot to define new values to the spring.config.name and spring.config.location properties when running JUnit tests?
For example, if we would like to define these properties when running the application itself we could use something like (in Kotlin):
fun main(args: Array<String>) {
// SpringApplication.run(Application::class.java, *args)
val applicationContext = SpringApplicationBuilder(Application::class.java)
.properties(
"""spring.config.name:
${getSpringConfigNames()}
""",
"""spring.config.location:
${getSpringConfigLocationPaths()}
"""
)
.build()
.run(*args)
// val environment = applicationContext.getEnvironment()
}
But I wasn't able to find a way to configure this to use in the JUnit tests.
Edit
There is a complication here because of an spring boot limitation.
I would like to use an entire folder and its subfolders as valid locations to search for configuration files (so, for example, we could have folders for specific environments, databases, third-parties, and so on).
When running the application this was possible creating a method, in this case getSpringConfigLocationPaths(). And this method create a comma separated list with all folder inside the "main" folder.
For example, for the main folder src/main/resources/configuration it will output:
src/main/resources/configuration,
src/main/resources/configuration/environments,
src/main/resources/configuration/environments/development,
src/main/resources/configuration/environments/staging,
src/main/resources/configuration/environments/testing,
src/main/resources/configuration/environments/production,
src/main/resources/configuration/environments/common
How could we solve this situation when using JUnit tests and Spring Boot?
Unfortunately Spring Boot doesn't allow something like src/main/resources/configuration/**/*.
Because we have organized the system with several properties files on different subfolders we need to find a way to dinamically consider them.
I am using latest Spring Boot 2.2.0 and from my experience both #TestPropertySource and #SpringBootTest annotations can do the job because they have properties attribute.
So, you can do something like this:
#TestPropertySource(properties = ["spring.config.location=classpath:dev/", "spring.config.name=custom-app-name"]
#TestConfiguration
class DevTestCfg {} // this will make tests to look for configs in resources/dev/custom-app-name.properties
Also notice that there is a spring.config.additional-location property if you want your properties to be loaded from multiple locations.
The only problem here is that values in properties attribute must be constant.
But you can create multiple configurations for each environment and put corresponding #Profile("envName") on each configuration class. Then run your tests with different -Dspring.profiles.active and corresponding test configuration should be automatically picked up.
The tests that run spring boot should be carefully designed,
There is a whole testing framework for spring boot tests, so obviously consider using this framework.
When it comes to configuration management, I suggest considering the following:
There are two types of tests basically:
Tests that load a concrete specific configuration (set of beans), for example if you want to test only a DAO, you load a configuration for this dao.
In this case, the configuration is something that should be "tailored" to the needs of a specific test, and no "full" configuration is required.
For example, if the microservice contains a configuration for a database (user, password, schema, etc) and for, say, messaging management, there is no need to specify a configuration of a messaging system when testing a DAO, messaging beans won't be loaded anyway.
Usually, the test of this "type" will look like this:
#SpringBootTest(classes = {RelationalDbDaoConfiguration.class})
public class MyDaoTest {
}
If you don't have a configuration for your needs you can use #MockBean to mock unnecessary beans or even create a custom configuration in src/test/java so that it will be only in test classpath. It makes sense to use #TestConfiguration but it's beyond the scope of the question.
Now in order to load the configuration for db only, the are many options, to name a few:
#ActiveProfiles("dao") on a test class + putting "application-dao.properties/yaml" into the src/test/resources or src/test/resources/config
Use #TestPropertySource(locations = "classpath:whatever.properties") on test
Create a special "DbProperties" bean and initialize it programmatically in spring, it can make sense when you know some details about the context in which the test runs only during the actual test execution (for example, if you start a database before the test and the port is created dynamically, but its really a fairly advanced setup and is beyond the scope of this question) + the data source bean can read these properties
Use #SpringBootTest's properties attribute to provide 'fine-grained' properties definitions
Kind of obvious, but I'll mention it anyway: put application.properties in src/test/resources it will override regular configurations
The second type of tests is when you load the "entire" microservice, usually, these are tests that do not have "classes" parameter in #SpringBootTest annotation
#SpringBootTest // note, no actual configurations specified
public class MyMicroserviceTest {
...
}
Now, this definitely requires to specify a whole set of configurations, although the techniques for actually specifying these configurations are still applicable (just the content of configuration files will be different).
I do not suggest the usage of spring.config.location during the test, because this means that the test depends on some external resource, which makes the whole setup even more complicated.
If it's XML driven configuration,
#ContextConfiguration(locations = "/app-context.xml")
If it's annotation driven by configuration classes,
#ContextConfiguration(classes = {AppCOnfig::class, AnotherCOnfig::class}
These would be defined on the class level on the unit test class you run.
Further, if you have profiles for Junit to consider,
#ActiveProfiles("myProfile") would be added to the test class.

Spring AOP aspect doesn't get applied if included from an external jar with different package name

I have a spring boot rest service that included an external project in pom as it's dependency. That external project is basically a jar that has spring AOP code.
The base package in my main application that includes this external jar with spring AOP code is x.y.z
The class in external jar where the #before advice is, is under the package a.b.c
With this class under a.b.c package, it doesn't get recognized by the main application where I want to use the spring aop implementation and apply the aspect. However, when I change it's package from a.b.c to x.y.z (which I really can't do in real life) it works fine.
I know that in spring boot service which happens to be the including service, it scans everything under root package given in the application class, x.y.z in this case and that is why aspect works fine if it's class is under x.y.z.
however, the problem is that this spring app jar will be used across multiple applications. So changing package name like this is not an option.
Is there a way to accomplish this without changing the package name of the class where spring app code is ?
Probably component scan is only activated for your application class packages by default. You can extend it to multiple packages, including the aspect package:
XML style configuration:
<context:component-scan base-package="x.y.z, a.b.c" />
Annotation style configuration:
#ComponentScan(basePackages = {"x.y.z", "a.b.c"})
Disclaimer: I am not a Spring user, only an AspectJ expert. I just knew that you can configure component scan, googled the syntax for you and hope it is correct.
Please define the bean (of jar project )inside main application. Give the #ComponentScan(basePackages = {"x.y.z", "a.b.c"}) as well as #EnableAspectJAutoProxy. Also include below piece of code.
ex:
` #Bean
public LoggingHandler loggingHandler()
{
return new LoggingHandler();
}`
Also annotate external jar code with:
`#Aspect
#Component
public class LoggingHandler {`
What #kriegaex suggests is correct. In addition to that, please make sure you are using #Component along with #Aspect. Since #Aspect is not a Spring annotation, Spring won't recognize it and hence your aspect won't be registered. So, using #Component is mandatory to getting aspects to work in Spring environment.

Plugin System in Spring Boot for modular applications

I looking for dynamically loading jar in spring boot after compiling, for example I will put jars in some folder and when spring boot is started, all jars from this folder will be injected into spring boot app. I don't know how can I do this with spring boot, and if You know can help me with this, with some example.
I need this jars to have #Service, #Controller as this will be module (plugin), with adding capabilities to my spring boot app.
Is possible to do this with spring boot, and if it is possible, please provide me with some sample code.
Thanks in advance.
UPDATE:
I found something https://www.youtube.com/watch?v=F-sw2pFdcDw https://code.google.com/p/jspf/
UPDATE 2: I can't get #Controller bean from plugin jar registered in Spring Boot
Have a look at FlexiCore, an open-source framework that brings modularity to spring boot utilizing plugins(jars) loaded at runtime See wizzdi and FlexiCore.
for example FlexiCore allows you to create a project ( compiled into a seperate jar from your main application) that contains a spring bean as follows:
#Component
#Extension
public class HelloWorldService implements ServicePlugin{
public String hello() {
return "Hello World!";
}
}
it will be automatically be loaded once placed inside the designated plugins folder, it basically allows a full support for most(all) of spring boot features , so for example you can add a RestController bean to your jar as well , FlexiCore will automatically load that bean allowing you to call the controller as if it was in your main application jar:
#RestController
#Extension
public class TestEntityController implements Plugin {
private static final String template = "Hello, %s!";
private final AtomicLong counter = new AtomicLong();
#Autowired
private TestEntityService testEntityService;
#PostMapping("/createTestEntity")
public TestEntity createTestEntity(#RequestParam(name="name", required=false, defaultValue="Stranger") String name) {
return testEntityService.createTestEntity(name);
}
#GetMapping("{id}")
public TestEntity getTestEntity(#PathVariable("id")String id) {
return testEntityService.getTestEntity(id);
}
}
Disclaimer: I am the CTO of wizzdi, the company powering FlexiCore.
One option is definitely to just use broad #ComponentScan. If you add new jar to classpath the annotated classes from that jar will get discovered via #ComponentScan, #Controllers will get mapped etc.
The XML equivalent here would be placing xml configuration files somewhere to your classpath (META-INF folder being obvious choice) and import them all using wildcard. The idea is the same. If the plugin jar file is on classpath you will get the xml file imported and the beans (controllers, ...) will get loaded.
There are drawbacks to this approach like the modules not being isolated but its definitely option for simpler applications.
You can find a sample spring boot web project here.
By dynamically loading jars I assume you want to add dependencies to your project. For this you can update pom.xml of the sample project and put your dependencies here.

Resources