How do I include components, scanned from a Spring library, into the main application consuming that library? - spring

We are building a Spring-based Java library that has several #Autowired assignments, which pick up #Bean instances from the library path, detected at runtime.
That works great when we test the application directly (with an #SpringBootApplication in the library)
However when we include the library as a dependency in other projects, it does not do the component scan of the library classes, and so the #Autowired injections never happen.
Of course we could tell the application developers to add the library path to their component scan, but that would lead to questions, errors, and frustration.
How can we tell Spring to do a component scan of the library classes, without explicitly including the scan base in the consumer applications?

You could create a configuration in your library which defines the components the library needs. This answer shows something like that.
Another way would be to create your own autoconfiguration. This article in the spring documentation describes how it is working.

Related

How to split code in library for spring boot starter

Imagine that I have a library A, that provide a given functionality that can be enabled like a spring-boot-starter, meaning that adding the dependency to my project will make it available.
On the other end I have another library B, that provide another functionality, similar to library A in the way it works, but it is a totally different feature.
Now I have a case where if the project include libraries A and B I would like to configure some bean in a specific way.
My question is how to decide where to do write the code needed both libraries, clearly it should either be:
library A having optional dependency on B with a specific #Configuration that enable that bean
the opposite, library B having optional dependency on A with a specific #Configuration that enable that bean
On purpose I leave the name A and B to be as generic as possible in my case, one library is applying some specific configuration for mongo and the other one is to wrap the mongock migration tool.
You could do the following:
Write your libraries A and B without Spring Boot in mind (no beans, no dependency injection etc.)
Declare both libraries as dependencies (Maven/Gradle) in your project
Define a #Configuration class that defines all the beans that you need, similar to this
Decide with Spring profiles and #Qualifier which been you need when
I wrote a blogpost about how you can use #Profile to decide for different Spring Beans based on the given scenario: https://medium.com/twodigits/keep-your-code-debuggable-4ad69e5e0ac7

SpringBoot creating a framework starter library

I am creating a library using spring-boot (v2.1.6.RELEASE) as a starter project that will facilitate as base extension jar responsible for configuring and starting up some of the components based on client project properties file.
The issue I am facing is that if the client project's SpringBoot Application class contains the same package path as library everything works like charm! but when client project contains different package path and includes ComponentScan, it is not able to load or start components from the library.
Did anyone encounter this issue? how to make client application to auto-configure some of the components from library jar?
Note: I am following the library creation example from here: https://www.baeldung.com/spring-boot-custom-starter
There are many things that can go wrong here, without seeing relevant parts of actual code its hard to tell something concrete. Out of my head, here are a couple of points for consideration that can hopefully lead to the solution:
Since we use starters in our applications (and sometimes people use explicit component scanning in there spring applications) and this obviously works, probably the issue is with the starter module itself. Don't think that the fact that the component scan is used alone prevents the starter from being loaded ;)
Make sure the starter is a: regular library and not packaged as a spring boot application (read you don't use spring boot plugin) and have <packaging>jar</packaging> in your pom.xml or whatever you use to build.
Make sure you have: src/main/resources/META-INF/spring.factories file
(case sensitive and everything)
Make sure that this spring.factories file indeed contains a valid reference on your configuration (java class annotated with #Configuration). If you use component scanning in the same package, it will find and load this configuration even without spring factories, in this case, its just kind of another portion of your code just packaged as a separate jar. So this looks especially "suspicious" to me.
Make sure that #Configuration doesn't have #Conditional-something - maybe this condition is not obeyed and the configuration doesn't start. For debugging purposes maybe you even should remove these #Conditional annotations just to ensure that the Configuration starts. You can also provide some logging inside the #Configuration class, like: "loading my cool library".

Using SpringBoot as an application loader

I have a spring-boot app that acts as a small framework for other apps. It provides a couple of JMS queues and a DAO layer to retrieve and store data from a common set of data stores. The problem is that the original developer of this framework app is scanning all the package "com.mycompany" (rather than com.mycompany.framework) so that it can load the beans of the specific app that may be declared under com.mycompany.myapp1 or com.mycompany.myapp2 an which JARs are bundled together with the JARs of the framework.
We only load a single app in the JVM (app1 or app2), but these apps may share other libraries and sometimes we end up with beans in the context that we don't need. (these may be needed in app1 but not in app2)
So, what would be your advice ?
My problem is similar to what was described here:
https://github.com/spring-projects/spring-boot/issues/3300
I am debating if each app should be aware of the framework and load it. Or if the framework should instantiate a class loader and create a new Spring context loading the app specific code as suggested in the link above.
Perhaps you should consider leveraging some of Spring Boot's Auto Configuration capabilities such as #ConditionalOnProperty or #ConditionalOnClass in your framework. That way, you can only actually enable certain beans if and when the application using your framework takes some specific action (e.g. has a given jar on the classpath, or sets a configuration value). For reference check out: http://docs.spring.io/spring-boot/docs/current/reference/htmlsingle/#boot-features-developing-auto-configuration

OSGi: when to use component framework and when to create objects yourself

I've been an AEM developer for almost a year now. I know AEM uses 'Declarative Services component framework' to manage life cycle of OSGi components.
Consider a scenario when i would export a package from a bundle and import that package from another bundle, i could create objects of classes in first bundle inside second bundle as well. it's a import-export contract in this case.
my question is when i should be using component framework to manage the lifecycle of my objects and when to handle it myself by creating them when required.
In an ideal design, you would NOT in fact be able to create objects from the exported package; because that package would contain only interfaces. This makes it a "pure" contract (API) export. If there are classes in there that you can directly instantiate, then they are implementation classes.
In general it is far better to export only pure APIs and to keep implementation classes hidden. There are two main reasons:
Implementation classes tend to have downstream dependencies. If you depend directly from implementation class to implementation class then you get a very large and fragile dependency graph... and eventually that graph will contain a cycle. In fact it's almost inevitable that it will. At that point, your application is not modular because you cannot deploy or change any part of it independently.
Pure interfaces can be analysed for compatibility between versions. As a consumer or a provider of an API, you know exactly which versions of the API you can support because the API does not contain executable code. However if you have a dependency onto an implementation class, then you never really know when they break compatibility because the breakage could happen deep down in executable code that you can't easily analyse.
If your objects are services then there's no question, they have to be OSGi components.
For other things, my first choice is OSGi components, unless they're trivial objects like data holders or something similar.
If an object requires configuration or refers to OSGi services then it's also clearly an OSGi component.
In general, it's best IMO to think in services and define your package exports as the minimum that allows other bundles to use a bundle's services. Unless a bundle is clearly a reusable library like commons-io (to take a simple example).

Classpath scanning in OSGi

My project has a set of custom defined annotations that could be present in any bundle deployed in the OSGi 4.3 framework. I want to find any class with these annotations in the classpath. I tried using BundleWiring.listResources(...) and Bundle.loadClass(...) for each class found. I have done some tests with an small set of bundles and it needs almost 200MB of Permanent Generation JVM memory space because all classes are loaded.
Is there a way to free loaded classes PermGen memory space when the program realizes that they does not have these annotations?
Is there a better way to look for annotated classes in an OSGi framework?
I think you should not do annotation scanning as it slows down startup and needs a lot of memory. JEE application servers do annotation scanning at startup to make lazy programmers happy and the result is very annoying (e.g. scan for JPA or EJB annotations).
I guess you are implementing a technology where you can define the rules. I suggest that you should define rules that are similar to these:
Annotate your class
Have a MANIFEST header where the annotated class must be listed.
An even better solution can be to use a custom capability namespace with specified attributes. E.g.:
Provide-Capability: myNamespace;classes=com.foo.myClass1,com.foo.myClass2
In your technology, you should write a BundleTracker that calls:
BundleWiring.getCapabilities("myNamespace");
If the namespace is present, you can find the classes that should be processed.
If you implemented the technology, you can consider an extension to Bnd to fill that MANIFEST header automatically. That extension can be used than when bnd is started from the command line or from build tools like maven.
Btw.: You can use ASM to parse the class bytecode or use the built in possibility of Java to build up AST. Although those could work to solve the memory issue, I still think that you should define the list of classes directly in the MANIFEST header as it makes things much more clear. You can read the MANIFEST headers, you can check the capabilities on webconsole but you cannot do the same with bytecode.
Usually, classpath scanning for annotations is a bad idea in an OSGi context, as the classpath is more like a graph. However, there are situations where this can be useful. Hence, OSGi encourages the usage of the Whiteboard Pattern.
What you could possibly do is register each of these classes as services in the OSGi registry. Then, create a separate bundle that just tracks these services and transforms/manipulates them in some way. For example, this project scans for all classes annotated with #Path and #Provider annotations, and transforms them into Jersey REST APIs.

Resources