Aspectj, how to use ajc in a modular way - osgi

I am trying to use the Aspectj compiler ajc in a modular (OSGi setting). The standard way ajc seems to be used is to take aspects & java code and turn it into one a JAR with all classes and resources in the -inpath, -aspectpath, and -sourceroots.
I am trying to weave aspects an OSGi executable JAR from bnd. This executable jar contains a set of bundles that need to be woven. However, in a modular system, the boundary is quite important. For one, the manifest often contains highly relevant information to that bundle or one of the many extenders. Flattening all the classes into a big blog won't work.
I am therefore weaving each bundle separately. However, then the output is cluttered with the aspects. I'd like to import these to keep the aspect modules proper modules. However, using the annotation programming model, I notice that ajc is modifying the aspect modules, so I need to rewrite those as well. This is fine, but since I weave each bundle separately, I have the question if the weaving of the aspect could depend on what gets other modules woven? That is,
does the modification of the annotated aspect depend on the classes that it is woven in?
The other issue is what happens to resources with the same name? Since my -inpath is only one JAR (the bundle), I notice I end up with the correct manifest (META-INF/MANIFEST.MF) in the output. However, if the -inpath consists of many bundles, what will the manifest be? Or any other resource that has the same path and thus overlaps?
Last issue is external dependencies. I understand acj wants to see the whole world and include this whole world into the output JAR. However, I must exclude external dependencies of a bundle. Is there a way to mark JARs as: use, but do not include. A bit like the maven 'provided' scope?
Summary:
Does the modification of an #Aspect annotated class depend on the targets that is applied to?
Can I compile the #Aspect annotated classes into separate JARs?
How to handle the external dependencies that will be provided in the runtime and thus must be excluded from the output JAR.
What are the rules around overlapping resource paths in the -inpath and -sourceroots?
UPDATE In the mean time I've made an implementation in Bndtools.

Does the modification of an #Aspect annotated class depend on the targets that is applied to?
If you want to be 100% sure you have to read the AspectJ source code, but I would assume that an aspect's byte code is independent of its target classes, because otherwise you could not compile aspects separately and also not build aspect libraries.
Can I compile the #Aspect annotated classes into separate JARs?
Absolutely, see above.
How to handle the external dependencies that will be provided in the runtime and thus must be excluded from the output JAR.
If I understand the question correctly, you probably want to put them on the class path during compilation, not on the inpath.
What are the rules around overlapping resource paths in the -inpath and -sourceroots?
Again, probably you have to look at the source code. If I was you I would simply assume that the selection order is undefined and make sure to not have duplicates in the first place. There should be Maven plugins helping you with filtering the way you want the result to be.
bndtools seems to have close ties to Eclipse. So does AspectJ as an Eclipse project. Maybe you can connect with Andy Clement, the AspectJ maintainer. He is so swamped with his day-time job though, he hardly ever has any free cycles. I am trying to unburden him as much as I can, but OSGi is one of my blind spots and I hardly know the AspectJ source code. I am rather an advanced user.

Related

Spring Boot Multi Module and Fat jar with Shared Features

Experts,
I need some expert advice on how to approach the below use case in spring boot.
I need to have a maven multi-module approach to my project.
I need to have a single jar as output of the final build process.
There are to be common modules for controllers, data access and other functionality
Other modules are to be created based on functionality domain for eg a module for Payroll, a module for Admin etc etc.
Each domain functional module will then have their own controllers extending the common controller, exception handler and so on.
Each module will also have its own set of thyme leaf pages.
The reason for following such an approach is we have development in phases and we will be rolling out based on functional modules.
Here are the issues that I can sense using this approach.
Where do I add the spring web dependency? If I add to the parent pom - it gets replicated across the children and there will be port conflict issues as each module loads. the same issue will also be there the moment I add it to two child modules.
How do I build the fat jar which has all the jars from all modules and works as the final deployment?
All the text that I read i can't see anything even close to what I am trying to achieve.
AD1. They will not unless you are trying to setup independent application context in each module. Of course you can do that(it might be complicated but I believe it's achievable), but for me it's an overkill. Personally I think it's better to have one application context and rely on scanning components that are present in classpath.
AD2. The structure in maven might be a little bit complicated and overwhelming at first glance but it makes sense. Here's how I see it:
Create a parent module that will aggregate each module in project and will declare library/plugin dependencies for submodules.
Create 1-N shared submodules that will be used in other modules. With come common logic, utils, etc.
Create 1-N submodules that will be handling your business logic
Create an application submodule that creates application context and loads configuration and components from classpath
Create a submodule that will be responsible for packaging process, either to war, jar, uber-jar or whatever else you desire. Maven jar plugin should do that for you. For executable uber-jar, you have dedicated tool from spring.
Now you can choose three ways(these ways I know) of loading your modules.
1. Include some modules in maven build based on the build configuration via maven profiles and let spring IoC container load all the components he finds in the classpath
2. Include all of the modules in maven build and load them depending on spring active profiles - you can think about it as of feature flag. You annotate your components or configuration class with #Profile("XYZ") telling spring IoC container whether to instantiate component or not. You will need (most flexible solution) to provide a property file which tells spring which profiles are active and thus which modules should be loaded
3. Mix of these two above.
Solution 1 pros:
build is faster (modules that are not included will be skipped during build)
final build file is light (modules that are not included are... not included ;))
nobody can run module that is not present
Solution 1 contras:
project descriptor in maven may explode as you might have many different profiles
Solution 2 pros:
it's fairly easy and fun to maintain modules from code
less mess in project descriptor
Solution 2 contras:
somebody can run module that is not intended to be run as it's present in classpath, but just excluded during runtime via spring active profiles
final build file might be overweight - unused code is still present in code
build might take longer - unused code will be compiled
Summary:
It's not easy to build well structured project from scratch. It's much more easier to create a monolith and then split it into modules. It's because if you already created a project, you've probably already identified all the domains and relations between them.
Over past 8 years of using maven, I honestly and strongly recommend using gradle as it's far more flexible than maven. Maven is really great tool, but when it comes to weird customization it often fails as it's build capabilities rely on plugins. You can't write a piece of code on the fly to perform some custom build behaviour while buidling your project, you must have a dedicated plugin for doing that. If such plugin exists it's fine, if it's not you will probably end up writing your own and handling its shipment, so anyone in your company can easily perform project build.
I hope it helps. Have fun ;)

When to use "optional" dependencies and when to use "provided" scope?

Dependencies decorated by <optional>true</optional> or <scope>provided</scope> will be ignored when they are dependent transitively. I have read this, and my understanding is like the difference between #Component and #Service in Spring, they only vary semantically.
Is it right?
In addition to the comment, there is more important semantic difference: "Provided" dependencies are expected to be supplied by the container, so if your container gives you hibernate, you should mark hibernate as provided.
Optional dependencies are mainly used to reduce the transitive burden of some libraries. For example: If you can use a library with 5 different database types, but you usually only require one, you can mark the library-dependent dependencies as optional, so that the user can supply the one they actually use. If you don't do, you might get two types of problems:
The library pulls a huge load of transitive dependencies of which you actually need very few so that you blow up your project without reason.
More dangerously: You might pull two libraries with overlapping classes, so that the class loader cannot load both of them. This might lead to unexpected behaviour of your library.
A minor difference I'd like to point out is the treatment of optional vs. provided by various plugins that create packages.
Apparently war plugin will not package optional dependencies, but there is an open bug about it: https://issues.apache.org/jira/browse/MWAR-351
The assembly plugin doesn't seem to provide any way to filter based on optional status, while it allows you to filter based on scope.
It seems the same is true for the shade plugin.
TL;DR if you are not developing a library, but a top-level application provided scope will give you more flexibility.

Classpath scanning in OSGi

My project has a set of custom defined annotations that could be present in any bundle deployed in the OSGi 4.3 framework. I want to find any class with these annotations in the classpath. I tried using BundleWiring.listResources(...) and Bundle.loadClass(...) for each class found. I have done some tests with an small set of bundles and it needs almost 200MB of Permanent Generation JVM memory space because all classes are loaded.
Is there a way to free loaded classes PermGen memory space when the program realizes that they does not have these annotations?
Is there a better way to look for annotated classes in an OSGi framework?
I think you should not do annotation scanning as it slows down startup and needs a lot of memory. JEE application servers do annotation scanning at startup to make lazy programmers happy and the result is very annoying (e.g. scan for JPA or EJB annotations).
I guess you are implementing a technology where you can define the rules. I suggest that you should define rules that are similar to these:
Annotate your class
Have a MANIFEST header where the annotated class must be listed.
An even better solution can be to use a custom capability namespace with specified attributes. E.g.:
Provide-Capability: myNamespace;classes=com.foo.myClass1,com.foo.myClass2
In your technology, you should write a BundleTracker that calls:
BundleWiring.getCapabilities("myNamespace");
If the namespace is present, you can find the classes that should be processed.
If you implemented the technology, you can consider an extension to Bnd to fill that MANIFEST header automatically. That extension can be used than when bnd is started from the command line or from build tools like maven.
Btw.: You can use ASM to parse the class bytecode or use the built in possibility of Java to build up AST. Although those could work to solve the memory issue, I still think that you should define the list of classes directly in the MANIFEST header as it makes things much more clear. You can read the MANIFEST headers, you can check the capabilities on webconsole but you cannot do the same with bytecode.
Usually, classpath scanning for annotations is a bad idea in an OSGi context, as the classpath is more like a graph. However, there are situations where this can be useful. Hence, OSGi encourages the usage of the Whiteboard Pattern.
What you could possibly do is register each of these classes as services in the OSGi registry. Then, create a separate bundle that just tracks these services and transforms/manipulates them in some way. For example, this project scans for all classes annotated with #Path and #Provider annotations, and transforms them into Jersey REST APIs.

Should I exclude Spring Framework jars that are not required?

I am developing a simple web application, using Spring Framework.
When I add Spring framework to my class path, I see that it has lot of jars which I never use (for example: spring-aop-3.2.3.RELEASE.jar).
Is it a good idea to keep the entire framework intact or remove unused jars?
If you need to remove unused jars, the best way is to use some dependency management tool like Ivy or Maven, and let the tool decide what the required dependencies are. Otherwise it will not be apparent what is really unused or not until you break something.
For instance, if you are using declarative transactions, then removing the AOP jar will cause breakage, because AOP is used to implement that functionality.
If you would rather not use dependency management, it's better to leave everything intact.
There are some cases where you do want to remove/exclude jars. Replacing commons-logging with slf4j is one example. Another example is excluding the log4j dependencies that get dragged in on account of some appender that's packaged with log4j but that you know you will never use. Dependency management tools allow you to tell them what needs to be excluded.
Doing without dependency management management and removing things because you never use them directly is too dangerous.

Expected or Recommended usage of the Maven JAXB 2.x Plugin

I'm new to XML Schema and to JAXB and wondering what the best or expected approach to using the Maven JAXB plugin (http://static.highsource.org/mjiip/maven-jaxb2-plugin/generate-mojo.html)is.
I have a simple XML document format for which I've defined a schema. I'm primarily interested in reading a compliant XML file into Java, but I'll probably also want to add extra properties to the POJOs which won't be in the XML, but will be used at runtime.
By default the plugin places generated code into ${project.build.directory}/generated-sources/xjc. What I think I want to do is copy the generated code into /src/main/java/whatever and add to/modify the code to add my extra properties. When I change the schema, I'd then merge changes form the newly generated POJOs into my own ones.
The alternative is to tell the plugin to place the generated source directly into /src/main/java and to perhaps subclass the POJOs to add my own properties, but I'm not sure whether the marshaling/unmarshaling can still be made to use my extended classes.
Anyone have any guidance on which approach is more normal or what the pitfalls of each are?
In your place I'd leave the generated sources where they are so that the corresponding jar can be built by Maven without further configuration and put your custom code in a different project that depends on the first one. This ensures that everything is build in the right order.
It is your choice whether to derive from the generated classes or just use instances of them in your code, as attributes or, even better, local variables. Personally I'd avoid derivation; after all JAXB is just low level machinery you use to perform I/O in a specific format.
Most importantly: forget about modifying the generated sources; why introduce an error prone manual step in your development process when you can get the same effect automatically?
(To provide a slight variation on to Nicola's answer)
If your schema rarely changes it might make sense to have a completely separate build which just creates the JAXB generated code, jars it, versions it, and sticks it in your repository.
Then in your downstream code you use that jar as a dependency and subclass the JAXB code as necessary to add your new fields.
We went this route because we felt that having JAXB complile every time we did a build was unnecessary as our schemas were pretty static.
Most importantly: forget about modifying the generated sources; why introduce an error prone manual step in your development process when you can get the same effect automatically?
Absolutely.
To elaborate and extend on a point already well-made... if there are a lot of implicit relationships and things you'd like to put "getters" on the JAXB code for, bite the bullet and wrap the JAXB class hierarchy in one that does exactly what you want where you want it.
With IDE-assisted delegation, this is only a little tedious, and factors a lot of straightforward, distracting, low-level code out of your main app.
Another benefit of this is that you'll spend a lot less time fighting JAXB to generate things exactly the way you want - the wrappers will make you care a whole lot less.

Resources