I am trying to create a Karaf assembly using Maven (and NetBeans). I create my bundles using declarative services, but I am having problems creating feature files. Part of my problem is the error messages that OSGi generates. But I have a more general questions.
I have discovered that I can call karaf-maven-plugin in the project that creates my bundles and it generates what appears to be a comprehensively populated feature file based on the dependencies of the bundle. (Method 1)
However, I have read somewhere that creating a feature file with karaf-maven-plugin should normally only be done in a project with feature packaging. If I do this, it seems to me that I have to create the feature file by hand, which is not a lot of fun. (Method 2)
No matter which method I use, I have been unable to successfully generate a Karaf assembly that contains anything other than simple bundles without any dependencies. I am currently stuck trying to install a single bundle that needs to wrap some non-OSGi dependencies. Method 1 above generates the wrap stuff (<feature> and wrap: protocol). All I get is the following error:
Failed to execute goal org.apache.karaf.tooling:karaf-maven-plugin:4.1.0:assembly (default-assembly) on project EnoceanBridgeAdmin: Unable to build assembly: [wrap/0.0.0]
EnoceanBridgeAdmin is the karaf-assembly packaging that I'm trying to build. It has a dependency on the bundle that contains the generated feature file (where wrap is referenced):
<dependency>
<groupId>net.winnall.enocean.service.impl</groupId>
<artifactId>EnoceanBridgeSASS.Impl</artifactId>
<version>0.99.99</version>
<type>xml</type>
<classifier>features</classifier>
</dependency>
So my questions:
Is method 1 above a correct usage?
Can I automatically generate a feature file to use method 2?
Will the error message disappear after I've got method 1 or 2 sorted?
Steve
I have resolved this myself.
Method 1 does not work because karaf-maven-plugin generates <feature> definitions for wrap. These cause the error mentioned above. Apparently – at least with Karaf 4.1.0 – the wrap: protocol is used in a feature file without a prior <feature> definition.
Method 2 (writing the feature file yourself) is thus the only viable option because of the behaviour of karaf-maven-plugin.
Yes, the error message disappeared :-)
Related
I am working on creating an internal library/starter for my team that will add support for creating native images, providing all of the hints that our currently unsupported dependencies will need. I really like providing the metadata via code (using RuntimeHintsRegistrar), but there are also certain classes that need to be initialized at build time for whatever reason.
Right now I'm passing the --initialize-at-build-time and the classes to the Spring Boot Maven Plugin via the BP_NATIVE_IMAGE_BUILD_ARGUMENTS, but ideally I'd like to avoid each consuming app having to include this in their own POM's plugin configuration.
I also understand that I can go more low-level and provide the argument inside of the META-INF/native-image directory in a native-image.properties file, but I'm not sure whether that will play nice with the Spring-provided RuntimeHintsRegistrar effectively creating that underneath the covers.
What is the best way to tell native-image the classes that should be initialized at build time without each consuming app having to pass it in their own POM? Also, if I use the GraalVM tracing agent to generate hints, will those hints play nicely with the ones that RuntimeHintsRegistrar generates?
Thanks in advance!
Experts,
I need some expert advice on how to approach the below use case in spring boot.
I need to have a maven multi-module approach to my project.
I need to have a single jar as output of the final build process.
There are to be common modules for controllers, data access and other functionality
Other modules are to be created based on functionality domain for eg a module for Payroll, a module for Admin etc etc.
Each domain functional module will then have their own controllers extending the common controller, exception handler and so on.
Each module will also have its own set of thyme leaf pages.
The reason for following such an approach is we have development in phases and we will be rolling out based on functional modules.
Here are the issues that I can sense using this approach.
Where do I add the spring web dependency? If I add to the parent pom - it gets replicated across the children and there will be port conflict issues as each module loads. the same issue will also be there the moment I add it to two child modules.
How do I build the fat jar which has all the jars from all modules and works as the final deployment?
All the text that I read i can't see anything even close to what I am trying to achieve.
AD1. They will not unless you are trying to setup independent application context in each module. Of course you can do that(it might be complicated but I believe it's achievable), but for me it's an overkill. Personally I think it's better to have one application context and rely on scanning components that are present in classpath.
AD2. The structure in maven might be a little bit complicated and overwhelming at first glance but it makes sense. Here's how I see it:
Create a parent module that will aggregate each module in project and will declare library/plugin dependencies for submodules.
Create 1-N shared submodules that will be used in other modules. With come common logic, utils, etc.
Create 1-N submodules that will be handling your business logic
Create an application submodule that creates application context and loads configuration and components from classpath
Create a submodule that will be responsible for packaging process, either to war, jar, uber-jar or whatever else you desire. Maven jar plugin should do that for you. For executable uber-jar, you have dedicated tool from spring.
Now you can choose three ways(these ways I know) of loading your modules.
1. Include some modules in maven build based on the build configuration via maven profiles and let spring IoC container load all the components he finds in the classpath
2. Include all of the modules in maven build and load them depending on spring active profiles - you can think about it as of feature flag. You annotate your components or configuration class with #Profile("XYZ") telling spring IoC container whether to instantiate component or not. You will need (most flexible solution) to provide a property file which tells spring which profiles are active and thus which modules should be loaded
3. Mix of these two above.
Solution 1 pros:
build is faster (modules that are not included will be skipped during build)
final build file is light (modules that are not included are... not included ;))
nobody can run module that is not present
Solution 1 contras:
project descriptor in maven may explode as you might have many different profiles
Solution 2 pros:
it's fairly easy and fun to maintain modules from code
less mess in project descriptor
Solution 2 contras:
somebody can run module that is not intended to be run as it's present in classpath, but just excluded during runtime via spring active profiles
final build file might be overweight - unused code is still present in code
build might take longer - unused code will be compiled
Summary:
It's not easy to build well structured project from scratch. It's much more easier to create a monolith and then split it into modules. It's because if you already created a project, you've probably already identified all the domains and relations between them.
Over past 8 years of using maven, I honestly and strongly recommend using gradle as it's far more flexible than maven. Maven is really great tool, but when it comes to weird customization it often fails as it's build capabilities rely on plugins. You can't write a piece of code on the fly to perform some custom build behaviour while buidling your project, you must have a dedicated plugin for doing that. If such plugin exists it's fine, if it's not you will probably end up writing your own and handling its shipment, so anyone in your company can easily perform project build.
I hope it helps. Have fun ;)
I'm creating a Liferay 7.1 OSGi bundle, which has some external dependencies in it. In consideration of time, we opted to embed the external JAR in our OSGi Bundle. I've managed to create a bnd file, which includes all of the ElasticSearch dependencies, and put them on the bundle classpath. I've used the source-code from github (https://github.com/liferay/liferay-portal/blob/master/modules/apps/portal-search-elasticsearch6/portal-search-elasticsearch6-impl/build.gradle) and the bnd.bnd file, to check what's imported.
When activating the bundle, an exception is thrown:
The activate method has thrown an exception
java.util.ServiceConfigurationError: org.elasticsearch.common.xcontent.XContentBuilderExtension: Provider org.elasticsearch.common.xcontent.XContentElasticsearchExtension not a subtype
at java.util.ServiceLoader.fail(ServiceLoader.java:239)
at java.util.ServiceLoader.access$300(ServiceLoader.java:185)
at java.util.ServiceLoader$LazyIterator.nextService(ServiceLoader.java:376)
at java.util.ServiceLoader$LazyIterator.next(ServiceLoader.java:404)
at java.util.ServiceLoader$1.next(ServiceLoader.java:480)
at org.elasticsearch.common.xcontent.XContentBuilder.<clinit>(XContentBuilder.java:118)
at org.elasticsearch.common.settings.Setting.arrayToParsableString(Setting.java:1257)
The XContentBuilderExtension is from the elasticsearch-x-content-6.5.0.jar,
the XContentElasticsearchExtension class, is included in the elasticsearch-6.5.0.jar. Both are Included Resources, and have been put on the classpath.
The Activate-method initializes a TransportClient in my other jar, hence it happens on activation ;).
Edit:
I've noticed that this error does NOT occur when installing this the first time, or when the portal restarts. So it only occurs when I uninstall and reinstall the bundle. (This is functionality I really prefer to have!). Maybe a stupid thought.. But could it be that there is some 'hanging thread'? That the bundle is not correctly installed, or that the TransportClient still is alive? I'm checking this out. Any hints are welcome!
Edit 2:
I'm fearing this is an incompatibility between SPI and OSGi? I've checked: The High Level Rest Client has the same issue. (But then with another Extension). I'm going to try the Low-Level Rest Client. This should work, as there are minimal dependencies, I'm guessing. I'm still very curious on why the incompatibility is there. I'm certainly no expert on OSGi, neither on SPI. (Time to learn new stuff!)
Seems like a case where OSGi uses your bundle to solve a dependency from another bundle, probably one that used your bundle to solve a package when the system started.
Looking at the symptoms: it does not occur when booting or restarts. Also it is not a subtype.
When OSGi uses that bundle to solve a dependency, it will keep a copy around, even when you remove it. When the bundle comes back a package that was previously used by another bundle may still be around and you can have the situation where a class used has two version of itself, from different classloaders, meaning they are not the same class and therefore, not a subtype.
Expose only the necessary to minimize the effects of this. Import only if needs importing. If you are using Liferay Gradle configuration to include the bundle inside, stop - it's a terrible way to include as it exposes a lot. If using the bnd file to include a resource and create an entry for the adicional classpath location, do not expose if not necessary. If you have several bundles using one as dependency, make sure about the version they use and if the exchange objects from the problematic class, if they do, than extra care is required.
PS: you can include attributes when exporting and/or importing in order to be more specific and avoid using packages from the wrong origin.
You can have 2 elastic search connections inside one Java app and Liferay is by default not exposing the connection that it holds.
A way around it is to rebuild the Liferay ES connector. It's not a big deal because you don't need to change the code only the OSGi descriptor to expose more services.
I did it in one POC project and worked fine. The tricky thing is to rebuild the Liferay jar but that was explained by Pettry by his google like search blog posts. https://community.liferay.com/blogs/-/blogs/creating-a-google-like-search (it is a series but it's kind of hard to navigate in the new Liferay blogs but Google will probably help) Either way it is all nicely documented here https://github.com/peerkar/liferay-gsearch
the only thing then what needs to be done is to add org.elasticsearch.* in the bnd.bnd file in the export section. You will then be able to work with the native elastic API.
I´m currently building a spring-boot application, which also uses some javascript-stuff. I use yarn as a package-manager to manage the different js-libraries.
Now I wonder, how I would include these resources into my spring-boot-project? Simply including the whole node_module-folder as a resource seems to be overhead for me, as this doesn´t neccessarily contain only the required sources (for me it is more like my local maven-repo-path). How do I identify, which java-script-resources should be included in my jar in the end, so that I can also reference them in my Thymeleaf-HTML-templates.
I already found the 'frontend-maven-plugin' (https://github.com/eirslett/frontend-maven-plugin) which helps me to install all my yarn-dependencies during build, but it doesn´t care about the build-process, as far as I can see.
Thanks for your help!
Perhaps you should consider using webpack or some other javascript bundler/task runner to bundle your javascript and required dependencies into a single file. Then you can simply include that bundled file in your jar. For example: http://justincalleja.com/2016/04/17/serving-a-webpack-bundle-in-spring-boot/
I am trying to construct a maven acceleo generator.
The generator consists of multiple acceleo projects (artifacts in maven), with inter-dependencies.
I am running into a problem with the dependencies between emtl files.
At runtime, I get errors stating that there are compilation errors in the mtl, (there are not).
I am guessing it may be the hrefs within the compiled emtl files.
There is an acceleo maven compile plugin that allows for these to be either:
a) absolute paths (ending up to be via the maven repository)
b) 'platform:/plugin/...' paths.
(a) works if the acceleo projects (maven artifacts) are build on the same machine as the one on which we do the generation, but if the location of the maven repository changes, we have a problem. Hence deploying the maven artifacts ends up being pointless.
(b) doesn't work because running from within maven, 'platform:/plugin/' cannot be resolved.
I have tried to override the 'createURIConverter' method in the AbstractAcceleoGenerator,
then using a URLClassLoader we can decode the 'platform:/plugin/' hrefs and find the correct emtl file.
I can verify that this seems to be working, however I still get the reported
"org.eclipse.acceleo.engine.AcceleoEvaluationException: Unresolved compilation error in generation module"
any suggestions?
The exception was caused by hrefs in the EMTL model resolving to null.
The problem with resolving the hrefs was just a lack of resource factories being registered (the exception informing me of this gets lost in EcoreUtil.resolve which catches the exception with "// Failure to resolve is ignored." - most unhelpful).
So it seems that my approach of "override the ‘createURIConverter’ method, and get it to decode the ‘platform:/plugin/..’ using a URLClassLoader" does work.
Building the URLClassLoader using the same method done in the AcceleoParserMojo.