After feeling like I had a grasp on how OSGi is utilized, I tried to go about adding a 3rd party dependency, specifically log4j2, to my application that is utilizing apache felix and bundling with the maven-bundle-plugin. Unfortunately, it seems as if I am stuck in dependency hell.
I have tried using numerous maven-bundle tactics like Import-Package, Embed-Dependency, wrapImportPackage, Embed-Transitive, and setting specific version numbers just to name a few. Below is what my pom looks like for this plugin:
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<parent>
<artifactId>ParentId</artifactId>
<groupId>ParentGroupId</groupId>
<version>x.x.x</version>
</parent>
<groupId>ParentGroupId.ParentId</groupId>
<artifactId>thisModule</artifactId>
<packaging>bundle</packaging>
<name>thisModule</name>
<properties>
<project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
</properties>
<dependencies>
<dependency>
<groupId>org.osgi</groupId>
<artifactId>org.osgi.core</artifactId>
<version>6.0.0</version>
<scope>provided</scope>
</dependency>
<dependency>
<groupId>${project.groupId}</groupId>
<artifactId>AM</artifactId>
<version>${project.version}</version>
</dependency>
<dependency>
<groupId>org.apache.logging.log4j</groupId>
<artifactId>log4j-api</artifactId>
<version>2.12.1</version>
</dependency>
<dependency>
<groupId>org.apache.logging.log4j</groupId>
<artifactId>log4j-core</artifactId>
<version>2.12.1</version>
</dependency>
<dependency>
<groupId>javax.servlet</groupId>
<artifactId>javax.servlet-api</artifactId>
<version>3.1.0</version>
</dependency>
</dependencies>
<build>
<plugins>
<plugin>
<groupId>org.apache.felix</groupId>
<artifactId>maven-bundle-plugin</artifactId>
<version>3.5.1</version>
<extensions>true</extensions>
<configuration>
<instructions>
<Bundle-SymbolicName>${project.name}</Bundle-SymbolicName>
<Bundle-Activator>moduleActivator</Bundle-Activator>
<Embed-Dependency>
AM,
gson,
log4j-api,
log4j-core
</Embed-Dependency>
</instructions>
</configuration>
</plugin>
</plugins>
<finalName>${project.artifactId}</finalName>
</build>
The most progress I feel like I have had is with the above pom, where I am embedding the log4j api and core directly into the bundle, but it seems as if OSGi is incapable of downloading and bundling the compile dependencies that the log4j api and core are dependent on. It successfully builds using maven but then when I deploy the EAR and JAR I get this error at runtime (when the plugin is trying to boot up):
Caused By: org.osgi.framework.BundleException: Unresolved constraint in bundle thisModule [2]: Unable to resolve 2.0: missing requirement [2.0] package; (package=com.conversantmedia.util.concurrent)
error that will name a specific dependency that log4j needs. What I DON'T want to do is include every single dependency and their mother inside the Embed-Dependency tag.
What am I doing wrong here?
Also note: Due to constraints, my only option here is to use apache felix and OSGi.
Below are other examples of modifications I made to the above POM and their resulting outputs:
Removing both log4j-api and log4j-core from Embed-Dependency and adding in a <wrapImportPackage>;</wrapImportPackage> tag. Doing so resulted in this output, which is extremely common and happens whenever I try to import:
Caused By: org.osgi.framework.BundleException: Unresolved constraint in bundle thisModule [2]: Unable to resolve 2.0: missing requirement [2.0] package; (&(package=org.apache.logging.log4j)(version>=2.12.0)(!(version>=3.0.0)))
Adding * to Embed-Dependency as well as adding <Embed-Transitive>true</Embed-Transitive>:
Caused By: org.osgi.framework.BundleException: Unresolved constraint in bundle thisModule [2]: Unable to resolve 2.0: missing requirement [2.0] package; (package=android.dalvik)
Embedding a logging libary is a bad idea. After all you want to configure the logging in a central place which is very hard when each bundle embeds a logging framework.
In most cases the safe bet is to simple keep the maven-bundle-plugin config empty and let it do its thing.
I personally always use slf4j for logging in OSGi. You simply depend on the slf4j-api. The maven-bundle-plugin creates the import package statements automatically.
Then at runtime you simply deploy a logging framework that supports the logging api you want.
For apache karaf this is already the case by default. If you use bndtools or your own application assembly based on plain felix then check out my osgi best practices example.
It shows how to use slf4j-api in your own bundles and also how to configure the log system in karaf and bndtools based applications.
Related
I'm trying to understand how to tell WildFly through maven, that certain libraries are needed.
I have a maven project in eclipse-jee. When I call a JSP, which's backing class works alone, everything is ok. But when I call a JSP, which's backing class uses a class from a library, I get a ClassNotFoundException. When I run that backing class locally instead of on the server, it works perfectly.
The library is added as maven dependency, which is fine as long as I stay local. But when I deploy the WAR to WildFly, the library isn't deployed.
Here's my POM:
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 https://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<name>hello_neo4j</name>
<artifactId>myNeo4j</artifactId>
<dependencies>
<dependency>
<groupId>org.neo4j.driver</groupId>
<artifactId>neo4j-java-driver</artifactId>
<version>4.0.1</version>
</dependency>
<dependency>
<groupId>javax.servlet.jsp</groupId>
<artifactId>javax.servlet.jsp-api</artifactId>
<version>2.3.3</version>
</dependency>
<dependency>
<groupId>javax.servlet</groupId>
<artifactId>javax.servlet-api</artifactId>
<version>4.0.1</version>
</dependency>
<dependency>
<groupId>javax.enterprise</groupId>
<artifactId>cdi-api</artifactId>
<version>2.0.SP1</version>
</dependency>
<dependency>
<groupId>org.reactivestreams</groupId>
<artifactId>reactive-streams</artifactId>
<version>1.0.3</version>
</dependency>
</dependencies>
<groupId>com.my-domain</groupId>
<version>0.0.1</version>
<packaging>war</packaging>
</project>
As soon as I try to instantiate org.neo4j.driver.Config, I get:
...
Caused by: java.lang.ClassNotFoundException: org.neo4j.driver.Config from [Module "deployment.hello_neo4j.war" from Service Module Loader]
at org.jboss.modules.ModuleClassLoader.findClass(ModuleClassLoader.java:255)
at org.jboss.modules.ConcurrentClassLoader.performLoadClassUnchecked(ConcurrentClassLoader.java:410)
at org.jboss.modules.ConcurrentClassLoader.performLoadClass(ConcurrentClassLoader.java:398)
at org.jboss.modules.ConcurrentClassLoader.loadClass(ConcurrentClassLoader.java:116)
... 59 more
So, how can I tell eclipse, that it should either package the library in the WAR or deploy it along with the WAR?
I guess, somewhere in the POM should be anything telling that the dependency is to be deployed, but I don't know how.
In the effective POM, I see <scope>compile</scope> and I thought, it would mean that the dependency would be compiled into the POM. But obviously, I need something more, but what?
Thanks in advance for your helpful comments!
before posting the question, I had searched really long... But now, short after posting it, I found the answer: Just another dependency is needed in order to enable the server to resolve the dependencies from the deployed POM:
<dependency>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-deploy-plugin</artifactId>
<version>3.0.0-M1</version>
<type>maven-plugin</type>
<scope>runtime</scope>
</dependency>
Hope, this might help others to search a shorter time than I did.
I'm trying to build a Blueprint bundle to run in Apache Felix. I tried to make it running but I didn't succeed.
The blueprint bundle works fine in Karaf but not in Felix. Is it any documentation or a running example on the web to explain how to run a Blueprint bundle only with plain Felix. I suppose I have to manually add Aries to Felix platform but it didn't seem to work.
To be more precise, I want a simple service to see that it's loaded from a blueprint.xml XML config file as a Blueprint bundle. The service may have only one dummy method or even just a constructor with a println in it. That service class I want to refer it in OSGI-INF/blueprint/blueprint.xml so it will be loaded when the Blueprint bundle is loaded by Felix.
After spending some time trying to solve this problem I found the solution. So, you need the following bundles to be installed into your Felix (tested with v.4.4.1) in order to make Aries Blueprint running:
org.apache.aries.blueprint : org.apache.aries.blueprint : 1.1.0
org.apache.aries : org.apache.aries.util : 1.1.0
org.apache.aries.proxy : org.apache.aries.proxy : 1.0.1
org.apache.felix : org.apache.felix.configadmin : 1.8.0
one implementation of SLF4J (in this case will be PAX Logging):
org.ops4j.pax.logging : pax-logging-api : 1.4
org.ops4j.pax.logging : pax-logging-service : 1.4 (you may exclude log4j : log4j because is not needed)
These jars will enable Aries Blueprint in Felix (but only the XML configuration version). If you want to use annotations, you have to add also annotation related Jars.
Here is a pom to ease your work. Just run it and all the jar needed to be installed in felix will be in your target folder.
<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/maven-v4_0_0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>org.apache.aries</groupId>
<artifactId>blueprint-felix-assembly</artifactId>
<version>1.0-SNAPSHOT</version>
<name>Blueprint Felix Jar Assembly</name>
<packaging>pom</packaging>
<properties>
<project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
<pax.logging.version>1.4</pax.logging.version>
<aries.version>1.1.0</aries.version>
<aries.proxy.version>1.0.1</aries.proxy.version>
<felix.config.admin.version>1.8.0</felix.config.admin.version>
</properties>
<dependencies>
<dependency>
<groupId>org.apache.felix</groupId>
<artifactId>org.apache.felix.configadmin</artifactId>
<version>${felix.config.admin.version}</version>
</dependency>
<dependency>
<groupId>org.ops4j.pax.logging</groupId>
<artifactId>pax-logging-api</artifactId>
<version>${pax.logging.version}</version>
</dependency>
<dependency>
<groupId>org.ops4j.pax.logging</groupId>
<artifactId>pax-logging-service</artifactId>
<version>${pax.logging.version}</version>
<exclusions>
<exclusion>
<groupId>log4j</groupId>
<artifactId>log4j</artifactId>
</exclusion>
</exclusions>
</dependency>
<dependency>
<groupId>org.apache.aries.blueprint</groupId>
<artifactId>org.apache.aries.blueprint</artifactId>
<version>${aries.version}</version>
</dependency>
<dependency>
<groupId>org.apache.aries</groupId>
<artifactId>org.apache.aries.util</artifactId>
<version>${aries.version}</version>
</dependency>
<dependency>
<groupId>org.apache.aries.proxy</groupId>
<artifactId>org.apache.aries.proxy</artifactId>
<version>${aries.proxy.version}</version>
</dependency>
</dependencies>
<build>
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-dependency-plugin</artifactId>
<executions>
<execution>
<id>copy</id>
<phase>package</phase>
<goals>
<goal>copy-dependencies</goal>
</goals>
<configuration>
<excludeTransitive>true</excludeTransitive>
<outputDirectory>${project.build.directory}</outputDirectory>
</configuration>
</execution>
</executions>
</plugin>
</plugins>
</build>
</project>
Aries should run very well on Apache Felix it does not require Apache Karaf to run. In fact we are using plain equinox for our integration tests.
You can take a look at the integration test base class to see which bundles you need.
I have following entries in my pom.xml.
<dependency>
<groupId>org.apache.mina</groupId>
<artifactId>mina-core</artifactId>
<version>2.0.4</version>
</dependency>
<dependency>
<groupId>org.apache.mina</groupId>
<artifactId>mina-filter-compression</artifactId>
<version>2.0.7</version>
</dependency>
I am getting "Missing artifact org.apache.mina:mina-core:bundle:
2.0.7" error in my pom.xml .
Could someone please help in resolving this error.
Add to your pom file:
<plugins>
<plugin>
<groupId>org.apache.felix</groupId>
<artifactId>maven-bundle-plugin</artifactId>
<extensions>true</extensions>
</plugin>
</plugins>
Addition to the accepted answer, an explanation why this is necessary:
The various MINA dependencies rely on OSGi bundle artifacts rather than standard JAR files.
As such, it's necessary to add support for these bundle to Maven using the Apache Felix maven-bundle-plugin.
See https://stackoverflow.com/a/5409602 for a good explanation of OSGi bundles, with links to more info.
I am trying to implement an camel route to send files in a HDFS server with an OSGI bundle, using java language with no blue print, but i can't make it work due tu hdfs scheme not being found while creating the route.
The code of the class has been tested as a Jar and works. The issue is in karaf, which seems to not be able to use camel-hdfs for the bundle, even if the camel-hdfs bundle is shown using the list command.
Here's the pom file of the project :
<modelVersion>4.0.0</modelVersion>
<groupId>the.group</groupId>
<artifactId>receiveFile</artifactId>
<version>0.0.1-SNAPSHOT</version>
<name>receiveFile</name>
<packaging>bundle</packaging>
<dependencies>
<dependency>
<groupId>org.osgi</groupId>
<artifactId>osgi_R4_core</artifactId>
<version>1.0</version>
<scope>provided</scope>
<optional>true</optional>
</dependency>
<dependency>
<groupId>org.apache.camel</groupId>
<artifactId>camel-hdfs</artifactId>
<version>2.10.4</version>
</dependency>
<dependency>
<groupId>org.apache.camel</groupId>
<artifactId>camel-core</artifactId>
<version>2.10.4</version>
</dependency>
</dependencies>
<build>
<plugins>
<plugin>
<groupId>org.apache.felix</groupId>
<artifactId>maven-bundle-plugin</artifactId>
<version>2.3.7</version>
<extensions>true</extensions>
<configuration>
<instructions>
<Import-Package>*</Import-Package>
<Export-Package>activation</Export-Package>
<Private-Package>activation</Private-Package>
<Bundle-Activator>activation.Activator</Bundle-Activator>
</instructions>
</configuration>
</plugin>
</plugins>
</build>
I tried also Embed-Dependency with the transitive option, but it still doesn't work and i'm kind of stuck right now.
Karaf prints the following error :
org.osgi.framework.BundleException: Exception in activation.Activator.start() of bundle group.receiveFile.
The log :
org.osgi.framework.BundleException: Exception in activation.Activator.start() of bundle group.receiveFile.
at org.eclipse.osgi.framework.internal.core.BundleContextImpl.startActivator(BundleContextImpl.java:806)[osgi-3.6.2.R36x_v20110210.jar:]
at org.eclipse.osgi.framework.internal.core.BundleContextImpl.start(BundleContextImpl.java:755)[osgi-3.6.2.R36x_v20110210.jar:]
at org.eclipse.osgi.framework.internal.core.BundleHost.startWorker(BundleHost.java:370)[osgi-3.6.2.R36x_v20110210.jar:]
at org.eclipse.osgi.framework.internal.core.AbstractBundle.start(AbstractBundle.java:284)[osgi-3.6.2.R36x_v20110210.jar:]
at org.apache.felix.fileinstall.internal.DirectoryWatcher.startBundle(DirectoryWatcher.java:1244)[6:org.apache.felix.fileinstall:3.2.4]
at org.apache.felix.fileinstall.internal.DirectoryWatcher.startBundles(DirectoryWatcher.java:1216)[6:org.apache.felix.fileinstall:3.2.4]
at org.apache.felix.fileinstall.internal.DirectoryWatcher.startAllBundles(DirectoryWatcher.java:1205)[6:org.apache.felix.fileinstall:3.2.4]
at org.apache.felix.fileinstall.internal.DirectoryWatcher.process(DirectoryWatcher.java:500)[6:org.apache.felix.fileinstall:3.2.4]
at org.apache.felix.fileinstall.internal.DirectoryWatcher.run(DirectoryWatcher.java:291)[6:org.apache.felix.fileinstall:3.2.4]
Caused by: org.apache.camel.FailedToCreateRouteException: Failed to create route route1027 at: >>> To[hdfs://hadoopServer/received] <<< in route: Route[[From[file://toSend/]] -> [To[hdfs://hadoopServer/rece... because of Failed to resolve endpoint: hdfs://hadoopServer/received due to: No component found with scheme: hdfs
at org.apache.camel.model.RouteDefinition.addRoutes(RouteDefinition.java:879)
at org.apache.camel.model.RouteDefinition.addRoutes(RouteDefinition.java:172)
at org.apache.camel.impl.DefaultCamelContext.startRoute(DefaultCamelContext.java:722)
at org.apache.camel.impl.DefaultCamelContext.startRouteDefinitions(DefaultCamelContext.java:1789)
at org.apache.camel.impl.DefaultCamelContext.doStartCamel(DefaultCamelContext.java:1575)
at org.apache.camel.impl.DefaultCamelContext.doStart(DefaultCamelContext.java:1444)
at org.apache.camel.support.ServiceSupport.start(ServiceSupport.java:60)
at org.apache.camel.impl.DefaultCamelContext.start(DefaultCamelContext.java:1412)
at activation.Activator.start(Activator.java:24)
at org.eclipse.osgi.framework.internal.core.BundleContextImpl$1.run(BundleContextImpl.java:783)[osgi-3.6.2.R36x_v20110210.jar:]
at java.security.AccessController.doPrivileged(Native Method)[:1.7.0_11]
at org.eclipse.osgi.framework.internal.core.BundleContextImpl.startActivator(BundleContextImpl.java:774)[osgi-3.6.2.R36x_v20110210.jar:]
... 8 more
Caused by: org.apache.camel.ResolveEndpointFailedException: Failed to resolve endpoint: hdfs://hadoopServer/received due to: No component found with scheme: hdfs
at org.apache.camel.impl.DefaultCamelContext.getEndpoint(DefaultCamelContext.java:485)
at org.apache.camel.util.CamelContextHelper.getMandatoryEndpoint(CamelContextHelper.java:50)
at org.apache.camel.model.RouteDefinition.resolveEndpoint(RouteDefinition.java:187)
at org.apache.camel.impl.DefaultRouteContext.resolveEndpoint(DefaultRouteContext.java:108)
at org.apache.camel.impl.DefaultRouteContext.resolveEndpoint(DefaultRouteContext.java:114)
at org.apache.camel.model.SendDefinition.resolveEndpoint(SendDefinition.java:61)
at org.apache.camel.model.SendDefinition.createProcessor(SendDefinition.java:55)
at org.apache.camel.model.ProcessorDefinition.makeProcessor(ProcessorDefinition.java:461)
at org.apache.camel.model.ProcessorDefinition.addRoutes(ProcessorDefinition.java:179)
at org.apache.camel.model.RouteDefinition.addRoutes(RouteDefinition.java:876)
... 19 more
Karaf version : 2.2.8
Maven : m2e plugin, 3.0.4
Thanks in advance.
I would suggest to use a blueprint XML file to bootstrap your application. Just to setup a < camelContext >.
Otherwise you would have to do a lot of manual osgi setup and whatnot in your activator to properly setup Camel for OSGi. All that is taken care of when using camel-blueprint.
If you still want to do it from an Activator, then check out camel-core-osgi and see what we do there.
I'm sorry in advance I can't give the exact code as I don't have it at hand right now. However, I ran into a problem while trying to package my project into a jar. I used maven assembly plugin in my pom.xml to assemble all dependencies and project jar into one place. But now I need all of those dependency jars to have a custom manifest file. Is it possible to inject some properties with Maven itself somehow? Now the only solution I came up with is to use Maven's shade plugin and create an uber-jar, but the problem is that some of dependencies have custom manifests (like Spring framework ones) which gets lost and only one manifest is generated for the uber-jar. Is it possible to somehow tell maven to unpack dependencies, edit manifestEntries and pack them up again and assemble together with the project jar in a zip?
Long story short: basically what I want to find out, is would it be possibly to somehow modify a file inside one of the dependencies jar or in all dependency jars at once? Let's say my project has a dependency of spring-beans. Now I would like to modify a specific file inside spring-beans.jar, specifically manifest.md before I assemble them in one zip which should contain project.jar and spring-beans.jar (with a modified manifest.md). I think something similar is achievable with maven antrunner plugin?
Example:
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>com.mycompany.app</groupId>
<artifactId>project</artifactId>
<version>1.0-SNAPSHOT</version>
<packaging>jar</packaging>
<name>Project</name>
<dependencies>
<dependency>
<groupId>org.springframework</groupId>
<artifactId>spring-beans</artifactId>
</dependency>
<dependency>
<groupId>org.springframework</groupId>
<artifactId>spring-context</artifactId>
</dependency>
<dependency>
<groupId>org.springframework</groupId>
<artifactId>spring-core</artifactId>
</dependency>
</dependencies>
<build>
<plugins>
<plugin>
<artifactId>maven-assembly-plugin</artifactId>
<version>2.4</version>
<configuration>
<descriptors>
<descriptor>src/main/assembly/src.xml</descriptor>
</descriptors>
</configuration>
</plugin>
</plugins>
</build>
</project>
I want to modify manifest.md in spring-core.jar, spring-context.jar and spring-beans.jar. I know I can use shade plugin to make an uber jar which would have one manifest.md which I could edit within shade's configuration, but if it is possible to somehow modify specific dependency jars alone I think it would be more fool proof and I could use these libraries among several applets.