How to start a bundle on boot in apache karaf - osgi

I want to start Camel and ActiveMQ during boot when I start karaf, What i've found is the etc/org.apache.karaf.features.cfg which lists features that should be started during boot.
featuresBoot=config,ssh,management,camel,activemq,camel-jms,activemq-spring,activemq-camel
This works fine for all but 'activemq-camel' (fair enough, since it's not a feature).
To get the apache-camel bundle installed i've to run run the command:
karaf#...>osgi:install -s mvn:org.apache.activemq/activemq-camel/5.5.0
It works, but I'd rather just get it running at boot time.
Can I somehow get the activemq-camel bundle to be installed at boot time without creating a custom feature for it?

Apache ServiceMix has this feature already defined. To use it add mvn:org.apache.servicemix/apache-servicemix/4.4.1/xml/features to the featuresRepositories property in etc/org.apache.karaf.features.cfg and then camel-activemq to the featuresBoot property.
Cheers,
Jon

Related

Karaf + Camel + Maven + Docker

I am new to Karaf and Camel and I'm trying to deploy custom camel routes (java) and I'm facing a lot of problems at the time of deploying the camel bundle (.jar) in the hot deploy directory.
What I got so far:
Apache Karaf 4.3.1 running in docker container
Bundle .jar with the java defined route
My idea is to have a /deploy directory mapped to the karaf container so any .jar that's added to that directory is deployed (or maybe build a new image for karaf).
When I tried to add my current bundle to the directory I got the following error message:
20:19:32.490 INFO [fileinstall-/opt/karaf/deploy] Installing bundle org.apache.karaf.examples.karaf-camel-example-java / 4.3.1
20:19:32.535 WARN [fileinstall-/opt/karaf/deploy] Error while starting bundle: file:/opt/karaf/deploy/karaf-camel-example-java-4.3.1.jar
org.osgi.framework.BundleException: Unable to resolve org.apache.karaf.examples.karaf-camel-example-java [111](R 111.0): missing requirement [org.apache.karaf.examples.karaf-camel-example-java [111](R 111.0)] osgi.wiring.package; (&(osgi.wiring.package=org.apache.camel)(version>=3.6.0)(!(version>=4.0.0))) Unresolved requirements: [[org.apache.karaf.examples.karaf-camel-example-java [111](R 111.0)] osgi.wiring.package; (&(osgi.wiring.package=org.apache.camel)(version>=3.6.0)(!(version>=4.0.0)))]
at org.apache.felix.framework.Felix.resolveBundleRevision(Felix.java:4368) ~[?:?]
at org.apache.felix.framework.Felix.startBundle(Felix.java:2281) ~[?:?]
at org.apache.felix.framework.BundleImpl.start(BundleImpl.java:998) ~[?:?]
at org.apache.felix.fileinstall.internal.DirectoryWatcher.startBundle(DirectoryWatcher.java:1260) [!/:3.6.8]
at org.apache.felix.fileinstall.internal.DirectoryWatcher.startBundles(DirectoryWatcher.java:1233) [!/:3.6.8]
at org.apache.felix.fileinstall.internal.DirectoryWatcher.doProcess(DirectoryWatcher.java:520) [!/:3.6.8]
at org.apache.felix.fileinstall.internal.DirectoryWatcher.process(DirectoryWatcher.java:365) [!/:3.6.8]
at org.apache.felix.fileinstall.internal.DirectoryWatcher.run(DirectoryWatcher.java:316) [!/:3.6.8]
I think this can be solve with a maven bundle "wrap" but I'm not sure if this is correct, and if so, how should I wrap the bundle?
Thank you for reading :D
A bit late but hope this helps someone as I've fiddled with this setup quite a bit for the past year while exploring OSGi, Karaf, Camel and Docker.
If you want to do local development with karaf you can actually map your maven repository to the container which can make installing bundles and features quite a bit easier.
Example Docker compose for Karaf
Here's a docker-compose for karaf 4.2.11 but you can probably change it to 4.3.1 without any problems. (add :z on volumes if using SELinux)
version: "2.4"
services:
karaf-runtime:
container_name: karaf
image: apache/karaf:4.2.11
ports:
- 8101:8101
- 8181:8181
- 1098:1098
volumes:
- ./karaf/etc:/opt/apache-karaf/etc
- ./karaf/deploy:/opt/apache-karaf/deploy
- karaf-data:/opt/apache-karaf/data
- ~/.m2:/root/.m2
- karaf-history:/root/.karaf
command: [ karaf, server ]
volumes:
karaf-data:
karaf-history:
Just save it to a empty folder somewhere as docker-compose.yml. Create folder named karaf to the folder and then fetch the default configurations from karaf using couple docker commands:
# Start detached karaf container with name karaf
docker run --name karaf -d apache/karaf:4.2.11
# copy files from container to host-system
docker cp karaf:opt/apache-karaf/etc ./karaf/
# stop the container
docker stop karaf
Setting karafs etc folder as shared volume makes it easy to tweak and share the configurations through version control for other developers.
To start Apache karaf with docker compose you can use following commands:
# Start
docker compose up -d
docker-compose up -d
# Stop
docker compose down
docker-compose down
# note: docker compose = newer version of docker-compose command
Creating bundles
Easy way to create bundles is to use one of the official archetypes karaf-bundle-archetype or karaf-blueprint-archetype when creating the project.
For projects using Apache Camel it is generally easier to use karaf-blueprint-archetype. With it you configure the CamelContext in the xml blueprint file found in projects resources/OSGI-INF/blueprint/ folder.
Example:
<blueprint xmlns="http://www.osgi.org/xmlns/blueprint/v1.0.0">
<bean id="ExampleRoute" class="com.example.ExampleRouteBuilder" />
<camelContext id="ExampleContext" xmlns="http://camel.apache.org/schema/blueprint">
<routeBuilder ref="ExampleRoute" />
</camelContext>
</blueprint>
With ~/.m2:/root/.m2 shared volume you can just package the project using mvn clean install to your local maven repository which you can then use in karaf to install the bundle using bundle:install mvn:groupId/artifactId/version
If you want to use deploy folder you can copy artifacts to container using docker cp ./target/exampleBundle.jar karaf:/opt/apache-karaf/deploy
Adding camel feature to karaf
As for camel you can follow the official guide on how to add camel feature repository and the features you need.
But the steps are basically:
# Add camel feature repo
feature:repo-add camel <version>
# Install camel feature
feature:install camel
# List available camel features for install
feature:list | grep camel
# Install camel features you need
feature:install <feature-name>
Missing requirements
When installing bundles often encounter missing requirement exception that tells you that package that the bundle depends on is missing from karaf which means you'll have to install bundle or feature that exports the said package.
These messages are usually best read starting from the end:
(osgi.wiring.package=org.apache.camel)(version>=3.6.0)(!(version>=4.0.0)
The above tells you that karaf installation doesn't have camel installed. OSGi bundles expect OSGi framework/runtime to satisfy their dependencies which is quite a bit different from say standalone SpringBoot projects.
Shared volumes and new files
When it comes to sharing config files or karaf deploy folder it's good to know that Docker has some issues related to new files in shared volumes. If new file is added/created using host-systems file-system there's chance that karaf will not detect these files or changes made to them.
It's generally better to use docker cp path/to/file/on/host karaf:/path/to/folder/on/container to deploy new files to container even if its shared volume.
otherwise you might have to shell in to the container and make copy of the file in question and

Using Neo4j 3.1.5 with neo4j-elasticsearch 3.1.4

I am trying to link my Neo4j db with Elasticsearch using the recommended approach on the Neo4j website, with this GitHub repository https://github.com/neo4j-contrib/neo4j-elasticsearch
I have done all the steps that they say to do, but when I run it in terminal I get this error (everything works normally except that nothing is getting pushed to Elasticsearch):
Failed to load `org.apache.commons.logging.impl.AvalonLogger` from
plugin jar
`/Users/tkralj/Documents/Neo4j/default.graphdb/plugins/neo4j-
elasticsearch-3.1.4.jar`: org/apache/avalon/framework/logger/Logger
2017-07-13 20:21:46.911+0000 WARN [o.n.k.i.p.Procedures]
Failed to load `org.apache.commons.logging.impl.Log4JLogger` from
plugin jar
`/Users/tkralj/Documents/Neo4j/default.graphdb/plugins/neo4j-
elasticsearch-3.1.4.jar`: org/apache/log4j/Category
2017-07-13 20:21:46.911+0000 WARN [o.n.k.i.p.Procedures]
Failed to
load `org.apache.commons.logging.impl.LogKitLogger` from plugin jar
`/Users/tkralj/Documents/Neo4j/default.graphdb/plugins/neo4j-
elasticsearch-3.1.4.jar`: org/apache/log/Logger
I am running Neo4j 3.1.5, while this plugin was created for 3.1.4 and I think that may be the issue; however, there is no plugin made for 3.1.5, and I cannot find a way to download the older version of Neo4j.
Looks like old question but still answering for future googlers.
I was facing the same problem with 3.1.x version . For me it's working fine with version 3.2

Jersey bug when starting Spring-boot app Unix style

A question has already been posted here at Spring-boot jersey maven failed to run war file and its author has found a work around to get the Spring-boot app to run, but it's still an annoying bug.
Basically, when running a Spring-boot app with a Jersey configuration using ResourceConfig's public final ResourceConfig packages(final String... packages) with java -jar command or using Unix style ./app.jar start the following exception occurs:
java.io.FileNotFoundException:/path/app-1.0-SNAPSHOT.war!/WEB-INF/classes (No such file or directory)
Caused by: org.glassfish.jersey.server.internal.scanning.ResourceFinderException:
This does not occur when the app is ran with mvn spring-boor:run.
Not using packages("com.company.app.rest") is a work around, but it's a pain not to be able to have Jersey scan a base package.
Is this listed as a bug by the Spring-boot team?
I think it is but for some reason nobody complained hard enough. Can you please share your issue on #3413?

spring profile not activated on STS tc Server

I have a project with Spring profile
In my web.xml, i have
<context-param>
<param-name>spring.profiles.default</param-name>
<param-value>dev</param-value>
</context-param>
to set the default spring profile.
I build the maven project with
clean install -Dspring.profiles.active="prod"
Then, I choose the option Run As -> Run on Server to deploy the maven project to tc Server.
However, the active profile is still dev.
What is the correct way to activate a spring profile on tc Server
If you are running your app from with STS and the tc Server that comes with it, you can put the system property definition into the launch configuration of tc Server. Once you started up tc Server once, you can modify the lauch config via "Run Configurations...", select the one for Pivotal tc Server, go to the VM arguments and add the -Dspring.profiles.active=prod setting.
Since this is a runtime option, it doesn't have any effect while building the app via Maven (the clean install way you tried).
Configuring Tomcat
defining context param in web.xml – that breaks “one package for all environments” statement. I don’t recommend that
defining system property -Dspring.profiles.active=your-active-profile
I believe that defining system property is much better approach. So how to define system property for Tomcat? Over the internet i could find a lot of advices like “modify catalina.sh” because you will not find any configuration file for doing stuff like that. Modifying catalina.sh is a dirty unmaintable solution. There is a better way to do that.
Just create file setenv.sh in Tomcat’s bin directory with content:
JAVA_OPTS="$JAVA_OPTS -Dspring.profiles.active="prod"
and it will be loaded automatically during running catalina.sh start or run.
I don't know why the JAVA_OPTS method didn't work for me.
What does is adding spring.profiles.active=dev in /conf/catalina.properties

Unable to auto-deploy bundle to Karaf

I am developing an OSGI-based application, which deploys to Karaf container. Karaf has an auto-deployment feature, whereby copying a bundle to its karaf/deploy directory should automatically deploy that bundle into the container. More often than not, however, I am getting errors similar to the one below when I copy bundles into the deploy directory:
org.osgi.framework.BundleException: Bundle symbolic name and version are not unique: legacy-services-impl:8.0.0.ALPHA-SPRINT9-SNAPSHOT
at org.apache.felix.framework.BundleImpl.createRevision(BundleImpl.java:1225)
at org.apache.felix.framework.BundleImpl.<init>(BundleImpl.java:95)
at org.apache.felix.framework.Felix.installBundle(Felix.java:2979)
at org.apache.felix.framework.BundleContextImpl.installBundle(BundleContextImpl.java:165)
at org.apache.felix.fileinstall.internal.DirectoryWatcher.installOrUpdateBundle(DirectoryWatcher.java:1030)[6:org.apache.felix.fileinstall:3.3.11.fuse-71-047]
at org.apache.felix.fileinstall.internal.DirectoryWatcher.install(DirectoryWatcher.java:944)[6:org.apache.felix.fileinstall:3.3.11.fuse-71-047]
at org.apache.felix.fileinstall.internal.DirectoryWatcher.install(DirectoryWatcher.java:857)[6:org.apache.felix.fileinstall:3.3.11.fuse-71-047]
at org.apache.felix.fileinstall.internal.DirectoryWatcher.process(DirectoryWatcher.java:483)[6:org.apache.felix.fileinstall:3.3.11.fuse-71-047]
at org.apache.felix.fileinstall.internal.DirectoryWatcher.run(DirectoryWatcher.java:291)[6:org.apache.felix.fileinstall:3.3.11.fuse-71-047]
Instead of redeploying an already deployed bundle, the container tells me that I am trying to deploy a duplicate bundle.
The Karaf indeed has that bundle deployed, but why wouldn't it redeploy the bundle? What is causing this behavior? How to avoid such errors on auto-deploy?
Thank you,
Michael
I suspect that your bundle does not stop correctly. That may be the reason why karaf thinks it is still there. Do you have some code in your activator that is executed when stopping? Perhaps you are also running some threads. You should make sure the stop method of your activator works and cleanly closes all resources and stops all threads of your bundle.

Resources