Jruby & OSGi - How to working together? - osgi

I want to create an simple osgi bundle to run ruby source file , so i using jruby-complete .Here is code example
A bundle which run jruby file
package activator;
import org.jruby.embed.ScriptingContainer;
public class Main {
public void runRubySource(String[] args) {
try {
System.out.println("JRUBYYYYYYYYYYYYYYYYYYYYYYYYy");
ScriptingContainer container = new ScriptingContainer();
container.setArgv(args);
container.runScriptlet("require 'ruby/test.rb'");
} catch (Exception ex) {
System.out.println(ex.getMessage());
}
}
}
A bundle which using above bundle
package activator;
import activator.Main;
import org.jruby.embed.ScriptingContainer;
import org.osgi.framework.BundleActivator;
import org.osgi.framework.BundleContext;
public class Test implements Activator{
#Override
public void start(BundleContext context) throws Exception {
// TODO Auto-generated method stub
Main m = new Main();
String[] args = {"-c","C:\\fileconfig.conf"};
m.runRubySource(args );
}
#Override
public void stop(BundleContext context) throws Exception {
// TODO Auto-generated method stub
}
}
POM file for osgi bundle build using maven
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>com.insight</groupId>
<artifactId>jruby</artifactId>
<packaging>bundle</packaging>
<name>JrubyDemo</name>
<version>1.0</version>
<dependencies>
<dependency>
<groupId>org.jruby</groupId>
<artifactId>jruby-complete</artifactId>
<version>1.7.10</version>
</dependency>
<dependency>
<groupId>org.osgi</groupId>
<artifactId>org.osgi.core</artifactId>
<version>4.3.0</version>
</dependency>
</dependencies>
<build>
<plugins>
<plugin>
<groupId>org.apache.felix</groupId>
<artifactId>maven-bundle-plugin</artifactId>
<version>2.0.1</version>
<extensions>true</extensions>
<configuration>
<Embed-Transitive>true</Embed-Transitive>
<Export-Package>*</Export-Package>
<instructions>
<Bundle-SymbolicName>${project.artifactId}</Bundle-SymbolicName>
</instructions>
</configuration>
</plugin>
</plugins>
</build>
</project>
Steps :
Start felix with osgi
Start jruby-complete (this jar file wrap using pax-wrap to make it as a osgi bundle: https://ops4j1.jira.com/wiki/display/paxurl/Wrap+Protocol )
Start my bundle
Now it raise an error:
(LoadError) no such file to load -- jruby/jruby.rb
Certainly, jruby/jruby.rb is contained in jruby-complete.jar ,not in my example bundle
So, what i have to do ??

(LoadError) no such file to load -- jruby/jruby.rb
I think you bundle is probably corrupt.
BTW, jruby-complete is already an OSGI bundle, so try the same without wrapping: it should definitely be able to load its own classes.
However, for the next step:
container.runScriptlet("require 'ruby/test.rb'");
You're asking a class in the jruby bundle to load the resource from another bundle.
The problem is that jruby-complete does not know about your bundle, as it's not wired to it via normal OSGI mechanims.
So you need some form of reverse-lookup mechanism to which lets the Jruby bundle locate resources/class in other bundles, without adding a direct dependency (RequireBundle or ImportPackage) to Jruby's bundle (as that would not be scalable if you want then to be able load from other bundles, or maybe reuse jruby in other contexts).
I'm using Eclipse Equinox for a similar setup, so I'm "spoiled" with nasty treats like Buddy Policy. Apart from being specific to that container has its own disadvantages, but it's been good enough for me.
Currently one generic OSGI 'equivalent' for BuddyPolicy=Global seems to be
DynamicImport-Package, however it's only there as a last resort as less flexible than the above.
Both of the above involve adding a line into Jruby bundle's Manifest (again jruby-complete.jar but I happen to repackage the whole thing as org.jruby).
A better solution is probably JRuby's own OSGiScriptingContainer, where you can pass the loading bundle into your class, something like this:
package activator;
import org.jruby.embed.OSGIScriptingContainer;
public class Main {
public void runRubySource(String[] args) {
try {
System.out.println("JRUBYYYYYYYYYYYYYYYYYYYYYYYYy");
ScriptingContainer container = new OSGIScriptingContainer(Activator.getBundle()); //
container.setArgv(args);
container.runScriptlet("require 'ruby/test.rb'");
} catch (Exception ex) {
System.out.println(ex.getMessage());
}
}
}
I haven't tried this but I'm going to change my setup as this is seems the right way.

Scripting in general is a little different in OSGI, since scripting can be/is runtime and OSGI wants you to specify things at build-time (pulled this from the linked thread). For scripting using JRuby, you might be better off using JSR-223 to load your engine, as there has been work getting these two slightly incompatible techniques to work together.
Here is a similar thread talking about working with JSR-223 in OSGI with some solutions:
Is OSGi fundamentally incompatible with JSR-223 Scripting Language Discovery?

Related

Glue code is not loaded when running with cucumber-spring back-end from jar file

I have been trying to get spring-based cucumber tests to run using a combination of Junit(4.12), Cucumber-Java(4.1.1), Cucumber-Spring(4.1.1) and Cucumber-Junit(4.1.1).
I have no issues loading glue code when running the tests from inside the IDE (IntelliJ 2018.3.4) but it seems that for some reason when I try running from the a compiled jar file (which is a requirement in this case) cucumber doesn't find the step definitions.
I've already tried multiple glue code formats such as:
"classpath:com.a.b.c.stepdefs"
"com.a.b.c.stepdefs"
"classpath:com/a/b/c/stepdefs"
I've also tried providing relative paths from the runner class up to the step definitions class (nested just one level below)
"stepdefs"
Also gave a try running using both JUnit and the cucumber.cli.Main and attempted to use different style of step definitions (both cucumber expression - which the missing step snippets are pointing me to - and regex)
I am using the spring-boot-maven-plugin so I am aware that that generally changes the jar structure
All of the above variations fully work when running from the IDE, but not from the jar file
Main Class:
#SpringBootApplication(exclude = {DataSourceAutoConfiguration.class, HibernateJpaAutoConfiguration.class})
#ComponentScan(basePackages = {"com.a.b.test.core.data",
"com.a.b.c",
"com.a.b.c.stepdefs"}
)
public class CucumberApplication {
public static void main(String[] args) throws IOException, InterruptedException {
SpringApplication.run(CucumberApplication.class, args);
Result result = JUnitCore.runClasses(RunnerCentral.class);
System.exit(result.wasSuccessful() ? 0 : 1);
}
}
Runner Class:
package com.a.b.c;
#RunWith(Cucumber.class)
#CucumberOptions(features = "classpath:BOOT-INF/classes/features",
glue = "classpath:com/a/b/c/stepdefs",
plugin = "json:target/cucumber-html-reports/cucumber.json")
public class RunnerCentral {
}
POM config of spring-boot-maven-plugin:
<plugin>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-maven-plugin</artifactId>
<version>2.1.0.RELEASE</version>
<configuration>
<fork>true</fork>
<mainClass>${start-class}</mainClass>
<requiresUnpack>
<dependency>
<groupId>io.cucumber</groupId>
<artifactId>cucumber-java</artifactId>
</dependency>
<dependency>
<groupId>io.cucumber</groupId>
<artifactId>cucumber-spring</artifactId>
</dependency>
<dependency>
<groupId>io.cucumber</groupId>
<artifactId>cucumber-junit</artifactId>
</dependency>
</requiresUnpack>
</configuration>
<executions>
<execution>
<goals>
<goal>repackage</goal>
</goals>
</execution>
</executions>
</plugin>
I am expecting the behavior to be consistent between running from IDE and running from a packaged source although I may be missing something
Another thing I want to mention is that when swapping the backend with cucumber-picocontainer everything seems to work (spring is a requirement so a swap isn't possible)
This is the kind of issue that can have you launching your hot coffee at the nearest colleague.
Have you seen this post about using a custom ResourceLoader https://github.com/cucumber/cucumber-jvm/issues/1320
I think you'd have to copy and paste the Cucumber.java class, providing the resource loader to the runtime from the Application Context, and change your RunnerCentral class to RunWith the new class.
FWIW in my case, I placed the raw project in a docker container, that on startup ran ./mvnw test which is the Maven Wrapper supplied in Spring Boot projects. You can do ./mvnw test -s /path/to/maven/settings.xml if using a corporate repository, and if your container host can't access the corporate repository, run the image first on the Jenkins box (or wherever the image is being built) which will cause the dependency jars to be downloaded inside, then commit the docker image, and push that image out.
That way, the container can run the cucumber test phase using the local .m2 directory inside it, with the dependencies it needs already there.

vertx 3.5.1 missing classes

I started to look into developing with VertX, and I stumbled into problems with some classes that couldnt be resolved. I am posting a simple example.
pom.xml
<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0
http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>demo.rabbit</groupId>
<artifactId>rabbitmq-client</artifactId>
<version>1.0-SNAPSHOT</version>
<dependencies>
<dependency>
<groupId>io.vertx</groupId>
<artifactId>vertx-core</artifactId>
<version>3.5.1</version>
</dependency>
</dependencies>
</project>
java code
import io.vertx.core.AbstractVerticle;
import io.vertx.core.AsyncResult;
import io.vertx.core.json.JsonObject;
public class RabbitMQVerticle extends AbstractVerticle
{
#Override
public void start() throws Exception {
AsyncResult ar;
JsonObject jo;
}
}
If I leave it like this, the compiler cannot resolve the AsyncResult and JsonObject imports, and thus cannot resolve both types.
In the external libraries view, those classes appear as part of the io.vertx.core library but the icon next to them indicates that they are missing from the library.
If I replace the vertx.core version to 3.5.0 in the pom file everything works great, switch back to 3.5.1 and nothing works again.
It's also my first time using Maven, what am I missing?
Couldn't find any useful information anywhere on the web
The mentioned classes are parts of the core Vert.x library. Core blocks never get deleted in mature libraries.
Here down the AsyncResult class for example under both versions:
AsyncResult under 3.5.0 version
AsyncResult under 3.5.1 version
Indeed I think even when changing the library version, your project still compiles (using cmd line or using IntelliJ IDEA) but you are facing a UI highlight issue with you IDE.
You can try to:
Re-import all Maven modules using the Maven Projects Tool Window
Clean the system caches and restart the IDE

Run a Maven plugin when the build fails

I am using a plugin to send a Slack message through Maven. I am wondering if it's possible to use a plugin when the build failed so I get automatically notified about the failed build?
You could do that within Maven itself, through the EventSpy mechanism, built-in from Maven 3.0.2. At each step of the build, several events are raised by Maven itself, or by custom code, and it is possible to listen to those events to perform some actions. The execution event raised by Maven are represented by the class ExecutionEvent. Each event has a type, that describes what kind of event it represents: project failure, Mojo failure, project skipped, etc. In this case, the project failure event is what you're looking for.
A custom spy on events is just a Java class that implements the EventSpy interface. Preferably, it should inherit from the AbstractEventSpy helper class. As an example, create a new project (let's call it my-spy), and add the following Java class under a package:
import org.apache.maven.eventspy.AbstractEventSpy;
import org.apache.maven.eventspy.EventSpy;
import org.apache.maven.execution.ExecutionEvent;
import org.codehaus.plexus.component.annotations.Component;
import org.codehaus.plexus.component.annotations.Requirement;
import org.codehaus.plexus.logging.Logger;
#Component(role = EventSpy.class)
public class BuildFailureEventSpy extends AbstractEventSpy {
#Requirement
private Logger logger;
#Override
public void onEvent(Object event) throws Exception {
if (event instanceof ExecutionEvent) {
ExecutionEvent executionEvent = (ExecutionEvent) event;
if (executionEvent.getType() == ExecutionEvent.Type.ProjectFailed) {
logger.info("My spy detected a build failure, do the necessary here!");
}
}
}
}
This code simply registers the spy through the Plexus' #Component annotation, and logs a message when a project failed to build. To compile that class, you just need to add to the my-spy project a dependency on Maven Core and an execution of the plexus-component-metadata plugin to create the right Plexus metadata for the component.
<dependencies>
<dependency>
<groupId>org.apache.maven</groupId>
<artifactId>maven-core</artifactId>
<version>3.0.2</version>
</dependency>
</dependencies>
<build>
<plugins>
<plugin>
<groupId>org.codehaus.plexus</groupId>
<artifactId>plexus-component-metadata</artifactId>
<version>1.6</version>
<executions>
<execution>
<goals>
<goal>generate-metadata</goal>
</goals>
</execution>
</executions>
</plugin>
</plugins>
</build>
Once this project is compiled and installed into your local repository (through mvn clean install), you can add it to the build of another project through the core extensions mechanism.
Before Maven 3.3.1, you had to drop the my-spy JAR into your ${MAVEN_HOME}/lib/ext folder, so that Maven could find it. As of 3.3.1, you don't need to fiddle with your Maven installation, and can create a file .mvn/extensions.xml in your project base directory (${maven.multiModuleProjectDirectory}/.mvn/extensions.xml). Its content would be
<?xml version="1.0" encoding="UTF-8"?>
<extensions>
<extension>
<groupId>my.spy</groupId>
<artifactId>my-spy</artifactId>
<version>0.0.1</version>
</extension>
</extensions>
which just declares an extension pointing to the Maven coordinates of the spy project. Maven (≥ 3.3.1) will by default look for that file, and, as such, your spy will be correctly registered and invoked throughout the build.
The only remaining thing to do, is to code what the spy should do. In your case, it should invoke a Maven plugin, so you take a look at the Mojo Executor library, which makes that very easy to do.

Find classes that implement interfaces or being subclasses/superclasses in maven CLASSPATH?

VisualVM OQL queries can't query for interfaces because current heap dump format doesn't preserve this info.
To workaround this issue it is possible to find classes that implements interface and further perform heap dump analysis.
I have an application managed by Maven. During build Maven know full application CLASSPATH.
Is it possible to query via mvn command which classes in which package implements selected interface?
Or even more - to find classes and packages in application build CLASSPATH which is subclasses or superclasses of selected class?
Are there exist plug-in suitable for my needs?
UPDATE Interesting suggestion to use IDE for getting list of known implementation.
I work with Emacs and NetBeans. NetBeans have limited ability (Find Usage dialog by Alt+ F7) to find know implementation but its scope is limited to only to open projects. For example I look for org.hibernate.cfg.NamingStrategy implementation and NetBeans doesn't help in my case.
Because I need that list for further scripting GUI tools are not relevant unless they provide clean text export.
If you really need to achieve this via maven or scripting, here is how I got it working.
Based on the approach suggested by another answer on Stackoverflow, I implemented the following simple class:
package com.sample;
import java.io.File;
import java.io.IOException;
import java.util.ArrayList;
import java.util.Collection;
import java.util.Scanner;
import org.clapper.util.classutil.ClassFilter;
import org.clapper.util.classutil.ClassFinder;
import org.clapper.util.classutil.ClassInfo;
public class MainScan {
public static void main(String[] args) throws Exception {
if (args.length < 2) {
System.out.println("Missing options");
System.exit(-1);
}
System.out.println("Filtering by: " + args[1]);
ClassFinder finder = new ClassFinder();
finder.addClassPath();
loadClasspath(finder, args[0]);
ClassFilter filter = new ImplementInterfaceFilter(args[1]);
// you could also use as a filter: new
// SubclassClassFilter(AbstractFileFilter.class);
// or make a concatenation of filters using an AndClassFilter
Collection<ClassInfo> foundClasses = new ArrayList<ClassInfo>();
finder.findClasses(foundClasses, filter);
if (foundClasses.size() > 0) {
for (ClassInfo classInfo : foundClasses) {
System.out.println("- " + classInfo.getClassName());
// consider also using classInfo.getClassLocation() to get the
// jar file providing it
}
} else {
System.out.println("No matches found.");
}
}
static void loadClasspath(ClassFinder finder, String file) throws IOException {
Scanner s = new Scanner(new File(file));
s.useDelimiter(File.pathSeparator);
try {
while (s.hasNext()) {
finder.add(new File(s.next()));
}
} finally {
s.close();
}
}
static class ImplementInterfaceFilter implements ClassFilter {
private String interfaceName;
public <T> ImplementInterfaceFilter(String name) {
this.interfaceName = name;
}
public boolean accept(ClassInfo info, ClassFinder finder) {
for (String i : info.getInterfaces()) {
if (i.endsWith(this.interfaceName)) {
return true;
}
}
return false;
}
}
}
Note, the class is located in the com.sample package, but it can obviously be moved to some other package. The main method expects two options, a classpath file and an interface name, it will then add the classpath to the classpath finder and scan it looking for classes implementing the provided interface name (via a custom filter also provided above). Both options will be provided at runtime by Maven as following:
I used this library for the classpath scanning, hence as suggested on its official page, we need to add a custom repository to our POM:
<repositories>
<repository>
<releases>
<enabled>true</enabled>
<updatePolicy>always</updatePolicy>
<checksumPolicy>warn</checksumPolicy>
</releases>
<id>clapper-org-maven-repo</id>
<name>org.clapper Maven Repo</name>
<url>http://maven.clapper.org/</url>
<layout>default</layout>
</repository>
</repositories>
And the required dependency:
<dependencies>
...
<dependency>
<groupId>org.clapper</groupId>
<artifactId>javautil</artifactId>
<version>3.1.2</version>
</dependency>
...
</dependencies>
Then we just need to configure the following in our Maven build:
<build>
<plugins>
...
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-dependency-plugin</artifactId>
<version>2.1</version>
<executions>
<execution>
<phase>validate</phase>
<goals>
<goal>build-classpath</goal>
</goals>
<configuration>
<outputFile>${project.build.directory}/classpath.txt</outputFile>
</configuration>
</execution>
</executions>
</plugin>
<plugin>
<groupId>org.codehaus.mojo</groupId>
<artifactId>exec-maven-plugin</artifactId>
<version>1.1</version>
<executions>
<execution>
<phase>validate</phase>
<goals>
<goal>java</goal>
</goals>
<configuration>
<mainClass>com.sample.MainScan</mainClass>
<arguments>
<argument>${project.build.directory}/classpath.txt</argument>
<argument>${interfaceName}</argument>
</arguments>
</configuration>
</execution>
</executions>
</plugin>
...
</plugins>
</build>
We are basically configuring the Maven Dependency Plugin to write the full Maven build classpath to a file, then using the Exec Maven Plugin to execute our custom Java main, passing to it the classpath file and a parameter, ${interfaceName}. Both plugins executions are linked to the validate phase: we don't need to execute the full maven build, we will just invoke one of its first phases for this task.
As such, we can invoke the maven build as following:
mvn validate -DinterfaceName=Serializable -q
And have an output like the following:
Filtering by: Serializable
- org.apache.commons.io.ByteOrderMark
- org.apache.commons.io.comparator.CompositeFileComparator
- org.apache.commons.io.comparator.DefaultFileComparator
...
The Maven command will directly invoke our concerned phase, validate, using the -q option (quite) to skip any maven build log and just get the output interesting to us. Moreover, we can then dynamically pass the interface we want via the -DinterfaceName=<value_here> option. It will pass the value to the Exec Maven Plugin and as such to the Java main above.
According to further needs (scripting, output, format, etc.), the Java main can be easily adapted. Moreover, the plugins, dependency, repositories configuration could also be moved to a Maven profile to have it cleaner and better organized.
Last note: if you change the package of the Java main above, do not forget to change the Exec Maven Plugin configuration accordingly (the mainClass element).
So, back to your questions:
Is it possible to query via mvn command which classes in which package implements selected interface? Yes, applying the approach above.
Or even more - to find classes and packages in application build CLASSPATH which is subclasses or superclasses of selected class? Yes, look at the SubclassClassFilter from the same library, change the main above accordingly and you will get to it.
Are there exist plug-in suitable for my needs? I couldn't find any, but the code above could be easily converted into a new Maven plugin. Otherwise the approach described here is a mix of Java code and existing Maven plugins usage, which could suit your need anyway.

How do inject properties or environment variables into my Maven plugin testcase?

I've created a Maven plugin using AbstractMojo and I'm trying to test it.
I'm using the maven-plugin-testing-harness to do the testing and I'm having problems with injecting values for my plugin parameters.
I have the following pom.xml file (src/test/resources/pom.xml) for testing:
<project xmlns="http://maven.apache.org/POM/4.0.0"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0
http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>com.vodori.pepper.docker.vm.unit</groupId>
<artifactId>test-pom</artifactId>
<version>1.0-SNAPSHOT</version>
<packaging>jar</packaging>
<name>Test VMStarter</name>
<dependencies>
<dependency>
<groupId>junit</groupId>
<artifactId>junit</artifactId>
<version>4.11</version>
<scope>test</scope>
</dependency>
</dependencies>
<build>
<plugins>
<plugin>
<groupId>com.vodori.common</groupId>
<artifactId>pepper-docker-vm</artifactId>
<version>1.0.1-SNAPSHOT</version>
<configuration>
<dockerSnapshotSite>pepper-demo-site</dockerSnapshotSite>
<dockerSnapshotVersion>3.6.11</dockerSnapshotVersion>
</configuration>
</plugin>
</plugins>
</build>
</project>
My Mojo looks like this (at the top):
public abstract class VMPlugin extends AbstractMojo {
/**
* Docker location
*/
#Parameter(required = true, property = "docker.path", defaultValue = "${env.DOCKER_LOCATION}")
String dockerPath;
public void setDockerPath(String dockerPath) {
this.dockerPath = dockerPath;
}
/**
* Docker VM Site name
*/
#Parameter(required = true, property = "docker.snapshot.site")
String dockerSnapshotSite;
/**
* Version of Docker snapshot
*/
#Parameter (required = true, property="docker.snapshot.version")
String dockerSnapshotVersion;
I'm using the #MojoRule approach for testing and my setup method looks like this:
#Before
public void setUp() throws Exception {
vmStarter = (VMStarter) rule.lookupMojo( "start-docker-vm", "src/test/resources/pom.xml" );
assertNotNull(vmStarter);
}
I use the setter for some of my testcases (the ones that test bad docker locations), but for my good path testing, I want to rely on the environment variable DOCKER_LOCATION for populating. However, for some reason, dockerPath is just showing up as null. It's as if the defaultValue is being ignored.
I've tried dumping System.getEnv() onto STDERR and I can see that DOCKER_LOCATION is indeed set.
What am I missing here? Why isn't my #Parameter getting populated correctly?
Where did you get the syntax defaultValue = "${env.DOCKER_LOCATION}" from?
env.* is a property and "you can use Maven properties in a pom.xml file or in any resource that is being processed by the Maven Resource plugin’s filtering features."
default-value requires an expression.
Guide to Developing Java Plugins, Introduction mentions: "(more can be found in the "Parameter Expressions" document)". But i didn't find such a document so far. Thx #khmarbaise: org.apache.maven.plugin.PluginParameterExpressionEvaluator.
help:expressions doesn't show ${env} with my Maven 3.2.1.
Though it's my experience that at least some, if not all, of the linked docs are not up-to-date and withhold latest enhancements.
Possible explanation from a program logic point of view: If a default value has to be set outside the scope of a program it can't be considered a default value. In the sense of Maven's Convention Over Configuration.
EDIT: Added link to Expressions API documentation.

Resources