com.oracle.truffle.polyglot.PolyglotImpl (in unnamed module) cannot access class org.graalvm.polyglot.impl.AbstractPolyglotImpl - quarkus

I tried to extend scaffolded quarkus demo, https://code.quarkus.io/, with polyglot code for GraalVM:
#GET
#Produces(MediaType.TEXT_PLAIN)
public String hello() {
String out = "From JS:";
try (Context context = Context.create()) {
Value function = context.eval("js", "x => x+1");
assert function.canExecute();
int x = function.execute(41).asInt();
out=out+x;
System.out.println(out);
}
return "hello";
}
I added dependencies to pom.xml as suggested here [https://stackoverflow.com/questions/54384499/illegalstateexception-no-language-and-polyglot-implementation-was-found-on-the]
<dependency>
<groupId>org.graalvm.js</groupId>
<artifactId>js</artifactId>
<version>20.1.0</version>
</dependency>
<dependency>
<groupId>org.graalvm.js</groupId>
<artifactId>js-scriptengine</artifactId>
<version>20.1.0</version>
</dependency>
<dependency>
<groupId>org.graalvm.truffle</groupId>
<artifactId>truffle-api</artifactId>
<version>20.1.0</version>
</dependency>
But when I run on cmd line
./mvnw clean package
Test fails with exception, which I do not understand.
2020-06-22 19:26:56,328 ERROR [io.qua.ver.htt.run.QuarkusErrorHandler]
(executor-thread-1) HTTP Request to /hello failed, error id: 996b0479-d836-47a5-bbcb-67bd876f9277-1: org.jboss.resteasy.spi.UnhandledException:
java.lang.IllegalAccessError: superclass access check failed:
class com.oracle.truffle.polyglot.PolyglotImpl (in unnamed module #0x7bf61ba2) cannot access class org.graalvm.polyglot.impl.AbstractPolyglotImpl (in module org.graalvm.sdk)
because module org.graalvm.sdk does not export org.graalvm.polyglot.impl to unnamed module #0x7bf61ba2
UPDATE:
It looks like regression in quarkus, https://github.com/quarkusio/quarkus/issues/10226.
App test is passing when used with quarkus 1.2.1 (instead of 1.5.2).

Look into the mvn dependency:tree -- it turns out that org.graalvm.js:js:20.1.0 depends on org.graalvm.sdk:graal-sdk:19.3.1. I'd personally call that GraalVM JS bug.
If you add an explicit dependency on org.graalvm.sdk:graal-sdk:20.1.0, it should work.
(At least it did for me, but I was getting a different error than you, so not sure.)
EDIT: as I was warned about in the comment, it is not true that org.graalvm.js:js:20.1.0 depends on org.graalvm.sdk:graal-sdk:19.3.1. Instead, there must be something else that forces graal-sdk to 19.3.1, perhaps something from Quarkus. Explicitly managing it to 20.1.0 should still help.

are you maybe executing that on GraalVM 19.3.1? That is known to confuse the system. Our strong suggestion is to EITHER run on a GraalVM (which automatically includes a proper version of Graal.js, no further input needed), OR run on a Stock JDK and import the respective JARs from Maven. If you import (a different version of) our Jars from maven on a GraalVM, then you might run into conflicts like that.

Related

Eureka Discovery Client is not working with java 11 modules [duplicate]

With Java 9 on the close horizon I thought it would be a good learning exercise to port some of my projects over to Java 9. In one of my projects I have dependencies for rxjava and rxjavafx
dependencies {
compile 'io.reactivex:rxjava:1.2.6'
compile 'io.reactivex:rxjavafx:1.0.0'
...
}
I want to create this project as a named-module. To do this I need to create a module-info.java file and I need to specify the requirements for rxjava and rxjavafx here. However, these libs don't have any module info yet.
In order to work around this I've read that I need to create Automatic Modules. From what I understand, I need to rename the rxjava and rxjavafx jars to have a simple name and then list the jars in the --module-path parameter. I then add a requires directive in my module-info.java with the jar names.
module com.foo.bar {
requires rxjavafx;
requires rxjava;
}
I wrote a gradle task to edit the jar names for me, and it appears to be working in most cases. It takes all the jars that need to be compiled and renames them to not include version-info or slashes. The files are then concatenated into a : separated string:
tasks.withType(JavaCompile) {
delete { delete '/tmp/gradle' }
copy {
from configurations.compile + configurations.testCompile
into '/tmp/gradle'
rename '(.*)-[0-9]+\\..*.jar', '$1.jar'
rename { String fileName -> fileName.replace("-", "") }
}
options.compilerArgs += ['--module-path', fileTree(dir: '/tmp/gradle', include: '*.jar').getFiles().join(':')]
}
Naturally the rx libraries share some of their package names... this however causes the compiler to spit back errors such as:
error: module reads package rx.subscriptions from both rxjava and rxjavafx
error: module reads package rx.schedulers from both rxjava and rxjavafx
error: module reads package rx.observables from both rxjava and rxjavafx
error: module rxjava reads package rx.subscriptions from both rxjavafx and rxjava
error: module rxjava reads package rx.schedulers from both rxjavafx and rxjava
error: module rxjava reads package rx.observables from both rxjavafx and rxjava
error: module rxjavafx reads package rx.subscriptions from both rxjava and rxjavafx
error: module rxjavafx reads package rx.schedulers from both rxjava and rxjavafx
error: module rxjavafx reads package rx.observables from both rxjava and rxjavafx
It seems like the only way to get around this issue would be to re-package the contents of rxjava and rxjavafx into a single jar and add that as a single module. This doesn't seem like a good solution though...
So my questions are:
Am I using the new module system correctly?
What can I do about this error? and
Do these dependencies prevent me from updating, or should I just wait for rx to update their libs?
Note: I've tried running this with standard java/javac and they cause the same issues. Also here is my java version:
java version "9-ea"
Java(TM) SE Runtime Environment (build 9-ea+140)
Java HotSpot(TM) 64-Bit Server VM (build 9-ea+140, mixed mode)
Am I using the new module system correctly?
Yes. What you are seeing is intended behavior, and this is because JPMS modules do not allow split packages.
In case you are not familiar with the term "split packages" it essentially means two members of the same package coming from two different modules.
For example:
com.foo.A (from moduleA.jar)
com.foo.B (from moduleB.jar)
What can I do about this error?
You have two options:
(harder) "unsplit" the package dependencies. However this could be difficult or impossible if you are not familiar with the inner workings of the library
(easier) combine the two jars into a single jar (and therefore a single automatic module) as you mentioned above. I agree that it is not a "good" solution, but having split packages in the first place is generally not a good idea either.
Do these dependencies prevent me from updating, or should I just wait for rx to update their libs?
Hopefully rx will eventually update their libs to not have split packages at some point in the future. Until then, my recommendation would be to just smash the two jars together into a single jar (option #2).
I had simmiliar problem:
error: module flyway.core reads package javax.transaction.xa from both jboss.transaction.api.1.2.spec and java.sql
error: module slf4j.api reads package javax.transaction.xa from both jboss.transaction.api.1.2.spec and java.sql
error: module hibernate.core reads package javax.transaction.xa from both jboss.transaction.api.1.2.spec and java.sql
.../src/main/java/module-info.java:1: error: module eu.com.x reads package javax.transaction.xa from both java.sql and jboss.transaction.api.1.2.spec
I could get rid of split packages compilation problem by checking my project transitive dependencies ("gradle dependencies" or "mvn dependency:tree" could be helpful) and excluding by code simmiliar to:
configurations.all {
exclude group: 'org.jboss.spec.javax.transaction', module: 'jboss-transaction-api_1.2_spec'
}
or
<dependencies>
<dependency>
<groupId>org.hibernate</groupId>
<artifactId>hibernate-core</artifactId>
<version>5.2.10.Final</version>
<exclusions>
<exclusion>
<groupId>org.jboss.spec.javax.transaction</groupId>
<artifactId>jboss-transaction-api_1.2_spec</artifactId>
</exclusion>
</exclusions>
</dependency>
</dependencies>
No jar repackaging was needed im my problem. This problem have not occured on #JDK8. Probably excluding dependencies does not help in every project.
i have been facing the same issue with java.transaction.xa read package from both javaee and java.transaction.xa.
And i fixed it by adding this line to my modul-info.java
opens javax.transaction.xa;
It worked fine but a hint shown up saying the package javax.transaction.xa is empty or doesn't exists. however the source code compile correctly.

osgi unresolved requirement: osgi.native

I am trying to make a call from a bundle to a native library using JNA. The code itself works fine. As a container I am using karaf, which I cannot change. The bundle sits in the state "Installed" and when trying to start it I get the following error.
Error executing command: Error executing command on bundles:
Error starting bundle 87: Could not resolve module: de.db.fkfmip.preparation.fkfmip-preparation-v2-gpio-nsb [87]
Unresolved requirement: Require-Capability: osgi.native; native.paths.0:List<String>="libf403.so"; native.paths.2:List<String>="libf403.so"; native.paths.1:List<String>="libf403.so"; native.paths.4:List<String>="libf403.so"; native.paths.3:List<String>="libf403.so"; native.paths.5:List<String>="libf403.so"; filter:="(|(&(osgi.native.osname~=Linux)(osgi.native.processor~=ARM))(&(osgi.native.osname~=Linux)(osgi.native.processor~=arm_le))(&(osgi.native.osname~=Linux)(osgi.native.processor~=arm_be))(&(osgi.native.osname~=Linux)(osgi.native.processor~=x86-64))(&(osgi.native.osname~=Linux)(osgi.native.processor~=x86_64))(&(osgi.native.osname~=Linux)(osgi.native.processor~=arm)))"
Here is a screenshot of my karaf console:
Karaf Console
I have made sure that I added the Bundle-NativeCode tag in my osgi.bnd
Bundle-NativeCode: \
libf403.so;osname=Linux;processor=ARM,\
libf403.so;osname=Linux;processor=arm_le,\
libf403.so;osname=Linux;processor=arm_be,\
libf403.so;osname=Linux;processor=x86-64,\
libf403.so;osname=Linux;processor=x86_64,\
libf403.so;osname=Linux;processor=arm
It matches my system which is openSUSE 42.3.
Architecture: x86_64
CPU op-mode(s): 32-bit, 64-bit
I am not sure what I am missing. Do I need to install something extra for karaf to work with osgi.native?
I have encountered many Problems using OSGi and JNA. Maybe this will help.
My first mistake was not to get all dependencies right at the beginning. The following dependencies are needed. I got weird errors when I left the second one out.
<dependency>
<groupId>net.java.dev.jna</groupId>
<artifactId>jna</artifactId>
<version>4.5.1</version>
</dependency>
<dependency>
<groupId>net.java.dev.jna</groupId>
<artifactId>jna-platform</artifactId>
<version>4.5.0</version>
</dependency>
If you are using your own native library as I did, you need to make sure you have compiled it with 32-bit or 64-bit compiler depending on the system it will be running on.
Using JNA in your code is quite straight forward. You need to make an interface matching the methods of your native library and then load the library as shown below.
/** Shared Library libf403.so */
private F403 f403;
/** JNA interface for libf403.so */
private interface F403 extends Library {
int BoardConfig();
int SetOutput(byte channel, byte status);
int GetGroupInputs();
}
f403 = (F403) Native.loadLibrary("f403", F403.class);
f403.BoardConfig();
If you have the native library in a specific location you can also specify a path. For example:
f403 = (F403) Native.loadLibrary("/usr/lib/libf403.so", F403.class);
Now onto OSGi.
When you have your own library and want it to be inside the jar you can do the following.
Place the native library under your source folder /scr/main/resources and in a folder you can name yourself. In my example it is linux-x86.
The bundle description then needs a Bundle-NativeCode tag. Under that tag you can define a whole list of libraries, OSs and processors. I know that my hardware and OS will never change. So that is fine for me.
Export-Package:
Import-Package: \
com.sun.jna*;resolution:=optional, \
*
Bundle-NativeCode: \
linux-x86/libf403.so;osname=Linux;processor=x86
-dsannotations: *
-dsannotations-options: norequirements
-metatypeannotations: *
Be aware that you do not need the Bundle-NativeCode tag if you are not having the library in your resources or vice versa. You will only get compile errors.

Always getting exception "Wrong type at constant pool index" with Cucumber-Java8

I am trying to set-up an example project for the Java8 dialect of Cucumber. My problem is, that I don't get it running. I always get the following hierarchy of exceptions:
Tests run: 1, Failures: 0, Errors: 1, Skipped: 0, Time elapsed: 0.068 sec <<< FAILURE! - in soy.wimmer.CucumberIT
Feature: Cucumber with Java8 Time elapsed: 0.051 sec <<< ERROR!
cucumber.runtime.CucumberException: Failed to instantiate class soy.wimmer.CucumberStepdefs
[…]
Caused by: java.lang.reflect.InvocationTargetException: null
[…]
Caused by: cucumber.runtime.CucumberException: java.lang.IllegalArgumentException: Wrong type at constant pool index
[…]
Caused by: java.lang.IllegalArgumentException: Wrong type at constant pool index
at sun.reflect.ConstantPool.getMemberRefInfoAt0(Native Method)
at sun.reflect.ConstantPool.getMemberRefInfoAt(ConstantPool.java:47)
at cucumber.runtime.java8.ConstantPoolTypeIntrospector.getTypeString(ConstantPoolTypeIntrospector.java:37)
at cucumber.runtime.java8.ConstantPoolTypeIntrospector.getGenericTypes(ConstantPoolTypeIntrospector.java:27)
at cucumber.runtime.java.Java8StepDefinition.<init>(Java8StepDefinition.java:45)
at cucumber.runtime.java.JavaBackend.addStepDefinition(JavaBackend.java:162)
at cucumber.api.java8.En.Given(En.java:190)
at soy.wimmer.CucumberStepdefs.<init>(CucumberStepdefs.java:8)
[…]
Results :
Tests in error:
Failed to instantiate class soy.wimmer.CucumberStepdefs
Tests run: 1, Failures: 0, Errors: 1, Skipped: 0
I have no clue why I get this error nor how to fix it.
I have packaged everything in a Maven project. The layout is like that:
./src/test/java/soy/wimmer/CucumberIT.java
./src/test/java/soy/wimmer/CucumberStepdefs.java
./src/test/resources/cucumber/cucumber-java8.feature
./pom.xml
The dependencies I include in the pom.xml are:
<dependencies>
<dependency>
<groupId>info.cukes</groupId>
<artifactId>cucumber-java8</artifactId>
<version>1.2.3</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>info.cukes</groupId>
<artifactId>cucumber-junit</artifactId>
<version>1.2.3</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>junit</groupId>
<artifactId>junit</artifactId>
<version>4.12</version>
<scope>test</scope>
</dependency>
</dependencies>
Additionally the pom.xml only loads the compiler and the failsafe plugin.
My definition of CucumberIT.java:
package soy.wimmer;
import cucumber.api.CucumberOptions;
import cucumber.api.junit.Cucumber;
import org.junit.runner.RunWith;
#RunWith(Cucumber.class)
#CucumberOptions(features = "classpath:cucumber")
public class CucumberIT {
}
My feature definition:
Feature: Cucumber with Java8
As a developer
I want to use Cucumber-java8
So that I have nicer step definitions
Scenario: Let's try it
Given I have some dummy code
When I try to test it
Then it should work with cucumber-java8
And this are my step definitions:
package soy.wimmer;
import cucumber.api.PendingException;
import cucumber.api.java8.En;
public class CucumberStepdefs implements En {
public CucumberStepdefs() {
Given("^I have some dummy code$", () -> {
// Write code here that turns the phrase above into concrete actions
throw new PendingException();
});
When("^I try to test it$", () -> {
// Write code here that turns the phrase above into concrete actions
throw new PendingException();
});
Then("^it should work with cucumber-java(\\d+)$", (Integer arg1) -> {
// Write code here that turns the phrase above into concrete actions
throw new PendingException();
});
}
}
Any idea what I'm doing wrong here?
The problem is caused because the Java8 dialect of Cucumber uses implementation details of Oracle's JDK8.
I was using OpenJDK8 as packaged by Debian which causes a different organisation of the constant pool. When I try the same with Oracle's JDK8 everything works as expected.
If you want to try it yourself, I published the complete example project on github: https://github.com/mawis/cucumber-java8-test
I also reported a bug at the issue tracker of cucumber-jvm here: https://github.com/cucumber/cucumber-jvm/issues/912
You might check the issue tracker to see if the problem will have been fixed in the future.
For now if you want to use cucumber-java8 it seems you have to use Oracle's implementation of the JDK.
(The fame for solving this problem belongs to Holger with his comments to the question. I just wanted to write this answer as a summary.)
Just use 1.2.5 version which has been recently released. It solved the bug referenced by accepted answer.

Maven Jaxb Generate Fails When Compiling A Module That Depends On Multiple Modules

I have an Eclipse Maven project consisting of multiple modules, some of which contain Xml schemas that I want to generate classes for (using Jaxb). My project layout is as follows:
schemas\core (pom)
schemas\core\types (jar)
schemas\vehicle (pom)
schemas\vehicle\automobile (jar)
schemas\vehicle\civic (jar)
The projects that contain schemas are:
schemas\core\types (xsd\types.xsd)
schemas\vehicle\automobile (xsd\automobile.xsd)
schemas\vehicle\civic (xsd\civic.xsd)
Some of the modules contain schemas that import schemas from other modules:
automobile.xsd imports types.xsd
civic.xsd imports types.xsd, automobile.xsd
Since the schemas are located in different projects I use a classpath catalog resolver along with catalog files to resolve the location of the schemas.
The automobile project depends on schemas in the types project. Here is the entry in its catalog file (catalog.xml):
<rewriteSystem systemIdStartString="http://schemas/core/types/" rewritePrefix="classpath:xsd/" />
Note the use of classpath:xsd/ to tell the catalog resolver to find the schemas on the classpath.
I also use episodes to prevent the classes in types from being re-generated inside the automobile project. Here is a snippit from my pom.xml:
<plugin>
<groupId>org.jvnet.jaxb2.maven2</groupId>
<artifactId>maven-jaxb2-plugin</artifactId>
<version>0.8.3</version>
<configuration>
<episodes>
<episode>
<groupId>schemas.core</groupId>
<artifactId>types</artifactId>
<version>1.0-SNAPSHOT</version>
</episode>
<episodes>
<catalog>src/main/resources/catalog.xml</catalog>
<catalogResolver>org.jvnet.jaxb2.maven2.resolver.tools.ClasspathCatalogResolver</catalogResolver>
<extension>true</extension>
....
When I run mvn clean install on automobile project everything works file. The schema types.xsd is resolved on the classpath and the classes are ultimately generated.
Where I run into problems is trying to compile the project civic.
The civic project depends on both types.xsd and automobile.xsd. I use a catalog file (catalog.xml) to define the location of the schemas:
<rewriteSystem systemIdStartString="http://schemas/core/types/" rewritePrefix="classpath:xsd/" />
<rewriteSystem systemIdStartString="http://schemas/vehicle/automobile/" rewritePrefix="classpath:xsd/" />
I use episodes to prevent re-generation of the classes. Here is a snippit from the pom.xml for civic:
<plugin>
<groupId>org.jvnet.jaxb2.maven2</groupId>
<artifactId>maven-jaxb2-plugin</artifactId>
<version>0.8.3</version>
<configuration>
<episodes>
<episode>
<groupId>schemas.core</groupId>
<artifactId>types</artifactId>
<version>1.0-SNAPSHOT</version>
</episode>
<episode>
<groupId>schemas.vehicle</groupId>
<artifactId>automobile</artifactId>
<version>1.0-SNAPSHOT</version>
</episode>
</episodes>
<catalog>src/main/resources/catalog.xml</catalog>
<catalogResolver>org.jvnet.jaxb2.maven2.resolver.tools.ClasspathCatalogResolver</catalogResolver>
<extension>true</extension>
...
When I try to run mvn clean install on the civic project I run into problems. It complains about not being able to resolve the public/system ids. Here are some of the error messages I get:
Could not resolve publicId [null], systemId [jar:file:/_m2repository/schemas/vehicle/automobile/1.0-SNAPSHOT/automobile-1.0-SNAPSHOT.jar!http://schemas/core/types/types.xsd]
[ERROR] Error while parsing schema(s).Location [].
com.sun.istack.SAXParseException2;
IOException thrown when processing "jar:file:/_m2repository/schemas/vehicle/automobile/1.0-SNAPSHOT/automobile-1.0-SNAPSHOT.jar!http://schemas/core/types/types.xsd".
Exception: java.net.MalformedURLException: no !/ in spec.
....
For some reason it cannot find types.xsd when trying to parse the jar file from the automobile project.
Does anyone know why this might be happening?
Thank you.
Note - I was experimenting around with tying to get things to work and I did find one way. If I remove the episodes from the pom.xml file I no longer get the error, however, the project civic ends up with all the types from the dependent modules (which is something I am tying to avoid by using the episodes).
If you want to see the full catalog.xml and pom.xml files for each project please see the following links:
types: http://pastebin.com/Uym3DY6X
automobile: http://pastebin.com/VQM4MPuW
civic: http://pastebin.com/eGSVGwmE
Author of the maven-jaxb2-plugin here.
I have just released the 0.10.0 version of the maven-jaxb2-plugin. This release fixes the MAVEN_JAXB2_PLUGIN-82 issue which is related to the reported problems.
This was actually NOT a bug in the maven-jaxb2-plugin, but an issue (or, better to say a few issues) in the XJC itself:
https://java.net/jira/browse/JAXB-1044
https://java.net/jira/browse/JAXB-1045
https://java.net/jira/browse/JAXB-1046
These issues cause problems when catalog and binding files are used together. This was also the reason why the Maven artifact resoltion did not work correctly in certain cases.
In the 0.10.0 release, I have implemented workarounds for JAXB-1044 and JAXB-1045. I will try to get my patches to the XJC via pull requests, but you know, I'm not sure, when/if Oracle guys will accept my PRs.
In the maven-jaxb2-plugin I've now implemented quite reliable workarounds. See this test project here:
https://github.com/highsource/maven-jaxb2-plugin/tree/master/tests/MAVEN_JAXB2_PLUGIN-82
This does exactly what you want: resolves schema via catalog AND Maven resolver to the resource from another artifact. Basically, this rewriting:
REWRITE_SYSTEM "http://www.ab.org" "maven:org.jvnet.jaxb2.maven2:maven-jaxb2-plugin-tests-MAVEN_JAXB2_PLUGIN-82-a:jar::!"
now works fine.
In case of problems do mvn -X and check the output, you'll also see the statements of the catalog resolver in the log. This might give you hints, what does not work.
Here's another project which uses schemas, bindings and the catalog itself from one central artifact:
https://github.com/highsource/w3c-schemas
Snippets from the POM:
<schemas>
<schema>
<url>http://www.w3.org/1999/xlink.xsd</url>
</schema>
</schemas>
<schemaIncludes/>
<bindings>
<binding>
<dependencyResource>
<groupId>${project.groupId}</groupId>
<artifactId>w3c-schemas</artifactId>
<resource>globalBindings.xjb</resource>
<version>${project.version}</version>
</dependencyResource>
</binding>
</bindings>
<catalogs>
<catalog>
<dependencyResource>
<groupId>${project.groupId}</groupId>
<artifactId>w3c-schemas</artifactId>
<resource>catalog.cat</resource>
<version>${project.version}</version>
</dependencyResource>
</catalog>
</catalogs>
Catalog:
REWRITE_SYSTEM "http://www.w3.org" "maven:org.hisrc.w3c:w3c-schemas:jar::!/w3c"
Binding:
<jaxb:bindings schemaLocation="http://www.w3.org/1999/xlink.xsd" node="/xs:schema">
<jaxb:schemaBindings>
<jaxb:package name="org.hisrc.w3c.xlink.v_1_0"/>
</jaxb:schemaBindings>
</jaxb:bindings>
So how all of this works:
Schemas as well as the catalog and global bindings are stored in the central artifact w3c-schemas.
The project wants to compile the URL http://www.w3.org/1999/xlink.xsd.
The catalog rewrites this URL into the systemId maven:org.hisrc.w3c:w3c-schemas:jar::!/w3c/1999/xlink.xsd. (There's a /w3c/1999/xlink.xsd resource in the w3c-schemas jar).
This systemId is then resolved my the Maven catalog resolver (delivered my the maven-jaxb2-plugin) into the "real" URL which will be some jar:... URL pointing to the resource within the w3c-schemas artifact JAR in the local repository.
So the schema is not downloaded from the Internet but taken from the local resource.
The workaround keep the "original" systemIds, therefor you can customize the schema using its original URL. (The resolved systemId won't be convenient.)
The catalog file and the global bindings file will be the same for all the individual projects, so they're also put into the central artifact and referenced there using the dependencyResource.
I have the same problem. Schema C imports B and A, B imports A. Generating sources for A, works, B is also fine and for C a MalformedUrlException pops up.
I'm still investigating the error but a workaround is to use the systemIdSuffix (Oasis spec 1.1) to match the systemId and rewrite it. You need to do the following:
Remove the 'catalogResolver' element from the plugin configuration in the poms.
Replace the content of the catalog file for the 'automobile' project with the following:
<systemSuffix systemIdSuffix="types.xsd" uri="maven:schemas.core:types!/types.xsd"/>
Replace the content of the catalog file for the 'civic' project with the following:
<systemSuffix systemIdSuffix="types.xsd" uri="maven:schemas.core:types!/types.xsd"/>
<systemSuffix systemIdSuffix="automobile.xsd" uri="maven:schemas.vehicle:automobile!/automobile.xsd"/>
Let me know if this works for you.
I faced similar problems. I used the sample projects found here.
I modified these projects in 2 ways:
1) Have an A project with 2 namespaces and a local catalog file. Have a project B depending on this, using the episode of A in B.
2) Have an A project, a B project and a C project. B relies on A and C relies on B.
In both cases I got the same exception as you. But I started to realize in situation 2 what is happening.
This is the exception:
com.sun.istack.SAXParseException2; IOException thrown when processing "jar:file:/Users/sjaak/.m2/repository/org/tst/b-working/1.0/b-working-1.0.jar!http://www.a1.org/a1/a1.xsd". Exception: java.net.MalformedURLException: no !/ in spec.
So, it tries to resolve namespace http://www.a1.org/a1/a1.xsd relative to project B when building project C. I traced the problem back to com.sun.tools.xjc.reader.internalizerAbstractReferenceFinderImpl, method startElement.
The solution I use is adapting the org.jvnet.jaxb2.maven2:maven-jaxb2-plugin. I used their MavenCatalogResolver (the default one as pointed out above) and made a small change, simply not offering the whole systemId: jar:file:/Users/sjaak/.m2/repository/org/tst/b-working/1.0/b-working-1.0.jar!http://www.a1.org/a1/a1.xsd, but in stead use a pattern that only offers the part after the exclamation mark for resolving.
Here's the code:
package org.jvnet.jaxb2.maven2.resolver.tools;
import java.net.URI;
import java.net.URISyntaxException;
import java.net.URL;
import java.text.MessageFormat;
import org.jvnet.jaxb2.maven2.DependencyResource;
import org.jvnet.jaxb2.maven2.DependencyResourceResolver;
import com.sun.org.apache.xml.internal.resolver.CatalogManager;
import java.util.regex.Matcher;
import java.util.regex.Pattern;
public class MavenCatalogResolver extends
com.sun.org.apache.xml.internal.resolver.tools.CatalogResolver {
private final static Pattern PTRN = Pattern.compile("^jar:file:(.*).jar!(.*)$");
public static final String URI_SCHEME_MAVEN = "maven";
private final DependencyResourceResolver dependencyResourceResolver;
private final CatalogManager catalogManager;
public MavenCatalogResolver(CatalogManager catalogManager,
DependencyResourceResolver dependencyResourceResolver) {
super(catalogManager);
this.catalogManager = catalogManager;
if (dependencyResourceResolver == null) {
throw new IllegalArgumentException(
"Dependency resource resolver must not be null.");
}
this.dependencyResourceResolver = dependencyResourceResolver;
}
#Override
public String getResolvedEntity(String publicId, String systemId)
{
String result;
Matcher matcher = PTRN.matcher(systemId);
if (matcher.matches())
{
result = super.getResolvedEntity(publicId, matcher.group(2));
}
else
{
result = super.getResolvedEntity(publicId, systemId);
}
if (result == null) {
return null;
}
try {
final URI uri = new URI(result);
if (URI_SCHEME_MAVEN.equals(uri.getScheme())) {
final String schemeSpecificPart = uri.getSchemeSpecificPart();
try {
final DependencyResource dependencyResource = DependencyResource
.valueOf(schemeSpecificPart);
try {
final URL url = dependencyResourceResolver
.resolveDependencyResource(dependencyResource);
String resolved = url.toString();
return resolved;
} catch (Exception ex) {
catalogManager.debug.message(1, MessageFormat.format(
"Error resolving dependency resource [{0}].",
dependencyResource));
}
} catch (IllegalArgumentException iaex) {
catalogManager.debug.message(1, MessageFormat.format(
"Error parsing dependency descriptor [{0}].",
schemeSpecificPart));
}
return null;
} else {
return result;
}
} catch (URISyntaxException urisex) {
return result;
}
}
}
This actually fixed my problem. I'll investigate a bit more. I've got the feeling there might be some XJC arg that I could use, or perhaps the catalog XML format offers more possibilities.
Hope it helps.

How to suppress SLF4J Warning about multiple bindings?

My java project has dependencies with different SLF4J versions. How do I suppress the annoying warnings?
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:xyz234/lib/slf4j-
log4j12-1.5.8.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:xyz123/.m2/repository/org/slf4j/slf4j-log4j12
/1.6.0/slf4j-log4j12-1.6.0.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
P.S.: This is not the same question as slf4j warning about the same binding being duplicate, the answer there is how to get rid of a false alarm warning, in my case it is a true warning however.
P.S.S.: Sorry, I forgot to mention: I use Maven and SLF4J is included in the dependencies of my dependencies.
Remove one of the slf4j-log4j12-1.5.8.jar or slf4j-log4j12-1.6.0.jar from the classpath. Your project should not depend on different versions of SLF4J. I suggest you to use just the 1.6.0.
If you're using Maven, you can exclude transitive dependencies. Here is an example:
<dependency>
<groupId>com.sun.xml.stream</groupId>
<artifactId>sjsxp</artifactId>
<version>1.0.1</version>
<exclusions>
<exclusion>
<groupId>javax.xml.stream</groupId>
<artifactId>stax-api</artifactId>
</exclusion>
</exclusions>
</dependency>
With the current slf4j-api implementation it is not possible to remove these warnings. The org.slf4j.LoggerFactory class prints the messages:
...
if (implementationSet.size() > 1) {
Util.report("Class path contains multiple SLF4J bindings.");
Iterator iterator = implementationSet.iterator();
while(iterator.hasNext()) {
URL path = (URL) iterator.next();
Util.report("Found binding in [" + path + "]");
}
Util.report("See " + MULTIPLE_BINDINGS_URL + " for an explanation.");
}
...
The Util class is the following:
public class Util {
static final public void report(String msg, Throwable t) {
System.err.println(msg);
System.err.println("Reported exception:");
t.printStackTrace();
}
...
The report method writes directly to System.err. A workaround could be to replace the System.err with System.setErr() before the first LoggerFactory.getLogger() call but you could lose other important messages if you do that.
Of course you can download the source and remove these Util.report calls and use your modified slf4j-api in your project.
PrintStream filterOut = new PrintStream(System.err) {
public void println(String l) {
if (! l.startsWith("SLF4J")) {
super.println(l);
}
}
};
System.setErr(filterOut);
et voilà!
Have you read the URL referenced by the warning?
SLF4J: See [http://www.slf4j.org/codes.html#multiple_bindings][1] for an explanation.
Here is what the link states:
SLF4J API is desinged to bind with one and only one underlying logging
framework at a time. If more than one binding is present on the class
path, SLF4J will emit a warning, listing the location of those
bindings. When this happens, select the one and only one binding you
wish to use, and remove the other bindings.
For example, if you have both slf4j-simple-1.6.2.jar and
slf4j-nop-1.6.2.jar on the class path and you wish to use the nop
(no-operation) binding, then remove slf4j-simple-1.6.2.jar from the
class path.
Note that the warning emitted by SLF4J is just that, a warning. SLF4J
will still bind with the first framework it finds on the class path.
If using maven always use the command
mvn dependency:tree
This will list all the dependencies added to the project including dependencies to the jars we have included. Here we can pin point multiple version, or multiple copies of jars that come in with other jars which were added. Use
<exclusions><exclusion><groupId></groupId><artifactId></artifactId></exclusion></exclusions>
in <dependency> element to exclude the ones which conflict. Always re-run the maven command above after each exclusion if the issue persist.
Sometimes excluding the 2nd logger from the classpath would require too many contortions, and the warning is incredibly annoying. Masking out the standard error as the very beginning of the program does seem to work, e.g.
public static void main(String[] args)
{
org.apache.log4j.Logger.getRootLogger().setLevel(org.apache.log4j.Level.OFF);
PrintStream old = System.err;
System.setErr(new PrintStream(new ByteArrayOutputStream()));
// ... trigger it ...
System.setErr(old);
...
whereby the trigger it part should call some nop function that accesses the logging system and would have otherwise caused the message to be generated.
I wouldn't recommend this for production code, but for skunkworks purposes it should help.
If you're using an old version of Jetty (say Jetty 6), you may need to change the classloading order for the webapp to make it higher priority than the container's. You can do that by adding this line to the container's xml file:
<Set name="parentLoaderPriority">false</Set>

Resources