Always getting exception "Wrong type at constant pool index" with Cucumber-Java8 - java-8

I am trying to set-up an example project for the Java8 dialect of Cucumber. My problem is, that I don't get it running. I always get the following hierarchy of exceptions:
Tests run: 1, Failures: 0, Errors: 1, Skipped: 0, Time elapsed: 0.068 sec <<< FAILURE! - in soy.wimmer.CucumberIT
Feature: Cucumber with Java8 Time elapsed: 0.051 sec <<< ERROR!
cucumber.runtime.CucumberException: Failed to instantiate class soy.wimmer.CucumberStepdefs
[…]
Caused by: java.lang.reflect.InvocationTargetException: null
[…]
Caused by: cucumber.runtime.CucumberException: java.lang.IllegalArgumentException: Wrong type at constant pool index
[…]
Caused by: java.lang.IllegalArgumentException: Wrong type at constant pool index
at sun.reflect.ConstantPool.getMemberRefInfoAt0(Native Method)
at sun.reflect.ConstantPool.getMemberRefInfoAt(ConstantPool.java:47)
at cucumber.runtime.java8.ConstantPoolTypeIntrospector.getTypeString(ConstantPoolTypeIntrospector.java:37)
at cucumber.runtime.java8.ConstantPoolTypeIntrospector.getGenericTypes(ConstantPoolTypeIntrospector.java:27)
at cucumber.runtime.java.Java8StepDefinition.<init>(Java8StepDefinition.java:45)
at cucumber.runtime.java.JavaBackend.addStepDefinition(JavaBackend.java:162)
at cucumber.api.java8.En.Given(En.java:190)
at soy.wimmer.CucumberStepdefs.<init>(CucumberStepdefs.java:8)
[…]
Results :
Tests in error:
Failed to instantiate class soy.wimmer.CucumberStepdefs
Tests run: 1, Failures: 0, Errors: 1, Skipped: 0
I have no clue why I get this error nor how to fix it.
I have packaged everything in a Maven project. The layout is like that:
./src/test/java/soy/wimmer/CucumberIT.java
./src/test/java/soy/wimmer/CucumberStepdefs.java
./src/test/resources/cucumber/cucumber-java8.feature
./pom.xml
The dependencies I include in the pom.xml are:
<dependencies>
<dependency>
<groupId>info.cukes</groupId>
<artifactId>cucumber-java8</artifactId>
<version>1.2.3</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>info.cukes</groupId>
<artifactId>cucumber-junit</artifactId>
<version>1.2.3</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>junit</groupId>
<artifactId>junit</artifactId>
<version>4.12</version>
<scope>test</scope>
</dependency>
</dependencies>
Additionally the pom.xml only loads the compiler and the failsafe plugin.
My definition of CucumberIT.java:
package soy.wimmer;
import cucumber.api.CucumberOptions;
import cucumber.api.junit.Cucumber;
import org.junit.runner.RunWith;
#RunWith(Cucumber.class)
#CucumberOptions(features = "classpath:cucumber")
public class CucumberIT {
}
My feature definition:
Feature: Cucumber with Java8
As a developer
I want to use Cucumber-java8
So that I have nicer step definitions
Scenario: Let's try it
Given I have some dummy code
When I try to test it
Then it should work with cucumber-java8
And this are my step definitions:
package soy.wimmer;
import cucumber.api.PendingException;
import cucumber.api.java8.En;
public class CucumberStepdefs implements En {
public CucumberStepdefs() {
Given("^I have some dummy code$", () -> {
// Write code here that turns the phrase above into concrete actions
throw new PendingException();
});
When("^I try to test it$", () -> {
// Write code here that turns the phrase above into concrete actions
throw new PendingException();
});
Then("^it should work with cucumber-java(\\d+)$", (Integer arg1) -> {
// Write code here that turns the phrase above into concrete actions
throw new PendingException();
});
}
}
Any idea what I'm doing wrong here?

The problem is caused because the Java8 dialect of Cucumber uses implementation details of Oracle's JDK8.
I was using OpenJDK8 as packaged by Debian which causes a different organisation of the constant pool. When I try the same with Oracle's JDK8 everything works as expected.
If you want to try it yourself, I published the complete example project on github: https://github.com/mawis/cucumber-java8-test
I also reported a bug at the issue tracker of cucumber-jvm here: https://github.com/cucumber/cucumber-jvm/issues/912
You might check the issue tracker to see if the problem will have been fixed in the future.
For now if you want to use cucumber-java8 it seems you have to use Oracle's implementation of the JDK.
(The fame for solving this problem belongs to Holger with his comments to the question. I just wanted to write this answer as a summary.)

Just use 1.2.5 version which has been recently released. It solved the bug referenced by accepted answer.

Related

com.oracle.truffle.polyglot.PolyglotImpl (in unnamed module) cannot access class org.graalvm.polyglot.impl.AbstractPolyglotImpl

I tried to extend scaffolded quarkus demo, https://code.quarkus.io/, with polyglot code for GraalVM:
#GET
#Produces(MediaType.TEXT_PLAIN)
public String hello() {
String out = "From JS:";
try (Context context = Context.create()) {
Value function = context.eval("js", "x => x+1");
assert function.canExecute();
int x = function.execute(41).asInt();
out=out+x;
System.out.println(out);
}
return "hello";
}
I added dependencies to pom.xml as suggested here [https://stackoverflow.com/questions/54384499/illegalstateexception-no-language-and-polyglot-implementation-was-found-on-the]
<dependency>
<groupId>org.graalvm.js</groupId>
<artifactId>js</artifactId>
<version>20.1.0</version>
</dependency>
<dependency>
<groupId>org.graalvm.js</groupId>
<artifactId>js-scriptengine</artifactId>
<version>20.1.0</version>
</dependency>
<dependency>
<groupId>org.graalvm.truffle</groupId>
<artifactId>truffle-api</artifactId>
<version>20.1.0</version>
</dependency>
But when I run on cmd line
./mvnw clean package
Test fails with exception, which I do not understand.
2020-06-22 19:26:56,328 ERROR [io.qua.ver.htt.run.QuarkusErrorHandler]
(executor-thread-1) HTTP Request to /hello failed, error id: 996b0479-d836-47a5-bbcb-67bd876f9277-1: org.jboss.resteasy.spi.UnhandledException:
java.lang.IllegalAccessError: superclass access check failed:
class com.oracle.truffle.polyglot.PolyglotImpl (in unnamed module #0x7bf61ba2) cannot access class org.graalvm.polyglot.impl.AbstractPolyglotImpl (in module org.graalvm.sdk)
because module org.graalvm.sdk does not export org.graalvm.polyglot.impl to unnamed module #0x7bf61ba2
UPDATE:
It looks like regression in quarkus, https://github.com/quarkusio/quarkus/issues/10226.
App test is passing when used with quarkus 1.2.1 (instead of 1.5.2).
Look into the mvn dependency:tree -- it turns out that org.graalvm.js:js:20.1.0 depends on org.graalvm.sdk:graal-sdk:19.3.1. I'd personally call that GraalVM JS bug.
If you add an explicit dependency on org.graalvm.sdk:graal-sdk:20.1.0, it should work.
(At least it did for me, but I was getting a different error than you, so not sure.)
EDIT: as I was warned about in the comment, it is not true that org.graalvm.js:js:20.1.0 depends on org.graalvm.sdk:graal-sdk:19.3.1. Instead, there must be something else that forces graal-sdk to 19.3.1, perhaps something from Quarkus. Explicitly managing it to 20.1.0 should still help.
are you maybe executing that on GraalVM 19.3.1? That is known to confuse the system. Our strong suggestion is to EITHER run on a GraalVM (which automatically includes a proper version of Graal.js, no further input needed), OR run on a Stock JDK and import the respective JARs from Maven. If you import (a different version of) our Jars from maven on a GraalVM, then you might run into conflicts like that.

How to use Evaluators in Java to score on a PMML using org.apache.spark?

I've implemented the code for scoring on a provided PMML file and a csv data file (Linear Regression) using Spark and Java. For this I've used jpmml-evaluator-spark and spark-mllib_2.11 maven artifacts, and it works fine.
Now, I'm looking at replacing jpmml-evaluator-spark library, which is AGPL licensed, to something similar may be bundled within org-apache-spark (or any other fully open source option)
I don't see Evaluators for scoring on a PMML available in org.apache.spark group of dependencies. Please confirm if this is correct and suggest some alternative.
https://github.com/jpmml/jpmml-evaluator-spark
This is the PMML evaluator library for the Apache Spark cluster computing system (http://spark.apache.org/) and is AGPL.
Also refer to: http://spark.apache.org/docs/latest/ml-guide.html
These suggest that whatever is packaged along with apache spark includes algorithms and model creation and training, but scoring on the model is not available here & has its dependencies included in the jpmml-evaluator-spark only.
import org.apache.spark.ml.Transformer;
import org.apache.spark.sql.Dataset;
import org.jpmml.evaluator.Evaluator;
import org.jpmml.evaluator.EvaluatorBuilder;
import org.jpmml.evaluator.LoadingModelEvaluatorBuilder;
import org.jpmml.evaluator.spark.TransformerBuilder;
...
...
...
EvaluatorBuilder evaluatorBuilder = new LoadingModelEvaluatorBuilder().setLocatable(false)
.setVisitors(new DefaultVisitorBattery()).load(pmmlInputStream);
Evaluator evaluator = evaluatorBuilder.build();
evaluator.verify();
TransformerBuilder pmmlTransformerBuilder = new TransformerBuilder(evaluator).withLabelCol("Predicted_SpeciesCategory").exploded(true);
Transformer pmmlTransformer = pmmlTransformerBuilder.build();
Dataset<?> resultDataset = pmmlTransformer.transform(csvDataset);
...
...
Maven dependencies:
<dependency>
<groupId>org.jpmml</groupId>
<artifactId>jpmml-evaluator-spark</artifactId>
<version>1.2.2</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-core_2.11</artifactId>
<version>2.4.3</version>
<scope>provided</scope>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-mllib_2.11</artifactId>
<version>2.4.3</version>
<scope>provided</scope>
</dependency>
<dependency>
<groupId>org.jpmml</groupId>
<artifactId>jpmml-sparkml</artifactId>
<version>1.5.4</version>
</dependency>
This code still has dependency on org.jpmml library, which I wish to remove. Looking for an alternative using org.apache.spark library to achieve similar results.
You could use the PMML4S-Spark to evaluate a PMML model against Spark, for example:
import org.pmml4s.spark.ScoreModel
val model = ScoreModel.fromInputStream(pmmlInputStream)
val resultDataset = model.transform(csvDataset)
If you want to use PMML4S-Spark in Java, it's also easy to use and similar as Scala, for example:
import org.pmml4s.spark.ScoreModel;
import org.apache.spark.sql.Dataset;
ScoreModel model = ScoreModel.fromInputStream(pmmlInputStream);
Dataset<?> resultDataset = model.transform(csvDataset);
BTW, PMML4S-Spark's license is APL 2.0.
My answer might be totally irrelevant to your question, but since I faced an issue and was coming to this question again and again - I don't want others to face it. I reached to the solution some how by going through many stackoverflows...
The problem
I was using spark in java using the old dependency maven code:
<dependency>
<groupId>org.jpmml</groupId>
<artifactId>pmml-evaluator-metro</artifactId>
<version>1.6.3</version>
</dependency>
Which I thought is perfect and will work. But that was not recognizing TransformerBuilder as one of its libraries.
The dependency code given below should solve your problem if your problem is related to TransformerBuilder:
<dependency>
<groupId>org.jpmml</groupId>
<artifactId>jpmml-evaluator-spark</artifactId>
<version>1.3.0</version>
</dependency>
That was it. You're welcome in advance 😉

Error: The method where(String) is undefined while using spring boot mongodb

I have added the mongodb dependencies in my spring boot app however I am getting undefined error on "where" method:
ChangeStreamRequest<Person> request = ChangeStreamRequest.builder()
.collection("person")
.filter(newAggregation(Person.class, match(where("operationType").is("insert"))))
.publishTo(pListener)
.build();
POM configuration:
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-data-mongodb-reactive</artifactId>
</dependency>
Please advise me on this
The reason for "undefined" is "there is no method where defined in your class".
You have to import where method from Criteria.
You can use Criteria.where("operationType").is("insert") by adding the below import statement.
import org.springframework.data.mongodb.core.query.Criteria;
Alternatively you can add a static import as below:
import static org.springframework.data.mongodb.core.query.Criteria.where;
Now, you can directly use :
where("operationType").is("insert")

Caused by: org.h2.jdbc.JdbcSQLException: Function "TO_TIMESTAMP" not found; SQL statement:

Can someone help me out here- getting following errors while running DML commands.
Caused by: org.h2.jdbc.JdbcSQLException: Function "TO_TIMESTAMP" not found; SQL statement:
Finally i got this answer, we were using older version of h2 database dependency in our project.
for Quick fixed,Take out latest dependency and add in your pom.xml file.
<dependency>
<groupId>com.h2database</groupId>
<artifactId>h2</artifactId>
<version>1.4.195</version>
</dependency>
http://www.h2database.com/html/download.html
https://mvnrepository.com/artifact/com.h2database/h2
I had a similar exception (org.h2.jdbc.JdbcSQLSyntaxErrorException: Function "TO_TIMESTAMP" not found) when upgrading h2 from 1.4.2 to 2.2.202
I don't know why, but it appears that the function had been removed
To make it work, I replaced to_timestamp by parsedatetime
parsedatetime follow the java.text.SimpleDataFormat semantics
Edit : Another way would be to add compatibity mode to Oracle (Mode=Oracle):
Example in spring boot : https://stackoverflow.com/a/64799048/1546137

Flink JDBCInputFormat cannot find method 'setRowTypeInfo'

I want to use flink-jdbc to get data from mysql。
I have seen an example on Apache flink website
// Read data from a relational database using the JDBC input format
DataSet<Tuple2<String, Integer> dbData =
env.createInput(
JDBCInputFormat.buildJDBCInputFormat()
.setDrivername("org.apache.derby.jdbc.EmbeddedDriver")
.setDBUrl("jdbc:derby:memory:persons")
.setQuery("select name, age from persons")
.setRowTypeInfo(new RowTypeInfo(BasicTypeInfo.STRING_TYPE_INFO, BasicTypeInfo.INT_TYPE_INFO))
.finish()
);
But when i try to write a demo, i can't find the method 'setRowTypeInfo'.
It was like this
import org.apache.flink.api.common.typeinfo.BasicTypeInfo
import org.apache.flink.api.java.ExecutionEnvironment
import org.apache.flink.api.java.io.jdbc.JDBCInputFormat
import org.apache.flink.api.scala._
/**
* Created by lulijun on 17/7/7.
*/
object FlinkJDBC {
def main(args:Array[String]): Unit = {
val env = ExecutionEnvironment.createLocalEnvironment()
val dbData = env.createInput(
JDBCInputFormat.buildJDBCInputFormat
.setDrivername("com.mysql.jdbc.Driver")
.setDBUrl("XXX")
.setUsername("xxx")
.setPassword("XXX")
.setQuery("select name, age from persons")
.setRowTypeInfo(new Nothing(BasicTypeInfo.STRING_TYPE_INFO, BasicTypeInfo.INT_TYPE_INFO))
.finish)
dbData.print()
env.execute()
}
}
The "setRowTypeInfo" method is always red, and the IDEA prompts
"cannot resolve symbol setRowTypeInfo"
The jar version of flink-jdbc i used is 1.0.0.
<dependencies>
<!-- Use this dependency if you are using the DataSet API -->
<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-scala_2.10</artifactId>
<version>1.3.0</version>
</dependency>
<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-clients_2.10</artifactId>
<version>1.3.0</version>
</dependency>
<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-jdbc</artifactId>
<version>1.0.0</version>
</dependency>
<dependency>
<groupId>mysql</groupId>
<artifactId>mysql-connector-java</artifactId>
<version>5.1.36</version>
</dependency>
</dependencies>
I have searched a lot, and most of the people use the method exactly like the official document, but on one mentioned this problem.
I doubt whether I used the wrong version of flink-jdbc, but I cannot get any information about the right way to use flink-jdbc.
If you know the problem, please teach me.Thank you.
I changed the flink-jdbc version from 1.0.0 to 1.3.0 and the problem solved.
But when I search flink-jdbc on maven websit
https://mvnrepository.com/search?q=flink-jdbc, I can't get the right information in the first few pages, It makes me thought the version of flink-jdbc do not need to be matched with other flink jars.
But the truth is flink-jdbc/1.1.3 use class RowTypeInfo of package api.table, but flink-jdbc/1.3.0 use class RowTypeInfo of package api.java.They have close ties with each other.
We must make sure the version is matched.

Resources