Processing 2.2.1 Video - AbstractMethodError - GStreamer / JNA mismatch? - processing

I am trying to use processing-video 2.2.1 as a library from my (Scala) project. I can run the demo capture sketches directly in the Processing IDE, but from my project I get an error that looks like a version mismatch:
Exception in thread "Animation Thread" java.lang.AbstractMethodError: com.sun.jna.Structure.getFieldOrder()Ljava/util/List;
at com.sun.jna.Structure.fieldOrder(Structure.java:868)
at com.sun.jna.Structure.getFields(Structure.java:894)
at com.sun.jna.Structure.deriveLayout(Structure.java:1042)
at com.sun.jna.Structure.calculateSize(Structure.java:966)
at com.sun.jna.Structure.calculateSize(Structure.java:933)
at com.sun.jna.Structure.allocateMemory(Structure.java:360)
at com.sun.jna.Structure.<init>(Structure.java:184)
at com.sun.jna.Structure.<init>(Structure.java:172)
at com.sun.jna.Structure.<init>(Structure.java:159)
at com.sun.jna.Structure.<init>(Structure.java:151)
at org.gstreamer.lowlevel.GObjectAPI$GTypeInstance.<init>(GObjectAPI.java:114)
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:422)
at java.lang.Class.newInstance(Class.java:442)
at com.sun.jna.Structure.newInstance(Structure.java:1635)
at com.sun.jna.Structure.newInstance(Structure.java:1621)
at com.sun.jna.Structure.size(Structure.java:950)
at com.sun.jna.Native.getNativeSize(Native.java:1076)
at com.sun.jna.Structure.getNativeSize(Structure.java:1927)
at com.sun.jna.Structure.getNativeSize(Structure.java:1920)
at com.sun.jna.Structure.validateField(Structure.java:1018)
at com.sun.jna.Structure.validateFields(Structure.java:1032)
at com.sun.jna.Structure.<init>(Structure.java:179)
at com.sun.jna.Structure.<init>(Structure.java:172)
at com.sun.jna.Structure.<init>(Structure.java:159)
at com.sun.jna.Structure.<init>(Structure.java:151)
at org.gstreamer.lowlevel.GObjectAPI$GParamSpec.<init>(GObjectAPI.java:395)
at org.gstreamer.GObject.findProperty(GObject.java:656)
at org.gstreamer.GObject.set(GObject.java:87)
at processing.video.Capture.initGStreamer(Unknown Source)
at processing.video.Capture.<init>(Unknown Source)
at (my sketch)
The Maven POM file is here. I end up with the following libraries on the class path:
com.googlecode.gstreamer-java:gstreamer-java:1.5
net.java.dev.jna:jna:4.0.0
net.java.dev.jna:platform:3.4.0
org.processing:core:2.2.1
org.processing:video:2.2.1
My intuition says there is a mismatch between jna and platform - should they have the same version? That would indicate that the published POM is wrong. Which version does the Processing standalone use? Unfortunately the jars there are stripped of version information.

Indeed, it seems the processing POM specifies an incompatible JNA version. In sbt, I could fix this with a dependencyOverrides declaration:
def processingVersion = "2.2.1"
def gstreamerVersion = "1.5"
def jnaVersion = "3.4.0"
libraryDependencies ++= Seq(
"org.processing" % "video" % processingVersion,
"com.googlecode.gstreamer-java" % "gstreamer-java" % gstreamerVersion
)
dependencyOverrides += "net.java.dev.jna" % "jna" % jnaVersion // !

for gradle peeps thats:
implementation ("org.processing:core:3.3.7") {
exclude group: 'net.java.dev.jna'
}
// https://mvnrepository.com/artifact/org.processing/video
implementation ("org.processing:video:3.3.7) {
exclude group: 'net.java.dev.jna'
}
// higher jna versions have abstract Structure.getFieldOrder which gstreamer doesn't implement
implementation "net.java.dev.jna:jna:3.4.0"

Related

Why does my deploy1Node task fails complaining about java.lang.ClassCastException: sun.nio.fs.UnixPath cannot be cast to java.lang.String?

I have a working corda gradle build. Our deploy1Node task works properly via IntelliJ, via cmd in Windows and via iTerm in MacOS.
We're using corda-3.2 open-source ( net.corda:corda-3.2:... ) for this particular build.
The problem occurs with 3.3 open-source as well, and
Oracle java version "1.8.0_171" on ubuntu, Oracle java version "1.8.0_152" on mac.
When I try to execute it in a Linux box, I receive the following error
Caused by: java.lang.ClassCastException: sun.nio.fs.UnixPath cannot be cast to java.lang.String
as shown in the stack trace excerpt below:
org.gradle.api.tasks.TaskExecutionException: Execution failed for task ':deploy1Node'.
at org.gradle.api.internal.tasks.execution.ExecuteActionsTaskExecuter.executeActions(ExecuteActionsTaskExecuter.java:100)
at org.gradle.api.internal.tasks.execution.ExecuteActionsTaskExecuter.execute(ExecuteActionsTaskExecuter.java:70)
at org.gradle.api.internal.tasks.execution.SkipUpToDateTaskExecuter.execute(SkipUpToDateTaskExecuter.java:63)
...
...
Caused by: java.lang.ClassCastException: sun.nio.fs.UnixPath cannot be cast to java.lang.String
at net.corda.nodeapi.internal.network.NetworkBootstrapper.generateWhitelist(NetworkBootstrapper.kt:323)
at net.corda.nodeapi.internal.network.NetworkBootstrapper.bootstrap(NetworkBootstrapper.kt:85)
at net.corda.plugins.Baseform.bootstrapNetwork(Baseform.kt:160)
at net.corda.plugins.Cordform.build(Cordform.kt:66)
...
...
How to make this task work regardless of platform ?
the open source jar version is :
Corda-Revision: 5ae8325980ad22df8146b983afeaca344fc03c3e
Corda-Vendor: Corda Open Source
Corda-Release-Version: 3.2-corda
The enterprise version seems to be ok:
Corda-Revision: c9b23a4400923a5cfe88271ce2fedd75740eac40
Corda-Vendor: Corda Enterprise Edition
Corda-Release-Version: 3.1
Trying to track down this problem, I've found that in the enterprise version
it displays in the gradle build:
> Task :deploy1Node
Putting task artifact state for task ':deploy1Node' into context took 0.0 secs.
Executing task ':deploy1Node' (up-to-date check took 0.0 secs) due to:
Task has not declared any outputs.
Running Cordform task
Deleting ./build/nodes
Bootstrapping local test network in /mnt/builds/Cordapp/appname/build/nodes
Generating node directory for Node
Copying CorDapp JARs into node directories
and in the open source version, the output is:
> Task :deploy1Node
Putting task artifact state for task ':deploy1Node' into context took 0.0 secs.
Executing task ':deploy1Node' (up-to-date check took 0.0 secs) due to:
Task has not declared any outputs.
Running Cordform task
Deleting ./build/nodes
Bootstrapping local network in /mnt/builds/Cordapp/appname/build/nodes
Node config files found in the root directory - generating node directories
Generating directory for Node_node
Nodes found in the following sub-directories: [Node_node]
It seems a _node is being appended where it should not.
There is another reference of this problem, in a russian site:
http://qaru.site/questions/16922067/why-does-my-deploy1node-task-fails-complaining-about-javalangclasscastexception-sunniofsunixpath-cannot-be-cast-to-javalangstring
Sharing the solution, and answering my own question, to prevent this problem, make sure your ext.corda_release_group, ext.corda_release_version and ext.corda_gradle_plugins_version are
defined once and only once in any buildscript session(s) your
project happens to have.
The problem was caused by trying to get the build.gradle to choose between
2 different corda distributions (open source and enterprise) -- and to change from the default choice after it was first used. For open source, the build.gradle's buildscript section must have the settings like:
buildscript {
...
ext.corda_release_group = 'net.corda'
ext.corda_release_version = '3.2-corda'
ext.corda_gradle_plugins_version = '3.0.9'
ext.kotlin_version = '1.2.50'
...
likewise, for the enterprise version, the settings have to be like:
buildscript {
...
ext.corda_release_group = 'com.r3.corda'
ext.corda_release_version = '3.1'
ext.corda_gradle_plugins_version = '4.0.25'
ext.kotlin_version = '1.2.50'
...
Once selected, the references became entangled, and the deploy1Node task failed in the open-source version.
One possibility is to use something like:
buildscript {
...
ext.profile_name = "open-source"
switch(profile_name) {
case ~/enterprise/ :
ext.cdb_node_location = 'nodes-enterprise'
ext.corda_release_group = 'com.r3.corda'
ext.corda_release_version = '3.1'
ext.corda_gradle_plugins_version = '4.0.25'
ext.kotlin_version = '1.2.50'
break;
case ~/open-source/ :
ext.cdb_node_location = 'nodes'
ext.corda_release_group = 'net.corda'
ext.corda_release_version = '3.2-corda'
ext.corda_gradle_plugins_version = '3.0.9'
ext.kotlin_version = '1.2.50'
break;
default :
ext.cdb_node_location = 'nodes'
ext.corda_release_group = 'net.corda'
ext.corda_release_version = '3.2-corda'
ext.corda_gradle_plugins_version = '3.0.9'
ext.kotlin_version = '1.2.50'
break;
break;
}

How to connect to Oracle DB with slick 3.0.1?

I'm starting to learn and experiment with slick.
I'm trying to connect to an oracle dev database, set up by our DBA.
However i am encountering issue and i can't connect.
Here is what i did so far:
oracledev = {
url = "jdbc:oracle:thin:#//vdevdbms2:4208/TPSDEV.IADB.ORG"
driver = com.typesafe.slick.driver.oracle.OracleDriver
connectionPool = disable
keepAliveConnection = true
}
I have the following in my build
resolvers += "Typesafe Releases" at "http://repo.typesafe.com/typesafe/maven-releases/"
libraryDependencies ++=
Seq(
"com.smartlogic.osclient" % "Semaphore-OS-Client" % "Semaphore-3.7.2",
"com.typesafe.slick" %% "slick-extensions" % "3.1.0",
"org.slf4j" % "slf4j-nop" % "1.6.4"
)
The code so far is simply:
object SlickSpike extends App {
val db = Database.forConfig("oracledev")
}
I get the following error:
Exception in thread "main" java.lang.ClassNotFoundException: disable
at java.lang.ClassLoader.findClass(ClassLoader.java:530) at
java.lang.ClassLoader.loadClass(ClassLoader.java:424) at
java.lang.ClassLoader.loadClass(ClassLoader.java:357) at
slick.util.ClassLoaderUtil$$anon$1.loadClass(ClassLoaderUtil.scala:12)
at slick.jdbc.JdbcDataSource$.loadFactory$1(JdbcDataSource.scala:30)
at slick.jdbc.JdbcDataSource$.forConfig(JdbcDataSource.scala:39) at
slick.jdbc.JdbcBackend$DatabaseFactoryDef$class.forConfig(JdbcBackend.scala:268)
at slick.jdbc.JdbcBackend$$anon$3.forConfig(JdbcBackend.scala:33) at
SlickSpike$.delayedEndpoint$SlickSpike$1(SlickSpike.scala:16) at
SlickSpike$delayedInit$body.apply(SlickSpike.scala:14) at
scala.Function0$class.apply$mcV$sp(Function0.scala:34) at
scala.runtime.AbstractFunction0.apply$mcV$sp(AbstractFunction0.scala:12)
at scala.App$$anonfun$main$1.apply(App.scala:76) at
scala.App$$anonfun$main$1.apply(App.scala:76) at
scala.collection.immutable.List.foreach(List.scala:381) at
scala.collection.generic.TraversableForwarder$class.foreach(TraversableForwarder.scala:35)
at scala.App$class.main(App.scala:76) at
SlickSpike$.main(SlickSpike.scala:14) at
SlickSpike.main(SlickSpike.scala) at
sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:483) at
com.intellij.rt.execution.application.AppMain.main(AppMain.java:144)
If i remove the line:
**
connectionPool = disable
**
Then i get the following error:
Exception in thread "main" java.lang.ClassNotFoundException:
slick.jdbc.hikaricp.HikariCPJdbcDataSource$ at
java.lang.ClassLoader.findClass(ClassLoader.java:530) at
java.lang.ClassLoader.loadClass(ClassLoader.java:424) at
java.lang.ClassLoader.loadClass(ClassLoader.java:357) at
slick.util.ClassLoaderUtil$$anon$1.loadClass(ClassLoaderUtil.scala:12)
at slick.jdbc.JdbcDataSource$.loadFactory$1(JdbcDataSource.scala:30)
at slick.jdbc.JdbcDataSource$.forConfig(JdbcDataSource.scala:35) at
slick.jdbc.JdbcBackend$DatabaseFactoryDef$class.forConfig(JdbcBackend.scala:268)
at slick.jdbc.JdbcBackend$$anon$3.forConfig(JdbcBackend.scala:33) at
SlickSpike$.delayedEndpoint$SlickSpike$1(SlickSpike.scala:16) at
SlickSpike$delayedInit$body.apply(SlickSpike.scala:14) at
scala.Function0$class.apply$mcV$sp(Function0.scala:34) at
scala.runtime.AbstractFunction0.apply$mcV$sp(AbstractFunction0.scala:12)
at scala.App$$anonfun$main$1.apply(App.scala:76) at
scala.App$$anonfun$main$1.apply(App.scala:76) at
scala.collection.immutable.List.foreach(List.scala:381) at
scala.collection.generic.TraversableForwarder$class.foreach(TraversableForwarder.scala:35)
at scala.App$class.main(App.scala:76) at
SlickSpike$.main(SlickSpike.scala:14) at
SlickSpike.main(SlickSpike.scala)
What am I doing wrong ?
I would simply like, to have a connection pool of 10, and connect to the database but i have no idea of how to set it up. Can someone help ?
Edit2
I solve the initial issue but i still have question and can't get everything to work.
I change my build as such:
libraryDependencies ++=
Seq(
"org.slf4j" % "slf4j-api" % "1.7.13",
"org.slf4j" % "slf4j-simple" % "1.7.13",
"com.smartlogic.osclient" % "Semaphore-OS-Client" % "Semaphore-3.7.2" exclude("org.slf4j","slf4j-log4j12"),
"com.typesafe.slick" %% "slick" % "3.1.0",
"com.typesafe.slick" %% "slick-extensions" % "3.1.0",
"com.typesafe.slick" %% "slick-hikaricp" % "3.1.0",
"com.oracle" % "ojdbc6" % "11.2.0.2.0"
)
I resorted to add slick-hikaricp, even if I did not intent to use originally.
Also I understood now that the oracle driver in the config, was the actual oracle drive, not the slick one. This is actually reflected in the change i did to my config as can be seen below:
oracledev = {
url = "jdbc:oracle:thin:#//vdevdbms2:4208/TPSDEV.IADB.ORG"
driver = oracle.jdbc.OracleDriver
// connectionPool = disable
keepAliveConnection = true
//databaseName = "BRIKPOOLPARTYDEV"
user = "*******"
password = "*******"
}
Questions:
1 - Is slick-hikaricp required by default when using Oracle. Indeed if i do not add it and comment out connectionPool = disable, which in my case, do not work when uncommented anyway, the program does not compile.
2 - I'm still not able to connect, am i missing something ?
Please help
the oracle, db2, and ms sql drivers are not free. there is a separate 'slick-extensions' package that contains drivers for them that you can use for development. but you have to fork over cold cash for production use of them.
You should download and put the ojdbc7.jar in a folder in your root project (lib) and have it rebuilt.
The entry in the config for connectionPool should be disabled instead of disable.
This config entry is used when deciding what JdbcDataSource will be loaded.
The disable config entry tries to load a JdbcDataSourceFactoryfactory with the name disable.

Spark streaming + json4s-jackson dependency problems

I am unable to use json4s-Jackson 3.2.11 within my spark 1.4.1 Streaming application.
Thinking that it was the existing dependency within the spark-core project that is causing the problem as explained here -> Is it possible to use json4s 3.2.11 with Spark 1.3.0? I have built Spark from source with an adjusted core/pom.xml. I have changed the reference from json4s-jackson_2.10:3.2.10 to 3.2.11, as the 2.10 version does not support extracting to implicit types.
I have replaced the source jars that are referenced in my intellij IDEA project with the rebuilt jars, however I am still getting the same errors as before. I fear that Spark must still be referencing json4s 3.2.10 somehow?
here is my simple test:
object StreamingPredictor {
implicit val formats = DefaultFormats
case class event(Key: String,
sensorId: String,
sessionId: String,
deviceId: String,
playerId: String,
impressionId: String,
time: String,
eventName: String,
eventProperties: Map[String, Any],
dl: Array[List[(String, Any)]],
$post: Boolean,
$sync: Boolean)
def parser(json: String): String = {
val parsedJson = parse(json)
val foo = parsedJson.extract[event]
foo.eventName
}
def main(args: Array[String]) {
val zkQuorum = "localhost:2181"
val group = "myGroup"
val topic = Map("test" -> 1)
val sparkContext = new SparkContext("local[4]","KafkaConsumer")
val ssc = new StreamingContext(sparkContext, Seconds(1))
val json = KafkaUtils.createStream(ssc, zkQuorum, group, topic)
val eventName = json.map(_._2).map(parser)
eventName.print()
ssc.start()
}
}
The error I get when referencing json4s 3.2.11 in my application pom.xml file:
java.lang.NoSuchMethodError: org.json4s.jackson.JsonMethods$.render(Lorg/json4s/JsonAST$JValue;)Lorg/json4s/JsonAST$JValue;
at org.apache.spark.scheduler.EventLoggingListener$$anonfun$logEvent$1.apply(EventLoggingListener.scala:143)
at org.apache.spark.scheduler.EventLoggingListener$$anonfun$logEvent$1.apply(EventLoggingListener.scala:143)
at scala.Option.foreach(Option.scala:236)
at org.apache.spark.scheduler.EventLoggingListener.logEvent(EventLoggingListener.scala:143)
at org.apache.spark.scheduler.EventLoggingListener.onBlockManagerAdded(EventLoggingListener.scala:174)
at org.apache.spark.scheduler.SparkListenerBus$class.onPostEvent(SparkListenerBus.scala:46)
at org.apache.spark.scheduler.LiveListenerBus.onPostEvent(LiveListenerBus.scala:31)
at org.apache.spark.scheduler.LiveListenerBus.onPostEvent(LiveListenerBus.scala:31)
at org.apache.spark.util.ListenerBus$class.postToAll(ListenerBus.scala:56)
at org.apache.spark.util.AsynchronousListenerBus.postToAll(AsynchronousListenerBus.scala:37)
at org.apache.spark.util.AsynchronousListenerBus$$anon$1$$anonfun$run$1.apply$mcV$sp(AsynchronousListenerBus.scala:79)
at org.apache.spark.util.Utils$.tryOrStopSparkContext(Utils.scala:1215)
at org.apache.spark.util.AsynchronousListenerBus$$anon$1.run(AsynchronousListenerBus.scala:63)
And the error I get when i use json4s-jackson_2.10:3.2.10 in my application pom.xml file:
org.apache.spark.SparkException: Job aborted due to stage failure: Task 0 in stage 1.0 failed 1 times, most recent failure: Lost task 0.0 in stage 1.0 (TID 1, localhost): org.json4s.package$MappingException: No usable value for eventProperties
No information known about type
at org.json4s.reflect.package$.fail(package.scala:96)
at org.json4s.Extraction$ClassInstanceBuilder.org$json4s$Extraction$ClassInstanceBuilder$$buildCtorArg(Extraction.scala:443)
at org.json4s.Extraction$ClassInstanceBuilder$$anonfun$14.apply(Extraction.scala:463)
at org.json4s.Extraction$ClassInstanceBuilder$$anonfun$14.apply(Extraction.scala:463)
at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:244)
at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:244)
at scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:59)
at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:47)
at scala.collection.TraversableLike$class.map(TraversableLike.scala:244)
at scala.collection.AbstractTraversable.map(Traversable.scala:105)
at org.json4s.Extraction$ClassInstanceBuilder.org$json4s$Extraction$ClassInstanceBuilder$$instantiate(Extraction.scala:451)
at org.json4s.Extraction$ClassInstanceBuilder$$anonfun$result$6.apply(Extraction.scala:491)
at org.json4s.Extraction$ClassInstanceBuilder$$anonfun$result$6.apply(Extraction.scala:488)
at org.json4s.Extraction$.org$json4s$Extraction$$customOrElse(Extraction.scala:500)
at org.json4s.Extraction$ClassInstanceBuilder.result(Extraction.scala:488)
at org.json4s.Extraction$.extract(Extraction.scala:332)
at org.json4s.Extraction$.extract(Extraction.scala:42)
at org.json4s.ExtractableJsonAstNode.extract(ExtractableJsonAstNode.scala:21)
at com.pca.triggar.Streaming.StreamingPredictor$.parser(StreamingPredictor.scala:38)
at com.pca.triggar.Streaming.StreamingPredictor$$anonfun$2.apply(StreamingPredictor.scala:57)
at com.pca.triggar.Streaming.StreamingPredictor$$anonfun$2.apply(StreamingPredictor.scala:57)
at scala.collection.Iterator$$anon$11.next(Iterator.scala:328)
at scala.collection.Iterator$$anon$10.next(Iterator.scala:312)
at scala.collection.Iterator$class.foreach(Iterator.scala:727)
at scala.collection.AbstractIterator.foreach(Iterator.scala:1157)
at scala.collection.generic.Growable$class.$plus$plus$eq(Growable.scala:48)
at scala.collection.mutable.ArrayBuffer.$plus$plus$eq(ArrayBuffer.scala:103)
at scala.collection.mutable.ArrayBuffer.$plus$plus$eq(ArrayBuffer.scala:47)
at scala.collection.TraversableOnce$class.to(TraversableOnce.scala:273)
at scala.collection.AbstractIterator.to(Iterator.scala:1157)
at scala.collection.TraversableOnce$class.toBuffer(TraversableOnce.scala:265)
at scala.collection.AbstractIterator.toBuffer(Iterator.scala:1157)
at scala.collection.TraversableOnce$class.toArray(TraversableOnce.scala:252)
at scala.collection.AbstractIterator.toArray(Iterator.scala:1157)
at org.apache.spark.rdd.RDD$$anonfun$take$1$$anonfun$28.apply(RDD.scala:1276)
at org.apache.spark.rdd.RDD$$anonfun$take$1$$anonfun$28.apply(RDD.scala:1276)
at org.apache.spark.SparkContext$$anonfun$runJob$5.apply(SparkContext.scala:1767)
at org.apache.spark.SparkContext$$anonfun$runJob$5.apply(SparkContext.scala:1767)
at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:63)
at org.apache.spark.scheduler.Task.run(Task.scala:70)
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:213)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
at java.lang.Thread.run(Thread.java:745)
Caused by: org.json4s.package$MappingException: No information known about type
at org.json4s.Extraction$ClassInstanceBuilder.org$json4s$Extraction$ClassInstanceBuilder$$instantiate(Extraction.scala:465)
at org.json4s.Extraction$ClassInstanceBuilder$$anonfun$result$6.apply(Extraction.scala:491)
at org.json4s.Extraction$ClassInstanceBuilder$$anonfun$result$6.apply(Extraction.scala:488)
at org.json4s.Extraction$.org$json4s$Extraction$$customOrElse(Extraction.scala:500)
at org.json4s.Extraction$ClassInstanceBuilder.result(Extraction.scala:488)
at org.json4s.Extraction$.extract(Extraction.scala:332)
at org.json4s.Extraction$$anonfun$extract$5.apply(Extraction.scala:316)
at org.json4s.Extraction$$anonfun$extract$5.apply(Extraction.scala:316)
at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:244)
at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:244)
at scala.collection.immutable.List.foreach(List.scala:318)
at scala.collection.TraversableLike$class.map(TraversableLike.scala:244)
at scala.collection.AbstractTraversable.map(Traversable.scala:105)
at org.json4s.Extraction$.extract(Extraction.scala:316)
at org.json4s.Extraction$ClassInstanceBuilder.org$json4s$Extraction$ClassInstanceBuilder$$buildCtorArg(Extraction.scala:431)
... 42 more
Driver stacktrace:
at org.apache.spark.scheduler.DAGScheduler.org$apache$spark$scheduler$DAGScheduler$$failJobAndIndependentStages(DAGScheduler.scala:1273)
at org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGScheduler.scala:1264)
at org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGScheduler.scala:1263)
at scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:59)
at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:47)
at org.apache.spark.scheduler.DAGScheduler.abortStage(DAGScheduler.scala:1263)
at org.apache.spark.scheduler.DAGScheduler$$anonfun$handleTaskSetFailed$1.apply(DAGScheduler.scala:730)
at org.apache.spark.scheduler.DAGScheduler$$anonfun$handleTaskSetFailed$1.apply(DAGScheduler.scala:730)
at scala.Option.foreach(Option.scala:236)
at org.apache.spark.scheduler.DAGScheduler.handleTaskSetFailed(DAGScheduler.scala:730)
at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:1457)
at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:1418)
at org.apache.spark.util.EventLoop$$anon$1.run(EventLoop.scala:48)
Ok, I found the problem. As posted else where, you need to compile against jason4s 3.2.10. Apparently, doing so generates a binary which would then work with Spark (version 1.5 in my case. Same in some earlier versions as well). It has to do with the default parameter in the render() method which shows up in 3.2.11.
I had the same issue with emr 4.3.0 and spark 1.6
solved it with installing json4s in the bootstrap action by :
down load the json4s jar and put it in s3
create the following shell script and put it in s3
#!/bin/bash
set -e
wget -S -T 10 -t 5 https://s3.amazonaws.com/your-bucketname/json4s-native_2.10-3.2.4.jar
mkdir -p /home/hadoop/lib
mv json4s-native_2.10-3.2.4.jar /home/hadoop/lib/
add it as a bootstrap step in the emr launch steps

How Do I Configure Buildr to run ScalaTest 2.11?

I'm using Buildr 1.4.20 (with Ruby 2.0.0 on 64-bit Linux) and trying to use ScalaTest with Scala 2.11.2, but I'm getting the following ClassNotFoundException every time I try to run buildr test.
Running tests in MyProject
ScalaTest "MySimpleTest"
An exception or error caused a run to abort. This may have been caused by a problematic custom reporter.
java.lang.NoClassDefFoundError: scala/xml/MetaData
at org.scalatest.tools.ReporterFactory.createJunitXmlReporter(ReporterFactory.scala:209)
at org.scalatest.tools.ReporterFactory.getReporterFromConfiguration(ReporterFactory.scala:230)
at org.scalatest.tools.ReporterFactory$$anonfun$createReportersFromConfigurations$1.apply(ReporterFactory.scala:242)
at org.scalatest.tools.ReporterFactory$$anonfun$createReportersFromConfigurations$1.apply(ReporterFactory.scala:241)
at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:245)
at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:245)
at scala.collection.Iterator$class.foreach(Iterator.scala:743)
at scala.collection.AbstractIterator.foreach(Iterator.scala:1177)
at scala.collection.IterableLike$class.foreach(IterableLike.scala:72)
at org.scalatest.tools.ReporterConfigurations.foreach(ReporterConfiguration.scala:43)
at scala.collection.TraversableLike$class.map(TraversableLike.scala:245)
at org.scalatest.tools.ReporterConfigurations.map(ReporterConfiguration.scala:43)
at org.scalatest.tools.ReporterFactory.createReportersFromConfigurations(ReporterFactory.scala:241)
at org.scalatest.tools.ReporterFactory.getDispatchReporter(ReporterFactory.scala:245)
at org.scalatest.tools.Runner$.withClassLoaderAndDispatchReporter(Runner.scala:2720)
at org.scalatest.tools.Runner$.runOptionallyWithPassFailReporter(Runner.scala:1043)
at org.scalatest.tools.Runner$.run(Runner.scala:883)
at org.scalatest.tools.ScalaTestAntTask.execute(ScalaTestAntTask.scala:329)
at org.apache.tools.ant.UnknownElement.execute(UnknownElement.java:291)
Caused by: java.lang.ClassNotFoundException: scala.xml.MetaData
at org.apache.tools.ant.AntClassLoader.findClassInComponents(AntClassLoader.java:1365)
at org.apache.tools.ant.AntClassLoader.findClass(AntClassLoader.java:1315)
at org.apache.tools.ant.AntClassLoader.loadClass(AntClassLoader.java:1068)
at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
... 19 more
Naturally, I thought I could fix this by adding a dependency with scala.xml.MetaData in it, so I added "org.scala-lang.modules:scala-xml_2.11:jar:1.0.2" to my test classpath, but I still get the same error.
I'm sure the class is indeed present in the .jar file:
atg#host:~> zipinfo ~/.m2/repository/org/scala-lang/modules/scala-xml_2.11/1.0.2/scala-xml_2.11-1.0.2.jar | grep MetaData
-rw---- 2.0 fat 1441 bl defN 14-May-20 10:09 scala/xml/MetaData$$anonfun$asAttrMap$1.class
-rw---- 2.0 fat 1312 bl defN 14-May-20 10:09 scala/xml/MetaData$$anonfun$toString$1.class
-rw---- 2.0 fat 1215 bl defN 14-May-20 10:09 scala/xml/MetaData$$anonfun$toString1$1.class
-rw---- 2.0 fat 4197 bl defN 14-May-20 10:09 scala/xml/MetaData$.class
-rw---- 2.0 fat 10489 bl defN 14-May-20 10:09 scala/xml/MetaData.class
... so I can only assume that test.with isn't the right way to add this dependency in a Scala project. Can anyone please offer any advice on how to fix this?
My entire buildfile is as follows:
# enable Scala 2.11.2
Buildr.settings.build['scala.version'] = "2.11.2"
Buildr.settings.build['scala.test'] = 'org.scalatest:scalatest_2.11:jar:2.2.2'
Buildr.settings.build['scala.check'] = 'org.scalacheck:scalacheck_2.11:jar:1.11.6'
require 'buildr/scala'
VERSION_NUMBER = "1.0-SNAPSHOT"
GROUP = "..."
COPYRIGHT = "..."
repositories.remote << "http://repo1.maven.org/maven2"
DEPS_COMPILE = "javax.servlet:javax.servlet-api:jar:3.1.0"
desc "..."
define "MyProject" do
project.version = VERSION_NUMBER
project.group = GROUP
manifest["Implementation-Vendor"] = COPYRIGHT
compile.with DEPS_COMPILE
test.with "org.scala-lang.modules:scala-xml_2.11:jar:1.0.2"
end
Your buildfile is ok, this seems to be a problem with buildr.
As a workaround, you could add the scala-xml in the ant dependencies, like so:
VERSION_NUMBER = "1.0-SNAPSHOT"
GROUP = "..."
COPYRIGHT = "..."
Ant.dependencies << "org.scala-lang.modules:scala-xml_2.11:jar:1.0.2"
You should also issue a bug in the buildr's repository.

Unable to use logback.groovy, but logback.xml works

I wanted to configure Logback using Groovy DSL. The file is very simple:
import ch.qos.logback.classic.encoder.PatternLayoutEncoder
import ch.qos.logback.core.ConsoleAppender
import static ch.qos.logback.classic.Level.DEBUG
import static ch.qos.logback.classic.Level.INFO
appender("stdout", ConsoleAppender) {
encoder(PatternLayoutEncoder) {
pattern = "%d %p [%c] - <%m>%n"
}
}
root(INFO, ["stdout"])
I use Gradle to build my application and run it with jettyRun. I get the following error:
Failed to instantiate [ch.qos.logback.classic.LoggerContext]
Reported exception:
org.codehaus.groovy.runtime.typehandling.GroovyCastException: Cannot cast object 'ch.qos.logback.core.ConsoleAppender[null]' with class 'ch.qos.logback.core.ConsoleAppender' to class 'ch.qos.logback.core.Appender'
at org.codehaus.groovy.runtime.typehandling.DefaultTypeTransformation.castToType(DefaultTypeTransformation.java:360)
at org.codehaus.groovy.runtime.ScriptBytecodeAdapter.castToType(ScriptBytecodeAdapter.java:599)
at ch.qos.logback.classic.gaffer.ConfigurationDelegate.appender(ConfigurationDelegate.groovy:119)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
at java.lang.reflect.Method.invoke(Method.java:597)
at org.codehaus.groovy.reflection.CachedMethod.invoke(CachedMethod.java:90)
at org.codehaus.groovy.runtime.metaclass.MixinInstanceMetaMethod.invoke(MixinInstanceMetaMethod.java:53)
at org.codehaus.groovy.runtime.callsite.PogoMetaMethodSite$PogoMetaMethodSiteNoUnwrapNoCoerce.invoke(PogoMetaMethodSite.java:308)
at org.codehaus.groovy.runtime.callsite.PogoMetaMethodSite.callCurrent(PogoMetaMethodSite.java:52)
at org.codehaus.groovy.runtime.callsite.CallSiteArray.defaultCallCurrent(CallSiteArray.java:46)
at org.codehaus.groovy.runtime.callsite.AbstractCallSite.callCurrent(AbstractCallSite.java:133)
at org.codehaus.groovy.runtime.callsite.AbstractCallSite.callCurrent(AbstractCallSite.java:149)
However, when I switch to equivalent XML configuration, everything works. What am I doing wrong?
Using Logback 1.0.0. Tried with Logback 1.0.3.
I figured the solution out, but some questions remain opened. The problem was that I had no proper Groovy on the classpath. I decided to make an example project to demonstrate this bug. I started with a console application using Gradle's "application" plugin. I didn't include Groovy as dependency.
apply plugin: 'application'
repositories {
mavenCentral()
}
ext.logbackVersion = '1.0.3'
ext.slf4jVersion = '1.6.4'
dependencies {
compile "ch.qos.logback:logback-classic:$ext.logbackVersion"
compile "org.slf4j:jcl-over-slf4j:$ext.slf4jVersion"
//runtime "org.codehaus.groovy:groovy:1.8.6" // the problem was here
}
mainClassName = "org.test.Main"
This gave me an error, which is quite straighforward.
|-ERROR in ch.qos.logback.classic.LoggerContext[default] - Groovy classes are not available on the class path. ABORTING INITIALIZATION.
OK, cool. The dependency was missing - easy to fix. But why didn't I get the same error when I ran my web application? Adding Groovy dependency solved the initial problem in the web application. I stripped my project down and will create a corresponding JIRA. Perhaps, Groovy on classpath detection is not quite correct.
The GroovyCastException "cannot cast ConsoleAppender as Appender" has all the bearings of a class loader issue. Which version of groovy is this? Could you open a bug report including a test case for reproducing this issuew?
colleagues.
I have faced almost the same trouble today:
When I use logback.xml everything works fine
When I use logback.groovy in IntelliJ IDEA eveything works fine too
When I use logback.groovy when start my script from command line I got a lot of errors like
:
D:\Projects\PRDMonitoring\sources>groovy tray.groovy PRD
Failed to instantiate [ch.qos.logback.classic.LoggerContext]
Reported exception:
org.codehaus.groovy.control.MultipleCompilationErrorsException: startup failed:
Script1.groovy: 2: unable to resolve class ch.qos.logback.classic.filter.LevelFilter
# line 2, column 1.
import ch.qos.logback.classic.filter.LevelFilter
^
Script1.groovy: -1: unable to resolve class ch.qos.logback.classic.encoder.PatternLayoutEncoder
# line -1, column -1.
Script1.groovy: 3: unable to resolve class ch.qos.logback.core.ConsoleAppender
# line 3, column 1.
import ch.qos.logback.core.ConsoleAppender
^
Script1.groovy: -1: unable to resolve class ch.qos.logback.classic.Level
# line -1, column -1.
Script1.groovy: 6: unable to resolve class ch.qos.logback.core.spi.FilterReply
# line 6, column 1.
import static ch.qos.logback.core.spi.FilterReply.ACCEPT
^
Script1.groovy: 7: unable to resolve class ch.qos.logback.core.spi.FilterReply
# line 7, column 1.
import static ch.qos.logback.core.spi.FilterReply.DENY
But after a couple of minutes to find a solution I figured out, that the following string before #Grapes annotation fixes a trouble with classes loading #GrabConfig(systemClassLoader=true)
#GrabConfig(systemClassLoader=true)
#Grapes([
#Grab(group = 'org.codehaus.groovy.modules.http-builder', module = 'http-builder', version = '0.6'),
#Grab(group = 'org.apache.commons', module='commons-lang3', version='3.0'),
#Grab(group = 'commons-io', module = 'commons-io', version = '2.4'),
#Grab(group = 'joda-time', module = 'joda-time', version = '2.9.4'),
#Grab(group = 'ch.qos.logback', module = 'logback-classic', version = '1.1.7'),
#Grab(group = 'ch.qos.logback', module = 'logback-core', version = '1.1.7')
])

Resources