Unable to use logback.groovy, but logback.xml works - gradle

I wanted to configure Logback using Groovy DSL. The file is very simple:
import ch.qos.logback.classic.encoder.PatternLayoutEncoder
import ch.qos.logback.core.ConsoleAppender
import static ch.qos.logback.classic.Level.DEBUG
import static ch.qos.logback.classic.Level.INFO
appender("stdout", ConsoleAppender) {
encoder(PatternLayoutEncoder) {
pattern = "%d %p [%c] - <%m>%n"
}
}
root(INFO, ["stdout"])
I use Gradle to build my application and run it with jettyRun. I get the following error:
Failed to instantiate [ch.qos.logback.classic.LoggerContext]
Reported exception:
org.codehaus.groovy.runtime.typehandling.GroovyCastException: Cannot cast object 'ch.qos.logback.core.ConsoleAppender[null]' with class 'ch.qos.logback.core.ConsoleAppender' to class 'ch.qos.logback.core.Appender'
at org.codehaus.groovy.runtime.typehandling.DefaultTypeTransformation.castToType(DefaultTypeTransformation.java:360)
at org.codehaus.groovy.runtime.ScriptBytecodeAdapter.castToType(ScriptBytecodeAdapter.java:599)
at ch.qos.logback.classic.gaffer.ConfigurationDelegate.appender(ConfigurationDelegate.groovy:119)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
at java.lang.reflect.Method.invoke(Method.java:597)
at org.codehaus.groovy.reflection.CachedMethod.invoke(CachedMethod.java:90)
at org.codehaus.groovy.runtime.metaclass.MixinInstanceMetaMethod.invoke(MixinInstanceMetaMethod.java:53)
at org.codehaus.groovy.runtime.callsite.PogoMetaMethodSite$PogoMetaMethodSiteNoUnwrapNoCoerce.invoke(PogoMetaMethodSite.java:308)
at org.codehaus.groovy.runtime.callsite.PogoMetaMethodSite.callCurrent(PogoMetaMethodSite.java:52)
at org.codehaus.groovy.runtime.callsite.CallSiteArray.defaultCallCurrent(CallSiteArray.java:46)
at org.codehaus.groovy.runtime.callsite.AbstractCallSite.callCurrent(AbstractCallSite.java:133)
at org.codehaus.groovy.runtime.callsite.AbstractCallSite.callCurrent(AbstractCallSite.java:149)
However, when I switch to equivalent XML configuration, everything works. What am I doing wrong?
Using Logback 1.0.0. Tried with Logback 1.0.3.

I figured the solution out, but some questions remain opened. The problem was that I had no proper Groovy on the classpath. I decided to make an example project to demonstrate this bug. I started with a console application using Gradle's "application" plugin. I didn't include Groovy as dependency.
apply plugin: 'application'
repositories {
mavenCentral()
}
ext.logbackVersion = '1.0.3'
ext.slf4jVersion = '1.6.4'
dependencies {
compile "ch.qos.logback:logback-classic:$ext.logbackVersion"
compile "org.slf4j:jcl-over-slf4j:$ext.slf4jVersion"
//runtime "org.codehaus.groovy:groovy:1.8.6" // the problem was here
}
mainClassName = "org.test.Main"
This gave me an error, which is quite straighforward.
|-ERROR in ch.qos.logback.classic.LoggerContext[default] - Groovy classes are not available on the class path. ABORTING INITIALIZATION.
OK, cool. The dependency was missing - easy to fix. But why didn't I get the same error when I ran my web application? Adding Groovy dependency solved the initial problem in the web application. I stripped my project down and will create a corresponding JIRA. Perhaps, Groovy on classpath detection is not quite correct.

The GroovyCastException "cannot cast ConsoleAppender as Appender" has all the bearings of a class loader issue. Which version of groovy is this? Could you open a bug report including a test case for reproducing this issuew?

colleagues.
I have faced almost the same trouble today:
When I use logback.xml everything works fine
When I use logback.groovy in IntelliJ IDEA eveything works fine too
When I use logback.groovy when start my script from command line I got a lot of errors like
:
D:\Projects\PRDMonitoring\sources>groovy tray.groovy PRD
Failed to instantiate [ch.qos.logback.classic.LoggerContext]
Reported exception:
org.codehaus.groovy.control.MultipleCompilationErrorsException: startup failed:
Script1.groovy: 2: unable to resolve class ch.qos.logback.classic.filter.LevelFilter
# line 2, column 1.
import ch.qos.logback.classic.filter.LevelFilter
^
Script1.groovy: -1: unable to resolve class ch.qos.logback.classic.encoder.PatternLayoutEncoder
# line -1, column -1.
Script1.groovy: 3: unable to resolve class ch.qos.logback.core.ConsoleAppender
# line 3, column 1.
import ch.qos.logback.core.ConsoleAppender
^
Script1.groovy: -1: unable to resolve class ch.qos.logback.classic.Level
# line -1, column -1.
Script1.groovy: 6: unable to resolve class ch.qos.logback.core.spi.FilterReply
# line 6, column 1.
import static ch.qos.logback.core.spi.FilterReply.ACCEPT
^
Script1.groovy: 7: unable to resolve class ch.qos.logback.core.spi.FilterReply
# line 7, column 1.
import static ch.qos.logback.core.spi.FilterReply.DENY
But after a couple of minutes to find a solution I figured out, that the following string before #Grapes annotation fixes a trouble with classes loading #GrabConfig(systemClassLoader=true)
#GrabConfig(systemClassLoader=true)
#Grapes([
#Grab(group = 'org.codehaus.groovy.modules.http-builder', module = 'http-builder', version = '0.6'),
#Grab(group = 'org.apache.commons', module='commons-lang3', version='3.0'),
#Grab(group = 'commons-io', module = 'commons-io', version = '2.4'),
#Grab(group = 'joda-time', module = 'joda-time', version = '2.9.4'),
#Grab(group = 'ch.qos.logback', module = 'logback-classic', version = '1.1.7'),
#Grab(group = 'ch.qos.logback', module = 'logback-core', version = '1.1.7')
])

Related

Overwrite Databricks Dependency

In our project we're using com.typesafe:config in version 1.3.4. According to the latest release notes, this dependency is already provided by Databricks on the cluster, but in a very old version (1.2.1).
How can I overwrite the provided dependency with our own version?
We use maven, in our dependencies I have
<dependency>
<groupId>com.typesafe</groupId>
<artifactId>config</artifactId>
<version>1.3.4</version>
</dependency>
Our created jar file should therefore contain the newer version.
I created a Job by uploading the jar file. The Job fails because it can't find a method that was added after version 1.2.1, so it looks like the library we provided gets overwritten by the older version on the cluster.
In the end we have fixed this by shading the relevant classes, by adding the following to our build.sbt
assemblyShadeRules in assembly := Seq(
ShadeRule.rename("com.typesafe.config.**" -> "shadedSparkConfigForSpark.#1").inAll
)
We solved it in the end by utilizing Sparks ChildFirstURLClassLoader. The project is open source so you can check it out yourself here and the usage of the method here.
But for reference, here is the method in its entirety. You need to provide a Seq of jars that you want to override with your own, in our case it's the typesafe config.
def getChildFirstClassLoader(jars: Seq[String]): ChildFirstURLClassLoader = {
val initialLoader = getClass.getClassLoader.asInstanceOf[URLClassLoader]
#tailrec
def collectUrls(clazz: ClassLoader, acc: Map[String, URL]): Map[String, URL] = {
val urlsAcc: Map[String, URL] = acc++
// add urls on this level to accumulator
clazz.asInstanceOf[URLClassLoader].getURLs
.map( url => (url.getFile.split(Environment.defaultPathSeparator).last, url))
.filter{ case (name, url) => jars.contains(name)}
.toMap
// check if any jars without URL are left
val jarMissing = jars.exists(jar => urlsAcc.get(jar).isEmpty)
// return accumulated if there is no parent left or no jars are missing anymore
if (clazz.getParent == null || !jarMissing) urlsAcc else collectUrls(clazz.getParent, urlsAcc)
}
// search classpath hierarchy until all jars are found or we have reached the top
val urlsMap = collectUrls(initialLoader, Map())
// check if everything found
val jarsNotFound = jars.filter( jar => urlsMap.get(jar).isEmpty)
if (jarsNotFound.nonEmpty) {
logger.info(s"""available jars are ${initialLoader.getURLs.mkString(", ")} (not including parent classpaths)""")
throw ConfigurationException(s"""jars ${jarsNotFound.mkString(", ")} not found in parent class loaders classpath. Cannot initialize ChildFirstURLClassLoader.""")
}
// create child-first classloader
new ChildFirstURLClassLoader(urlsMap.values.toArray, initialLoader)
}
As you can see, it also has some logic to abort if the jar files you specified do not exist in the classpath.
Databricks supports the initialization script (cluster scope or global scope) so that you can install/remove any dependency. The details are at https://docs.databricks.com/clusters/init-scripts.html.
In your initialization script, you can remove the default jar file locates at databricks driver/executor classpath /databricks/jars/ and add the expected one there.

Why does my deploy1Node task fails complaining about java.lang.ClassCastException: sun.nio.fs.UnixPath cannot be cast to java.lang.String?

I have a working corda gradle build. Our deploy1Node task works properly via IntelliJ, via cmd in Windows and via iTerm in MacOS.
We're using corda-3.2 open-source ( net.corda:corda-3.2:... ) for this particular build.
The problem occurs with 3.3 open-source as well, and
Oracle java version "1.8.0_171" on ubuntu, Oracle java version "1.8.0_152" on mac.
When I try to execute it in a Linux box, I receive the following error
Caused by: java.lang.ClassCastException: sun.nio.fs.UnixPath cannot be cast to java.lang.String
as shown in the stack trace excerpt below:
org.gradle.api.tasks.TaskExecutionException: Execution failed for task ':deploy1Node'.
at org.gradle.api.internal.tasks.execution.ExecuteActionsTaskExecuter.executeActions(ExecuteActionsTaskExecuter.java:100)
at org.gradle.api.internal.tasks.execution.ExecuteActionsTaskExecuter.execute(ExecuteActionsTaskExecuter.java:70)
at org.gradle.api.internal.tasks.execution.SkipUpToDateTaskExecuter.execute(SkipUpToDateTaskExecuter.java:63)
...
...
Caused by: java.lang.ClassCastException: sun.nio.fs.UnixPath cannot be cast to java.lang.String
at net.corda.nodeapi.internal.network.NetworkBootstrapper.generateWhitelist(NetworkBootstrapper.kt:323)
at net.corda.nodeapi.internal.network.NetworkBootstrapper.bootstrap(NetworkBootstrapper.kt:85)
at net.corda.plugins.Baseform.bootstrapNetwork(Baseform.kt:160)
at net.corda.plugins.Cordform.build(Cordform.kt:66)
...
...
How to make this task work regardless of platform ?
the open source jar version is :
Corda-Revision: 5ae8325980ad22df8146b983afeaca344fc03c3e
Corda-Vendor: Corda Open Source
Corda-Release-Version: 3.2-corda
The enterprise version seems to be ok:
Corda-Revision: c9b23a4400923a5cfe88271ce2fedd75740eac40
Corda-Vendor: Corda Enterprise Edition
Corda-Release-Version: 3.1
Trying to track down this problem, I've found that in the enterprise version
it displays in the gradle build:
> Task :deploy1Node
Putting task artifact state for task ':deploy1Node' into context took 0.0 secs.
Executing task ':deploy1Node' (up-to-date check took 0.0 secs) due to:
Task has not declared any outputs.
Running Cordform task
Deleting ./build/nodes
Bootstrapping local test network in /mnt/builds/Cordapp/appname/build/nodes
Generating node directory for Node
Copying CorDapp JARs into node directories
and in the open source version, the output is:
> Task :deploy1Node
Putting task artifact state for task ':deploy1Node' into context took 0.0 secs.
Executing task ':deploy1Node' (up-to-date check took 0.0 secs) due to:
Task has not declared any outputs.
Running Cordform task
Deleting ./build/nodes
Bootstrapping local network in /mnt/builds/Cordapp/appname/build/nodes
Node config files found in the root directory - generating node directories
Generating directory for Node_node
Nodes found in the following sub-directories: [Node_node]
It seems a _node is being appended where it should not.
There is another reference of this problem, in a russian site:
http://qaru.site/questions/16922067/why-does-my-deploy1node-task-fails-complaining-about-javalangclasscastexception-sunniofsunixpath-cannot-be-cast-to-javalangstring
Sharing the solution, and answering my own question, to prevent this problem, make sure your ext.corda_release_group, ext.corda_release_version and ext.corda_gradle_plugins_version are
defined once and only once in any buildscript session(s) your
project happens to have.
The problem was caused by trying to get the build.gradle to choose between
2 different corda distributions (open source and enterprise) -- and to change from the default choice after it was first used. For open source, the build.gradle's buildscript section must have the settings like:
buildscript {
...
ext.corda_release_group = 'net.corda'
ext.corda_release_version = '3.2-corda'
ext.corda_gradle_plugins_version = '3.0.9'
ext.kotlin_version = '1.2.50'
...
likewise, for the enterprise version, the settings have to be like:
buildscript {
...
ext.corda_release_group = 'com.r3.corda'
ext.corda_release_version = '3.1'
ext.corda_gradle_plugins_version = '4.0.25'
ext.kotlin_version = '1.2.50'
...
Once selected, the references became entangled, and the deploy1Node task failed in the open-source version.
One possibility is to use something like:
buildscript {
...
ext.profile_name = "open-source"
switch(profile_name) {
case ~/enterprise/ :
ext.cdb_node_location = 'nodes-enterprise'
ext.corda_release_group = 'com.r3.corda'
ext.corda_release_version = '3.1'
ext.corda_gradle_plugins_version = '4.0.25'
ext.kotlin_version = '1.2.50'
break;
case ~/open-source/ :
ext.cdb_node_location = 'nodes'
ext.corda_release_group = 'net.corda'
ext.corda_release_version = '3.2-corda'
ext.corda_gradle_plugins_version = '3.0.9'
ext.kotlin_version = '1.2.50'
break;
default :
ext.cdb_node_location = 'nodes'
ext.corda_release_group = 'net.corda'
ext.corda_release_version = '3.2-corda'
ext.corda_gradle_plugins_version = '3.0.9'
ext.kotlin_version = '1.2.50'
break;
break;
}

Groovy: unable to resolve class when running from command line

I'm trying to run a groovy script from command line. It runs fine in a gradle project in intellij, but when I try to run it via command line I get several errors like this one
org.codehaus.groovy.control.MultipleCompilationErrorsException: startup
failed:
/home/tpulayan/jenkins-
watcher/src/main/groovy/attempt_authentification.groovy: 1: unable to resolve
class in.ashwanthkumar.slack.webhook.Slack
# line 1, column 1.
import in.ashwanthkumar.slack.webhook.Slack
^
1 error
My code:
#Grapes(
#Grab(group='in.ashwanthkumar', module='slack-java-webhook',
version='0.0.7')
)
import in.ashwanthkumar.slack.webhook.Slack
import in.ashwanthkumar.slack.webhook.SlackMessage
import org.apache.http.HttpException
import org.apache.http.HttpHost
import org.apache.http.HttpRequest
import org.apache.http.HttpRequestInterceptor
import org.apache.http.auth.AuthScheme
import org.apache.http.auth.AuthScope
import org.apache.http.auth.AuthState
import org.apache.http.auth.UsernamePasswordCredentials
import org.apache.http.client.CredentialsProvider
import org.apache.http.client.methods.HttpGet
import org.apache.http.client.protocol.ClientContext
import org.apache.http.impl.auth.BasicScheme
import org.apache.http.impl.client.DefaultHttpClient
import org.apache.http.protocol.BasicHttpContext
import org.apache.http.protocol.ExecutionContext
import org.apache.http.protocol.HttpContext
import org.apache.http.util.EntityUtils
use(TimerMethods) {
def timer = new Timer()
def periodSeconds = 600
new Slack()
.icon(':exclamation:')
.sendToChannel('jenkins-monitoring')
.displayName('Jenkins Watcher')
.push(new SlackMessage("Jenkins Watcher started at ${new Date()}. Started to monitor the status of deb-jenkins-prd..."))
def task = timer.runEvery(1000, periodSeconds * 1000) {
def response = getAuthentication().statusCode
if (response != 200) {
new Slack()
.icon(':exclamation:')
.sendToChannel('jenkins-monitoring')
.displayName('Jenkins Watcher')
.push(new SlackMessage("Login attempt by Jenkins Watcher failed at ${new Date()}. This could indicate that Jenkins is stuck! Request response code is: ${response}"))
}
}
}
I suppose it's something related to the classpath but I was unable to solve it.
Any help is much appreciated!
EDIT AFTER RESPONSE
I've updated my code with the #Grab but unfortunately I get this error when running it from command line:
org.codehaus.groovy.control.MultipleCompilationErrorsException: startup failed:
General error during conversion: Error grabbing Grapes -- [unresolved dependency: in.ashwanthkumar#slack-java-webhook;0.0.7: not found]
java.lang.RuntimeException: Error grabbing Grapes -- [unresolved dependency: in.ashwanthkumar#slack-java-webhook;0.0.7: not found]
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
at org.codehaus.groovy.reflection.CachedConstructor.invoke(CachedConstructor.java:83)
at org.codehaus.groovy.reflection.CachedConstructor.doConstructorInvoke(CachedConstructor.java:77)
at org.codehaus.groovy.runtime.callsite.ConstructorSite$ConstructorSiteNoUnwrap.callConstructor(ConstructorSite.java:84)
at org.codehaus.groovy.runtime.callsite.CallSiteArray.defaultCallConstructor(CallSiteArray.java:59)
at org.codehaus.groovy.runtime.callsite.AbstractCallSite.callConstructor(AbstractCallSite.java:238)
at org.codehaus.groovy.runtime.callsite.AbstractCallSite.callConstructor(AbstractCallSite.java:250)
at groovy.grape.GrapeIvy.getDependencies(GrapeIvy.groovy:464)
in build.gradle seems you have defined dependencies.
you can put all dependencies (*.jar) into ./lib dir and run your groovy script like this:
groovy -cp ./lib/* MyScript.groovy
or you can specify all the dependencies using Grape/Grab directly in your groovy script then your command line will be simple and groovy will download all dependencies on script start
groovy MyScript.groovy

Spark streaming + json4s-jackson dependency problems

I am unable to use json4s-Jackson 3.2.11 within my spark 1.4.1 Streaming application.
Thinking that it was the existing dependency within the spark-core project that is causing the problem as explained here -> Is it possible to use json4s 3.2.11 with Spark 1.3.0? I have built Spark from source with an adjusted core/pom.xml. I have changed the reference from json4s-jackson_2.10:3.2.10 to 3.2.11, as the 2.10 version does not support extracting to implicit types.
I have replaced the source jars that are referenced in my intellij IDEA project with the rebuilt jars, however I am still getting the same errors as before. I fear that Spark must still be referencing json4s 3.2.10 somehow?
here is my simple test:
object StreamingPredictor {
implicit val formats = DefaultFormats
case class event(Key: String,
sensorId: String,
sessionId: String,
deviceId: String,
playerId: String,
impressionId: String,
time: String,
eventName: String,
eventProperties: Map[String, Any],
dl: Array[List[(String, Any)]],
$post: Boolean,
$sync: Boolean)
def parser(json: String): String = {
val parsedJson = parse(json)
val foo = parsedJson.extract[event]
foo.eventName
}
def main(args: Array[String]) {
val zkQuorum = "localhost:2181"
val group = "myGroup"
val topic = Map("test" -> 1)
val sparkContext = new SparkContext("local[4]","KafkaConsumer")
val ssc = new StreamingContext(sparkContext, Seconds(1))
val json = KafkaUtils.createStream(ssc, zkQuorum, group, topic)
val eventName = json.map(_._2).map(parser)
eventName.print()
ssc.start()
}
}
The error I get when referencing json4s 3.2.11 in my application pom.xml file:
java.lang.NoSuchMethodError: org.json4s.jackson.JsonMethods$.render(Lorg/json4s/JsonAST$JValue;)Lorg/json4s/JsonAST$JValue;
at org.apache.spark.scheduler.EventLoggingListener$$anonfun$logEvent$1.apply(EventLoggingListener.scala:143)
at org.apache.spark.scheduler.EventLoggingListener$$anonfun$logEvent$1.apply(EventLoggingListener.scala:143)
at scala.Option.foreach(Option.scala:236)
at org.apache.spark.scheduler.EventLoggingListener.logEvent(EventLoggingListener.scala:143)
at org.apache.spark.scheduler.EventLoggingListener.onBlockManagerAdded(EventLoggingListener.scala:174)
at org.apache.spark.scheduler.SparkListenerBus$class.onPostEvent(SparkListenerBus.scala:46)
at org.apache.spark.scheduler.LiveListenerBus.onPostEvent(LiveListenerBus.scala:31)
at org.apache.spark.scheduler.LiveListenerBus.onPostEvent(LiveListenerBus.scala:31)
at org.apache.spark.util.ListenerBus$class.postToAll(ListenerBus.scala:56)
at org.apache.spark.util.AsynchronousListenerBus.postToAll(AsynchronousListenerBus.scala:37)
at org.apache.spark.util.AsynchronousListenerBus$$anon$1$$anonfun$run$1.apply$mcV$sp(AsynchronousListenerBus.scala:79)
at org.apache.spark.util.Utils$.tryOrStopSparkContext(Utils.scala:1215)
at org.apache.spark.util.AsynchronousListenerBus$$anon$1.run(AsynchronousListenerBus.scala:63)
And the error I get when i use json4s-jackson_2.10:3.2.10 in my application pom.xml file:
org.apache.spark.SparkException: Job aborted due to stage failure: Task 0 in stage 1.0 failed 1 times, most recent failure: Lost task 0.0 in stage 1.0 (TID 1, localhost): org.json4s.package$MappingException: No usable value for eventProperties
No information known about type
at org.json4s.reflect.package$.fail(package.scala:96)
at org.json4s.Extraction$ClassInstanceBuilder.org$json4s$Extraction$ClassInstanceBuilder$$buildCtorArg(Extraction.scala:443)
at org.json4s.Extraction$ClassInstanceBuilder$$anonfun$14.apply(Extraction.scala:463)
at org.json4s.Extraction$ClassInstanceBuilder$$anonfun$14.apply(Extraction.scala:463)
at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:244)
at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:244)
at scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:59)
at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:47)
at scala.collection.TraversableLike$class.map(TraversableLike.scala:244)
at scala.collection.AbstractTraversable.map(Traversable.scala:105)
at org.json4s.Extraction$ClassInstanceBuilder.org$json4s$Extraction$ClassInstanceBuilder$$instantiate(Extraction.scala:451)
at org.json4s.Extraction$ClassInstanceBuilder$$anonfun$result$6.apply(Extraction.scala:491)
at org.json4s.Extraction$ClassInstanceBuilder$$anonfun$result$6.apply(Extraction.scala:488)
at org.json4s.Extraction$.org$json4s$Extraction$$customOrElse(Extraction.scala:500)
at org.json4s.Extraction$ClassInstanceBuilder.result(Extraction.scala:488)
at org.json4s.Extraction$.extract(Extraction.scala:332)
at org.json4s.Extraction$.extract(Extraction.scala:42)
at org.json4s.ExtractableJsonAstNode.extract(ExtractableJsonAstNode.scala:21)
at com.pca.triggar.Streaming.StreamingPredictor$.parser(StreamingPredictor.scala:38)
at com.pca.triggar.Streaming.StreamingPredictor$$anonfun$2.apply(StreamingPredictor.scala:57)
at com.pca.triggar.Streaming.StreamingPredictor$$anonfun$2.apply(StreamingPredictor.scala:57)
at scala.collection.Iterator$$anon$11.next(Iterator.scala:328)
at scala.collection.Iterator$$anon$10.next(Iterator.scala:312)
at scala.collection.Iterator$class.foreach(Iterator.scala:727)
at scala.collection.AbstractIterator.foreach(Iterator.scala:1157)
at scala.collection.generic.Growable$class.$plus$plus$eq(Growable.scala:48)
at scala.collection.mutable.ArrayBuffer.$plus$plus$eq(ArrayBuffer.scala:103)
at scala.collection.mutable.ArrayBuffer.$plus$plus$eq(ArrayBuffer.scala:47)
at scala.collection.TraversableOnce$class.to(TraversableOnce.scala:273)
at scala.collection.AbstractIterator.to(Iterator.scala:1157)
at scala.collection.TraversableOnce$class.toBuffer(TraversableOnce.scala:265)
at scala.collection.AbstractIterator.toBuffer(Iterator.scala:1157)
at scala.collection.TraversableOnce$class.toArray(TraversableOnce.scala:252)
at scala.collection.AbstractIterator.toArray(Iterator.scala:1157)
at org.apache.spark.rdd.RDD$$anonfun$take$1$$anonfun$28.apply(RDD.scala:1276)
at org.apache.spark.rdd.RDD$$anonfun$take$1$$anonfun$28.apply(RDD.scala:1276)
at org.apache.spark.SparkContext$$anonfun$runJob$5.apply(SparkContext.scala:1767)
at org.apache.spark.SparkContext$$anonfun$runJob$5.apply(SparkContext.scala:1767)
at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:63)
at org.apache.spark.scheduler.Task.run(Task.scala:70)
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:213)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
at java.lang.Thread.run(Thread.java:745)
Caused by: org.json4s.package$MappingException: No information known about type
at org.json4s.Extraction$ClassInstanceBuilder.org$json4s$Extraction$ClassInstanceBuilder$$instantiate(Extraction.scala:465)
at org.json4s.Extraction$ClassInstanceBuilder$$anonfun$result$6.apply(Extraction.scala:491)
at org.json4s.Extraction$ClassInstanceBuilder$$anonfun$result$6.apply(Extraction.scala:488)
at org.json4s.Extraction$.org$json4s$Extraction$$customOrElse(Extraction.scala:500)
at org.json4s.Extraction$ClassInstanceBuilder.result(Extraction.scala:488)
at org.json4s.Extraction$.extract(Extraction.scala:332)
at org.json4s.Extraction$$anonfun$extract$5.apply(Extraction.scala:316)
at org.json4s.Extraction$$anonfun$extract$5.apply(Extraction.scala:316)
at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:244)
at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:244)
at scala.collection.immutable.List.foreach(List.scala:318)
at scala.collection.TraversableLike$class.map(TraversableLike.scala:244)
at scala.collection.AbstractTraversable.map(Traversable.scala:105)
at org.json4s.Extraction$.extract(Extraction.scala:316)
at org.json4s.Extraction$ClassInstanceBuilder.org$json4s$Extraction$ClassInstanceBuilder$$buildCtorArg(Extraction.scala:431)
... 42 more
Driver stacktrace:
at org.apache.spark.scheduler.DAGScheduler.org$apache$spark$scheduler$DAGScheduler$$failJobAndIndependentStages(DAGScheduler.scala:1273)
at org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGScheduler.scala:1264)
at org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGScheduler.scala:1263)
at scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:59)
at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:47)
at org.apache.spark.scheduler.DAGScheduler.abortStage(DAGScheduler.scala:1263)
at org.apache.spark.scheduler.DAGScheduler$$anonfun$handleTaskSetFailed$1.apply(DAGScheduler.scala:730)
at org.apache.spark.scheduler.DAGScheduler$$anonfun$handleTaskSetFailed$1.apply(DAGScheduler.scala:730)
at scala.Option.foreach(Option.scala:236)
at org.apache.spark.scheduler.DAGScheduler.handleTaskSetFailed(DAGScheduler.scala:730)
at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:1457)
at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:1418)
at org.apache.spark.util.EventLoop$$anon$1.run(EventLoop.scala:48)
Ok, I found the problem. As posted else where, you need to compile against jason4s 3.2.10. Apparently, doing so generates a binary which would then work with Spark (version 1.5 in my case. Same in some earlier versions as well). It has to do with the default parameter in the render() method which shows up in 3.2.11.
I had the same issue with emr 4.3.0 and spark 1.6
solved it with installing json4s in the bootstrap action by :
down load the json4s jar and put it in s3
create the following shell script and put it in s3
#!/bin/bash
set -e
wget -S -T 10 -t 5 https://s3.amazonaws.com/your-bucketname/json4s-native_2.10-3.2.4.jar
mkdir -p /home/hadoop/lib
mv json4s-native_2.10-3.2.4.jar /home/hadoop/lib/
add it as a bootstrap step in the emr launch steps

Processing 2.2.1 Video - AbstractMethodError - GStreamer / JNA mismatch?

I am trying to use processing-video 2.2.1 as a library from my (Scala) project. I can run the demo capture sketches directly in the Processing IDE, but from my project I get an error that looks like a version mismatch:
Exception in thread "Animation Thread" java.lang.AbstractMethodError: com.sun.jna.Structure.getFieldOrder()Ljava/util/List;
at com.sun.jna.Structure.fieldOrder(Structure.java:868)
at com.sun.jna.Structure.getFields(Structure.java:894)
at com.sun.jna.Structure.deriveLayout(Structure.java:1042)
at com.sun.jna.Structure.calculateSize(Structure.java:966)
at com.sun.jna.Structure.calculateSize(Structure.java:933)
at com.sun.jna.Structure.allocateMemory(Structure.java:360)
at com.sun.jna.Structure.<init>(Structure.java:184)
at com.sun.jna.Structure.<init>(Structure.java:172)
at com.sun.jna.Structure.<init>(Structure.java:159)
at com.sun.jna.Structure.<init>(Structure.java:151)
at org.gstreamer.lowlevel.GObjectAPI$GTypeInstance.<init>(GObjectAPI.java:114)
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:422)
at java.lang.Class.newInstance(Class.java:442)
at com.sun.jna.Structure.newInstance(Structure.java:1635)
at com.sun.jna.Structure.newInstance(Structure.java:1621)
at com.sun.jna.Structure.size(Structure.java:950)
at com.sun.jna.Native.getNativeSize(Native.java:1076)
at com.sun.jna.Structure.getNativeSize(Structure.java:1927)
at com.sun.jna.Structure.getNativeSize(Structure.java:1920)
at com.sun.jna.Structure.validateField(Structure.java:1018)
at com.sun.jna.Structure.validateFields(Structure.java:1032)
at com.sun.jna.Structure.<init>(Structure.java:179)
at com.sun.jna.Structure.<init>(Structure.java:172)
at com.sun.jna.Structure.<init>(Structure.java:159)
at com.sun.jna.Structure.<init>(Structure.java:151)
at org.gstreamer.lowlevel.GObjectAPI$GParamSpec.<init>(GObjectAPI.java:395)
at org.gstreamer.GObject.findProperty(GObject.java:656)
at org.gstreamer.GObject.set(GObject.java:87)
at processing.video.Capture.initGStreamer(Unknown Source)
at processing.video.Capture.<init>(Unknown Source)
at (my sketch)
The Maven POM file is here. I end up with the following libraries on the class path:
com.googlecode.gstreamer-java:gstreamer-java:1.5
net.java.dev.jna:jna:4.0.0
net.java.dev.jna:platform:3.4.0
org.processing:core:2.2.1
org.processing:video:2.2.1
My intuition says there is a mismatch between jna and platform - should they have the same version? That would indicate that the published POM is wrong. Which version does the Processing standalone use? Unfortunately the jars there are stripped of version information.
Indeed, it seems the processing POM specifies an incompatible JNA version. In sbt, I could fix this with a dependencyOverrides declaration:
def processingVersion = "2.2.1"
def gstreamerVersion = "1.5"
def jnaVersion = "3.4.0"
libraryDependencies ++= Seq(
"org.processing" % "video" % processingVersion,
"com.googlecode.gstreamer-java" % "gstreamer-java" % gstreamerVersion
)
dependencyOverrides += "net.java.dev.jna" % "jna" % jnaVersion // !
for gradle peeps thats:
implementation ("org.processing:core:3.3.7") {
exclude group: 'net.java.dev.jna'
}
// https://mvnrepository.com/artifact/org.processing/video
implementation ("org.processing:video:3.3.7) {
exclude group: 'net.java.dev.jna'
}
// higher jna versions have abstract Structure.getFieldOrder which gstreamer doesn't implement
implementation "net.java.dev.jna:jna:3.4.0"

Resources