How to connect to Oracle DB with slick 3.0.1? - oracle

I'm starting to learn and experiment with slick.
I'm trying to connect to an oracle dev database, set up by our DBA.
However i am encountering issue and i can't connect.
Here is what i did so far:
oracledev = {
url = "jdbc:oracle:thin:#//vdevdbms2:4208/TPSDEV.IADB.ORG"
driver = com.typesafe.slick.driver.oracle.OracleDriver
connectionPool = disable
keepAliveConnection = true
}
I have the following in my build
resolvers += "Typesafe Releases" at "http://repo.typesafe.com/typesafe/maven-releases/"
libraryDependencies ++=
Seq(
"com.smartlogic.osclient" % "Semaphore-OS-Client" % "Semaphore-3.7.2",
"com.typesafe.slick" %% "slick-extensions" % "3.1.0",
"org.slf4j" % "slf4j-nop" % "1.6.4"
)
The code so far is simply:
object SlickSpike extends App {
val db = Database.forConfig("oracledev")
}
I get the following error:
Exception in thread "main" java.lang.ClassNotFoundException: disable
at java.lang.ClassLoader.findClass(ClassLoader.java:530) at
java.lang.ClassLoader.loadClass(ClassLoader.java:424) at
java.lang.ClassLoader.loadClass(ClassLoader.java:357) at
slick.util.ClassLoaderUtil$$anon$1.loadClass(ClassLoaderUtil.scala:12)
at slick.jdbc.JdbcDataSource$.loadFactory$1(JdbcDataSource.scala:30)
at slick.jdbc.JdbcDataSource$.forConfig(JdbcDataSource.scala:39) at
slick.jdbc.JdbcBackend$DatabaseFactoryDef$class.forConfig(JdbcBackend.scala:268)
at slick.jdbc.JdbcBackend$$anon$3.forConfig(JdbcBackend.scala:33) at
SlickSpike$.delayedEndpoint$SlickSpike$1(SlickSpike.scala:16) at
SlickSpike$delayedInit$body.apply(SlickSpike.scala:14) at
scala.Function0$class.apply$mcV$sp(Function0.scala:34) at
scala.runtime.AbstractFunction0.apply$mcV$sp(AbstractFunction0.scala:12)
at scala.App$$anonfun$main$1.apply(App.scala:76) at
scala.App$$anonfun$main$1.apply(App.scala:76) at
scala.collection.immutable.List.foreach(List.scala:381) at
scala.collection.generic.TraversableForwarder$class.foreach(TraversableForwarder.scala:35)
at scala.App$class.main(App.scala:76) at
SlickSpike$.main(SlickSpike.scala:14) at
SlickSpike.main(SlickSpike.scala) at
sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:483) at
com.intellij.rt.execution.application.AppMain.main(AppMain.java:144)
If i remove the line:
**
connectionPool = disable
**
Then i get the following error:
Exception in thread "main" java.lang.ClassNotFoundException:
slick.jdbc.hikaricp.HikariCPJdbcDataSource$ at
java.lang.ClassLoader.findClass(ClassLoader.java:530) at
java.lang.ClassLoader.loadClass(ClassLoader.java:424) at
java.lang.ClassLoader.loadClass(ClassLoader.java:357) at
slick.util.ClassLoaderUtil$$anon$1.loadClass(ClassLoaderUtil.scala:12)
at slick.jdbc.JdbcDataSource$.loadFactory$1(JdbcDataSource.scala:30)
at slick.jdbc.JdbcDataSource$.forConfig(JdbcDataSource.scala:35) at
slick.jdbc.JdbcBackend$DatabaseFactoryDef$class.forConfig(JdbcBackend.scala:268)
at slick.jdbc.JdbcBackend$$anon$3.forConfig(JdbcBackend.scala:33) at
SlickSpike$.delayedEndpoint$SlickSpike$1(SlickSpike.scala:16) at
SlickSpike$delayedInit$body.apply(SlickSpike.scala:14) at
scala.Function0$class.apply$mcV$sp(Function0.scala:34) at
scala.runtime.AbstractFunction0.apply$mcV$sp(AbstractFunction0.scala:12)
at scala.App$$anonfun$main$1.apply(App.scala:76) at
scala.App$$anonfun$main$1.apply(App.scala:76) at
scala.collection.immutable.List.foreach(List.scala:381) at
scala.collection.generic.TraversableForwarder$class.foreach(TraversableForwarder.scala:35)
at scala.App$class.main(App.scala:76) at
SlickSpike$.main(SlickSpike.scala:14) at
SlickSpike.main(SlickSpike.scala)
What am I doing wrong ?
I would simply like, to have a connection pool of 10, and connect to the database but i have no idea of how to set it up. Can someone help ?
Edit2
I solve the initial issue but i still have question and can't get everything to work.
I change my build as such:
libraryDependencies ++=
Seq(
"org.slf4j" % "slf4j-api" % "1.7.13",
"org.slf4j" % "slf4j-simple" % "1.7.13",
"com.smartlogic.osclient" % "Semaphore-OS-Client" % "Semaphore-3.7.2" exclude("org.slf4j","slf4j-log4j12"),
"com.typesafe.slick" %% "slick" % "3.1.0",
"com.typesafe.slick" %% "slick-extensions" % "3.1.0",
"com.typesafe.slick" %% "slick-hikaricp" % "3.1.0",
"com.oracle" % "ojdbc6" % "11.2.0.2.0"
)
I resorted to add slick-hikaricp, even if I did not intent to use originally.
Also I understood now that the oracle driver in the config, was the actual oracle drive, not the slick one. This is actually reflected in the change i did to my config as can be seen below:
oracledev = {
url = "jdbc:oracle:thin:#//vdevdbms2:4208/TPSDEV.IADB.ORG"
driver = oracle.jdbc.OracleDriver
// connectionPool = disable
keepAliveConnection = true
//databaseName = "BRIKPOOLPARTYDEV"
user = "*******"
password = "*******"
}
Questions:
1 - Is slick-hikaricp required by default when using Oracle. Indeed if i do not add it and comment out connectionPool = disable, which in my case, do not work when uncommented anyway, the program does not compile.
2 - I'm still not able to connect, am i missing something ?
Please help

the oracle, db2, and ms sql drivers are not free. there is a separate 'slick-extensions' package that contains drivers for them that you can use for development. but you have to fork over cold cash for production use of them.

You should download and put the ojdbc7.jar in a folder in your root project (lib) and have it rebuilt.

The entry in the config for connectionPool should be disabled instead of disable.
This config entry is used when deciding what JdbcDataSource will be loaded.
The disable config entry tries to load a JdbcDataSourceFactoryfactory with the name disable.

Related

I cannot install MySQL JDBC driver on Ubuntu 16.04

I downloaded the MySQL JDBC driver to connect Scala to a MySQL database, but I cannot install it with java -jar mysql-connector-java-5.1.45-bin.jar. It says, no main manifest attribute, in mysql-connector-java-5.1.45-bin.jar
I cannot find any MANIFEST.MF file.
Any help would be appreciated.
EDIT: warnings after I run sbt run
[info] Loading project definition from /home/alessandro/Scala/tests/project
[info] Loading settings from build.sbt ...
[info] Set current project to MyProject (in build file:/home/alessandro/Scala/tests/)
[info] Running tests.ScalaJdbcConnectSelect
Sun Feb 18 21:51:33 CET 2018 WARN: Establishing SSL connection without server's identity verification is not recommended. According to MySQL 5.5.45+, 5.6.26+ and 5.7.6+ requirements SSL connection must be established by default if explicit option isn't set. For compliance with existing applications not using SSL the verifyServerCertificate property is set to 'false'. You need either to explicitly disable SSL by setting useSSL=false, or set useSSL=true and provide truststore for server certificate verification.
com.mysql.jdbc.exceptions.jdbc4.MySQLSyntaxErrorException: Table 'pmanager.user' doesn't exist
at java.base/jdk.internal.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at java.base/jdk.internal.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
at java.base/jdk.internal.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.base/java.lang.reflect.Constructor.newInstance(Constructor.java:488)
at com.mysql.jdbc.Util.handleNewInstance(Util.java:425)
at com.mysql.jdbc.Util.getInstance(Util.java:408)
at com.mysql.jdbc.SQLError.createSQLException(SQLError.java:944)
at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:3973)
at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:3909)
at com.mysql.jdbc.MysqlIO.sendCommand(MysqlIO.java:2527)
at com.mysql.jdbc.MysqlIO.sqlQueryDirect(MysqlIO.java:2680)
at com.mysql.jdbc.ConnectionImpl.execSQL(ConnectionImpl.java:2480)
at com.mysql.jdbc.ConnectionImpl.execSQL(ConnectionImpl.java:2438)
at com.mysql.jdbc.StatementImpl.executeQuery(StatementImpl.java:1381)
at tests.ScalaJdbcConnectSelect$.delayedEndpoint$tests$ScalaJdbcConnectSelect$1(ScalaJdbcConnectSelect.scala:16)
at tests.ScalaJdbcConnectSelect$delayedInit$body.apply(ScalaJdbcConnectSelect.scala:5)
at scala.Function0.apply$mcV$sp(Function0.scala:34)
at scala.Function0.apply$mcV$sp$(Function0.scala:34)
at scala.runtime.AbstractFunction0.apply$mcV$sp(AbstractFunction0.scala:12)
at scala.App.$anonfun$main$1$adapted(App.scala:76)
at scala.collection.immutable.List.foreach(List.scala:389)
at scala.App.main(App.scala:76)
at scala.App.main$(App.scala:74)
at tests.ScalaJdbcConnectSelect$.main(ScalaJdbcConnectSelect.scala:5)
at tests.ScalaJdbcConnectSelect.main(ScalaJdbcConnectSelect.scala)
at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.base/java.lang.reflect.Method.invoke(Method.java:564)
at sbt.Run.invokeMain(Run.scala:93)
at sbt.Run.run0(Run.scala:87)
at sbt.Run.execute$1(Run.scala:65)
at sbt.Run.$anonfun$run$4(Run.scala:77)
at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:12)
at sbt.util.InterfaceUtil$$anon$1.get(InterfaceUtil.scala:10)
at sbt.TrapExit$App.run(TrapExit.scala:252)
at java.base/java.lang.Thread.run(Thread.java:844)
[success] Total time: 3 s, completed Feb 18, 2018, 9:51:34 PM
UPDATE:
I found two options:
The first
In your build.sbt replace line
libraryDependencies += "mysql" % "mysql-connector-java" % "5.1.24"
with this:
libraryDependencies ++= Seq(
"mysql" % "mysql-connector-java" % "5.1.16"
)
In command line:
sbt compile
sbt run
The second
Use this in command line:
set fullClasspath in Compile += Attributed.blank(file("path-to-your-connector-jar"))
sbt compile
sbt run
After that, the problem with the driver should disappear, but check the data to connect to the database.
BEFORE UPDATE:
In your project on Scala you also can use this:
package tests
import java.sql.{Connection,DriverManager}
object ScalaJdbcConnectSelect extends App {
// connect to the database named "mysql" on port 8889 of localhost
val url = "jdbc:mysql://localhost:8889/mysql"
val driver = "com.mysql.jdbc.Driver"
val username = "root"
val password = "root"
var connection:Connection = _
try {
Class.forName(driver)
connection = DriverManager.getConnection(url, username, password)
val statement = connection.createStatement
val rs = statement.executeQuery("SELECT host, user FROM user")
while (rs.next) {
val host = rs.getString("host")
val user = rs.getString("user")
println("host = %s, user = %s".format(host,user))
}
} catch {
case e: Exception => e.printStackTrace
}
connection.close
}
Link to source 1
Link to source 2
Please read this line:
com.mysql.jdbc.exceptions.jdbc4.MySQLSyntaxErrorException: Table 'pmanager.user' doesn't exist
Check if there is such a table. If so, leave characters after the point (user).

Deepwater threw java.lang.ArrayIndexOutOfBoundsException during training if balance_classes=TRUE

In AWS, I followed the instruction in here and launched a g2.2xlarge EC2 using the community AMI ami-97591381
On the docker image, I can run a simple deepwater tutorial without a problem. However, when I tried to train a deepwater model using my own data (which worked ok with a non-GPU deeplearning model), h2o gave me this exception:
java.lang.RuntimeException: java.lang.ArrayIndexOutOfBoundsException: 0 <= 186393 < 170807
at water.Futures.blockForPending(Futures.java:88)
at hex.deepwater.DeepWaterDatasetIterator.Next(DeepWaterDatasetIterator.java:99)
at hex.deepwater.DeepWaterTask.setupLocal(DeepWaterTask.java:168)
at water.MRTask.setupLocal0(MRTask.java:550)
at water.MRTask.dfork(MRTask.java:456)
at water.MRTask.doAll(MRTask.java:389)
at water.MRTask.doAll(MRTask.java:385)
at hex.deepwater.DeepWater$DeepWaterDriver.trainModel(DeepWater.java:345)
at hex.deepwater.DeepWater$DeepWaterDriver.buildModel(DeepWater.java:205)
at hex.deepwater.DeepWater$DeepWaterDriver.computeImpl(DeepWater.java:118)
at hex.ModelBuilder$Driver.compute2(ModelBuilder.java:173)
at hex.deepwater.DeepWater$DeepWaterDriver.compute2(DeepWater.java:111)
at water.H2O$H2OCountedCompleter.compute(H2O.java:1256)
at jsr166y.CountedCompleter.exec(CountedCompleter.java:468)
at jsr166y.ForkJoinTask.doExec(ForkJoinTask.java:263)
at jsr166y.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:974)
at jsr166y.ForkJoinPool.runWorker(ForkJoinPool.java:1477)
at jsr166y.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:104)
Caused by: java.lang.ArrayIndexOutOfBoundsException: 0 <= 186393 < 170807
at water.fvec.Vec.elem2ChunkIdx(Vec.java:925)
at water.fvec.Vec.chunkForRow(Vec.java:1063)
at hex.deepwater.DeepWaterDatasetIterator$FrameDataConverter.compute2(DeepWaterDatasetIterator.java:76)
... 6 more
This is my code, which you can run as I made the S3 links public:
library(h2o)
library(jsonlite)
library(curl)
h2o.init()
df.truth <- h2o.importFile("https://s3.amazonaws.com/nw.data.test.us.east/df.truth.zeroed", header = T, sep=",")
df.truth$isFemale <- h2o.asfactor(df.truth$isFemale)
hotnames.truth <- fromJSON("https://s3.amazonaws.com/nw.data.test.us.east/hotnames.json", simplifyVector = T)
# Training and validation sets
splits <- h2o.splitFrame(df.truth, c(0.9), seed=1234)
train.truth <- h2o.assign(splits[[1]], "train.truth.hex")
valid.truth <- h2o.assign(splits[[2]], "valid.truth.hex")
dl.2.balanced <- h2o.deepwater(
training_frame = train.truth, model_id="dl.2.balanced",
x=setdiff(hotnames.truth[1:(length(hotnames.truth)/2)], c("isFemale", "nwtcs")),
y="isFemale", stopping_metric = "AUTO", seed = 1000000,
sparse = F,
balance_classes = T,
mini_batch_size = 20)
The h2o version is 3.13.0.356.
Update:
I think I found the h2o bug. If I set balance_classes to FALSE, then it will run w/o crashing.
Please note that Deep Water is a legacy project (as of December 2017), which means that it is no longer under active development. The H2O.ai team has no current plans to add new features, however, contributions from the community (in the form of pull requests) are welcome.

Spark streaming + json4s-jackson dependency problems

I am unable to use json4s-Jackson 3.2.11 within my spark 1.4.1 Streaming application.
Thinking that it was the existing dependency within the spark-core project that is causing the problem as explained here -> Is it possible to use json4s 3.2.11 with Spark 1.3.0? I have built Spark from source with an adjusted core/pom.xml. I have changed the reference from json4s-jackson_2.10:3.2.10 to 3.2.11, as the 2.10 version does not support extracting to implicit types.
I have replaced the source jars that are referenced in my intellij IDEA project with the rebuilt jars, however I am still getting the same errors as before. I fear that Spark must still be referencing json4s 3.2.10 somehow?
here is my simple test:
object StreamingPredictor {
implicit val formats = DefaultFormats
case class event(Key: String,
sensorId: String,
sessionId: String,
deviceId: String,
playerId: String,
impressionId: String,
time: String,
eventName: String,
eventProperties: Map[String, Any],
dl: Array[List[(String, Any)]],
$post: Boolean,
$sync: Boolean)
def parser(json: String): String = {
val parsedJson = parse(json)
val foo = parsedJson.extract[event]
foo.eventName
}
def main(args: Array[String]) {
val zkQuorum = "localhost:2181"
val group = "myGroup"
val topic = Map("test" -> 1)
val sparkContext = new SparkContext("local[4]","KafkaConsumer")
val ssc = new StreamingContext(sparkContext, Seconds(1))
val json = KafkaUtils.createStream(ssc, zkQuorum, group, topic)
val eventName = json.map(_._2).map(parser)
eventName.print()
ssc.start()
}
}
The error I get when referencing json4s 3.2.11 in my application pom.xml file:
java.lang.NoSuchMethodError: org.json4s.jackson.JsonMethods$.render(Lorg/json4s/JsonAST$JValue;)Lorg/json4s/JsonAST$JValue;
at org.apache.spark.scheduler.EventLoggingListener$$anonfun$logEvent$1.apply(EventLoggingListener.scala:143)
at org.apache.spark.scheduler.EventLoggingListener$$anonfun$logEvent$1.apply(EventLoggingListener.scala:143)
at scala.Option.foreach(Option.scala:236)
at org.apache.spark.scheduler.EventLoggingListener.logEvent(EventLoggingListener.scala:143)
at org.apache.spark.scheduler.EventLoggingListener.onBlockManagerAdded(EventLoggingListener.scala:174)
at org.apache.spark.scheduler.SparkListenerBus$class.onPostEvent(SparkListenerBus.scala:46)
at org.apache.spark.scheduler.LiveListenerBus.onPostEvent(LiveListenerBus.scala:31)
at org.apache.spark.scheduler.LiveListenerBus.onPostEvent(LiveListenerBus.scala:31)
at org.apache.spark.util.ListenerBus$class.postToAll(ListenerBus.scala:56)
at org.apache.spark.util.AsynchronousListenerBus.postToAll(AsynchronousListenerBus.scala:37)
at org.apache.spark.util.AsynchronousListenerBus$$anon$1$$anonfun$run$1.apply$mcV$sp(AsynchronousListenerBus.scala:79)
at org.apache.spark.util.Utils$.tryOrStopSparkContext(Utils.scala:1215)
at org.apache.spark.util.AsynchronousListenerBus$$anon$1.run(AsynchronousListenerBus.scala:63)
And the error I get when i use json4s-jackson_2.10:3.2.10 in my application pom.xml file:
org.apache.spark.SparkException: Job aborted due to stage failure: Task 0 in stage 1.0 failed 1 times, most recent failure: Lost task 0.0 in stage 1.0 (TID 1, localhost): org.json4s.package$MappingException: No usable value for eventProperties
No information known about type
at org.json4s.reflect.package$.fail(package.scala:96)
at org.json4s.Extraction$ClassInstanceBuilder.org$json4s$Extraction$ClassInstanceBuilder$$buildCtorArg(Extraction.scala:443)
at org.json4s.Extraction$ClassInstanceBuilder$$anonfun$14.apply(Extraction.scala:463)
at org.json4s.Extraction$ClassInstanceBuilder$$anonfun$14.apply(Extraction.scala:463)
at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:244)
at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:244)
at scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:59)
at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:47)
at scala.collection.TraversableLike$class.map(TraversableLike.scala:244)
at scala.collection.AbstractTraversable.map(Traversable.scala:105)
at org.json4s.Extraction$ClassInstanceBuilder.org$json4s$Extraction$ClassInstanceBuilder$$instantiate(Extraction.scala:451)
at org.json4s.Extraction$ClassInstanceBuilder$$anonfun$result$6.apply(Extraction.scala:491)
at org.json4s.Extraction$ClassInstanceBuilder$$anonfun$result$6.apply(Extraction.scala:488)
at org.json4s.Extraction$.org$json4s$Extraction$$customOrElse(Extraction.scala:500)
at org.json4s.Extraction$ClassInstanceBuilder.result(Extraction.scala:488)
at org.json4s.Extraction$.extract(Extraction.scala:332)
at org.json4s.Extraction$.extract(Extraction.scala:42)
at org.json4s.ExtractableJsonAstNode.extract(ExtractableJsonAstNode.scala:21)
at com.pca.triggar.Streaming.StreamingPredictor$.parser(StreamingPredictor.scala:38)
at com.pca.triggar.Streaming.StreamingPredictor$$anonfun$2.apply(StreamingPredictor.scala:57)
at com.pca.triggar.Streaming.StreamingPredictor$$anonfun$2.apply(StreamingPredictor.scala:57)
at scala.collection.Iterator$$anon$11.next(Iterator.scala:328)
at scala.collection.Iterator$$anon$10.next(Iterator.scala:312)
at scala.collection.Iterator$class.foreach(Iterator.scala:727)
at scala.collection.AbstractIterator.foreach(Iterator.scala:1157)
at scala.collection.generic.Growable$class.$plus$plus$eq(Growable.scala:48)
at scala.collection.mutable.ArrayBuffer.$plus$plus$eq(ArrayBuffer.scala:103)
at scala.collection.mutable.ArrayBuffer.$plus$plus$eq(ArrayBuffer.scala:47)
at scala.collection.TraversableOnce$class.to(TraversableOnce.scala:273)
at scala.collection.AbstractIterator.to(Iterator.scala:1157)
at scala.collection.TraversableOnce$class.toBuffer(TraversableOnce.scala:265)
at scala.collection.AbstractIterator.toBuffer(Iterator.scala:1157)
at scala.collection.TraversableOnce$class.toArray(TraversableOnce.scala:252)
at scala.collection.AbstractIterator.toArray(Iterator.scala:1157)
at org.apache.spark.rdd.RDD$$anonfun$take$1$$anonfun$28.apply(RDD.scala:1276)
at org.apache.spark.rdd.RDD$$anonfun$take$1$$anonfun$28.apply(RDD.scala:1276)
at org.apache.spark.SparkContext$$anonfun$runJob$5.apply(SparkContext.scala:1767)
at org.apache.spark.SparkContext$$anonfun$runJob$5.apply(SparkContext.scala:1767)
at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:63)
at org.apache.spark.scheduler.Task.run(Task.scala:70)
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:213)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
at java.lang.Thread.run(Thread.java:745)
Caused by: org.json4s.package$MappingException: No information known about type
at org.json4s.Extraction$ClassInstanceBuilder.org$json4s$Extraction$ClassInstanceBuilder$$instantiate(Extraction.scala:465)
at org.json4s.Extraction$ClassInstanceBuilder$$anonfun$result$6.apply(Extraction.scala:491)
at org.json4s.Extraction$ClassInstanceBuilder$$anonfun$result$6.apply(Extraction.scala:488)
at org.json4s.Extraction$.org$json4s$Extraction$$customOrElse(Extraction.scala:500)
at org.json4s.Extraction$ClassInstanceBuilder.result(Extraction.scala:488)
at org.json4s.Extraction$.extract(Extraction.scala:332)
at org.json4s.Extraction$$anonfun$extract$5.apply(Extraction.scala:316)
at org.json4s.Extraction$$anonfun$extract$5.apply(Extraction.scala:316)
at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:244)
at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:244)
at scala.collection.immutable.List.foreach(List.scala:318)
at scala.collection.TraversableLike$class.map(TraversableLike.scala:244)
at scala.collection.AbstractTraversable.map(Traversable.scala:105)
at org.json4s.Extraction$.extract(Extraction.scala:316)
at org.json4s.Extraction$ClassInstanceBuilder.org$json4s$Extraction$ClassInstanceBuilder$$buildCtorArg(Extraction.scala:431)
... 42 more
Driver stacktrace:
at org.apache.spark.scheduler.DAGScheduler.org$apache$spark$scheduler$DAGScheduler$$failJobAndIndependentStages(DAGScheduler.scala:1273)
at org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGScheduler.scala:1264)
at org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGScheduler.scala:1263)
at scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:59)
at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:47)
at org.apache.spark.scheduler.DAGScheduler.abortStage(DAGScheduler.scala:1263)
at org.apache.spark.scheduler.DAGScheduler$$anonfun$handleTaskSetFailed$1.apply(DAGScheduler.scala:730)
at org.apache.spark.scheduler.DAGScheduler$$anonfun$handleTaskSetFailed$1.apply(DAGScheduler.scala:730)
at scala.Option.foreach(Option.scala:236)
at org.apache.spark.scheduler.DAGScheduler.handleTaskSetFailed(DAGScheduler.scala:730)
at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:1457)
at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:1418)
at org.apache.spark.util.EventLoop$$anon$1.run(EventLoop.scala:48)
Ok, I found the problem. As posted else where, you need to compile against jason4s 3.2.10. Apparently, doing so generates a binary which would then work with Spark (version 1.5 in my case. Same in some earlier versions as well). It has to do with the default parameter in the render() method which shows up in 3.2.11.
I had the same issue with emr 4.3.0 and spark 1.6
solved it with installing json4s in the bootstrap action by :
down load the json4s jar and put it in s3
create the following shell script and put it in s3
#!/bin/bash
set -e
wget -S -T 10 -t 5 https://s3.amazonaws.com/your-bucketname/json4s-native_2.10-3.2.4.jar
mkdir -p /home/hadoop/lib
mv json4s-native_2.10-3.2.4.jar /home/hadoop/lib/
add it as a bootstrap step in the emr launch steps

Processing 2.2.1 Video - AbstractMethodError - GStreamer / JNA mismatch?

I am trying to use processing-video 2.2.1 as a library from my (Scala) project. I can run the demo capture sketches directly in the Processing IDE, but from my project I get an error that looks like a version mismatch:
Exception in thread "Animation Thread" java.lang.AbstractMethodError: com.sun.jna.Structure.getFieldOrder()Ljava/util/List;
at com.sun.jna.Structure.fieldOrder(Structure.java:868)
at com.sun.jna.Structure.getFields(Structure.java:894)
at com.sun.jna.Structure.deriveLayout(Structure.java:1042)
at com.sun.jna.Structure.calculateSize(Structure.java:966)
at com.sun.jna.Structure.calculateSize(Structure.java:933)
at com.sun.jna.Structure.allocateMemory(Structure.java:360)
at com.sun.jna.Structure.<init>(Structure.java:184)
at com.sun.jna.Structure.<init>(Structure.java:172)
at com.sun.jna.Structure.<init>(Structure.java:159)
at com.sun.jna.Structure.<init>(Structure.java:151)
at org.gstreamer.lowlevel.GObjectAPI$GTypeInstance.<init>(GObjectAPI.java:114)
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:422)
at java.lang.Class.newInstance(Class.java:442)
at com.sun.jna.Structure.newInstance(Structure.java:1635)
at com.sun.jna.Structure.newInstance(Structure.java:1621)
at com.sun.jna.Structure.size(Structure.java:950)
at com.sun.jna.Native.getNativeSize(Native.java:1076)
at com.sun.jna.Structure.getNativeSize(Structure.java:1927)
at com.sun.jna.Structure.getNativeSize(Structure.java:1920)
at com.sun.jna.Structure.validateField(Structure.java:1018)
at com.sun.jna.Structure.validateFields(Structure.java:1032)
at com.sun.jna.Structure.<init>(Structure.java:179)
at com.sun.jna.Structure.<init>(Structure.java:172)
at com.sun.jna.Structure.<init>(Structure.java:159)
at com.sun.jna.Structure.<init>(Structure.java:151)
at org.gstreamer.lowlevel.GObjectAPI$GParamSpec.<init>(GObjectAPI.java:395)
at org.gstreamer.GObject.findProperty(GObject.java:656)
at org.gstreamer.GObject.set(GObject.java:87)
at processing.video.Capture.initGStreamer(Unknown Source)
at processing.video.Capture.<init>(Unknown Source)
at (my sketch)
The Maven POM file is here. I end up with the following libraries on the class path:
com.googlecode.gstreamer-java:gstreamer-java:1.5
net.java.dev.jna:jna:4.0.0
net.java.dev.jna:platform:3.4.0
org.processing:core:2.2.1
org.processing:video:2.2.1
My intuition says there is a mismatch between jna and platform - should they have the same version? That would indicate that the published POM is wrong. Which version does the Processing standalone use? Unfortunately the jars there are stripped of version information.
Indeed, it seems the processing POM specifies an incompatible JNA version. In sbt, I could fix this with a dependencyOverrides declaration:
def processingVersion = "2.2.1"
def gstreamerVersion = "1.5"
def jnaVersion = "3.4.0"
libraryDependencies ++= Seq(
"org.processing" % "video" % processingVersion,
"com.googlecode.gstreamer-java" % "gstreamer-java" % gstreamerVersion
)
dependencyOverrides += "net.java.dev.jna" % "jna" % jnaVersion // !
for gradle peeps thats:
implementation ("org.processing:core:3.3.7") {
exclude group: 'net.java.dev.jna'
}
// https://mvnrepository.com/artifact/org.processing/video
implementation ("org.processing:video:3.3.7) {
exclude group: 'net.java.dev.jna'
}
// higher jna versions have abstract Structure.getFieldOrder which gstreamer doesn't implement
implementation "net.java.dev.jna:jna:3.4.0"

Unable to use logback.groovy, but logback.xml works

I wanted to configure Logback using Groovy DSL. The file is very simple:
import ch.qos.logback.classic.encoder.PatternLayoutEncoder
import ch.qos.logback.core.ConsoleAppender
import static ch.qos.logback.classic.Level.DEBUG
import static ch.qos.logback.classic.Level.INFO
appender("stdout", ConsoleAppender) {
encoder(PatternLayoutEncoder) {
pattern = "%d %p [%c] - <%m>%n"
}
}
root(INFO, ["stdout"])
I use Gradle to build my application and run it with jettyRun. I get the following error:
Failed to instantiate [ch.qos.logback.classic.LoggerContext]
Reported exception:
org.codehaus.groovy.runtime.typehandling.GroovyCastException: Cannot cast object 'ch.qos.logback.core.ConsoleAppender[null]' with class 'ch.qos.logback.core.ConsoleAppender' to class 'ch.qos.logback.core.Appender'
at org.codehaus.groovy.runtime.typehandling.DefaultTypeTransformation.castToType(DefaultTypeTransformation.java:360)
at org.codehaus.groovy.runtime.ScriptBytecodeAdapter.castToType(ScriptBytecodeAdapter.java:599)
at ch.qos.logback.classic.gaffer.ConfigurationDelegate.appender(ConfigurationDelegate.groovy:119)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
at java.lang.reflect.Method.invoke(Method.java:597)
at org.codehaus.groovy.reflection.CachedMethod.invoke(CachedMethod.java:90)
at org.codehaus.groovy.runtime.metaclass.MixinInstanceMetaMethod.invoke(MixinInstanceMetaMethod.java:53)
at org.codehaus.groovy.runtime.callsite.PogoMetaMethodSite$PogoMetaMethodSiteNoUnwrapNoCoerce.invoke(PogoMetaMethodSite.java:308)
at org.codehaus.groovy.runtime.callsite.PogoMetaMethodSite.callCurrent(PogoMetaMethodSite.java:52)
at org.codehaus.groovy.runtime.callsite.CallSiteArray.defaultCallCurrent(CallSiteArray.java:46)
at org.codehaus.groovy.runtime.callsite.AbstractCallSite.callCurrent(AbstractCallSite.java:133)
at org.codehaus.groovy.runtime.callsite.AbstractCallSite.callCurrent(AbstractCallSite.java:149)
However, when I switch to equivalent XML configuration, everything works. What am I doing wrong?
Using Logback 1.0.0. Tried with Logback 1.0.3.
I figured the solution out, but some questions remain opened. The problem was that I had no proper Groovy on the classpath. I decided to make an example project to demonstrate this bug. I started with a console application using Gradle's "application" plugin. I didn't include Groovy as dependency.
apply plugin: 'application'
repositories {
mavenCentral()
}
ext.logbackVersion = '1.0.3'
ext.slf4jVersion = '1.6.4'
dependencies {
compile "ch.qos.logback:logback-classic:$ext.logbackVersion"
compile "org.slf4j:jcl-over-slf4j:$ext.slf4jVersion"
//runtime "org.codehaus.groovy:groovy:1.8.6" // the problem was here
}
mainClassName = "org.test.Main"
This gave me an error, which is quite straighforward.
|-ERROR in ch.qos.logback.classic.LoggerContext[default] - Groovy classes are not available on the class path. ABORTING INITIALIZATION.
OK, cool. The dependency was missing - easy to fix. But why didn't I get the same error when I ran my web application? Adding Groovy dependency solved the initial problem in the web application. I stripped my project down and will create a corresponding JIRA. Perhaps, Groovy on classpath detection is not quite correct.
The GroovyCastException "cannot cast ConsoleAppender as Appender" has all the bearings of a class loader issue. Which version of groovy is this? Could you open a bug report including a test case for reproducing this issuew?
colleagues.
I have faced almost the same trouble today:
When I use logback.xml everything works fine
When I use logback.groovy in IntelliJ IDEA eveything works fine too
When I use logback.groovy when start my script from command line I got a lot of errors like
:
D:\Projects\PRDMonitoring\sources>groovy tray.groovy PRD
Failed to instantiate [ch.qos.logback.classic.LoggerContext]
Reported exception:
org.codehaus.groovy.control.MultipleCompilationErrorsException: startup failed:
Script1.groovy: 2: unable to resolve class ch.qos.logback.classic.filter.LevelFilter
# line 2, column 1.
import ch.qos.logback.classic.filter.LevelFilter
^
Script1.groovy: -1: unable to resolve class ch.qos.logback.classic.encoder.PatternLayoutEncoder
# line -1, column -1.
Script1.groovy: 3: unable to resolve class ch.qos.logback.core.ConsoleAppender
# line 3, column 1.
import ch.qos.logback.core.ConsoleAppender
^
Script1.groovy: -1: unable to resolve class ch.qos.logback.classic.Level
# line -1, column -1.
Script1.groovy: 6: unable to resolve class ch.qos.logback.core.spi.FilterReply
# line 6, column 1.
import static ch.qos.logback.core.spi.FilterReply.ACCEPT
^
Script1.groovy: 7: unable to resolve class ch.qos.logback.core.spi.FilterReply
# line 7, column 1.
import static ch.qos.logback.core.spi.FilterReply.DENY
But after a couple of minutes to find a solution I figured out, that the following string before #Grapes annotation fixes a trouble with classes loading #GrabConfig(systemClassLoader=true)
#GrabConfig(systemClassLoader=true)
#Grapes([
#Grab(group = 'org.codehaus.groovy.modules.http-builder', module = 'http-builder', version = '0.6'),
#Grab(group = 'org.apache.commons', module='commons-lang3', version='3.0'),
#Grab(group = 'commons-io', module = 'commons-io', version = '2.4'),
#Grab(group = 'joda-time', module = 'joda-time', version = '2.9.4'),
#Grab(group = 'ch.qos.logback', module = 'logback-classic', version = '1.1.7'),
#Grab(group = 'ch.qos.logback', module = 'logback-core', version = '1.1.7')
])

Resources