Play Framework and Java8 - spring

I have one Java 8 project and this project is a dependency of Play Web app.
Now whenever I try to instantiate classes rom Java 8 project in Play 2.2.3 web app, it gives me following error:
play.PlayExceptions$CompilationException: Compilation error[error: cannot access MongoOperations]
at play.PlayReloader$$anon$1$$anonfun$reload$2$$anonfun$apply$14$$anonfun$apply$16.apply(PlayReloader.scala:304) ~[na:na]
at play.PlayReloader$$anon$1$$anonfun$reload$2$$anonfun$apply$14$$anonfun$apply$16.apply(PlayReloader.scala:304) ~[na:na]
How should let play compile code with Java 8 when I say 'Play "run 8080"' ? Why play isn't able to access the class in Java 8 project ?
FYI: My JAVA_HOME is pointing to JAVA 8.
Here is how my build.sbt looks like.
Note that 'content-aggregator' is my local artifact installed in my local maven repo.
name := "web"
version := "1.0-SNAPSHOT"
resolvers += "Maven central" at "http://repo1.maven.org/maven2"
libraryDependencies ++= Seq(
javaJdbc,
javaEbean,
cache,
"de.undercouch" % "bson4jackson" % "2.1.0" force(),
"com.fasterxml.jackson.core" % "jackson-databind" % "2.1.0" force(),
"com.fasterxml.jackson.core" % "jackson-annotations" % "2.1.0" force(),
"com.fasterxml.jackson.core" % "jackson-core" % "2.1.0" force(),
"org.mongodb" % "mongo-java-driver" % "2.11.3",
"com.techr" % "content-aggregator" % "0.0.1-SNAPSHOT",
"org.jongo" % "jongo" % "1.0",
"uk.co.panaxiom" %% "play-jongo" % "0.6.0-jongo1.0"
)
play.Project.playJavaSettings
In 'content-aggregator'(Java 8) project I am using Spring and have injected beans by autowiring.
MongoOperations is autowired in one of the classes and play is yelling about it.
SpringMongoConfig.java is a class from this project which is marked as #configuration annotation.
Now in Play project I have created config class which imports content-aggregator's config class.
#Configuration
#Import(SpringMongoConfig.class)
public class SpringConfig {
}

Related

Use one version of java library or another => make code compatible for both

I'm working on a Spring Boot project built with Gradle and the main language is Kotlin.
In this project, there is one imported library (developed in Java) which, depending on the version I use, has 5 or 6 parameters in the constructor of a specific class I use.
For now, I switch between the versions manually by changing the version number in the build.gradle.kts file so my question would be : regardless of the version I use, how could my code work for all the versions ?
So, basically,
library-version1.jar => Class(6 parameters)
library-version2.jar => Class(5 parameters)
project with library-version1.jar or library-version2.jar imported => universal code to create instance of Class
P.S : may I add that I have to use those 2 versions of the library.
I found the solution for my initial question :
val yourVariable = YourClass::class.java
val constructor = yourVariable.constructors[0] //The class I use has only one constructor
val implementationVersion = yourVariable.`package`.implementationVersion
if (implementationVersion < "a specific version number") {
constructor.newInstance(6 parameters) as YourClass
} else {
constructor.newInstance(5 parameters) as YourClass
}
But now I have a follow-up question that you can find in the comments (I can still post it here though) :
In my build.gradle.kts file, I have this line in my dependencies :
dependencies {
implementation("myLibrary:1.0")
...
}
Obviously, I don't want to switch between myLibrary v1 and v2 manually. Just by changing the build.gradle.kts file, would it be possible to have :
build.gradle.kts -> appV1 (if myLibrary v1 used)
-> appV2 (if myLibrary v2 used)
?

Overwrite Databricks Dependency

In our project we're using com.typesafe:config in version 1.3.4. According to the latest release notes, this dependency is already provided by Databricks on the cluster, but in a very old version (1.2.1).
How can I overwrite the provided dependency with our own version?
We use maven, in our dependencies I have
<dependency>
<groupId>com.typesafe</groupId>
<artifactId>config</artifactId>
<version>1.3.4</version>
</dependency>
Our created jar file should therefore contain the newer version.
I created a Job by uploading the jar file. The Job fails because it can't find a method that was added after version 1.2.1, so it looks like the library we provided gets overwritten by the older version on the cluster.
In the end we have fixed this by shading the relevant classes, by adding the following to our build.sbt
assemblyShadeRules in assembly := Seq(
ShadeRule.rename("com.typesafe.config.**" -> "shadedSparkConfigForSpark.#1").inAll
)
We solved it in the end by utilizing Sparks ChildFirstURLClassLoader. The project is open source so you can check it out yourself here and the usage of the method here.
But for reference, here is the method in its entirety. You need to provide a Seq of jars that you want to override with your own, in our case it's the typesafe config.
def getChildFirstClassLoader(jars: Seq[String]): ChildFirstURLClassLoader = {
val initialLoader = getClass.getClassLoader.asInstanceOf[URLClassLoader]
#tailrec
def collectUrls(clazz: ClassLoader, acc: Map[String, URL]): Map[String, URL] = {
val urlsAcc: Map[String, URL] = acc++
// add urls on this level to accumulator
clazz.asInstanceOf[URLClassLoader].getURLs
.map( url => (url.getFile.split(Environment.defaultPathSeparator).last, url))
.filter{ case (name, url) => jars.contains(name)}
.toMap
// check if any jars without URL are left
val jarMissing = jars.exists(jar => urlsAcc.get(jar).isEmpty)
// return accumulated if there is no parent left or no jars are missing anymore
if (clazz.getParent == null || !jarMissing) urlsAcc else collectUrls(clazz.getParent, urlsAcc)
}
// search classpath hierarchy until all jars are found or we have reached the top
val urlsMap = collectUrls(initialLoader, Map())
// check if everything found
val jarsNotFound = jars.filter( jar => urlsMap.get(jar).isEmpty)
if (jarsNotFound.nonEmpty) {
logger.info(s"""available jars are ${initialLoader.getURLs.mkString(", ")} (not including parent classpaths)""")
throw ConfigurationException(s"""jars ${jarsNotFound.mkString(", ")} not found in parent class loaders classpath. Cannot initialize ChildFirstURLClassLoader.""")
}
// create child-first classloader
new ChildFirstURLClassLoader(urlsMap.values.toArray, initialLoader)
}
As you can see, it also has some logic to abort if the jar files you specified do not exist in the classpath.
Databricks supports the initialization script (cluster scope or global scope) so that you can install/remove any dependency. The details are at https://docs.databricks.com/clusters/init-scripts.html.
In your initialization script, you can remove the default jar file locates at databricks driver/executor classpath /databricks/jars/ and add the expected one there.

Spark 1.3 and Cassandra 3.0 problems with guava

I am trying to connect to Cassandra 3.0 from Spark 1.3. I know that there is Cassandra connector for each version in spark, but spark-cassandra-connector-java_2.10:1.3.0 connector depends on cassandra-driver-core:2.1.5, that's why I am using the latest cassandra connector which depends the latest core driver. Anyway, so far this was not the problem. The problem is the com.google.guava package I suppose.
My pom looks like this:
...
<dependency>
<groupId>com.datastax.spark</groupId>
<artifactId>spark-cassandra-connector-java_2.10</artifactId>
<version>1.5.0-M3</version>
</dependency>
<dependency>
<groupId>com.datastax.spark</groupId>
<artifactId>spark-cassandra-connector_2.10</artifactId>
<version>1.5.0-M3</version>
</dependency>
...
I have excluded google guava from everywhere with:
<exclusions>
<exclusion>
<groupId>com.google.guava</groupId>
<artifactId>guava</artifactId>
</exclusion>
</exclusions>
so in the dependency tree only this is present com.google.guava:guava:jar:16.0.1 under com.datastax.spark:spark-cassandra-connector-java_2.10:jar:1.5.0-M3:compile.
However I am still getting the following error:
yarn.ApplicationMaster: User class threw exception: Failed to open native connection to Cassandra at {139.19.52.111}:9042
java.io.IOException: Failed to open native connection to Cassandra at {139.19.52.111}:9042
at com.datastax.spark.connector.cql.CassandraConnector$.com$datastax$spark$connector$cql$CassandraConnector$$createSession(CassandraConnector.scala:162)
at com.datastax.spark.connector.cql.CassandraConnector$$anonfun$2.apply(CassandraConnector.scala:148)
at com.datastax.spark.connector.cql.CassandraConnector$$anonfun$2.apply(CassandraConnector.scala:148)
at com.datastax.spark.connector.cql.RefCountedCache.createNewValueAndKeys(RefCountedCache.scala:31)
at com.datastax.spark.connector.cql.RefCountedCache.acquire(RefCountedCache.scala:56)
at com.datastax.spark.connector.cql.CassandraConnector.openSession(CassandraConnector.scala:81)
at com.ambiverse.tagging.dao.impl.DAOCassandra.createTable(DAOCassandra.java:45)
at com.ambiverse.tagging.dao.impl.DAOCassandra.createTable(DAOCassandra.java:64)
at com.ambiverse.tagging.dao.impl.DAOCassandra.savePairRDD(DAOCassandra.java:70)
at com.ambiverse.tagging.statistics.entitycorrelation.CorrelationStatisticsSparkRunner.run(CorrelationStatisticsSparkRunner.java:176)
at com.ambiverse.tagging.statistics.entitycorrelation.CorrelationStatisticsSparkRunner.main(CorrelationStatisticsSparkRunner.java:94)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:497)
at org.apache.spark.deploy.yarn.ApplicationMaster$$anon$2.run(ApplicationMaster.scala:480)
Caused by: java.lang.NoSuchMethodError: com.google.common.util.concurrent.Futures.withFallback(Lcom/google/common/util/concurrent/ListenableFuture;Lcom/google/common/util/concurrent/FutureFallback;Ljava/util/concurrent/Executor;)Lcom/google/common/util/concurrent/ListenableFuture;
at com.datastax.driver.core.Connection.initAsync(Connection.java:178)
at com.datastax.driver.core.Connection$Factory.open(Connection.java:742)
at com.datastax.driver.core.ControlConnection.tryConnect(ControlConnection.java:240)
at com.datastax.driver.core.ControlConnection.reconnectInternal(ControlConnection.java:187)
at com.datastax.driver.core.ControlConnection.connect(ControlConnection.java:79)
at com.datastax.driver.core.Cluster$Manager.init(Cluster.java:1393)
at com.datastax.driver.core.Cluster.getMetadata(Cluster.java:402)
at com.datastax.spark.connector.cql.CassandraConnector$.com$datastax$spark$connector$cql$CassandraConnector$$createSession(CassandraConnector.scala:155)
Before somebody point me to this blog post for solution: http://arjon.es/2015/10/12/making-hadoop-2-dot-6-plus-spark-cassandra-driver-play-nice-together/, I am using maven as a build tool, not sbt. If you know how can I do the exact same thing with maven, that would be great.
Although i work with scala + sbt, i had several mismatch between different artifacts with spark, and one among them is guava.
here is how i solved it (dependencies in sbt):
val sparkVersion = "1.6.1"//"2.0.0-preview"//
val sparkCassandraConnectorVersion = "1.6.0"
val scalaGuiceVersion = "4.0.1"
val cassandraUnitVersion = "3.0.0.1"
val typesafeConfigVersion = "1.3.0"
val findbugsVersion = "3.0.0"
val sparkRabbitmqVersion = "0.4.0.20160613"
val nettyAllVersion = "4.0.33.Final"
val guavaVersion = "19.0"
val jacksonVersion = "2.7.4"
val xbeanAsm5ShadedVersion = "4.5"
val commonsBeanutilsVersion = "1.8.0"
//IMPORTANT: all spark dependency magic is done in one place, to overcome the assembly mismatch errors
val sparkDependencies :List[ModuleID] = List(
("org.apache.spark" %% "spark-core" % sparkVersion).exclude("com.esotericsoftware.minlog", "minlog"),
"org.apache.spark" %% "spark-sql" % sparkVersion,
"org.apache.spark" %% "spark-streaming" % sparkVersion,
("com.datastax.spark" %% "spark-cassandra-connector"
% sparkCassandraConnectorVersion).exclude("org.apache.cassandra", "cassandra-clientutil"),
"com.stratio.receiver" % "spark-rabbitmq_1.6" % sparkRabbitmqVersion,//"0.3.0-b", //,//
"org.scalatest" %% "scalatest" % scalaTestVersion % "test",
"org.apache.xbean" % "xbean-asm5-shaded" % xbeanAsm5ShadedVersion,//,//, //https://github.com/apache/spark/pull/9512/files
"io.netty" % "netty-all" % nettyAllVersion,
"commons-beanutils" % "commons-beanutils" % commonsBeanutilsVersion,
"com.google.guava" % "guava" % guavaVersion,
"com.fasterxml.jackson.module" %% "jackson-module-scala" % jacksonVersion,//fix jackson mismatch problem
"com.fasterxml.jackson.core" % "jackson-databind" % jacksonVersion,//fix jackson mismatch problem
//override findbugs artifacts versions(fix assembly issues)
"com.google.code.findbugs" % "annotations" % findbugsVersion,
"com.google.code.findbugs" % "jsr305" % findbugsVersion
).map(_.exclude("commons-collections", "commons-collections"))
i hope it will help.

Play framework 2.3.8, org.apache.poi dependency not found

I added the org.apache.poi to my dependencies, but it just does not resolve.
libraryDependencies ++= Seq(
"postgresql" % "postgresql" % "9.1-901-1.jdbc4",
"net.sf.jasperreports" % "jasperreports" % "6.0.3",
"net.sf.jasperreports" % "jasperreports-fonts" % "6.0.0",
"com.typesafe.play" %% "play-mailer" % "2.4.1",
"org.apache.poi" %% "poi" % "3.13",
javaJdbc,
javaEbean,
cache,
javaWs
)
Getting error, that it does search it but is not found. Interesting is this :
Warning:Play 2 Compiler: ==== public: tried
Warning:Play 2 Compiler: http://repo1.maven.org/maven2/org/apache/poi/poi_2.11/3.13/poi_2.11-3.13.pom
Error:Play 2 Compiler:
(*:update) sbt.ResolveException: unresolved dependency: org.apache.poi#poi_2.11;3.13: not found
But in reality, the location of the pom file is here:
https://repo1.maven.org/maven2/org/apache/poi/poi/3.13/poi-3.13.pom
Why does play framework append that 2.11 version there?
Just remove one percentage symbol
"org.apache.poi" % "poi" % "3.13",

Processing 2.2.1 Video - AbstractMethodError - GStreamer / JNA mismatch?

I am trying to use processing-video 2.2.1 as a library from my (Scala) project. I can run the demo capture sketches directly in the Processing IDE, but from my project I get an error that looks like a version mismatch:
Exception in thread "Animation Thread" java.lang.AbstractMethodError: com.sun.jna.Structure.getFieldOrder()Ljava/util/List;
at com.sun.jna.Structure.fieldOrder(Structure.java:868)
at com.sun.jna.Structure.getFields(Structure.java:894)
at com.sun.jna.Structure.deriveLayout(Structure.java:1042)
at com.sun.jna.Structure.calculateSize(Structure.java:966)
at com.sun.jna.Structure.calculateSize(Structure.java:933)
at com.sun.jna.Structure.allocateMemory(Structure.java:360)
at com.sun.jna.Structure.<init>(Structure.java:184)
at com.sun.jna.Structure.<init>(Structure.java:172)
at com.sun.jna.Structure.<init>(Structure.java:159)
at com.sun.jna.Structure.<init>(Structure.java:151)
at org.gstreamer.lowlevel.GObjectAPI$GTypeInstance.<init>(GObjectAPI.java:114)
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:422)
at java.lang.Class.newInstance(Class.java:442)
at com.sun.jna.Structure.newInstance(Structure.java:1635)
at com.sun.jna.Structure.newInstance(Structure.java:1621)
at com.sun.jna.Structure.size(Structure.java:950)
at com.sun.jna.Native.getNativeSize(Native.java:1076)
at com.sun.jna.Structure.getNativeSize(Structure.java:1927)
at com.sun.jna.Structure.getNativeSize(Structure.java:1920)
at com.sun.jna.Structure.validateField(Structure.java:1018)
at com.sun.jna.Structure.validateFields(Structure.java:1032)
at com.sun.jna.Structure.<init>(Structure.java:179)
at com.sun.jna.Structure.<init>(Structure.java:172)
at com.sun.jna.Structure.<init>(Structure.java:159)
at com.sun.jna.Structure.<init>(Structure.java:151)
at org.gstreamer.lowlevel.GObjectAPI$GParamSpec.<init>(GObjectAPI.java:395)
at org.gstreamer.GObject.findProperty(GObject.java:656)
at org.gstreamer.GObject.set(GObject.java:87)
at processing.video.Capture.initGStreamer(Unknown Source)
at processing.video.Capture.<init>(Unknown Source)
at (my sketch)
The Maven POM file is here. I end up with the following libraries on the class path:
com.googlecode.gstreamer-java:gstreamer-java:1.5
net.java.dev.jna:jna:4.0.0
net.java.dev.jna:platform:3.4.0
org.processing:core:2.2.1
org.processing:video:2.2.1
My intuition says there is a mismatch between jna and platform - should they have the same version? That would indicate that the published POM is wrong. Which version does the Processing standalone use? Unfortunately the jars there are stripped of version information.
Indeed, it seems the processing POM specifies an incompatible JNA version. In sbt, I could fix this with a dependencyOverrides declaration:
def processingVersion = "2.2.1"
def gstreamerVersion = "1.5"
def jnaVersion = "3.4.0"
libraryDependencies ++= Seq(
"org.processing" % "video" % processingVersion,
"com.googlecode.gstreamer-java" % "gstreamer-java" % gstreamerVersion
)
dependencyOverrides += "net.java.dev.jna" % "jna" % jnaVersion // !
for gradle peeps thats:
implementation ("org.processing:core:3.3.7") {
exclude group: 'net.java.dev.jna'
}
// https://mvnrepository.com/artifact/org.processing/video
implementation ("org.processing:video:3.3.7) {
exclude group: 'net.java.dev.jna'
}
// higher jna versions have abstract Structure.getFieldOrder which gstreamer doesn't implement
implementation "net.java.dev.jna:jna:3.4.0"

Resources