how to import a Maven dependency using SBT - maven

I am trying to get embedded-cassandra in my scala/play project which uses sbt instead of maven. (https://github.com/nosan/embedded-cassandra/wiki)
I translated the following maven dependency into sbt.
<!-- Core API -->
<dependency>
<groupId>com.github.nosan</groupId>
<artifactId>embedded-cassandra</artifactId>
<version>2.0.1</version>
</dependency>
<!-- Test Extensions (Spring, JUnit, etc.) -->
<dependency>
<groupId>com.github.nosan</groupId>
<artifactId>embedded-cassandra-test</artifactId>
<version>2.0.1</version>
<scope>test</scope>
</dependency>
SBT conversion
"com.github.nosan"%"embedded-cassandra" % "2.0.1" % "test"
But I am getting compilation error when I try to import embedded-cassandra in my unit test.
import com.github.nosan.embedded.cassandra.Cassandra
error
Error:(7, 12) object github is not a member of package com
import com.github.nosan.embedded.cassandra.Cassandra
What am I doing wrong?

Turns out, the issue was that SBT hadn't downloaded the dependency. I re-imported the project and things worked. I made another change. I removed the % test from the sbt entry though to be honest I don't know if that had any implications.

Related

How to include kotlin.test properly via Maven?

Our team is making first steps into Kotlin and I'm about to to migrate a test. I tried a first example from mockk (https://github.com/mockk/mockk/blob/master/mockk/common/src/test/kotlin/io/mockk/it/InjectMocksTest.kt). For some reason it seems I'm not able to use kotlin.test although I have added it via maven. Do I have to include any other modules? Or do I have to change something else?
(the mockk example uses Gradle so it doesn't help me).
This is what I'd like to use in my Kotlin test file but it which can't be found (at least not the packages I need):
(Restarting Intellij doesn't help, neither running mvn seperately)
This is my maven dependancy (Intellij shows now error):
<dependency>
<groupId>org.jetbrains.kotlin</groupId>
<artifactId>kotlin-test</artifactId>
<version>${kotlin.version}</version> <!-- kotlin.version == 1.7.0 -->
<scope>test</scope>
</dependency>
The solution was (see hotkey's answer) to add the following maven dependency:
<dependency>
<groupId>org.jetbrains.kotlin</groupId>
<artifactId>kotlin-test-junit5</artifactId>
<version>1.7.0</version>
<scope>test</scope>
</dependency>
You need to add a dependency on one of kotlin-test-junit (for JUnit 4), kotlin-test-junit5 (for JUnit Jupiter), or kotlin-test-testng (for TestNG), depending on what test framework you are using.
The artifact kotlin-test contains only the common code, asserters and other stuff that can be reused across the frameworks.
The kotlin.test annotations like #Test or #BeforeTest are shipped in the test-framework-specific artifacts as typealiases to the actual annotations types of the test frameworks.

import of amqp from org.springframework get error

I'm working on existing Scala project which using the spring framework and I need to import org.springframework.amqp but when I tried to build the project I get:
Error:(15, 28) object amqp is not a member of package
org.springframework import org.springframework.amqp
It is really strange since I can see it in the formal website and I can see it in lot of examples in the web.
Any idea what is the problem?
A Maven dependency was missing. This is what I was need to add:
<dependency>
<groupId>org.springframework.amqp</groupId>
<artifactId>spring-amqp</artifactId>
<version>2.1.2.RELEASE</version>
</dependency>

Why does spark-submit fail to find kafka data source unless --packages is used?

I am trying to integrate Kafka in my Spark app, here is my POM file required entries:
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-streaming-kafka-0-10_2.11</artifactId>
<version>${spark.stream.kafka.version}</version>
</dependency>
<dependency>
<groupId>org.apache.kafka</groupId>
<artifactId>kafka_2.11</artifactId>
<version>${kafka.version}</version>
</dependency>
Corresponding artifact versions are:
<kafka.version>0.10.2.0</kafka.version>
<spark.stream.kafka.version>2.2.0</spark.stream.kafka.version>
I have been scratching my head over:
Exception in thread "main" java.lang.ClassNotFoundException: Failed to find data source: kafka. Please find packages at http://spark.apache.org/third-party-projects.html
I also tried supplying the jar with --jars parameter, however it is not helping. What am I missing here?
Code:
private static void startKafkaConsumerStream() {
Dataset<HttpPackage> ds1 = _spark
.readStream()
.format("kafka")
.option("kafka.bootstrap.servers", getProperty("kafka.bootstrap.servers"))
.option("subscribe", HTTP_FED_VO_TOPIC)
.load() // Getting the error here
.as(Encoders.bean(HttpPackage.class));
ds1.foreach((ForeachFunction<HttpPackage>) req ->System.out.print(req));
}
And _spark is defined as:
_spark = SparkSession
.builder()
.appName(_properties.getProperty("app.name"))
.config("spark.master", _properties.getProperty("master"))
.config("spark.es.nodes", _properties.getProperty("es.hosts"))
.config("spark.es.port", _properties.getProperty("es.port"))
.config("spark.es.index.auto.create", "true")
.config("es.net.http.auth.user", _properties.getProperty("es.net.http.auth.user"))
.config("es.net.http.auth.pass", _properties.getProperty("es.net.http.auth.pass"))
.getOrCreate();
My imports are:
import org.apache.kafka.clients.producer.Producer;
import org.apache.kafka.common.serialization.StringDeserializer;
import org.apache.kafka.common.serialization.StringSerializer;
import org.apache.spark.api.java.function.ForeachFunction;
import org.apache.spark.sql.Dataset;
import org.apache.spark.sql.Encoders;
import org.apache.spark.sql.SparkSession;
However when I run my code as mentioned here and which is with the package option:
--packages org.apache.spark:spark-sql-kafka-0-10_2.11:2.1.0
It works
Spark Structured Streaming supports Apache Kafka as the streaming source and sink using the external kafka-0-10-sql module.
kafka-0-10-sql module is not available to Spark applications that are submitted for execution using spark-submit. The module is external and to have it available you should define it as a dependency.
Unless you use kafka-0-10-sql module-specific code in your Spark application you don't have to define the module as a dependency in pom.xml. You simply don't need a compilation dependency on the module since no code uses the module's code. You code against interfaces which is one of the reasons why Spark SQL is so pleasant to use (i.e. it requires very little to code to have fairly sophisticated distributed application).
spark-submit however will require --packages command-line option that you've reported it worked fine.
However when I run my code as mentioned here and which is with the package option:
--packages org.apache.spark:spark-sql-kafka-0-10_2.11:2.1.0
The reason it worked fine with --packages is that you have to tell Spark infrastructure where to find the definition of kafka format.
That leads us to the other "issue" (or a requirement) to run streaming Spark applications with Kafka. You have to specify the runtime dependency on spark-sql-kafka module.
You specify a runtime dependency using --packages command-line option (that downloads the necessary jars after you spark-submit your Spark application) or creating a so-called uber-jar (or a fat-jar).
That's where pom.xml comes to play (and that's why people offered their help with pom.xml and the module as a dependency).
So, first of all, you have to specify the dependency in pom.xml.
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-sql-kafka-0-10_2.11</artifactId>
<version>2.2.0</version>
</dependency>
And the last but not least, you have to build an uber-jar that you configure in pom.xml using Apache Maven Shade Plugin.
With Apache Maven Shade Plugin you create an Uber JAR that will include all the "infrastructure" for kafka format to work, inside the Spark application jar file. As a matter of fact, the Uber JAR will contain all the necessary runtime dependencies and so you could spark-submit with the jar alone (and no --packages option or similar).
Add below dependency to your pom.xml file.
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-sql-kafka-0-10_2.11</artifactId>
<version>2.2.0</version>
</dependency>
Update your dependencies and versions. Below given dependencies should work fine:
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-core_2.11</artifactId>
<version>2.1.1</version>
<scope>provided</scope>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-streaming_2.11</artifactId>
<version>2.1.1</version>
<scope>provided</scope>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-streaming-kafka-0-10_2.11</artifactId>
<version>2.1.1</version>
</dependency>
PS: Note provided scope in first two dependencies.

Maven dependency for javax.mail

What is the maven dependency i should add for
import javax.mail.*;
import javax.mail.internet.*;
Adding the maven dependency from here http://mvnrepository.com/artifact/javax.mail/mail/1.5.0-b01 makes some of the jersey dependencies unable to retrieve error. What do you think is going wrong?
The version 1.6.3 had been the last version of JavaMail; since 2019-07-03, the new name for it is "Jakarta Mail".
Together with the name change also the Maven coordinates got different:
<dependency>
<groupId>com.sun.mail</groupId>
<artifactId>jakarta.mail</artifactId>
<version>…</version>
</dependency>
The project homepage can be found here: https://eclipse-ee4j.github.io/mail/
We are using following dependency:
<dependency>
<groupId>com.sun.mail</groupId>
<artifactId>javax.mail</artifactId>
</dependency>

Adding POM type dependency using m2eclipse, unable to resolve

I am trying to add Hector dependencies to my POM. My IDE is Eclipse, and I am also using m2eclipse. Adding dependencies of JAR type is not a problem, but this dependency is of type POM. I have tried almost everything usual including cleaning, building, and using import scope but nothing seem to have helped. When I try to add import me.prettyprint.hector.api.Serializer;
I get the error "The import cannot be resolved".
Is there anything else I need to do to use POM type dependencies or is there a better way of using dependencies of POM types in the project?
I believe his question is not as obvious as simply including the necessary dependency. I have experienced this problem too and am looking for a solution. The problem can be clearer stated as the following:
Let's say I have two maven projects (project A and project B). Project A is a simple web-app which wants to include dependencies as stated in project B. However, project B packaging type is "pom". This should allow all of project B's dependencies to be included into project A. Here is an example:
Project A (packaging is "war"):
<dependencies>
<dependency>
<groupId>com.foo</groupId>
<artifactId>B</artifactId>
<version>1.0</version>
<type>pom</type>
</dependency>
</dependencies>
Project B (packaging is "pom")
<dependencies>
<dependency>
<groupId>commons-lang</groupId>
<artifactId>commons-lang</artifactId>
<version>2.4</version>
</dependency>
</dependencies>
What we'd like to see in Eclipse is when you run maven eclipse:eclipse on Project A, that you can see the commons-lang-2.4.jar file as a dependency under project A such that you can resolve it in your code when imported. This is not happening and I'm still looking for such a solution.
The error indicates that the relevant class is missing in your classpath. A search of this class indicates, it is available in hector-core
This discussion indicates how this dependency can be imported, viz. adding the following entry to your project pom (or choosing this appropriately in m2eclipse).
<dependency>
<groupId>me.prettyprint</groupId>
<artifactId>hector-core</artifactId>
<version>0.7.0-29</version>
</dependency>

Resources