ScalaTest for Scala 2.11.4? - maven

Is there a ScalaTest available for Scala 2.11.4 in Maven?
I tried what appeared to be the latest version.
<dependency>
<groupId>org.scalatest</groupId>
<artifactId>scalatest_2.11</artifactId>
<version>3.0.0-SNAP2</version>
<scope>test</scope>
</dependency>
However, when building via mvn clean install, I see:
[WARNING] org.scalatest:scalatest_2.11:3.0.0-SNAP2 requires scala version: 2.11.2

http://www.scalatest.org/download
there's all the configuration examples,for you I think this is suitable
Here's the code for your Maven POM (for Scala 2.11.0+):
<dependency>
<groupId>org.scalatest</groupId>
<artifactId>scalatest_2.11</artifactId>
<version>2.2.4</version>
<scope>test</scope>
</dependency>
and why don't you use sbt, it's a better build tools for pure scala coder.

Related

WebDriverManager The import io.github cannot be resolved

I added the below WebDriverManager maven dependency in pom.xml
<dependency>
<groupId>io.github.bonigarcia</groupId>
<artifactId>webdrivermanager</artifactId>
<version>3.8.1</version>
<scope>test</scope>
</dependency>
In my java class I am unable to import io.github.bonigarcia.wdm.WebDriverManager; automatically. If manually write the import, I get error at io.github which says: The import io.github cannot be resolved.
What is the issue here? I tried clean, restart and different versions of webdrivermanager in pom.xml.
<dependency>
<groupId>io.github.bonigarcia</groupId>
<artifactId>webdrivermanager</artifactId>
<version>3.8.1</version>
<scope>test</scope>
</dependency>
The dependency you used is reduced to <scope>test</scope> but what that actually means?
It indicates that the dependency is NOT required for the compilation but only for execution.
It appears during the runtime and test but not during compilation.
The default scope is compile. Compile dependencies are available in all classpaths of the project.
EDIT:
<scope>test</scope> makes the dependency available for execution but not for compilation. What does it mean?
It means that the classpath is available for src/test folder in your project.
Default scope makes classpath available for src/main AND src/test. So if you make any classes manage WebDriver and you put them under source folder, you should use a scope which allows the dependency to be available at compilation time.
<groupId>io.github.bonigarcia</groupId>
<artifactId>webdrivermanager</artifactId>
<version>4.2.2</version>
<scope>compile</scope>
</dependency>
In the scope replace with compile instead of test, it will import.*
You also can not specify the scope, it will work too :
<dependency>
<groupId>io.github.bonigarcia</groupId>
<artifactId>webdrivermanager</artifactId>
<version>4.4.3</version>
</dependency>
Windows 10
Eclipse IDE for Enterprise Java Developers - 2020-12
Java JDK 15.0.2
Maven 3.6.3

In Maven, what is the difference between `package:artifact:jar:version` and package:artifact:jar:tests:version`?

Using Maven 3.0.5
I'm trying to get spark-testing-base from com.holdenkarau to work with Hadoop 3.1. holdenkarau's dependency tree includes Hadoop 2.8.3; which is why I think I'm getting errors.
From my mvn dependency:tree I see the following lines:
[INFO] +- org.apache.hadoop:hadoop-common:jar:3.1.0:provided
...
[INFO] | +- org.apache.hadoop:hadoop-common:jar:tests:2.8.3:test
These lines come from these two lines in the pom.xml file:
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-common</artifactId>
<version>3.1.0</version>
<scope>provided</scope>
</dependency>
<dependency>
<groupId>com.holdenkarau</groupId>
<artifactId>spark-testing-base_${scala.compat.version}</artifactId>
<version>${spark.version}_0.12.0</version>
<scope>test</scope>
<exclusions>
<exclusion>
<groupId>commons-beanutils</groupId>
<artifactId>commons-beanutils-core</artifactId>
</exclusion>
</exclusions>
</dependency>
I basically have two related questions:
What is the difference between org.apache.hadoop:hadoop-common:jar:3.1.0 and org.apache.hadoop:hadoop-common:jar:tests:2.8.3. What is that extra tests in there for; where does it come from and what does it mean?
If I have a dependency that uses an older version of a package in the test scope, how do I force it to use a newer version; i.e., how do I force spark-testing-base to use Hadoop 3.1 in the test scope.
tests is called a classifier, and it contains code that's really only useful in the context of actually testing, such as an embedded HDFS system
You could explicitly try pulling in a new version like so, assuming it exists
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-common</artifactId>
<version>3.1.0</version>
<scope>test</scope>
<classifier>test</classifier>
</dependency>
You may also want to exclude the same within the other dependency, however you might then run into build issues since that library is only written to test against 2.8.3

How to use com.fasterxml.jackson 2.8.1 and 2.6.5 in the same module of maven project?

I have a module which has Spark 2.1.0 and Presto 0.166.
Spark 2.1.0 requires com.faster.xml version 2.6.5 while Presto 0.166 requires 2.8.1 strictly. How Can I resolve the issue in the same pom.xml so that I can run them in the same module?
Simply specify the version of com-fasterxml-jackson in your pom file. The version specified here will override the versions in Spark 2.1.0 and Presto 0.166
<dependency>
<groupId>com.fasterxml.jackson.core</groupId>
<artifactId>jackson-core</artifactId>
<version>2.8.1</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-core_2.11</artifactId>
<version>2.1.0</version>
</dependency>
<dependency>
<groupId>com.facebook.presto</groupId>
<artifactId>presto...</artifactId>
<version>0.166</version>
</dependency>
Since, Spark 2.1.0 can use com.fasterxml.jackson 2.8.1, you won't need 2 different versions of it in your module.
Resources -
Introduction to the Dependency Mechanism
You cannot use multiple version(s) of same dependency in a single pom.xml, exclude-dependency com.faster.xml version either from Spark 2.1.0 or from Presto 0.166, for example:
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-core_2.11</artifactId>
<version>2.1.0</version>
<exclusions>
<exclusion>
<groupId>com.fasterxml.jackson.core</groupId>
<artifactId>jackson-core</artifactId>
</exclusion>
</exclusions>
</dependency>
Are you trying to write a plugin for Presto? If so, the Presto SPI explicitly depends on only jackson-annotations and not the implementation. There should be no problem with using the newer version of the annotations with an older version of Jackson within your plugin. The version of Jackson used by the Presto engine can and will be different from the one used by your plugin as plugins are loaded in a separate class loader.
The Presto plugin system is designed to have very minimal dependencies and allow you to use whatever versions of libraries you want (as that is often necessary when writing a connector to a random system that uses older versions of libraries).

SolrJ with Maven

I am a newbie in Solr and maven and i want to make a small application that index all my database tables via SolrJ .
For that i looked up at this tutorial where they are using MAVEN .
I installed the librairies and jars (except maven) but i had this exception:
Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/http/HttpRequestInterceptor
I looked into the tutorial and i saw that for resolving this problem we need to add this to my maven configuration:
org.slf4j
slf4j-simple
1.5.6
Is there anyway to do that without maven?
Thank you
Use maven. Even with it, it took me a fairly considerable amount of time to get the dependencies right. The tutorials were all a bit lacking. Below is my pom.xml with the relevant dependencies that I had maven bring in. Perhaps it will help you.
<dependency>
<groupId>org.apache.solr</groupId>
<artifactId>solr-core</artifactId>
<version>4.3.0</version>
</dependency>
<dependency>
<artifactId>solr-solrj</artifactId>
<groupId>org.apache.solr</groupId>
<version>4.3.0</version>
<type>jar</type>
<scope>compile</scope>
</dependency>
<dependency>
<groupId>commons-logging</groupId>
<artifactId>commons-logging</artifactId>
<version>1.1.1</version>
</dependency>
<dependency>
<groupId>javax.servlet</groupId>
<artifactId>servlet-api</artifactId>
<version>2.5</version>
</dependency>
Maven is the suggested build technology for the Solrj, because it automates the management of 3rd party dependencies. Without dependency management it's a royal pain to decipher these relationships (Jar hell).
What I could suggest is to use ivy, which has a command-line mode.
First download the ivy jar
http://search.maven.org/remotecontent?filepath=org/apache/ivy/ivy/2.3.0/ivy-2.3.0.jar
To retrieve the following Maven module and all it's dependencies:
<dependency>
<artifactId>solr-solrj</artifactId>
<groupId>org.apache.solr</groupId>
<version>1.4.0</version>
<type>jar</type>
<scope>compile</scope>
</dependency>
Then run it as follows:
java -jar ivy.jar \
-dependency org.apache.solr solr-solrj 1.4.0 \
-retrieve "lib/[artifact]-[revision](-[classifier]).[ext]" \
-confs default
Retrieves into the lib directory:
lib/commons-httpclient-3.1.jar
lib/wstx-asl-3.2.7.jar
lib/slf4j-api-1.5.5.jar
lib/commons-codec-1.3.jar
lib/stax-api-1.0.1.jar
lib/geronimo-stax-api_1.0_spec-1.0.1.jar
lib/commons-logging-1.0.4.jar
lib/solr-solrj-1.4.0.jar
lib/commons-io-1.4.jar
lib/commons-fileupload-1.2.1.jar
Update
Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/http/HttpRequestInterceptor
This is due to a missing httpcore.jar file. I found this out by browsing Maven Central:
http://search.maven.org/#search|ga|1|fc%3A%22org.apache.http.HttpRequestInterceptor%22
The recommendation on using the "slf4j-simple" is to provide a logging implementation in case your application doesn't have one.
Finally... This demonstrates what I've tried to say. In the absence of a dependency management tool (ivy, groovy, Maven) you're on your own in deciphering the 3rd party jar dependencies.

pig-0.9.0.pom does not contain all its runtime dependencies, like pig-0.8.1-cdh3u1.pom

maven noob, be patient...
I'm upgrading from cdh3u1 to apache hadoop 0.20.203.0 and pig 0.9.0. I used to have:
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-core</artifactId>
<version>0.20.2-cdh3u1</version>
</dependency>
<dependency>
<groupId>org.apache.pig</groupId>
<artifactId>pig</artifactId>
<version>0.8.1-cdh3u1</version>
</dependency>
and running them from inside eclipse, with junit run configuration worked great.
Now I have:
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-core</artifactId>
<version>0.20.203.0</version>
</dependency>
<dependency>
<groupId>org.apache.pig</groupId>
<artifactId>pig</artifactId>
<version>0.9.0</version>
</dependency>
and I got NoClassDefFoundError: jline/ConsoleReaderInputStream on runtime.
I ended with adding all these dependencies manually until it worked:
<dependency>
<groupId>jline</groupId>
<artifactId>jline</artifactId>
<version>0.9.94</version>
</dependency>
<dependency>
<groupId>org.antlr</groupId>
<artifactId>antlr-runtime</artifactId>
<version> 3.2 </version> <- this is 3.0.1 in cdh3u1, but probably changed in pig 0.9.0
</dependency>
<dependency>
<groupId>com.google.guava</groupId>
<artifactId>guava</artifactId>
<version>r06</version>
</dependency>
What gives? why isn't maven automatically pulling my dependencies and putting them in the classpath?
Maven has a feature called Transitive dependencies, so you don´t have to specify the libraries that your own dependencies require.
ConsoleReaderInputStream is in the Jline JAR. When you were using Pig.0.8.1-cdh3u1, you didn´t have to add the Jline dependency because it is declared in Pig.0.8.1-cdh3u1.pom. Pig 0.9.0.pom does not have Jline dependency declared anymore, that´s the reason you had to add it by yourself. As for the reason JLine was removed from Pig, you have to ask the developers of that project.

Resources