I have hadoop and Hbase installed, both working fine as far as I can tell. When trying to the built jar with hadoop, I get a
java.lang.NoClassDefFoundError: org/apache/hadoop/hbase/HBaseConfiguration
error, using Hbase version 0.90.2 in my maven dependency.
I think this is quite an old version of Hbase and I am unsure if this old version is compatible with hadoop 2.7.2 or even Java 8. Thus I tried using Hbase version 0.99.2 in my maven dependency, but then I get a
Failed to execute goal on project exercise_2: Could not resolve dependencies for project com.company.exercise_2:exercise_2:jar:1.0-SNAPSHOT: Failure to find org.apache.hbase:hbase:jar:0.99.2 in http://repo.maven.apache.org/maven2 was cached in the local repository
error from the maven plugin. What am I doing wrong?
Here is my pom.xml:
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-hdfs</artifactId>
<version>2.7.2</version>
<scope>provided</scope>
</dependency>
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-mapreduce-client-core</artifactId>
<version>2.7.2</version>
<scope>provided</scope>
</dependency>
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-common</artifactId>
<version>2.7.2</version>
<scope>provided</scope>
</dependency>
<dependency>
<groupId>org.apache.hbase</groupId>
<artifactId>hbase</artifactId>
<version>0.99.2</version>
<scope>provided</scope>
</dependency>
<dependency>
<groupId>org.apache.hbase</groupId>
<artifactId>hbase-client</artifactId>
<version>1.1.2</version>
<scope>provided</scope>
</dependency>
Seems like this is jar caching issue, I think HbaseConfiguration is common class regardless of which version of hbase used.
Can you manually delete local repository file of hbase and try mvn XXXX command once again.
Maven will then try to download and fix the class path.
for cross checking, use mvn ... -X option to see which version of jar its trying to download.
Since scope of this jar is
provided
Cross check the hbase version of this jar in your cluster. by using "hbase classpath" and check whether this jar version is closely matching with your jar file version of maven repository of your pom.xml.
That should fix.
Related
I have added a <scope>provided</scope> dependency to my Maven pom.xml for my Java adapter, because the referenced jars are provided by my application server, but Maven install is still including the jars in the .adapter file.
Is this the expected behavior? Is this a Maven issue, or something related to MobileFirst platform foundations use of it? The MFP provided libraries seem to not be included.
Not included in .adapter file:
<dependency>
<groupId>com.ibm.mfp</groupId>
<artifactId>adapter-maven-api</artifactId>
<scope>provided</scope>
<version>8.0.2017012516</version>
</dependency>
Included:
<dependency>
<groupId>org.apache.commons</groupId>
<artifactId>commons-lang3</artifactId>
<version>3.3</version>
<scope>provided</scope>
</dependency>
I configured my Nexus to proxy maven.oracle.com and I am able to browse the index, so I know it worked. Oracle's documentation from this year (https://blogs.oracle.com/dev2dev/entry/oracle_maven_repository_instructions_for) gives this as the dependency for their driver:
<dependency>
<groupId>com.oracle.jdbc</groupId>
<artifactId>ojdbc7</artifactId>
<version>12.1.0.2</version>
</dependency>
But I cannot find it and Maven can't resolve it. I do find a jar ojdbc7 under a different GAV in their maven repo though:
<dependency>
<groupId>com.oracle.weblogic</groupId>
<artifactId>ojdbc7</artifactId>
<version>12.1.3-0-0</version>
</dependency>
Have I configured it wrong? Or is their doc wrong?
I am migrating my application from hadoop 1.0.3 to hadoop 2.2.0 and maven build had hadoop-core marked as dependency. Since hadoop-core is not present for hadoop 2.2.0. I tried replacing it with hadoop-client and hadoop-common but I am still getting this error for ant.filter. Can anybody please suggest which artifact to use?
previous config :
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-core</artifactId>
<version>1.0.3</version>
</dependency>
New Config:
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-client</artifactId>
<version>2.2.0</version>
</dependency>
Error:
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-compiler-plugin:3.1:compile (default-compile) on project event: Compilation failure: Compilation failure:
[ERROR] /opt/teamcity/buildAgent/work/c670ebea1992ec2f/event/src/main/java/com/intel/event/EventContext.java:[27,36] package org.apache.tools.ant.filters does not exist
[ERROR] /opt/teamcity/buildAgent/work/c670ebea1992ec2f/event/src/main/java/com/intel/event/EventContext.java:[27,36] package org.apache.tools.ant.filters does not exist
[ERROR] /opt/teamcity/buildAgent/work/c670ebea1992ec2f/event/src/main/java/com/intel/event/EventContext.java:[180,59] cannot find symbol
[ERROR] symbol: class StringInputStream
[ERROR] location: class com.intel.event.EventContext
We mainly depend on hdfs api for our application. When we migrated to hadoop 2.X, we were surprised to see the changes in dependencies. We started adding dependencies one at a time. Today we depend on the following core libraries.
hadoop-annotations-2.2.0
hadoop-auth-2.2.0
hadoop-common-2.2.0
hadoop-hdfs-2.2.0
hadoop-mapreduce-client-core-2.2.0
In addition to these we depend on test libraries too. Based on your needs, you may want to include hadoop-hdfs and hadoop-mapreduce-client to the dependencies along with hadoop-common.
Try with these artifacts, word fine on my sample project wordcount
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-common</artifactId>
<version>2.2.0</version>
</dependency>
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-core</artifactId>
<version>1.2.1</version>
</dependency>
Maven dependencies can be got from this link.
As far as hadoop-core dependies goes, hadoop-core was the name for hadoop 1.X and just renaming the version to 2.X wont help. Also in a hadoop 2.X project using the hadoop 1.X dependency gives an error like
Caused by: org.apache.hadoop.ipc.RemoteException: Server IPC version 9 cannot communicate with client version 4
Thus it is suggested not to use it. I have been using the following dependencies in my hadoop
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-client</artifactId>
<version>2.7.1</version>
</dependency>
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-hdfs</artifactId>
<version>2.7.1</version>
</dependency>
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-mapreduce-client-core</artifactId>
<version>2.7.1</version>
</dependency>
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-mapreduce-client-jobclient</artifactId>
<version>2.7.1</version>
</dependency>
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-mapreduce-client-common</artifactId>
<version>2.7.1</version>
</dependency>
<dependency>
You can try these.
I am a newbie in Solr and maven and i want to make a small application that index all my database tables via SolrJ .
For that i looked up at this tutorial where they are using MAVEN .
I installed the librairies and jars (except maven) but i had this exception:
Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/http/HttpRequestInterceptor
I looked into the tutorial and i saw that for resolving this problem we need to add this to my maven configuration:
org.slf4j
slf4j-simple
1.5.6
Is there anyway to do that without maven?
Thank you
Use maven. Even with it, it took me a fairly considerable amount of time to get the dependencies right. The tutorials were all a bit lacking. Below is my pom.xml with the relevant dependencies that I had maven bring in. Perhaps it will help you.
<dependency>
<groupId>org.apache.solr</groupId>
<artifactId>solr-core</artifactId>
<version>4.3.0</version>
</dependency>
<dependency>
<artifactId>solr-solrj</artifactId>
<groupId>org.apache.solr</groupId>
<version>4.3.0</version>
<type>jar</type>
<scope>compile</scope>
</dependency>
<dependency>
<groupId>commons-logging</groupId>
<artifactId>commons-logging</artifactId>
<version>1.1.1</version>
</dependency>
<dependency>
<groupId>javax.servlet</groupId>
<artifactId>servlet-api</artifactId>
<version>2.5</version>
</dependency>
Maven is the suggested build technology for the Solrj, because it automates the management of 3rd party dependencies. Without dependency management it's a royal pain to decipher these relationships (Jar hell).
What I could suggest is to use ivy, which has a command-line mode.
First download the ivy jar
http://search.maven.org/remotecontent?filepath=org/apache/ivy/ivy/2.3.0/ivy-2.3.0.jar
To retrieve the following Maven module and all it's dependencies:
<dependency>
<artifactId>solr-solrj</artifactId>
<groupId>org.apache.solr</groupId>
<version>1.4.0</version>
<type>jar</type>
<scope>compile</scope>
</dependency>
Then run it as follows:
java -jar ivy.jar \
-dependency org.apache.solr solr-solrj 1.4.0 \
-retrieve "lib/[artifact]-[revision](-[classifier]).[ext]" \
-confs default
Retrieves into the lib directory:
lib/commons-httpclient-3.1.jar
lib/wstx-asl-3.2.7.jar
lib/slf4j-api-1.5.5.jar
lib/commons-codec-1.3.jar
lib/stax-api-1.0.1.jar
lib/geronimo-stax-api_1.0_spec-1.0.1.jar
lib/commons-logging-1.0.4.jar
lib/solr-solrj-1.4.0.jar
lib/commons-io-1.4.jar
lib/commons-fileupload-1.2.1.jar
Update
Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/http/HttpRequestInterceptor
This is due to a missing httpcore.jar file. I found this out by browsing Maven Central:
http://search.maven.org/#search|ga|1|fc%3A%22org.apache.http.HttpRequestInterceptor%22
The recommendation on using the "slf4j-simple" is to provide a logging implementation in case your application doesn't have one.
Finally... This demonstrates what I've tried to say. In the absence of a dependency management tool (ivy, groovy, Maven) you're on your own in deciphering the 3rd party jar dependencies.
maven noob, be patient...
I'm upgrading from cdh3u1 to apache hadoop 0.20.203.0 and pig 0.9.0. I used to have:
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-core</artifactId>
<version>0.20.2-cdh3u1</version>
</dependency>
<dependency>
<groupId>org.apache.pig</groupId>
<artifactId>pig</artifactId>
<version>0.8.1-cdh3u1</version>
</dependency>
and running them from inside eclipse, with junit run configuration worked great.
Now I have:
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-core</artifactId>
<version>0.20.203.0</version>
</dependency>
<dependency>
<groupId>org.apache.pig</groupId>
<artifactId>pig</artifactId>
<version>0.9.0</version>
</dependency>
and I got NoClassDefFoundError: jline/ConsoleReaderInputStream on runtime.
I ended with adding all these dependencies manually until it worked:
<dependency>
<groupId>jline</groupId>
<artifactId>jline</artifactId>
<version>0.9.94</version>
</dependency>
<dependency>
<groupId>org.antlr</groupId>
<artifactId>antlr-runtime</artifactId>
<version> 3.2 </version> <- this is 3.0.1 in cdh3u1, but probably changed in pig 0.9.0
</dependency>
<dependency>
<groupId>com.google.guava</groupId>
<artifactId>guava</artifactId>
<version>r06</version>
</dependency>
What gives? why isn't maven automatically pulling my dependencies and putting them in the classpath?
Maven has a feature called Transitive dependencies, so you don´t have to specify the libraries that your own dependencies require.
ConsoleReaderInputStream is in the Jline JAR. When you were using Pig.0.8.1-cdh3u1, you didn´t have to add the Jline dependency because it is declared in Pig.0.8.1-cdh3u1.pom. Pig 0.9.0.pom does not have Jline dependency declared anymore, that´s the reason you had to add it by yourself. As for the reason JLine was removed from Pig, you have to ask the developers of that project.