Eclipse plugin error for Hadoop on Ubuntu - hadoop

I installed Hadoop version 1.0.3 and its related eclipse plugin successfully. All the Hadoop functionalities and examples are working pretty well, but when I want to use its plugin on eclipse, it could not connect to hdfs and I get the error:
An internal error occurred during: "Connecting to DFS localhost".
org/apache/commons/configurati­on/Configuration.
could anybody help me how to solve this problem!
Thanks

You are facing this problem because the plugin is missing some necessary jars. In order to solve the problem you need to rebuild the plugin after including the necessary jars. I have seen this kind of questions a lot on SO, and they all point out to the same thing. Please see these links :
Eclipse Hadoop plugin issue(Call to localhost/127.0.0.1:50070 )Can any body give me the solution for this?
Hadoop eclipse mapreduce is not working?
Installing Hadoop's Eclipse Plugin

I did follow the following blog instructions to make Hadoop eclipse plugin 1.0.4 :
http://iredlof.com/part-4-compile-hadoop-v1-0-4-eclipse-plugin-on-ubuntu-12-10/
but it seems it has some missing parts like:
in MANIFEST.MF you should add:
/lib/commons-cli-1.2.jar
and in build-contrib.xml you should also add:
<property name="commons-cli.version" value="1.2"/>
I hope these are useful!

you must run hadoop with command line first!!
./[hadoop-path]/bin/start-all.sh

Related

Spark Cassandra NoClassDefFoundError guava/cache/CacheLoader

Running Cassandra 2.2.8, Win7, JDK8, Spark2, HAve thse in the CP: Cassandra core 3.12, spark-cassandra-2.11, Spark-cassandra-java-2.11, Spark2.11, spark-network-common_2.11, Guava-16.0.jar, sacala2.11.jar, etc
Trying to run a basic example- compiles fine, but when when I try to run- at the first line itself get error:
SparkConf conf = new SparkConf();
java.lang.NoClassDefFoundError: org/spark_project/guava/cache/CacheLoader
Missing spark-network-common is supposed to cause this error - but I do have it. Any conflicting jars?
Thanks
So the answer is: don't exactly know the answer but the problem was solved. Used the the pom and created a maven project in eclipse. it brought in several (dozen) jars and it finally worked. So likely some conflicting/missing jar - tried to look into it- hard to figure out.
Maybe you should check the repository. To check the whether jar with the lastupdated .If it has lastupdated, and then del those files. And download again.

Missing of hadoop-mapreduce-client-core-[0-9.]*.jar in hadoop1.2.1

I have installed Hadoop 1.2.1 on a three node cluster. while installing Oozie, When i try to generate a war file for the web console, I get this error.
hadoop-mapreduce-client-core-[0-9.]*.jar' not found in '/home/hduser/hadoop'
I believe the version of Hadoop that I am using doesn't have this jar file(don't know where to find them). So can anyone please tell me how to create a war file and enable the web console. Any help is appreciated.
You are correct. You have 2 options :
1. Download individual jars and put them inside your hadoop1.2.1 directory and generate the war file.
2. Download Hadoop 2.x and use it while creating the war file and once it has been built continue using your hadoop1.2.1.
For example : oozie-3.3.2 bin/oozie-setup.sh prepare-war -hadoop
hadoop-1.1.2 ~/hadoop-eco/hadoop-2.2.0 -extjs
~/hadoop-eco/oozie/oozie-3.3.2/webapp/src/main/webapp/ext-2.2
Here I have built Oozie-3.3.2 to use it with hadoop-1.1.2 but using hadoop-2.2.0
HTH

Hadoop and PiggyBank incompatibility

I am trying to use org.apache.pig.piggybank.storage.MultiStorage from piggybank.jar archive. I downloaded pig trunk and built piggybank.jar by following the instructions here. However, I get the error below when I use the MultiStorage class.
Found interface org.apache.hadoop.mapreduce.TaskAttemptContext, but class was expected
By looking here, it looks like there is a version incompatibility between the piggybank build and the hadoop version. But I am not able to fix this issue. I really appreciate any help on this (spent inordinate amount of time on this already).
pig.hadoop.version: 2.0.0-cdh4.1.0
> hadoop version
Hadoop 2.0.0-cdh4.1.0 Subversion
file:///data/1/jenkins/workspace/generic-package-ubuntu64-10-04/CDH4.1.0-Packaging-Hadoop-2012-09-29_10-56-25/hadoop-2.0.0+541-1.cdh4.1.0.p0.27~lucid/src/hadoop-common-project/hadoop-common -r 5c0a0bddbc2aaff30a8624b5980cd4a2e1b68d18 Compiled by jenkins on Sat Sep 29 11:26:31 PDT 2012 From source with checksum
95f5c7f30b4030f1f327758e7b2bd61f
Though I am not able to figure out how to build a compatible piggybank.jar, I found that a compatible piggybank.jar is located under /usr/lib/pig/.
I faced a similar issue when I used piggybank version 0.13 with Hadoop version Hadoop 2.4.0.2.1.5.0-695. It however worked when I used the piggybank jar in the location you mentioned -- /usr/lib/pig.
The additional observation I made is the piggybank jar in /usr/lib/pig is quite old and does not have XPath and other functions available. I believe new piggy jar has dependencies on later Hadoop version.

Problems compiling Hadoop

That's the problem: I have done a simply Hadoop program to "clean" a graph saved in a text file that I will use later (with Hadoop), but I can't compile it!
The compiler can't find Hadoop classes (IntWritable, Text ecc...), and each time I get a "cannot find symbol" error.
I've tried with:
javac -classpath path/to/hadoop/root/hadoop-core-{version}.jar filename.java
I'm running with ubuntu 11.04, and the Hadoop version is 1.0.3.
the problem is that hadoop-core-{version}.jar depends on some other jars. You can find all the dependencies on the Maven repository web site :
http://mvnrepository.com/artifact/org.apache.hadoop/hadoop-core/1.0.3
You should use Maven or add all the dependencies to your project to be able to build it.

Eclipse plugins in hadoop on windows

I'm new to the hadoop.I am trying to install hadoop on my windows machine with the help of following link i.e. http://blog.v-lad.org/archives/4#comment-43
I'm using eclipse IDE:3.3.1
javaJDK :1.6.0_24
Hadoop :0.21.0
Everything fine, eclipse IDE when i select the "new hadoop location" action is not perform.I didn't get the problem .Any one can help me
Have you added the eclipse-plugin to the installation directory of eclipse?
...\hadoop-0.21.0\mapred\contrib\eclipse-plugin\hadoop-0.21.0-eclipse-plugin.jar
to
...\eclipse\dropins
P.S. the eclipse-plugin of Hadoop :0.21.0 is not complete.
You can download the revised one at http://www.lifeba.org/wp-content/uploads/2012/03/hadoop-0.21.0-eclipse-plugin-3.6.rar
Although it's for eclipse 3.6 ,I suppose it's also compatible for eclipse 3.3.1
——————————————————————————————
Oh, I just noticed the time...years ago... sorry
I hope it be useful for those who encountered this problem recently.

Resources