hadoop ClassNotFoundException when running start-all.sh - hadoop

I tried to run ./hadoop start-all.sh
Unfortunately this error is thrown
Exception in thread "main" java.lang.NoClassDefFoundError: start/all/sh
Caused by: java.lang.ClassNotFoundException: start.all.sh
at java.net.URLClassLoader$1.run(URLClassLoader.java:217)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:205)
at java.lang.ClassLoader.loadClass(ClassLoader.java:321)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:294)
at java.lang.ClassLoader.loadClass(ClassLoader.java:266)
Could not find the main class: start.all.sh. Program will exit.
I though it might have been the hadoop path but that does not seem to fix the issue. The path that i set in the hadoop-env.sh is /usr/local/hadoop/bin`.
I looked at other posts with simular titles
Hadoop: strange ClassNotFoundException
what is considered the main class. I tried changing the path to /usr/local/hadoop/bin/

Its a shell script. >> start-all.sh should do. You do not need hadoop. You can find more information here. http://hadoop.apache.org/common/docs/r0.19.2/quickstart.html

Just run as follows
/path/to/Hadoop/home/bin/start-all.sh
In your case
/user/local/hadoop/bin/start-all.sh

Since you are already in /hadoop/bin folder. You no need to give again ./hadoop start-all.sh
instead just give ./start-all.sh
It will not throw any error and it will start your hadoop process.

Related

java.lang.NoClassDefFoundError: org/apache/hadoop/security/authorize/RefreshAuthorizationPolicyProtocol

I am currently trying to install hadoop 2.3.0 on my cluster. However, when I run command "bin/hdfs namenode -format", I got this error:
Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/hadoop/security/authorize/RefreshAuthorizationPolicyProtocol
Caused by: java.lang.ClassNotFoundException: org.apache.hadoop.security.authorize.RefreshAuthorizationPolicyProtocol
Any ideas how to solve it?
There isn't much information in the question, but this error will occur if the environment variable HADOOP_COMMON_HOME is not correctly set to $HADOOP_HOME (the top level where HADOOP is installed).

HiveServer Class Not Found Exception

I'm trying to run hive from the command prompt it is working absolutely fine. But when I try running hiveserver using "hive --service hiveserver" command, I'm getting the following exception.
Starting Hive Thrift Server
Exception in thread "main" java.lang.ClassNotFoundException: org.apache.hadoop.hive.service.HiveServer
at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
at java.lang.Class.forName0(Native Method)
at java.lang.Class.forName(Class.java:278)
at org.apache.hadoop.util.RunJar.run(RunJar.java:214)
at org.apache.hadoop.util.RunJar.main(RunJar.java:136)
So I then tried with the command "hive --service hiveserver2"; still I'm not finding any solution.
Can anybody please suggest a solution for this problem.
May be another process (another hiveserver) already listening on port 10000.
can you check it by :
netstat -ntulp | grep ':10000' to see it and if found then kill the process.
Otherwise start the server on another port.
By the way which version you are using ?
This error occurred to me when it can't find hive-service-*.jar in hadoop classpath. Just copy the hive-service-*.jar to your hadoop lib folder or export classpath in hadoop-env.sh. I have mentioned how to add classpath below.
Add this line in hadoop-env.sh:
export HADOOP_CLASSPATH=$HADOOP_CLASSPATH:/usr/local/hive/lib/hive-*.jar
I have mentioned the path for hive as /usr/local/hive since i have hive installed at that location. Change it to point to your hive installation.

Hadoop getting error Exception in thread "main" java.lang.NoClassDefFoundError:

I am very new to hadoop and map reduce programing.
I downloaded version 1.2.1 and was trying to see some example with command
bin/hadoop jar hadoop*example*.jar
with this command I am getting exception. What is wrong here? is there any problem with installation?
Exception in thread "main" java.lang.NoClassDefFoundError: 1/2/1/hadoop-1/2/1/libexec////logs
Caused by: java.lang.ClassNotFoundException: 1.2.1.hadoop-1.2.1.libexec....logs
at java.net.URLClassLoader$1.run(URLClassLoader.java:200)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:188)
at java.lang.ClassLoader.loadClass(ClassLoader.java:315)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:330)
at java.lang.ClassLoader.loadClass(ClassLoader.java:250)
at java.lang.ClassLoader.loadClassInternal(ClassLoader.java:398)
The right command is:
bin/hadoop jar hadoop-*-examples.jar <program name>
If you are using your custom map reduce class, try the following configuration on main method:
job.setJarByClass(WordCount.class);
Reference: http://mydailylearningblog.blogspot.com.br/2011/06/javalangclassnotfoundexception.html

Problems running Manning's Hadoop in Practice 4.1 MapReduce code on Hadoop 1.0.3

I am attempting to run the 4.1 example code from Manning's "Hadoop in Practice" at http://www.manning.com/lam/
I am running Ubuntu 10.4 using hadoop 1.0.3 java 6.
The examples from http://www.michael-noll.com/tutorials/running-hadoop-on-ubuntu-linux-single-node-cluster/, I used the wordcount example to verify the installation.
I then tried to running the 4.1 example using:
hduser#ubuntu:/usr/local/hadoop$ bin/hadoop jar MyJob.jar MyJob /user/hduser/4.1/input /user/hduser/4.1output
I get the error:
Exception in thread "main" java.lang.ClassNotFoundException: MyJob
at java.net.URLClassLoader$1.run(URLClassLoader.java:202)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
at java.lang.ClassLoader.loadClass(ClassLoader.java:247)
at java.lang.Class.forName0(Native Method)
at java.lang.Class.forName(Class.java:247)
at org.apache.hadoop.util.RunJar.main(RunJar.java:149)
The public run method in the example that runs and manning's code appear to be different.
I appreciate your assistance!
Give the complete path of the jar. For example, if MyJob.jar is present inside your home directory then : hduser#ubuntu:/usr/local/hadoop$ bin/hadoop jar /home/hduser/MyJob.jar MyJob /user/hduser/4.1/input /user/hduser/4.1output
I had the same problem with Hadoop 1.0.3.16 and java 6 but I managed to get the Manning example 4.1 working by adding job.setJar("/path/to/MyJob.jar"); after job.setJobName("MyJob"); I thought of making this change because I was getting a warning: WARN mapred.JobClient: No job jar file set. User classes may not be found. See JobConf(Class) or JobConf#setJar(String). Do you get the same warning Tariq?
I also tried adding job.setJarByClass(MyJob.class); instead but this did not work.
Cheers, Alex

Hadoop can't find example jar file

I am trying to run this in pseudo distributed mode following the directions in Hadoop In Action. It ran when I used the local/standalone mode.
Now it can't seem to find the path to the jar file.
cd $HADOOP_HOME
jps
17559 JobTracker
17466 SecondaryNameNode
17791 TaskTracker
16993 NameNode
17942 Jps
bin/hadoop hadoop-examples-1.0.3.jar wordcount
Warning: $HADOOP_HOME is deprecated.
Exception in thread "main" java.lang.NoClassDefFoundError: hadoop-examples-1/0/3/jar
Caused by: java.lang.ClassNotFoundException: hadoop-examples-1.0.3.jar
at java.net.URLClassLoader$1.run(URLClassLoader.java:202)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
at java.lang.ClassLoader.loadClass(ClassLoader.java:247)
Could not find the main class: hadoop-examples-1.0.3.jar. Program will exit.
My CLASSPATH is set to $HADOOP_HOME
Any ideas?
Two things that don't look right:
You should also have DataNode process running check the logs to see what happened to it.
The correct command to use is bin/hadoop jar hadoop-examples-1.0.3.jar wordcount
You should also have HADOOP_CONF_DIR set to point to the directory with 'hdfs-site.xml' and 'core-site.xml'

Resources