Hadoop namenode format not working - hadoop

I've been trying to install hadoop 2.7.0 on Ubuntu but when i enter the hadoop namenode -format command i get the following message:
Error: Could not find or load main class org.apache.hadoop.hdfs.server.namenode.NameNode
I've triple checked all the configuration files but i can't seem to find where the problem is.
I followed this tutorial : http://www.bogotobogo.com/Hadoop/BigData_hadoop_Install_on_ubuntu_single_node_cluster.php
Can anyone please tell me why is this not working??

You have to add hadoop-hdfs-2.7.0.jar to your hadoop classpath. Just add these lines in $HADOOP_HOME/etc/hadoop/hadoop-env.sh:
export HADOOP_HOME=/path/to/hadoop
export HADOOP_CLASSPATH=$HADOOP_CLASSPATH:$HADOOP_HOME/share/hadoop/hdfs/hadoop-hdfs-2.7.0.jar
Now, stop all hadoop processes. Try to format namenode now. Post the error if you get any.

Related

How to know the default location of hadoop homepath

I have installed cloudera hadoop 2.0.0 CDH4 while doing i am not mentioning any
home path.It is working fine now. But when i ran JPS command.It was showing jps
process only.
So i tried to start the hadoop but I am unable to find the location of hadoop,
where It is actually stored. So can any one please help how to find the default
location.
Is there any commands are there to find the exact location of hadoop in my
system?
please help me on this issue.
Thanks,
Anbu k
Yes, you could try:
find / -name hadoop

unable to setup psuedo distributed hadoop cluster

I am using centos 7. Downloaded and untarred hadoop 2.4.0 and followed the instruction as per the link Hadoop 2.4.0 setup
Ran the following command.
./hdfs namenode -format
Got this error :
Error: Could not find or load main class org.apache.hadoop.hdfs.server.namenode.NameNode
I see a number of posts with the same error with no accepted answers and I have tried them all without any luck.
This error can occur if the necessary jarfiles are not readable by the user running the "./hdfs" command or are misplaced so that they can't be found by hadoop/libexec/hadoop-config.sh.
Check the permissions on the jarfiles under: hadoop-install/share/hadoop/*:
ls -l share/hadoop/*/*.jar
and if necessary, chmod them as the owner of the respective files to ensure they're readable. Something like chmod 644 should be sufficient to at least check if that fixes the initial problem. For the more permanent fix, you'll likely want to run the hadoop commands as the same user that owns all the files.
I followed the link Setup hadoop 2.4.0
and I was able to get over the error message.
Seems like the documentation on hadoop site is not complete.

hbase installation on single node

i have installed hadoop single node on ubuntu 12.04. Now I am trying to install hbase over it (version 0.94.18). But i get the following errors(even though i have extracted it in the /usr/local/hbase):
Error: Could not find or load main class org.apache.hadoop.hbase.util.HBaseConfTool
Error: Could not find or load main class org.apache.hadoop.hbase.zookeeper.ZKServerTool
starting master, logging to /usr/lib/hbase/hbase-0.94.8/logs/hbase-hduser-master-ubuntu.out
nice: /usr/lib/hbase/hbase-0.94.8/bin/hbase: No such file or directory
cat: /usr/lib/hbase/hbase-0.94.8/conf/regionservers: No such file or directory
To resolve This Error
Download binary version of hbase
Edit conf file hbase-env.sh and hbase-site.xml
Set Up Hbase Home Directory
Start hbase By - Start-hbase.sh
Explanation To above Error:
Could not find or load main class your downloaded version does not have required jar
Hi can you tell when it is coming this error.
I think you gave environment set wrong
You should enter bellow command:
export HBASE_HOME="/usr/lib/hbase/hbase-0.94.18"
Then try hbase it will work.
If you want shell script you can download this lik :: https://github.com/tonyreddy/Apache-Hadoop1.2.1-SingleNode-installation-shellscript
It have hadoop, hive, hbase, pig.
Thank
Tony.
It is not recommended to run hbase from the source distribution directly instead you have to download the binary distribution as they have mentioned in their official site, follow the same instructions and you will get it up.
You could try installing the version 0.94.27
Download it from : h-base 0.94.27 dowload
This one worked for me.
Follow the instruction specified in :
Hbase installation guide
sed "s/<\/configuration>/<property>\n<name>hbase.rootdir<\/name>\n<value>hdfs:\/\/'$c':54310\/hbase<\/value>\n<\/property>\n<property>\n<name>hbase.cluster.distributed<\/name>\n<value>true<\/value>\n<\/property>\n<property>\n<name>hbase.zookeeper.property.clientPort<\/name>\n<value>2181<\/value>\n<\/property>\n<property>\n<name>hbase.zookeeper.quorum<\/name>\n<value>'$c'<\/value>\n<\/property>\n<\/configuration>/g" -i.bak hbase/conf/hbase-site.xml
sed 's/localhost/'$c'/g' hbase/conf/regionservers -i
sed 's/#\ export\ HBASE_MANAGES_ZK=true/export\ HBASE_MANAGES_ZK=true/g' hbase/conf/hbase-env.sh -i
Yes just type this tree commands and you need change replace $c to your hostname.
Then try it will work.

Issue with psuedo mode configuration of Hadoop

I am trying to do pseudo mode configuration of Hadoop 2.0.4 version. Script start-dfs.sh works fine. However, start-mapred.sh fails to start the jobtracker and tasktracker. Below is the error I am getting. Seeing at error it looks like it is not able to pick the jar file. Please let me know if you have any idea of this issue. Thanks.
FATAL org.apache.hadoop.mapred.JobTracker: java.lang.NoSuchMethodError: org/apache/hadoop/mapred/JobACLsManager.<init>(Lorg/apache/hadoop/mapred/JobConf;)V
at org.apache.hadoop.mapred.JobTracker.<init>(JobTracker.java:2182)
at org.apache.hadoop.mapred.JobTracker.<init>(JobTracker.java:1895)
at org.apache.hadoop.mapred.JobTracker.<init>(JobTracker.java:1889)
at org.apache.hadoop.mapred.JobTracker.startTracker(JobTracker.java:311)
at org.apache.hadoop.mapred.JobTracker.startTracker(JobTracker.java:302)
at org.apache.hadoop.mapred.JobTracker.startTracker(JobTracker.java:297)
at org.apache.hadoop.mapred.JobTracker.main(JobTracker.java:4820)
It seems I was using incorrect jars. So, first I replaced those. Then, I just created a new directory with hadoop conf files. Formatted the namenode. Finally it worked. :)

Can't find or load main class error in hadoop

Here is the tutorial that am following. I got error when i am trying to execute the command bin/hadoop namenode -format.
The error is:
Error: Could not find or load main class
org.apache.hadoop.util.PlatformName DEPRECATED: Use of this script to
execute hdfs command is deprecated. Instead use the hdfs command for
it.
Error: Could not find or load main class
org.apache.hadoop.util.PlatformName Error: Could not find or load main
class org.apache.hadoop.hdfs.server.namenode.NameNode
any help ?
Here is my 2 cents. I had the same issue, I re-installed hadoop in a unix environment and it worked fine. Cygwin has some weird issues with hadoop plus windows is not a good platform for hadoop.
hdfs namenode -format
Error: Could not find or load main class xyz
you are getting this error maybe because your username contains white space
or blank space like amit kumar, pavan singh etc.
to solve this problem:
Go to your C:\hadoop-2.7.7\etc\hadoop
and open the file hadoop-env.cmd in any text editor and at the end replace
%USERNAME%
with any name but without space
REPLACE THIS
set HADOOP_IDENT_STRING=%USERNAME%
WITH THIS
set HADOOP_IDENT_STRING=AMIT
OR simply follow this video
http://www.youtube.com/watch?v=Y6wck9dKVKQ

Resources