JAVA_HOME is not set and could not be found. error on When Install HADOOP - hadoop

I'm new to the hadoop. when the process of the installation, i gave hadoop.env.sh a JAVA_HOME path, but when I'm going to execute hdfs namenode -format it says that the java_home is not set.when check it again, it also saved in the hadoop.env.sh. i can't up the hdfs because of this. explained help is much appreciated.
thank u. i've attached the screen shot for the reference as well.[hadoop.env.sh view][error message]1

Can you restart HDFS service after adding JAVA_HOME to hadoop-env.sh?
ALso try echoing echo $JAVA_HOME before running hadoop namenode format command.

Make sure you have set environment variable correctly.
https://hadoopwala.wordpress.com/2016/07/03/java/
Reference: Hadoop-Psuedo Distributed Mode
Hope this helps.

Related

Error in formating the namenode in Hadoop single cluster node

I am trying to install and configure hadoop in the system Ubuntu 16.04, as per the guidelines of https://data-flair.training/blogs/installation-of-hadoop-3-x-on-ubuntu/
all the steps were run successfully, but while trying to run the command hdfs namenode -format, I get a message
There is some problem with your bashrc file. Just check your variables inside bashrc. Even I faced the same problem when I started with hadoop. Mention correct path for each and every variable and afterwards use source ~/.bashrc to commit the changes done to your bashrc file

Missing Hadoop installation in Windows

I have installed Hadoop 2.x and its running fine in Windows 8.
G:\hadoop\hive2.1\bin>jps
10916 NameNode
1588 DataNode
3332 Jps
4200 ResourceManager
2444 NodeManager
And I have installed Hive also in Windows, But when I start Hive it's throwing an error saying:
G:\hadoop\hive2.1\bin>hive
"Missing hadoop installation: G:\hadoop\winutils must be set"
HADOOP_HOME is already set to G:\hadoop\winutils in env variables.
Please help here.
You have wrongly set HADOOP_HOME.
Try below..
In User variables, configure HADOOP_HOME with following value.
HADOOP_HOME-->D:\Hadoop-2.8.1
In System variables, add following value in addition of existing path value.
path-->;D:\Hadoop-2.8.1\bin;
If you not ok with that above configurations just try below way.
Open cmd prompt, Just set home by setting path and home.
C:>set HADOOP_HOME=D:\Hadoop-2.8.1
C:>set PATH=%PATH%;%HADOOP_HOME%\bin
Now start hadoop services from same cmd prompt and then go to hive shell.
Hope this helpful for you.

How to know the default location of hadoop homepath

I have installed cloudera hadoop 2.0.0 CDH4 while doing i am not mentioning any
home path.It is working fine now. But when i ran JPS command.It was showing jps
process only.
So i tried to start the hadoop but I am unable to find the location of hadoop,
where It is actually stored. So can any one please help how to find the default
location.
Is there any commands are there to find the exact location of hadoop in my
system?
please help me on this issue.
Thanks,
Anbu k
Yes, you could try:
find / -name hadoop

Could not format the Namenode in hadoop 2.6?

I have installed the hadoop 2.6 on ubuntu 14.04.I just followed this blog.
While I am trying to format the namenode, I am hitting with below error:
hduser#data1:~$ hadoop namenode -format
DEPRECATED: Use of this script to execute hdfs command is deprecated.
Instead use the hdfs command for it.
/usr/local/hadoop/bin/hdfs: line 276: /home/hduser/usr/lib/jvm/java-7-openjdk-amd64/bin/java: No such file or directory
/home/hduser/usr/lib/jvm/java-7-openjdk-amd64/bin/java: No such file or directory
This error occurs because the JAVA_HOME you have provided does not have java.
Just add this line in hadoop-env.sh and /home/hduser/.bashrc:
export JAVA_HOME=/usr/lib/jvm/java-7-openjdk-amd64
I think you have already set the $JAVA_HOME but you did it wrong (just a guess):
/home/hduser/usr/lib/jvm/java-7-openjdk-amd64/bin/java
It would be :
/usr/lib/jvm/java-7-openjdk-amd64/bin/java
You probably have added ~ before the path when you exported that JAVA_HOME and this added the home directory /home/hduser.
To check this out, type java -version and see if java is working. And type echo $JAVA_HOME and check the path manually.
I figured out. The entry we made was for amd64. it is really i386 computers. Please verify the path and that should fix the issue.

"hadoop namenode -format" formats wrong directory

I'm trying to install Hadoop 1.1.2.21 on CentOS 6.3
I've configured dfs.name.dir in /etc/hadoop/conf/hdfs-site.xml file
<name>dfs.name.dir</name>
<value>/mnt/ext/hadoop/hdfs/namenode</value>
But when I run "hadoop namenode -format" command, it formats /tmp/hadoop-hadoop/dfs/name instead.
What am I missing?
I ran into this problem and solved it. So updating this answer.
Make sure your environment variable HADOOP_CONF_DIR points to the directory where it can find all you xml files for used for configuration. It solved it for me.
It might be taking the path /tmp/hadoop-hadoop/dfs/name from hdfs-default.xml. Not sure why the value from hdfs-site.xml is not taken. Is dfs.name.dir marked as final in hdfs-default.xml?
Check if some Hadoop Process is running in the background already. This happens if you have aborted a previous process and it has not been killed and has become a ZOMBIE process
If that is the case kill the process and then again try to format the system
Also you can check the permission of the Directory.
Try to give a different location for the directory, if it is reflected
Please don't set HADOOP_CONF_DIR. You can check .bashrc file and remove it.

Resources