unable to start namenode after formatting in centos7 - hadoop

I am unable to start namenode in hdp 2.3.4 centos 7 after running the format command. I am getting below error: Error: Cannot find configuration directory: start
Below is the bashrc file:
if [ -f ~/.bashrc ]; then
. ~/.bashrc
fi
User specific environment and startup programs
PATH=$PATH:$HOME/bin
export PATH
export JAVA_HOME=$PATH/jdk1.7.0_71
export HADOOP_INSTALL=$PATH/hadoop-2.3.4
export PATH=$PATH:$HADOOP_INSTALL/bin
export PATH=$PATH:$HADOOP_INSTALL/sbin
export HADOOP_MAPRED_HOME=$HADOOP_INSTALL
export HADOOP_COMMON_HOME=$HADOOP_INSTALL
export HADOOP_HDFS_HOME=$HADOOP_INSTALL
export YARN_HOME=$HADOOP_INSTALL
export HADOOP_COMMON_LIB_NATIVE_DIR=$HADOOP_INSTALL/lib/native
export HADOOP_OPTS="-Djava.library.path=$HADOOP_INSTALL/lib"
Below is the command I am executing to start namenode:
/usr/hdp/current/hadoop-hdfs-namenode/../hadoop/sbin/hadoop-daemon.sh --config $HADOOP_CONF_DIR start namenode

The error
Error: Cannot find configuration directory:
is thrown because the variable $HADOOP_CONF_DIR, used in the command, is not set in the environment and is trying to start namenode with no actual config --config $HADOOP_CONF_DIR path.
After fixing the environment variable assignments, the .bashrc should look like this (assumed the installation is thru tarballs)
export JAVA_HOME=/<absolute_path_where_jdk_is_extracted>/jdk1.7.0_71
export HADOOP_INSTALL=/<absolute_path_where_hdp_is_extracted>/hadoop-2.3.4
export HADOOP_MAPRED_HOME=$HADOOP_INSTALL
export HADOOP_COMMON_HOME=$HADOOP_INSTALL
export HADOOP_HDFS_HOME=$HADOOP_INSTALL
export YARN_HOME=$HADOOP_INSTALL
export HADOOP_COMMON_LIB_NATIVE_DIR=$HADOOP_INSTALL/lib/native
export HADOOP_OPTS="-Djava.library.path=$HADOOP_INSTALL/lib"
export HADOOP_CONF_DIR=$HADOOP_INSTALL/etc/hadoop
export PATH=$PATH:$HADOOP_INSTALL/bin:$HADOOP_INSTALL/sbin:$JAVA_HOME/bin

Update your .bashrc with below parameters
export JAVA_HOME= location of the JAVA_home (/usr/java/jdk1.x.x)
export HADOOP_HOME=location of the HADOOP_HOME (User defined)
export PATH=$PATH:$HADOOP_HOME/bin:$HADOOP_HOME/sbin
export PATH
Note: Hadoop installed location should be with HADOOP_HOME, it will reflext in hadoop-env.sh

Related

Why I got "-bash: hadoop: command not found" warning when open a new terminal window in Mac OS?

After I installed hadoop in my MacOS, I found there is always a warning "-bash: hadoop: command not found" displayed when I open a new terminal window every time. What's wrong? and How can I fix it? Thanks.
Last login: Fri Jan 8 20:13:00 on ttys010
-bash: hadoop: command not found
SJ-DN0393:github admin$
Here is the content of my /etc/profile file:
# System-wide .profile for sh(1)
if [ -x /usr/libexec/path_helper ]; then
eval `/usr/libexec/path_helper -s`
fi
if [ "${BASH-no}" != "no" ]; then
[ -r /etc/bashrc ] && . /etc/bashrc
fi
export GITLAB_HOME=/Users/admin/dev/gitlab
export LDFLAGS="-L/usr/local/opt/python#3.7/lib"
export BASH_SILENCE_DEPRECATION_WARNING=1
export ZEPPELIN_HOME=/Users/admin/dev/zeppelin-0.9.0-preview2-bin-all
export SPARK_HOME=/Users/admin/dev/spark-3.0.1-bin-hadoop2.7
export ZOOKEEPER_HOME=/Users/admin/dev/apache-zookeeper-3.6.2-bin
export CONFLUENT_HOME=/Users/admin/dev/confluent-6.0.1
export HADOOP_HOME=/Users/admin/dev/hadoop-2.8.5
export HADOOP_CONF_DIR=$HADOOP_HOME/etc/hadoop
export YARN_CONF_DIR=$HADOOP_HOME/etc/hadoop
export HADOOP_CLASSPATH=`hadoop classpath`
export FLINK_HOME=/Users/admin/dev/flink-1.12.0
export FLINK_CONF_DIR=$FLINK_HOME/conf
export FLINK_OPT_DIR=$FLINK_HOME/opt
export FLINK_PLUGINS_DIR=$FLINK_HOME/plugins
export FLINK_BIN_DIR=$FLINK_HOME/bin
export FLINK_LIB_DIR=$FLINK_HOME/lib
export MYSQL_HOME=/usr/local/mysql-5.7.31-macos10.14-x86_64
export HIVE_HOME=/Users/admin/dev/apache-hive-2.3.7-bin
export HBASE_HOME=/Users/admin/dev/hbase-2.2.6
export KAFKA_HOME=/Users/admin/dev/kafka_2.12-2.4.1
export JAVA_HOME=/Users/admin/.sdkman/candidates/java/current
export CLASSPATH=.:$JAVA_HOME/lib:$JRE_HOME/lib:$HADOOP_HOME/share/hadoop/tools/lib/hadoop-aliyun-3.3.0.jar:$HADOOP_HOME/share/hadoop/common/hadoop-common-2.8.5.jar:$HADOOP_HOME/share/hadoop/common/lib/commons-cli-1.2.jar:$HADOOP_HOME/share/hadoop/mapreduce/hadoop-mapreduce-client-core-2.8.5.jar
export PATH=/usr/local/sbin:/usr/local/bin:/sbin:/bin:/usr/sbin:/usr/bin:/root/bin:/usr/local/opt/python#3.7/bin:/usr/local/opt/redis#4.0/bin:$JAVA_HOME/bin:$HADOOP_HOME/bin:$HADOOP_HOME/sbin:$PATH:$FLINK_HOME/bin:$SPARK_HOME/bin:$MYSQL_HOME/bin:$ZEPPELIN_HOME/bin:$ZOOKEEPER_HOME/bin:$HBASE_HOME/bin:$KAFKA_HOME/bin:$CONFLUENT_HOME/bin:$HIVE_HOME/bin
You have `hadoop classpath` being invoked when you open the shell.
Since hadoop is not in the PATH until the very end of sourcing this file, the command wouldnt be found
The appropriate solution could be use $HADOOP_HOME/bin/hadoop classpath

ORACLE_HOME: command not found

Hello Everybody I have confufrated bash_profile as follow
# .bash_profile
Get the aliases and functions if [ -f ~/.bashrc ]; then . ~/.bashrc` fi
User specific environment and startup programs PATH=$PATH:$HOME/bin export PATH
Oracle Settings TMP=/tmp; export TMP TMPDIR=$TMP; export TMPDIR ORACLE_HOSTNAME=merkez-rac1.localdomain; export ORACLE_HOSTNAME
ORACLE_UNQNAME=RAC; export ORACLE_UNQNAME ORACLE_BASE=/u01/app/oracle;
export ORACLE_BASE GRID_HOME=/u01/app/11.2.0/grid; export GRID_HOME
DB_HOME=$ORACLE_BASE/product/11.2.0/db_1; export DB_HOME
ORACLE_HOME=$DB_HOME; export ORACLE_HOME ORACLE_SID=RAC1; export
ORACLE_SID ORACLE_TERM=xterm; export ORACLE_TERM
BASE_PATH=/usr/sbin:$PATH; export BASE_PATH
PATH=$ORACLE_HOME/bin:$BASE_PATH; export PATH
LD_LIBRARY_PATH=$ORACLE_HOME/lib:/lib:/usr/lib; export LD_LIBRARY_PATH
CLASSPATH=$ORACLE_HOME/JRE:$ORACLE_HOME/jlib:$ORACLE_HOME/rdbms/jlib;
export CLASSPATH
if [$USER = "oracle" ] ; then if [ $SHELL= "/bin/ksh" ] ; then
ulimit -p 16384 ulimit -n 65536`enter code here`
else`enter code here` K ulimit -u 16384 -n 65536 fi fi alias
gid_env='. /home/oracle/gid_env' alias db_env='. /home/oracle/db_env'
But When i wrote on terminal `[oracle#merkez-rac1 ~]$ echo
$ORACLE_HOME`
[oracle#merkez-rac1 ~]$ It give me nothing and dosent show me the
directory /u01/app/oracle/product/11.2.0/db_1
Can't tell what the question is, but if it's about ORACLE_HOME, try putting a newline in on that line:
ORACLE_HOME=$DB_HOME; export ORACLE_HOME ORACLE_SID=RAC1; export
ORACLE_SID ORACLE_TERM=xterm; export ORACLE_TERM
should be
ORACLE_HOME=$DB_HOME; export ORACLE_HOME
ORACLE_SID=RAC1; export ORACLE_SID
ORACLE_TERM=xterm; export ORACLE_TERM
and so on

Hadoop fs shell commands not working

I'm not able to run hadoop fs shell commands from the CLI but, able to browse the hdfs through web UI and also other hadoop commands are working fine (for example hadoop version). Below is the error I'm getting. Please help.
$ hadoop fs -ls /
-ls: For input string: "13.1067728"
Usage: hadoop fs [generic options] -ls [-d] [-h] [-R] [<path> ...]
Use
hdfs dfs -ls ...
Try this then f above doesnt works,
export HADOOP_COMMON_LIB_NATIVE_DIR=$HADOOP_HOME/lib/native export HADOOP_OPTS="-Djava.library.path=$HADOOP_HOME/lib"
Add these two lines in .bashrc.
export Hadoop home according to your system in .bashrc
#Hadoop variables
export JAVA_HOME=/usr/jdk1.8.0_11
export PATH=$JAVA_HOME/bin:$PATH
export HADOOP_HOME=/home/kishore/BigData/hadoop
export PATH=$PATH:$HADOOP_HOME/bin
export PATH=$PATH:$HADOOP_HOME/sbin
export HADOOP_MAPRED_HOME=$HADOOP_HOME
export HADOOP_COMMON_HOME=$HADOOP_HOME
export HADOOP_HDFS_HOME=$HADOOP_HOME
export YARN_HOME=$HADOOP_HOME
and execute command on terminal
source ~/.bashrc

Can not create hadoop cluster

I am following http://alanxelsys.com/hadoop-v2-single-node-installation-on-centos-6-5 to install hadoop
on my cluster
I have installed hadoop inside /usr/local/hadoop/sbin directory and when I try executing the bash script
start-all.sh; system gives below error;
start-all.sh: command not found
Know What I have tried
1. Tried setting SSH again
2. Recheck the java path
Varible i have set is
export JAVA_HOME=/usr/java/latest
export HADOOP_INSTALL=/usr/local/hadoop
export HADOOP_MAPRED_HOME=$HADOOP_INSTALL
export HADOOP_COMMON_HOME=$HADOOP_INSTALL
export HADOOP_HDFS_HOME=$HADOOP_INSTALL
export YARN_HOME=$HADOOP_INSTALL
export HADOOP_COMMON_LIB_NATIVE_DIR=$HADOOP_INSTALL/lib/native
export HADOOP_OPTS="-Djava.library.path=$HADOOP_HOME/lib"export PATH=$PATH:$HADOOP_HOME/sbin:$HADOOP_HOME/bin
The start-all.sh script says that "This script is Deprecated. Instead use start-dfs.sh and start-yarn.sh" so you should use start-dfs.sh and start-yarn.sh

Hadoop Home Path for Datanode

I'm configuring a 3 node hadoop cluster in EC2.
For Namenode and Jobtracker:
export HADOOP_HOME=/usr/local/hadoop # Masternode
export PATH=$PATH:$HADOOP_HOME/bin
export PATH=$PATH:$HADOOP_HOME/sbin
export HADOOP_MAPRED_HOME=${HADOOP_HOME}
export HADOOP_COMMON_HOME=${HADOOP_HOME}
export HADOOP_HDFS_HOME=${HADOOP_HOME}
export YARN_HOME=${HADOOP_HOME}
export HADOOP_CONF_DIR=${HADOOP_HOME}/etc/hadoop
export HDFS_CONF_DIR=${HADOOP_HOME}/etc/hadoop
export YARN_CONF_DIR=${HADOOP_HOME}/etc/hadoop
export HADOOP_COMMON_LIB_NATIVE_DIR=${HADOOP_HOME}/lib/native
export HADOOP_OPTS="-Djava.library.path=${HADOOP_HOME}/lib"
For Datanode, I attached additional EBS storage which is mounted on /vol,
export HADOOP_HOME=/vol/hadoop
export PATH=$PATH:$HADOOP_HOME/bin
export PATH=$PATH:$HADOOP_HOME/sbin
export HADOOP_MAPRED_HOME=${HADOOP_HOME}
export HADOOP_COMMON_HOME=${HADOOP_HOME}
export HADOOP_HDFS_HOME=${HADOOP_HOME}
export YARN_HOME=${HADOOP_HOME}
export HADOOP_CONF_DIR=${HADOOP_HOME}/etc/hadoop
export HDFS_CONF_DIR=${HADOOP_HOME}/etc/hadoop
export YARN_CONF_DIR=${HADOOP_HOME}/etc/hadoop
export HADOOP_COMMON_LIB_NATIVE_DIR=${HADOOP_HOME}/lib/native
export HADOOP_OPTS="-Djava.library.path=${HADOOP_HOME}/lib"
When I tried to run start-dfs.sh on Namenode I got following error,
data01.hdp-dev.XYZ.com: bash: line 0: cd: /usr/local/hadoop: No such file or directory
data01.hdp-dev.XYZ.com: bash: /usr/local/hadoop/sbin/hadoop-daemon.sh: No such file or directory
Notice the Namenode trying to invoke hadoop from Datanode from wrong directory...
Any advice would help.

Resources