Unable to start daemons using start-dfs.sh - hadoop

We are using cdh4-0.0 distribution from cloudera. We are unable to start the daemons using the below command.
>start-dfs.sh
Starting namenodes on [localhost]
hduser#localhost's password:
localhost: mkdir: cannot create directory `/hduser': Permission denied
localhost: chown: cannot access `/hduser/hduser': No such file or directory
localhost: starting namenode, logging to /hduser/hduser/hadoop-hduser-namenode-canberra.out
localhost: /home/hduser/work/software/cloudera/hadoop-2.0.0-cdh4.0.0/sbin/hadoop-daemon.sh: line 150: /hduser/hduser/hadoop-hduser-namenode-canberra.out: No such file or directory
localhost: head: cannot open `/hduser/hduser/hadoop-hduser-namenode-canberra.out' for reading: No such file or directory

Looks like you're using tarballs?
Try to set an override the default HADOOP_LOG_DIR location in your etc/hadoop/hadoop-env.sh config file like so:
export HADOOP_LOG_DIR=/path/to/hadoop/extract/logs/
And then retry sbin/start-dfs.sh, and it should work.
In packaged environments, the start-stop scripts are tuned to provide a unique location for each type of service, via the same HADOOP_LOG_DIR env-var, so they do not have the same issue you're seeing.
If you are using packages instead, don't use these scripts and instead just do:
service hadoop-hdfs-namenode start
service hadoop-hdfs-datanode start
service hadoop-hdfs-secondarynamenode start

Related

Hadoop installation Issue:Permission denied

I followed this tutorial for installation of Hadoop. Unfortunately, when I run the dfs namenode -format script - The following error was printed on console:
but at the end i see this msg
dfs namenode -format
WARNING: /home/hdoop/hadoop-3.2.1/logs does not exist. Creating.
mkdir: cannot create directory ‘/home/hdoop/hadoop-3.2.1/logs’: Permission denied
ERROR: Unable to create /home/hdoop/hadoop-3.2.1/logs. Aborting.
thank u
also when i run
./start-dfs.sh
Starting namenodes on [localhost]
localhost: WARNING: /home/hdoop/hadoop-3.2.1/logs does not exist. Creating.
Starting datanodes
Starting secondary namenodes [blabla]
blabla: Warning: Permanently added 'blabla,192.168.100.10' (ECDSA) to the list of known hosts.
change the permission of /home/hdoop to the correct one!
i sloved this link here
According to my configuration i didn't set JAVA_HOME within PATH
$ which java
$ echo $JAVA_HOME
Also,i change the value of HADOOP_OPTS in hadoop-env.sh as given below.
export HADOOP_OPTS="-Djava.library.path=$HADOOP_HOME/lib/"
the image before and after
Create Directory :- Create "logs" directory by using root access yourself.As in this case directory was logs create it at /home/hdoop/hadoop-3.2.1/ (eg. "/home/{username}/{extracted hadoop directory}/" )
Give Access of Directory :- Make it accessible by using sudo chmod 777 {directory location}
In this case - sudo chmod 777 /home/hdoop/hadoop-3.2.2/logs To see it worked in my case see following image :
solved it by giving access of following directory

start-all.sh: command not found. How do I fix this?

I tried installing hadoop using this tutorial, link (timestamped the video from where problem occurs)
However, after formatting the namenode(hdfs namenode -format) I don't get the "name" folder in /abc.
Also the start-all.sh and other /sbin commands dont work.
P.S I did try installing hadoop as a single node which didnt work so I tried removing it, redoing everything as a double node setup, so i had to reformat the namenode..i dont know if that affected this somehow.
EDIT 1: I fixed the start-all.sh command not working because there was a mistake in .bashrc that i corrected.
However I get these error messages when running start-all.sh or start-dfs.sh etc.
hadoop#linux-virtual-machine:~$ start-dfs.sh
Starting namenodes on [localhost]
localhost: mkdir: cannot create directory ‘/usr/local/hadoop-2.10.0/logs’: Permission denied
localhost: chown: cannot access '/usr/local/hadoop-2.10.0/logs': No such file or directory
localhost: starting namenode, logging to /usr/local/hadoop-2.10.0/logs/hadoop-hadoop-namenode-linux-virtual-machine.out
localhost: /usr/local/hadoop-2.10.0/sbin/hadoop-daemon.sh: line 159: /usr/local/hadoop-2.10.0/logs/hadoop-hadoop-namenode-linux-virtual-machine.out: No such file or directory
localhost: head: cannot open '/usr/local/hadoop-2.10.0/logs/hadoop-hadoop-namenode-linux-virtual-machine.out' for reading: No such file or directory
localhost: /usr/local/hadoop-2.10.0/sbin/hadoop-daemon.sh: line 177: /usr/local/hadoop-2.10.0/logs/hadoop-hadoop-namenode-linux-virtual-machine.out: No such file or directory
localhost: /usr/local/hadoop-2.10.0/sbin/hadoop-daemon.sh: line 178: /usr/local/hadoop-2.10.0/logs/hadoop-hadoop-namenode-linux-virtual-machine.out: No such file or directory
localhost: mkdir: cannot create directory ‘/usr/local/hadoop-2.10.0/logs’: Permission denied
localhost: chown: cannot access '/usr/local/hadoop-2.10.0/logs': No such file or directory
localhost: starting datanode, logging to /usr/local/hadoop-2.10.0/logs/hadoop-hadoop-datanode-linux-virtual-machine.out
localhost: /usr/local/hadoop-2.10.0/sbin/hadoop-daemon.sh: line 159: /usr/local/hadoop-2.10.0/logs/hadoop-hadoop-datanode-linux-virtual-machine.out: No such file or directory
localhost: head: cannot open '/usr/local/hadoop-2.10.0/logs/hadoop-hadoop-datanode-linux-virtual-machine.out' for reading: No such file or directory
localhost: /usr/local/hadoop-2.10.0/sbin/hadoop-daemon.sh: line 177: /usr/local/hadoop-2.10.0/logs/hadoop-hadoop-datanode-linux-virtual-machine.out: No such file or directory
localhost: /usr/local/hadoop-2.10.0/sbin/hadoop-daemon.sh: line 178: /usr/local/hadoop-2.10.0/logs/hadoop-hadoop-datanode-linux-virtual-machine.out: No such file or directory
Starting secondary namenodes [0.0.0.0]
The authenticity of host '0.0.0.0 (0.0.0.0)' can't be established.
ECDSA key fingerprint is SHA256:a37ThJJRRW+AlDso9xrOCBHzsFCY0/OgYet7WczVbb0.
Are you sure you want to continue connecting (yes/no)? no
0.0.0.0: Host key verification failed.
EDIT 2: Fixed the above error my changing the permissions to hadoop folder (in my case both hadoop-2.10.0 and hadoop)
start-all.sh works perfectly but namenode doesnt show up.
It's not clear how you setup your PATH variable. Or how the scripts are not "working". Did you chmod +x them to make them executable? Any logs output that comes from them at all?
The start-all script is available in the sbin directory of where you downloaded Hadoop, so just /path/to/sbin/start-all.sh is all you really need.
Yes, the namenode needs formatted on a fresh cluster. Using the official Apache Guide is the most up-to-date source and works fine for most.
Otherwise, I would suggest you learn about Apache Amabri, which can automate your installation. Or just use a Sandbox provided by Cloudera, or use many of the Docker containers that already exist for Hadoop if you don't care about fully "installing" it.

Hadoop standalone mode not starting at the local machine have permission issues

I am not able to figure out what the problem is, I have checked all the links available for the problem and tried but still the same problem.
Please need help as the sandbox available needs higher configuration like more RAM.
hstart
WARNING: Attempting to start all Apache Hadoop daemons as adityaverma
in 10 seconds.
WARNING: This is not a recommended production deployment configuration.
WARNING: Use CTRL-C to abort.
Starting namenodes on [localhost]
localhost: adityaverma#localhost: Permission denied
(publickey,password,keyboard-interactive).
Starting datanodes
localhost: adityaverma#localhost: Permission denied
(publickey,password,keyboard-interactive).
Starting secondary namenodes [Adityas-MacBook-Pro.local]
Adityas-MacBook-Pro.local: adityaverma#adityas-macbook-pro.local:
Permission denied (publickey,password,keyboard-interactive).
2018-05-30 11:07:03,084 WARN util.NativeCodeLoader: Unable to load
native-hadoop library for your platform... using builtin-java classes
where applicable
Starting resourcemanager
Starting nodemanagers
localhost: adityaverma#localhost: Permission denied (publickey,password,keyboard-interactive).
This error typically means you failed to setup passwordless SSH. For example, the same error should happen with ssh localhost, and it should not prompt for a password
Check the Hadoop documentation again on SSH key generation and add it to your authorized keys file
I might suggest setting up a virtual machine anyway (for example, using Vagrant) if the sandbox requires too many resources. The Hortonworks&Cloudrea installation docs are fairly detailed to install a cluster from scratch
This way, Hadoop isn't cluttering your Mac's hard drive and a Linux server will closer match Hadoop installations running in production environments

Error in Hadoop 2.2 while starting in windows

I am trying to install hadoop on windows7.i have installed cygwin, when i do ./start-dfs.sh i am getting the following error:
Error: Could not find or load main class org.apache.hadoop.hdfs.tools.GetConf
Starting namenodes on []
localhost: starting namenode, logging to /usr/local/hadoop/logs/hadoop-kalai-namenode kalai-PC.out
localhost: Error: Could not find or load main class org.apache.hadoop.hdfs.server.namenode.NameNode
localhost: starting datanode, logging to /usr/local/hadoop/logs/hadoop-kalai-datanode-kalai-PC.out
localhost: Error: Could not find or load main class org.apache.hadoop.hdfs.server.datanode.DataNode
Error: Could not find or load main class org.apache.hadoop.hdfs.tools.GetConf
Can anyone let me know what i'm doing here wrong?
The above issue gets cleared when I used Command Prompt with admin privileges for Formatting namenode and starting services.
Removed the C:\tmp and C:\data directories manually
Open a cmd.exe with 'Run as Administrator"
Format the namenode and start the services.

start-all.sh error while installing hadoop on ubuntu 12.04lts

I have been referring to this link for hadoop-1.1.1 installation.
All my files and permissions have been set according to this link.
But I am getting this error.Please help.
hduser#ubuntu:/usr/local/hadoop$ bin/start-all.sh mkdir: cannot create
directory /usr/local/hadoop/libexec/../logs': Permission denied
chown: cannot access/usr/local/hadoop/libexec/../logs': No such file
or directory starting namenode, logging to
/usr/local/hadoop/libexec/../logs/hadoop-hduser-namenode-ubuntu.out
/usr/local/hadoop/bin/hadoop-daemon.sh: line 136:
/usr/local/hadoop/libexec/../logs/hadoop-hduser-namenode-ubuntu.out:
No such file or directory head: cannot open
/usr/local/hadoop/libexec/../logs/hadoop-hduser-namenode-ubuntu.out'
for reading: No such file or directory localhost: mkdir: cannot create
directory/usr/local/hadoop/libexec/../logs': Permission denied
localhost: chown: cannot access /usr/local/hadoop/libexec/../logs':
No such file or directory localhost: starting datanode, logging to
/usr/local/hadoop/libexec/../logs/hadoop-hduser-datanode-ubuntu.out
localhost: /usr/local/hadoop/bin/hadoop-daemon.sh: line 136:
/usr/local/hadoop/libexec/../logs/hadoop-hduser-datanode-ubuntu.out:
No such file or directory localhost: head: cannot open
/usr/local/hadoop/libexec/../logs/hadoop-hduser-datanode-ubuntu.out'
for reading: No such file or directory localhost: mkdir: cannot create
directory /usr/local/hadoop/libexec/../logs': Permission denied
localhost: chown: cannot access/usr/local/hadoop/libexec/../logs':
No such file or directory localhost: starting secondarynamenode,
logging to
/usr/local/hadoop/libexec/../logs/hadoop-hduser-secondarynamenode-ubuntu.out
localhost: /usr/local/hadoop/bin/hadoop-daemon.sh: line 136:
/usr/local/hadoop/libexec/../logs/hadoop-hduser-secondarynamenode-ubuntu.out:
No such file or directory localhost: head: cannot open
/usr/local/hadoop/libexec/../logs/hadoop-hduser-secondarynamenode-ubuntu.out'
for reading: No such file or directory mkdir: cannot create directory
/usr/local/hadoop/libexec/../logs': Permission denied chown: cannot
access /usr/local/hadoop/libexec/../logs': No such file or directory
starting jobtracker, logging to
/usr/local/hadoop/libexec/../logs/hadoop-hduser-jobtracker-ubuntu.out
/usr/local/hadoop/bin/hadoop-daemon.sh: line 136:
/usr/local/hadoop/libexec/../logs/hadoop-hduser-jobtracker-ubuntu.out:
No such file or directory head: cannot open
/usr/local/hadoop/libexec/../logs/hadoop-hduser-jobtracker-ubuntu.out'
for reading: No such file or directory localhost: mkdir: cannot create
directory /usr/local/hadoop/libexec/../logs': Permission denied
localhost: chown: cannot access/usr/local/hadoop/libexec/../logs':
No such file or directory localhost: starting tasktracker, logging to
/usr/local/hadoop/libexec/../logs/hadoop-hduser-tasktracker-ubuntu.out
localhost: /usr/local/hadoop/bin/hadoop-daemon.sh: line 136:
/usr/local/hadoop/libexec/../logs/hadoop-hduser-tasktracker-ubuntu.out:
No such file or directory localhost: head: cannot open
`/usr/local/hadoop/libexec/../logs/hadoop-hduser-tasktracker-ubuntu.out'
for reading: No such file or directory
As the error suggests you're having a permission problem.
You need to give hduser proper permissions. Try:
sudo chown -R hduser /usr/local/hadoop/
Run this command to change the permission of the hadoop directory
sudo chmod 750 /app/hadoop
Below are 2 very helpful suggestions:
It is good to check whether HADOOP_HOME and JAVA_HOME is set in
.bashrc file. Sometimes, not setting up these environment variables
may also cause error while starting the hadoop cluster.
It is also useful to debug the error by going through the log files generated in /usr/local/hadoop/logs directory.

Resources