I tried installing hadoop using this tutorial, link (timestamped the video from where problem occurs)
However, after formatting the namenode(hdfs namenode -format) I don't get the "name" folder in /abc.
Also the start-all.sh and other /sbin commands dont work.
P.S I did try installing hadoop as a single node which didnt work so I tried removing it, redoing everything as a double node setup, so i had to reformat the namenode..i dont know if that affected this somehow.
EDIT 1: I fixed the start-all.sh command not working because there was a mistake in .bashrc that i corrected.
However I get these error messages when running start-all.sh or start-dfs.sh etc.
hadoop#linux-virtual-machine:~$ start-dfs.sh
Starting namenodes on [localhost]
localhost: mkdir: cannot create directory ‘/usr/local/hadoop-2.10.0/logs’: Permission denied
localhost: chown: cannot access '/usr/local/hadoop-2.10.0/logs': No such file or directory
localhost: starting namenode, logging to /usr/local/hadoop-2.10.0/logs/hadoop-hadoop-namenode-linux-virtual-machine.out
localhost: /usr/local/hadoop-2.10.0/sbin/hadoop-daemon.sh: line 159: /usr/local/hadoop-2.10.0/logs/hadoop-hadoop-namenode-linux-virtual-machine.out: No such file or directory
localhost: head: cannot open '/usr/local/hadoop-2.10.0/logs/hadoop-hadoop-namenode-linux-virtual-machine.out' for reading: No such file or directory
localhost: /usr/local/hadoop-2.10.0/sbin/hadoop-daemon.sh: line 177: /usr/local/hadoop-2.10.0/logs/hadoop-hadoop-namenode-linux-virtual-machine.out: No such file or directory
localhost: /usr/local/hadoop-2.10.0/sbin/hadoop-daemon.sh: line 178: /usr/local/hadoop-2.10.0/logs/hadoop-hadoop-namenode-linux-virtual-machine.out: No such file or directory
localhost: mkdir: cannot create directory ‘/usr/local/hadoop-2.10.0/logs’: Permission denied
localhost: chown: cannot access '/usr/local/hadoop-2.10.0/logs': No such file or directory
localhost: starting datanode, logging to /usr/local/hadoop-2.10.0/logs/hadoop-hadoop-datanode-linux-virtual-machine.out
localhost: /usr/local/hadoop-2.10.0/sbin/hadoop-daemon.sh: line 159: /usr/local/hadoop-2.10.0/logs/hadoop-hadoop-datanode-linux-virtual-machine.out: No such file or directory
localhost: head: cannot open '/usr/local/hadoop-2.10.0/logs/hadoop-hadoop-datanode-linux-virtual-machine.out' for reading: No such file or directory
localhost: /usr/local/hadoop-2.10.0/sbin/hadoop-daemon.sh: line 177: /usr/local/hadoop-2.10.0/logs/hadoop-hadoop-datanode-linux-virtual-machine.out: No such file or directory
localhost: /usr/local/hadoop-2.10.0/sbin/hadoop-daemon.sh: line 178: /usr/local/hadoop-2.10.0/logs/hadoop-hadoop-datanode-linux-virtual-machine.out: No such file or directory
Starting secondary namenodes [0.0.0.0]
The authenticity of host '0.0.0.0 (0.0.0.0)' can't be established.
ECDSA key fingerprint is SHA256:a37ThJJRRW+AlDso9xrOCBHzsFCY0/OgYet7WczVbb0.
Are you sure you want to continue connecting (yes/no)? no
0.0.0.0: Host key verification failed.
EDIT 2: Fixed the above error my changing the permissions to hadoop folder (in my case both hadoop-2.10.0 and hadoop)
start-all.sh works perfectly but namenode doesnt show up.
It's not clear how you setup your PATH variable. Or how the scripts are not "working". Did you chmod +x them to make them executable? Any logs output that comes from them at all?
The start-all script is available in the sbin directory of where you downloaded Hadoop, so just /path/to/sbin/start-all.sh is all you really need.
Yes, the namenode needs formatted on a fresh cluster. Using the official Apache Guide is the most up-to-date source and works fine for most.
Otherwise, I would suggest you learn about Apache Amabri, which can automate your installation. Or just use a Sandbox provided by Cloudera, or use many of the Docker containers that already exist for Hadoop if you don't care about fully "installing" it.
Related
I followed this tutorial for installation of Hadoop. Unfortunately, when I run the dfs namenode -format script - The following error was printed on console:
but at the end i see this msg
dfs namenode -format
WARNING: /home/hdoop/hadoop-3.2.1/logs does not exist. Creating.
mkdir: cannot create directory ‘/home/hdoop/hadoop-3.2.1/logs’: Permission denied
ERROR: Unable to create /home/hdoop/hadoop-3.2.1/logs. Aborting.
thank u
also when i run
./start-dfs.sh
Starting namenodes on [localhost]
localhost: WARNING: /home/hdoop/hadoop-3.2.1/logs does not exist. Creating.
Starting datanodes
Starting secondary namenodes [blabla]
blabla: Warning: Permanently added 'blabla,192.168.100.10' (ECDSA) to the list of known hosts.
change the permission of /home/hdoop to the correct one!
i sloved this link here
According to my configuration i didn't set JAVA_HOME within PATH
$ which java
$ echo $JAVA_HOME
Also,i change the value of HADOOP_OPTS in hadoop-env.sh as given below.
export HADOOP_OPTS="-Djava.library.path=$HADOOP_HOME/lib/"
the image before and after
Create Directory :- Create "logs" directory by using root access yourself.As in this case directory was logs create it at /home/hdoop/hadoop-3.2.1/ (eg. "/home/{username}/{extracted hadoop directory}/" )
Give Access of Directory :- Make it accessible by using sudo chmod 777 {directory location}
In this case - sudo chmod 777 /home/hdoop/hadoop-3.2.2/logs To see it worked in my case see following image :
solved it by giving access of following directory
I have used this link to create a 4 node cluster: https://blog.insightdatascience.com/spinning-up-a-free-hadoop-cluster-step-by-step-c406d56bae42, but once I reach the part to start the hadoop cluster I get errors like so:
$HADOOP_HOME/sbin/start-dfs.sh
Starting namenodes on [namenode_dns]
namenode_dns: mkdir: cannot create
directory ‘/usr/local/hadoop/logs’: Permission denied
namenode_dns: chown: cannot access
'/usr/local/hadoop/logs': No such file or directory
namenode_dns: starting namenode, logging
to /usr/local/hadoop/logs/hadoop-ubuntu-namenode-ip-172-31-2-168.out
namenode_dns:
/usr/local/hadoop/sbin/hadoop-daemon.sh: line 159:
/usr/local/hadoop/logs/hadoop-ubuntu-namenode-ip-172-31-2-168.out: No
such file or directory
namenode_dns: head: cannot open
'/usr/local/hadoop/logs/hadoop-ubuntu-namenode-ip-172-31-2-168.out'
for reading: No such file or directory
namenode_dns:
/usr/local/hadoop/sbin/hadoop-daemon.sh: line 177:
/usr/local/hadoop/logs/hadoop-ubuntu-namenode-ip-172-31-2-168.out: No
such file or directory
namenode_dns:
/usr/local/hadoop/sbin/hadoop-daemon.sh: line 178:
/usr/local/hadoop/logs/hadoop-ubuntu-namenode-ip-172-31-2-168.out: No
such file or directory
ip-172-31-1-82: starting datanode, logging to
/usr/local/hadoop/logs/hadoop-ubuntu-datanode-ip-172-31-1-82.out
ip-172-31-7-221: starting datanode, logging to
/usr/local/hadoop/logs/hadoop-ubuntu-datanode-ip-172-31-7-221.out
ip-172-31-14-230: starting datanode, logging to
/usr/local/hadoop/logs/hadoop-ubuntu-datanode-ip-172-31-14-230.out
Starting secondary namenodes [0.0.0.0]
0.0.0.0: mkdir: cannot create directory ‘/usr/local/hadoop/logs’:
Permission denied
0.0.0.0: chown: cannot access '/usr/local/hadoop/logs': No such file
or directory
0.0.0.0: starting secondarynamenode, logging to
/usr/local/hadoop/logs/hadoop-ubuntu-secondarynamenode-ip-172-31-2-
168.out
0.0.0.0: /usr/local/hadoop/sbin/hadoop-daemon.sh: line 159:
/usr/local/hadoop/logs/hadoop-ubuntu-secondarynamenode-ip-172-31-2-
168.out: No such file or directory
0.0.0.0: head: cannot open '/usr/local/hadoop/logs/hadoop-ubuntu-
secondarynamenode-ip-172-31-2-168.out' for reading: No such file or
directory
0.0.0.0: /usr/local/hadoop/sbin/hadoop-daemon.sh: line 177:
/usr/local/hadoop/logs/hadoop-ubuntu-secondarynamenode-ip-172-31-2-
168.out: No such file or directory
0.0.0.0: /usr/local/hadoop/sbin/hadoop-daemon.sh: line 178:
/usr/local/hadoop/logs/hadoop-ubuntu-secondarynamenode-ip-172-31-2-
168.out: No such file or directory
Here is what happens when I run jps:
20688 Jps
I'm not sure where I went wrong with the configuration and such. I am new to hadoop and map reduce so please keep it simple.
It's a permission related issue, Looks like the user(I'nk it's ubuntu) you are using to start hadoop services doesn't have write permission in the log directory(/usr/local/hadoop) - You would've copied hadoop files as sudo/root. Try to change Hadoop Home directory ownership recursively or Give write access to /usr/local/hadoop/logs directory.
chown -R ububunt:ubuntu /usr/local/hadoop
or
chmod 777 /usr/local/hadoop/logs
I am trying to run a single node hadoop cluster on my machine with the following config:
inux livingstream 3.2.0-29-generic #46-Ubuntu SMP Fri Jul 27 17:03:23 UTC 2012 x86_64 x86_64 x86_64 GNU/Linux
I am able to format the namenode without any problems however when I try and start the namenode using :
hadoop-daemon.sh start namenode
I get the following errors :
ishan#livingstream:/usr/local/hadoop$ hadoop-daemon.sh start namenode
Warning: $HADOOP_HOME is deprecated.
mkdir: cannot create directory `/var/log/hadoop/ishan': Permission denied
chown: cannot access `/var/log/hadoop/ishan': No such file or directory
mkdir: cannot create directory `/var/run/hadoop': Permission denied
starting namenode, logging to /var/log/hadoop/ishan/hadoop-ishan-namenode-livingstream.out
/usr/sbin/hadoop-daemon.sh: line 138: /var/run/hadoop/hadoop-ishan-namenode.pid: No such file or directory
/usr/sbin/hadoop-daemon.sh: line 137: /var/log/hadoop/ishan/hadoop-ishan-namenode-livingstream.out: No such file or directory
head: cannot open `/var/log/hadoop/ishan/hadoop-ishan-namenode-livingstream.out' for reading: No such file or directory
/usr/sbin/hadoop-daemon.sh: line 147: /var/log/hadoop/ishan/hadoop-ishan-namenode-livingstream.out: No such file or directory
/usr/sbin/hadoop-daemon.sh: line 148: /var/log/hadoop/ishan/hadoop-ishan-namenode-livingstream.out: No such file or directory
I did not create a separate user "hduser" for hadoop installation. I am using my exsisting username. May be that is why I am facing the problem.
Can someone please help me with this .
Exactly what permissions do I need to alter to get this working ?
UPDATE
After fiddling around and getting around the permission problems I have moved on to new stupidity of errors posted here : hadoop Nanenode wont start
I will forever keep you guys in mind if you can nudge me in the right direction so that I can start some real work on this.
I have been referring to this link for hadoop-1.1.1 installation.
All my files and permissions have been set according to this link.
But I am getting this error.Please help.
hduser#ubuntu:/usr/local/hadoop$ bin/start-all.sh mkdir: cannot create
directory /usr/local/hadoop/libexec/../logs': Permission denied
chown: cannot access/usr/local/hadoop/libexec/../logs': No such file
or directory starting namenode, logging to
/usr/local/hadoop/libexec/../logs/hadoop-hduser-namenode-ubuntu.out
/usr/local/hadoop/bin/hadoop-daemon.sh: line 136:
/usr/local/hadoop/libexec/../logs/hadoop-hduser-namenode-ubuntu.out:
No such file or directory head: cannot open
/usr/local/hadoop/libexec/../logs/hadoop-hduser-namenode-ubuntu.out'
for reading: No such file or directory localhost: mkdir: cannot create
directory/usr/local/hadoop/libexec/../logs': Permission denied
localhost: chown: cannot access /usr/local/hadoop/libexec/../logs':
No such file or directory localhost: starting datanode, logging to
/usr/local/hadoop/libexec/../logs/hadoop-hduser-datanode-ubuntu.out
localhost: /usr/local/hadoop/bin/hadoop-daemon.sh: line 136:
/usr/local/hadoop/libexec/../logs/hadoop-hduser-datanode-ubuntu.out:
No such file or directory localhost: head: cannot open
/usr/local/hadoop/libexec/../logs/hadoop-hduser-datanode-ubuntu.out'
for reading: No such file or directory localhost: mkdir: cannot create
directory /usr/local/hadoop/libexec/../logs': Permission denied
localhost: chown: cannot access/usr/local/hadoop/libexec/../logs':
No such file or directory localhost: starting secondarynamenode,
logging to
/usr/local/hadoop/libexec/../logs/hadoop-hduser-secondarynamenode-ubuntu.out
localhost: /usr/local/hadoop/bin/hadoop-daemon.sh: line 136:
/usr/local/hadoop/libexec/../logs/hadoop-hduser-secondarynamenode-ubuntu.out:
No such file or directory localhost: head: cannot open
/usr/local/hadoop/libexec/../logs/hadoop-hduser-secondarynamenode-ubuntu.out'
for reading: No such file or directory mkdir: cannot create directory
/usr/local/hadoop/libexec/../logs': Permission denied chown: cannot
access /usr/local/hadoop/libexec/../logs': No such file or directory
starting jobtracker, logging to
/usr/local/hadoop/libexec/../logs/hadoop-hduser-jobtracker-ubuntu.out
/usr/local/hadoop/bin/hadoop-daemon.sh: line 136:
/usr/local/hadoop/libexec/../logs/hadoop-hduser-jobtracker-ubuntu.out:
No such file or directory head: cannot open
/usr/local/hadoop/libexec/../logs/hadoop-hduser-jobtracker-ubuntu.out'
for reading: No such file or directory localhost: mkdir: cannot create
directory /usr/local/hadoop/libexec/../logs': Permission denied
localhost: chown: cannot access/usr/local/hadoop/libexec/../logs':
No such file or directory localhost: starting tasktracker, logging to
/usr/local/hadoop/libexec/../logs/hadoop-hduser-tasktracker-ubuntu.out
localhost: /usr/local/hadoop/bin/hadoop-daemon.sh: line 136:
/usr/local/hadoop/libexec/../logs/hadoop-hduser-tasktracker-ubuntu.out:
No such file or directory localhost: head: cannot open
`/usr/local/hadoop/libexec/../logs/hadoop-hduser-tasktracker-ubuntu.out'
for reading: No such file or directory
As the error suggests you're having a permission problem.
You need to give hduser proper permissions. Try:
sudo chown -R hduser /usr/local/hadoop/
Run this command to change the permission of the hadoop directory
sudo chmod 750 /app/hadoop
Below are 2 very helpful suggestions:
It is good to check whether HADOOP_HOME and JAVA_HOME is set in
.bashrc file. Sometimes, not setting up these environment variables
may also cause error while starting the hadoop cluster.
It is also useful to debug the error by going through the log files generated in /usr/local/hadoop/logs directory.
We are using cdh4-0.0 distribution from cloudera. We are unable to start the daemons using the below command.
>start-dfs.sh
Starting namenodes on [localhost]
hduser#localhost's password:
localhost: mkdir: cannot create directory `/hduser': Permission denied
localhost: chown: cannot access `/hduser/hduser': No such file or directory
localhost: starting namenode, logging to /hduser/hduser/hadoop-hduser-namenode-canberra.out
localhost: /home/hduser/work/software/cloudera/hadoop-2.0.0-cdh4.0.0/sbin/hadoop-daemon.sh: line 150: /hduser/hduser/hadoop-hduser-namenode-canberra.out: No such file or directory
localhost: head: cannot open `/hduser/hduser/hadoop-hduser-namenode-canberra.out' for reading: No such file or directory
Looks like you're using tarballs?
Try to set an override the default HADOOP_LOG_DIR location in your etc/hadoop/hadoop-env.sh config file like so:
export HADOOP_LOG_DIR=/path/to/hadoop/extract/logs/
And then retry sbin/start-dfs.sh, and it should work.
In packaged environments, the start-stop scripts are tuned to provide a unique location for each type of service, via the same HADOOP_LOG_DIR env-var, so they do not have the same issue you're seeing.
If you are using packages instead, don't use these scripts and instead just do:
service hadoop-hdfs-namenode start
service hadoop-hdfs-datanode start
service hadoop-hdfs-secondarynamenode start