I have setup hadoop on mac local mac. When i start-dfs using the start-dfs.sh command using a separate hadoop user i get the following error in the terminal.
0.0.0.0: mkdir: /usr/local/Cellar/hadoop/2.3.0/libexec/logs: Permission denied
Does anyone know how i can change the log directory for hadoop? I installed hadoop using homebrew.
bash-3.2$ start-dfs.sh
14/03/31 09:04:20 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Starting namenodes on [localhost]
localhost: mkdir: /usr/local/Cellar/hadoop/2.3.0/libexec/logs: Permission denied
localhost: chown: /usr/local/Cellar/hadoop/2.3.0/libexec/logs: No such file or directory
localhost: starting namenode, logging to /usr/local/Cellar/hadoop/2.3.0/libexec/logs/hadoop-hadoop-namenode-mymac.local.out
localhost: /usr/local/Cellar/hadoop/2.3.0/libexec/sbin/hadoop-daemon.sh: line 151: /usr/local/Cellar/hadoop/2.3.0/libexec/logs/hadoop-hadoop-namenode-mymac.local.out: No such file or directory
localhost: head: /usr/local/Cellar/hadoop/2.3.0/libexec/logs/hadoop-hadoop-namenode-mymac.local.out: No such file or directory
localhost: /usr/local/Cellar/hadoop/2.3.0/libexec/sbin/hadoop-daemon.sh: line 166: /usr/local/Cellar/hadoop/2.3.0/libexec/logs/hadoop-hadoop-namenode-mymac.local.out: No such file or directory
localhost: /usr/local/Cellar/hadoop/2.3.0/libexec/sbin/hadoop-daemon.sh: line 167: /usr/local/Cellar/hadoop/2.3.0/libexec/logs/hadoop-hadoop-namenode-mymac.local.out: No such file or directory
localhost: mkdir: /usr/local/Cellar/hadoop/2.3.0/libexec/logs: Permission denied
localhost: chown: /usr/local/Cellar/hadoop/2.3.0/libexec/logs: No such file or directory
The error indicates a permissions problem. The hadoop user needs the proper privileges to the hadoop folder. Try running the following in Terminal:
sudo chown -R hadoop /usr/local/Cellar/hadoop/2.3.0/
Related
I can navigate from node to node with an ssh connection without any problems, for example from parasilo-1 to parasilo-10.
cat ~/.ssh/id_rsa.pub >> ~/.ssh/authorized_keys doesn't change anything unfortunately.
I am connected in SSH to my master node (parasilo-1) on Grid5000 to run a hdfs command:
user#parasilo-1:~$ ./hadoop/hadoop-3.3.4/sbin/start-dfs.sh
Starting namenodes on [parasilo-1.rennes.grid5000.fr]
parasilo-1.rennes.grid5000.fr: user#parasilo-1.rennes.grid5000.fr: Permission denied (publickey,password).
Starting datanodes
parasilo-1.rennes.grid5000.fr: user#parasilo-1.rennes.grid5000.fr: Permission denied (publickey,password).
parasilo-10.rennes.grid5000.fr: user#parasilo-10.rennes.grid5000.fr: Permission denied (publickey,password).
Starting secondary namenodes [parasilo-1.rennes.grid5000.fr]
parasilo-1.rennes.grid5000.fr: user#parasilo-1.rennes.grid5000.fr: Permission denied (publickey,password).
2023-01-12 15:54:57,462 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Does anyone have an idea how to make this command run correctly?
You need to ssh-copy-id to all datanodes, not only edit localhost authorized keys. And there should be no password prompt.
If it's not working, there's no harm in generating a new key and trying again.
Ubuntu 16.04.1 LTS
Hadoop 3.3.1
when I run start-dfs.sh,
hadoop#ubuntu:~/hadoop/sbin$ start-dfs.sh
Starting namenodes on [ubuntu]
ubuntu: Warning: Permanently added 'ubuntu' (ECDSA) to the list of known hosts.
ubuntu: Permission denied (publickey,password).
Starting datanodes
localhost: Warning: Permanently added 'localhost' (ECDSA) to the list of known hosts.
localhost: Permission denied (publickey,password).
Starting secondary namenodes [ubuntu]
ubuntu: Permission denied (publickey,password).
2021-06-25 18:05:42,961 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... >using builtin-java classes where applicable
I also encountered this problem.
I resolved using the following shell commands.
ssh-keygen -t rsa
cat ~/.ssh/id_rsa.pub >> ~/.ssh/authorized_keys
chmod 0600 ~/.ssh/authorized_keys
when i start hadoop daemon i got following error
[hdp#localhost ~]$ start-all.sh
WARNING: Attempting to start all Apache Hadoop daemons as hdp in 10 seconds.
WARNING: This is not a recommended production deployment configuration.
WARNING: Use CTRL-C to abort.
Starting namenodes on [localhost]
localhost: Permission denied (publickey,gssapi-keyex,gssapi-with-mic,password).
Starting datanodes
localhost: Permission denied (publickey,gssapi-keyex,gssapi-with-mic,password).
Starting secondary namenodes [localhost.localdomain]
localhost.localdomain: Permission denied (publickey,gssapi-keyex,gssapi-with-mic,password).
Starting resourcemanager
Starting nodemanagers
localhost: Permission denied (publickey,gssapi-keyex,gssapi-with-mic,password).
hduser#manoj-VirtualBox:/usr/local/hadoop$ sbin/start-all.sh
This script is Deprecated. Instead use start-dfs.sh and start-yarn.sh
Starting namenodes on [localhost]
localhost: mkdir: cannot create directory ‘/tmp’: Permission denied
localhost: starting namenode, logging to /usr/local/hadoop/logs/hadoop-hduser-namenode-manoj-VirtualBox.out
localhost: /usr/local/hadoop/sbin/hadoop-daemon.sh: line 165: /tmp/hadoop-hduser-namenode.pid: No such file or directory
localhost: mkdir: cannot create directory ‘/tmp’: Permission denied
localhost: starting datanode, logging to /usr/local/hadoop/logs/hadoop-hduser-datanode-manoj-VirtualBox.out
localhost: /usr/local/hadoop/sbin/hadoop-daemon.sh: line 165: /tmp/hadoop-hduser-datanode.pid: No such file or directory
Starting secondary namenodes [0.0.0.0]
0.0.0.0: mkdir: cannot create directory ‘/tmp’: Permission denied
0.0.0.0: starting secondarynamenode, logging to /usr/local/hadoop/logs/hadoop-hduser-secondarynamenode-manoj-VirtualBox.out
0.0.0.0: /usr/local/hadoop/sbin/hadoop-daemon.sh: line 165: /tmp/hadoop-hduser-secondarynamenode.pid: No such file or directory
starting yarn daemons
mkdir: cannot create directory ‘/tmp’: Permission denied
starting resourcemanager, logging to /usr/local/hadoop/logs/yarn-hduser-resourcemanager-manoj-VirtualBox.out
/usr/local/hadoop/sbin/yarn-daemon.sh: line 125: /tmp/yarn-hduser-resourcemanager.pid: No such file or directory
localhost: mkdir: cannot create directory ‘/tmp’: Permission denied
localhost: starting nodemanager, logging to /usr/local/hadoop/logs/yarn-hduser-nodemanager-manoj-VirtualBox.out
localhost: /usr/local/hadoop/sbin/yarn-daemon.sh: line 125: /tmp/yarn-hduser-nodemanager.pid: No such file or directory
I tried running
sudo chown -R hduser /usr/local/hadoop/
to give permissions to hduser. but still the same.
Also tried running sbin/start-dfs.sh & sbin/start-yarn.sh these also resulting same permisssion problems.
After Adding to sudoers some other permission denied errors are coming.
hduser#manoj-VirtualBox:/usr/local/hadoop$ sbin/start-all.sh
This script is Deprecated. Instead use start-dfs.sh and start-yarn.sh
Starting namenodes on [localhost]
localhost: starting namenode, logging to /usr/local/hadoop/logs/hadoop-hduser-namenode-manoj-VirtualBox.out
localhost: /usr/local/hadoop/sbin/hadoop-daemon.sh: line 165: /tmp/hadoop-hduser-namenode.pid: Permission denied
localhost: starting datanode, logging to /usr/local/hadoop/logs/hadoop-hduser-datanode-manoj-VirtualBox.out
localhost: /usr/local/hadoop/sbin/hadoop-daemon.sh: line 165: /tmp/hadoop-hduser-datanode.pid: Permission denied
Starting secondary namenodes [0.0.0.0]
0.0.0.0: starting secondarynamenode, logging to /usr/local/hadoop/logs/hadoop-hduser-secondarynamenode-manoj-VirtualBox.out
0.0.0.0: /usr/local/hadoop/sbin/hadoop-daemon.sh: line 165: /tmp/hadoop-hduser-secondarynamenode.pid: Permission denied
starting yarn daemons
starting resourcemanager, logging to /usr/local/hadoop/logs/yarn-hduser-resourcemanager-manoj-VirtualBox.out
/usr/local/hadoop/sbin/yarn-daemon.sh: line 125: /tmp/yarn-hduser-resourcemanager.pid: Permission denied
localhost: starting nodemanager, logging to /usr/local/hadoop/logs/yarn-hduser-nodemanager-manoj-VirtualBox.out
localhost: /usr/local/hadoop/sbin/yarn-daemon.sh: line 125: /tmp/yarn-hduser-nodemanager.pid: Permission denied
I am trying to run Hadoop in Pseudo-Distributed mode. For this I am trying to follow this tutorial http://hadoop.apache.org/docs/current/hadoop-project-dist/hadoop-common/SingleCluster.html
I can ssh to my localhost and Format the filesystem. However, I can't start NameNode daemon and DataNode daemon by this command :
sbin/start-dfs.sh
When I execute it with sudo I get:
ubuntu#ip-172-31-42-67:/usr/local/hadoop-2.6.0$ sudo sbin/start-dfs.sh
Starting namenodes on [localhost]
localhost: Permission denied (publickey).
localhost: Permission denied (publickey).
Starting secondary namenodes [0.0.0.0]
0.0.0.0: Permission denied (publickey).
and when executed without sudo:
ubuntu#ip-172-31-42-67:/usr/local/hadoop-2.6.0$ sbin/start-dfs.sh
Starting namenodes on [localhost]
localhost: mkdir: cannot create directory ‘/usr/local/hadoop-2.6.0/logs’: Permission denied
localhost: chown: cannot access ‘/usr/local/hadoop-2.6.0/logs’: No such file or directory
localhost: starting namenode, logging to /usr/local/hadoop-2.6.0/logs/hadoop-ubuntu-namenode-ip-172-31-42-67.out
localhost: /usr/local/hadoop-2.6.0/sbin/hadoop-daemon.sh: line 159: /usr/local/hadoop-2.6.0/logs/hadoop-ubuntu-namenode-ip-172-31-42-67.out: No such file or directory
localhost: head: cannot open ‘/usr/local/hadoop-2.6.0/logs/hadoop-ubuntu-namenode-ip-172-31-42-67.out’ for reading: No such file or directory
localhost: /usr/local/hadoop-2.6.0/sbin/hadoop-daemon.sh: line 177: /usr/local/hadoop-2.6.0/logs/hadoop-ubuntu-namenode-ip-172-31-42-67.out: No such file or directory
localhost: /usr/local/hadoop-2.6.0/sbin/hadoop-daemon.sh: line 178: /usr/local/hadoop-2.6.0/logs/hadoop-ubuntu-namenode-ip-172-31-42-67.out: No such file or directory
localhost: mkdir: cannot create directory ‘/usr/local/hadoop-2.6.0/logs’: Permission denied
localhost: chown: cannot access ‘/usr/local/hadoop-2.6.0/logs’: No such file or directory
localhost: starting datanode, logging to /usr/local/hadoop-2.6.0/logs/hadoop-ubuntu-datanode-ip-172-31-42-67.out
localhost: /usr/local/hadoop-2.6.0/sbin/hadoop-daemon.sh: line 159: /usr/local/hadoop-2.6.0/logs/hadoop-ubuntu-datanode-ip-172-31-42-67.out: No such file or directory
localhost: head: cannot open ‘/usr/local/hadoop-2.6.0/logs/hadoop-ubuntu-datanode-ip-172-31-42-67.out’ for reading: No such file or directory
localhost: /usr/local/hadoop-2.6.0/sbin/hadoop-daemon.sh: line 177: /usr/local/hadoop-2.6.0/logs/hadoop-ubuntu-datanode-ip-172-31-42-67.out: No such file or directory
localhost: /usr/local/hadoop-2.6.0/sbin/hadoop-daemon.sh: line 178: /usr/local/hadoop-2.6.0/logs/hadoop-ubuntu-datanode-ip-172-31-42-67.out: No such file or directory
Starting secondary namenodes [0.0.0.0]
0.0.0.0: mkdir: cannot create directory ‘/usr/local/hadoop-2.6.0/logs’: Permission denied
0.0.0.0: chown: cannot access ‘/usr/local/hadoop-2.6.0/logs’: No such file or directory
0.0.0.0: starting secondarynamenode, logging to /usr/local/hadoop-2.6.0/logs/hadoop-ubuntu-secondarynamenode-ip-172-31-42-67.out
0.0.0.0: /usr/local/hadoop-2.6.0/sbin/hadoop-daemon.sh: line 159: /usr/local/hadoop-2.6.0/logs/hadoop-ubuntu-secondarynamenode-ip-172-31-42-67.out: No such file or directory
0.0.0.0: head: cannot open ‘/usr/local/hadoop-2.6.0/logs/hadoop-ubuntu-secondarynamenode-ip-172-31-42-67.out’ for reading: No such file or directory
0.0.0.0: /usr/local/hadoop-2.6.0/sbin/hadoop-daemon.sh: line 177: /usr/local/hadoop-2.6.0/logs/hadoop-ubuntu-secondarynamenode-ip-172-31-42-67.out: No such file or directory
0.0.0.0: /usr/local/hadoop-2.6.0/sbin/hadoop-daemon.sh: line 178: /usr/local/hadoop-2.6.0/logs/hadoop-ubuntu-secondarynamenode-ip-172-31-42-67.out: No such file or directory
I also notice now that when executing ls to check content of hfs directories like here, it fails:
ubuntu#ip-172-31-42-67:~/dir$ hdfs dfs -ls output/
ls: Call From ip-172-31-42-67.us-west-2.compute.internal/172.31.42.67 to localhost:9000 failed on connection exception: java.net.ConnectException: Connection refused; For more details see: http://wiki.apache.org/hadoop/ConnectionRefused
Can anyone tell me what could be the problem ?
I had the same problem and the only solution I found was:
https://anuragsoni.wordpress.com/2015/07/05/hadoop-start-dfs-sh-localhost-permission-denied-how-to-fix/
Which suggest you to generate a new ssh-rsa key
The errors above suggest a permissions problem.
You have to make sure that the hadoop user has the proper privileges to /usr/local/hadoop.
For this purpose you can try:
sudo chown -R hadoop /usr/local/hadoop/
Or
sudo chmod 777 /usr/local/hadoop/
Please make sure that you do the following "Configuration" correctly, you need to edit 4 ".xml" files:
Edit the file hadoop-2.6.0/etc/hadoop/core-site.xml , between , put in :
<property>
<name>fs.defaultFS</name>
<value>hdfs://localhost:9000</value>
</property>
Edit the file hadoop-2.6.0/etc/hadoop/hdfs-site.xml, between put in:
<property>
<name>dfs.replication</name>
<value>1</value>
</property>
Edit the file hadoop-2.6.0/etc/hadoop/mapred-site.xm, between paste the following and save
<property>
<name>mapreduce.framework.name</name>
<value>yarn</value>
</property>
Edit the file hadoop-2.6.0/etc/hadoop/yarn-site.xml, between paste the following and save
<property>
<name>yarn.nodemanager.aux-services</name>
<value>mapreduce_shuffle</value>
</property>