when i start hadoop daemon i got following error
[hdp#localhost ~]$ start-all.sh
WARNING: Attempting to start all Apache Hadoop daemons as hdp in 10 seconds.
WARNING: This is not a recommended production deployment configuration.
WARNING: Use CTRL-C to abort.
Starting namenodes on [localhost]
localhost: Permission denied (publickey,gssapi-keyex,gssapi-with-mic,password).
Starting datanodes
localhost: Permission denied (publickey,gssapi-keyex,gssapi-with-mic,password).
Starting secondary namenodes [localhost.localdomain]
localhost.localdomain: Permission denied (publickey,gssapi-keyex,gssapi-with-mic,password).
Starting resourcemanager
Starting nodemanagers
localhost: Permission denied (publickey,gssapi-keyex,gssapi-with-mic,password).
Related
I meet some problems in the configuration of Hadoop3.2.1 for YARN learning. And I found that there are two different conditions in user root and user host1 when I run the sbin/start-all.sh. Can you tell me how to solve it and whether it has connection with the SSH? Thank you very much.
In Root:
root#host1-virtual-machine:/home/host1/usr/hadoop-3.2.1# sbin/start-all.sh
Starting namenodes on [localhost]
ERROR: Attempting to operate on hdfs namenode as root
ERROR: but there is no HDFS_NAMENODE_USER defined. Aborting operation.
Starting datanodes
ERROR: Attempting to operate on hdfs datanode as root
ERROR: but there is no HDFS_DATANODE_USER defined. Aborting operation.
Starting secondary namenodes [host1-virtual-machine]
ERROR: Attempting to operate on hdfs secondarynamenode as root
ERROR: but there is no HDFS_SECONDARYNAMENODE_USER defined. Aborting operation.
2020-02-12 14:40:27,093 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Starting resourcemanager
ERROR: Attempting to operate on yarn resourcemanager as root
ERROR: but there is no YARN_RESOURCEMANAGER_USER defined. Aborting operation.
Starting nodemanagers
ERROR: Attempting to operate on yarn nodemanager as root
ERROR: but there is no YARN_NODEMANAGER_USER defined. Aborting operation.
In host1 user:
host1#host1-virtual-machine:~/usr/hadoop-3.2.1$ sbin/start-all.sh
WARNING: Attempting to start all Apache Hadoop daemons as host1 in 10 seconds.
WARNING: This is not a recommended production deployment configuration.
WARNING: Use CTRL-C to abort.
Starting namenodes on [localhost]
localhost: host1#localhost: Permission denied (publickey,password).
Starting datanodes
localhost: host1#localhost: Permission denied (publickey,password).
Starting secondary namenodes [host1-virtual-machine]
host1-virtual-machine: host1#host1-virtual-machine: Permission denied (publickey,password).
Starting resourcemanager
Starting nodemanagers
localhost: host1#localhost: Permission denied (publickey,password).
You need to set up passwordless connection between nodes. This link might help
http://mynotesonhadoop.blogspot.com/2017/07/configuring-passwordless-ssh-from.html?m=1
I am installing Hadoop in Ubuntu 18.04 following this tutorial. I have tried to solve it reading stackoverflow answers, but I am still having the issue. Here is the output. What can I do?
hadoop#victor-GL552VW:~/hadoop/sbin$ ./start-yarn.sh
Starting resourcemanager
Starting nodemanagers
localhost: hadoop#localhost: Permission denied (publickey,password).
hduser#ubuntu:~$ start-dfs.sh
Starting namenodes on [localhost]
localhost: sign_and_send_pubkey: signing failed: agent refused operation
localhost: Permission denied (publickey,password).
Starting datanodes
localhost: sign_and_send_pubkey: signing failed: agent refused operation
localhost: Permission denied (publickey,password).
Starting secondary namenodes [ubuntu]
ubuntu: sign_and_send_pubkey: signing failed: agent refused operation
ubuntu: Permission denied (publickey,password).
hduser#ubuntu:~$
I think this command will help you
eval "$(ssh-agent -s)"
ssh-add
restart ssh-agent and do ssh-add.
hduser#manoj-VirtualBox:/usr/local/hadoop$ sbin/start-all.sh
This script is Deprecated. Instead use start-dfs.sh and start-yarn.sh
Starting namenodes on [localhost]
localhost: mkdir: cannot create directory ‘/tmp’: Permission denied
localhost: starting namenode, logging to /usr/local/hadoop/logs/hadoop-hduser-namenode-manoj-VirtualBox.out
localhost: /usr/local/hadoop/sbin/hadoop-daemon.sh: line 165: /tmp/hadoop-hduser-namenode.pid: No such file or directory
localhost: mkdir: cannot create directory ‘/tmp’: Permission denied
localhost: starting datanode, logging to /usr/local/hadoop/logs/hadoop-hduser-datanode-manoj-VirtualBox.out
localhost: /usr/local/hadoop/sbin/hadoop-daemon.sh: line 165: /tmp/hadoop-hduser-datanode.pid: No such file or directory
Starting secondary namenodes [0.0.0.0]
0.0.0.0: mkdir: cannot create directory ‘/tmp’: Permission denied
0.0.0.0: starting secondarynamenode, logging to /usr/local/hadoop/logs/hadoop-hduser-secondarynamenode-manoj-VirtualBox.out
0.0.0.0: /usr/local/hadoop/sbin/hadoop-daemon.sh: line 165: /tmp/hadoop-hduser-secondarynamenode.pid: No such file or directory
starting yarn daemons
mkdir: cannot create directory ‘/tmp’: Permission denied
starting resourcemanager, logging to /usr/local/hadoop/logs/yarn-hduser-resourcemanager-manoj-VirtualBox.out
/usr/local/hadoop/sbin/yarn-daemon.sh: line 125: /tmp/yarn-hduser-resourcemanager.pid: No such file or directory
localhost: mkdir: cannot create directory ‘/tmp’: Permission denied
localhost: starting nodemanager, logging to /usr/local/hadoop/logs/yarn-hduser-nodemanager-manoj-VirtualBox.out
localhost: /usr/local/hadoop/sbin/yarn-daemon.sh: line 125: /tmp/yarn-hduser-nodemanager.pid: No such file or directory
I tried running
sudo chown -R hduser /usr/local/hadoop/
to give permissions to hduser. but still the same.
Also tried running sbin/start-dfs.sh & sbin/start-yarn.sh these also resulting same permisssion problems.
After Adding to sudoers some other permission denied errors are coming.
hduser#manoj-VirtualBox:/usr/local/hadoop$ sbin/start-all.sh
This script is Deprecated. Instead use start-dfs.sh and start-yarn.sh
Starting namenodes on [localhost]
localhost: starting namenode, logging to /usr/local/hadoop/logs/hadoop-hduser-namenode-manoj-VirtualBox.out
localhost: /usr/local/hadoop/sbin/hadoop-daemon.sh: line 165: /tmp/hadoop-hduser-namenode.pid: Permission denied
localhost: starting datanode, logging to /usr/local/hadoop/logs/hadoop-hduser-datanode-manoj-VirtualBox.out
localhost: /usr/local/hadoop/sbin/hadoop-daemon.sh: line 165: /tmp/hadoop-hduser-datanode.pid: Permission denied
Starting secondary namenodes [0.0.0.0]
0.0.0.0: starting secondarynamenode, logging to /usr/local/hadoop/logs/hadoop-hduser-secondarynamenode-manoj-VirtualBox.out
0.0.0.0: /usr/local/hadoop/sbin/hadoop-daemon.sh: line 165: /tmp/hadoop-hduser-secondarynamenode.pid: Permission denied
starting yarn daemons
starting resourcemanager, logging to /usr/local/hadoop/logs/yarn-hduser-resourcemanager-manoj-VirtualBox.out
/usr/local/hadoop/sbin/yarn-daemon.sh: line 125: /tmp/yarn-hduser-resourcemanager.pid: Permission denied
localhost: starting nodemanager, logging to /usr/local/hadoop/logs/yarn-hduser-nodemanager-manoj-VirtualBox.out
localhost: /usr/local/hadoop/sbin/yarn-daemon.sh: line 125: /tmp/yarn-hduser-nodemanager.pid: Permission denied
I have setup hadoop on mac local mac. When i start-dfs using the start-dfs.sh command using a separate hadoop user i get the following error in the terminal.
0.0.0.0: mkdir: /usr/local/Cellar/hadoop/2.3.0/libexec/logs: Permission denied
Does anyone know how i can change the log directory for hadoop? I installed hadoop using homebrew.
bash-3.2$ start-dfs.sh
14/03/31 09:04:20 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Starting namenodes on [localhost]
localhost: mkdir: /usr/local/Cellar/hadoop/2.3.0/libexec/logs: Permission denied
localhost: chown: /usr/local/Cellar/hadoop/2.3.0/libexec/logs: No such file or directory
localhost: starting namenode, logging to /usr/local/Cellar/hadoop/2.3.0/libexec/logs/hadoop-hadoop-namenode-mymac.local.out
localhost: /usr/local/Cellar/hadoop/2.3.0/libexec/sbin/hadoop-daemon.sh: line 151: /usr/local/Cellar/hadoop/2.3.0/libexec/logs/hadoop-hadoop-namenode-mymac.local.out: No such file or directory
localhost: head: /usr/local/Cellar/hadoop/2.3.0/libexec/logs/hadoop-hadoop-namenode-mymac.local.out: No such file or directory
localhost: /usr/local/Cellar/hadoop/2.3.0/libexec/sbin/hadoop-daemon.sh: line 166: /usr/local/Cellar/hadoop/2.3.0/libexec/logs/hadoop-hadoop-namenode-mymac.local.out: No such file or directory
localhost: /usr/local/Cellar/hadoop/2.3.0/libexec/sbin/hadoop-daemon.sh: line 167: /usr/local/Cellar/hadoop/2.3.0/libexec/logs/hadoop-hadoop-namenode-mymac.local.out: No such file or directory
localhost: mkdir: /usr/local/Cellar/hadoop/2.3.0/libexec/logs: Permission denied
localhost: chown: /usr/local/Cellar/hadoop/2.3.0/libexec/logs: No such file or directory
The error indicates a permissions problem. The hadoop user needs the proper privileges to the hadoop folder. Try running the following in Terminal:
sudo chown -R hadoop /usr/local/Cellar/hadoop/2.3.0/