I followed this Hadoop tutorial here.
My end goal is to install Hadoop and Spark, and what not on my Mac, but honestly, I'm an amateur at this thing, at best.
So, I got up to step 2, under Execution of Psuedo-Distributed Operation and then the following appears:
sudo start-dfs.sh
Password:
2014-06-10 18:42:01.200 java[6982:1303] Unable to load realm info from SCDynamicStore
14/06/10 18:42:01 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Starting namenodes on [localhost]
The authenticity of host 'localhost (::1)' can't be established.
RSA key fingerprint is 17:d5:20:eb:8d:f9:24:2f:c6:46:d7:e2:f5:6a:b8:c1.
Are you sure you want to continue connecting (yes/no)? yes
localhost: Warning: Permanently added 'localhost' (RSA) to the list of known hosts.
Password:
Password:
Password:
localhost: Permission denied (publickey,keyboard-interactive).
Basically, I typed in my password for user lanceguinto, and it is apparently incorrect. Of note is that I did not follow the Setup passphraseless ssh portion because I thought it was unnecessary - I can already ssh, but I'm entirely sure that code does.
During the setup, I also simply used my local user. I did not sudo su / anything.
So, how can I solve this problem? Thanks.
Check if this helps
You need to change the sshd_config file in the remote server (probably in /etc/ssh/sshd_config).
Change
PasswordAuthentication no
to
PasswordAuthentication yes
And then restart the sshd daemon.
Permission denied (publickey,keyboard-interactive)
Related
While installing Hadoop I got many errors but this one just doesn't go. No matter what I do, it keeps popping again and again. As soon as I am starting Hadoop by the command ./start-all.sh, I get the error:
localhost: rajneeshsahai#localhost: Permission denied
(publickey,password,keyboard-interactive)
Error logs:
Starting namenodes on [localhost]
localhost: rajneeshsahai#localhost: Permission denied (publickey,password,keyboard-interactive).
Starting datanodes
localhost: rajneeshsahai#localhost: Permission denied (publickey,password,keyboard-interactive).
Starting secondary namenodes [MacBook-Air.local]
MacBook-Air.local: rajneeshsahai#macbook-air.local: Permission denied (publickey,password,keyboard-interactive).
2020-05-29 18:42:06,106 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Starting resourcemanager
resourcemanager is running as process 2937. Stop it first.
Starting nodemanagers
localhost: rajneeshsahai#localhost: Permission denied (publickey,password,keyboard-interactive).
I already tried the following things:
ssh-keygen -t rsa
cat ~/.ssh/id-rsa.pub >> ~/.ssh/authorized_keys
chmod 600 ~/.ssh/authorized_keys
I think repeating this process has created multiple keys in my system.
sudo passwd
Configured /etc/ssh/sshd_config
(i) Changed PermitRootLogin prohibit-password to PermitRootLogin yes
(ii) Changed PasswordAuthentication no to PasswordAuthentication yes
I do have one doubt: Do I have to remove the hash tag (#) from the lines?
I am using macOS Catalina.
You can try the following:
$ ssh-keygen -t rsa -P '' -f ~/.ssh/id_rsa
$ cat ~/.ssh/id_rsa.pub >> ~/.ssh/authorized_keys
$ chmod 0600 ~/.ssh/authorized_keys
On Windows WSL2 Ubuntu container you have to restart the ssh service to make it available for Hadoop. You could try to run the Hadoop in a docker container. See https://github.com/big-data-europe/docker-hadoop.
In the Ubuntu 20.04 container each time before I start Hadoop I restart the ssh service.
sudo service ssh restart
For more details see the following tutorial https://dev.to/samujjwaal/hadoop-installation-on-windows-10-using-wsl-2ck1.
When I try to ssh into localhost, I am prompted for password. See below
"
ssh connection to localhost:
[hadoop#mftrhel74 sbin]$ ssh localhost
hadoop#localhost's password:
Last login: Fri Aug 23 15:44:08 2019 from mah"
---The above statement means, passwordless connection is not setup----
But when I try to start Hadoop nodes as below, it doesn't prompt for password.
And the nodes are not starting, I see below message
I think it should prompt me to enter the password for the user just like as SSH connection is to be established.
[hadoop#mftrhel74 ~]$ start-dfs.sh
Starting namenodes on [mftrhel74]
mftrhel74: Permission denied (publickey,gssapi-keyex,gssapi-with-mic,password).
Starting datanodes
localhost: Permission denied (publickey,gssapi-keyex,gssapi-with-mic,password).
Starting secondary namenodes [mftrhel74]
mftrhel74: Permission denied (publickey,gssapi-keyex,gssapi-with-mic,password).
************I DO NOT WANT A PASSWORDLESS CONNECTION*****
I suspect you are able to log in to one of the nodes with SSH, however probably you have not set up passwordless ssh between the nodes, so the steps you try to execute from the node will fail.
Here is some documentation that should explain that you need to set up passwordless ssh or otherwise install an ambari client (assuming you work on HDP).
https://ambari.apache.org/1.2.2/installing-hadoop-using-ambari/content/ambari-chap1-5-2.html
I am new in hadoop and I run it by below steps:
ssh-keygen
cat ~/.ssh/id_rsa.pub >> ~/.ssh/authorized_keys
ssh localhost
./start-all.sh
but I get below error:
WARNING: Attempting to start all Apache Hadoop daemons as ... in 10 seconds.
WARNING: This is not a recommended production deployment configuration.
WARNING: Use CTRL-C to abort.
Starting namenodes on [localhost]
localhost: Permission denied (publickey,password).
Starting datanodes
localhost: Permission denied (publickey,password).
Starting secondary namenodes [karbasi]
karbasi: Permission denied (publickey,password).
Starting resourcemanager
Starting nodemanagers
localhost: Permission denied (publickey,password).
Please help me in solving my problem.
Permission denied issue at Hadoop Installation
In the above link, you can find the reason for the issue.
The issue occurs if any mistakes occur in ssh key generation or if the hadoop installation is not extracted by same user and started by the same user.
I'm following the Hadoop official tutorial to run Hadoop on a my machine in a pseudo-distributed mode.
I can use ssh to login in localhost without password:
admin#mycomputer:/usr/local/hadoop/hadoop-2.6.0$ ssh localhost
Welcome to Ubuntu 14.04.1 LTS (GNU/Linux 3.13.0-45-generic x86_64)
* Documentation: https://help.ubuntu.com/
4 packages can be updated.
0 updates are security updates.
Last login: Mon Feb 9 12:31:17 2015 from localhost
admin#mycomputer:~$
And I can also format the namenode without error, but I cannot start Hadoop with start-dfs.sh:
admin#mycomputer:/usr/local/hadoop/hadoop-2.6.0$ sudo sbin/start-dfs.sh
Starting namenodes on [localhost]
root#localhost's password:
localhost: Permission denied, please try again.
Why I'm still asked to provide root password while I can ssh into localhost without it?
I also tried:
sudo passwd
to reset the password, but later encounter the same permission denied error, it seems to me that this password is not the password for root#localhost. How can I solve this problem?
I think you didn't change the permission for the hadoop-2.6.0 folder. Give admin user permission to this folder and try to start.
Follow my below blog link : I provided steps in detail installing in Ubuntu by enriching from another blog.
http://gubendran.blogspot.com/2015/01/install-hadoop-in-single-node-linux.html
I have a hadoop installation on the Amazon Elastic MapReduce , whenever I try to restart the cluster I get the following error:
/stop-all.sh
no jobtracker to stop
The authenticity of host 'localhost (::1)' can't be established. RSA key fingerprint is
Are you sure you want to continue connecting (yes/no)? yes
localhost: Warning: Permanently added 'localhost' (RSA) to the list of known hosts.
localhost: Permission denied (publickey).
no namenode to stop
localhost: Permission denied (publickey).
localhost: Permission denied (publickey).
Any idea on how to restart hadoop?
Following hack worked for me.
I have replaced "ssh" command in sbin/slaves.sh & sbin/hadoop-daemon.sh with "ssh -i ~/.ssh/keyname"
I'm using hadoop version 2.4 and this worked for me:
export HADOOP_SSH_OPTS="-i /home/hadoop/mykey.pem"
For the stop-all.sh script to work, you probably need to have the same user in all the machines as the user with which you are executing the stop-all.sh script.
Moreover, it seems you do not have a password less ssh setup from the machine you are executing stop-all.sh to rest of the machines that will spare you from manually entering the password for each machine separately. Passwords might be different for the same user for different machines, please don't forget that.