When connecting to an EC2 instance with telnet what would be the username and password?
I'm using Ubuntu and I didn't set any credentials
Anyway to disable the authentication via telnet?
Related
We installed OpenSSH using github on a Windows Server 2016. Everything works great except passwordless login. Here's the scenario:
Domain-joined CentOS server trying to SSH into Domain-joined Windows Server 2016 with OpenSSH using domain credentials.
On Centos server:
log in with domain user "user"
ssh-keygen -t rsa
key saved in /home/user/.ssh/id_rsa
On Windows Server
Copy id_rsa.pub entry from Centos server into "authorized_keys" file for user on Windows box (C:\Users\User\.ssh\authorized_keys). Save the file.
On Centos Server:
while logged in as domain user "user"
ssh user#WindowsServer
accept into known_hosts
get prompted for password
In theory, we shouldn't get prompted for a password, but we do. Any ideas?
How to transfer files from an on-prem production server to an AWS EC2 in PLE server using password-less SCP from the same service account id on the origination and the destination servers?
The easiest option is to setup some authentication on your AWS machine. I assume you would already have key based SSH access to the server. For key based access you can simply execute the below mentioned command:
scp -i <path_to_private_key> <source_file_path> username#PublicIP:/tmp/
path_to_private_key is the private key used for SSH to your AWS machine
source_file_path is the file to be copied
username is theSSH username used to SSH to your AWS machine
PublicIP is the IP of your AWS machine
We are in the midway of implementing Ansible CI for app deployment. For connecting the Remote host from Control Host , we used passwordless SSH authentication (by adding SSH key to authorized_keys).
But with recent changes, Unix team not allowing this any more on higher env as corporate unix policy. So have to use the password way.
The user with which Ansible running & connecting to Remote machine is a sudo user & does not have a password for itself.
So in this case, how do we connect from Control Host to Remote host, without the SSH key?
while running the ansible playbook we get an option to provide the user using which we can do ssh --user . Also the same configuration can be achieved by providing the configuration in the inventory file.
ansible_user=<user_name>
For password you can use vault
I am editing the answer to provide info that we can use other user than the one with which ansible is installed. You can create a new user which has password or passwordless authentication setup.
Hope so this helps.
Cheers,
Yash
I have a jenkins box, I have ssh in to it and from there I want to access one of the Ec2 instance in AWS, I tried ssh -i "mykeyname.pem" ec2-user#DNSname but It throws me an error "Permission denied (publickey,gssapi-keyex,gssapi-with-mic)".
I have the PEM file of the EC2 instance I want to connect. But is there any way I can ssh in to the instance..?
There are two possible reasons.
Default user name is not "ec2-user"
Please check your using image "jenkins box".
If it doesn't use "ec2-user", change user name for ssh commands
Your key-pair is incorrect
Once you created EC2 instance with correct key-pair, you could access EC2 instance with such commands
Please check your using key-pair name
FYI
Connecting to Your Linux Instance Using SSH
I have some problem connecting to my amazon EC2 server over ssh over proxy.
I have my username and password for http proxy port 8080.(dont have control over proxy)
Also I have my connection string which would work without proxy
ssh -i key.pem root#xx.compute.amazonaws.com
when I am trying to connect I am getting "No route to host" error
I tried to use putty, configured proxy + authentication file, But then I getting this error
"Unable to use this key file (OpenSSH SSH-2 private key)"
Also I dont know how putty inserts my proxy config, into ssh connection string, so I could try it in terminal
I was facing the same problem and this is what I used to connect, using corkscrew. My config file looks like this
Host AWS
Hostname <Public DNS>
Port 443
#Write the appropriate username depending on your AMI, eg : ubuntu, ec2-user
User ubuntu
IdentityFile </path to key file>
ProxyCommand /usr/bin/corkscrew 10.10.78.61 3128 %h %p
then I simply use this command to connect
ssh AWS
and it works flawlessly.
Note : You must edit your sshd_config file on the server to listen to ssh connections on port 443 (in addition to 22) and restart the ssh daemon.
Are you sure you can login as root? Try logging in as ec2-user instead.
Also, if you have assigned an elastic IP to your instance, the public DNS has probably changed. Log in to the aws console, and select your instance. Scroll down to look at the public DNS again and double check you are using the correct xx.compute.amazonaws.com addr.