Ansible: Provided hosts list is empty - ansible

I have this below playbook where the remote host is an user input and subsequently I am trying to gather facts about the remote host and copy the same to a file in local:
---
- hosts: localhost
vars_prompt:
name: hostname
prompt: "Enter Hostname"
tasks:
- name: Add hosts to known_hosts file
add_host: name={{ hostname }} groups=new
- name: Check if Host is reachable
shell: ansible -m ping {{ hostname }}
- name: Remove existing remote hosts
shell: ssh-keygen -R {{ hostname }}
- name: Setup passwordless SSH login
shell: ssh-copy-id -i ~/.ssh/id_rsa user#{{ hostname }}
- name: Display facts
command: ansible {{ groups['new'] }} -m setup
register: output
- copy: content="{{ output }}" dest=/var/tmp/dir/Node_Health/temp
...
I get the below error in the temp file:
Node_Health]# cat temp
{"start": "2016-06-17 09:26:59.174155", "delta": "0:00:00.279268", "cmd": ["ansible", "[udl360x4675]", "-m", "setup"], "end": "2016-06-17 09:26:59.453423", "stderr": " [WARNING]: provided hosts list is empty, only localhost is available", "stdout": "", "stdout_lines": [], "changed": true, "rc": 0, "warnings":
I also tried the below playbook which also gives the same error:
---
- hosts: localhost
vars_prompt:
name: hostname
prompt: "Enter Hostname"
tasks:
- name: Add hosts to known_hosts file
add_host: name={{ hostname }} groups=new
- name: Check if Host is reachable
shell: ansible -m ping {{ hostname }}
- name: Remove existing remote hosts
shell: ssh-keygen -R {{ hostname }}
- name: Setup passwordless SSH login
shell: ssh-copy-id -i ~/.ssh/id_rsa user#{{ hostname }}
- hosts: new
tasks:
- name: Display facts
command: ansible {{ groups['new'] }} -m setup
register: output
- local_action: copy content="{{ output }}" dest=/var/tmp/dir/Node_Health/temp
...
Any help will be appreciated.

Ansible assumes that you have all your hosts in an inventory file somewhere.
add_host only adds your host to the currently running Ansible, and that doesn't propagate to the copy of Ansible you call.
You're going to have to either:
change the command to use an inline inventory list, like ansible all -i '{{ hostname }},' -m setup (More details re the use of -i '<hostname>,' here
or write out the hostname to a file, and use that as your inventory file

Put your hosts in a hosts.ini file, with the following syntax:
[nodes]
node_u1 ansible_user=root ansible_host=127.0.0.1
node_u2 ansible_user=root ansible_host=127.0.1.1
node_u3 ansible_user=root ansible_host=127.0.2.1
node_u4 ansible_user=root ansible_host=127.0.3.1
node_u5 ansible_user=root ansible_host=127.0.4.1
Running ansible, use:
ansible-playbook -i hosts.ini
You can also save your hosts file in /etc/ansible/hosts to avoid passing hosts as parameters. Ansible looks there as a default location. Then just run using:
ansible-playbook <playbook.yml>

For me it looks that thay don't know where the inventory file is located.
I used cmd:
ansible-playbook name.yml -i hostfile

The parent directory for ansible files to be located at $HOME.

Related

Ansible: How do I link Variables, stored in a Vault to a specific host?

I want to encrypt my host credentials in a central secrets.yml file.
How can I tell Ansible, to use the variables?
I tried with this setup:
host_vars/test.yml
ansible_user: {{ test_user }}
ansible_become_pass: {{ test_pass }}
secrets.yml
# Credentials Test Server #
test_user: user
test_pass: password
inventory.yml
all:
children:
test:
hosts:
10.10.10.10
playbook.yml
---
- name: Update Server
hosts: test
become: yes
vars_files:
- secrets.yml
tasks:
- name: Update
ansible.builtin.apt:
update_cache: yes
For execution I user this command:
ansible-playbook -i inventory.yml secure_linux.yml --ask-vault-pass
During execution I get this Error Message:
fatal: [10.10.10.10]: UNREACHABLE! => {"changed": false, "msg": "Failed to connect to the host via ssh: root#10.10.10.10: Permission denied (publickey,password).", "unreachable": true}
For those credentials to be used by all hosts, use the group_vars/all directory. So you will have the file group_vars/all/secrets.yml, which you will encrypt with ansible-vault.
ansible_user: user
ansible_password: password
You do not need a host_vars file.
The solution was:
give the host_vars file the right name (10.10.10.10.yml)
add ansible_password as variable
use quotation marks "{{ test_user }}"

How to let Ansible run a command with host as argument locally

I installed servers at multiple hosts using ansible playbook. the hosts are defined at inventory file:
[services]
host_ip1
host_ip1
...
Now I need to test if each host works properly using command:
service-cli -h <host_ip1> -p 6380 ping
...
How do I write that using ansible? it is for sure ansible will run the command at local not remote, Then how do I pass the host_ipX to ansible?
I know how to use delegate_to: 127.0.0.1, I do not know how to get the ip of inventory file using with_items, what keywords should I set? services?
Your use case is literally documented in the extract examples
- debug:
msg: service-cli -h {{ item }} -p 6380 ping
with_items: '{{ groups["services"] | map("extract", hostvars, "ansible_host") }}'
delegate_to: localhost
# this is required if you want the task as part of a playbook
# that contains "hosts: all"
run_once: yes
or, as an alternative to the "run_once" you can isolate that as its own play:
- hosts: localhost
gather_facts: no
tasks:
- debug:
msg: service-cli -h {{ item }} -p 6380 ping
with_items: '{{ groups["services"] | map("extract", hostvars, "ansible_host") }}'
- hosts: all
... # and now back to normal life

Ansible Hostname as variable in ansible_user

Need help with ansible. In our company we use following method to ssh to a server.
If IP of server is 172.16.1.8 , Username would be "empname~id~serverIP" e.g. john~1234~172.16.1.8 . So following ssh command is used -
> ssh john~1234~172.16.1.8#172.16.1.8 -i key.pem
Basically username has the hostname as a variable.
Now our inventory has just IPs with group web.
> cat inventory.txt
[web]
172.16.1.8
172.16.x.y
172.16.y.z
My playbook yml has ansible user as following.
> cat uptime.yml
- hosts: web
vars:
ansible_ssh_port: xxxx
ansible_user: john~1234~{{inventory_hostname}}
tasks:
- name: Run uptime command
shell: uptime
However, when I use following ansible-playbook command, it gives error for incorrect username.
> ansible-playbook -v uptime.yml -i inventory.txt --private-key=key.pem
Please help me find correct ansible_user in playbook which has hostname as a variable inside.
You can define ansible_user in group_vars/web.yml
group_vars/web.yml:
---
ansible_ssh_port: xxxx
ansible_user: "john~1234~{{ inventory_hostname }}"
Using a group var helped -
ansible_ssh_port: xxxx
ansible_user: "john~1234~{{ inventory_hostname }}"

Can Ansible deploy public SSH key asking password only once?

I wonder how to copy my SSH public key to many hosts using Ansible.
First attempt:
ansible all -i inventory -m local_action -a "ssh-copy-id {{ inventory_hostname }}" --ask-pass
But I have the error The module local_action was not found in configured module paths.
Second attempt using a playbook:
- hosts: all
become: no
tasks:
- local_action: command ssh-copy-id {{ inventory_hostname }}
Finally I have entered my password for each managed host:
ansible all -i inventory --list-hosts | while read h ; do ssh-copy-id "$h" ; done
How to fill password only once while deploying public SSH key to many hosts?
EDIT: I have succeeded to copy my SSH public key to multiple remote hosts using the following playbook from the Konstantin Suvorov's answer.
- hosts: all
tasks:
- authorized_key:
key: "{{ lookup('file', '~/.ssh/id_rsa.pub') }}"
The field user should be mandatory according to the documentation but it seems to work without. Therefore the above generic playbook may be used for any user when used with this command line:
ansible-playbook -i inventory authorized_key.yml -u "$USER" -k
Why don't you use authorized_key module?
- hosts: all
tasks:
- authorized_key:
user: remote_user_name
state: present
key: "{{ lookup('file', '/local/path/.ssh/id_rsa.pub') }}"
and run playbook with -u remote_user_name -k

How to add a host to the known_host file with ansible?

I want to add the ssh key for my private git server to the known_hosts file with ansible 1.9.3 but it doesn't work.
I have the following entry in my playbook:
- name: add SSH host key
known_hosts: name='myhost.com'
key="{{ lookup('file', 'host_key.pub') }}"
I have copied /etc/ssh/ssh_host_rsa_key.pub to host_key.pub and the file looks like:
ssh-rsa AAAAB3NzaC1... root#myhost.com
If I run my playbook I always get the following error message:
TASK: [add SSH host key]
******************************************************
failed: [default] => {"cmd": "/usr/bin/ssh-keygen -F myhost.com -f /tmp/tmpe5KNIW", "failed": true, "rc": 1}
What I am doing wrong?
You can directly use ssh-keyscan within the ansible task:
- name: Ensure servers are present in known_hosts file
known_hosts:
name: "{{ hostvars[item].ansible_host }}"
state: present
key: "{{ lookup('pipe', 'ssh-keyscan {{ hostvars[item].ansible_host }}') }}"
hash_host: true
with_items: "{{ groups.servers }}"
In the above snipped, we iterate over all hosts in the group "servers" defined in your inventory, use ssh-keyscan on them, read the result with pipe and add it using known_hosts.
If you have only one host that you want to add, it's even simpler:
- name: Ensure server is present in known_hosts file
known_hosts:
name: "myhost.com"
state: present
key: "{{ lookup('pipe', 'ssh-keyscan myhost.com') }}"
hash_host: true
Whether you need hash_host or not depends on your system.
Your copy of the remote host public key needs a name, that name needs to match what you specify for your known hosts.
In your case, prepend "myhost.com " to your host_key.pub key file as follows:
myhost.com ssh-rsa AAAAB3NzaC1... root#myhost.com
Reference:
Ansible known_hosts module, specifically the name parameter
Use ssh-keyscan to generate host_key.pub is another way.
ssh-keyscan myhost.com > host_key.pub
This command will generate the format like this.
$ ssh-keyscan github.com > github.com.pub
# github.com SSH-2.0-libssh-0.7.0
# github.com SSH-2.0-libssh-0.7.0
$ cat github.com.pub
github.com ssh-rsa AAAAB3NzaC1yc2EAAAABIwAAAQEAq2A7hRGmdnm9tUDbO9IDSwBK6TbQa+PXYPCPy6rbTrTtw7PHkccKrpp0yVhp5HdEIcKr6pLlVDBfOLX9QUsyCOV0wzfjIJNlGEYsdlLJizHhbn2mUjvSAHQqZETYP81eFzLQNnPHt4EVVUh7VfDESU84KezmD5QlWpXLmvU31/yMf+Se8xhHTvKSCZIFImWwoG6mbUoWf9nzpIoaSjB+weqqUUmpaaasXVal72J+UX2B+2RPW3RcT0eOzQgqlJL3RKrTJvdsjE3JEAvGq3lGHSZXy28G3skua2SmVi/w4yCE6gbODqnTWlg7+wC604ydGXA8VJiS5ap43JXiUFFAaQ==

Resources