I met with a "surprising" error from Ansible.
I have hosts: localhost in 2 consecutive plays.
play1 is ok, in play2 Ansible 2.9 says "localhost" unreachable.
In host_vars/localhost, I have ansible_connection: local
---
############ Play 1 ###############
- name: Test Play 1
hosts: localhost
gather_facts: no
tasks:
- name: Set facts
set_fact:
action_host: localhost
############ Play 2 ###############
- name: Test Play 2
hosts: localhost
gather_facts: no
tasks:
- name: Test Play 2
shell: |
echo toto
Output:
PLAY [Test Play 1] ***********************************************************************************************
TASK [Set facts] *************************************************************************************************
ok: [localhost]
PLAY [Test Play 2] ***********************************************************************************************
TASK [Test Play 2] ***********************************************************************************************
fatal: [localhost]: UNREACHABLE! => {"changed": false, "msg": "Failed to connect to the host via ssh: Permission denied (publickey,gssapi-keyex,gssapi-with-mic,password).", "unreachable": true}
PLAY RECAP *******************************************************************************************************
localhost : ok=1 changed=0 unreachable=1 failed=0 skipped=0 rescued=0 ignored=0
$ cat host_vars/localhost
ansible_connection: local
Thanks for all your answers.
Resolved, finally this was simply because, the playbook was not in the same directory where ansible.cfg was located.
I've copied the file to ~/ansible/.
cd ~/ansible ; ansible-playbook <file>.yaml
Then, All well.
If I simply do:
cp <file>.yaml /tmp/.
ansible-playbook /tmp/<file>.yaml
Then I get the error:
TASK [Gathering Facts] *******************************************************************************************
fatal: [localhost]: UNREACHABLE! => {"changed": false, "msg": "Failed to connect to the host via ssh: Permission denied (publickey,gssapi-keyex,gssapi-with-mic,password).", "unreachable": true}
This is because default connection in /etc/ansible/ansible.cfg is ssh_connection
Thanks all.
Related
I need to run task that would check and create if missing folders on Ansible Control Node (where the ansible-playbook command is run) - next tasks will copy some specified files respectively to these local sub-folders:
I have a task:
tasks:
- name: Create local directory
file:
path: "remotes/{{ inventory_hostname }}"
state: directory
recurse: yes
delegate_to: localhost
tags:
localfolders
however when I run with --check it is going to "change" (create folders) on each remote:
TASK [Create local directory] ****************************************************************************************************************
changed: [ansible -> localhost]
changed: [remote1 -> localhost]
changed: [remote2 -> localhost]
Why it not runs task on local only?
Expected result is that on ansible host (only), following folders are created:
remotes/ansible
remotes/remote1
remotes/remote2
To get a better understand of Controlling where tasks run: delegation and local actions and run_once works, I've prepared a small test with an inventory file of
[test]
remote01.example.com
remote02.example.com
and a playbook local.yml
---
- hosts: test
become: false
gather_facts: false
tasks:
- name: Check where I am running on
delegate_to: localhost
shell:
cmd: "hostname && hostname -i"
register: result
run_once: true
- name: Show result
debug:
msg: "{{ result.stdout_lines }}"
run_once: false
executed on control.example.com node via
sshpass -p ${PASSWORD} ansible-playbook --user ${ACCOUNT} --ask-pass local.yml
resulting into an output of
PLAY [test] ********************************
TASK [Check where I am running on] *********
changed: [remote01.example.com -> localhost]
TASK [Show result] *************************
ok: [remote01.example.com] =>
msg:
- control.example.com
- 192.0.2.1
ok: [remote02.example.com] =>
msg:
- control.example.com
- 192.0.2.1
PLAY RECAP *************************************************************************************************
remote01.example.com : ok=2 changed=1 unreachable=0 failed=0 skipped=0 rescued=0 ignored=0
remote02.example.com : ok=1 changed=0 unreachable=0 failed=0 skipped=0 rescued=0 ignored=0
Interesting Doc
IPv4 Address Blocks Reserved for Documentation
When I run this playbook, its not finding ansible_uptime_seconds variable. But ansible hostname -m setup gives this variable. I am using ansible 2.9.23 version.
- hosts: all
become: yes
become_method: sudo
gather_facts: yes
tasks:
- name: Print all available facts
ansible.builtin.debug:
var: ansible_facts
Getting this message
'ansible_uptime_seconds' is undefined
How to get this value in the play book?
The fact name is uptime_seconds, when facts are collected without setup module. however its "ansible_uptime_seconds" when collected with setup module.
---
- name: Sample playbook
connection: local
# gather_facts: false
hosts: localhost
tasks:
- name: print uptime sec
debug:
msg: "{{ ansible_facts.uptime_seconds }}"
Output of the above playbook is:
PLAY [Sample playbook] *********************************************************************************************************************************************
TASK [Gathering Facts] *********************************************************************************************************************************************
ok: [localhost]
TASK [print uptime sec] **********************************************************************************************************************************************************
ok: [localhost] => {
"msg": "172603"
}
PLAY RECAP *********************************************************************************************************************************************************
localhost : ok=2 changed=0 unreachable=0 failed=0 skipped=0 rescued=0 ignored=0
when you don't have any hosts in inventory, when running playbook there is only warning:
[WARNING]: provided hosts list is empty, only localhost is available. Note that the implicit localhost does not match 'all'
Is there a way to make that Error instead of Warning?
I find out that there is this parameter in ansible.cfg:
[inventory]
unparsed_is_failed = True
but it will only return error when there is no inventory file which you are trying to use. It didn't look into content.
One simple solution is:
Create the playbook "main.yml" like:
---
# Check first if the supplied host pattern {{ RUNNER.HOSTNAME }} matches with the inventory
# or forces otherwise the playbook to fail (for Jenkins)
- hosts: localhost
vars_files:
- "{{ jsonfilename }}"
tasks:
- name: "Hostname validation | If OK, it will skip"
fail:
msg: "{{ RUNNER.HOSTNAME }} not found in the inventory group or hosts file {{ ansible_inventory_sources }}"
when: RUNNER.HOSTNAME not in hostvars
# The main playbook starts
- hosts: "{{ RUNNER.HOSTNAME }}"
vars_files:
- "{{ jsonfilename }}"
tasks:
- Your tasks
...
...
...
Put your host variables in a json file "var.json":
{
"RUNNER": {
"HOSTNAME": "hostname-to-check"
},
"VAR1":{
"CIAO": "CIAO"
}
}
Run the command:
ansible-playbook main.yml --extra-vars="jsonfilename=var.json"
You can also adapt this solution as you like and pass directly the hostname with the command
ansible-playbook -i hostname-to-check, my_playbook.yml
but in this last case remember to put in your playbook:
hosts: all
[WARNING]: provided hosts list is empty, only localhost is available. Note that the implicit localhost does not match 'all'
Q: "Is there a way to make that Error instead of Warning?"
A: Yes. It is. Test it in the playbook. For example,
- hosts: localhost
tasks:
- fail:
msg: "[ERROR] Empty inventory. No host available."
when: groups.all|length == 0
- hosts: all
tasks:
- debug:
msg: Playbook started
gives with an empty inventory
fatal: [localhost]: FAILED! => {"changed": false, "msg": "[ERROR] Empty inventory. No host available."}
Example of a project for testing
shell> tree .
.
├── ansible.cfg
├── hosts
└── pb.yml
0 directories, 3 files
shell> cat ansible.cfg
[defaults]
gathering = explicit
inventory = $PWD/hosts
shell> cat hosts
shell> cat pb.yml
- hosts: localhost
tasks:
- fail:
msg: "[ERROR] Empty inventory. No host available."
when: groups.all|length == 0
- hosts: all
tasks:
- debug:
msg: Playbook started
gives
shell> ansible-playbook pb.yml
[WARNING]: provided hosts list is empty, only localhost is available. Note that the implicit
localhost does not match 'all'
PLAY [localhost] *****************************************************************************
TASK [fail] **********************************************************************************
fatal: [localhost]: FAILED! => {"changed": false, "msg": "[ERROR] Empty inventory. No host available."}
PLAY RECAP ***********************************************************************************
localhost: ok=0 changed=0 unreachable=0 failed=1 skipped=0 rescued=0 ignored=0
Q: "Still I am getting a warning: [WARNING]: provided hosts list is empty, ..."
A: Feel free to turn the warning off. See LOCALHOST_WARNING.
shell> ANSIBLE_LOCALHOST_WARNING=false ansible-playbook pb.yml
PLAY [localhost] *****************************************************************************
TASK [fail] **********************************************************************************
fatal: [localhost]: FAILED! => {"changed": false, "msg": "[ERROR] Empty inventory. No host available."}
PLAY RECAP ***********************************************************************************
localhost: ok=0 changed=0 unreachable=0 failed=1 skipped=0 rescued=0 ignored=0
I want to validate few things before i run my main play in Ansible. For example below command is taking 2 input arguments from the user so I want to validate them before executing the main tasks.
ansible-playbook -i my-inventory my-main.yml --tags=repodownload -e release_version=5.0.0-07 -e target_env=dev/prod/preprod
In the above case, release_version should not be empty and target_env must be these type of values -
5.0.0.34
I want to display a message to user about what is wrong. How do i achieve it?
Any help is appreciated.
If you absolutely need the user to provide the variables, I would first of all use vars_prompt so that the variable value is asked interactively if user forgot to provide them as extra vars. This also makes a good inline documentation.
Then you can use pre_tasks to validate the input that was provided, either interactively or as an extra var. For validation, I usually use the fail module. The point here is to use run_once: true to force the test to run only once even if there are several hosts in your play.
Here is an example based on your input. Adapt to your exact needs
---
- name: Prompt and validation demo
hosts: all
gather_facts: false
vars:
_allowed_envs:
- dev
- preprod
- prod
vars_prompt:
- name: release_version
prompt: "What is the release version ? [w.x.y-z]"
private: no
- name: target_env
prompt: "What is the target environment ? [{{ _allowed_envs | join(', ') }}]"
private: no
pre_tasks:
- name: Make sure version is ok
fail:
msg: >-
Release version is not formatted correctly. Please make sure
it is of the form w.x.y-zz
when: not release_version is regex('\d*(\.\d*){2}-\d\d')
run_once: true
- name: Make sure target_env is allowed
fail:
msg: >-
Environment "{{ target_env }}" is not allowed.
Please choose a target environment in {{ _allowed_envs | join(', ') }}
when: not target_env in _allowed_envs
run_once: true
tasks:
- name: "Dummy task just to have a complete playbook for the example"
debug:
msg: "Deploying version {{ release_version }} for environment {{ target_env }} on {{ inventory_hostname }}"
And here are some examples launching the playbook:
##########################
# Fully interactive runs #
##########################
$ ansible-playbook -i localhost, playbook.yml
What is the release version ? [w.x.y-z]: wrong
What is the target environment ? [dev, preprod, prod]: prod
PLAY [Prompt and validation demo] ************************************
TASK [Make sure version is ok] ***************************************
fatal: [localhost]: FAILED! => {"changed": false, "msg": "Release version is not formatted correctly. Please make sure it is of the form w.x.y-zz"}
NO MORE HOSTS LEFT ***************************************************
PLAY RECAP **********************************************************
localhost : ok=0 changed=0 unreachable=0 failed=1 skipped=0 rescued=0 ignored=0
$ ansible-playbook -i localhost, playbook.yml
What is the release version ? [w.x.y-z]: 1.2.3-44
What is the target environment ? [dev, preprod, prod]: dev
PLAY [Prompt and validation demo] ************************************
TASK [Make sure version is ok] ***************************************
skipping: [localhost]
TASK [Make sure target_env is allowed] *******************************
skipping: [localhost]
TASK [Dummy task just to have a complete playbook for the example] ***
ok: [localhost] => {
"msg": "Deploying version 1.2.3-44 for environment dev on localhost"
}
PLAY RECAP ***********************************************************
localhost : ok=1 changed=0 unreachable=0 failed=0 skipped=2 rescued=0 ignored=0
###############
# Hybrid run #
###############
$ ansible-playbook -i localhost, playbook.yml -e target_env=prod
What is the release version ? [w.x.y-z]: 1.2.3-44
PLAY [Prompt and validation demo] ************************************
TASK [Make sure version is ok] ***************************************
skipping: [localhost]
TASK [Make sure target_env is allowed] *******************************
skipping: [localhost]
TASK [Dummy task just to have a complete playbook for the example] ***
ok: [localhost] => {
"msg": "Deploying version 1.2.3-44 for environment prod on localhost"
}
PLAY RECAP ***********************************************************
localhost : ok=1 changed=0 unreachable=0 failed=0 skipped=2 rescued=0 ignored=0
###################
# Fully automated #
###################
$ ansible-playbook -i localhost, playbook.yml -e target_env=prod -e release_version=1.2.3-44
PLAY [Prompt and validation demo] ************************************
TASK [Make sure version is ok] ***************************************
skipping: [localhost]
TASK [Make sure target_env is allowed] *******************************
skipping: [localhost]
TASK [Dummy task just to have a complete playbook for the example] ***
ok: [localhost] => {
"msg": "Deploying version 1.2.3-44 for environment prod on localhost"
}
PLAY RECAP ***********************************************************
localhost : ok=1 changed=0 unreachable=0 failed=0 skipped=2 rescued=0 ignored=0
I am using the following ansible script to import a playbook based on the user input,
---
- hosts: localhost
vars_prompt:
- name: "cleanup"
prompt: "Do you want to run cleanup? Enter [yes/no]"
private: no
- name: run the cleanup yaml file
import_playbook: cleanup.yml
when: cleanup == "yes"
Execution log:
bash-$ ansible-playbook -i hosts cleanup.yml
Do you want to run cleanup? Enter [yes/no]: no
PLAY [localhost] *********************************************************************************************************************
TASK [Gathering Facts] ***************************************************************************************************************
ok: [127.0.0.1]
PLAY [master] ********************************************************************************************************************
TASK [Gathering Facts] ***************************************************************************************************************
fatal: [192.168.56.128]: FAILED! => {"msg": "The conditional check 'cleanup == \"yes\"' failed. The error was: error while evaluating conditional (cleanup == \"yes\"): 'cleanup' is undefined"}
to retry, use: --limit #/home/admin/playbook/cleanup.retry
PLAY RECAP ***************************************************************************************************************************
127.0.0.1 : ok=1 changed=0 unreachable=0 failed=0
192.168.56.128 : ok=0 changed=0 unreachable=0 failed=1
It throws error in the imported playbook not in the mail playbook.
Please help me to import a playbook based on user input.
vars_prompt variables are only defined in the play in which they were called. In order to use them in other plays, a workaround is to use set_fact to bind the variable to a host, then use hostvars to access that value from the second play.
For instance:
---
- hosts: localhost
vars_prompt:
- name: "cleanup"
prompt: "Do you want to run cleanup? Enter [yes/no]"
private: no
tasks:
- set_fact:
cleanup: "{{cleanup}}"
- debug:
msg: 'cleanup is available in the play using: {{cleanup}}'
- debug:
msg: 'cleanup is also available globally using: {{hostvars["localhost"]["cleanup"]}}'
- name: run the cleanup yaml file
import_playbook: cleanup.yml
when: hostvars["localhost"]["cleanup"] == True