I am using the following ansible script to import a playbook based on the user input,
---
- hosts: localhost
vars_prompt:
- name: "cleanup"
prompt: "Do you want to run cleanup? Enter [yes/no]"
private: no
- name: run the cleanup yaml file
import_playbook: cleanup.yml
when: cleanup == "yes"
Execution log:
bash-$ ansible-playbook -i hosts cleanup.yml
Do you want to run cleanup? Enter [yes/no]: no
PLAY [localhost] *********************************************************************************************************************
TASK [Gathering Facts] ***************************************************************************************************************
ok: [127.0.0.1]
PLAY [master] ********************************************************************************************************************
TASK [Gathering Facts] ***************************************************************************************************************
fatal: [192.168.56.128]: FAILED! => {"msg": "The conditional check 'cleanup == \"yes\"' failed. The error was: error while evaluating conditional (cleanup == \"yes\"): 'cleanup' is undefined"}
to retry, use: --limit #/home/admin/playbook/cleanup.retry
PLAY RECAP ***************************************************************************************************************************
127.0.0.1 : ok=1 changed=0 unreachable=0 failed=0
192.168.56.128 : ok=0 changed=0 unreachable=0 failed=1
It throws error in the imported playbook not in the mail playbook.
Please help me to import a playbook based on user input.
vars_prompt variables are only defined in the play in which they were called. In order to use them in other plays, a workaround is to use set_fact to bind the variable to a host, then use hostvars to access that value from the second play.
For instance:
---
- hosts: localhost
vars_prompt:
- name: "cleanup"
prompt: "Do you want to run cleanup? Enter [yes/no]"
private: no
tasks:
- set_fact:
cleanup: "{{cleanup}}"
- debug:
msg: 'cleanup is available in the play using: {{cleanup}}'
- debug:
msg: 'cleanup is also available globally using: {{hostvars["localhost"]["cleanup"]}}'
- name: run the cleanup yaml file
import_playbook: cleanup.yml
when: hostvars["localhost"]["cleanup"] == True
Related
When I run this playbook, its not finding ansible_uptime_seconds variable. But ansible hostname -m setup gives this variable. I am using ansible 2.9.23 version.
- hosts: all
become: yes
become_method: sudo
gather_facts: yes
tasks:
- name: Print all available facts
ansible.builtin.debug:
var: ansible_facts
Getting this message
'ansible_uptime_seconds' is undefined
How to get this value in the play book?
The fact name is uptime_seconds, when facts are collected without setup module. however its "ansible_uptime_seconds" when collected with setup module.
---
- name: Sample playbook
connection: local
# gather_facts: false
hosts: localhost
tasks:
- name: print uptime sec
debug:
msg: "{{ ansible_facts.uptime_seconds }}"
Output of the above playbook is:
PLAY [Sample playbook] *********************************************************************************************************************************************
TASK [Gathering Facts] *********************************************************************************************************************************************
ok: [localhost]
TASK [print uptime sec] **********************************************************************************************************************************************************
ok: [localhost] => {
"msg": "172603"
}
PLAY RECAP *********************************************************************************************************************************************************
localhost : ok=2 changed=0 unreachable=0 failed=0 skipped=0 rescued=0 ignored=0
I want to validate few things before i run my main play in Ansible. For example below command is taking 2 input arguments from the user so I want to validate them before executing the main tasks.
ansible-playbook -i my-inventory my-main.yml --tags=repodownload -e release_version=5.0.0-07 -e target_env=dev/prod/preprod
In the above case, release_version should not be empty and target_env must be these type of values -
5.0.0.34
I want to display a message to user about what is wrong. How do i achieve it?
Any help is appreciated.
If you absolutely need the user to provide the variables, I would first of all use vars_prompt so that the variable value is asked interactively if user forgot to provide them as extra vars. This also makes a good inline documentation.
Then you can use pre_tasks to validate the input that was provided, either interactively or as an extra var. For validation, I usually use the fail module. The point here is to use run_once: true to force the test to run only once even if there are several hosts in your play.
Here is an example based on your input. Adapt to your exact needs
---
- name: Prompt and validation demo
hosts: all
gather_facts: false
vars:
_allowed_envs:
- dev
- preprod
- prod
vars_prompt:
- name: release_version
prompt: "What is the release version ? [w.x.y-z]"
private: no
- name: target_env
prompt: "What is the target environment ? [{{ _allowed_envs | join(', ') }}]"
private: no
pre_tasks:
- name: Make sure version is ok
fail:
msg: >-
Release version is not formatted correctly. Please make sure
it is of the form w.x.y-zz
when: not release_version is regex('\d*(\.\d*){2}-\d\d')
run_once: true
- name: Make sure target_env is allowed
fail:
msg: >-
Environment "{{ target_env }}" is not allowed.
Please choose a target environment in {{ _allowed_envs | join(', ') }}
when: not target_env in _allowed_envs
run_once: true
tasks:
- name: "Dummy task just to have a complete playbook for the example"
debug:
msg: "Deploying version {{ release_version }} for environment {{ target_env }} on {{ inventory_hostname }}"
And here are some examples launching the playbook:
##########################
# Fully interactive runs #
##########################
$ ansible-playbook -i localhost, playbook.yml
What is the release version ? [w.x.y-z]: wrong
What is the target environment ? [dev, preprod, prod]: prod
PLAY [Prompt and validation demo] ************************************
TASK [Make sure version is ok] ***************************************
fatal: [localhost]: FAILED! => {"changed": false, "msg": "Release version is not formatted correctly. Please make sure it is of the form w.x.y-zz"}
NO MORE HOSTS LEFT ***************************************************
PLAY RECAP **********************************************************
localhost : ok=0 changed=0 unreachable=0 failed=1 skipped=0 rescued=0 ignored=0
$ ansible-playbook -i localhost, playbook.yml
What is the release version ? [w.x.y-z]: 1.2.3-44
What is the target environment ? [dev, preprod, prod]: dev
PLAY [Prompt and validation demo] ************************************
TASK [Make sure version is ok] ***************************************
skipping: [localhost]
TASK [Make sure target_env is allowed] *******************************
skipping: [localhost]
TASK [Dummy task just to have a complete playbook for the example] ***
ok: [localhost] => {
"msg": "Deploying version 1.2.3-44 for environment dev on localhost"
}
PLAY RECAP ***********************************************************
localhost : ok=1 changed=0 unreachable=0 failed=0 skipped=2 rescued=0 ignored=0
###############
# Hybrid run #
###############
$ ansible-playbook -i localhost, playbook.yml -e target_env=prod
What is the release version ? [w.x.y-z]: 1.2.3-44
PLAY [Prompt and validation demo] ************************************
TASK [Make sure version is ok] ***************************************
skipping: [localhost]
TASK [Make sure target_env is allowed] *******************************
skipping: [localhost]
TASK [Dummy task just to have a complete playbook for the example] ***
ok: [localhost] => {
"msg": "Deploying version 1.2.3-44 for environment prod on localhost"
}
PLAY RECAP ***********************************************************
localhost : ok=1 changed=0 unreachable=0 failed=0 skipped=2 rescued=0 ignored=0
###################
# Fully automated #
###################
$ ansible-playbook -i localhost, playbook.yml -e target_env=prod -e release_version=1.2.3-44
PLAY [Prompt and validation demo] ************************************
TASK [Make sure version is ok] ***************************************
skipping: [localhost]
TASK [Make sure target_env is allowed] *******************************
skipping: [localhost]
TASK [Dummy task just to have a complete playbook for the example] ***
ok: [localhost] => {
"msg": "Deploying version 1.2.3-44 for environment prod on localhost"
}
PLAY RECAP ***********************************************************
localhost : ok=1 changed=0 unreachable=0 failed=0 skipped=2 rescued=0 ignored=0
Is there a way to make a playbook waiting till a variable is defined?
To reduce some time in the execution of a playbook, I would like to to split it into multiple and start them at the same time. Some of them need a variables, which are defined in the other playbooks.
Is it possible?
IMHO it's not possible. Global scope is set only by config, environment variables and the command line.
Other variables are shared in the scope of a play. It is possible to import more playbooks into one playbook with import_playbook and share variables among the playbooks. But, it's not possible to let the imported playbooks run asynchronously and let them wait for each other.
An option would be to use an external shared memory (e.g. database) and to start such playbooks separately. For example, to share variables among the playbooks at the controller, a simple ini file would do the job.
$ cat shared-vars.ini
[global]
The playbook below
- hosts: localhost
tasks:
- wait_for:
path: "{{ playbook_dir }}/shared-vars.ini"
search_regex: "^shared_var1\\s*=(.*)"
- debug:
msg: "{{ lookup('ini', 'shared_var1 file=shared-vars.ini') }}"
waits for a variable shared_var1 in the file shared-vars.ini
$ ansible-playbook wait_for_var.yml
PLAY [localhost] *******************************************************
TASK [wait_for] ********************************************************
Next playbook
- hosts: localhost
tasks:
- ini_file:
path: "{{ playbook_dir }}/shared-vars.ini"
section: global
option: shared_var1
value: Test value set by declare_var.yml
writes the variable shared_var1 into the file shared-vars.ini
$ ansible-playbook declare_var.yml
PLAY [localhost] *******************************************************
TASK [ini_file] ********************************************************
changed: [localhost]
PLAY RECAP *************************************************************
localhost : ok=1 changed=1 unreachable=0 failed=0
First playbook which was waiting for the variable continues
TASK [debug] ***********************************************************
ok: [localhost] => {
"msg": "Test value set by declare_var.yml"
}
PLAY RECAP *************************************************************
localhost : ok=2 changed=0 unreachable=0 failed=0
The handlers I have are not being run by the playbook or tasks
I have the following directory structur:
<project>
- playbook.yml
- <roles>
-<handler>
- main.yml
-<meta>
-<tasks>
-main.yml
The problem is the handler is never called.
tasks/main.yml:
- name: run task1
command: run_task
notify: "test me now"
handler/main.yml:
- name: tested
register: val1
listen: "test me now"
The playbook just calls the task/main.yml and has host:all
Do I ned an include/import? I tried in playbook but it didn't help
The play below works
tasks:
- include_tasks: tasks/main.yml
- meta: flush_handlers
- debug: var=val1.stdout
handlers:
- import_tasks: handlers/main.yml
handlers must be imported to be present when the task notifies it.
tasks may be included, or imported.
A module is missing in handler/main.yml. This would cause:
ERROR! no action detected in task. This often indicates a misspelled module name, or incorrect module path.
Use some module in handler/main.yml. For example:
- name: tested
command: "echo 'running handler'"
register: val1
listen: "test me now"
Running such play gives
val1.stdout: running handler
Simplified example
Running the playbook below
shell> cat playbook.yml
- hosts: localhost
tasks:
- include_tasks: tasks/main.yml
handlers:
- import_tasks: handlers/main.yml
shell> cat tasks/main.yml
- command: date
register: result
notify: test me now
shell> cat handlers/main.yml
- name: test me now
debug:
msg: "{{ result.stdout }} Running handler."
gives
shell> ansible-playbook playbook.yml
PLAY [localhost] *****************************************************************************
TASK [Gathering Facts] ***********************************************************************
ok: [localhost]
TASK [include_tasks] *************************************************************************
included: /export/scratch/tmp8/test-801/tasks/main.yml for localhost
TASK [command] *******************************************************************************
changed: [localhost]
RUNNING HANDLER [test me now] ****************************************************************
ok: [localhost] =>
msg: Mon 25 Apr 2022 04:59:02 PM CEST Running handler.
PLAY RECAP ***********************************************************************************
localhost : ok=4 changed=1 unreachable=0 failed=0 skipped=0 rescued=0 ignored=0
You should have the structure described in https://docs.ansible.com/ansible/latest/user_guide/playbooks_reuse_roles.html thus the directory should be called handlers (and not handler)
Is there a way to force commands ansible-playbook, ansible-variable, etc... to be executed with a --limit option (otherwise to deny it) ?
I discovered that, on a cluster it can easily run a playbook to all nodes if you mistakenly run it without limit, I'd like to prevent it from ansible users.
Use the ansible_limit variable (added in ansible 2.5). You can test like this:
tasks:
- fail:
msg: "you must use -l or --limit"
when: ansible_limit is not defined
run_once: true
It's the opposite of the task I've solved recently. My goal was to detect there is a --limit and to skip some plays.
https://medium.com/opsops/how-to-detect-if-ansible-is-run-with-limit-5ddb8d3bd145
In your case you can check this in the play and fail if it "full run":
- hosts: all
gather_facts: no
tasks:
- set_fact:
full_run: '{{play_hosts == groups.all}}'
- fail:
msg: 'Use --limit, Luke!'
when: full_run
You can use a different group instead of all, of course (change it in both hosts and set_fact lines).
I did it that way in a task:
$ cat exit-if-no-limit.yml
---
- name: Verifying that a limit is set
fail:
msg: 'This playbook cannot be run with no limit'
run_once: true
when: ansible_limit is not defined
- debug:
msg: Limit is {{ ansible_limit }}, let's continue
run_once: true
when: ansible_limit is defined
Which I include in my playbooks when I need to disallow them to run on all the hosts:
- include_role:
name: myrole
tasks_from: "{{ item }}.yml"
loop:
- exit-if-no-limit
- something
- something_else
Easy to reuse when needed. It works like that:
TASK [myrole: Verifying that a limit is set]
fatal: [ahost]: FAILED! => {"changed": false, "msg": "This playbook cannot be run with no limit"}
or
TASK [myrole: debug]
ok: [anotherhost] => {
"msg": "Limit is anotherhost, let's continue"
}
This can be done with the assert module.
I like to do this in a separate play, at the start of the playbook, with fact gathering disabled. That way, the playbook fails instantly if the limit is not specified.
- hosts: all
gather_facts: no
tasks:
- name: assert limit
run_once: yes
assert:
that:
- 'ansible_limit is defined'
fail_msg: Playbook must be run with a limit (normally staging or production)
quiet: yes
When a limit is not set, you get:
$ ansible-playbook site.yaml
PLAY [all] *********************************************************************
TASK [assert limit] ************************************************************
fatal: [host1.example.net]: FAILED! => {"assertion": "ansible_limit is defined", "changed": false, "evaluated_to": false, "msg": "Playbook must be run with a limit (normally staging or production)"}
NO MORE HOSTS LEFT *************************************************************
PLAY RECAP *********************************************************************
host1.example.com : ok=0 changed=0 unreachable=0 failed=1 skipped=0 rescued=0 ignored=0
And when a limit is set, you get:
$ ansible-playbook -l staging site.yaml
PLAY [all] *********************************************************************
TASK [assert limit] ************************************************************
ok: [host1.example.com]
PLAY [all] *********************************************************************
[... etc ...]
Functionally this is very similar to using the fail module, guarded with when. The difference is that the assert task itself is responsible for checking the assertions, therefore if the assertions pass, the task succeeds. When using the fail module, if the when condition fails, the task is skipped.