My directory structure looks like this
playbooks/
Foo.yml
tasks/
Task1.yml
Task2.yml
AllTasks.yml
The Foo.yml playbook has a - import_tasks: tasks/AllTasks.yml task. AllTasks.yml has
- import_tasks: tasks/Task1.yml
- import_tasks: tasks/Task2.yml
This works perfectly fine when I execute playbook Foo.yml. But when I execute a playbook located elsewhere (so not directly in this playbooks directory), the imports no longer work. The reason for this is that they import relative to the location of the imported playbook.
The same happens with tasks using other modules, such as copy. They look for files relative to the playbook location.
Is there a way to make my tasks work for playbooks located in different directories?
I know there is a playbook_dir variable which sadly I cannot override. I also came across inventory_dir, but for whatever reason that one is not defined.
A way to reference files relative to the file the reference is made in would work. Example:
- import_tasks: "{{ current_dir }}/Task1.yml"
- import_tasks: "{{ current_dir }}/Task2.yml"
Something relative to the inventory file of this project would also work. Example:
- import_tasks: "{{ inventory_dir }}/playbooks/tasks/Task1.yml"
- import_tasks: "{{ inventory_dir }}/playbooks/tasks/Task2.yml"
This latter approach would force me to add these paths all over the project though.
Q: "playbook_dir variable which sadly I cannot override"
A: The variable playbook_dir works as expected (but it's not needed in this case; see the second part under the line)
shell> cd /scratch/tmp
shell> cat Foo.yml
- hosts: localhost
tasks:
- import_tasks: "{{ playbook_dir }}/tasks/AllTasks.yml"
shell> cat tasks/AllTasks.yml
- import_tasks: "{{ playbook_dir }}/tasks/Task1.yml"
- import_tasks: "{{ playbook_dir }}/tasks/Task2.yml"
shell> cat tasks/Task1.yml
- debug:
msg: Task1.yml
shell> cat tasks/Task2.yml
- debug:
msg: Task2.yml
shell> cd /tmp
shell> pwd
/tmp
shell> ansible-playbook /scratch/tmp/Foo.yml
PLAY [localhost] ****
TASK [debug] ****
ok: [localhost] => {
"msg": "Task1.yml"
}
TASK [debug] ****
ok: [localhost] => {
"msg": "Task2.yml"
}
PLAY RECAP ****
localhost: ok=2 changed=0 unreachable=0 failed=0 skipped=0 rescued=0 ignored=0
Q: "Change where Ansible looks for files"
A: No changes are needed. The defaults work fine. See Search paths in Ansible. The values of the variable ansible_search_path below explain why the paths are working
shell> cat Foo.yml
- hosts: localhost
tasks:
- debug:
var: ansible_search_path
- import_tasks: tasks/AllTasks.yml
shell> cat tasks/AllTasks.yml
- debug:
var: ansible_search_path
- import_tasks: Task1.yml
- import_tasks: Task2.yml
shell> cat tasks/Task1.yml
- debug:
msg: Task1.yml
shell> cat tasks/Task2.yml
- debug:
msg: Task2.yml
give
"ansible_search_path": [
"/scratch/tmp"
]
"ansible_search_path": [
"/scratch/tmp/tasks",
"/scratch/tmp"
]
"msg": "Task1.yml"
"msg": "Task2.yml"
Related
We have bit huge ansible tasks in main.yaml and we don't need to execute all tasks, only new task 'House Keep Task' is enough. So tried '--start-at-task' option as below, but Ansible can't find that task;
command :
ansible-playbook -u sysadmin -i ./inventories/dev/hosts --start-at-task='House Keep Task' --step batchservers.yml -K
message :
[ERROR]: No matching task "House Keep Task" found. Note: --start-at-task can only follow static includes.
batchserver.yaml
---
- hosts: batchservers
become: true
tasks:
- import_role:
name: batchservers
tags: [batch]
ansible/roles/batchservers/tasks/main.yaml
---
- name: Add SHARED_DATA_PATH env
lineinfile:
dest: ~/.bash_profile
line: export SHARED_DATA_PATH=/data
- name: Create /data if it does not exist
file:
path: /data
state: directory
mode: og+w
... other tasks include reboot task ...
- name: House Keep Task
cron:
name: "House Keep Task"
user: "{{ batch_user }}"
special_time: daily
job: "/usr/bin/find /var/log -name '*.log' -mtime +6 -type f -delete"
state: present
Is there any good way to execute particular task, House Keep Task?
Our ansible version is core 2.11.12.
Any advice would be highly apprciated.
Q: "Execute particular task House Keep Task."
A: The ansible-playbook option --start-at-task works as expected
--start-at-task 'START_AT_TASK'
start the playbook at the task matching this name
Given the tree
shell> tree .
.
├── ansible.cfg
├── hosts
├── pb.yml
└── roles
└── batchservers
└── tasks
└── main.yaml
3 directories, 4 files
The playbook
shell> cat pb.yml
- hosts: localhost
gather_facts: false
tasks:
- import_role:
name: batchservers
and the role
shell> cat roles/batchservers/tasks/main.yaml
- name: Add SHARED_DATA_PATH env
debug:
msg: Add SHARED_DATA_PATH env
- name: Create /data if it does not exist
debug:
msg: Create /data if it does not exist
- name: House Keep Task
debug:
msg: House Keep Task
give
shell> ansible-playbook pb.yml --start-at-task 'House Keep Task'
PLAY [localhost] *****************************************************************************
TASK [batchservers : House Keep Task] ********************************************************
ok: [localhost] =>
msg: House Keep Task
PLAY RECAP ***********************************************************************************
localhost: ok=1 changed=0 unreachable=0 failed=0 skipped=0 rescued=0 ignored=0
ansible version: 2.9
Hi.
How can I specify the hosts when I import a playbook with import_playbook?
My code (/project/first_pb.yml)
- import_playbook: /test/pb0.yml
hosts: atlanta
Q: "A method to pass specific family hosts to the imported playbook?"
A: There is no difference between a playbook imported or not. For example,
shell> cat pb-A.yml
- hosts: "{{ my_hosts|default('localhost') }}"
tasks:
- debug:
var: inventory_hostname
shell> ansible-playbook pb-A.yml -e my_hosts=host1
...
inventory_hostname: host1
shell> cat pb-B.yml
- import_playbook: pb-A.yml
shell> ansible-playbook pb-B.yml -e my_hosts=host1
...
inventory_hostname: host1
There are many options on how to pass specific hosts and groups to a playbook. For example, see:
Patterns: targeting hosts and groups
add_host module – Add a host and group to the ansible-playbook in-memory inventory
Inventory plugins (e.g. constructed)
I can filter play plabooks with "when: " condition, for example:
- import_playbook: /test/pb0.yml
when: hostname != host1a*
- import_playbook: /test/pb0.yml
when: '"north" not in hostname'
- import_playbook: /test/pb0.yml
when: '"west" in hostname'
Is it possible to set variables for specific hosts in Ansible in the playbook itself, using the the global vars?
So the playbook would be configured something like this:
---
- hosts:
- host-1
- host-2
vars:
host-1: # < - set vars for host-1 specifically
a_var: yes
host-2: # < - set vars for host-2 specifically
a_var: no
I know I using either group_vars, host_vars, an inventory file, or set_fact during runtime is possible, but this is not what I want.
The docs describe "playbook host_vars", but I haven't figured out how that is configured.
What you are referring to is not really in the playbook, per se. But it can be in the directory structure next to the playbook itself.
This is further explained in Organizing host and group variables.
Although you can store variables in the main inventory file, storing separate host and group variables files may help you organize your variable values more easily. Host and group variable files must use YAML syntax. Valid file extensions include ‘.yml’, ‘.yaml’, ‘.json’, or no file extension. See YAML Syntax if you are new to YAML.
Ansible loads host and group variable files by searching paths relative to the inventory file or the playbook file.
Source: https://docs.ansible.com/ansible/latest/user_guide/intro_inventory.html#organizing-host-and-group-variables, emphasis, mine
So, what you can have is this:
.
├── host_vars
│ └── localhost.yml
└── play.yml
Where localhost.yml matches the name of the host we want to target and would contain something like:
foo: bar
And the file play.yml would be the playbook:
- hosts: all
gather_facts: no
tasks:
- debug:
var: foo
Then running it would give the expected:
PLAY [all] **********************************************************************************************************
TASK [debug] ********************************************************************************************************
ok: [localhost] => {
"foo": "bar"
}
PLAY RECAP **********************************************************************************************************
localhost : ok=1 changed=0 unreachable=0 failed=0 skipped=0 rescued=0 ignored=0
Q: "Set variables for specific hosts in Ansible in the playbook itself, using the global vars."
A: Put the host-specific variables into a global dictionary, e.g. my_hostvars. (De facto, create your own hostvars). For example
- hosts: host-1,host-2
vars:
my_hostvars:
host-1: # < - set vars for host-1 specifically
a_var: yes
host-2: # < - set vars for host-2 specifically
a_var: no
tasks:
- debug:
msg: "{{ my_hostvars[inventory_hostname]['a_var'] }}"
gives
ok: [host-1] =>
msg: true
ok: [host-2] =>
msg: false
It's up to you where you declare the dictionary and how you reference it in the playbook. See Variable precedence: Where should I put a variable?.
Put the variables into your own facts to simplify the access. For example
- set_fact:
my_facts: "{{ my_hostvars[inventory_hostname] }}"
- debug:
var: my_facts.a_var
give
ok: [host-1] =>
my_facts.a_var: true
ok: [host-2] =>
my_facts.a_var: false
You can simplify the access further by setting the variables if needed. (For example, to avoid rewriting a code already using the variables).
- set_fact:
a_var: "{{ my_facts.a_var }}"
b_var: "{{ my_facts.b_var }}"
c_var: "{{ my_facts.c_var }}"
You can use it to set or customize (precedence 19.) the default values if needed. For example,
- set_fact:
a_var: "{{ my_facts.a_var|default('a') }}"
b_var: "{{ my_facts.b_var }}"
c_var: "{{ my_facts.c_var }}"
I am having around 250 Debian files in a directory /home/emgda/del/ which periodically changes and must be installed by everyday end.
So I am trying to write an Ansible script to loop this directory, hold file names in an array then install all Debian sequentially using command sudo dpkg -i file_name
So far below is the code I have listed out the files in the directory, just need to add command: somehow to execute above command,
---
- hosts: local
gather_facts: false
tasks:
- command: "ls /home/emgda/del/"
register: dir_out
- debug: var={{item}}
with_items: dir_out.stdout_lines
OUTPUT is
PLAY [local] ***********************************************************************************************************
TASK [command] ************************************************************************************************************************
changed: [localhost]
TASK [debug] ************************************************************************************************************************
ok: [localhost] => (item=dir_out.stdout_lines) => {
"dir_out.stdout_lines": [
"a.deb"
],
"item": "dir_out.stdout_lines"
}
PLAY RECAP ************************************************************************************************************************
localhost : ok=2 changed=1 unreachable=0 failed=0
Any help will be deeply appreciated.
Q: "I have Debian files in a directory /home/emgda/del/ which periodically changes and must be installed."
A: find the packages and install them in the loop with apt
- find:
path: '/home/emgda/del/'
patterns: '*.deb'
register: result
- apt:
deb: '{{ item.path }}'
loop: '{{ result.files }}'
It's possible to query plugin fileglob and install the packages in one task
- apt:
deb: "{{ item }}"
loop: "{{ query('fileglob', '/home/emgda/del/*.deb') }}"
Well I solved it using below technique.
---
- hosts: local
gather_facts: false
tasks:
- name: Making a list of files
shell: "ls /home/emgda/packages/"
register: command_result
- name: Installing Debian sequentially.
become: yes
shell: "dpkg -i /home/emgda/packages/{{item}}"
with_items:
- "{{ command_result.stdout_lines }}"
I have 3 tasks in a playbook. My requirement is to complete the first task and then the second and third tasks should happen in parallel. As by default, these three tasks will happen one after the other, is there a way to make the second and third task in parallel once the first one is done?
- hosts: conv4
remote_user: username
tasks:
- name: Start Services
script: app-stop.sh
register: console
- hosts: patchapp
remote_user: username
become_user: username
become_method: su
tasks:
- name: Stop APP Services
script: stopapp.sh
register: console
- debug: var=console.stdout_lines
- hosts: patchdb
remote_user: username
become_user: username
become_method: su
tasks:
- name: Stop DB Services
script: stopdb.sh
register: console
- debug: var=console.stdout_lines
I need to run Start Services task first and then once it is complete i need to run Stop APP Services and Stop DB Services tasks parallely.
1.Import playbook works fine.
play.yml
- import_playbook: play1.yml
- import_playbook: play2.yml
play1.yml
- hosts: localhost
tasks:
- debug:
msg: 'play1: {{ ansible_host }}'
play2.yml
- hosts:
- vm2
- vm3
tasks:
- debug:
msg: 'play2: {{ ansible_host }}'
2.Loop over include_tasks and delegate_to is a nogo.
An option would be to loop over include_tasks and delegate_to inside the task (see below). But there is an unsolved bug delegate_to with remote_user on ansible 2.0 .
Workflows in Ansible Tower require license.
Workflows are only available to those with Enterprise-level licenses.
play.yml
- name: 'Test loop delegate_to'
hosts: localhost
gather_facts: no
vars:
my_servers:
- vm2
- vm3
tasks:
- name: "task 1"
debug:
msg: 'Task1: Running at {{ ansible_host }}'
- include_tasks: task2.yml
loop: "{{ my_servers }}"
task2.yml
- debug:
msg: 'Task2: Running at {{ ansible_host }}'
delegate_to: "{{ item }}"
ansible-playbook play.yml
PLAY [Test loop delegate_to]
TASK [task 1]
ok: [localhost] =>
msg: 'Task1: Running at 127.0.0.1'
TASK [include_tasks]
included: ansible-examples/examples/example-014/task2.yml for localhost
included: ansible-examples/examples/example-014/task2.yml for localhost
TASK [debug]
ok: [localhost -> vm2] =>
msg: 'Task2: Running at vm2'
TASK [debug]
ok: [localhost -> vm3] =>
msg: 'Task2: Running at vm3'
PLAY RECAP
localhost : ok=5 changed=0 unreachable=0 failed=0