I have written a playbook which copies a file from source to destination on multiple hosts. The playbook works if all hosts are reachable but it doesn't fail if one of the host is unreachable.
ansible-playbook -i "10.11.12.13,10.11.12.14," -e "hostid=12345" test.yml
.e.g. if the host "10.11.12.13" is not reachable task execution skips the unreachable host and move onto the next host.
Sample playbook
- hosts: localhost
gather_facts: no
tasks:
- debug: msg="backup_restore.py file not found"
- name: Copy file
hosts: all
remote_user: test
gather_facts: no
vars:
srcFolder: "/home/test"
destFolder: "/opt/config"
tasks:
- block:
- name: Copy file to node
copy:
src: '{{srcFolder}}/self.config'
dest: '{{destFolder}}/self.config'
Is there a way to fail the task if the any of the hosts are not reachable. I am using ansible 2.6.1. Thank you in advance.
Brute force solution is any_errors_fatal
- name: Copy file
hosts: all
any_errors_fatal: true
Overview of other options is in Abort execution of remaining task if certain condition is failed.
Related
I have a playbook that runs multiple tasks on loacalhost like below, Except one where I need to store the result in a file to a remote server and use it in the next task as condition based on the content of the file.
What is the best way to do this and how do we define credentials for this server?
- hosts: localhost
tasks:
- name: run task1
debug: msg="running task on localhost"
- name: run task 2
debug: msg="running all others also localhost"
register: output
- name: store output in remote storage server
debug: msg="Copy the content of register output to a file in remote server"
delegate_to: "remote.storageserver.com"
Make two plays, and at the end of the first play set the variables you need for the tasks on the remote host. Set hosts to only localhost for the first play, and remotehost for the second play. Include whatever remote tasks you need in the second play. You can also define different credentials this way.
Ex:
- name: Local play
hosts: localhost
tasks:
- name: run task1
debug: msg="running task on localhost"
- name: run task 2
debug: msg="running all others also localhost"
register: output
- name: Remote play
hosts: remote.storageserver.com
tasks:
- name: store output in remote storage server
debug: msg="Copy the content of register output to a file in remote server"
I have a playbook which will be run on many servers (say ten). The first three tasks will be run on the remote servers. The last task of merging is done on localhost (Ansible controller).
When I run this playbook, the merging is happening each time (i.e.: ten times).
I want to do the merging task only one time, once all the above tasks are completed on all servers.
---
- name: Find the location
debug:
- name: Extract details
debug:
- name: Create csv file
debug:
- name: Merge files
debug:
delegate_to: localhost
Use run_once to achieve this:
- hosts: all
tasks:
- name: do this on every host
debug:
- name: do this once on localhost
debug:
delegate_to: localhost
run_once: true
Create a block containing 'Find', 'Extract', 'Create' that runs on remote servers.
Another block containing 'Merge' that runs only on localhost.
The preferred way is to create a role for the first block, another role for second and use it in playbook:
- hosts: all
roles:
- find_extract_create
- hosts: localhost
roles:
- merge
I want to restrict an Ansible play to a specific host
Here's a cut down version of what I want:
- hosts some_host_group
tasks:
- name: Remove existing server files
hosts: 127.0.0.1
file:
dest: /tmp/test_file
state: present
- name: DO some other stuff
file:
...
I want to (as an early task), remove a local directory (I've created a file in the example as it's a more easily observed test). I was under the impression that I could limit a play to a set of hosts with the "hosts" parameter to the task -
but I get this error:
ERROR! 'hosts' is not a valid attribute for a Task
$ansible --version
ansible 2.3.1.0
Thanks.
PS I could wrap the ansible in a shell fragment, but that's ugly.
You should use delegate_to or local_action and tell Ansible to run the task only once (otherwise it will try to delete the directory as many times as target hosts in your play, although it won't be a problem).
You should also use absent not present if you want to remove directory, as you stated.
- name: Remove existing server files
delegate_to: 127.0.0.1
run_once: true
file:
dest: /tmp/test_file
state: absent
There are syntax errors in your playbook, have a look at Ansible Intro, Local Playbooks and Delegation.
- hosts: some_host_group
tasks:
- name: Remove existing server files
- hosts: localhost
tasks:
- file:
dest: /tmp/test_file
state: present
- name: DO some other stuff
file:
For example, deploy.yml is a ansible playbook. There are two plays in deploy.yml, play1 and play2.
$ cat deploy.yml
- hosts: nodes
remote_user: cloud
become: yes
tasks:
- name: play1
copy: src=test1 dest=/root
- hosts: nodes
remote_user: cloud
become: yes
tasks:
- name: play2
copy: src=test2 dest=/root
$ cat hosts
[nodes]
192.168.1.12
192.168.1.13
Running
ansible-playbook -i hosts deploy.yml
When play1 failed on 192.168.1.12 but success on 192.168.1.13, the deploy.retry only list 192.168.1.12 but no 192.168.1.13.
$ cat deploy.retry
192.168.1.12
Then I running
ansible-playbook -i hosts deploy.yml --limit #deploy.retry
I got a wrong result of play2 haven't be running on 192.168.1.13! Some people know how to solve this problem?
Problem is in playbok file, in facts, you have two independent playbooks in one file. I tested your setup with ansible 2.2.1.0 and second play run correctly for host without error in play1, but there can be difference in configuration.
Correct playbook format for expecting behavior is
- hosts: nodes
remote_user: cloud
become: yes
tasks:
- name: play1
copy: src=test1 dest=/root
- name: play2
copy: src=test2 dest=/root
Playbook below. I'm trying to replace test#ip: with a way to pull from my inventory file the IP from a group I've created.
- hosts: firewall
gather_facts: no
tasks:
- name: run shell script
raw: 'sh /home/test/firewall.sh'
- hosts: localhost
gather_facts: no
tasks:
- name: Copy File to Local Machine
shell: 'scp test#ip:/home/test/test.test /Users/dest/Desktop'
You need to change your task like this:
- hosts: localhost
gather_facts: no
tasks:
- name: Copy File to Local Machine
shell: 'scp test#{{ item }}:/home/test/test.test /Users/dest/Desktop'
with_items: groups['your_group_name']
If you want to run on all the hosts in the inventory then you can use like this:
with_items: groups['all']
Hope that will help you.