I am trying to use ansible to telnet into cisco switches and apply a copy startup-config disk0 command.
Ansible seems to never be able to pass
(?i)"Destination filename": "work please" through the expect command
---
- hosts: all
gather_facts: false
connection: local
tasks:
- name: telnet,login and execute command
ignore_errors: true
expect:
command: telnet "{{ inventory_hostname }}"
responses:
(?i)password: "{{ password}}"
(?i)#: copy startup-config disk0
(?i)"Destination filename": "{{ lookup('pipe','date') }"
echo: yes
register: telnet_output
What i am getting as an output
ansible-playbook 2.7.6
config file = /etc/ansible/ansible.cfg
configured module search path = [u'/root/.ansible/plugins/modules', u'/usr/share/ansible/plugins/modules']
ansible python module location = /usr/lib/python2.7/site-packages/ansible
executable location = /usr/bin/ansible-playbook
python version = 2.7.5 (default, Oct 30 2018, 23:45:53) [GCC 4.8.5 20150623 (Red Hat 4.8.5-36)]
Using /etc/ansible/ansible.cfg as config file
/var/lib/awx/projects/6500/hosts did not meet host_list requirements, check plugin documentation if this is unexpected
/var/lib/awx/projects/6500/hosts did not meet script requirements, check plugin documentation if this is unexpected
PLAYBOOK: copy-startup.yml *************************************************************************************************************************************************************************************************************
1 plays in copy-startup.yml
PLAY [all] *****************************************************************************************************************************************************************************************************************************
META: ran handlers
TASK [telnet,login and execute command] ************************************************************************************************************************************************************************************************
task path: /var/lib/awx/projects/6500/copy-startup.yml:6
fatal: [66.90.19.18]: FAILED! => {"changed": true, "cmd": "telnet \"66.90.19.18\"", "delta": "0:00:30.370396", "end": "2019-02-12 10:09:41.473716", "msg": "command exceeded timeout", "rc": null, "start": "2019-02-12 10:09:11.103320", "stdout": "Trying 66.90.19.18...\r\r\nConnected to 66.90.19.18.\r\r\nEscape character is '^]'.\r\r\n\r\n\r\nUser Access Verification\r\n\r\nPassword: \r\nLAB-6500-SUP2T#copy startup-config disk0\r\nDestination filename [disk0]? ", "stdout_lines": ["Trying 66.90.19.18...", "", "Connected to 66.90.19.18.", "", "Escape character is '^]'.", "", "", "", "User Access Verification", "", "Password: ", "LAB-6500-SUP2T#copy startup-config disk0", "Destination filename [disk0]? "]}
...ignoring
PLAY RECAP *****************************************************************************************************************************************************************************************************************************
66.90.19.18 : ok=2 changed=1 unreachable=0 failed=0
It seems to never want to write the Destination Filename[disk0]?
Any ideas
(?i)"Destination filename" matches for string with double quotes.
You need:
responses:
'(?i)password': "{{ password}}"
'(?i)#': copy startup-config disk0
'(?i)Destination filename': "{{ lookup('pipe','date') }"
---
- hosts: '6500'
gather_facts: true
connection: local
tasks:
- name: telnet,login and execute command
ignore_errors: true
expect:
command: telnet "{{ inventory_hostname }}"
responses:
(?i)Password: {{ password }}
(?i)Destination filename [disk0]? : "{{ lookup('pipe','date +%Y-%m-%d-%H-%M') }} {{ inventory_hostname }}"
(?i)#: copy startup-config disk0
(?i){{COMMAND}}: exit
echo: yes
register: telnet_output
This seems to be the best solution to what I need. I changed the order of operations and it was rocking,
Related
My Ansible playbook.
hosts: all
vars:
alias_name: '{{ alias }}'
upload_file: '{{ upload }}'
pack1: /home/ansible
tasks:
- name: Copy file with owner and permissions
copy:
src: '{{ upload_file }}'
dest: '{{ pack1 }}'
owner: ansible
group: ansible
mode: '0777'
- name: return motd to registered var
shell: "ls -Art | tail -n 1"
args:
chdir: '{{ pack1 }}'
register: mymotd
- name: Import SSL certificate from google.com to a given cacerts keystore
java_cert:
cert_path: '{{ pack1 }}/{{ mymotd.stdout }}'
cert_alias: '{{ alias_name }}'
keystore_path: '/usr/lib/jvm/java-1.8.0-openjdk-1.8.0.181-7.b13.el7.x86_64/jre/lib/security/cacerts'
keystore_pass: changeit
state: present
ERROR:
FAILED! => {
"changed": false,
"invocation": {
"module_args": {
"_raw_params": "ls -Art | tail -n 1",
"chdir": "/home/ansible",
"register": "mymotd",
"warn": true
}
},
"msg": "Unsupported parameters for (command) module: register Supported parameters include: _raw_params, _uses_shell, argv, chdir, creates, executable, removes, stdin, stdin_add_newline, strip_empty_ends, warn"
}
PLAY RECAP ***********************************************************************************************************************************
172.16.217.129 : ok=2 changed=0 unreachable=0 failed=1 skipped=0 rescued=0 ignored=0
Is there any other way you can recommend me to do the same? My main aim is to pass command output to a variable and use that variable later to append in cert path.
You need to correct the indentation of return motd to registered var task like below. Also use shell module instead of command to be able to use operations like "<", ">", "|", ";" and "&".
Please format ansible playbook as per correct YAML syntax.
- name: return motd to registered var
shell: "ls -Art | tail -n 1"
args:
chdir: '{{ pack1 }}'
register: mymotd
I am having difficulty figuring out to set a fact in my Ansible playbook that contains the IP address of a server that is listed in the /etc/hosts file on my controller. I am running a playbook against my web server which needs the IP address of my file server. I run the command like this:
ansible-playbook deploy-webservers.yml -i inventory.ini -l webservers
My inventory file looks like this:
[fileservers]
prod-fs1.example.com
[webservers]
prod-web1.example.com
[localhost]
127.0.0.1 ansible_connection=local ansible_python_interpreter=/Users/jsmith/.virtualenvs/provision/bin/python
Here is the playbook:
---
hosts: all
gather_facts: yes
become: yes
pre_tasks:
- name: get file server's IP address
command: "grep prod-fs1 /etc/hosts | awk '{ print $1 }'"
register: fs_ip_addr
delegate_to: localhost
- debug: var={{ fs_ip_addr }}
When I run it, I get this error:
TASK [get file server's IP address] ****************************************************************************************
fatal: [prod-web1.example.com -> localhost]: FAILED! => {"changed": true, "cmd": ["grep", "prod-fs1", "/etc/hosts", "|", "awk", "{ print $0 }"], "delta": "0:00:00.010303", "end": "2020-03-03 12:24:36.207656",
"msg": "non-zero return code", "rc": 2, "start": "2020-03-03 12:24:36.197353", "stderr": "grep: |: No such file or directory\ngrep: awk: No such file or directory\ngrep: { print $0 }:
No such file or directory", "stderr_lines": ["grep: |: No such file or directory", "grep: awk: No such file or directory", "grep: { print $0 }: No such file or directory"], "stdout": "/etc/hosts:45.79.99.99 prod-fs1.example.com prod-fs1", "stdout_lines": ["/etc/hosts:45.79.99.99 prod-fs1.example.com prod-fs1"]}
PLAY RECAP ****************************************************************************************************************
prod-web1.example.com : ok=7 changed=0 unreachable=0 failed=1 skipped=0 rescued=0 ignored=0;
It looks like Ansible has a problem parsing the command when it reaches the pipe symbol. Is there a way around this problem?
Try this. No need to delegate to localhost. lookup is running always on the controller
- set_fact:
fs_ip_addr: "{{ (lookup('file', '/etc/hosts').splitlines()|
list|
select('search', search_host)|
list|
first).split().0 }}"
vars:
search_host: "prod-fs1"
The code can be simplified a bit. Note that search_host is a var.
- set_fact:
fs_ip_addr: "{{ lookup('file', '/etc/hosts').splitlines() |
select('search', search_host) |
first | split() | first }}"
I want to set one value as an environment variable in Ansible and then use it another playbook.
Below is my playbook:
get_cmd.yaml
[root#a6296ab33a34 test_code]# vi get-cwd.yaml
- hosts: localhost
connection: local
gather_facts: False
tasks:
#- name: Get directory
# shell: export ACWD="{{ playbook_dir }}"
# when: platform == 'jenkins'
- name: Get CWD
shell: "export ACWD=/test_code_demo"
when: platform != 'jenkins'
- name: DEMO
shell: echo $ACWD
Output
[root#a6296ab33a34 test_code]# vi get-cwd.yaml
[root#a6296ab33a34 test_code]# ansible-playbook get-cwd.yaml --extra-vars="#deploy-vars.yaml" -vv
[WARNING] Ansible is being run in a world writable directory (/test_code), ignoring it as an ansible.cfg source. For more information see https://docs.ansible.com/ansible/devel/reference_appendices/config.html#cfg-in-world-writable-dir
ansible-playbook 2.8.4
config file = /etc/ansible/ansible.cfg
configured module search path = [u'/root/.ansible/plugins/modules', u'/usr/share/ansible/plugins/modules']
ansible python module location = /usr/lib/python2.7/site-packages/ansible
executable location = /usr/bin/ansible-playbook
python version = 2.7.5 (default, Jun 20 2019, 20:27:34) [GCC 4.8.5 20150623 (Red Hat 4.8.5-36)]
Using /etc/ansible/ansible.cfg as config file
[WARNING]: provided hosts list is empty, only localhost is available. Note that the implicit localhost does not match 'all'
PLAYBOOK: get-cwd.yaml *********************************************************************************************************************************************************************************************************************
1 plays in get-cwd.yaml
PLAY [localhost] ***************************************************************************************************************************************************************************************************************************
META: ran handlers
TASK [Get CWD] *****************************************************************************************************************************************************************************************************************************
task path: /test_code/get-cwd.yaml:11
changed: [localhost] => {"changed": true, "cmd": "export ACWD=/test_code_demo", "delta": "0:00:00.713703", "end": "2019-12-13 14:43:37.054390", "rc": 0, "start": "2019-12-13 14:43:36.340687", "stderr": "", "stderr_lines": [], "stdout": "", "stdout_lines": []}
TASK [DEMO] ********************************************************************************************************************************************************************************************************************************
task path: /test_code/get-cwd.yaml:15
changed: [localhost] => {"changed": true, "cmd": "echo $ACWD", "delta": "0:00:00.705605", "end": "2019-12-13 14:43:37.919962", "rc": 0, "start": "2019-12-13 14:43:37.214357", "stderr": "", "stderr_lines": [], "stdout": "/test_code_dinesh", "stdout_lines": ["/test_code_dinesh"]}
META: ran handlers
META: ran handlers
PLAY RECAP *********************************************************************************************************************************************************************************************************************************
localhost : ok=2 changed=2 unreachable=0 failed=0 skipped=0 rescued=0 ignored=0
[root#a6296ab33a34 test_code]#
You can see, though I have tried to set the value to test_code_demo, still old value test_code_dinesh is reflecting.
Please let me know way to resolve the above issue.
Remember that when you set an environment variable (anywhere, not just in Ansible), it only effects the current process and its children.
When you run something like this:
- name: Get CWD
shell: "export ACWD=/test_code_demo"
when: platform != 'jenkins'
You are:
Spawning a shell
Setting the environment variable ACWD in that shell
Exiting the shell
At this point, the environment is destroyed. There's no way to set an environment variable in one task and have it effect another task. You can set per-task environment variables using the environment key on your task, like this:
- name: DEMO
shell: echo $ACWD
environment:
ACWD: '/test_code_demo'
If you need to apply the environment setting to multiple tasks, you can set it on a play instead:
- hosts: localhost
environment:
ACWD: '/test_code_demo'
tasks:
- command: 'echo $ACWD'
register: output1
- command: 'echo $ACWD'
register: output2
- debug:
msg:
- "{{ output1.stdout }}"
- "{{ output2.stdout }}"
I'm facing some (silly) problems with ansible-playbook. I don't understand the error due to is a syntax-error but another similar playbook has been executed successfully and this one not.
ERROR! 'copy' is not a valid attribute for a Play
The error appears to have been in '/home/manu/monitorizacion/node-exporter/playbooks/deploy-node-exporter.yml': line 2, column 3, but may
be elsewhere in the file depending on the exact syntax problem.
The offending line appears to be:
---
- name: Upload project directory to '{{ docker_home }}'
^ here
We could be wrong, but this one looks like it might be an issue with
missing quotes. Always quote template expression brackets when they
start a value. For instance:
with_items:
- {{ foo }}
Should be written as:
with_items:
- "{{ foo }}"
The thing is that i have a (install.yml) playbook with 2 include_tasks
---
- hosts: wargame
vars:
project_home: ../../node-exporter/
scripts_dir: /home/docker/node-exporter/textfile-collector/
docker_home: /home/docker/node-exporter/
tasks:
- include_tasks: packages.yml
- include_tasks: deploy-node-exporter.yml
packages.yml is fine and it is executed without problem
---
- name: Instalar paquetes requeridos para Docker e instalación
apt:
update_cache: true
name: "{{ packages }}"
vars:
packages:
- lsb-core
- apt-transport-https
- ca-certificates
- curl
- gnupg2
- python-setuptools
- python-pip
- git
- smartmontools
- name: Instalar clave Docker
apt_key:
url: https://download.docker.com/linux/debian/gpg
- name: Instalar repositorio Docker
apt_repository:
repo: "deb [arch=amd64] https://download.docker.com/linux/debian {{ ansible_lsb.codename }} stable"
- name: Actualizar listado de paquetes
apt:
update_cache: yes
- name: Instalar Docker
apt:
name: docker-ce
- name: Instalar Docker-compose
pip:
name: docker-compose
- name: Grupo docker
group:
name: docker
state: present
- name: Usuario docker
user:
name: docker
group: docker
As you can see, the last executed task is "Usuario docker"
ok: [188.166.52.222] => {"append": false, "changed": false, "comment": "", "group": 998, "home": "/home/docker", "move_home": false, "name": "docker", "shell": "", "state": "present", "uid": 1002}
TASK [include_tasks] *****************************************************************************************************************************
fatal: [188.166.52.222]: FAILED! => {"reason": "no action detected in task. This often indicates a misspelled module name, or incorrect module path.\n\nThe error appears to have been in '/home/manu/monitorizacion/node-exporter/playbooks/deploy-node-exporter.yml': line 24, column 3, but may\nbe elsewhere in the file depending on the exact syntax problem.\n\nThe offending line appears to be:\n\n mode: '0755'\n- name: Deploy node-exporter\n ^ here\n"}
And now the error is in deploy-node-exporter.yml but the syntax is the same as in install.yml
---
- name: Upload project directory to '{{ docker_home }}'
copy:
src: "{{ project_home }}"
dest: "{{ docker_home }}"
directory_mode: yes
follow: yes
owner: docker
group: docker
- name: Fix perms for dirs
file:
path: "{{ item }}"
state: touch
mode: '0755'
with_items:
- "{{ docker_home }}"
- "{{ docker_home }}/textfile-collector"
- "{{ docker_home }}/playbooks"
- name: Give exec perms to smartmon.sh
file:
path: "{{ scripts_dir }}/smartmon.sh"
state: touch
mode: '0755'
- name: Deploy node-exporter
docker_compose:
project_src: "{{ docker_home }}"
build: no
state: present
recreate: always
register: output
- name: Creates cron entry for smartmon.sh
cron:
name: Cron job for collecting smartctl stats
minute: "*/15"
user: root
job: "{{ scripts_dir }}/smartmon.sh > {{ scripts_dir }}/smartmon.prom"
Or I'm blind due to i'm not able to find the syntax error or i think is an ansible problem
My current installed version
ansible 2.7.7
config file = /etc/ansible/ansible.cfg
configured module search path = ['/root/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules']
ansible python module location = /usr/lib/python3/dist-packages/ansible
executable location = /usr/bin/ansible
python version = 3.7.3 (default, Apr 3 2019, 05:39:12) [GCC 8.3.0]
--- update ---
I've put it everything on a single file
---
- hosts: myServer
vars:
project_home: ../../node-exporter/
scripts_dir: /home/docker/node-exporter/textfile-collector/
docker_home: /home/docker/node-exporter/
tasks:
- name: Instalar paquetes requeridos para Docker e instalación
apt:
update_cache: true
name: "{{ packages }}"
vars:
packages:
- lsb-core
- apt-transport-https
- ca-certificates
- curl
- gnupg2
- python-setuptools
- python-pip
- git
- smartmontools
- name: Instalar clave Docker
apt_key:
url: https://download.docker.com/linux/debian/gpg
- name: Instalar repositorio Docker
apt_repository:
repo: "deb [arch=amd64] https://download.docker.com/linux/debian {{ ansible_lsb.codename }} stable"
- name: Actualizar listado de paquetes
apt:
update_cache: yes
- name: Instalar Docker
apt:
name: docker-ce
- name: Instalar Docker-compose
pip:
name: docker-compose
- name: Grupo docker
group:
name: docker
state: present
- name: Usuario docker
user:
name: docker
group: docker
- name: Upload project directory to '{{ docker_home }}'
copy:
src: "../../node-exporter/"
dest: "/home/docker/node-exporter/"
directory_mode: yes
follow: yes
owner: docker
group: docker
- name: Fix perms for dirs
file:
path: "{{ item }}"
state: touch
mode: "0755"
with_items:
- "/home/docker/node-exporter/"
- "/home/docker/node-exporter/textfile-collector"
- "/home/docker/node-exporter/playbooks"
- name: Give exec perms to smartmon.sh
file:
path: "{{ scripts_dir }}/smartmon.sh"
state: touch
mode: "0755"
- name: Creating cron entry for smartmon.sh
cron:
name: Cron job for collecting smartctl stats
minute: "*/15"
user: root
job: "{{ scripts_dir }}/smartmon.sh > {{ scripts_dir }}/smartmon.prom"
In this file I've removed
- name: Deploying node-exporter
docker_compose:
project_src: "{{ docker_home }}"
build: no
state: present
recreate: always
register: output
And now it works (without this part of code)
┌─[root#hippi3c0w] - [~manu/monitorizacion/node-exporter/playbooks] - [sáb ago 31, 20:59]
└─[$] <git:(master*)> ansible-playbook install_new.yml --syntax-check
playbook: install_new.yml
But the problem comes back when i add this part to the yaml file, so the problem is without doubts following part
- name: Deploying node-exporter
docker_compose:
project_src: "{{ docker_home }}"
build: no
state: present
recreate: always
register: output
Do you know what could be the problem? Maybe docker_compose?
--- UPDATE ---
It was due to ansible version. 2.7 doesn't support docker_compose directive. Updated to 2.8.4 and now it works properly.
PLAY RECAP ***************************************************************************************************************************************
[MyIP] : ok=16 changed=5 unreachable=0 failed=0 skipped=0 rescued=0 ignored=0
┌─[root#hippi3c0w] - [~manu/monitorizacion/node-exporter/playbooks] - [dom sep 01, 08:34]
└─[$] <git:(master*)> ansible --version; ansible-playbook install.yml --syntax-check
ansible 2.8.4
config file = /etc/ansible/ansible.cfg
configured module search path = [u'/root/.ansible/plugins/modules', u'/usr/share/ansible/plugins/modules']
ansible python module location = /usr/lib/python2.7/dist-packages/ansible
executable location = /usr/bin/ansible
python version = 2.7.16+ (default, Jul 8 2019, 09:45:29) [GCC 8.3.0]
playbook: install.yml
Would you please be more specific about which error you want to solve.
This error?
- name: Upload project directory to '{{ docker_home }}'
^ here
We could be wrong, but this one looks like it might be an issue with
missing quotes. Always quote template expression brackets when they
start a value. For instance:
with_items:
- {{ foo }}
Should be written as:
with_items:
- "{{ foo }}"
Or That one?
ok: [188.166.52.222] => {"append": false, "changed": false, "comment": "", "group":
998, "home": "/home/docker", "move_home": false, "name": "docker", "shell": "",
"state": "present", "uid": 1002}
TASK [include_tasks]
fatal: [188.166.52.222]: FAILED! => {"reason": "no action detected in task. This
often indicates a misspelled module name, or incorrect module path.\n\nThe error
appears to have been in '/home/manu/monitorizacion/node-exporter/playbooks/deploy-
node-exporter.yml': line 24, column 3, but may\nbe elsewhere in the file depending
on the exact syntax problem.\n\nThe offending line appears to be:\n\n mode: '
0755'\n- name: Deploy node-exporter\n ^ here\n"}
JFYI
copy is possible in my ansible version
tasks:
- name: Test copy
copy:
src: "/home/manu/test"
dest: "/home/user"
follow: yes
owner: user
group: user
mode: 0755
└─[$] <git:(master*)> ansible-playbook test.yaml --syntax-check
playbook: test.yaml
┌─[root#hippi3c0w] - [~manu/monitorizacion/node-exporter/playbooks] - [sáb ago 31, 20:33]
└─[$] <git:(master*)> ansible-playbook test.yaml -u user -v
Using /etc/ansible/ansible.cfg as config file
/etc/ansible/hosts did not meet host_list requirements, check plugin documentation if this is unexpected
/etc/ansible/hosts did not meet script requirements, check plugin documentation if this is unexpected
PLAY [myserver] ***********************************************************************************************************************************
TASK [Gathering Facts] ***************************************************************************************************************************
ok: [myIP]
TASK [Test copy] *********************************************************************************************************************************
changed: [myIP] => {"changed": true, "checksum": "55ca6286e3e4f4fba5d0448333fa99fc5a404a73", "dest": "/home/user/test", "gid": 1001, "group": "user", "mode": "0755", "owner": "user", "path": "/home/user/test", "size": 3, "state": "file", "uid": 1001}
PLAY RECAP ***************************************************************************************************************************************
myIP : ok=2 changed=1 unreachable=0 failed=0
--- UPDATE ---
It was due to ansible version. 2.7 doesn't support docker_compose directive. Updated to 2.8.4 and now it works properly.
PLAY RECAP ***************************************************************************************************************************************
[MyIP] : ok=16 changed=5 unreachable=0 failed=0 skipped=0 rescued=0 ignored=0
┌─[root#hippi3c0w] - [~manu/monitorizacion/node-exporter/playbooks] - [dom sep 01, 08:34]
└─[$] <git:(master*)> ansible --version; ansible-playbook install.yml --syntax-check
ansible 2.8.4
config file = /etc/ansible/ansible.cfg
configured module search path = [u'/root/.ansible/plugins/modules', u'/usr/share/ansible/plugins/modules']
ansible python module location = /usr/lib/python2.7/dist-packages/ansible
executable location = /usr/bin/ansible
python version = 2.7.16+ (default, Jul 8 2019, 09:45:29) [GCC 8.3.0]
playbook: install.yml
I have an Ansible playbook, where I would like a variable I register in a first play targeted on one node to be available in a second play, targeted on another node.
Here is the playbook I am using:
---
- hosts: localhost
gather_facts: no
tasks:
- command: echo "hello world"
register: foo
- hosts: main
gather_facts: no
tasks:
- debug:
msg: {{ foo.stdout }}
But, when I try to access the variable in the second play, targeted on main, I get this message:
The task includes an option with an undefined variable. The error was: 'foo' is undefined
How can I access foo, registered on localhost, from main?
The problem you're running into is that you're trying to reference facts/variables of one host from those of another host.
You need to keep in mind that in Ansible, the variable foo assigned to the host localhost is distinct from the variable foo assigned to the host main or any other host.
If you want to access one hosts facts/variables from another host then you need to explicitly reference it via the hostvars variable. There's a bit more of a discussion on this in this question.
Suppose you have a playbook like this:
- hosts: localhost
gather_facts: no
tasks:
- command: echo "hello world"
register: foo
- hosts: localhost
gather_facts: no
tasks:
- debug:
var: foo
This will work because you're referencing the host localhost and localhosts's instance of the variable foo in both plays.
The output of this playbook is something like this:
PLAY [localhost] **************************************************
TASK: [command] ***************************************************
changed: [localhost]
PLAY [localhost] **************************************************
TASK: [debug] *****************************************************
ok: [localhost] => {
"var": {
"foo": {
"changed": true,
"cmd": [
"echo",
"hello world"
],
"delta": "0:00:00.004585",
"end": "2015-11-24 20:49:27.462609",
"invocation": {
"module_args": "echo \"hello world\",
"module_complex_args": {},
"module_name": "command"
},
"rc": 0,
"start": "2015-11-24 20:49:27.458024",
"stderr": "",
"stdout": "hello world",
"stdout_lines": [
"hello world"
],
"warnings": []
}
}
}
If you modify this playbook slightly to run the first play on one host and the second play on a different host, you'll get the error that you encountered.
Solution
The solution is to use Ansible's built-in hostvars variable to have the second host explicitly reference the first hosts variable.
So modify the first example like this:
- hosts: localhost
gather_facts: no
tasks:
- command: echo "hello world"
register: foo
- hosts: main
gather_facts: no
tasks:
- debug:
var: foo
when: foo is defined
- debug:
var: hostvars['localhost']['foo']
## alternatively, you can use:
# var: hostvars.localhost.foo
when: hostvars['localhost']['foo'] is defined
The output of this playbook shows that the first task is skipped because foo is not defined by the host main.
But the second task succeeds because it's explicitly referencing localhosts's instance of the variable foo:
TASK: [debug] *************************************************
skipping: [main]
TASK: [debug] *************************************************
ok: [main] => {
"var": {
"hostvars['localhost']['foo']": {
"changed": true,
"cmd": [
"echo",
"hello world"
],
"delta": "0:00:00.005950",
"end": "2015-11-24 20:54:04.319147",
"invocation": {
"module_args": "echo \"hello world\"",
"module_complex_args": {},
"module_name": "command"
},
"rc": 0,
"start": "2015-11-24 20:54:04.313197",
"stderr": "",
"stdout": "hello world",
"stdout_lines": [
"hello world"
],
"warnings": []
}
}
}
So, in a nutshell, you want to modify the variable references in your main playbook to reference the localhost variables in this manner:
{{ hostvars['localhost']['foo'] }}
{# alternatively, you can use: #}
{{ hostvars.localhost.foo }}
Use a dummy host and its variables
For example, to pass a Kubernetes token and hash from the master to the workers.
On master
- name: "Cluster token"
shell: kubeadm token list | cut -d ' ' -f1 | sed -n '2p'
register: K8S_TOKEN
- name: "CA Hash"
shell: openssl x509 -pubkey -in /etc/kubernetes/pki/ca.crt | openssl rsa -pubin -outform der 2>/dev/null | openssl dgst -sha256 -hex | sed 's/^.* //'
register: K8S_MASTER_CA_HASH
- name: "Add K8S Token and Hash to dummy host"
add_host:
name: "K8S_TOKEN_HOLDER"
token: "{{ K8S_TOKEN.stdout }}"
hash: "{{ K8S_MASTER_CA_HASH.stdout }}"
- name:
debug:
msg: "[Master] K8S_TOKEN_HOLDER K8S token is {{ hostvars['K8S_TOKEN_HOLDER']['token'] }}"
- name:
debug:
msg: "[Master] K8S_TOKEN_HOLDER K8S Hash is {{ hostvars['K8S_TOKEN_HOLDER']['hash'] }}"
On worker
- name:
debug:
msg: "[Worker] K8S_TOKEN_HOLDER K8S token is {{ hostvars['K8S_TOKEN_HOLDER']['token'] }}"
- name:
debug:
msg: "[Worker] K8S_TOKEN_HOLDER K8S Hash is {{ hostvars['K8S_TOKEN_HOLDER']['hash'] }}"
- name: "Kubeadmn join"
shell: >
kubeadm join --token={{ hostvars['K8S_TOKEN_HOLDER']['token'] }}
--discovery-token-ca-cert-hash sha256:{{ hostvars['K8S_TOKEN_HOLDER']['hash'] }}
{{K8S_MASTER_NODE_IP}}:{{K8S_API_SERCURE_PORT}}
I have had similar issues with even the same host, but across different plays. The thing to remember is that facts, not variables, are the persistent things across plays. Here is how I get around the problem.
#!/usr/local/bin/ansible-playbook --inventory=./inventories/ec2.py
---
- name: "TearDown Infrastructure !!!!!!!"
hosts: localhost
gather_facts: no
vars:
aws_state: absent
vars_prompt:
- name: "aws_region"
prompt: "Enter AWS Region:"
default: 'eu-west-2'
tasks:
- name: Make vars persistant
set_fact:
aws_region: "{{aws_region}}"
aws_state: "{{aws_state}}"
- name: "TearDown Infrastructure hosts !!!!!!!"
hosts: monitoring.ec2
connection: local
gather_facts: no
tasks:
- name: set the facts per host
set_fact:
aws_region: "{{hostvars['localhost']['aws_region']}}"
aws_state: "{{hostvars['localhost']['aws_state']}}"
- debug:
msg="state {{aws_state}} region {{aws_region}} id {{ ec2_id }} "
- name: last few bits
hosts: localhost
gather_facts: no
tasks:
- debug:
msg="state {{aws_state}} region {{aws_region}} "
results in
Enter AWS Region: [eu-west-2]:
PLAY [TearDown Infrastructure !!!!!!!] ***************************************************************************************************************************************************************************************************
TASK [Make vars persistant] **************************************************************************************************************************************************************************************************************
ok: [localhost]
PLAY [TearDown Infrastructure hosts !!!!!!!] *********************************************************************************************************************************************************************************************
TASK [set the facts per host] ************************************************************************************************************************************************************************************************************
ok: [XXXXXXXXXXXXXXXXX]
TASK [debug] *****************************************************************************************************************************************************************************************************************************
ok: [XXXXXXXXXXX] => {
"changed": false,
"msg": "state absent region eu-west-2 id i-0XXXXX1 "
}
PLAY [last few bits] *********************************************************************************************************************************************************************************************************************
TASK [debug] *****************************************************************************************************************************************************************************************************************************
ok: [localhost] => {
"changed": false,
"msg": "state absent region eu-west-2 "
}
PLAY RECAP *******************************************************************************************************************************************************************************************************************************
XXXXXXXXXXXXX : ok=2 changed=0 unreachable=0 failed=0
localhost : ok=2 changed=0 unreachable=0 failed=0
You can use an Ansible known behaviour. That is using group_vars folder to load some variables at your playbook. This is intended to be used together with inventory groups, but it is still a reference to the global variable declaration. If you put a file or folder in there with the same name as the group, you want some variable to be present, Ansible will make sure it happens!
As for example, let's create a file called all and put a timestamp variable there. Then, whenever you need, you can call that variable, which will be available to every host declared on any play inside your playbook.
I usually do this to update a timestamp once at the first play and use the value to write files and folders using the same timestamp.
I'm using lineinfile module to change the line starting with timestamp :
Check if it fits for your purpose.
On your group_vars/all
timestamp: t26032021165953
On the playbook, in the first play:
hosts: localhost
gather_facts: no
- name: Set timestamp on group_vars
lineinfile:
path: "{{ playbook_dir }}/group_vars/all"
insertafter: EOF
regexp: '^timestamp:'
line: "timestamp: t{{ lookup('pipe','date +%d%m%Y%H%M%S') }}"
state: present
On the playbook, in the second play:
hosts: any_hosts
gather_facts: no
tasks:
- name: Check if timestamp is there
debug:
msg: "{{ timestamp }}"