Using Ansible I'm having a problem registering a variable the way I want. Using the implementation below I will always have to call .stdout on the variable - is there a way I can do better?
My playbook:
Note the unwanted use of .stdout - I just want to be able to use the variable directly without calling a propery...?
---
- name: prepare for new deployment
hosts: all
user: ser85
tasks:
- name: init deploy dir
shell: echo ansible-deploy-$(date +%Y%m%d-%H%M%S-%N)
# http://docs.ansible.com/ansible/playbooks_variables.html
register: deploy_dir
- debug: var=deploy_dir
- debug: var=deploy_dir.stdout
- name: init scripts dir
shell: echo {{ deploy_dir.stdout }}/scripts
register: scripts_dir
- debug: var=scripts_dir.stdout
The output when I execute the playbook:
TASK [init deploy dir] *********************************************************
changed: [123.123.123.123]
TASK [debug] *******************************************************************
ok: [123.123.123.123] => {
"deploy_dir": {
"changed": true,
"cmd": "echo ansible-deploy-$(date +%Y%m%d-%H%M%S-%N)",
"delta": "0:00:00.002898",
"end": "2016-05-27 10:53:38.122217",
"rc": 0,
"start": "2016-05-27 10:53:38.119319",
"stderr": "",
"stdout": "ansible-deploy-20160527-105338-121888719",
"stdout_lines": [
"ansible-deploy-20160527-105338-121888719"
],
"warnings": []
}
}
TASK [debug] *******************************************************************
ok: [123.123.123.123] => {
"deploy_dir.stdout": "ansible-deploy-20160527-105338-121888719"
}
TASK [init scripts dir] ********************************************************
changed: [123.123.123.123]
TASK [debug] *******************************************************************
ok: [123.123.123.123] => {
"scripts_dir.stdout": "ansible-deploy-20160527-105338-121888719/scripts"
}
Any help or insights appreciated - thank you :)
If I understood it right you want to assign deploy_dir.stdout to a variable that you can use without stdout key. It can be done with set_fact module:
tasks:
- name: init deploy dir
shell: echo ansible-deploy-$(date +%Y%m%d-%H%M%S-%N)
# http://docs.ansible.com/ansible/playbooks_variables.html
register: deploy_dir
- set_fact: my_deploy_dir="{{ deploy_dir.stdout }}"
- debug: var=my_deploy_dir
Related
Club multiple variable and get one output in ansible
I will be using combined variable many other parts of the code and Is there a way to do it.
name:
hosts: dummy
gather_facts: True
tasks:
block:
Capture package-Infra agent version
v1
- name: Get package-Integration agent version ostype1.4
shell: command1
register: NRAV1
when: ansible_distribution == "ostype" and ansible_distribution_version == "v1"
- name: Debug
debug:
msg: "{{ NRAV1 }}"
v2
- name: Get package-Integration agent version ostype2.2
shell: command2
register: NRAV2
when: ansible_distribution == "ostype" and ansible_distribution_version == "v2"
- name: Debug
debug:
msg: "{{ NRAV2 }}"
v3
- name: Get package-Integration agent version ostype2.3
shell: command3
register: NRAV3
when: ansible_distribution == "ostype" and ansible_distribution_version == "v3"
- name: Debug
debug:
msg: "{{ NRAV3 }}"
- name: Result
debug:
msg: "{{ NRAV1.stdout | NRAV2.stdout | NRAV2.stdout }}"
#Output:
TASK [setup] *******************************************************************
ok: [dummy]
TASK [Get package-Integration agent version ostype1.4] *************************
skipping: [dummy]
TASK [Debug] *******************************************************************
ok: [dummy] => {
"msg": {
"changed": false,
"skip_reason": "Conditional check failed",
"skipped": true
}
}
TASK [Get package-Integration agent version ostype2.2] *************************
changed: [dummy]
TASK [Debug] *******************************************************************
ok: [dummy] => {
"msg": {
"changed": true,
"cmd": "rpm -qa --last | grep package-infra | awk '{print $1}' | cut -d'-' -f 3,4 | cut -d'.' -f 1,2,3,4,5",
"delta": "0:00:00.524132",
"end": "2021-03-07 12:06:33.150778",
"rc": 0,
"start": "2021-03-07 12:06:32.626646",
"stderr": "",
"stdout": "1.15.1-1.ostype2.2",
"stdout_lines": [
"1.15.1-1.ostype2.2"
],
"warnings": [
"Consider using yum module rather than running rpm"
]
}
}
TASK [Get package-Integration agent version ostype2.3] *************************
skipping: [dummy]
TASK [Debug] *******************************************************************
ok: [dummy] => {
"msg": {
"changed": false,
"skip_reason": "Conditional check failed",
"skipped": true
}
}
TASK [Result] *******************************************************************
fatal: [dummy]: FAILED! => {"failed": true, "msg": "ERROR! template error while templating string: no filter named 'NRAV3.stdout'"}
PLAY RECAP *********************************************************************
dummy : ok=5 changed=1 unreachable=0 failed=1
#My expected result would be >> NRAV2.stdout >> "1.15.1-1.ostype2.2"
Ansible is unable to parse below line -
msg: "{{ NRAV1.stdout | NRAV2.stdout | NRAV3.stdout }}"
You need to replace it with something like this -
msg: "{{ NRAV1.stdout }} | {{ NRAV2.stdout }} | {{ NRAV3.stdout }}"
Below is my simple playbook
name: "test"
hosts: webservers
tasks:
- name: Echo my_env_var
shell: "echo $MY_ENV_VARIABLE"
environment:
MY_ENV_VARIABLE: whatever_value
- name: Echo my_env_var again
shell: "echo $MY_ENV_VARIABLE"
register: stdd
- debug: msg={{stdd.stdout_lines}}
My output is always msg:"" or msg: []. Why am i not able to see the value of variable
I took your example and changed it from debug msg to debug var. I also simplified it by only running the task once, and found the error in the process. The environment argument is specific to a task. You aren't including it in your second shell task.
Here's the example I used.
echo.yml
- hosts: localhost
tasks:
- name: Echo my_env_var
shell: "echo $MY_ENV_VARIABLE"
environment:
MY_ENV_VARIABLE: whatever_value
register: stdd
- debug: var=stdd
execution
$ ansible-playbook -c local -i "localhost," echo.yml
PLAY [localhost] **************************************************************
GATHERING FACTS ***************************************************************
ok: [localhost]
TASK: [Echo my_env_var] *******************************************************
changed: [localhost]
TASK: [debug var=stdd] ********************************************************
ok: [localhost] => {
"var": {
"stdd": {
"changed": true,
"cmd": "echo $MY_ENV_VARIABLE",
"delta": "0:00:00.005332",
"end": "2016-07-25 19:42:54.320667",
"invocation": {
"module_args": "echo $MY_ENV_VARIABLE",
"module_complex_args": {},
"module_name": "shell"
},
"rc": 0,
"start": "2016-07-25 19:42:54.315335",
"stderr": "",
"stdout": "whatever_value",
"stdout_lines": [
"whatever_value"
],
"warnings": []
}
}
}
PLAY RECAP ********************************************************************
localhost : ok=3 changed=1 unreachable=0 failed=0
I am trying (newbie) to setup a playbook, which will use lookup plugin to fetch secrets from vault (https://github.com/jhaals/ansible-vault), but it will fail on missing environment variables every time. Can anyone help? Thanks for the help.
PS: token is for a test purposes
There is condition in lookup module :
url = os.getenv('VAULT_ADDR')
if not url:
raise AnsibleError('VAULT_ADDR environment variable is missing')
Playbook:
---
- hosts: localhost
vars:
vault1_env:
VAULT_ADDR: https://localhost:8200/
VAULT_TOKEN: my-token-id
VAULT_SKIP_VERIFY: True
tasks:
- shell: echo VAULT_ADDR is $VAULT_ADDR, VAULT_TOKEN is $VAULT_TOKEN, VAULT_SKIP_VERIFY is $VAULT_SKIP_VERIFY
environment: "{{ vault1_env }}"
register: shellout
- debug: var=shellout
- debug: msg="{{ lookup('vault', 'secret/hello', 'value') }}"
output:
PLAY ***************************************************************************
TASK [setup] *******************************************************************
ok: [localhost]
TASK [command] *****************************************************************
changed: [localhost]
TASK [debug] *******************************************************************
ok: [localhost] => {
"shellout": {
"changed": true,
"cmd": "echo VAULT_ADDR is $VAULT_ADDR, VAULT_TOKEN is $VAULT_TOKEN, VAULT_SKIP_VERIFY is $VAULT_SKIP_VERIFY",
"delta": "0:00:00.001268",
"end": "2016-05-17 15:46:34.144735",
"rc": 0,
"start": "2016-05-17 15:46:34.143467",
"stderr": "",
"stdout": "VAULT_ADDR is https://localhost:8200/, VAULT_TOKEN is ab9b16c6-52d9-2051-0802-6f047d929b63, VAULT_SKIP_VERIFY is True",
"stdout_lines": [
"VAULT_ADDR is https://localhost:8200/, VAULT_TOKEN is ab9b16c6-52d9-2051-0802-6f047d929b63, VAULT_SKIP_VERIFY is True"
],
"warnings": []
}
}
TASK [debug] *******************************************************************
fatal: [localhost]: FAILED! => {"failed": true, "msg": "ERROR! VAULT_ADDR environment variable is missing"}
PLAY RECAP *********************************************************************
localhost : ok=3 changed=1 unreachable=0 failed=1
Here you are only setting environmental variables for the shell module, and not for the others. If you want to use variables across multiple modules, or for an entire a host, you should use the environment attribute on all of the modules, or on the host itself, something like this:
---
- hosts: localhost
environment:
VAULT_ADDR: https://localhost:8200/
VAULT_TOKEN: my-token-id
VAULT_SKIP_VERIFY: True
Why don't you make use of the vault feature to encrypt a variable file and then include this file in your playbook.
http://docs.ansible.com/ansible/playbooks_vault.html#running-a-playbook-with-vault
I am trying to use ansible to check to see if there are any aerospike migrations happening, and then run a task when migrations reach 0. To do that, I am using the ansible shell module to output the total number of migrations to stdout, register that output with ansible, and have ansible test on it.
Ansible seems to be recording the output correctly, but it is constantly displaying the stdout as "Hello World"
Here is my test playbook:
---
- hosts:
- foo
- bar
serial: 1
gather_facts: no
tasks:
- name: check for migrates
shell: "echo 10"
register: as_migrates
- debug: var=as_migrates
- debug: msg = "{{ as_migrates.stdout }}"
- debug: msg = "{{ as_migrates.stdout_lines }}"
- debug: msg = "{{ as_migrates }}"
Here is the output:
PLAY [foo;bar] ******************************************************
TASK: [check for migrates] ****************************************************
changed: [foo-10]
TASK: [debug var=as_migrates] *************************************************
ok: [foo-10] => {
"var": {
"as_migrates": {
"changed": true,
"cmd": "echo 10",
"delta": "0:00:00.001367",
"end": "2016-01-26 23:19:20.586245",
"invocation": {
"module_args": "echo 10",
"module_complex_args": {},
"module_name": "shell"
},
"rc": 0,
"start": "2016-01-26 23:19:20.584878",
"stderr": "",
"stdout": "10",
"stdout_lines": [
"10"
],
"warnings": []
}
}
}
TASK: [debug msg = "{{ as_migrates.stdout }}"] ********************************
ok: [foo-10] => {
"msg": "Hello world!"
}
TASK: [debug msg = "{{ as_migrates.stdout_lines }}"] **************************
ok: [foo-10] => {
"msg": "Hello world!"
}
TASK: [debug msg = "{{ as_migrates }}"] ***************************************
ok: [foo-10] => {
"msg": "Hello world!"
}
My question is: Why is does the debug var clearly show the correct stdout and the as_migrats.stdout display "Hello World"??
I know that "Hello World" is the default message for the message module. So is the register not persisting from one task to another?? I feel like I am missing something obvious. I don't have another variable named "as_migrates" in my ansible environment.
Ansible is "space" sensitive :-) You cannot have a space after msg
Try this:
- debug: msg="{{ as_migrates.stdout }}"
- debug: msg="{{ as_migrates.stdout_lines }}"
- debug: msg="{{ as_migrates }}"
I have an Ansible playbook, where I would like a variable I register in a first play targeted on one node to be available in a second play, targeted on another node.
Here is the playbook I am using:
---
- hosts: localhost
gather_facts: no
tasks:
- command: echo "hello world"
register: foo
- hosts: main
gather_facts: no
tasks:
- debug:
msg: {{ foo.stdout }}
But, when I try to access the variable in the second play, targeted on main, I get this message:
The task includes an option with an undefined variable. The error was: 'foo' is undefined
How can I access foo, registered on localhost, from main?
The problem you're running into is that you're trying to reference facts/variables of one host from those of another host.
You need to keep in mind that in Ansible, the variable foo assigned to the host localhost is distinct from the variable foo assigned to the host main or any other host.
If you want to access one hosts facts/variables from another host then you need to explicitly reference it via the hostvars variable. There's a bit more of a discussion on this in this question.
Suppose you have a playbook like this:
- hosts: localhost
gather_facts: no
tasks:
- command: echo "hello world"
register: foo
- hosts: localhost
gather_facts: no
tasks:
- debug:
var: foo
This will work because you're referencing the host localhost and localhosts's instance of the variable foo in both plays.
The output of this playbook is something like this:
PLAY [localhost] **************************************************
TASK: [command] ***************************************************
changed: [localhost]
PLAY [localhost] **************************************************
TASK: [debug] *****************************************************
ok: [localhost] => {
"var": {
"foo": {
"changed": true,
"cmd": [
"echo",
"hello world"
],
"delta": "0:00:00.004585",
"end": "2015-11-24 20:49:27.462609",
"invocation": {
"module_args": "echo \"hello world\",
"module_complex_args": {},
"module_name": "command"
},
"rc": 0,
"start": "2015-11-24 20:49:27.458024",
"stderr": "",
"stdout": "hello world",
"stdout_lines": [
"hello world"
],
"warnings": []
}
}
}
If you modify this playbook slightly to run the first play on one host and the second play on a different host, you'll get the error that you encountered.
Solution
The solution is to use Ansible's built-in hostvars variable to have the second host explicitly reference the first hosts variable.
So modify the first example like this:
- hosts: localhost
gather_facts: no
tasks:
- command: echo "hello world"
register: foo
- hosts: main
gather_facts: no
tasks:
- debug:
var: foo
when: foo is defined
- debug:
var: hostvars['localhost']['foo']
## alternatively, you can use:
# var: hostvars.localhost.foo
when: hostvars['localhost']['foo'] is defined
The output of this playbook shows that the first task is skipped because foo is not defined by the host main.
But the second task succeeds because it's explicitly referencing localhosts's instance of the variable foo:
TASK: [debug] *************************************************
skipping: [main]
TASK: [debug] *************************************************
ok: [main] => {
"var": {
"hostvars['localhost']['foo']": {
"changed": true,
"cmd": [
"echo",
"hello world"
],
"delta": "0:00:00.005950",
"end": "2015-11-24 20:54:04.319147",
"invocation": {
"module_args": "echo \"hello world\"",
"module_complex_args": {},
"module_name": "command"
},
"rc": 0,
"start": "2015-11-24 20:54:04.313197",
"stderr": "",
"stdout": "hello world",
"stdout_lines": [
"hello world"
],
"warnings": []
}
}
}
So, in a nutshell, you want to modify the variable references in your main playbook to reference the localhost variables in this manner:
{{ hostvars['localhost']['foo'] }}
{# alternatively, you can use: #}
{{ hostvars.localhost.foo }}
Use a dummy host and its variables
For example, to pass a Kubernetes token and hash from the master to the workers.
On master
- name: "Cluster token"
shell: kubeadm token list | cut -d ' ' -f1 | sed -n '2p'
register: K8S_TOKEN
- name: "CA Hash"
shell: openssl x509 -pubkey -in /etc/kubernetes/pki/ca.crt | openssl rsa -pubin -outform der 2>/dev/null | openssl dgst -sha256 -hex | sed 's/^.* //'
register: K8S_MASTER_CA_HASH
- name: "Add K8S Token and Hash to dummy host"
add_host:
name: "K8S_TOKEN_HOLDER"
token: "{{ K8S_TOKEN.stdout }}"
hash: "{{ K8S_MASTER_CA_HASH.stdout }}"
- name:
debug:
msg: "[Master] K8S_TOKEN_HOLDER K8S token is {{ hostvars['K8S_TOKEN_HOLDER']['token'] }}"
- name:
debug:
msg: "[Master] K8S_TOKEN_HOLDER K8S Hash is {{ hostvars['K8S_TOKEN_HOLDER']['hash'] }}"
On worker
- name:
debug:
msg: "[Worker] K8S_TOKEN_HOLDER K8S token is {{ hostvars['K8S_TOKEN_HOLDER']['token'] }}"
- name:
debug:
msg: "[Worker] K8S_TOKEN_HOLDER K8S Hash is {{ hostvars['K8S_TOKEN_HOLDER']['hash'] }}"
- name: "Kubeadmn join"
shell: >
kubeadm join --token={{ hostvars['K8S_TOKEN_HOLDER']['token'] }}
--discovery-token-ca-cert-hash sha256:{{ hostvars['K8S_TOKEN_HOLDER']['hash'] }}
{{K8S_MASTER_NODE_IP}}:{{K8S_API_SERCURE_PORT}}
I have had similar issues with even the same host, but across different plays. The thing to remember is that facts, not variables, are the persistent things across plays. Here is how I get around the problem.
#!/usr/local/bin/ansible-playbook --inventory=./inventories/ec2.py
---
- name: "TearDown Infrastructure !!!!!!!"
hosts: localhost
gather_facts: no
vars:
aws_state: absent
vars_prompt:
- name: "aws_region"
prompt: "Enter AWS Region:"
default: 'eu-west-2'
tasks:
- name: Make vars persistant
set_fact:
aws_region: "{{aws_region}}"
aws_state: "{{aws_state}}"
- name: "TearDown Infrastructure hosts !!!!!!!"
hosts: monitoring.ec2
connection: local
gather_facts: no
tasks:
- name: set the facts per host
set_fact:
aws_region: "{{hostvars['localhost']['aws_region']}}"
aws_state: "{{hostvars['localhost']['aws_state']}}"
- debug:
msg="state {{aws_state}} region {{aws_region}} id {{ ec2_id }} "
- name: last few bits
hosts: localhost
gather_facts: no
tasks:
- debug:
msg="state {{aws_state}} region {{aws_region}} "
results in
Enter AWS Region: [eu-west-2]:
PLAY [TearDown Infrastructure !!!!!!!] ***************************************************************************************************************************************************************************************************
TASK [Make vars persistant] **************************************************************************************************************************************************************************************************************
ok: [localhost]
PLAY [TearDown Infrastructure hosts !!!!!!!] *********************************************************************************************************************************************************************************************
TASK [set the facts per host] ************************************************************************************************************************************************************************************************************
ok: [XXXXXXXXXXXXXXXXX]
TASK [debug] *****************************************************************************************************************************************************************************************************************************
ok: [XXXXXXXXXXX] => {
"changed": false,
"msg": "state absent region eu-west-2 id i-0XXXXX1 "
}
PLAY [last few bits] *********************************************************************************************************************************************************************************************************************
TASK [debug] *****************************************************************************************************************************************************************************************************************************
ok: [localhost] => {
"changed": false,
"msg": "state absent region eu-west-2 "
}
PLAY RECAP *******************************************************************************************************************************************************************************************************************************
XXXXXXXXXXXXX : ok=2 changed=0 unreachable=0 failed=0
localhost : ok=2 changed=0 unreachable=0 failed=0
You can use an Ansible known behaviour. That is using group_vars folder to load some variables at your playbook. This is intended to be used together with inventory groups, but it is still a reference to the global variable declaration. If you put a file or folder in there with the same name as the group, you want some variable to be present, Ansible will make sure it happens!
As for example, let's create a file called all and put a timestamp variable there. Then, whenever you need, you can call that variable, which will be available to every host declared on any play inside your playbook.
I usually do this to update a timestamp once at the first play and use the value to write files and folders using the same timestamp.
I'm using lineinfile module to change the line starting with timestamp :
Check if it fits for your purpose.
On your group_vars/all
timestamp: t26032021165953
On the playbook, in the first play:
hosts: localhost
gather_facts: no
- name: Set timestamp on group_vars
lineinfile:
path: "{{ playbook_dir }}/group_vars/all"
insertafter: EOF
regexp: '^timestamp:'
line: "timestamp: t{{ lookup('pipe','date +%d%m%Y%H%M%S') }}"
state: present
On the playbook, in the second play:
hosts: any_hosts
gather_facts: no
tasks:
- name: Check if timestamp is there
debug:
msg: "{{ timestamp }}"