Access to stdout_lines when using Ansible - debugging

Given I have "registered" files in directory using shell task in a var sonarqube_plugins_installed, when I "debug" this using
- name: Debug
debug:
var: sonarqube_plugins_installed.results
I see for example
TASK [sonarqube : Debug] ************************************************************************
ok: [sonarqube] => {
"sonarqube_plugins_installed.results": [
{
"ansible_loop_var": "item",
"changed": true,
"cmd": "ls /opt/sonarqube/sonarqube-6.7/extensions/plugins/sonar-build-breaker-plugin-*.jar",
"delta": "0:00:00.003748",
"end": "2019-09-18 04:04:54.355667",
"failed": false,
"invocation": {
"module_args": {
"_raw_params": "ls /opt/sonarqube/sonarqube-6.7/extensions/plugins/sonar-build-breaker-plugin-*.jar",
"_uses_shell": true,
"argv": null,
"chdir": null,
"creates": null,
"executable": null,
"removes": null,
"stdin": null,
"stdin_add_newline": true,
"strip_empty_ends": true,
"warn": true
}
},
"item": "build_breaker",
"rc": 0,
"start": "2019-09-18 04:04:54.351919",
"stderr": "",
"stderr_lines": [],
"stdout": "/opt/sonarqube/sonarqube-6.7/extensions/plugins/sonar-build-breaker-plugin-2.2.jar",
"stdout_lines": [
"/opt/sonarqube/sonarqube-6.7/extensions/plugins/sonar-build-breaker-plugin-2.2.jar"
]
}
]
}
How can I access for example stdout in another task? In another task I want to process each item in results using with_items and output stdout.
How can this be done?

The following should be a good start:
- name: show stdout for each result
debug:
var: item.stdout
loop: "{{ sonarqube_plugins_installed.results }}"
As you guessed, this is going trough each elements in your results list.
An other approach if your want to print all lines at once would be to process the data structure through json_query with the relevant jmespath expression to only extract the needed information
- name: show sdtout list for all results
debug:
msg: "{{ sonarqube_plugins_installed | json_query('results[].stdout[]') }}"
You could also combine the two and loop over the simple list of stdouts only
- name: show stdout for each result
debug:
var: item
loop: "{{ sonarqube_plugins_installed | json_query('results[].stdout[]') }}"
You could also use the map filter to extract the desired attribute:
- name: show stdout list of all results
debug:
msg: "{{ sonarqube_plugins_installed.results | map(attribute='stdout') | list }}"
And there other solutions you can use that I'm sure you will find in the specific ansible filters and general jinja2 filters documentations

Related

How do I get Ansible's json_query to return a value from this JSON document

After several hours of beating my head against this (not to mention leaving it for a day) I'm pretty much stumped on trying to figure out why I can't JMESPath to return a value in Ansible.
I have a task which runs a shell command and returns the following output:
[
{
"ansible_loop_var": "item",
"changed": false,
"cmd": [
"pvesh",
"create",
"/access/users/user#pve/token/pve-apikey",
"-privsep=0",
"--output=json"
],
"delta": "0:00:00.707130",
"end": "2022-09-22 12:28:43.746253",
"failed": false,
"invocation": {
"module_args": {
"_raw_params": "pvesh create /access/users/\"user#pve\"/token/\"pve-apikey\" -privsep=0 --output=json",
"_uses_shell": false,
"argv": null,
"chdir": null,
"creates": null,
"executable": null,
"removes": null,
"stdin": null,
"stdin_add_newline": true,
"strip_empty_ends": true,
"warn": false
}
},
"item": {
"token": "pve-apikey",
"user": "user#pve"
},
"msg": "",
"rc": 0,
"start": "2022-09-22 12:28:43.039123",
"stderr": "",
"stderr_lines": [],
"stdout": "{\"full-tokenid\":\"user#pve!pve-apikey\",\"info\":{\"privsep\":\"0\"},\"value\":\"dc2aa48f-daf6-4efe-b95e-83774a588988\"}",
"stdout_lines": [
"{\"full-tokenid\":\"user#pve!pve-apikey\",\"info\":{\"privsep\":\"0\"},\"value\":\"dc2aa48f-daf6-4efe-b95e-83774a588988\"}"
]
}
]
I'm now trying to obtain the UUID returned as value in the stdout_line using json_query and this is far as I can get:
- debug:
msg: "{{ token | community.general.json_query(query) }}"
vars:
query: '[].stdout'
This json_query returns the following output:
"msg": [
"{\"full-tokenid\":\"tfuser#pve!tfe-pve-apikey\",\"info\":{\"privsep\":\"0\"},\"value\":\"e47e82d4-6798-47ea-9592-c7cf55cc8b61\"}"
]
I believe that this is a list, so I've tried extending the json_query as [].stdout[].value but that returns null. I've tried various permutations but so far nothing seems to work.
Any advice on how to proceed would be very welcome!
The items of the list stdout_lines are strings. You can test it. For example,
- debug:
var: output.0.stdout_lines.0|type_debug
gives
output.0.stdout_lines.0|type_debug: AnsibleUnsafeText
Convert the items to dictionaries. For example
- debug:
var: output.0.stdout_lines.0|from_yaml
gives
output.0.stdout_lines.0|from_yaml:
full-tokenid: user#pve!pve-apikey
info:
privsep: '0'
value: dc2aa48f-daf6-4efe-b95e-83774a588988
To get the UUID, declare the variable
UUID: "{{ output|map(attribute='stdout_lines')|
map('map', 'from_yaml')|list|
json_query('[].value') }}"
This gives the list of the values
UUID:
- dc2aa48f-daf6-4efe-b95e-83774a588988
Example of a complete playbook for testing
- hosts: localhost
vars:
output: "{{ lookup('file', 'output.json') }}"
UUID: "{{ output|map(attribute='stdout_lines')|
map('map', 'from_yaml')|list|
json_query('[].value') }}"
tasks:
- debug:
var: output.0.stdout_lines.0|type_debug
- debug:
var: output.0.stdout_lines.0|from_yaml
- debug:
var: UUID

Unable to get product version. How to set CLASSPATH using ansible

I can get the weblogic version by doing this on the command prompt in unix.
$ . /app/wlserv*/server/bin/setWLSEnv.sh
$ java weblogic.version
I wish to grab the weblogic version using ansible on remote hosts so I wrote this playbook:
---
- hosts: dest_nodes
tasks:
- name: Get weblogic version
shell: "/app/wlserv*/server/bin/setWLSEnv.sh;java weblogic.version"
register: wlsversion
- debug:
msg: "{{ wlsversion }}"
However, I get this error:
fatal: [10.0.0.91]: FAILED! => {
"ansible_facts": {
"discovered_interpreter_python": "/usr/bin/python"
},
"changed": true,
"cmd": "/app/wlserv*/server/bin/setWLSEnv.sh;java weblogic.version",
"delta": "0:00:00.271434",
"end": "2020-05-15 16:31:44.209506",
"invocation": {
"module_args": {
"_raw_params": "/app/wlserv*/server/bin/setWLSEnv.sh;java weblogic.version",
"_uses_shell": true,
"argv": null,
"chdir": null,
"creates": null,
"executable": null,
"removes": null,
"stdin": null,
"stdin_add_newline": true,
"strip_empty_ends": true,
"warn": true
}
},
"msg": "non-zero return code",
"rc": 1,
"start": "2020-05-15 16:31:43.938072",
"stderr": "Error: Could not find or load main class weblogic.version",
"stderr_lines": [
"Error: Could not find or load main class weblogic.version"
],
"stdout": "CLASSPATH=/usr/java/jdk1.8.0_192-amd64/lib/tools.jar:/app/wlserver/modules/features/wlst.wls.classpath.jar:\n\nPATH=/app/wlserver/server/bin:/app/wlserver/../oracle_common/modules/thirdparty/org.apache.ant/1.9.8.0.0/apache-ant-1.9.8/bin:/usr/java/jdk1.8.0_192-amd64/jre/bin:/usr/java/jdk1.8.0_192-amd64/bin:/usr/lib64/qt-3.3/bin:/usr/kerberos/sbin:/usr/kerberos/bin:/usr/local/bin:/bin:/usr/bin:/app/wlserver/../oracle_common/modules/org.apache.maven_3.2.5/bin\n\nYour environment has been set.",
"stdout_lines": [
"CLASSPATH=/usr/java/jdk1.8.0_192-amd64/lib/tools.jar:/app/wlserver/modules/features/wlst.wls.classpath.jar:",
"",
"PATH=/app/wlserver/server/bin:/app/wlserver/../oracle_common/modules/thirdparty/org.apache.ant/1.9.8.0.0/apache-ant-1.9.8/bin:/usr/java/jdk1.8.0_192-amd64/jre/bin:/usr/java/jdk1.8.0_192-amd64/bin:/usr/lib64/qt-3.3/bin:/usr/kerberos/sbin:/usr/kerberos/bin:/usr/local/bin:/bin:/usr/bin:/app/wlserver/../oracle_common/modules/org.apache.maven_3.2.5/bin",
"",
"Your environment has been set."
]
}
From the output, I see that the classpath did get set but java weblogic.version command failed on the remote host.
Can you please suggest how can i get the Weblogic version registered to wlsversion variable ?
Wow, I found the solution after some efforts. Posting the solution here. I'm sure many face this issue.
---
- hosts: dest_nodes
tasks:
- name: Get weblogic CLASSPATH
shell: "/app/wlserv*/server/bin/setWLSEnv.sh"
register: wlsenv
- name: Get weblogic Version
shell: "java weblogic.version"
environment:
CLASSPATH: "{{ wlsenv.stdout }}"
register: wlsversion
- debug:
msg: "{{ wlsversion }}"

Raise an error based on results of other task

I am using the postgresql_db module that is part of Ansible. For example with something like
- name: Database
postgresql_db:
name: "{{ vars[item + '_database_name_version'] }}"
login_host: "{{ vars[item + '_database_host'] }}"
login_password: "{{ vars[item + '_database_admin_password'] }}"
login_user: "{{ vars[item + '_database_admin_username'] }}"
port: "{{ vars[item + '_database_port'] }}"
state: restore
target: "{{ backup_restore[item]['db_tar'] }}"
when: backup_restore[item]['db_tar'] is defined
with_items: '{{ backup_restore }}'
register: db_restore
When I debug output db_restore I see
TASK [backup : db_restore] *****************************************************
ok: [myapp] => {
"db_restore": {
"changed": true,
"msg": "All items completed",
"results": [
{
"ansible_loop_var": "item",
"changed": true,
"cmd": "cmd: ****",
"failed": false,
"invocation": {
"module_args": {
"ca_cert": null,
"conn_limit": "",
"db": "myapp_0_1_0",
"encoding": "",
"lc_collate": "",
"lc_ctype": "",
"login_host": "1.1.1.2",
"login_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER",
"login_unix_socket": "",
"login_user": "ansible",
"maintenance_db": "postgres",
"name": "myapp_0_1_0",
"owner": "",
"port": 5432,
"session_role": null,
"ssl_mode": "prefer",
"state": "restore",
"target": "/backup/tmp/myapp-myapp/myapp_daily/databases/PostgreSQL.sql.gz",
"target_opts": "",
"template": ""
}
},
"item": "myapp",
"msg": "",
"rc": 0,
"stderr": "ERROR: relation \"myapp_table\" already exists\n",
"stderr_lines": [
"ERROR: relation \"myapp_table\" already exists"
]
}
]
}
}
Although there is an error during execution of this module as visible in stderr, the Ansible postgresql_db module also returns "failed": false. So it looks like it is ignoring any errors that might occur while running commands to restore the database.
Now I want to add a task to check db_restore for stderr attribute and if present, raise an error so that the user is made aware of the problems.
How can I raise an error? Is there an error module?
Yes, there is a module to raise errors which is called fail (https://docs.ansible.com/ansible/latest/modules/fail_module.html)
# Example playbook using fail and when together
- fail:
msg: The system may not be provisioned according to the CMDB status.
when: cmdb_status != "to-be-staged"
Another possible way is to use failed_when.
- name: Fail task when the command error output prints FAILED
command: /usr/bin/example-command -x -y -z
register: command_result
failed_when: "'FAILED' in command_result.stderr"
I would recommend you to read the ansible page on error handling (https://docs.ansible.com/ansible/latest/user_guide/playbooks_error_handling.html)
I strongly agree with #Stefan Wegener's answer, but want to add another possibility.
I'm curious if your problem only occurs because of the loop you wrapped around your task.
Another possible solution would be to add the any_errors_fatal: true clause on play level like described in the Ansible documentation:
- hosts: somehosts
any_errors_fatal: true
roles:
- myrole
See https://docs.ansible.com/ansible/latest/user_guide/playbooks_error_handling.html#aborting-the-play
for reference.

Browsing JSON variable in Ansible

For some reasons, I'm not allowed to use jsqon_query with Ansible, I'm trying to reach stdout element in a results list in a variable resulting from a shell call.
The JSON variable is saved this way :
"request": {
"changed": true,
"msg": "All items completed",
"results": [
{
"_ansible_ignore_errors": null,
"_ansible_item_result": true,
"_ansible_no_log": false,
"_ansible_parsed": true,
"changed": true,
"cmd": "echo \"****:********\" | grep -o -P '^*****:[^\\n]*$' | awk '{split($0,a,\":\"); print a[2]}'",
"delta": "0:00:00.003660",
"end": "2018-10-31 17:26:17.697864",
"failed": false,
"invocation": {
"module_args": {
"_raw_params": "echo \"**************\" | grep -o -P '^************:[^\\n]*$' | awk '{split($0,a,\":\"); print a[2]}'",
"_uses_shell": true,
"chdir": null,
"creates": null,
"executable": null,
"removes": null,
"stdin": null,
"warn": true
}
},
"item": "**********:************",
"rc": 0,
"start": "2018-10-31 17:26:17.694204",
"stderr": "",
"stderr_lines": [],
"stdout": "**********",
"stdout_lines": [
"*********"
]
}
]
}
}
I'm trying to browse my stdout element this way :
- name: Tarball copy
copy: src= "{{ '%s/%s' | format( TARBALL_DIR , request.results[0].stdout ) }}" dest= "/tmp/tarball/"
I tryied also :
- name: Tarball copy
copy: src= "{{ '%s/%s' | format( TARBALL_DIR , request.results[.stdout] ) }}" dest= "/tmp/tarball/"
- name: Tarball copy
copy: src= "{{ '%s/%s' | format( TARBALL_DIR , item.stdout ) }}" dest= "/tmp/tarball/"
with_items: "{{ request.results }}"
I've no idea why I'm always getting the same error :
- template error while templating string: unexpected '.'. String: {{ request.results[.stdout] }} (when trying with [.stdout)
- The task includes an option with an undefined variable (when putting [0] index)
I've finally solved my problem using :
- name: Tarball copy
copy:
src: "{{ '%s/%s' | format( TARBALL_DIR , request.results[0].stdout ) }}"
dest: "/tmp/tarball/"
It seems that src and dest couldn't accept space after the equal char.

Ansible Dynamic Variables in Playbook

I'm provisioning a bunch of systems (VMs) on a physical host. I'm up to the step where the VM's are running. Now I need to ssh into the VM's via their DHCP addresses. I can pull the IP addresses from the server but I need a way to set these to host_vars. Here are my groups:
ok: [kvm01] => {
"msg": {
"all": [
"kvm01",
"dcos-master-1",
"dcos-agent-1",
"dcos-agent-2",
"dcos_bootstrap"
],
"dcos": [
"dcos-master-1",
"dcos-agent-1",
"dcos-agent-2",
"dcos_bootstrap"
],
"dcos-agents": [
"dcos-agent-1",
"dcos-agent-2"
],
"dcos-bootstraps": [
"dcos_bootstrap"
],
"dcos-masters": [
"dcos-master-1"
],
"kvm": [
"kvm01"
],
"ungrouped": []
}
}
Here's my command:
- name: Get the IP of the VM (DHCP)
command: "/getip.sh {{ item }}"
register: "result"
with_items: "{{ groups['dcos'] }}"
- name: List the output of the variables
debug: msg="{{item.stdout}}"
with_items: "{{result.results}}"
When I run the playbook I get results but they are the FULL JSON result of the command rather than the stdout. There's probably a way to pull out the stdout and assign it to a fact but it's a complex hash. Here's the last result:
TASK [vm_install : Get the IP of the VM (DHCP)] ***************************************************************************
changed: [kvm01] => (item=dcos-master-1)
changed: [kvm01] => (item=dcos-agent-1)
changed: [kvm01] => (item=dcos-agent-2)
changed: [kvm01] => (item=dcos_bootstrap)
TASK [vm_install : List the output of the variables] **********************************************************************
......
ok: [kvm01] => (item={'_ansible_parsed': True, 'stderr_lines': [u'] => {
"item": {
"changed": false,
"cmd": [
"/getip.sh",
"dcos_bootstrap"
],
"delta": "0:00:00.056193",
"end": "2017-09-18 15:45:45.409693",
"invocation": {
"module_args": {
"_raw_params": "/getip.sh dcos_bootstrap",
"_uses_shell": false,
"chdir": null,
"creates": null,
"executable": null,
"removes": null,
"warn": true
}
},
"item": "dcos_bootstrap",
"rc": 0,
"start": "2017-09-18 15:45:45.353500",
"stderr": " ",
"stdout": "192.168.1.130",
"stdout_lines": [
"192.168.1.130"
]
},
"msg": "192.168.1.130"
}
How can I put the output of the command into an array so that I can use it later in my playbook?
So, like I said in my comment, you've already managed to extract the information you want into an array. You can iterate over those items using with_items as in the follow task that will create an ip_address for each host:
- set_fact:
ip_address: "{{ item.stdout }}"
with_items: "{{ results.results }}"
delegate_to: "{{ item.item }}"
delegate_facts: true
Or you can create a single array containing all of the addresses using Jinja filters:
- set_fact:
all_ip_addresses: "{{ results.results|map(attribute='stdout')|list }}"
Or you could create a dictionary with the same information:
- set_fact:
all_ip_addresses: >
{{ all_ip_addresses
|default({})
|combine({ item.item: item.stdout })}}
with_items: "{{ results.results }}"

Resources