I have key of an object variable I can't escape in Ansible playbook, possibly because of containing dots,
Here's the data structure of variable:
"results":[
{
//snip//
"changed": false,
"hostvars[item].commandResult.stdout": "abc",
//snip//
},
{
//snip//
"changed": true,
"hostvars[item].commandResult.stdout": "xyz",
//snip//
}
]
I'm unable to extract "hostvars[item].commandResult.stdout" inside it with this playbook,
- debug:
msg: "{{variable.results | map(attribute='hostvars[item].commandResult.stdout') }}"
While I can get other value just fine,
- debug:
msg: "{{variable.results | map(attribute='changed') }}"
I tried with \ , '.', and {{...}} to escape . (dot) but still no luck.
I suspect it's . because of this error message:
msg: |-
The task includes an option with an undefined variable. The error was: 'dict object' has no attribute 'hostvars[item]'
When running ansible-playbook -vvv command
How I can map "hostvars[item].commandResult.stdout" ?
Turned out I work it around by wrapping inside Ansible set_fact first:
- set_fact:
variable: "{{hostvars[item].commandResult.stdout_lines}}"
with_items: "{{ groups['servers'] }}"
register: all_result
- debug:
msg: "{{all_result.results | map(attribute='ansible_facts') | list | to_nice_json }}
The original question to escape dots remain unanswered, though.
To address a key with dots, use array notation with single quotes instead of dot notation, i.e.:
- debug:
msg: "{{variable.results | map(attribute=['hostvars[item].commandResult.stdout']) }}"
This returns the value.
cf.: Ansible FAQ
Related
I want to filter this variable hp but it getting print as square bracket with "".
how do i remove square bracket with "" just to get the only value. can someone please help here ?
I was looking as regex but not able to find the exact syntax.
srv_make1: '{{ basic_params | from_json | json_query("servers.server_details[*].srv_make") }}'
Thanks
I had something similar.
Was getting
["abc"]
to overcome it, had to do 2 things:
append | [0] to the json query
use replace to get rid of "
so in your case instead of
srv_make1: '{{ basic_params | from_json | json_query("servers.server_details[*].srv_make") }}'
it will look something like
srv_make1: '{{ basic_params | from_json | json_query("servers.server_details[*].srv_make | [0]") | replace('\"','') }}'
Q: "How to remove square bracket & double quote?"
json_query always returns a list. It depends on the debug task how a list is displayed. For example
vars:
srv_make1: [a,b,c]
tasks:
- debug:
var: srv_make1
- debug:
msg: "{{ srv_make1|to_yaml }}"
give
TASK [debug] ***
ok: [localhost] => {
"srv_make1": [
"a",
"b",
"c"
]
}
TASK [debug] ***
ok: [localhost] => {
"msg": "[a, b, c]\n"
}
It's possible to use template and write the list into a file without brackets and quotes. For example the template
shell> cat srv_make1.conf.j2
{% for item in srv_make1 %}{{ item }} {% endfor %}
with the task
- template:
src: srv_make1.conf.j2
dest: srv_make1.conf
gives
shell> cat srv_make1.conf
a b c
I need to replace all the / by \ in a string stored in a variable.
I'm just trying to do it a simple as possible to test it with a debug, but no matter how I try it I dont get the expected result of just replacing character to character. I think it's probably just a single/double quote problem or maybe the \ needs to be escaped in a certain way I don't know.
vars:
- SecGroup: '/stuff/foo/thing'
tasks:
- name: Display modified var
debug:
msg: "{{ SecGroup | replace('/','\') }}"
Expected output : \stuff\foo\thing
Output with differents tries :
- name: Display modified var
debug:
msg: "{{ SecGroup | replace('/','\') }}"
TASK [Display modified var]
ok: [localhost] => {
"msg": "stufffoothing"
}
- name: Display modified var
debug:
msg: "{{ SecGroup | replace('/','\\') }}"
TASK [Display modified var]
fatal: [localhost]: FAILED! => {"msg": "Unexpected failure during module execution."}
- name: Display modified var
debug:
msg: "{{ SecGroup | replace('/','\\\') }}"
TASK [Display modified var]
fatal: [localhost]: FAILED! => {"msg": "Unexpected failure during module execution."}
- name: Display modified var
debug:
msg: "{{ SecGroup | replace('/','\\\\') }}"
TASK [Display modified var]
ok: [localhost] => {
"msg": "\\\\stuff\\\\foo\\\\thing"
}
I also tried to revert the quotes :
- name: Display modified var
debug:
msg: '{{ SecGroup | replace("/","\") }}'
TASK [Display modified var]
fatal: [localhost]: FAILED! => {"msg": "Unexpected failure during module execution."}
I can't explain the output of this one
- name: Display modified var
debug:
msg: '{{ SecGroup | replace("/","\\") }}'
TASK [Display modified var]
ok: [localhost] => {
"msg": "\\\\stuff\\\\foo\\\\thing"
}
I think you've stumbled upon an edge case that involves the interaction between YAML escaping and Python escaping. The only way I was able to get it to work was introducing a guard character -- something to ensure that the \ isn't the last character in the expression, which we then remove with a subsequent replace() filter. Here I'm using a semicolon (;), but you could use anything that you're certain won't be in your SecGroup string. Note that your choice of quotes is significant; quoting the entire string with single quotes inhibits YAML escaping:
- name: With guard character
debug:
msg: '{{ SecGroup | replace("/","\;") | replace(";", "") }}'
Outputs:
TASK [With guard character] *******************************************************************************************************************************************************************
ok: [localhost] => {
"msg": "\\stuff\\foo\\thing"
}
Which is exactly what you want (remembering that a single \ is encoded as \\ in JSON output).
Regarding this:
- name: Display modified var
debug:
msg: '{{ SecGroup | replace("/","\\") }}'
TASK [Display modified var]
ok: [localhost] => {
"msg": "\\\\stuff\\\\foo\\\\thing"
}
You are successfully replacing / with two backslashes, \\. Since a backslash must be encoded as \\ in JSON output, a double backslash will end up represented as \\\\, so this:
"msg": "\\\\stuff\\\\foo\\\\thing"
Means you actually have the string:
\\stuff\\foo\\thing
I wanted to add an alternative solution:
If you're familiar with Python, you can just write a custom filter module and avoid multiple layers of escaping. E.g., if you were to create filter_plugins/reslash.py with the following content:
#!/usr/bin/python
def filter_reslash(val):
return val.replace('/', '\\')
class FilterModule(object):
filter_map = {
'reslash': filter_reslash
}
def filters(self):
return self.filter_map
You could then write your playbook like this:
---
- hosts: localhost
gather_facts: false
vars:
- SecGroup: '/stuff/foo/thing'
tasks:
- debug:
msg: "{{ SecGroup | reslash }}"
That's arguably a cleaner solution.
The solution by #larsks didn't entirely work for me as described. I needed to escape the backslash with double slashes \ plus the guard character in order for it to work in the Ansible Playbook.
This works: replace('/','\\;') | replace(';', '')
Another easy solution is to leave escaping backslash to ansible itself. This is how i would have done.
- set_fact:
replacer: '\'
- name: With guard character
debug:
msg: '{{ SecGroup | replace("/",replacer)}}'
Same workaround if you want replace 1 backslash with double backslash on a windows path.
- hosts: localhost
gather_facts: False
vars:
- iis_manager_logdir: 'C:\inetpub\logs\manager-logs'
tasks:
- set_fact:
iis_mng_logs: "{{ iis_manager_logdir | regex_replace('\\\\', '\\\\;') | regex_replace(';', '\\\\') }}"
- name: Original path
debug:
msg: "{{ iis_manager_logdir }}"
- name: New path
debug:
msg: "{{ iis_mng_logs }}"
Thanks to the #larsks's answer i've managed to replace backslashes in ansible string variable value without intermediate replace. It's possible by supplying into regex_replace expression a regex quantifier {1} between last backslash and closing quote.
For example, expression like {{ install_path | regex_replace('\\\\{1}', '/') }} replaces all occurences of backslash \ to forward slash /. It was used to replace Windows path delimiters with Unix-like ones:
- name: install libs
win_shell: "pip install --no-index --find-links \"file://{{ install_path | regex_replace('\\\\{1}', '/') }}/libs\" attrs requests"
become: true
For what its worth, after countless struggles, this is what has worked for me without any workarounds:
Forward to Back Slash
ForwardtoBackSlash: "{{ 'c:/test' | regex_replace('\\\/', '\\\\') }}"
output:
c:\test
Single Slash to Double Slash
SingleSlashtoDoble: "{{ 'C:\test\logs\logfile.txt'| regex_replace('\\\\', '\\\\\\\\') }}"
Output:
C:\\test\\logs\\logfile.txt
I hope it helps someone.
Hi I'm using ansible to create some elastic search indexes using laravel's artisan commands. The issue I have is that when I have to create an index I have to use the php class name which the base name is snake cased from camel case e.g. UserConfigurator = user_configurator. In my vars I have the following;
elastic_search:
version: 6.3.2
indexes:
- "App\\ElasticSearch\\Configurators\\UserConfigurator"
- "App\\ElasticSearch\\Configurators\\ClientConfigurator"
models:
- "App\\Models\\User"
- "App\\Models\\Client"
and in my playbook I have the following;
- name: check if indexes exist
uri: url='http://localhost:9200/{{ index_name }}'
method=HEAD
status_code=200,404
- name: create indexes
command: php artisan elastic:create-index "{{ item }}"
args:
chdir: "{{site_root}}"
with_items: "{{elastic_search.indexes}}"
The playbook isn't sufficient enough to do what I want to do due to lack of experience. Any ideas how I may loop over each elastic_search.indexes and convert the class basename to a snake case and check to see if the index exists or not and push them into two separate arrays so then I can use the one of the new variables to create the index and the other variable to update the index?
Any ideas how I may loop over each elastic_search.indexes and convert the class basename to a snake case
The algorithm I use for snake-casing something is to break the word at an upper-to-lower boundary, lowercase the entire thing, then reassemble the parts, being mindful that the first upper-to-lower transition is special and should not be separated by _.
We can leverage the behavior that when set_fact sees a block of text that looks like JSON, it will coerce that block of text into a dict (you'll see that behavior on the command line with ansible -e '{"hello": "world"}' too), along with the infinitely handy regex_replace jinja2 filter provided by ansible:
- hosts: all
gather_facts: no
vars:
class_names:
- 'App\ElasticSearch\Configurators\UserConfigurator'
- 'App\ElasticSearch\Configurators\ClientConfigurator'
tasks:
- set_fact:
index_names: >-
[
{% for cn in class_names -%}
{{ '' if loop.first else ',' }}
{{ cn |
regex_replace(".*\\([^\\]+$)", "\1") |
regex_replace("([A-Z])([a-z])", "\x00\1\2") |
lower | replace("\x00", "_") | regex_replace("^_", "") |
to_json }}
{% endfor %}
]
with_items: '{{ class_names }}'
- debug: var=index_names verbosity=0
- debug: var=item verbosity=0
with_items: '{{ index_names }}'
which produces the correct output:
TASK [debug] *******************************************************************
ok: [localhost] => (item=user_configurator) => {
"item": "user_configurator"
}
ok: [localhost] => (item=client_configurator) => {
"item": "client_configurator"
}
and now you can use those indices in a command of your choosing.
I am using Ansible 2.3.0.0 and I have tested in Ansible 2.4.0.0, obtaining the same result. My problem is very simple, but I cannot see the problem.
I have defined a list of objects in Ansible as follows:
vars:
password_text_to_encrypt:
- { line: "{{truststore_pass }}" , regexp: '\${TRUSTSTORE_PASS}*'}
- { line: "{{ keystore_pass }}" , regexp: '\${KEYSTORE_PASS}*'}
- { line: "{{ gp_pass }}" , regexp: '\${GP_PASS}*'}
- { line: "{{ datasource_password }}" , regexp: '\${DATASOURCE_PASS}*'}
- { line: "{{ server_password }}" , regexp: '\${SERVER_PASSWORD}*'}
- { line: "{{ sftp_password }}" , regexp: '\${SFTP_PASSWORD}*'}
- { line: "{{ db_userpassword }}" , regexp: '\${DB_PASSWORD}*'}
roles:
- basic_role
My Basic_role just prints the items, and I would like to obtain the content of each line:
---
- name: "print password"
debug:
msg: "The content of the line is: {{ item.line}}"
with_nested:
- "{{password_text_to_encrypt}}"
But the result that I obtain is:
FAILED! => {"failed": true, "msg": "the field 'args' has an invalid value, which appears to include a variable that is undefined. The error was: 'list object' has no attribute 'line'\n\nThe error appears to have been in.....
If I change item.line to just item, it works but it prints:
ok: [localhost] => (item=[u'regexp', u'line']) => {
"item": [
"regexp",
"line"
],
"msg": "The content of the line is: [u'regexp', u'line']"
}
.
.
.
To summarize, Ansible does not consider the content of line o regexp. I have been doing tests and the variable which are used to init line and regexp are not empty.
Use with_items instead of with_nested.
I think you want loops and includes because you're getting a flattened list which is expected (as per the documentation).
When should I use the from_json filter in Ansible?
I found out that using it sometimes has and sometimes have no effect.
Please consider the following example which illustrates the inconsistency I am getting.
Included in reverse order are: the questions - expected result - actual result - the playbook - the data. The data is taken from this question and the playbook is based on this answer.
The question(s):
Why storing the left part (before json_query) of the following expression in a variable and then using json_query on the variable causes the expression to be evaluated differently?
"{{ lookup('file','test.json') | json_query(query) }}"
Why does adding from_json filter alter the results (but does not if processing a variable):
"{{ lookup('file','test.json') | from_json | json_query(query) }}"
Expected result:
Last four tasks should give the same result. Alternatively, last two tasks should give the same result as previous two tasks.
Actual result (last four tasks only):
One task result differs.
TASK [This query is run against lookup value with from_json stored in a variable] ***
ok: [localhost] => {
"msg": [
678
]
}
TASK [This query is run against lookup value without from_json stored in a variable] ***
ok: [localhost] => {
"msg": [
678
]
}
TASK [This query is run directly against lookup value with from_json] **********
ok: [localhost] => {
"msg": [
678
]
}
TASK [This query is run directly against lookup value without from_json - the result is empty - why?] ***
ok: [localhost] => {
"msg": ""
}
The playbook:
---
- hosts: localhost
gather_facts: no
connection: local
tasks:
- set_fact:
from_lookup_with_from_json: "{{ lookup('file','test.json') | from_json }}"
- set_fact:
from_lookup_without_from_json: "{{ lookup('file','test.json') }}"
- name: Save the lookup value stored in a variable in a file for comparison
copy: content="{{ from_lookup_with_from_json }}" dest=./from_lookup_with_from_json.txt
- name: Save the lookup value stored in a variable in a file for comparison (they are the same)
copy: content="{{ from_lookup_without_from_json }}" dest=./from_lookup_without_from_json.txt
- name: This query is run against lookup value with from_json stored in a variable
debug: msg="{{ from_lookup_with_from_json | json_query(query) }}"
vars:
query: "Foods[].{id: Id, for: (Tags[?Key=='For'].Value)[0]} | [?for=='Tigger'].id"
- name: This query is run against lookup value without from_json stored in a variable
debug: msg="{{ from_lookup_without_from_json | json_query(query) }}"
vars:
query: "Foods[].{id: Id, for: (Tags[?Key=='For'].Value)[0]} | [?for=='Tigger'].id"
- name: This query is run directly against lookup value with from_json
debug: msg="{{ lookup('file','test.json') | from_json | json_query(query) }}"
vars:
query: "Foods[].{id: Id, for: (Tags[?Key=='For'].Value)[0]} | [?for=='Tigger'].id"
- name: This query is run directly against lookup value without from_json - the result is empty - why?
debug: msg="{{ lookup('file','test.json') | json_query(query) }}"
vars:
query: "Foods[].{id: Id, for: (Tags[?Key=='For'].Value)[0]} | [?for=='Tigger'].id"
The data (test.json):
{ "Foods" :
[ { "Id": 456
, "Tags":
[ {"Key":"For", "Value":"Heffalump"}
, {"Key":"Purpose", "Value":"Food"}
]
}
, { "Id": 678
, "Tags":
[ {"Key":"For", "Value":"Tigger"}
, {"Key":"Purpose", "Value":"Food"}
]
}
, { "Id": 911
, "Tags":
[ {"Key":"For", "Value":"Roo"}
, {"Key":"Purpose", "Value":"Food"}
]
}
]
}
json_query requires Python object (dict) as input, if you feed it with string, it gives empty string as result.
You get different result because of Ansible templating engine tricky work.
I should definitely write a post about it on my site...
After evaluating jijna2 expression Ansible try to cast complex types to Python objects (like dict or list). See my other answer.
In your case:
1.
- set_fact:
from_lookup_with_from_json: "{{ lookup('file','test.json') | from_json }}"
from_lookup_with_from_json is a dict, because you manually convert JSON-string from file to dict with from_json filter.
2.
- set_fact:
from_lookup_without_from_json: "{{ lookup('file','test.json') }}"
from_lookup_with_from_json becomes dict, because Ansible converts it when jinja2 expression ends with }}. So from_json is actually unnecessary as the last filter in chain.
3.
debug: msg="{{ lookup('file','test.json') | from_json | json_query(query) }}"
Again, you manually convert JSON-string here. So json_query get dict as input.
4.
debug: msg="{{ lookup('file','test.json') | json_query(query) }}"
In this case you feed JSON-string (not dict) as input to json_query filter. As everything happens inside one jinja2 expression, Ansible doesn't attempt to convert anything in between.
You can also get empty string result with a variable this way:
- set_fact:
from_lookup_force_string: "{{ lookup('file','test.json') | string }}"
In this case from_lookup_force_string will not be converted by Ansible tempating engine, and json_query will give you empty response on it.