I have an ansible playbook that gets its vars passed in from an extra-vars.json file. It gets passed in at the command line with --extra-vars "#extra-vars.json".
This is an abbreviated version of the var file
{
"source" : {
"access_token" : "abc",
"git_instance_url" : "foo.com",
"repo" : "some-group/some-project/some-repo"
},
"target" : {
"access_token" : "xyz",
"git_instance_url" : "foo.bar.com",
"repo_path" : "lorem/ipsum"
}
}
Because of the var structure, when I call the vars in my playbook I have to use dot notation i.e. {{ source.repo_path }} or {{ target.access_token }}. My problem is that I would like to remove a couple of these vars from the extra-vars.json and pass them individually at the command line. If I remove source.git_instance_url from extra-vars.json I can pass it in without any precedence conflicts.
My issue is that I can't figure out how to pass dot notation vars in at the command line. I don't want to change my playbook to do this. If I pass in --extra-vars "source.git_instance_url=bar.baz.com" I get an error source is undefined.
I tried using bracket notation source[git_instance_url]=bar.baz.com with no success.
Is there a way to pass dot notation vars at the command line or am I going to have to change my playbook from {{ source.git_instance_url }} ==> {{ source_git_instance_url }} to be able to accomplish this?
When passing extra vars, these are always "string variables". I learned it the hard way when trying to pass in boolean variables.
You could pass them as json:
ansible-playbook -e '{"source": { "git_instance_url": "foo" }}' playbook.yml
But I don't know right now, if they get merged with the source var from your vars file. I'd guess, one overwrites the other. So you probably end up with either the source var from your string or with the one from the file.
Related
I created a Variable (from Airflow UI):
Key: env_variables
Value: {'xx': 'yy`}
and trying to access using var.json.env_variables.xx inside bash operator. But the value of "xx" is dynamic, which
I am passing from REST API and accessing it using dag_run.conf['param'] (dag_run.conf['param'] should return xx). Eventually I want to run var.json.env_variables.xx. How can I achieve that inside bash operator in Airflow?
I am trying to run the following code but its showing Jinja Syntax Error.
task = BashOperator(
bash_command="export KUBECONFIG=$KUBECONFIG:{{var.json.env_variables.{{dag_run.conf['param']}} }}
What I want:
It should fetch the value of var.json.env_variables.xx. I have also tried using params but no luck, params.path is returning None.
task = BashOperator(
bash_command="export KUBECONFIG=$KUBECONFIG:{{params.path}},
params: {
"path":env_variables.get('{{dag_run.conf["param"]}}')
}
Any kind of help is highly appreciated.
You don't need the extra {{...}} in the Jinja expression around dag_run.conf and you'll also need the typical item access of a dictionary. Try using a template expression like this:
nested_vars = BashOperator(
task_id="nest_vars",
bash_command="export KUBECONFIG=$KUBECONFIG:{{ var.json.env_variables[dag_run.conf['param']] }}"
)
This is the log entry when passing in a triggering config of {"param": "xx"} to the operator instantiation above:
INFO - Running command: ['bash', '-c', 'export KUBECONFIG=$KUBECONFIG:yy']
I have a problem with a Jinja2 template I'm writing (called from Ansible).
The resultant file is a JSON file that I will send to an API (either with Ansible URI module or with curl). The template looks like this and basically works:
{
"description" : "my description",
"pipeline": "{% include 'root/pipeline.j2' %}"
}
The problem is that the content of root/pipeline.j2 is quite complex and includes multiple lines, quote characters and any number of other things that make the json file I'm creating invalid. What I want to do is parse the included file through a filter to convert it to a JSON valid string; something like this:
{
"description" : "my description",
"pipeline": "{% include 'root/pipeline.j2' | to_json %}"
}
But that doesn't work, probably because the filter is acting on the filename, not the included content.
Just for a little clarity when I create the template at the moment I see pipeline gets set to something like this:
"pipeline": "input {
"input1" {
<snipped>
"
It should appear thus:
"pipeline": "input {\n \"input1\" {<snipped>"
NB: I'm only giving the first couple of lines and I am using 'snipped' where I have remove the rest of the config.
Can anyone tell me how I can use an include within a jinja2 template that renders the result as a single line valid json string?
Thanks in advance for any assistance.
I finally managed to find a solution to my own question. Within the template that provides the JSON that is an API payload I am now setting a variable to the content of the pipeline template, which makes it easy to filter it with to_json:
{% set pipeline = lookup('template', 'root/pipeline.j2') %}
{
"description" : "my description",
"pipeline": {{ pipeline | to_json }}
}
I will leave this quesiton answer open for a while in case anyone can supply a better answer or explain why this is not a good one.
Thanks.
Here is my problem I need to get a dict value from key. But the key is also a var.
For example, I had an ansible role.
In vars/main.yml, I defined vars as below:
---
location: "USA"
source: {
"China": "/net/server1/patha",
"USA": "/net/server2/pathb",
"Japan": "/net/server3/pathc"
}
So in my tasks: tasks/main.yml. How do get "/net/server2/pathb" using the vars. I tried below in tasks, all did not work.
-shell: "perl run.perl {{ source.location }}/script.pl"
-shell: "perl run.perl {{ source.{{ location }} }}/script.pl"
This may be a simple question. But I searched many posts for a long time and still cannot get a right answer. So please help and many thanks.
The answer is {{ source[location] }}.
I have item list of IPs:
server_hosts:
- { host: '1.1.1.1' }
- { host: '10.10.10.10' }
I want to pass only one of the items in the command line:
ansible-playbook base.yml -i ${host}, --extra-vars "env_name=lab server_hosts={host:'${1.1.1.1}'} "
but this gives an error of:
{"failed": true, "msg": "the field 'args' has an invalid value, which appears to include a variable that is undefined. The error was: 'unicode' object has no attribute 'host'\n\nThe error
Any advice how to pass a specific item from list in the command line?
Please pay attention to a side note here:
Note: Values passed in using the key=value syntax are interpreted as strings. Use the JSON format if you need to pass in anything that shouldn’t be a string (Booleans, integers, floats, lists etc).
So, you should use:
--extra-vars '{"env_name":"lab","server_hosts":{"host":"1.1.1.1"}}'
Otherwise you end up with server_hosts as string, not an object.
I am using gather fact variable to get size information about host. for some server I am getting variable "ansible_devices": { "sda" and for few server getting "ansible_devices": { "cciss!c0d0".
Problem:- When I am using variable {{ ansible_devices.sda.size }} in my playbook and if sda key not found in ansible_device variable then obviously it gives me error
fatal: [xyz101] =>One or more undefined variables: dict object has no element sda
Getting value in ansible_device variable like below
"ansible_devices": {
"sda": {
"size": "68.33 GB",
........
}
},
"item": ""
or
"ansible_devices": {
"cciss!c0d0": {
"size": "68.33 GB",
........
}
},
"item": ""
Also I can access size here using {{ ansible_devices.sda.size }} in first case But unable to fetch value in {{ ansible_devices.cciss!c0d0.size }} in second case.
It might be the case special character in a json key that's why I am unable to fetch its value.
Is there any way to access this variable through key index {{ ansible_devices[0].size }} ?
or any other better way to access it.
You could use a conditional?
when: ansible_devices.sda exists
Or you can iterate through ansible_devices.keys() and with_items.
We can check key by using has_key in ansible playbook like below.
when: ansible_devices.has_key('sda')
Above check resolve my fetal error as I have added two task for these two keys. But still I am looking the solution where I can get these key value through index number. It will replace multiple condition into one.