Can Ansible dictionaries refer to themselves? - ansible

I'm trying to build certain software on each machine locally. The playbook would download the source tarball (using get_url), configure and build it.
I'd like to define the list of items to build as something like the below:
srcpkg:
python:
ver: "3.7.0"
sha: "0382996d1ee6aafe59763426cf0139ffebe36984474d0ec4126dd1c40a8b3549"
url: "https://www.python.org/ftp/python/{{ srcpkg.python.ver }}/python/Python-{{ srcpkg.python.pyver }}.tar.xz"
Unfortunately, such references to itself (the url refers to ver in the above example) cause Ansible to throw a "recursive loop detected" error at runtime.
Is there a way -- either in Ansible or, maybe, simply in Yaml -- to define things so that I wouldn't have to repeat the version in more than one place?
Update: tried to use anchor/reference:
srcpkg:
python:
ver: &ver "3.7.0"
sha: "0382996d1ee6aafe59763426cf0139ffebe36984474d0ec4126dd1c40a8b3549"
url: "https://www.python.org/ftp/python/{{ *ver }}/python/Python-{{ *ver }}.tar.xz"
to no avail: Ansible complains of "unexpected '*'".

When you write the following in YAML:
url: "https://www.python.org/ftp/python/{{ *ver }}/python/Python-{{ *ver }}.tar.xz"
the right side of the :  specifies a scalar value. YAML aliases are not resolved in parts of a scalar.
Ansible thus creates a string variable with the value: https://www.python.org/ftp/python/{{ *ver }}/python/Python-{{ *ver }}.tar.xz.
And for Jinja2 *ver is a syntax error.
What you can do is to use a helper Ansible variable (YAML uses eager evaluation for aliases, Jinja2 uses lazy evaluation for variables):
srcpkg:
python:
ver: &ver "3.7.0"
sha: "0382996d1ee6aafe59763426cf0139ffebe36984474d0ec4126dd1c40a8b3549"
url: "https://www.python.org/ftp/python/{{ python_version }}/python/Python-{{ python_version }}.tar.xz"
python_version: *ver

Related

Determine return type for object returned from a task

I have a playbook which should retrieve atrifacts from maven repo, extract them to temp dir and copy some file to destinatiom folder. Currently it works pretty fine - artifacts are downloaded using maven_artifact task. But some requirements have changed and I need to use get_url task now. After changing to get_url the whole rest of the playbook is broken because object returned from maven_artifact and get_url are of different types. How to determine what type with what fields is getting returned from a task?
Best regards
No matter which ansible module you use there is the option to create variables from the output of the task by using register.
The ansible documentation states which return values are available to you when doing so. Here for example are the return values for the get_url module: https://docs.ansible.com/ansible/latest/reference_appendices/common_return_values.html
In that case you may do something like the following to retrieve the status code of the get_url module:
- name: Download foo.conf
get_url:
url: http://example.com/path/file.conf
dest: /etc/foo.conf
mode: '0440'
register: my_result
- name: Print status code of get_url
debug:
var: my_result.status_code
Each module returns an object of a different type.
There Is no way within Ansible to identify the type of a registered variable (i.e. what attributes you can read from it) however, a module will always return an object of the same type.
The return values of a module are listed at the bottom of that modules documentation page.

How to create environment variables in Github Actions using other variables

I want to use the GITHUB_SHA with a variable, like this:
name: build
on: ["push"]
env:
PACKAGE: package-$GITHUB_SHA
But, when I use, yaml does not expand the variable and I got the string. How I can do it?
According to https://help.github.com/en/actions/automating-your-workflow-with-github-actions/contexts-and-expression-syntax-for-github-actions#github-context:
As part of an expression, you may access context information using one of two syntaxes.
Index syntax: github['sha']
Property dereference syntax: github.sha
In this case:
name: build
on: ["push"]
env:
PACKAGE: package-${{ github.sha }}
Try the following:
name: build
on: ["push"]
env:
PACKAGE: package-${{ github.sha }}

google deployment manager, can you import files in jinja template that you call directly with --template?

https://cloud.google.com/deployment-manager/docs/configuration/templates/create-basic-template
I can deploy a template directly like this: gcloud deployment-manager deployments create a-single-vm --template vm_template.jinja
But what if that template depends on other files that need to be imported? If using a --config file you can define import in that file and call the template as a resource. But you cant pass parameter/properties to a config file. I want to call a template directly to pass --properties via the command line but that template also needs to import other files.
EDIT: What I needed was a top level jinja template instead of a config. My confusion was that you cant use imports in a jinja template without a schema file- it was failing and I thought it wasnt supported. So the solution was just swap out the config with a jinja template (with schema file) and then I can use --properies
Maybe you can try importing the dependent files into your config file as follows:
imports:
- path: vm-template.jinja
- path: vm-template-2.jinja
# In the resources section below, the properties of the resources are replaced
# with the names of the templates.
resources:
- name: vm-1
type: vm-template.jinja
- name: vm-2
type: vm-template-2.jinja
and Set Arbitrary Metadata insito create a special variable that you can pass and might use in other applications outside of Deployment Manager:
properties:
size:
type: integer
default: 2
description: Number of Mongo Slaves
variable-x: ultra-secret-sauce
More info about gcloud deployment-manager deployments create optional flags and example can be found here.
More info about passing properties using a Schema can be found here
Hope it helps

How to prevent Ruby's YAML parser from trying to parse {{var-name}}

I have a bunch of concourse pipeline files that look like the following:
---
resources:
- name: example
type: git
source:
uri: git#github.internal.me.com:me/example.git
branch: {{tracking_branch}}
private_key: {{ssh_key}}
paths:
- code/src/do/teams/sampleapp
params:
depth: 1
- name: deploy-image
type: docker-image
source:
repository: {{docker_image_url}}
And I want to parse them in ruby to perform a bunch of transformations (like validating them and updating some keys if they are missing).
Problem is, whenever I try to load and them dump them back to files the pieces that have {{something}} become:
branch:
? tracking_branch:
:
private_key:
? ssh_key:
:
Why is it doing this and is there any way I can configure the parser not to do this? Just leave these variables as they are?
To avoid conflict with YAML's internal syntax you need to quote your values:
---
resources:
- name: example
type: git
source:
uri: git#github.internal.me.com:me/example.git
branch: '{{tracking_branch}}'
private_key: '{{ssh_key}}'
paths:
- code/src/do/teams/sampleapp
params:
depth: 1
This sort of thing comes up in Ansible configuration files all the time for similar reasons.
The { and } characters are used in Yaml for flow mappings (i.e. hashes). If you don’t provide a value for a mapping entry you get nil.
So in the case of branch: {{tracking_branch}}, since there are two pairs of braces, you get a hash with a key branch and value (in Ruby) of
{{"tracking_branch"=>nil}=>nil}
When this is dumped back out to Yaml you get the somewhat awwkward and verbose:
branch:
? tracking_branch:
:
The solution is simply to quote the value:
branch: "{{tracking_branch}}"
Completely forgot that concourse now offers ((var-name)) for templating, just switched to that instead of {{var-name}} at the pipelines and the YAML parser is now happy!

Ansible archive module "no action detected in task"

I want to use the archive module of ansible but it is unfortunately not working. I have the following version installed: ansible 2.3.0 (devel 2131eaba0c)
my playbook looks like this:
- archive: path="{{path_dir}}" dest="{{dest_dir}}/foo.zip" format=zip
The output looks like this:
"failed": true, "reason": "no action detected in task. This often indicates a misspelled module name, or incorrect module path.
The error appears to have been in '/prj/sndbox1/app/jenkins/jobs/release/workspace/tasks/build_rpclient.yml': line 125, column 3, but may be elsewhere in the file depending on the exact syntax problem.
The offending line appears to be:
- archive: path="{{path_dir}}" dest="{{dest_dir}}/foo.zip" format=zip
^ here
We could be wrong, but this one looks like it might be an issue with missing quotes. Always quote template expression brackets when they start a value. For instance:
with_items:
- {{ foo }}
Should be written as:
with_items:
- "{{ foo }}"
As far as I understood the doc, the extra modules are shipped within ansible, so I assume I don't need to separately install this module.
However, what am I doing wrong? Is there any configuration I need to change in order to tell ansible where to look for the extra modules?
Thanks in advance!
Edit: Included the the full log message
Edit 2:
I tried to put the archive.py directly into my working directory --> [library]/archive.py
Now I get the following error:
"failed": true, "msg": "Could not find imported module support code for archive. Looked for either get_exception or pycompat24"
I have the same use case here, I'm using Ansible version 2.2.0.0. Installed via brew on MacOS sierra.
I've managed to solve the issue by adding the following into my local ansible.cfg file.
On defautls section, you must change the library entry:
[defaults]
library = /usr/local/Cellar/ansible/2.2.0.0_2/libexec/lib/python2.7:library
If you have a different setup, check:
pip show ansible
/usr/share/ansible if you are on Linux machine.

Resources