I want to set a remote environment inside a docker container using an Ansible playbook. This playbook will run from gitlab-ci with variables I set in in the Gitlab CI/CD confituration. How can I acheive that?
Here is the template I want to use. How do I set the user_id and password from the CI/CD variables?
tasks:
- name: Run XYZ Container
docker_container:
name: XYZ
restart_policy: on-failure
image: xxxxxxxxxxx
container_default_behavior: "compatibility"
env:
USER_ID= $USER_ID
PASSWORD= $PASSWORD
Since gitlab-ci variables are just environment variables inside your job, and since your ansible controller runs inside that job, you can use the env lookup to read them from the controller.
Please note that:
the docker_container module's env parameter expects a dict and not a new line separated string of bash like env vars definition like in your example.
as a security measure, you should either check that the vars are defined prior to using them (with an assert or fail task) or use a default value in case they're not. My example uses a default value. For more on providing default value, you can see the ansible documentation (and the original jinja2 documentation to understand that d is a an alias to default)
tasks:
- name: Run XYZ Container
docker_container:
name: XYZ
restart_policy: on-failure
image: xxxxxxxxxxx
container_default_behavior: "compatibility"
env:
USER_ID: "{{ lookup('env', 'USER_ID') | d('defaultuser', true) }}"
PASSWORD: "{{ lookup('env', 'PASSWORD') | d('defaultpass', true) }}"
i wanted to use the CI_JOB_TOKEN so i used:
tasks:
- include_role: role_name
vars:
ci_job_token: "{{ lookup('env', 'CI_JOB_TOKEN') }}"
Related
In my playbook i have tasks that use the hostname of a server and extrapolate data to set location and environment based on that. But some servers have unique names and I'm not sure how to set variables on those. I'd prefer not to use ansible facts since i would like to share the playbook with a team. One way I was thinking is to do what's listed below but i'm running into issues. Could someone please guide me.
Create vars_file inventory
---
customservers
customhostname1:
env: test
location: hawaii
customhostname2:
env: prod
location: alaska
In Playbook.
---
task
tasks:
- name: set hostname
shell: echo "customhostname1"
register: my_hostname
- name: setting env var
set_fact:
env: "{{ item.value.env }}"
when: my_hostname == "{{ item.key }}"
with_dict: "{{ customservers }}"
- name: outputing env var
debug:
msg: the output is {{ env }}
Expected output should be test.
Thank you.
In my playbook i have tasks that use the hostname of a server and
extrapolate data to set location and environment based on that.
Bad Idea.
But some servers have unique names and I'm not sure how to set variables on those
And that is why.
The second Bad Idea is to have TEST and PROD in the same inventory. That's just begging for a disaster. They should be two completely separate inventories, though perhaps under the same parent directory:
inventories/
inventories/test/
inventories/test/hosts
inventories/test/host_vars/
inventories/test/host_vars/customhostname1.yml
inventories/prod/
inventories/prod/hosts
inventories/prod/host_vars/
inventories/prod/host_vars/customhostname2.yml
So inventories/prod/hosts could look like this (I prefer the ini format):
[customservers]
customhostname2 location=hawaii
Or:
[customservers]
customhostname2
[customhostname2:vars]
location=hawaii
But in any case, DO NOT combine test and prod inventories.
If you still need that env variable, you can either put it in group_vars/all.yml or right in the hosts file like so:
[all:vars]
env=prod
We're using Ansible playbook with GitLab CI in this project, where we'd pass some variables from ENV_FILE through Ansible playbook, then rendering JJ2 template with them.
Now the problem occurs when some variable has $ in its value, which seems interpreted as shell variable at some point, and the final value is rendered incorrect.
For example, in ENV_FILE
(set via GitLab CI Settings > CI/CD > Variables menu):
export FIRST_VAR=...
export SOME_VAR='123$abc#xyz'
export SOME_OTHER_VAR=...
And the final result in docker-compose.yaml becomes 123#xyz
EDIT: We just tried changing to export SOME_VAR='123''$''abc#xyz', the final result becomes 123abc#xyz, still missing the $.
gitlab-ci.yaml
deploy:
stage: deploy
environment:
name: dev
script:
- source $ENV_FILE
- cd ansible && ansible-playbook -i inventory/dev.ini runapp.yaml --vault-password-file=${ANSIBLE_VAULT_FILE}
runapp.yaml
- hosts: app
become: yes
roles:
- { role: some_app }
vars:
SOME_VAR: "{{ lookup('env', 'SOME_VAR') }}"
Task File:
- name: "Templating docker-compose file"
become: yes
template:
src: app-docker-compose.yaml.j2
dest: /opt/someapp/docker-compose.yaml
app-docker-compose.yaml.j2
someapp-svc:
image: someapp:version
restart: always
ports:
- ####:####
environment:
SOME_VAR: {{ SOME_VAR }}
Any hint about this?
Thanks!
I can reproduce that behavior when setting a CI/CD variable containing $; the docs kind of hint at it, although the docs are written as if the problem only applies when setting variables inside .gitlab-ci.yml which is demonstrably false
If you want a CI/CD variable to contain a literal $, it needs to be doubled, so SOME_VAR would need to be written as 123$$abc#xyz in the CI/CD configuration page in order for it to materialize as 123$abc#xyz inside the pipeline (although as the comments correctly point out, one will want to be exceedingly careful about the use of source to avoid further interpolation)
I'm developing a role where I want to launch a docker container, among the tasks in my role I have one using the docker_container module to do that:
- name: Launch docker container
docker_container:
name: abc
...
This works fine but now I want to have a variable that will define whether this container needs to be attached to a particular docker network.
If I require it is fine:
- name: Launch docker container
docker_container:
name: abc
networks:
- name: '{{ network_name_var}}'
...
But I want to allow the users to not define it, in which case no networks: ... property should be added.
I have found no easy way of achieving this, is there one?
Semantically I want something like this:
- name: Launch docker container
docker_container:
name: abc
{% if network_name_var is defined %}
networks:
- name: '{{ network_name_var}}'
...
{% endif %}
Here is a possible scenario you can use. The key points:
We keep your single network_name_var that is exposed to your user. I took for granted that this var could be either undefined, or empty.
We define the full network list definition dynamically if the var has a value set. This list stays unset otherwise.
We use the omit place holder to not define any networks in the module if need be.
- name: demo playbook for omit
hosts: localhost
tasks:
- name: set the list of networks for our container
# don't define anywhere else. it should only exist
# if network_name_var is set
set_fact:
my_networks:
- name: '{{ network_name_var }}'
when: network_name_var | default('') | length > 0
- name: make sure container is started
docker_container:
name: abc
networks: "{{ my_networks | default(omit) }}"
Hi I have a template in ansible yaml file:
env:
"{{ env }}"
this value is assigned when I executed:
ansible-playbook template.yaml --extra-vars vars.yaml
vars.yml has:
env:
CONSUL_HOST: 'x.x.x.x:8500'
SERVICE_NAME: 'backoffice-fe'
that works fine , now I want to add another variable in the template.yaml with the value of CONSUL_HOST
I tried without lucky:
env:
"{{ env }}"
NEW_VAR: env.CONSULT_HOST
I need to do this in template.yaml because it is used for a lot of modules , I don't want to modify all the modules because there are more than 100!
that is possible?
thanks in advance!
I am trying to craft a list of environment variables to use in tasks that may have slightly different path on each host due to version differences.
For example, /some/common/path/v_123/rest/of/path
I created a list of these variables in variables.yml file that gets imported via roles.
roles/somerole/varables/main.yml contains the following
somename:
somevar: 'coolvar'
env:
SOME_LIB_PATH: /some/common/path/{{ unique_part.stdout }}/rest/of/path
I then have a task that runs something like this
- name: Get unique path part
shell: 'ls /some/common/path/'
register: unique_part
tags: workflow
- name: Perform some actions that need some paths
shell: 'binary argument argument'
environment: somename.env
But I get some Ansible errors about variables not being defined.
Alternatively I tried to predefine the unique_part.stdout in hopes of register overwriting predefined variable, but then I got other ansible errors - failure to template.
Is there another way to craft these variables based on command returns?
You can also use facts:
http://docs.ansible.com/set_fact_module.html
# Prepare unique variables
- hosts: myservers
tasks:
- name: Get unique path part
shell: 'ls /some/common/path/'
register: unique_part
tags: workflow
- name: Add as Fact per for each hosts
set_fact:
library_path: "{{ unique_part.stdout }}"
# launch roles that use those unique variables
- hosts: myservers
roles:
- somerole
This way you can dynamicaly add variable to your hosts before using them.
The vars files gets evaluated when it is read by Ansible. Your only chance would be to include a placeholder which you then later have to replace yourself, like this:
somename:
somevar: 'coolvar'
env:
SOME_LIB_PATH: '/some/common/path/[[ unique_part.stdout ]]/rest/of/path'
And then later in your playbook you can replace that placeholder:
- name: Perform some actions that need some paths
shell: 'binary argument argument'
environment: '{{ somename.env | replace("[[ unique_part.stdout ]]", unique_part.stdout) }}'