Can I define a var directly in a role ?
With the following I get an error with this role ansible/roles/myrole/tasks/main.yml
vars:
source: /var/www/test.xxx.com/proj/assets
dest: /var/www/test.xxx.com/
- name: eb copy files
shell: rsync -a {{ source }} {{ dest }}
either with this:
source=var/www/test.xxx.com/proj/assets
dest=var/www/test.xxx.com/
I get a
ERROR: Syntax Error while loading YAML script
To define variables inside of a task, you can use the set_fact module.
- set_fact:
source: /var/www/test.xxx.com/proj/assets
dest: /var/www/test.xxx.com/
You can also set facts in a role, by placing variables in the vars directory.
ansible/roles/myrole/vars/main.yml
Related
I want to set a playbook level environment, but after I execute a couple of tasks. I have found that I could define a playbook level environment variable before definition of any tasks or task level environment variables. But, I haven't found how can I set-up an environment variable that can be used by all tasks following a task.
- name: server properties
hosts: kafka_broker
vars:
ansible_ssh_extra_args: "-o StrictHostKeyChecking=no"
ansible_host_key_checking: false
date: "{{ lookup('pipe', 'date +%Y%m%d-%H%M%S') }}"
copy_to_dest: "/export/home/kafusr/kafka/secrets"
server_props_loc: "/etc/kafka"
secrets_props_loc: "{{ server_props_loc }}/secrets"
environment:
CONFLUENT_SECURITY_MASTER_KEY: "{{ extract_key2 }}"
tasks:
- name: Create a directory if it does not exist
file:
path: "{{ copy_to_dest }}"
state: directory
mode: '0755'
- name: Find files from "{{ server_props_loc }}"
find:
paths: /etc/kafka/
patterns: "server.properties*"
# ... the rest of the task
register: etc_kafka_server_props
- name: Find files from "{{ secrets_props_loc }}"
find:
paths: /etc/kafka/secrets
patterns: "*"
# ... the rest of the task
register: etc_kafka_secrets_props
- name: Copy the files
copy:
src: "{{ item.path }}"
dest: "{{ copy_to_dest }}"
remote_src: yes
loop: "{{ etc_kafka_server_props.files + etc_kafka_secrets_props.files }}"
- name: set masterkey content value
set_fact:
contents: "{{ lookup('file', '/export/home/kafusr/kafka/secrets/masterkey.txt') }}"
extract_key2: "{{ contents.split('\n').2.split('|').2|trim }}"
I want to set CONFLUENT_SECURITY_MASTER_KEY after the set_facts task
Is it possible to set playbook level environment variable, but after defining some tasks
Thank you
UPDATE
Initially, when I was executing the playbook as originally defined, I was getting the error
fatal: [kafkaserver1]: FAILED! => {"msg": "The field 'environment' has an invalid value,
which includes an undefined variable. The error was: 'extract_key2' is undefined"}
which was expected as the variable extract_key2 was not set - before copying the files to desired directory.
After #Zeitounator's suggestion, when I added default to the environment variable's definition,
CONFLUENT_SECURITY_MASTER_KEY: "{{ extract_key2 | default('') }}"
I now get a different error
The new error is
TASK [set masterkey content value] ******************** fatal: [kafkaserver1]: FAILED! =>
{"msg": "The task includes an option with an undefined variable. The error was: 'contents' is undefined\n\n
The error appears to be in '/export/home/kafuser/tmp/so-71538207-question.yml': line 43, column 7, but may\n
be elsewhere in the file depending on the exact syntax problem.\n\nThe offending line appears to be:\n\n\n
- name: set masterkey content value\n ^ here\n"}
Getting this on all 3 brokers in the console and I checked the file it exists
I did do a cat on that file, copying the path from error to make sure there is no typo, and the contents of that file are displayed on console.
Update 2
I am trying to figure out how to use slurp to get the info, with the same approach as #Zeitounator's example about using lookup.
This is what I am trying. The current definition, is of course, erroneous. Just wanted to show what I am trying to do. But, can it be done for slurp and am I on the right path?
environment:
CONFLUENT_SECURITY_MASTER_KEY: >-
{{
(
((slurp: src: /export/home/z8tpush/kafka/secrets/masterkey.txt)['content'] | b64decode ).split('\n').2.split('|').2|trim
)
}}
#Zeitounator - Will you be able to direct me to an example where a slurp or fetch module is defined to set-up an environment variable and where the value will get updated after the tasks that create the file are executed, similar to what you have shown with lookup filter? I would really appreciate it.
Note:
Ultimately, I want to use ansible to create a new kafka user using confluents CLI commands ( using shell or command module ), verify it in my directory and once satisfied, I will encrypt the security.properties file using the masterkey and copy it to the appropriate location where confluent is installed.
As already mentioned, you can
With Ansible Configuration Settings set environment variables globally
Setting the remote environment in a task
Regarding your question
I haven't found how can I set-up an environment variable that can be used by all tasks following a task.
You can set the environment on Block level, a logical groups of tasks too
Setting the remote environment: "When you set a value with environment: at the play or block level, it is available only to tasks within the play or block that are executed by the same user."
This means you would need to define a block for the next tasks
- name: Block of next task(s)
block:
- name: Next task
...
environment:
CONFLUENT_SECURITY_MASTER_KEY: "{{ extract_key2 }}"
Regarding your question
Is it possible to set playbook level environment variable, but after defining some tasks?
No, not on that level in that run as the playbook is already running.
Another option might be to distribute the tasks in question into an other role, playbook or task file and include_* it.
You cannot set_fact a var depending on an other var in the same block. Moreover, there is absolutely no need to set_fact here as long as your relevant tasks can live with an empty environment var until it is fully defined. The following environment declaration (untested) should work and return the key for every task running after your file exists.
environment:
CONFLUENT_SECURITY_MASTER_KEY: >-
{{
(
(
lookup('file', '/export/home/kafusr/kafka/secrets/masterkey.txt', errors='ignore')
| default('')
).split('\n').2
| default('')
).2
| default('')
| trim
}}
I feel I must be missing this answer as it seems obvious but I've read a number of posts and have not been able to get this working.
Currently I am loading and then templating vars from files depending on inventory hostnames, like so:
- name: load unique dev vars from file
include_vars:
file: ~/ansible/env-dev.yml
when: inventory_hostname in groups[ 'devs' ]
- name: load unique prod vars from file
include_vars:
file: ~/ansible/env-prod.yml
when: inventory_hostname == 'prod'
- name: copy .env dev file with templated vars
ansible.builtin.template:
src: ~/ansible/env-dev.j2
dest: /home/{{ inventory_hostname }}/.env
owner: '{{ inventory_hostname }}'
group: '{{ inventory_hostname }}'
mode: '0600'
when: inventory_hostname in groups[ 'devs' ]
This works fine but ultimately it is requiring me to create a ton of .yml files when I would rather include some variables in certain steps instead.
Is it possible to load vars for a specific task? I've tried a number of solutions but haven't been able to make it work yet. See below for one method I tried using vars at the end of the task.
- name: copy .env dev file with templated vars
ansible.builtin.template:
src: ~/ansible/env-dev.j2
dest: /home/{{ inventory_hostname }}/.env
owner: '{{ inventory_hostname }}'
group: '{{ inventory_hostname }}'
mode: '0600'
when: inventory_hostname in groups[ 'devs' ]
vars:
NODE_ENV: development
PORT: 66
The key to organize your Ansible code is to rely on group vars.
This feature permits to load variables according to the group a host belong to. You have several ways to do that, one of the clearest way is to use yaml files named with the name of the group in the group_vars folder (plus a all.yaml matching all hosts). Ansible will pick automatically them for you, so you can get rid of your first two include_vars. You can combine them with variables specific to the role and or the playbook. So you end with a set of variables coming from the host (the target) and from the role / playbook (the task to achieve).
To replace the hardcoded src: ~/ansible/env-dev.j2 you could for example define a variable in each group.
---
# dev.yaml
template_name: "env-dev.j2"
---
# prod.yaml
template_name: "env-prod.j2"
And then use it in your playbook / role src: "{{ template_name }}".
In one of my ansible roles, I have created a variable via the "defaults/main.yml" file. I am able to reference this variable just fine in "tasks/main.yml" file. However, the variable does not appear to work in "files/some_file.txt"
Is this expected ?
Yes that's expected. tasks/main.yml is being parsed by ansible and will replace variables as you've seen.
Generally files/some_file.txt should contain static files or scripts that should be used with the copy module. As you've discovered it will not be parsed beyond that.
If you want to use variables in a file you should use the template module. Create a templates directory and copy your files there e.g template/some_file.txt. Note that it's common to rename the file with a .j2 extension to indicate that it is a jinja template e.g some_file.j2 but this is not required.
- name: Create fact
set_fact:
my_variable: 123456
- name: Create file from template
template:
src: some_file.j2
dest: "/tmp/some_file.txt"
mode: 0755
some_file.j2 might contain:
This file contains this sentence and the number {{ my_variable }}
After the template task run /tmp/some_file.txt will looks like:
This file contains this sentence and the number 123456
Use template. For example, the role
shell> cat roles/role-17/defaults/main.yml
var1: value1
shell> cat roles/role-17/files/some_file.txt
[{{ var1 }}]
shell> cat roles/role-17/tasks/main.yml
- debug:
msg: task/main.yml [{{ var1 }}]
- debug:
msg: files/some_file.txt {{ lookup('template', 'files/some_file.txt') }}
and the playbook
shell> cat test-17.yml
- hosts: localhost
roles:
- role-17
give
"msg": "task/main.yml [value1]"
"msg": "files/some_file.txt [value1]\n"
I am trying to do a custom install of openedx and I have a bunch of .yml files with environment variables in them inside paths that looks like this
playbooks/roles/<component-name>/defaults/main.yml
Then, while running a playbook that installs all such components, I'm using a command like this
ansible-playbook ./openedx_native.yml -e"#roles/<component-name-1>/defaults/main.yml" -e"#roles/<component-name-2>/defaults/main.yml"
Now I want to be able to use the main.yml files from all components and there are about 20-25 of them, so I'm looking for a way to include them using a wildcard, something like this
ansible-playbook ./openedx_native.yml -e"#roles/*/defaults/main.yml"
This, of course, doesn't work and Ansible throws an error like this
ERROR! the file_name
'/var/tmp/configuration/playbooks/roles/*/defaults/main.yml' does not
exist, or is not readable
How do I achieve this? Please help!
An option would be to find the files and include_vars.
tasks:
- command: "sh -c 'find {{ playbook_dir }}/roles/*/defaults/main.yml'"
register: result
- include_vars:
file: "{{ item }}"
loop: "{{ result.stdout_lines }}"
If you have flexibility to change & re-arrange environment variables and its values in /group/all.yaml like environments:
- { name: ‘development’, profile: 'small' }
- { name: ‘staging’, profile: ‘medium’ }
- { name: ‘production’, profile: ‘complex’ }
And then you can use this variable for any task say for example you want to create folder with environment name
- name: create folders for Environment
file:
path: "{{ target }}/{{ item.name }}"
state: directory
mode: 0755
with_items: "{{ environments }}"
I'm using the ec2 module with ansible-playbook I want to set a variable to the contents of a file. Here's how I'm currently doing it.
Var with the filename
shell task to cat the file
use the result of the cat to pass to the ec2 module.
Example contents of my playbook.
vars:
amazon_linux_ami: "ami-fb8e9292"
user_data_file: "base-ami-userdata.sh"
tasks:
- name: user_data_contents
shell: cat {{ user_data_file }}
register: user_data_action
- name: launch ec2-instance
local_action:
...
user_data: "{{ user_data_action.stdout }}"
I assume there's a much easier way to do this, but I couldn't find it while searching Ansible docs.
You can use lookups in Ansible in order to get the contents of a file, e.g.
user_data: "{{ lookup('file', user_data_file) }}"
Caveat: This lookup will work with local files, not remote files.
Here's a complete example from the docs:
- hosts: all
vars:
contents: "{{ lookup('file', '/etc/foo.txt') }}"
tasks:
- debug: msg="the value of foo.txt is {{ contents }}"
You can use the slurp module to fetch a file from the remote host: (Thanks to #mlissner for suggesting it)
vars:
amazon_linux_ami: "ami-fb8e9292"
user_data_file: "base-ami-userdata.sh"
tasks:
- name: Load data
slurp:
src: "{{ user_data_file }}"
register: slurped_user_data
- name: Decode data and store as fact # You can skip this if you want to use the right hand side directly...
set_fact:
user_data: "{{ slurped_user_data.content | b64decode }}"
You can use fetch module to copy files from remote hosts to local, and lookup module to read the content of fetched files.
lookup only works on localhost. If you want to retrieve variables from a variables file you made remotely use include_vars: {{ varfile }} . Contents of {{ varfile }} should be a dictionary of the form {"key":"value"}, you will find ansible gives you trouble if you include a space after the colon.