In Ansible, how is the environment keyword used? - ansible

I have a playbook to install PythonBrew. In order to do this, I have to modify the shell environment. Because shell steps in Ansible are not persistent, I have to prepend export PYTHONBREW_ROOT=${pythonbrew.root}; source ${pythonbrew.root}/etc/bashrc; to the beginning of each of my PythonBrew-related commands:
- name: Install python binary
shell: export PYTHONBREW_ROOT=${pythonbrew.root}; source ${pythonbrew.root}/etc/bashrc; pythonbrew install ${python.version}
executable=/bin/bash
- name: Switch to python version
shell: export PYTHONBREW_ROOT=${pythonbrew.root}; source ${pythonbrew.root}/etc/bashrc; pythonbrew switch ${python.version}
executable=/bin/bash
I'd like to eliminate that redundancy. On the Ansible discussion group, I was referred the environment keyword. I've looked at the examples in the documentation and it's not clicking for me. To me, the environment keyword doesn't look much different than any other variable.
I've looked for other examples but have only been able to find this very simple example.
Can someone demonstrate how the environment keyword functions in Ansible, preferably with the code sample I've provided above?

Not sure if it will fits your need, but this is how I see this :
- hosts: all
vars:
env:
PYTHONBREW_ROOT: "{{ pythonbrew.root }}"
tasks:
- name: Install python binary
shell: pythonbrew install {{ python.version }} executable=/bin/bash
environment: env
- name: Switch to python version
shell: pythonbrew switch {{ python.version }} executable=/bin/bash
environment: env
It simply sets a variable named env and reuse it as environment in both of your shell commands. This way your shell command will have the PYTHONBREW_ROOT path set.

I have a very similar issue; I'd like to have ansible do it's stuff inside a Python virtualenv (after it's made sure it's set up for me, of course).
Here's one way I've done the environment preconditions so far; essentially I have had to add (and optionally remove) lines to .bashrc:
tasks:
- name: "Enable virtualenv in .bashrc"
lineinfile: dest=.bashrc
line="source {{ PROJECT_HOME }}/venv/bin/activate"
# Put tasks that rely on this environmental precondition here (?)
- name: "Disable virtualenv in .bashrc"
lineinfile: dest=.bashrc
line="source {{ PROJECT_HOME }}/venv/bin/activate"
state=absent
I don't know if I'm "Doing It Wrong", but until I figure it out or someone comes along to tell me how to do it better, I suppose this will work.

Related

How to set different environment variables for each managed host in ansible

i have an inventory file which looks like below.
[abc]
dsad1.jkas.com
dsad2.jkas.com
[def]
dsad3.jkas.com
dsad4.jkas.com
[main:children]
abc
def
[main:vars]
ansible_user="{{lookup('env', 'uid')}}
ansible_password="{{lookup('env', 'pwd')}}
ansible_connection=paramkio
main.yaml --> main yaml looks like below
---
- hosts: "{{my-hosts}}"
roles:
- role: dep-main
tags:
- main
- role: dep-test
tags:
-test
cat roles/dep-main/tasks/main.yaml
- name: run playbook
script: path/scripts/dep-main.sh
where i have scripts folder inside which i have dep-main.sh --> using script module to run the shell script on remote machine.
"ansible-playbook -i inventory -e "my_hosts=main" --tags main main.yaml"
i am following above design for a new requirement. Now the challenge is i need to set environment variable for each host and env variables are diff for each host. how can i achieve it . please help me.
there are around 15 env key value pairs that needs to be exported to each host above, out of which 10 are common, where i'll simply put in the above shell script. where as other 5 env key value pairs are diff for each host like below.
dsad1.jkas.com
sys=abc1
cap=rty2
jam=yup4
pak=hyd4
jum=563
dsad2.jkas.com
sys=abc45
cap=hju
jam=upy
pak=upsc
jum=y78
please help.
Please see the ansible documentation for this as it's rather explanatory:
https://docs.ansible.com/ansible/latest/user_guide/intro_inventory.html
I am editing my initial answer and try to make it more clear. But please note that I am assuming a few things here:
you are using Linux
you use bash as your shell, Linux offers a variety of shells and the syntax is different
when you say export environment variables I assume you mean copy the variables to the remote host in some form and export them to the shell
Also there's a few things left to do but I have to leave it to you:
You need to "load" the env file so the variables are available to the shell (export), you can add an extra line to the ".bashrc" file to source the file, like "source /etc/env_variables", but that will depend on how you want to use the vars. You can easily do that with your shell script or with ansible (lineinfile)
I recommend you use a separate file like I did and don't edit the bashrc adding all the variables. It will make you life easier in the future if you want to update the variables.
Rename the env file properly so you know what's for and add some backup mechanism - check the documentation https://docs.ansible.com/ansible/latest/collections/ansible/builtin/blockinfile_module.html
Inventory file with the variables:
all:
hosts:
<your_host_name1>:
sys: abc1
cap: rty2
jam: yup4
pak: hyd4
jum: 563
<your_host_name2>:
sys: abc45
cap: hju
jam: upy
pak: upsc
jum: y78
My very simplistic playbook:
- hosts: all
become: yes
become_method: sudo
tasks:
- name: Create env file and add variables
blockinfile:
path: /etc/env_variables
create: yes
block: |
export sys={{ sys }}
export cap={{ cap }}
export jam={{ jam }}
export pak={{ pak }}
export jum={{ jum }}
The final result on the servers I used for testing is the following:
server1:
# BEGIN ANSIBLE MANAGED BLOCK
export sys=abc1
export cap=rty2
export jam=yup4
export pak=hyd4
export jum=563
# END ANSIBLE MANAGED BLOCK
server2:
# BEGIN ANSIBLE MANAGED BLOCK
export sys=abc45
export cap=hju
export jam=upy
export pak=upsc
export jum=y78
# END ANSIBLE MANAGED BLOCK
Hope this is what you are asking for.
For windows you may have to find where the user variables are set and adjust the playbook to edit that file, if they are in a text file...

Ansible Builtin Lineinfile to ~/.bashrc

I'm relatively new to ansible, so apologies if this question misses something.
My goal is to add a line to the ~/.bashrc file with ansible. I think the best way to do that is with the ansible.builtin.lineinfile module.
Unfortunately, I've run the module, it appears to run properly on the target host machine, reports back 'changed' on the first run (and 'ok' on subsequent runs), but no changes are actually made in the ~/.bashrc file.
Appreciate any help in figuring out what changes would be needed to create the desired outcome.
---
- hosts: setup
become: true
vars_files:
- /etc/ansible/vars.yml
tasks:
- name: Test lineinfile
ansible.builtin.lineinfile:
path: ~/.bashrc
line: "test lineinfile"
Changed path: ~/.bashrc to path: .bashrc and it worked.
Multiple lines could be done this way:
- name: Add an environment variable to the remote user's shell.
lineinfile:
dest: "~/.bashrc"
line: |
Line 1
Line 2

Ansible lookup env with $ and inject into jj2 template not working

We're using Ansible playbook with GitLab CI in this project, where we'd pass some variables from ENV_FILE through Ansible playbook, then rendering JJ2 template with them.
Now the problem occurs when some variable has $ in its value, which seems interpreted as shell variable at some point, and the final value is rendered incorrect.
For example, in ENV_FILE
(set via GitLab CI Settings > CI/CD > Variables menu):
export FIRST_VAR=...
export SOME_VAR='123$abc#xyz'
export SOME_OTHER_VAR=...
And the final result in docker-compose.yaml becomes 123#xyz
EDIT: We just tried changing to export SOME_VAR='123''$''abc#xyz', the final result becomes 123abc#xyz, still missing the $.
gitlab-ci.yaml
deploy:
stage: deploy
environment:
name: dev
script:
- source $ENV_FILE
- cd ansible && ansible-playbook -i inventory/dev.ini runapp.yaml --vault-password-file=${ANSIBLE_VAULT_FILE}
runapp.yaml
- hosts: app
become: yes
roles:
- { role: some_app }
vars:
SOME_VAR: "{{ lookup('env', 'SOME_VAR') }}"
Task File:
- name: "Templating docker-compose file"
become: yes
template:
src: app-docker-compose.yaml.j2
dest: /opt/someapp/docker-compose.yaml
app-docker-compose.yaml.j2
someapp-svc:
image: someapp:version
restart: always
ports:
- ####:####
environment:
SOME_VAR: {{ SOME_VAR }}
Any hint about this?
Thanks!
I can reproduce that behavior when setting a CI/CD variable containing $; the docs kind of hint at it, although the docs are written as if the problem only applies when setting variables inside .gitlab-ci.yml which is demonstrably false
If you want a CI/CD variable to contain a literal $, it needs to be doubled, so SOME_VAR would need to be written as 123$$abc#xyz in the CI/CD configuration page in order for it to materialize as 123$abc#xyz inside the pipeline (although as the comments correctly point out, one will want to be exceedingly careful about the use of source to avoid further interpolation)

How to get ansible to update PATH

is there a way to do:
export PATH=$PATH:/new/path/to/bin
in Ansible, if possible without using shell or command.
I've tried this
- name: Add another bin dir to system-wide $PATH.
copy:
dest: /etc/profile.d/custom-path.sh
content: 'PATH=$PATH:{{ my_custom_path_var }}'
That I got from:
https://www.jeffgeerling.com/comment/reply/2799
But it doesn't work as PATH results in:
\$PATH:/new/path/to/bin
Breaking the system's PATH.
Thanks!
Using shell or command would be:
- name: Add pm2 to PATH
shell: echo "PATH=$PATH:/new/path/to/bin" > /etc/environment
become: true
But I'd still prefer an option that doesn't use shell/command.
Solution without using shell module
instead of
content: 'PATH=$PATH:{{ my_custom_path_var }}'
Use
content: 'export PATH=$PATH:{{ my_custom_path_var }}'

Running Python script via ansible

I'm trying to run a python script from an ansible script. I would think this would be an easy thing to do, but I can't figure it out. I've got a project structure like this:
playbook-folder
roles
stagecode
files
mypythonscript.py
tasks
main.yml
release.yml
I'm trying to run mypythonscript.py within a task in main.yml (which is a role used in release.yml). Here's the task:
- name: run my script!
command: ./roles/stagecode/files/mypythonscript.py
args:
chdir: /dir/to/be/run/in
delegate_to: 127.0.0.1
run_once: true
I've also tried ../files/mypythonscript.py. I thought the path for ansible would be relative to the playbook, but I guess not?
I also tried debugging to figure out where I am in the middle of the script, but no luck there either.
- name: figure out where we are
stat: path=.
delegate_to: 127.0.0.1
run_once: true
register: righthere
- name: print where we are
debug: msg="{{righthere.stat.path}}"
delegate_to: 127.0.0.1
run_once: true
That just prints out ".". So helpful ...
try to use script directive, it works for me
my main.yml
---
- name: execute install script
script: get-pip.py
and get-pip.py file should be in files in the same role
If you want to be able to use a relative path to your script rather than an absolute path then you might be better using the role_path magic variable to find the path to the role and work from there.
With the structure you are using in the question the following should work:
- name: run my script!
command: ./mypythonscript.py
args:
chdir: "{{ role_path }}"/files
delegate_to: 127.0.0.1
run_once: true
An alternative/straight forward solution:
Let's say you have already built your virtual env under ./env1 and used pip3 install the needed python modules.
Now write playbook task like:
- name: Run a script using an executable in a system path
script: ./test.py
args:
executable: ./env1/bin/python
register: python_result
- name: Get stdout or stderr from the output
debug:
var: python_result.stdout
If you want to execute the inline script without having a separate script file (for example, as molecule test) you can write something like this:
- name: Test database connection
ansible.builtin.command: |
python3 -c
"
import psycopg2;
psycopg2.connect(
host='127.0.0.1',
dbname='db',
user='user',
password='password'
);
"
You can even insert Ansible variables in this string.

Resources