Reload PATH for the entire playbook - ansible

As part of most of the Ansible playbooks I need to install Node and Mongo from internally hosted tarballs. Sudo privileges and internet access are not available. All of the Ansible runs happen against localhost.
One of the problems of this setup is that after untarring node/mongo, they need to be added to PATH or subsequent roles/tasks won't be able to rely on them. Unfortunately, I don't seem to be able to find a way to amend PATH within an Ansible playbook run.
I've tried using shell and command tasks to export PATH and source .bashrc, neither of those seem to help. Is there a way to use my node installation within the same playbook? yum task seems to do the trick, but it's not available to me now.

Have you tried using 'environment'?
You can get your local PATH into a variable
environment:
PATH: "{{ lookup('env', 'PATH') }}"
or you can set the PATH
environment:
PATH: "{{ node_path }}:{{mongo_path}}:{{ lookup('env', 'PATH') }}"
The above assumes you can register the path to mongo & Node as vars, and make them available to later plays.
Info on using environment & PATH locally and remotely is here:
https://serverfault.com/questions/577188/how-can-i-prepend-to-path-while-running-ansibles-pip-module
- hosts: localhost
gather_facts: False
vars:
path1: "{{lookup('env', 'PATH')}}"
tasks:
- shell: echo $PATH
environment:
PATH: 'mypath2'
register: path2
- shell: echo $PATH
environment:
PATH: 'mypath3'
register: path3
- shell: echo $PATH
environment:
PATH: "{{ path1 }}"
register: path4
- debug: msg={{path1}}
- debug: msg={{path2}}
- debug: msg={{path3}}
- debug: msg={{path4}}
- debug: msg={{lookup('env', 'PATH')}}

Related

Ansible set environment variable from a file and access it within the playbook

I have the below Ansible script which runs on localhost
- name: set env
shell: ". /tmp/testenv"
- name: get env
debug:
msg: "{{ lookup('env','TEST') }}"
In above script I'm trying to source the file and access the environment variables using the lookup. But it seems like environment variable is not set. Is there anyway I can get this to work?
The shell command is one process and the lookup looks in the environment of the Ansible process, which is a different one. You have to echo the environment variable in your shell command and register the result.
- hosts: localhost
connection: local
gather_facts: no
tasks:
- name: set env
shell: "echo 'export TEST=test' > /tmp/testenv"
- name: check env
shell: ". /tmp/testenv; echo $TEST"
register: result
- name: get env
debug:
msg: "{{ result.stdout }}"

Ansible - environment variables from .env file

I am trying to setup a playbook which will run the command to check status of the service installed in the target machine. The command will only work only if the .env file executed. The command to execute the .env file is .<space>./.env_file_name and the file contains list of environment variables like export JAVA_HOME=/optware/java/jdk/1.2.
I tried to execute the environment file before running the command with the below playbook, but it is not working.
- hosts: name
tasks:
- name: `execute env file`
command: . ./.env_file_name
register: result
Is there any playbook to run the executable environment file to set the environments present on the target machine and then run our command??
First, the . ./.env_file_name syntax is a shell syntax and cannot work with the command module, you need to use the shell module.
Secondly, the shell environment context is reset at every task as each is an ssh command round-trip (so a new shell session), and loading the environment variables in one task will not not make them available for next tasks.
Depending on your context, you have some options:
1. Inventory environment variables
The best option is to have the environment at your inventory side in a variable with different value for each group/host through group_vars/host_vars, then to use it for the environment keyword
# host_vars/my_host.yml
---
env_vars:
VAR1: key1
VAR2: key2
- hosts: my_host
tasks:
- name: Display environment variables
command: env
environment: "{{ env_vars }}"
Pros:
full ansible solution
will work for environment of every module
Cons:
need to know the environment variables at ansible side
2. Loading environment variables for every tasks
If your tasks are all shell/command (which I don't advise, as it's better to use appropriate ansible module whenever possible), you can simply load the env file every time with shell module
- hosts: my_host
tasks:
- name: Display environment variables
shell: |
. ./.env_file_name && env
- name: Do another action
shell: |
. ./.env_file_name && do_something_else
Pros:
no need to know the environment variables at ansible side
Cons:
limited to tasks with shell module
3. Load environment variables from env_file into ansible fact
This option is to parse the env file once and for all and load it in an ansible fact to use with the environment keyword.
- hosts: my_host
tasks:
- name: Get env file content
slurp:
src: ./.env_file_name
register: env_file_content
- name: Parse environment
set_fact:
env_vars: "{{ ('{' + (env_file_content.content | b64decode).split('\n') | select | map('regex_replace', '([^=]*)=(.*)', '\"\\1\": \"\\2\"') | join(',') + '}') | from_json }}"
- name: Display environment variables
command: env
environment: "{{ env_vars }}"
Or, if the env file need to be executed instead of directly parsed:
- hosts: my_host
tasks:
- name: Get env file content
shell: . ./.env_file_name && env
register: env_file_result
- name: Parse environment
set_fact:
env_vars: "{{ ('{' + env_file_result.stdout_lines | map('regex_replace', '([^=]*)=(.*)', '\"\\1\": \"\\2\"') | join(',') + '}') | from_json }}"
- name: Display environment variables
command: env
environment: "{{ env_vars }}"
Pros:
will work for environment of every module
no need to know the environment variables at ansible side
Cons:
could fail on bad formatting of file

How to permanently set environment variable?

Host is Ubuntu 16.04
I'm trying to set environment variable for user, with:
- hosts: all
remote_user: user1
tasks:
- name: Adding the path in the bashrc files
lineinfile: dest=/home/user1/.bashrc line='export MY_VAR=TEST' insertafter='EOF' state=present
- name: Source the bashrc file
shell: . /home/user1/.bashrc
- debug: msg={{lookup('env','MY_VAR')}}
Unfortunately it outputs:
TASK [debug] *******************************************************************
ok: [xxxxx.xxx] => {
"msg": ""
}
How can I export variable so next time I run some tasks on this machine I can use {{ lookup('env', 'MY_VAR') }} to get value of this variable?
Because lookups happen locally, and because each task runs in it's own process, you need to do something a bit different.
- hosts: all
remote_user: user1
tasks:
- name: Adding the path in the bashrc files
lineinfile: dest=/home/user1/.bashrc line='export MY_VAR=TEST' insertafter='EOF' state=present
- shell: . /home/user1/.bashrc && echo $MY_VAR
args:
executable: /bin/bash
register: myvar
- debug: var=myvar.stdout
In this example I am sourcing the .bashrc and checking the var in the same command, and storing the value with register
All lookups in Ansible are local. See documentation for details:
Note
Lookups occur on the local computer, not on the remote computer.

Ansible Set Dynamic Environment Variables

I know about Ansible's environment: command at the top of playbook, but I don't think that will work for me seeing how I don't know the variables value prior to the execution of the playbook. I'm trying to retrieve package versions and PHP Modules and log them to a file. I want to use regex to capture the version and store it to an environment variable. Then I want to write that variable equals that variable's value to an environment file with a shell command. I also want to pull an array from the environment and loop through that. Ansible doesn't seem to persist the shell environment and the environment variable gets wiped out between commands. This is simple in Bash. Is this possible in Ansible? I'm trying:
---
- hosts: all
become: yes
vars:
site_variables:
code_directory: /home/
dependency_versions:
WGET_VERSION: placeholder
PHP_MODULES: placeholder
tasks:
- name: Retrieve Environment
shell: export WGET_VERSION=$(wget --version | grep -o 'Wget [0-9]*.[0-9]*\+')
shell: export PHP_MODULES=$(php -m)
shell: echo "export {{ item }}={{ lookup('env', item ) }}" >> {{ site_variables.code_directory }}/.env.log
with_items:
- WGET_VERSION
- name: Write PHP Modules Out
shell: export PHP_MODULES=$(php -m)
shell: export PHP_MODULES=$(echo {{ lookup('env', 'PHP_MODULES') }} | sed 's/\[PHP Modules\]//g')
shell: export PHP_MODULES=$(echo {{ lookup('env', 'PHP_MODULES') }} | sed 's/\[Zend Modules\]//g')
shell: export PHP_MODULES=({{ lookup('env', 'PHP_MODULES') }})
shell: echo "# - {{ item.0 }}" >> {{ site_variables.code_directory }}/.env.log
with_items:
- "{{ lookup('env', 'PHP_MODULES') }}"
There's a lot going on here.
First, lookup always runs on the ansible control host, while the script that you pass to the shell module is running on the remote server. So you will never be able to get a remote environment variable using lookup.
For details: https://docs.ansible.com/ansible/playbooks_lookups.html
Secondly, environment variables don't propagate from a child to parent. If you have a script that does this...
export MYVARIABLE=foo
...and you run that script, your current environment will not suddenly have a variable named MYVARIABLE. This is just as true for processes spawned by Ansible as it is for processes spawned by your shell.
If you want to set an ansible variable, consider using the register keyword to get the value:
- hosts: localhost
gather_facts: false
tasks:
- name: get wget version
command: wget --version
register: wget_version_raw
- name: extract wget version
set_fact:
wget_version: "{{ wget_version_raw.stdout_lines[0].split()[2] }}"
- name: show wget version
debug:
msg: "wget version is: {{ wget_version }}"

Is there with_fileglob that works remotely in ansible?

Is there with_fileglob that works remotely in ansible?
Mainly I do want to use something similar with the with_fileglob but that will glob the files on the remote/target machine, not on the one that is running ansible.
Use find module to filter the files and then process the resulting list:
- name: Get files on remote machine
find:
paths: /path/on/remote
register: my_find
- debug:
var: item.path
with_items: "{{ my_find.files }}"
All of the with_* looping mechanisms are local lookups unfortunately so there's no really clean way to do this in Ansible. Remote operations by design must be enclosed in tasks as it would need to deal with connections and inventory etc.
What you can do is generate your fileglob by shelling out to the host and then registering the output and looping over the stdout_lines part of the output.
So a trivial example may be something like this:
- name : get files in /path/
shell : ls /path/*
register: path_files
- name: fetch these back to the local Ansible host for backup purposes
fetch:
src : /path/"{{item}}"
dest: /path/to/backups/
with_items: "{{ path_files.stdout_lines }}"
This would connect to the remote host (e.g., host.example.com), get all the file names under /path/ and then copy them back to the Ansible host to the path: /path/host.example.com/.
Using ls /path/* didn't work for me, so here's an example that uses find and some simple regex to delete all nginx managed virtual hosts:
- name: get all managed vhosts
shell: find /etc/nginx/sites-enabled/ -type f -name \*-managed.conf
register: nginx_managed_virtual_hosts
- name: delete all managed nginx virtual hosts
file:
path: "{{ item }}"
state: absent
with_items: "{{ nginx_managed_virtual_hosts.stdout_lines }}"
You could use it to find all files with a specific extension or any other mix. For instance to simply get all files in a directory: find /etc/nginx/sites-enabled/ -type f.
Here's a way to do it so that you can loop through all found. In my example, i had to look for all instances of pip to wipe out awscli in preparation to install awscli v2.0. I've done similar with lineinfile to strip out vars in /etc/skel dotfiles
- name: search for pip
find:
paths: [ /usr/local/bin, /usr/bin ]
file_type: any
pattern: pip*
register: foundpip
- name: Parse out pip paths (say that 3 times fast)
set_fact:
pips: "{{ foundpip | json_query('files[*].path') }}"
- name: List all the found versions of pip
debug:
msg: "{{ pips }}"
#upgrading pip often leaves broken symlinks or older wrappers behind which doesn't affect pip but breaks playbooks so ignore!
- name: remove awscli with found versions of pip
pip:
name: awscli
state: absent
executable: "{{ item }}"
loop: "{{ pips }}"
ignore_errors: yes

Resources