Host is Ubuntu 16.04
I'm trying to set environment variable for user, with:
- hosts: all
remote_user: user1
tasks:
- name: Adding the path in the bashrc files
lineinfile: dest=/home/user1/.bashrc line='export MY_VAR=TEST' insertafter='EOF' state=present
- name: Source the bashrc file
shell: . /home/user1/.bashrc
- debug: msg={{lookup('env','MY_VAR')}}
Unfortunately it outputs:
TASK [debug] *******************************************************************
ok: [xxxxx.xxx] => {
"msg": ""
}
How can I export variable so next time I run some tasks on this machine I can use {{ lookup('env', 'MY_VAR') }} to get value of this variable?
Because lookups happen locally, and because each task runs in it's own process, you need to do something a bit different.
- hosts: all
remote_user: user1
tasks:
- name: Adding the path in the bashrc files
lineinfile: dest=/home/user1/.bashrc line='export MY_VAR=TEST' insertafter='EOF' state=present
- shell: . /home/user1/.bashrc && echo $MY_VAR
args:
executable: /bin/bash
register: myvar
- debug: var=myvar.stdout
In this example I am sourcing the .bashrc and checking the var in the same command, and storing the value with register
All lookups in Ansible are local. See documentation for details:
Note
Lookups occur on the local computer, not on the remote computer.
Related
I am trying to convert an existing shell script into an Ansible role. In this role, I am reading two environment variables but Ansible does not display these even though it is available in the host. Can anyone please help me understand what I am doing wrong?
Note: I cannot hardcode env.sh into my Ansible role as each region will have its own settings.
/etc/synopsys/bin/env.sh:
#!/bin/sh
SITEID="us01-savvis"
MYGLOBAL="/remote/kickstart"
export SITEID MYGLOBAL
Ansible code:
---
- name: Gather Facts
setup:
gather_subset:
- '!all'
- '!any'
- facter
- network
- hardware
async: 300
poll: 20
- name: Check if env.sh exists
stat:
path: /etc/synopsys/bin/env.sh
register: stat_result
- name: Source env.sh file if it exists
shell: "source /etc/synopsys/bin/env.sh"
when: stat_result.stat.exists == True
- name: Printing all the environment​ variables in Ansible
debug:
# msg: "{{ ansible_env }}"
msg: "{{ lookup('env','SITEID','MYGLOBAL','HOME','SHELL') }}"
Ansible output (Note that SITEID and MYGLOBAL are not visible):
TASK [common/run_pkg_checker/v1 : Printing all the environment​ variables in Ansible] *************************************************
ok: [ansible-poc-cos6] => {
"msg": ",,/u/subburat,/usr/local/bin/tcsh"
}
Linux environment variables (SITEID and MYGLOBAL defined):
[root#ansible-poc-cos6 ~]# env |grep MYGLOBAL
MYGLOBAL=/remote/kickstart
[root#ansible-poc-cos6 ~]# env |grep SITEID
SITEID=us01-savvis
First, each Ansible tasks running inside a separate sub-process.
Second, sub-processes can't have effects on their parent processes.
So, for the task which runs source env.sh command, it really does nothing to the Ansible process and the following tasks.
For your question, you can run the source env.sh command first before running ansible command.
Or using ansible --extra-vars or -e option to avoid hard-coding values in your playbook.
source /etc/synopsys/bin/env.sh
ansible-playbook your-playbook.yml
# OR
ansible-playbook -e SITEID=xxx -e MYGLOBAL=yyy your-playbook.yml
I have the below Ansible script which runs on localhost
- name: set env
shell: ". /tmp/testenv"
- name: get env
debug:
msg: "{{ lookup('env','TEST') }}"
In above script I'm trying to source the file and access the environment variables using the lookup. But it seems like environment variable is not set. Is there anyway I can get this to work?
The shell command is one process and the lookup looks in the environment of the Ansible process, which is a different one. You have to echo the environment variable in your shell command and register the result.
- hosts: localhost
connection: local
gather_facts: no
tasks:
- name: set env
shell: "echo 'export TEST=test' > /tmp/testenv"
- name: check env
shell: ". /tmp/testenv; echo $TEST"
register: result
- name: get env
debug:
msg: "{{ result.stdout }}"
How to execute shell commands in ansible along with special characters?
---
- name: Displaying the ORACLE_HOME
hosts: "{{ hostname }}"
tasks:
- name:
shell: echo $ORACLE_HOME
I want the output of echo $ORACLE_HOME
You need to write something like this :
---
- name: Displaying the ORACLE_HOME
hosts: "{{ hostname }}"
tasks:
- name:
shell: echo $ORACLE_HOME
register: var
- debug: msg="{{ var }}"
or
you can use the registered variable as var.stdout in your code.
As per your comment, use below command to use the shell variable in your playbook :
shell: source <Absolute-Path>/.bash_profile && echo $ORACLE_HOME
I want to execute some script on remote via Ansible and get result file from remote to host.
I wrote a playbook like below:
---
- name : script deploy
hosts: all
vars:
timestamp: "{{ lookup('pipe', 'date +%Y%m%d%H%M%S') }}"
become: true
tasks:
- name: script deployment
script: ./exe.sh {{ansible_nodename}}_{{ timestamp }}
args:
chdir: /tmp
exe.sh successfully executed on remote and redirect result to output file like remote_20170806065817.data
Script execution takes a few seconds, and I tried to fetch result file after execution done.
But {{timestamp}} is re-evaluated and changed when I fetch it.
So fetch cannot find script-execution result file name.
What I want is assign immutable (constant) value in my playbook.
Is there any workaround?
Ansible uses lazy evaluation, so variables are evaluated at the moment of their use.
You should set the fact, which will be evaluated once:
---
- name : script deploy
hosts: all
become: true
tasks:
- set_fact:
timestamp: "{{ lookup('pipe', 'date +%Y%m%d%H%M%S') }}"
- name: script deployment
script: ./exe.sh {{ansible_nodename}}_{{ timestamp }}
args:
chdir: /tmp
As part of most of the Ansible playbooks I need to install Node and Mongo from internally hosted tarballs. Sudo privileges and internet access are not available. All of the Ansible runs happen against localhost.
One of the problems of this setup is that after untarring node/mongo, they need to be added to PATH or subsequent roles/tasks won't be able to rely on them. Unfortunately, I don't seem to be able to find a way to amend PATH within an Ansible playbook run.
I've tried using shell and command tasks to export PATH and source .bashrc, neither of those seem to help. Is there a way to use my node installation within the same playbook? yum task seems to do the trick, but it's not available to me now.
Have you tried using 'environment'?
You can get your local PATH into a variable
environment:
PATH: "{{ lookup('env', 'PATH') }}"
or you can set the PATH
environment:
PATH: "{{ node_path }}:{{mongo_path}}:{{ lookup('env', 'PATH') }}"
The above assumes you can register the path to mongo & Node as vars, and make them available to later plays.
Info on using environment & PATH locally and remotely is here:
https://serverfault.com/questions/577188/how-can-i-prepend-to-path-while-running-ansibles-pip-module
- hosts: localhost
gather_facts: False
vars:
path1: "{{lookup('env', 'PATH')}}"
tasks:
- shell: echo $PATH
environment:
PATH: 'mypath2'
register: path2
- shell: echo $PATH
environment:
PATH: 'mypath3'
register: path3
- shell: echo $PATH
environment:
PATH: "{{ path1 }}"
register: path4
- debug: msg={{path1}}
- debug: msg={{path2}}
- debug: msg={{path3}}
- debug: msg={{path4}}
- debug: msg={{lookup('env', 'PATH')}}