Convering If-else shell to Ansible - ansible

I am trying to convert this command to Ansible script. But I am not getting any luck. When module will not help.
if [ ${EnvType} == "PRE" ]
then
EnvPrefix="RP"
else
EnvPrefix=$(echo "${EnvType}" | cut -c1,3)
fi
export EnvPrefix
Essentially i need to export envprefix based on envtype. I can run shell command to findout if envtype is pre but I am receiving blank when i try to export using shell module.
- name: Set Envprefix for other environment
shell: |
EnvPrefix=$(echo "${EnvType}" | cut -c1,3)
export EnvPrefix
when: output.stdout != "PRE"

This is the equivalent
- set_fact:
EnvPrefix: "{{ (EnvType == 'PRE')|
ternary('ST',[EnvType.0,EnvType.2]|join) }}"
- command: "export {{ EnvPrefix }}"
In case the list of the selections is longer, map/extract might be more efficient
- set_fact:
EnvPrefix: "{{ (EnvType == 'PRE')|
ternary('ST',[0,2]|map('extract',EnvType)|list|join) }}"

Related

Ansible - Environment variables setting

I need to set the environment in the target machine. The environment variables are present in the file called .env337. There are several variables inside that file like
export AB_HOME=/tl/dev/abinitio/abinitio-V3 #/gcc3p32 # for 32-bit
export PATH=${AB_HOME}/bin:${PATH}
I have tried the below playbook to set the environment and register the environment variables in order to use them in the environment keyword to run the other commands in the registered environment, but it didn't worked.
- hosts: dev
gather_facts: false
tasks:
- name: To set the environment
shell: . ./.env337
register: output
Is there any other way to resolve this.
Q: "Set the environment and register the environment variables in order to use them in the environment keyword to run the other commands."
A: The environment variables set in the shell command can not be persistent. The shell process will be terminated after the execution of the command(s). For example
- shell: |
cat ./.env337
. ./.env337
echo "AB_HOME = $AB_HOME"
echo "PATH = $PATH"
exit 0
register: result
- debug:
var: result.stdout_lines
- shell: |
echo "AB_HOME = $AB_HOME"
echo "PATH = $PATH"
exit 0
register: result
- debug:
var: result.stdout_lines
give
"result.stdout_lines": [
"export AB_HOME=/tl/dev/abinitio/abinitio-V3 #/gcc3p32 # for 32-bit",
"export PATH=${AB_HOME}/bin:${PATH}",
"",
"AB_HOME = /tl/dev/abinitio/abinitio-V3",
"PATH = /tl/dev/abinitio/abinitio-V3/bin:/sbin:/bin:/usr/sbin:/usr/bin:/usr/local/sbin:/usr/local/bin:/home/admin/bin"
]
"result.stdout_lines": [
"AB_HOME = ",
"PATH = /sbin:/bin:/usr/sbin:/usr/bin:/usr/local/sbin:/usr/local/bin:/home/admin/bin"
]
As expected the variables are missing in the second shell task.
Q: "what if I don't know what are all the variables present inside the env file. Is there any other way to print all variables instead of using echo."
A: Short answer: Create a dictionary env_dict_add with the environment variables. Use it in the shell module to create the environment environment: "{{ env_dict_add }}".
Details
1) Create a list of unknown variables. For example
- shell: cat ./.env337
register: result
- set_fact:
env_list: "{{ env_list|default([]) +
[item.split('=').0.split(' ').1|trim] }}"
loop: "{{ result.stdout_lines }}"
- debug:
var: env_list
gives
"env_list": [
"AB_HOME",
"PATH"
]
2) Create a dictionary with the environment. For example
- shell: |
. ./.env337
set
register: result
- set_fact:
env_dict: "{{ env_dict|default({})|
combine({my_key: my_value}) }}"
vars:
my_key: "{{ item.split('=').0 }}"
my_value: "{{ item.split('=').1|default('') }}"
loop: "{{ result.stdout_lines }}"
3) Use any environment variable from the dictionary. For example, print whatever variables have been exported by sourcing the file .env337
- debug:
msg: "var: {{ item }} value: {{ env_dict[item] }}"
loop: "{{ env_list }}"
gives
"msg": "var: AB_HOME value: /tl/dev/abinitio/abinitio-V3"
"msg": "var: PATH value: /tl/dev/abinitio/abinitio-V3/bin:/sbin:/bin:/usr/sbin:/usr/bin:/usr/local/sbin:/usr/local/bin:/home/admin/bin"
4) Create a dictionary with the additional environment variables only. For example
- set_fact:
env_dict_add: "{{ env_dict_add|default({})|
combine({item: env_dict[item]}) }}"
loop: "{{ env_list }}"
- debug:
var: env_dict_add
gives
"env_dict_add": {
"AB_HOME": "/tl/dev/abinitio/abinitio-V3",
"PATH": "/tl/dev/abinitio/abinitio-V3/bin:/sbin:/bin:/usr/sbin:/usr/bin:/usr/local/sbin:/usr/local/bin:/home/admin/bin"
}
5) Use the dictionary env_dict_add to create environment variables in the shell command. For example
- shell: echo ${{ item }}
loop: "{{ env_list }}"
register: result
environment: "{{ env_dict_add }}"
- debug:
msg: "{{ dict(result.results|json_query('[].[item, stdout]')) }}"
give
"msg": {
"AB_HOME": "/tl/dev/abinitio/abinitio-V3",
"PATH": "/tl/dev/abinitio/abinitio-V3/bin:/sbin:/bin:/usr/sbin:/usr/bin:/usr/local/sbin:/usr/local/bin:/home/admin/bin"
}

How to read line-by-line in a file on remote machine

first line: /u01/app/oracle/oradata/TEST/
second line: /u02/
How to read both lines in a same variable and by using same varible i want know the present working directory through shell commands in ansible
You can use command to read a file from disk
- name: Read a file into a variable
command: cat /path/to/your/file
register: my_variable
And then do something like below to loop over the lines in the file.
- debug: msg="line: {{ item }}"
loop: y_variable.stdout_lines
The task below creates the list of the lines from a file
- set_fact:
lines_list: "{{ lines_list|default([]) + [item] }}"
with_lines: cat /path/to/file
It's possible to create both a list
"lines_list": [
"/u01/app/oracle/oradata/TEST/",
"/u02/"
]
and a dictionary
"lines_dict": {
"0": "/u01/app/oracle/oradata/TEST/",
"1": "/u02/"
}
with the combine filter
- set_fact:
lines_dict: "{{ lines_dict|default({})|combine({idx: item}) }}"
with_lines: cat /path/to/file
loop_control:
index_var: idx
"Present working directory through shell commands in ansible" can be printed from the registered variable. For example
- command: echo $PWD
register: result
- debug:
var: result.stdout
(not tested)

ansible: how do I display the output from a task that using with_items?

Ansible newbie here
Hopefully there is a simple solution to my problem
I'm trying to run SQL across a number of Oracle databases on one node. I generate a list of databases from ps -ef and use with_items to pass the dbname values.
My question is how do I display the output from each database running the select statement?
tasks:
- name: Exa check | find db instances
become: yes
become_user: oracle
shell: |
ps -ef|grep pmon|grep -v grep|grep -v ASM|awk '{ print $8 }'|cut -d '_' -f3
register: instance_list_output
changed_when: false
run_once: true
- shell: |
export ORAENV_ASK=NO; export ORACLE_SID={{ item }}; export ORACLE_HOME=/u01/app/oracle/database/12.1.0.2/dbhome_1; source /usr/local/bin/oraenv; $ORACLE_HOME/bin/sqlplus -s \"/ as sysdba\"<< EOSQL
select * from v\$instance;
EOSQL
with_items:
- "{{ instance_list_output.stdout_lines }}"
register: sqloutput
run_once: true
The loop below might work.
- debug:
msg: "{{ item.stdout }}"
loop: "{{ sqloutput.results }}"
If it does not take a look at the content of the variable and decide how to use it.
- debug: var=sqloutput

Use awk with ansible to run command

I have a playbook below:
- hosts: localhost
vars:
folderpath:
folder1/des
folder2/sdf
tasks:
- name: Create a symlink
shell: "echo {{folderpath}} | awk -F'/' '{system(\"mkdir \" $1$2 );}'"
register: result
#- debug:
# msg: "{{ result.stdout }}"
with_items:
- " {{folderpath}} "
However when I run the playbook I get 2 folders made. The first one is :
1- folder1des (as expected)
2- folder2 (this should ideally be folder2sdf )
I have tried many combination and still it doesnt want to work. What do I need to have it work properly.
I do not have ansible environment at the moment. But following should work:
- hosts: localhost
tasks:
- name: Create a symlink
shell: "echo {{item}} | awk -F'/' '{system(\"mkdir \" $1$2 );}'"
register: result
#- debug:
# msg: "{{ result.stdout }}"
with_items:
- folder1/des
- folder2/sdf
Reference: Ansible Loops Example
Explanation:
You were adding a single list object to the with_items. so in your with_items it finds only one object (which is of type list) to iterate over. Hence it runs only once. So now what I have done is I have passed a list of items to with_items that way it can iterate over the multiple items present in with_items.
Hope this helps!
Maybe
- hosts: localhost
vars:
folderpath:
folder1/des
folder2/sdf
tasks:
- name: Create a symlink
file:
state : link
path : "{{ item | regex_replace('[0-9]/','_') }}"
src : "{{ item }}"
with_items: " {{ folderpath }} "
Nothing in your given code creates symlinks. Is that really what you meant to do?

Change several files with specific file extension to another extension in a folder

In a folder containing files with different extensions (*.rules and *.rules.yml), I need to change the file extension based on certain condition:
*.rules => *.rules.yml, or
*.rules.yml => *.rules
In shell, I can do it as:
Case # 1
for file in ./*.rules; do mv "$file" "${file%.*}.rules.yml" ; done
# from *.rules to *.rules.yml
Case # 2
for file in ./*.rules.yml ; do mv "$file" "${file%.*.*}.rules" ; done
# from *.rules.yml to *.rules
Any idea in ansible to do the same thing?
Any help will be appreciated :)
Assuming the difficulty you are having is with YAML quoting, you may experience better luck with the "pipe literal":
tasks:
- shell: |
for i in *.rules; do
/bin/mv -iv "$i" "`basename "$i" .rules`.rules.yml"
done
- shell: |
for i in *.rules.yml; do
/bin/mv -v "$i" "`basename "$i" .rules.yml`.rules"
done
One will also notice that I used the more traditional basename rather than trying to do "crafty" variable expansion tricks, since with it should run with any posix shell.
Or if you are experiencing that your target system uses dash, or zsh, or ksh, or whatever, you can also be explicit in the shell you wish for ansible to use:
tasks:
- shell: echo "hello from bash"
args:
executable: /bin/bash
Thanks for the help, Matthew L Daniel. It works quite well.
The final working solution would be enclosed as reference:
- name: Run in local to replace suffix in a folder
hosts: 127.0.0.1
connection: local
vars:
- tmpRulePath: "rules"
- version: "18.06" # change the version here to change the suffix from rules/rules.yml to rules.yml/rules
- validSuffix: "rules.yml"
- invalidSuffix: "rules"
tasks:
- name: Prepare the testing resources
shell: mkdir -p {{ tmpRulePath }}; cd {{ tmpRulePath }}; touch 1.rules 2.rules 3.rules.yml 4.rules.yml; cd -; ls {{ tmpRulePath }};
register: result
- debug:
msg: "{{ result.stdout_lines }}"
- name: Check whether it's old or not
shell: if [ {{ version }} \< '18.06' ]; then echo 'true'; else echo 'false'; fi
register: result
- debug:
msg: "Is {{ version }} less than 18.06 {{ result.stdout }}"
- name: Update validSuffix and invalidSuffix
set_fact:
validSuffix="rules"
invalidSuffix="rules.yml"
when: result.stdout == "true"
- debug:
msg: "validSuffix is {{ validSuffix }} while invalidSuffix {{ invalidSuffix }}"
- name: Replace the invalid suffix with valid
shell: |
cd {{ tmpRulePath }};
for i in *.{{ invalidSuffix }}; do
/bin/mv -v "$i" "`basename "$i" .{{ invalidSuffix }}`.{{ validSuffix }}"
done
- name: Check the latest files
shell: ls {{ tmpRulePath }}
register: result
- debug:
msg: "{{ result.stdout_lines }}"
- name: Clean up
shell: rm -rf {{ tmpRulePath }}

Resources