Ansible if else using shell script - shell

I am using following ansible tasks for triggering certain a task based on user's choice.
This is working:
tasks:
- name: Run python script for generating Repos Report
command: python GetRepos.py -o {{ org }} -p {{ pat }}
register: result
- debug: msg="{{result.stdout}}"
when: choice == "Repos"
- name: Run python script for generating projects Report
command: python Getprojects.py -o {{ org }} -p {{ pat }}
register: result
- debug: msg="{{result.stdout}}"
when: choice == "projects"
But I want to use a shell script with if else statement to run this in one task as below:
tasks:
- name: run python script
shell: |
if [choice == "repos"]
then
cmd: python GetRepos.py -o {{ org }} -p {{ pat }}
elif [choice == "projects"]
then
cmd: python Getprojects.py -o {{ org }} -p {{ pat }}
fi
register: cmd_output
- debug: msg="{{cmd_output.stdout}}"
But this does not execute the task; it just ends without error.
Is this the right syntax for shell?
How can I achieve these 2 separate working tasks in just one task using the shell module?

The cmd: in a shell script will try to run cmd: as a command, which you don't want.
Also, the if statement conditions need spaces on either side - otherwise, it would try to run [choice as a command, which you also don't want.
Also prefer to use single equals instead of double equals, to make it more portable (the remote hosts could have various different shells!).
The other issue is that choice as used inside the shell script is just a literal string. You need to add the braces {{ }} to interpolate the value, as done elsewhere in the playbook.
Taking into consideration the above, the following should work for you:
tasks:
- name: run python script
shell: |
if [ "{{ choice }}" = "repos" ]
then
python GetRepos.py -o "{{ org }}" -p "{{ pat }}"
elif [ "{{ choice }}" = "projects" ]
then
python Getprojects.py -o "{{ org }}" -p "{{ pat }}"
fi
register: cmd_output
- debug:
msg: "{{ cmd_output.stdout }}"

Related

running different script base on extension in Ansible

I have an ansible playbook to run different scripts. I need to change the executable base on the extension. This is my playbook:
- name: run nodejs script
command: node "{{item}}" arg1
loop: "{{ lookup('fileglob', '{{path}}/{{deploy_version}}/*.js', wantlist=True) }}"
- name: running python script
command: python "{{item}}" arg1
loop: "{{ lookup('fileglob', '{{path}}/{{deploy_version}}/*.py, wantlist=True) }}"
Since there is an order to run the scripts, I need to use an "if, else, statement". But I'm unable to find a way to run scripts in alphabetic order based on extension. How can I achieve this?
The splitext filter will extract the extension from the filename, which can then be used to lookup the command in a dict that maps the file extension to the command, and then it can be applied to all the globs you wish, which if I understand correctly you want to run in alphabetical order regardless of which fileglob it matches.
- name: run the scripts
command: '{{ cmd_by_ext[ item|splitext|last ] }} {{ item }} arg1'
loop: '{{ (js_files + py_files) | sort }}'
vars:
script_dir: '{{ path + "/" + deploy_version }}'
js_files: '{{ lookup("fileglob", script_dir+"*/.js", wantlist=True) }}'
py_files: '{{ lookup("fileglob", script_dir+"*/.py", wantlist=True) }}'
cmd_by_ext:
'.js': 'node'
'.py': 'python'

Ansible: Command module not interpolating shell variable

The following command fails most likely because it fails to interpolate the shell variable packdir
- name: archive_artifacts.yml --> Clear git history from packs directories
command: 'for packdir in {{ packs_dir }}/*; do rm -rf {{ packs_dir }}/"${packdir}"/.git; done'
args:
chdir: "{{ temp_build_directory }}"
packs_dir is a variable in the defaults/main.yml of the specific role:
packs_dir: "packs"
Is there a way of having the command module substituting both ansible and shell variables?
Is there a way of having the command module substituting both ansible and shell variables?
Yes. Make sure the shell variable is present in the environment of the host, user and shell you run the command in.
- hosts: localhost
vars:
env_variable: SHELL
tasks:
- command: "echo ${{ env_variable }}"
register: result
- debug: msg="{{ result.stdout }}"

Iterating over stdout

I am writing a playbook to locate a string pattern in a sequence of files. If I run my utility through the command module it will generate one or more strings on STDOUT. To run this across a number of systems I would like to run the command with_items:
- command: "findstring {{ item }}"
with_items:
- "string1"
- "string2"
register: found
failed_when: found.rc >= 2
And then iterate over the result to post process the info:
- name: Print strings we found
debug:
var: "{{ item }}"
with_items: found.results
Is there something equivalent to loop.index that can be used with "results" in the task above? This would allow me to do something like {{ item[INDEX].stdout }} to get the strings that were generated. I haven't been able to find an answer in the official documentation so I thought I would post here to see what the gurus think.
If you need to iterate over every line from all commands, use:
- debug:
msg: "Do smth for line {{ item }}"
with_items: "{{ found | json_query('results[].stdout_lines[]') }}"
This will take ever element from found.results, then every element from every stdout_lines.
For me this works in ansible [core 2.11.6]:
- name: Command which outputs multiple lines
ansible.builtin.command:
cmd: ls -l /
register: _ls_cmd
changed_when: no
- name: Debug to show each line
ansible.builtin.debug:
msg: "ITEM: {{ item }}"
with_items: "{{ _ls_cmd.stdout_lines }}"

Ansible Set Dynamic Environment Variables

I know about Ansible's environment: command at the top of playbook, but I don't think that will work for me seeing how I don't know the variables value prior to the execution of the playbook. I'm trying to retrieve package versions and PHP Modules and log them to a file. I want to use regex to capture the version and store it to an environment variable. Then I want to write that variable equals that variable's value to an environment file with a shell command. I also want to pull an array from the environment and loop through that. Ansible doesn't seem to persist the shell environment and the environment variable gets wiped out between commands. This is simple in Bash. Is this possible in Ansible? I'm trying:
---
- hosts: all
become: yes
vars:
site_variables:
code_directory: /home/
dependency_versions:
WGET_VERSION: placeholder
PHP_MODULES: placeholder
tasks:
- name: Retrieve Environment
shell: export WGET_VERSION=$(wget --version | grep -o 'Wget [0-9]*.[0-9]*\+')
shell: export PHP_MODULES=$(php -m)
shell: echo "export {{ item }}={{ lookup('env', item ) }}" >> {{ site_variables.code_directory }}/.env.log
with_items:
- WGET_VERSION
- name: Write PHP Modules Out
shell: export PHP_MODULES=$(php -m)
shell: export PHP_MODULES=$(echo {{ lookup('env', 'PHP_MODULES') }} | sed 's/\[PHP Modules\]//g')
shell: export PHP_MODULES=$(echo {{ lookup('env', 'PHP_MODULES') }} | sed 's/\[Zend Modules\]//g')
shell: export PHP_MODULES=({{ lookup('env', 'PHP_MODULES') }})
shell: echo "# - {{ item.0 }}" >> {{ site_variables.code_directory }}/.env.log
with_items:
- "{{ lookup('env', 'PHP_MODULES') }}"
There's a lot going on here.
First, lookup always runs on the ansible control host, while the script that you pass to the shell module is running on the remote server. So you will never be able to get a remote environment variable using lookup.
For details: https://docs.ansible.com/ansible/playbooks_lookups.html
Secondly, environment variables don't propagate from a child to parent. If you have a script that does this...
export MYVARIABLE=foo
...and you run that script, your current environment will not suddenly have a variable named MYVARIABLE. This is just as true for processes spawned by Ansible as it is for processes spawned by your shell.
If you want to set an ansible variable, consider using the register keyword to get the value:
- hosts: localhost
gather_facts: false
tasks:
- name: get wget version
command: wget --version
register: wget_version_raw
- name: extract wget version
set_fact:
wget_version: "{{ wget_version_raw.stdout_lines[0].split()[2] }}"
- name: show wget version
debug:
msg: "wget version is: {{ wget_version }}"

Combine with_fileglob with another list in Ansible Playbooks

So I have an Ansible playbook and I'm trying to call a command for each item in a list, but also run that command over a fileglob. There is a "with_nested" in Ansible, and it can take variable names, but if I add a "with_fileglob," it just inserts "with_fileglob" as the filename instead of actually doing a glob.
vars:
repo_versions:
- version: trusty
distribution: Ubuntu
- version: wheezy
distribution: Debian
...
- command: reprepro -b /var/www/html includedeb {{ item[0].version }} {{ item[1] }}
with_nested:
- repo_versions
with_fileglob: /home/repoman/debs/*.deb
when: debs_available.stat.exists == True
I've tried a couple of different combinations and I can't seem to get it to process the command in a double for loop (for each .version, for each .deb file)
This should be what you are trying to accomplish.
I used the shell module to register the output of the fileglob, and then the stdout_lines property of the registerd variable in the loop. I have converted the task from my test to your actual commands and paths, so you might need to double check:
vars:
repo_versions:
- version: trusty
distribution: Ubuntu
- version: wheezy
distribution: Debian
tasks:
- shell: ls -1 /home/repoman/debs/*.deb
register: repo_list
- command: reprepro -b /var/www/html includedeb {{ item[0].version }} {{ item[1] }}
with_nested:
- repo_versions
- repo_list.stdout_lines

Resources