I have an ansible playbook to run different scripts. I need to change the executable base on the extension. This is my playbook:
- name: run nodejs script
command: node "{{item}}" arg1
loop: "{{ lookup('fileglob', '{{path}}/{{deploy_version}}/*.js', wantlist=True) }}"
- name: running python script
command: python "{{item}}" arg1
loop: "{{ lookup('fileglob', '{{path}}/{{deploy_version}}/*.py, wantlist=True) }}"
Since there is an order to run the scripts, I need to use an "if, else, statement". But I'm unable to find a way to run scripts in alphabetic order based on extension. How can I achieve this?
The splitext filter will extract the extension from the filename, which can then be used to lookup the command in a dict that maps the file extension to the command, and then it can be applied to all the globs you wish, which if I understand correctly you want to run in alphabetical order regardless of which fileglob it matches.
- name: run the scripts
command: '{{ cmd_by_ext[ item|splitext|last ] }} {{ item }} arg1'
loop: '{{ (js_files + py_files) | sort }}'
vars:
script_dir: '{{ path + "/" + deploy_version }}'
js_files: '{{ lookup("fileglob", script_dir+"*/.js", wantlist=True) }}'
py_files: '{{ lookup("fileglob", script_dir+"*/.py", wantlist=True) }}'
cmd_by_ext:
'.js': 'node'
'.py': 'python'
Related
I am trying to make a playbook that loops over the number of files in a directory and then use those files in another playbook.
My playbook as it is now:
---
- name: Run playbooks for Raji's testing
register: scripts
roles:
- prepare_edge.yml
- prepare_iq.yml
- scriptor.yml
with_fileglob: ~/ansible/test_scripts/*
~
When I run this it doesn't work, I've tried "register: scripts" to make a variable to reference inside scriptor.yml but again the playbook fails. Any advice or help you can provide would be much appreciated.
Thanks!
P.S. I am super new to ansible
here is the scriptor.yml
---
- hosts: all
tasks:
- name: Create directory
command: mkdir /some/path/
- name: If file is a playbook
copy:
src: "{{ scripts }}"
dest: /some/path/
when: "{{ scripts }}" == "*.yml"
- name: if file is a script
shell: . ${{ scripts }}
when: "{{ scripts }}" == "*.sh"
P.S.S prepare_edge.yml and prepare_iq.yml don't reference anything and just need to be called in the loop before scriptor.yml
here is the error:
ERROR! 'register' is not a valid attribute for a Play
The error appears to have been in '/Users/JGrow33/ansible/raji_magic_playbook.yml': line 3, column 3, but may
be elsewhere in the file depending on the exact syntax problem.
The offending line appears to be:
- name: Run playbooks for Raji's testing
^ here
There error message you're getting is telling you that you can't run register in a Playbook.
You can accomplish what you're looking for by doing something like the following in your scriptor.yml file:
- hosts: all
tasks:
- name: Create directory
command: mkdir /some/path/
- name: If file is a playbook
copy:
src: "{{ item }}"
dest: /some/path/
with_fileglob: ~/ansible/test_scripts/*.yml
- name: if file is a script
shell: . ${{ item }}
with_fileglob: copy:
src: "{{ item }}"
dest: /some/path/
with_fileglobe: ~/ansible/test_scripts/*.sh
References
How can Ansible "register" in a variable the result of including a playbook?
I have the following in a makefile:
some_file: some_script
bash some_script > $#
and would like to replicate this in an ansible playbook task, where running the task creates the file anew but only if the generating script file has changed, otherwise it's a no-op. Is there any way to accomplish this using the stock ansible modules?
You'd have to execute that in two phases: gather info, then conditionally execute
vars:
files:
- some_file
- some_script
tasks:
- stat: path={{ item }}
register: stats
with_items: '{{ files }}'
- debug:
msg: '{{ stats.results[0].stat.path }} is newer than {{ stats.results[1].stat.path }}'
when: '{{ stats.results[0].stat.mtime > stats.results[1].stat.mtime }}'
I am writing a playbook to locate a string pattern in a sequence of files. If I run my utility through the command module it will generate one or more strings on STDOUT. To run this across a number of systems I would like to run the command with_items:
- command: "findstring {{ item }}"
with_items:
- "string1"
- "string2"
register: found
failed_when: found.rc >= 2
And then iterate over the result to post process the info:
- name: Print strings we found
debug:
var: "{{ item }}"
with_items: found.results
Is there something equivalent to loop.index that can be used with "results" in the task above? This would allow me to do something like {{ item[INDEX].stdout }} to get the strings that were generated. I haven't been able to find an answer in the official documentation so I thought I would post here to see what the gurus think.
If you need to iterate over every line from all commands, use:
- debug:
msg: "Do smth for line {{ item }}"
with_items: "{{ found | json_query('results[].stdout_lines[]') }}"
This will take ever element from found.results, then every element from every stdout_lines.
For me this works in ansible [core 2.11.6]:
- name: Command which outputs multiple lines
ansible.builtin.command:
cmd: ls -l /
register: _ls_cmd
changed_when: no
- name: Debug to show each line
ansible.builtin.debug:
msg: "ITEM: {{ item }}"
with_items: "{{ _ls_cmd.stdout_lines }}"
Im creating a deployment playbook for our web services. Each web service is in its own directory such as:
/webapps/service-one/
/webapps/service-two/
/webapps/service-three/
I want to check to see if the service directory exists, and if so, I want to run a shell script that stops the service gracefully. Currently, I am able to complete this step by using ignore_errors: yes.
- name: Stop services
with_items: services_to_stop
shell: "/webapps/scripts/stopService.sh {{item}}"
ignore_errors: yes
While this works, the output is very messy if one of the directories doesnt exist or a service is being deployed for the first time. I effectively want to something like one of these:
This:
- name: Stop services
with_items: services_to_stop
shell: "/webapps/scripts/stopService.sh {{item}}"
when: shell: [ -d /webapps/{{item}} ]
or this:
- name: Stop services
with_items: services_to_stop
shell: "/webapps/scripts/stopService.sh {{item}}"
stat:
path: /webapps/{{item}}
register: path
when: path.stat.exists == True
I'd collect facts first and then do only necessary things.
- name: Check existing services
stat:
path: "/tmp/{{ item }}"
with_items: "{{ services_to_stop }}"
register: services_stat
- name: Stop existing services
with_items: "{{ services_stat.results | selectattr('stat.exists') | map(attribute='item') | list }}"
shell: "/webapps/scripts/stopService.sh {{ item }}"
Also note, that bare variables in with_items don't work since Ansible 2.2, so you should template them.
This will let you get a list of existing directory names into the list variable dir_names (use recurse: no to read only the first level under webapps):
---
- hosts: localhost
connection: local
vars:
dir_names: []
tasks:
- find:
paths: "/webapps"
file_type: directory
recurse: no
register: tmp_dirs
- set_fact: dir_names="{{ dir_names+ [item['path']] }}"
no_log: True
with_items:
- "{{ tmp_dirs['files'] }}"
- debug: var=dir_names
You can then use dir_names in your "Stop services" task via a with_items. It looks like you're intending to use only the name of the directory under "webapps" so you probably want to use the | basename jinja2 filter to get that, so something like this:
- name: Stop services
with_items: "{{ dir_names }}"
shell: "/webapps/scripts/stopService.sh {{item | basename }}"
I'm using the ec2 module with ansible-playbook I want to set a variable to the contents of a file. Here's how I'm currently doing it.
Var with the filename
shell task to cat the file
use the result of the cat to pass to the ec2 module.
Example contents of my playbook.
vars:
amazon_linux_ami: "ami-fb8e9292"
user_data_file: "base-ami-userdata.sh"
tasks:
- name: user_data_contents
shell: cat {{ user_data_file }}
register: user_data_action
- name: launch ec2-instance
local_action:
...
user_data: "{{ user_data_action.stdout }}"
I assume there's a much easier way to do this, but I couldn't find it while searching Ansible docs.
You can use lookups in Ansible in order to get the contents of a file, e.g.
user_data: "{{ lookup('file', user_data_file) }}"
Caveat: This lookup will work with local files, not remote files.
Here's a complete example from the docs:
- hosts: all
vars:
contents: "{{ lookup('file', '/etc/foo.txt') }}"
tasks:
- debug: msg="the value of foo.txt is {{ contents }}"
You can use the slurp module to fetch a file from the remote host: (Thanks to #mlissner for suggesting it)
vars:
amazon_linux_ami: "ami-fb8e9292"
user_data_file: "base-ami-userdata.sh"
tasks:
- name: Load data
slurp:
src: "{{ user_data_file }}"
register: slurped_user_data
- name: Decode data and store as fact # You can skip this if you want to use the right hand side directly...
set_fact:
user_data: "{{ slurped_user_data.content | b64decode }}"
You can use fetch module to copy files from remote hosts to local, and lookup module to read the content of fetched files.
lookup only works on localhost. If you want to retrieve variables from a variables file you made remotely use include_vars: {{ varfile }} . Contents of {{ varfile }} should be a dictionary of the form {"key":"value"}, you will find ansible gives you trouble if you include a space after the colon.