Running Python script via ansible - ansible

I'm trying to run a python script from an ansible script. I would think this would be an easy thing to do, but I can't figure it out. I've got a project structure like this:
playbook-folder
roles
stagecode
files
mypythonscript.py
tasks
main.yml
release.yml
I'm trying to run mypythonscript.py within a task in main.yml (which is a role used in release.yml). Here's the task:
- name: run my script!
command: ./roles/stagecode/files/mypythonscript.py
args:
chdir: /dir/to/be/run/in
delegate_to: 127.0.0.1
run_once: true
I've also tried ../files/mypythonscript.py. I thought the path for ansible would be relative to the playbook, but I guess not?
I also tried debugging to figure out where I am in the middle of the script, but no luck there either.
- name: figure out where we are
stat: path=.
delegate_to: 127.0.0.1
run_once: true
register: righthere
- name: print where we are
debug: msg="{{righthere.stat.path}}"
delegate_to: 127.0.0.1
run_once: true
That just prints out ".". So helpful ...

try to use script directive, it works for me
my main.yml
---
- name: execute install script
script: get-pip.py
and get-pip.py file should be in files in the same role

If you want to be able to use a relative path to your script rather than an absolute path then you might be better using the role_path magic variable to find the path to the role and work from there.
With the structure you are using in the question the following should work:
- name: run my script!
command: ./mypythonscript.py
args:
chdir: "{{ role_path }}"/files
delegate_to: 127.0.0.1
run_once: true

An alternative/straight forward solution:
Let's say you have already built your virtual env under ./env1 and used pip3 install the needed python modules.
Now write playbook task like:
- name: Run a script using an executable in a system path
script: ./test.py
args:
executable: ./env1/bin/python
register: python_result
- name: Get stdout or stderr from the output
debug:
var: python_result.stdout

If you want to execute the inline script without having a separate script file (for example, as molecule test) you can write something like this:
- name: Test database connection
ansible.builtin.command: |
python3 -c
"
import psycopg2;
psycopg2.connect(
host='127.0.0.1',
dbname='db',
user='user',
password='password'
);
"
You can even insert Ansible variables in this string.

Related

How to Use command in Envoirment Variable Ansible Playbook

I am trying to use hostname -f command in variable with ansible-playbook. After I set the variable, I will use it in sed command. When manually execute the commands it works but with Ansible, variable does not work.
When I echo $hostn output is empty.
---
- hosts: test
become: true
become_user: root
tasks:
- name: test
shell: "{{ item }}"
with_items:
- hostn=`hostname -f` <<<<<< not working
- echo $hostn <<<<<< not working
- sed -i "s/test/$hostn/g" /file <<<< manually works
Can you help me?
My advise would be that you should not try to fit all your commands in Ansible shell this way, but rather translate them into the corresponding Ansible modules.
What you are looking to achieve can be done with the replace module — in place of sed — and the Ansible fact ansible_hostname — in place of hostname -f.
This would be the equivalent playbook:
- hosts: test
become: true
become_user: root
tasks:
- replace:
path: /file
regexp: test
replace: "{{ ansible_hostname }}"

Is it possible to specify groups from two different inventory files for an Ansible playbook?

My playbook (/home/user/Ansible/dist/playbooks/test.yml):
- hosts: regional_clients
tasks:
- shell: /export/home/user/ansible_scripts/test.sh
register: shellout
- debug: var=shellout
- hosts: web_clients
tasks:
- shell: /var/www/html/webstart/release/ansible_scripts/test.sh
register: shellout
- debug: var=shellout
- command: echo catalina.sh start
register: output
- debug: var=output
The [regional_clients] group is specified in /home/user/Ansible/webproj/hosts and the [web_clients] group is specified in /home/user/Ansible/regions/hosts.
Is there a way I could make the above work? Currently, running the playbook will fail since neither [regional_clients] or [web_clients] are defined in the default inventory file /home/user/Ansible/dist/hosts.
Yes, you can write a simple shell script:
#!/bin/sh
cat /home/user/Ansible/webproj/hosts /home/user/Ansible/regions/hosts
and call it as a dynamic inventory in Ansible:
ansible-playbook -i my_script test.yml
This question, however, looks to me like a problem with your organisation, not a technical one. If your environment is so complex and maintained by different parties, then use some kind of configuration database (and a dynamic inventory in Ansible which would retrieve the data), instead of individual files in user's paths.

Execute local script on remote without copying it across in Ansible

Is it possible to execute a local script on a remote host in Ansible without copying it across and then executing it?
The script, shell and command modules all seems like they might be the answer but I'm not sure which is best.
The script module describes itself as "Runs a local script on a remote node after transferring it" but the examples given don't suggest a copy operation - e.g. no src, dest - so maybe this is the answer?
script module FTW
tasks:
- name: Ensure docker repo is added
script: "{{ role_path }}/files/add-docker-repo.sh"
register: dockeraddrepo
notify: Done dockeraddrepo
when: ansible_local.dockeraddrepo | d(0) == 0
handlers:
- name: Done dockeraddrepo
copy:
content: '{{ dockeraddrepo }}'
dest: /etc/ansible/facts.d/dockeraddrepo.fact

Ansible playbook error in Roles vars section while copying the main.yaml file of vars from a remote machine

Error MsgI am trying to execute a ansible playbook using roles. I have some variables, which I defined in main.yaml of vars section. I am copying these variables (main.yaml) from another remote machine.
My question is, my playbook throws an error for the first time even though it copies the main.yaml file in my vars section. When I run for second time, it executes playbook well. I am understanding, for the first time though the file is copied it doesn't read it as it was not present before the playbook was running. Is there a way I can run it successfully without an error for the first time.
Image roles will give clear idea about roles and sub files. Roles
site.yaml
---
- name: Testing the Mini project
hosts: all
roles:
- test
tasks/main.yaml
---
- name: Converting Mysql to CSV file
command: mysqldump -u root -padmin -T/tmp charan test --fields-terminated-by=,
when: inventory_hostname == "ravi"
- name: Converting CSV file to yaml format
shell: python /tmp/test.py > /tmp/main.yaml
when: inventory_hostname == "ravi"
- name:Copying yaml file from remote node to vars
shell: sshpass -p admin scp -r root#192.168.56.101:/tmp/main.yaml /etc/ansible/Test/roles/vars/main.yaml
when: inventory_hostname == "charan"
- name:Install Application as per the table
apt: name={{ item.Application }} state=present
when: inventory_hostname == {{ item.Username }}
with_items: user_app
/vars/main.yaml This will be copied from remote machine.
---
user_app:
- {Username: '"ravi"' , Application: curl}
- {Username: '"charan"' , Application: git}
Take a look at the include_vars task. It may do what you need. It looks like you need to be explicitly including /vars/main.yaml in a task before your apt task where you reference the variables.

in Ansible, How can I set log file name dynamilcally

I'm currently developing ansible script to build and deploy java project.
so, I can set the log_path like below
log_path=/var/log/ansible.log
but, It is hard to look up build history.
Is it possible to append datetime to log file name?
for example,
ansible.20150326145515.log
I don't believe there is a built-in way to generate the date on the fly like that but one option you have is to use a lookup which can shell out to date. Example:
log_path="/var/log/ansible.{{ lookup('pipe', 'date +%Y%m%d%H%M%S') }}.log"
Here is an option using ANSIBLE_LOG_PATH environment variable thanks to Bash shell alias:
alias ansible="ANSIBLE_LOG_PATH=ansible-\`date +%Y%m%d%H%M%S\`.log ansible"
Feel free to use an absolute path if you prefer.
I found it.
just add task to copy(or mv command) log locally
- name: Copy ansible.log
connection: local
command: mv ./logs/ansible.log ./logs/ansible.{{ lookup('pipe', 'date %Y%M%d%H%M%S') }}.log
run_once: true
thanks to #jarv
How about this:
- shell: date +%Y%m%d%H%M%S
register: timestamp
- debug: msg="foo.{{timestamp.stdout}}.log"
Output:
TASK [command] *****************************************************************
changed: [blabla.example.com]
TASK [debug] *******************************************************************
ok: [blabla.example.com] => {
"msg": "foo.20160922233847.log"
}
According to the nice folks at the #ansible freenode IRC, this can be accomplished with a custom callback plugin.
I haven't done it yet because I can't install the Ansible Python library on this machine. Specifically, Windows 7 can't have directory names > 260 chars in length, and pip tries to make lengthy temporary paths. But if someone gets around to it, please post it here.
Small improvement on #ickhyun-kwon answer:
- name: "common/_ansible_log_path.yml: rename ansible.log"
connection: local
shell: |
mkdir -vp {{ inventory_dir }}/logs/{{ svn_deploy.release }}/ ;
mv -vf {{ inventory_dir }}/logs/ansible.log {{ inventory_dir }}/logs/{{ svn_deploy.release }}/ansible.{{ svn_deploy.release }}.{{ lookup('pipe', 'date +%Y-%m-%d-%H%M') }}.log args:
executable: /bin/bash
chdir: "{{ inventory_dir }}"
run_once: True
ignore_errors: True
This has separate log directories per svn release, ensures the log directory actually exists before the mv command.
Ansible interprets ./ as the current playbook directory, which may or may not be the root of your ansible repository, whereas mine live in ./playbooks/$project/$role.yml. For me {{ inventory_dir }}/logs/ happens to correspond to the ~/ansible/log/ directory, though alternative layout configurations do not guarantee this.
I am unsure the correct way to formally extract the absolute ansible.cfg::log_path variable
Also the date command for month is +%m and not %M which is Minute
I have faced a similar problem while trying to set dynamic log paths for various playbooks.
A simple solution seems to be to pass the log filename dynamically to the ANSIBLE_LOG_PATH environment variable. Checkout -> https://docs.ansible.com/ansible/latest/reference_appendices/config.html
In this particular case just export the environment variable when running the intended playbook on your terminal:
export ANSIBLE_LOG_PATH=ansible.`date +%s`.log; ansible-playbook test.yml
Else if the intended filename cannot be generated by the terminal, you can always use a runner playbook which runs the intended playbook from the within:
---
- hosts:
- localhost
gather_facts: false
ignore_errors: yes
tasks:
- name: set dynamic variables
set_fact:
task_name: dynamic_log_test
log_dir: /path/to/log_directory/
- name: Change the working directory and run the ansible-playbook as shell command
shell: "export ANSIBLE_LOG_PATH={{ log_dir }}log_{{ task_name|lower }}.txt; ansible-playbook test.yml"
register: shell_result
This should log the result of test.yml to /path/to/log_directory/log_dynamic_log_test.txt
Hope you find this helpful!

Resources