How to append date, timestamp in the ansible log file? - ansible

How do I append date, timestamp in ansible log file ?
Currently i have it as log_path=/var/ansible-playbooks/ansible.log in the ansible.cfg
Everytime I run, i need this log to file to be saved with the timestamp
example ansible-20160808142400.log

Use ANSIBLE_LOG_PATH environment variable.
Execute playbook as follows:
ANSIBLE_LOG_PATH=/tmp/ansible_$(date "+%Y%m%d%H%M%S").log ansible-playbook myplabook.yml
Alternatively you can write your own callback plugin that will log what you want and where you want it to.

If you're running on a UNIX based system you can take advantage of the behavior of inodes. Define a log path in your ansible.cfg. I created a directory in $HOME/.ansible.
log_path = $HOME/.ansible/log/ansible.log
Create a pre-task section in your playbooks and include the following task:
- name: Create the log file for this run
shell: /bin/bash -l -c "mv {{ lookup('env', 'HOME') }}/.ansible/log/ansible.log {{ lookup('env', 'HOME') }}/.ansible/log/ansible.log-{{ lookup('pipe', 'date +%Y%m%d%H%M%S') }}"
delegate_to: localhost
become: yes
become_user: "{{ lookup('env', 'USER') }}"
When ansible starts running a playbook it creates the log file and starts writing to it. The log file is then renamed to ansible.log-YYYYmmddHHMMSS and the ansible process continues to write to it because even though the log file's name has changed the inode associated with it hasn't.

Related

ansible-playbook gather information from a file

I want to read a file by ansible and find specific thing and store all of them in a file in my localhost
for example there is /tmp/test file in all host and I want to grep specific thing in this file and store all of them in my home.
What should I do?
There might be many ways to accomplish this. The choice of Ansible modules (or even tools) can vary.
One approach is (using only Ansible):
Slurp the remote file
Write new file with filtered content
Fetch the file to Control machine
Example:
- hosts: remote_host
tasks:
# Slurp the file
- name: Get contents of file
slurp:
src: /tmp/test
register: testfile
# Filter the contents to new file
- name: Save contents to a variable for looping
set_fact:
testfile_contents: "{{ testfile.content | b64decode }}"
- name: Write a filtered file
lineinfile:
path: /tmp/filtered_test
line: "{{ item }}"
create: yes
when: "'TEXT_YOU_WANT' in item"
with_items: "{{ testfile_contents.split('\n') }}"
# Fetch the file
- name: Fetch the filtered file
fetch:
src: /tmp/filtered_test
dest: /tmp/
This will fetch the file to /tmp/<ANSIBLE_HOSTNAME>/tmp/filtered_test.
You can use the Ansible fetch module to download files from the remote system to your local system. You can then do the processing locally, as shown in this Ansible cli example:
REMOTE=[YOUR_REMOTE_SERVER]; \
ansible -m fetch -a "src=/tmp/test dest=/tmp/ansiblefetch/" $REMOTE && \
grep "[WHAT_YOU_ARE_INTERESTED_IN]" /tmp/ansiblefetch/$REMOTE/tmp/test > /home/user/ansible_files/$REMOTE
This snippet runs the ad-hoc version of Ansible, calling the module fetch with the source folder (on the remote) and the destination folder (locally) as arguments. Fetch copies the file into a folder [SRC]/[REMOTE_NAME]/[DEST], from which we then grep what we are interested in, and output that in the /home/user/ansible_files/$REMOTE.

How to skip a task if a file has been modified

I am working on ansible-playbook, granted I am kind of new at this. Anyways, in the ansible playbook, I modified a file and when I rerun the playbook, I don’t want that file to be modified again.
- name: Check if the domain config.xml has been edited
stat: path={{ domainshome }}/{{ domain_name }}/config/config.xml
register: config
- name: Config.xml modified
debug: msg="The Config.xml has been modified"
when: config.changed
- name: Edit the config.xml - remove extra file-store bad tag
shell: "sed -i '776,780d' {{ domainshome }}/{{ domain_name }}/config/config.xml"
when: config.changed
When I run for the first time, it skips this step.
I need this step to run once and skip if the playbook is rerun.
I am trying to write ansible-playbook and remove entries from config file only when it’s executed for the first time so that it can run the jvm.
Q: "Remove entries from config file only when it’s executed for the first time."
A: It's possible to use the creates parameter of the shell module to make sure the configuration file has been edited only once. For example
- name: Edit the config.xml - remove extra file-store bad tag
shell: "sed -i '776,780d' {{ domainshome }}/{{ domain_name }}/config/config.xml"
args:
creates: "{{ domainshome }}/{{ domain_name }}/config/config.xml.lock"
- name: Create lock file
file:
state: touch
path: "{{ domainshome }}/{{ domain_name }}/config/config.xml.lock"
Notes
Quoting: creates: A filename, when it already exists, this step will not be run.
Fit the path and name of the lock file to your needs
stat module returns information about a file only and never changes a file. The registered variable register: config in this task would never report a file has been changed.
file module and state: touch is not idempotent. Quoting: an existing file or directory will receive updated file access and modification times (similar to the way touch works from the command line).
A better solution would be to modify the command and create the lock file along with sed. For example "sed -i ... && touch /path-to-lockfile/lockfile".

Ansible playbook error in Roles vars section while copying the main.yaml file of vars from a remote machine

Error MsgI am trying to execute a ansible playbook using roles. I have some variables, which I defined in main.yaml of vars section. I am copying these variables (main.yaml) from another remote machine.
My question is, my playbook throws an error for the first time even though it copies the main.yaml file in my vars section. When I run for second time, it executes playbook well. I am understanding, for the first time though the file is copied it doesn't read it as it was not present before the playbook was running. Is there a way I can run it successfully without an error for the first time.
Image roles will give clear idea about roles and sub files. Roles
site.yaml
---
- name: Testing the Mini project
hosts: all
roles:
- test
tasks/main.yaml
---
- name: Converting Mysql to CSV file
command: mysqldump -u root -padmin -T/tmp charan test --fields-terminated-by=,
when: inventory_hostname == "ravi"
- name: Converting CSV file to yaml format
shell: python /tmp/test.py > /tmp/main.yaml
when: inventory_hostname == "ravi"
- name:Copying yaml file from remote node to vars
shell: sshpass -p admin scp -r root#192.168.56.101:/tmp/main.yaml /etc/ansible/Test/roles/vars/main.yaml
when: inventory_hostname == "charan"
- name:Install Application as per the table
apt: name={{ item.Application }} state=present
when: inventory_hostname == {{ item.Username }}
with_items: user_app
/vars/main.yaml This will be copied from remote machine.
---
user_app:
- {Username: '"ravi"' , Application: curl}
- {Username: '"charan"' , Application: git}
Take a look at the include_vars task. It may do what you need. It looks like you need to be explicitly including /vars/main.yaml in a task before your apt task where you reference the variables.

in Ansible, How can I set log file name dynamilcally

I'm currently developing ansible script to build and deploy java project.
so, I can set the log_path like below
log_path=/var/log/ansible.log
but, It is hard to look up build history.
Is it possible to append datetime to log file name?
for example,
ansible.20150326145515.log
I don't believe there is a built-in way to generate the date on the fly like that but one option you have is to use a lookup which can shell out to date. Example:
log_path="/var/log/ansible.{{ lookup('pipe', 'date +%Y%m%d%H%M%S') }}.log"
Here is an option using ANSIBLE_LOG_PATH environment variable thanks to Bash shell alias:
alias ansible="ANSIBLE_LOG_PATH=ansible-\`date +%Y%m%d%H%M%S\`.log ansible"
Feel free to use an absolute path if you prefer.
I found it.
just add task to copy(or mv command) log locally
- name: Copy ansible.log
connection: local
command: mv ./logs/ansible.log ./logs/ansible.{{ lookup('pipe', 'date %Y%M%d%H%M%S') }}.log
run_once: true
thanks to #jarv
How about this:
- shell: date +%Y%m%d%H%M%S
register: timestamp
- debug: msg="foo.{{timestamp.stdout}}.log"
Output:
TASK [command] *****************************************************************
changed: [blabla.example.com]
TASK [debug] *******************************************************************
ok: [blabla.example.com] => {
"msg": "foo.20160922233847.log"
}
According to the nice folks at the #ansible freenode IRC, this can be accomplished with a custom callback plugin.
I haven't done it yet because I can't install the Ansible Python library on this machine. Specifically, Windows 7 can't have directory names > 260 chars in length, and pip tries to make lengthy temporary paths. But if someone gets around to it, please post it here.
Small improvement on #ickhyun-kwon answer:
- name: "common/_ansible_log_path.yml: rename ansible.log"
connection: local
shell: |
mkdir -vp {{ inventory_dir }}/logs/{{ svn_deploy.release }}/ ;
mv -vf {{ inventory_dir }}/logs/ansible.log {{ inventory_dir }}/logs/{{ svn_deploy.release }}/ansible.{{ svn_deploy.release }}.{{ lookup('pipe', 'date +%Y-%m-%d-%H%M') }}.log args:
executable: /bin/bash
chdir: "{{ inventory_dir }}"
run_once: True
ignore_errors: True
This has separate log directories per svn release, ensures the log directory actually exists before the mv command.
Ansible interprets ./ as the current playbook directory, which may or may not be the root of your ansible repository, whereas mine live in ./playbooks/$project/$role.yml. For me {{ inventory_dir }}/logs/ happens to correspond to the ~/ansible/log/ directory, though alternative layout configurations do not guarantee this.
I am unsure the correct way to formally extract the absolute ansible.cfg::log_path variable
Also the date command for month is +%m and not %M which is Minute
I have faced a similar problem while trying to set dynamic log paths for various playbooks.
A simple solution seems to be to pass the log filename dynamically to the ANSIBLE_LOG_PATH environment variable. Checkout -> https://docs.ansible.com/ansible/latest/reference_appendices/config.html
In this particular case just export the environment variable when running the intended playbook on your terminal:
export ANSIBLE_LOG_PATH=ansible.`date +%s`.log; ansible-playbook test.yml
Else if the intended filename cannot be generated by the terminal, you can always use a runner playbook which runs the intended playbook from the within:
---
- hosts:
- localhost
gather_facts: false
ignore_errors: yes
tasks:
- name: set dynamic variables
set_fact:
task_name: dynamic_log_test
log_dir: /path/to/log_directory/
- name: Change the working directory and run the ansible-playbook as shell command
shell: "export ANSIBLE_LOG_PATH={{ log_dir }}log_{{ task_name|lower }}.txt; ansible-playbook test.yml"
register: shell_result
This should log the result of test.yml to /path/to/log_directory/log_dynamic_log_test.txt
Hope you find this helpful!

Finding file name in files section of current Ansible role

I'm fairly new to Ansible and I'm trying to create a role that copies a file to a remote server. The local file can have a different name every time I'm running the playbook, but it needs to be copied to the same name remotely, something like this:
- name: copy file
copy:
src=*.txt
dest=/path/to/fixedname.txt
Ansible doesn't allow wildcards, so when I wrote a simple playbook with the tasks in the main playbook I could do:
- name: find the filename
connection: local
shell: "ls -1 files/*.txt"
register: myfile
- name: copy file
copy:
src="files/{{ item }}"
dest=/path/to/fixedname.txt
with_items:
- myfile.stdout_lines
However, when I moved the tasks to a role, the first action didn't work anymore, because the relative path is relative to the role while the playbook executes in the root dir of the 'roles' directory. I could add the path to the role's files dir, but is there a more elegant way?
It looks like you need access to a task that looks up information locally, and then uses that information as input to the copy module.
There are two ways to get local information.
use local_action:. That's shorthand for running the task agains 127.0.0.1, more info found here. (this is what you've been using)
use a lookup. This is a plugin system specifically designed for getting information locally. More info here.
In your case, I would go for the second method, using lookup. You could set it up like this example:
vars:
local_file_name: "{{ lookup('pipe', 'ls -1 files/*.txt') }}"
tasks:
- name: copy file
copy: src="{{ local_file_name }}" dest=/path/to/fixedname.txt
Or, more directly:
tasks:
- name: copy file
copy: src="{{ lookup('pipe', 'ls -1 files/*.txt') }}" dest=/path/to/fixedname.txt
With regards to paths
the lookup plugin is run from the context of the task (playbook vs role). This means that it will behave differently depending on where it's used.
In the setup above, the tasks are run directly from a playbook, so the working dir will be:
/path/to/project -- this is the folder where your playbook is.
If you where to add the task to a role, the working dir would be:
/path/to/project/roles/role_name/tasks
In addition, the file and pipe plugins run from within the role/files folder if it exists:
/path/to/project/roles/role_name/files -- this means your command is ls -1 *.txt
caveat:
The plugin is called every time you access the variable. This means you cannot trust debugging the variable in your playbook, and then relying on the variable to have the same value when used later in a role!
I do wonder though, about the use-case for a file that resides inside a projects ansible folders, but who's name is not known in advance. Where does such a file come from? Isn't it possible to add a layer in between the generation of the file and using it in Ansible... or having a fixed local path as a variable? Just curious ;)
Just wanted to throw in an additional answer... I have the same problem as you, where I build an ansible bundle on the fly and copy artifacts (rpms) into a role's files folder, and my rpms have versions in the filename.
When I run the ansible play, I want it to install all rpms, regardless of filenames.
I solved this by using the with_fileglob mechanism in ansible:
- name: Copy RPMs
copy: src="{{ item }}" dest="{{ rpm_cache }}"
with_fileglob: "*.rpm"
register: rpm_files
- name: Install RPMs
yum: name={{ item }} state=present
with_items: "{{ rpm_files.results | map(attribute='dest') | list }}"
I find it a little bit cleaner than the lookup mechanism.

Resources