copy and running configfiles programatically in ansible - ansible

There is a ansible code i am writing. It does two tasks, first is to copy a configuration file to target instance in groups. Second is running that config file to install the application.
I am creating configuration file and inventory programtically so that same suffix is added to both configfile name & groupname in inventory:
Configfile name example : Equivalent group names:
myappconf1 [myapp1]
hostname
myappconf2 [myapp2]
hostname
This is my code for copying files
hosts: all
tasks:
name: Copy file.role1 to host1
copy: src=/tmp/myconf1 dest=/tmp
when:
- "'myapp1' in group_names"
name: Copy Config File two to to Ldap2
copy: src=/tmp/myconf2 dest=/tmp
when:
- "'myapp2' in group_names"
This is my code for running conf file
hosts: myapp1
tasks:
- command: "/tmp/mainapp/update.sh -f myappconf1"
hosts: myapp2
tasks:
- command: "/tmp/mainapp/update.sh -f myappconf1"
But depending on user input uncertain number of conf files and groups can be created so I would like to do task more programatically. desired code may look like:
[task for copying file]
hosts: ~(myapp)
tasks:
- copy: copy the appropriate file to the host
example: copy myappconf4 to myapp4
- command: run the commmand with appropirate file
example: for myapp3, command: /tmp/mainapp/update.sh -f myappconf3
Can someone please suggest me what i can use to make my code more generic and efficient ?

You can use group variables like this,
#File : "myapp1/group_vars/all/vars_file.yml"
# Application settings
app_name: "myapp1"
copy_file: "myapp1conf1"
.
.
# other variables used for myapp1
Then use 2nd folder for 2nd app
#File : "myapp2/group_vars/all/vars_file.yml"
# Application settings
copy_file: "myapp2conf2"
.
.
# other variables used for myapp2
And In your code modify as below,
hosts: all
-tasks
-name: Copy file.role1 to host1
copy: src=/tmp/{{ copy_file }} dest=/tmp
-name : execute script
command: "/tmp/mainapp/update.sh -f {{ app_name }}"
Now depending upon the environment use the user input during the play time as below,
ansible-play -i myapp1 main.yml
or
ansible-play -i myapp2 main.yml

I populated variables with my python code in below format to resolve the issue:
[myapp]
host1 conf_file="myappconf1"
host2 conf_file="myappconf2"
then in my code i used the variable for both copy and running configfile tasks.

Related

Send the output from Ansible to a file [duplicate]

This question already has answers here:
Ansible - Save registered variable to file
(5 answers)
Closed 2 months ago.
I am trying to gain knowledge in Ansible and solve a few problems:
I want to, not sure if it is even possible. Can the output be saved local to the server the playbook is being run on?
in the example, I am just printing to terminal I am running the playbook. I it not much use when there is a large amount of data. I would like it to be saved in a file on the server I am running the playbook instead.
---
- name: list os version
hosts: test
become: true
tasks:
- name: hostname
command: hostname
register: command_output
- name: cat /etc/redhat-release
command: cat redhat-release chdir=/etc
- name: Print output to console
debug:
msg: "{{command_output.stdout}}"
I really want the output to go to a file. I cant find anything about if this is possible.
as you can read on the ansible documentation, you can create a local configuration file ansible.cfg inside the directory vers you have your playbook and then set the proper config log file to output all the playbook output inside: Ansible output documentation
By default Ansible sends output about plays, tasks, and module arguments to your screen (STDOUT) on the control node. If you want to capture Ansible output in a log, you have three options:
To save Ansible output in a single log on the control node, set the log_path configuration file setting. You may also want to set display_args_to_stdout, which helps to differentiate similar tasks by including variable values in the Ansible output.
To save Ansible output in separate logs, one on each managed node, set the no_target_syslog and syslog_facility configuration file settings.
To save Ansible output to a secure database, use AWX or Red Hat Ansible Automation Platform. You can then review history based on hosts, projects, and particular inventories over time, using graphs and/or a REST API.
If you just want to output the result of the task on file, use the copy module on the localhost delegation
---
- name: list os version
hosts: test
become: true
tasks:
- name: hostname
command: hostname
register: command_output
- name: cat /etc/redhat-release
command: cat redhat-release chdir=/etc
- name: Create your local file on master node
ansible.builtin.file:
path: /your/local/file
owner: foo
group: foo
mode: '0644'
delegate_to: localhost
- name: Print output to file
ansible.builtin.copy:
content: "{{command_output.stdout}}"
dest: /your/local/file
delegate_to: localhost

Executing playbooks in groupings created in hosts.yaml file

Ansible Version: 2.8.3
I have the following hosts.yaml file for use in Ansible
I have applications that I want to deploy on potentially both rp_1 and rp_2
---
all:
vars:
docker_network_name: devopsNet
http_protocol: http
http_host: ansiblenode01_new.example.com
http_url: "{{ http_protocol }}://{{ http_host }}:{{ http_port }}/{{ http_context }}"
hosts:
ansiblenode01_new.example.com:
ansiblenode02_new.example.com:
children:
##################################################################
rp_1:
children:
httpd:
hosts:
ansiblenode01_new.example.com:
vars:
number_of_tools: 6
outside_port: 443
jenkins:
hosts:
ansiblenode01_new.example.com:
vars:
http_port: 4444
http_context: jenkins
artifactory:
hosts:
ansiblenode01_new.example.com:
vars:
http_port: 8000
http_context: artifactory
rp_2:
children:
httpd:
hosts:
ansiblenode02_new.example.com:
vars:
number_of_tools: 4
outside_port: 7090
jenkins:
hosts:
ansiblenode02_new.example.com:
vars:
http_port: 7990
http_context: jenkins
artifactory:
hosts:
ansiblenode02_new.example.com:
vars:
http_port: 8000
http_context: artifactory
The following python wrapper script is calling ansible-playbook in a loop to deploy the applications
#!/usr/bin/python
import yaml
import os
import getpass
with open('hosts.yaml') as f:
var = yaml.load(f)
sudo_pass = getpass.getpass(prompt="Please enter sudo password: ")
# Running individual ansible-playbook deployment for each application listed and uncommented under 'applications' object.
for network in var['all']['children']:
for app in var['all']['children'][network]['children']:
os.system('ansible-playbook deploy.yml --extra-vars "application='+app+' ansible_sudo_password='+sudo_pass+'"')
The problem I recognize is that both Ansible and Python will use the hosts.yaml file, but not use it the way I thought it would as I'm not too familiar with Ansible.
The hosts.yaml was written in a format that is required by Ansible.
The Python script will open the yaml file, make a dictionary out of it, and step through the dictionary and look for the application names to pass to the command line call. The problem is then that Python only passes the name of the app as a string to the invocation of ansible-playbook, the dictionary structure obviously doesn't get passed, so Ansible will then open the hosts.yaml file as well, but all it does is step through the yaml and look for the first occurrence of the app name that was passed as an argument when ansible-playbook was invoked, completely disregarding the structure I've created in the yaml file.
So basically only the rp_1 group in the yaml file will be executed since Ansible, I think reads through the yaml from top down and stops at the first occurrence, therefore all or parts of the rp_2 group will never be processed by Ansible if the group contains all or some of the same apps as rp_1, therefore running the same deployment twice.
Is there a way to invoke Ansible or some ways to set the playbooks up so that Ansible will recognize that in my hosts file, I have networks (rp_1, rp_2) that I want to setup and executes the playbooks in the grouping that I've created in the yaml file?
Ansible already has this built-in. You do not need a wrapper script.
To run the deploy.yml playbook on all hosts in your hosts.yaml (this is called "inventory" btw.) do this:
ansible-playbook -i hosts.yaml deploy.yml -bK
To only run it on rp_1, do this:
ansible-playbook -i hosts.yaml deploy.yml --limit rp_1 -bK
-b makes ansible become root
-K will make ansible ask for the password to become root
-i <file> specifies the inventory file
--limit <host/group> limits the execution to certain hosts or groups, you can also add more than one, as a comma-separated list (e.g., pr_1,rp_2)
You can also specify a list of hosts/groups in your playbook like this:
- name: do whatever you like
hosts:
- rp_1
- rp_2
become: yes
tasks:
- debug:
msg: "I'm running on {{ inventory_hostname }}!"
Further reading:
Discovering variables: facts and magic variables
How to build your inventory
Special variables
Using variables
Ansible examples
Accessing variables of "other" hosts: on serverfault and stackoverflow

Ansible copy file using command syntax

The assignment is as follows:
Lets create a file touch afile.txt, prior to creating tasks
Create a playbook test.yml to
copy afile.txt from your control machine to host machine at /home/ubuntu/ location as afile_copy.txt and
debug the above task to display the returned value
Execute your playbook (test.yml) and observe the output
I did following
Created the afile_copy.txt using touch
created the playbook as follows:
- name: copy files
hosts: all
tasks:
- name: copy file
command: cp afile.txt /home/ubuntu/afile_copy.txt
register:output
- debug: var=output
When I run the playbook using the command
ansible-playbook -i myhosts test.yml
it fails with the error message
stderr: cp: cannot stat 'afile.txt' : no such file or directory
The afile.txt is present in directory /home/scrapbook/tutorial
You should use copy module instead of command module. command module executes on the remote node.
1)first execute the ad-hoc command for copy:
ansible all -i myhosts -m copy -a "src=afile.txt dest=/home/ubuntu/"
2) After execute the above command,execute this playbpook:
hosts: all
tasks:
stat: path=/home/ubuntu/afile_copy.txt
register: st
name: rename
command: mv afile.txt /home/ubuntu/afile_copy.txt
when: not st.stat.exists
register: output
debug: var=output
Copy module should be used instead of command module
- name: copy files
hosts: all
tasks:
- name: copy file
copy: src=afile.txt dest=/home/ubuntu/afile_copy.txt
register:output
- debug: var=output
---
- name: copy files
hosts: all
tasks:
- name: copy file
copy:
src: afile.txt
dest: /home/ubuntu/afile_copy.txt
register: output
- debug: var=output

dynamic include var files at playbook level [duplicate]

I have created my own custom library, I added my custom library in the common folder of my repository. In that I need to pass variables dynamically. It's a confidential password, so I am using "vault" in ansible.
In that my requirement is how to pass include_vars in the tasks\main.yml before hosts.
e.g: mytasks.yml
- include_vars: sample_vault.yml
- include: sample_tasks.yml
- hosts: localhost
tasks:
name: "free task"
command: ls -a
my directory structure like this:
myfolder
- common
-library
-my file.py
- sample_tasks.yml
- mytasks
-mytasks.yml(my main master playbook file)
-sample_vault.yml (note:i create this using vault for confidential purpose)
- roles
-myrole
Here I need to run sample_tasks file using a variables passed in sample_vault.yml file before I execute the hosts tasks using ansible. If I use extra variable means password is visible so I don't need that.
When I use include_vars in my tasks/main.yml file, it shows the following error:
ERROR! 'include_vars' is not a valid attribute for a Play
You can't use include_vars this way, it's only available for use under tasks.
If sample_tasks.yml is a list of tasks, you also can't use it on playbook level. See my other answer for explanation.
You can use vars_files like this:
- hosts: localhost
vars_files:
- sample_vault.yml
tasks:
name: "free task"
command: ls -a
Or pass a file as extra variables:
ansible-playbook --ask-vault-pass -e #sample_vault.yml myplaybook.yml

a newbee trying to use ansible and a weird error

I have this playbook:
- name: main playbook
hosts: 127.0.0.1
connection: local
sudo: Yes
gather_facts: True
vars_files:
- /home/core/REPO/alonso/core2door-integration/workflows/core2door.ansible/vars-feed.yml
tasks:
- include: tasks/feed-adapter.yml
tasks/feed-adapter.yml
# This playbook deploys feed-adapter application in this host. This app needs a jar that basically writes to HDFS and write a log in a system,
# actually Impala. Variables are described within vars-feed.yml file.
- name: feed-adapter playbook
hosts: "{{host_feed}}"
remote_user: "{{remote-user}}"
sudo: Yes
- name: Creates feed_adapter_outputpath directory
file: path=/var/app/feed-adapter/ state=directory
- name: Creates check-feed-adapter-folders_outputpath directory
file: path=/var/app/check-feed-adapter-folders/ state=directory
- name: Copy supervisor conf files to /etc/supervisor.d folder
copy: src=/home/core/REPO/alonso/core2door-integration/feed-adapter/feed_adapter_Sip_Pub.conf dest=/etc/supervisor.d/feed_adapter_Sip_Pub.conf owner=root group=root mode=0644
- name: Copy generated jar to destination folder
copy: src=/home/core/REPO/alonso/core2door-integration/feed-adapter/target/feed-adapter-0.0.1-SNAPSHOT.jar dest=/var/app/check-feed-adapter-folders/feed-adapter.jar mode=0644
- name: Copy necessary .properties files to destination folder
copy: src=/home/core/REPO/alonso/core2door-integration/feed-adapter/feed-adapter-SIP-Pub.properties dest=/var/app/check-feed-adapter-folders/feed-adapter-SIP-Pub.properties mode=0644
- name: Copy check-feed-adapter-folders jar to destination folder
copy: src=/home/core/REPO/alonso/core2door-integration/CheckFeedAdapterFolders/target/CheckFeedAdapterFolders-0.0.1-SNAPSHOT.jar dest=/var/app/check-feed-adapter-folders/CheckFeedAdapterFolders.jar mode=0644
- name: Copy check.sh script to destination folder
copy: src=/home/core/REPO/alonso/core2door-integration/CheckFeedAdapterFolders/check.sh dest=/var/app/check-feed-adapter-folders/check.sh mode=0644
and when I try to run the playbook, it returns me this output:
$ ansible-playbook playbook.yml --syntax-check --i hosts.txt
/usr/lib/python2.6/site-packages/setuptools-12.0.5-py2.6.egg/pkg_resources/__init__.py:1222: UserWarning:
/home/core/.python-eggs is writable by group/others and vulnerable to
attack when used with get_resource_filename. Consider a more secure
location (set with .set_extraction_path or the PYTHON_EGG_CACHE
environment variable).
playbook: playbook.yml
ERROR: copy is not a legal parameter at this level in an Ansible Playbook
Please help, i am newbee with ansible.
UPDATE
after Daniel's advice, I have indented carefully the playbook and I am getting this error:
$ vi tasks/feed-adapter.yml
$ ansible-playbook playbook.yml --syntax-check --i hosts.txt
/usr/lib/python2.6/site-packages/setuptools-12.0.5-py2.6.egg/pkg_resources/__init__.py:1222: UserWarning:
/home/core/.python-eggs is writable by group/others and vulnerable to
attack when used with get_resource_filename. Consider a more secure
location (set with .set_extraction_path or the PYTHON_EGG_CACHE
environment variable).
playbook: playbook.yml
ERROR: file is not a legal parameter at this level in an Ansible Playbook
UPDATE 2:
The execution of ansible-playbook with -vvvv shows the same output:
ERROR: file is not a legal parameter at this level in an Ansible Playbook
[core#dub-vcd-vms171 core2door.ansible]$ ansible-playbook --i hosts.txt -vvvv playbook.yml
/usr/lib/python2.6/site-packages/setuptools-12.0.5-py2.6.egg/pkg_resources/__init__.py:1222: UserWarning: /home/core/.python-eggs is writable by group/others and vulnerable to attack when used with get_resource_filename. Consider a more secure location (set with .set_extraction_path or the PYTHON_EGG_CACHE environment variable).
ERROR: file is not a legal parameter at this level in an Ansible Playbook
I guess that the error comes with this line:
- name: Creates feed_adapter_outputpath directory
file: path=/var/app/feed-adapter/ state=directory
You have to be very careful with line indention when working with yaml files. The error here is probably caused by the copy element in line 28:
- name: ...
copy: ...
Add two whitespaces in front of the copy, so it looks like this:
- name: ...
copy: ...

Resources