Looking for ansible solution to read standalone.xml files on wildfly - ansible

looking for a solution to gather and organize standalone.xml files from various wildfly servers grouped by "staging" or "production" in my hosts file.
looking to see if something is available with the same output functionality of:
ansible wildfly -m setup --tree config
which creates a file per host with the requested data.
for example if i have 4 servers each one having a file named the exact same, in the same path, but having different contents. i could have them copied to a local directory and named after the server it came from:
(E.G:
standalone.server1.myserver.com
standalone.server2.myserver.com
)

Use the Ansible fetch module which has a few examples:
A very simple playbook may look like:
hosts: widlfy
tasks:
- name: Store file into /tmp/fetched/{hostname}/tmp/somefile
fetch:
src: /tmp/somefile
dest: /tmp/fetched
Run the playbook:
ansible-playbook playbook.yml

You can use the fetch module, e.g as ad hoc command:
ansible wildfly -i myInventory -m fetch -a "src=/myRemotePathname/standalone dest=/myLocalPathName/myDir" -u myUser
You'll get the remote file standalone file from the remote directory /myRemotePathname of any host belonging to wildfly group defined in myInventory file.
Local files are stored in a local /myLocalPathName/myDir directory having a subdirectory named as the remote hosts and under that the remote directory path.

Related

AnsibleFileNotFound error during Ansible playbook execution via AWS SSM

I zipped up an Ansible playbook and a configuration file, pushed the .zip file to S3, and I'm triggering the Ansible playbook from AWS SSM.
I'm getting a AnsibleFileNotFound error: AnsibleFileNotFound: Could not find or access '/path/to/my_file.txt' on the Ansible Controller.
Here is my playbook:
- name: Copies a configuration file to a machine.
hosts: 127.0.0.1
connection: local
tasks:
- name: Copy the configuration file.
copy:
src: /path/to/my_file.txt
dest: /etc/my_file.txt
owner: root
group: root
mode: '0644'
become: true
my_file.txt exists in the .zip file that I uploaded to S3, and I've verified that it's being extracted (via the AWS SSM output). Why wouldn't I be able to copy that file over? What do I need to do to get Ansible to save this file to /etc/ on the target machine?
EDIT:
Using remote_src: true makes sense because the .zip file is presumably unpacked by AWS SSM to somewhere on the target machine. The problem is that this is unpacked to a random temp directory, so the file isn't found anyway.
I tried a couple of different absolute paths - I am assuming the path here is relevant to the .zip root.
The solution here is a bit horrendous:
The .zip file is extracted to the machine into an ephemeral directory with a random name which is not known in advance of the AWS SSM execution.
remote_src must be true. Yeah, it's your file that you've uploaded to S3, but Ansible isn't really smart enough to know that in this context.
A path relative to the playbook has to be used if you're bundling configuration files with the playbook.
That relative path has to be interpolated.
So using src: "{{ playbook_dir | dirname }}/path/to/my_file.txt" solved the problem in this case.
Note that this approach should not be used if configuration files contain secrets, but I'm not sure what approach AWS SSM offers for that type of scenario when you are using it in conjunction with Ansible.

Fetching files from pythonless remote hosts

I know that I can manage remote hosts that don't have python installed (or it's too old python) using 'raw' and 'script' modules. It's possible to upload some file and execute it as script, or run some arbitrary command. But how can I fetch some file from remote host? Something like 'fetch' module which requires python?
You can execute scp on the Ansible host to fetch files.
tasks:
- name: Fetch a file
local_action: ansible.builtin.command scp {{ inventory_hostname }}:/path/to/file /tmp/.
See delegating tasks in the documentation.

ansible-pull on remote hosts

I want run the playbook on remote host, which the playbook is in github. So, I followed this blog and forked the repo https://github.com/vincesesto/ansible-pull-example
In side the repo, I have modified hosts file to my server IP. When run ansible-pull
veeru#carb0n:~/ansible-example$ ansible-pull -U https://github.com/veerendra2/ansible-pull-example -i hosts
Starting Ansible Pull at 2019-06-26 16:26:30
/usr/local/bin/ansible-pull -U https://github.com/veerendra2/ansible-pull-example -i hosts
[WARNING]: Could not match supplied host pattern, ignoring: carb0n
ERROR! Specified hosts and/or --limit does not match any hosts
Not sure why it is picking current server name carb0n even I specified -i hosts argument.
here is my hosts file
[hydrogen]
10.250.30.11
local.yml
---
- hosts: all
tasks:
- name: install example application
copy:
src: ansible_test_app
dest: /tmp/
owner: root
group: root
I had changed local.yml to hydrogen.yml, but still getting same error.
Not sure why it is picking current server name carb0n even I specified -i hosts argument.
Sure, because ansible pull is designed to run against the current host, always. If you want to run against a remote server then you are supposed to use ansible or ansible-playbook and then your specification of a host list and the connection mechanism would start to make sense again.
Using ansible-pull is designed for the cases where it is either impossible, or highly undesirable for something to connect to the managed host. That can be due to firewall, security policies, or any number of reasons. But policies are usually less strict about what a managed host can, itself, connect to, and that's why pulling configuration onto the host can be easier.

Ansible node not able to access a file in my host System

I am trying to copy a file from my Host Mac System to CentOS on a VM through Ansible Roles.
I have a folder created called Ansible Roles and under that I have used ansible-galaxy command and have created a role called tomcatdoccfg. helloworld.war is present in the root Ansible Roles folder.
The folder structure is as below :
Ansible tasks\main.yml playbook on Mac is as below:
- name: Copy war file to tmp
copy:
src: ⁨helloworld.war
dest: /tmp/helloworld.war
The helloworld.war file should be accessible for user abhilashdk(My Default MAC username). The CentOS VM also has a user called abhilashdk. I have configured ssh keys. Meaning I have generated ssh-keys -t rsa and moved the keys to the CentOS VM using ssh-copy-id and I am able to ping to VM using ansible -i hosts node1 -m ping command. I am able to install docker also on my node1 machine using ansible.
I have a main.yml file in the root Ansible Roles folder the contents of which is as below:
---
- hosts: node1
vars:
webapp:
app1:
PORT: 8090
NAME: webapp1
app2:
PORT: 8091
NAME: webapp2
become: true
roles:
- docinstall
- tomcatdoccfg
Now when I run the command ansible-playbook -i hosts main.yml I get the below error for Copy war file to tmp:
TASK [tomcatdoccfg : Copy war file to tmp] ************************************************************************************
An exception occurred during task execution. To see the full traceback, use -vvv. The error was: /Files/DevOps/Ansible/Ansible_roles/⁨helloworld.war
fatal: [node1]: FAILED! => {"changed": false, "msg": "Could not find or access '⁨helloworld.war'\nSearched in:\n\t/Files/DevOps/Ansible/Ansible_roles/tomcatdoccfg/files/⁨helloworld.war\n\t/Files/DevOps/Ansible/Ansible_roles/tomcatdoccfg/⁨helloworld.war\n\t/Files/DevOps/Ansible/Ansible_roles/tomcatdoccfg/tasks/files/⁨helloworld.war\n\t/Files/DevOps/Ansible/Ansible_roles/tomcatdoccfg/tasks/⁨helloworld.war\n\t/Files/DevOps/Ansible/Ansible_roles/files/⁨helloworld.war\n\t/Files/DevOps/Ansible/Ansible_roles/⁨helloworld.war"}
I am not understanding what permissions should I give to hellowrold.war file so that my centos on vm will be able to access it through ansible playbook/roles.
Could anybody help me out how to solve this issue.
Thanks in Advance
adding as answer, so i can show the non-latin characters that the log you attached in the question includes:
note right before the helloworld.war.
Could be the reason why Ansible cant find the file on the FS.
To be on the safe side, i would delete the whole main.yml and rewrite it.
ansible-playbook -i hosts main.yml --ask-sudo-pass or ansible-playbook -i hosts main.yml --ask-pass
This params will ask you for sudo password for playbook operations

Simple Ansible playbook: Location of file for task to copy file to target servers?

As a proof of concept, I'm trying to create what is probably the simplest ansible playbook ever: copying a single file from the ansible server out to the server farm.
For completeness, ansible is installed properly. The ping module works great! LoL
The playbook for my POC reads:
---
- hosts: Staging
tasks:
- name: Copy the file
copy: src=/root/Michael/file.txt dest=/tmp/file.txt
When I run the command...
ansible-playbook book.yml
I get the following output (summarized)...
msg: could not find src=/root/Michael/file.txt
Various docs and web pages I've read said the path to the file can either be absolute or relative to the playbook. I've tried both without success.
Where should my file be to get ansible to copy it over to the target servers?
Thanks!
Found the error in my ways. The playbook and files were located in a directory that was not accessible to the account running the ansible-playbook command. So while the ansible-playbook process could read the playbook (i called the command from the directory where the file was located), the process could not read the directory where the file was located and as a result could not find the file.
The solution was to move the playbook and files into a directory that could be read by the account running ansible. After that, the playbook worked exactly as expected!

Resources