Fetching files from pythonless remote hosts - ansible

I know that I can manage remote hosts that don't have python installed (or it's too old python) using 'raw' and 'script' modules. It's possible to upload some file and execute it as script, or run some arbitrary command. But how can I fetch some file from remote host? Something like 'fetch' module which requires python?

You can execute scp on the Ansible host to fetch files.
tasks:
- name: Fetch a file
local_action: ansible.builtin.command scp {{ inventory_hostname }}:/path/to/file /tmp/.
See delegating tasks in the documentation.

Related

Is it possible to call ansible or ansible-playbook directly on a target host using a script or ansible itself?

I need to know if it's possible to call / execute ansible playbooks from the target machine. I think i saw a vendor do it or at least something similar. they downloaded a script and it did ran the playbook.
if this is possible how would it be done?
my goal is to run ansible as a centralized server in aws to perform tasks in mulitple environments. most are behind firewalls, any reccomendations/thoughts would be appreciated.
Sure. If your host will install Ansible on target and feed it with all the playbooks the you can run it as any other executable. Should you do that is another story but technically there's no obstacle.
You can run ansible and ansible playbook as you would any other binary on the target's $PATH, so any tool that facilitates running remote commands would work.
Because you are in AWS, one way might be to use AWS System's Manager.
If you wanted to use Ansible itself to do this you could use the shell or command modules:
- hosts: target
become: false
gather_facts: false
tasks:
- name: ansible in ansible
command: ansible --version
- name: ansible-playbook in ansible
command: ansible-playbook --version
Though, as with any situation where you reach for the shell or command modules, you have to be vigilant to maintain playbook idempotency yourself.
If you're requirement is just the ability to execute Ansible commands remotely, you might look into AWX which is the upstream project for Red Hat's Ansible Tower. It wraps ansible in a nice user interface to allow you to trigger Ansible playbooks remotely and with nice-to-haves like RBAC.
If you're ok with executing tasks remotely over ssh take a look at Sparrowdo it has out of the box facilities to run bash scripts ( read ansible executable ) remotely from one master host to another. Or you can even use it to install all the ansible dependencies or whatever you need to do for your scope.

Looking for ansible solution to read standalone.xml files on wildfly

looking for a solution to gather and organize standalone.xml files from various wildfly servers grouped by "staging" or "production" in my hosts file.
looking to see if something is available with the same output functionality of:
ansible wildfly -m setup --tree config
which creates a file per host with the requested data.
for example if i have 4 servers each one having a file named the exact same, in the same path, but having different contents. i could have them copied to a local directory and named after the server it came from:
(E.G:
standalone.server1.myserver.com
standalone.server2.myserver.com
)
Use the Ansible fetch module which has a few examples:
A very simple playbook may look like:
hosts: widlfy
tasks:
- name: Store file into /tmp/fetched/{hostname}/tmp/somefile
fetch:
src: /tmp/somefile
dest: /tmp/fetched
Run the playbook:
ansible-playbook playbook.yml
You can use the fetch module, e.g as ad hoc command:
ansible wildfly -i myInventory -m fetch -a "src=/myRemotePathname/standalone dest=/myLocalPathName/myDir" -u myUser
You'll get the remote file standalone file from the remote directory /myRemotePathname of any host belonging to wildfly group defined in myInventory file.
Local files are stored in a local /myLocalPathName/myDir directory having a subdirectory named as the remote hosts and under that the remote directory path.

Is it possible to use Ansible fetch and copy modules for SCPing files

Im trying to SCP a .sql file from one server to another, and I am trying to use an Ansible module to do so. I stumbled across the fetch and copy modules, but I am not sure how to specify which host I want to copy the file from.
This is my current Ansible file:
firstDB database dump happens on seperate host
- hosts: galeraDatabase
become: yes
remote_user: kday
become_method: sudo
tasks:
- name: Copy file to the galera server
fetch:
dest: /tmp/
src: /tmp/{{ tenant }}Prod.sql
validate_checksum: yes
fail_on_missing: yes
Basically, I want to take the dump file from the firstDB host, and then get it over to the other galeraDatabase host. How would I do this? Im order to use fetch or copy, I would need to pass it the second hostname to copy the files from, and I don't see any parameters to do that inside of the documentation. Should I be using a different method altogether?
Thanks
Try using the synchronize module with delegate_to, or if you don't have rsync then use the copy module. Some good answers relating to this topic already on stackoverflow.
Also, check the ansible documentation for more info on the copy and synchronize modules along with the delegate_to tasks parameter.
hth.

Ansible execute command locally and then on remote server

I am trying to start a server using ansible shell module with ipmitools and then do configuration change on that server once its up.
Server with ansible installed also has ipmitools.
On server with ansible i need to execute ipmitools to start target server and then execute playbooks on it.
Is there a way to execute local ipmi commands on server running ansible to start target server through ansible and then execute all playbooks over ssh on target server.
You can run any command locally by providing the delegate_to parameter.
- shell: ipmitools ...
delegate_to: localhost
If ansible complains about connecting to localhost via ssh, you need to add an entry in your inventory like this:
localhost ansible_connection=local
or in host_vars/localhost:
ansible_connection: local
See behavioral parameters.
Next, you're going to need to wait until the server is booted and accessible though ssh. Here is an article from Ansible covering this topic and this is the task they have listed:
- name: Wait for Server to Restart
local_action:
wait_for
host={{ inventory_hostname }}
port=22
delay=15
timeout=300
sudo: false
If that doesn't work (since it is an older article and I think I previously had issues with this solution) you can look into the answers of this SO question.

Simple Ansible playbook: Location of file for task to copy file to target servers?

As a proof of concept, I'm trying to create what is probably the simplest ansible playbook ever: copying a single file from the ansible server out to the server farm.
For completeness, ansible is installed properly. The ping module works great! LoL
The playbook for my POC reads:
---
- hosts: Staging
tasks:
- name: Copy the file
copy: src=/root/Michael/file.txt dest=/tmp/file.txt
When I run the command...
ansible-playbook book.yml
I get the following output (summarized)...
msg: could not find src=/root/Michael/file.txt
Various docs and web pages I've read said the path to the file can either be absolute or relative to the playbook. I've tried both without success.
Where should my file be to get ansible to copy it over to the target servers?
Thanks!
Found the error in my ways. The playbook and files were located in a directory that was not accessible to the account running the ansible-playbook command. So while the ansible-playbook process could read the playbook (i called the command from the directory where the file was located), the process could not read the directory where the file was located and as a result could not find the file.
The solution was to move the playbook and files into a directory that could be read by the account running ansible. After that, the playbook worked exactly as expected!

Resources