AnsibleFileNotFound error during Ansible playbook execution via AWS SSM - ansible

I zipped up an Ansible playbook and a configuration file, pushed the .zip file to S3, and I'm triggering the Ansible playbook from AWS SSM.
I'm getting a AnsibleFileNotFound error: AnsibleFileNotFound: Could not find or access '/path/to/my_file.txt' on the Ansible Controller.
Here is my playbook:
- name: Copies a configuration file to a machine.
hosts: 127.0.0.1
connection: local
tasks:
- name: Copy the configuration file.
copy:
src: /path/to/my_file.txt
dest: /etc/my_file.txt
owner: root
group: root
mode: '0644'
become: true
my_file.txt exists in the .zip file that I uploaded to S3, and I've verified that it's being extracted (via the AWS SSM output). Why wouldn't I be able to copy that file over? What do I need to do to get Ansible to save this file to /etc/ on the target machine?
EDIT:
Using remote_src: true makes sense because the .zip file is presumably unpacked by AWS SSM to somewhere on the target machine. The problem is that this is unpacked to a random temp directory, so the file isn't found anyway.
I tried a couple of different absolute paths - I am assuming the path here is relevant to the .zip root.

The solution here is a bit horrendous:
The .zip file is extracted to the machine into an ephemeral directory with a random name which is not known in advance of the AWS SSM execution.
remote_src must be true. Yeah, it's your file that you've uploaded to S3, but Ansible isn't really smart enough to know that in this context.
A path relative to the playbook has to be used if you're bundling configuration files with the playbook.
That relative path has to be interpolated.
So using src: "{{ playbook_dir | dirname }}/path/to/my_file.txt" solved the problem in this case.
Note that this approach should not be used if configuration files contain secrets, but I'm not sure what approach AWS SSM offers for that type of scenario when you are using it in conjunction with Ansible.

Related

Copy to remote shared path using ansible

I am trying to copy some files to a remote shared path.
---
- hosts: localhost
tasks:
- name: Test
copy:
src: /tmp/log/test.csv
dest: \\xyz_prod.com\public\app\
The playbook ran fine and it displayed changed=1 for the first run. When I ran it again, still it is successful and changed=0. But if I navigate to the shared location manually under the folder the test.csv file is not present. Can anyone please suggest what is wrong here?
dest must not include URL. Quoting from copy
Remote absolute path where the file should be copied to.
Try the play below
- hosts: xyz_prod.com
tasks:
- name: Test
copy:
src: /tmp/log/test.csv
dest: /public/app
For Windows remote hosts use win_copy which "Copies files to remote locations on windows hosts".
To copy from a remote server
use fetch – Fetch files from remote nodes.
See: Ansible - fetch files from one remote node to another.

How to copy file from localhost to remote host in Ansible playbook?

I have a directory:
/users/rolando/myfile
I want to copy "myfile" to hostname "targetserver" in directory:
/home/rolando/myfile
What is the syntax in the playbook to do this? Examples I found with the copy command look like it's more about copying a file from a source directory on a remote server to a target directory on the same remote server.
The line in my playbook .yml I tried that failed:
- copy:
src='/users/rolando/myfile'
dest='rolando#targetserver:/home/rolando/myfile'
What am I doing wrong?
From copy synopsis:
The copy module copies a file on the local box to remote locations.
- hosts: targetserver
tasks:
- copy:
src: /users/rolando/myfile
dest: /users/rolando/myfile
Here is the updated answer. Above answer helps copy files in local machine itself.
This should be easy using remote_src parameter available in copy module
- name: Copy a "sudoers" file on the remote machine for editing
copy:
src: /users/rolando/myfile
dest: /home/rolando/myfile
remote_src: yes

Is it possible to use Ansible fetch and copy modules for SCPing files

Im trying to SCP a .sql file from one server to another, and I am trying to use an Ansible module to do so. I stumbled across the fetch and copy modules, but I am not sure how to specify which host I want to copy the file from.
This is my current Ansible file:
firstDB database dump happens on seperate host
- hosts: galeraDatabase
become: yes
remote_user: kday
become_method: sudo
tasks:
- name: Copy file to the galera server
fetch:
dest: /tmp/
src: /tmp/{{ tenant }}Prod.sql
validate_checksum: yes
fail_on_missing: yes
Basically, I want to take the dump file from the firstDB host, and then get it over to the other galeraDatabase host. How would I do this? Im order to use fetch or copy, I would need to pass it the second hostname to copy the files from, and I don't see any parameters to do that inside of the documentation. Should I be using a different method altogether?
Thanks
Try using the synchronize module with delegate_to, or if you don't have rsync then use the copy module. Some good answers relating to this topic already on stackoverflow.
Also, check the ansible documentation for more info on the copy and synchronize modules along with the delegate_to tasks parameter.
hth.

Ansible with_fileglob is skipping

I am putting together an Ansible Playbook designed to build webservers. However I am stuck when trying to use with_fileglob because Ansible keeps reporting that it's skipping the copy of nginx vhost files.
My script looks like this:
- name: Nginx | Copy vhost files
copy: src={{ item }} dest=/etc/nginx/sites-available owner=root group=root mode=600
with_fileglob:
- "{{ templates_dir }}/nginx/sites-available/*"
notify
- nginx-restart:
{{ templates }} has been defined elsewhere as roles/common/templates. In this directory I have a file called webserver1 that I'm hoping Ansible will copy into /etc/nginx/sites-available/
I have found other people discussing this issue but no responses have helped me solve this problem. Why would Ansible be skipping files?
Edit: I should point out that I want to use with_fileglob (rather than straight copy) as I want to iterate over other virtual hosts in the future.
Look at http://docs.ansible.com/playbooks_loops.html#looping-over-fileglobs, Note 1:
When using a relative path with with_fileglob in a role, Ansible resolves the path relative to the roles//files directory.
So to access a file in the templates directory, you can start the relative path with ../templates

Simple Ansible playbook: Location of file for task to copy file to target servers?

As a proof of concept, I'm trying to create what is probably the simplest ansible playbook ever: copying a single file from the ansible server out to the server farm.
For completeness, ansible is installed properly. The ping module works great! LoL
The playbook for my POC reads:
---
- hosts: Staging
tasks:
- name: Copy the file
copy: src=/root/Michael/file.txt dest=/tmp/file.txt
When I run the command...
ansible-playbook book.yml
I get the following output (summarized)...
msg: could not find src=/root/Michael/file.txt
Various docs and web pages I've read said the path to the file can either be absolute or relative to the playbook. I've tried both without success.
Where should my file be to get ansible to copy it over to the target servers?
Thanks!
Found the error in my ways. The playbook and files were located in a directory that was not accessible to the account running the ansible-playbook command. So while the ansible-playbook process could read the playbook (i called the command from the directory where the file was located), the process could not read the directory where the file was located and as a result could not find the file.
The solution was to move the playbook and files into a directory that could be read by the account running ansible. After that, the playbook worked exactly as expected!

Resources