Remote Server's "/home"
enter image description here
Remote Server User
1. bitnami
2. take02
3. take03
4. take04
But local Host are only ubuntu users.
I would like to copy the "home" directory of REMOTE HOST as ansible,
keeping the OWNER information.
This is my playbook:
---
- hosts: discovery_bitnami
gather_facts: no
become: yes
tasks:
- name: "Creates directory"
local_action: >
file path=/tmp/{{ inventory_hostname }}/home/ state=directory
- name: "remote-to-local sync test"
become_method: sudo
synchronize:
mode: pull
src: /home/
dest: /tmp/{{ inventory_hostname }}/home
rsync_path: "sudo rsync"
Playbook result is:
PLAY [discovery_bitnami] *******************************************************
TASK [Creates directory] *******************************************************
ok: [discovery_bitnami -> localhost]
TASK [remote-to-local sync test] ***********************************************
fatal: [discovery_bitnami]: FAILED! => {"changed": false, "cmd": "/usr/bin/rsync --delay-updates -F --compress --archive --rsh 'ssh -i /home/ubuntu/.ssh/red_LightsailDefaultPrivateKey.pem -S none -o StrictHostKeyChecking=no -o Port=22' --rsync-path=\"sudo rsync\" --out-format='<<CHANGED>>%i %n%L' \"bitnami#54.236.34.197:/home/\" \"/tmp/discovery_bitnami/home\"", "failed": true, "msg": "rsync: failed to set times on \"/tmp/discovery_bitnami/home/.\": Operation not permitted (1)\nrsync: recv_generator: mkdir \"/tmp/discovery_bitnami/home/bitnami\" failed: Permission denied (13)\n*** Skipping any contents from this failed directory ***\nrsync: recv_generator: mkdir \"/tmp/discovery_bitnami/home/take02\" failed: Permission denied (13)\n*** Skipping any contents from this failed directory ***\nrsync: recv_generator: mkdir \"/tmp/discovery_bitnami/home/take03\" failed: Permission denied (13)\n*** Skipping any contents from this failed directory ***\nrsync: recv_generator: mkdir \"/tmp/discovery_bitnami/home/take04\" failed: Permission denied (13)\n*** Skipping any contents from this failed directory ***\nrsync error: some files/attrs were not transferred (see previous errors) (code 23) at main.c(1655) [generator=3.1.1]\n", "rc": 23}
to retry, use: --limit #/home/ubuntu/work/esc_discovery/ansible_test/ansible_sync_test.retry
PLAY RECAP *********************************************************************
discovery_bitnami : ok=1 changed=0 unreachable=0 failed=1
But,
failed "cmd" works fine run with sudo on the console.
$ sudo /usr/bin/rsync --delay-updates -F --compress --archive --rsh 'ssh -i /home/ubuntu/.ssh/red_PrivateKey.pem -S none -o StrictHostKeyChecking=no -o Port=22' --rsync-path=\"sudo rsync\" --out-format='<<CHANGED>>%i %n%L' bitnami#54.236.34.197:/home/ /tmp/discovery_bitnami/home
How do I run "task" with sudo?
ps. remove become: yes then all permission is "ubuntu"
enter image description here
I guess you are out of options for the synchronize module. It runs locally without sudo and it's hardcoded.
On the other hand, in the first task you create a directory under /tmp as root, so the permissions are limited to the root user. As a result you get "permissions denied" error.
Either:
refactor the code so that you don't need root permissions for the local destination (or add become: no for the task "Creates directory"), as you use archive option which implies permissions preservation, this might not be an option;
or:
create your own version of the synchronize module and add sudo to the front of the cmd variable;
or:
use the command module with sudo /usr/bin/rsync as the call.
Mind that synchronize module is a non-standard one, there were changes in the past regarding the accounts used, and requests for the changes.
On top of everything, the current documentation for the module is pretty confusing. On one hand it states strongly:
The user and permissions for the synchronize dest are those of the remote_user on the destination host or the become_user if become=yes is active.
But in another place it only hints that the source and destination meaning is reversed when using pull mode:
In pull mode the remote host in context is the source.
So for the case from this question, the following passage is relevant, even though it incorrectly states the "src":
The user and permissions for the synchronize src are those of the user running the Ansible task on the local host (or the remote_user for a delegate_to host when delegate_to is used).
Related
I am using synchronize module to copy dir present in NFS to local path. The user(SVC12345) which run playbook from ansible tower is not present in /etc/passwd or /etc/group.
When synchronise task is invoked it failes with below error
"msg" : "Warning : Permanently added 'hostname,1.2.3.4 to the list of known hosts\r\n/usr/bin/id: cannot find name for group ID 12345 \nrsync: change_dir \"/app/nfs_share_path/DIR1 failed: No such file or directory
"rc": "23"
"cmd": sshpass -d4 /bin/rsync --delay-updates -F compress --dry-run --archive --rsh=/bin/ssh -S none -o StrictHostKetcheking=no -o UserKnownHostFile=/dev/null --rsync-path=sudo rsync --out-format=<<CHANGED>>%i %n%L /app/nfs_share_path/DIR1 SVC12345#hostname:/app/path/util"
My ansible task
- name: Test
become: yes
become_user: local_user
synchronize:
src: /app/nfs_share_path/DIR1 //shared directory
dest: /app/path/util
owner: yes
group: yes
I am expecting this task is executed as "local_user" (since I have mentioned become_user) instead it perform the task as SVC12345 user.
The end goal is for me to copy file.txt from Host2 over to Host1. However, I keep getting the same error whenever I perform the function. I have triple checked my spacing and made sure I spelled everything correctly, but nothing seems to work.
Command to start the playbook:
ansible-playbook playbook_name.yml -i inventory/inventory_name -u username -k
My Code:
- hosts: Host1
tasks:
- name: Synchronization using rsync protocol on delegate host (pull)
synchronize:
mode: pull
src: rsync://Host2.linux.us.com/tmp/file.txt
dest: /tmp
delegate_to: Host2.linux.us.com
Expected Result:
Successfully working
Actual Result:
fatal: [Host1.linux.us.com]: FAILED! => {"changed": false, "cmd": "sshpass", "msg": "[Errno 2] No such file or directory", "rc": 2}
I have the same problem as you,Installing sshpass on the target host can work normally
yum install -y sshpass
SSH Authentication problem
TASK [backup : Gather facts (ops)] *************************************************************************************
fatal: [10.X.X.X]: FAILED! => {"msg": " [WARNING] Ansible is being run
in a world writable directory
(/mnt/c/Users/AnirudhSomanchi/Desktop/KVM/Scripting/Ansible/network/backup),
ignoring it as an ansible.cfg source. For more information see
https://docs.ansible.com/ansible/devel/reference_appendices/config.html#cfg-in-world-writable-dir\n{\"socket_path\":
(https://docs.ansible.com/ansible/devel/reference_appendices/config.html#cfg-in-world-writable-dir/n{/'socket_path/':)
\"/home/SaiAnirudh/.ansible/pc/87fd82198c\", \"exception\":
\"Traceback (most recent call last):\n File
\\"/usr/bin/ansible-connection\\", line 104, in start\n
self.connection._connect()\n File
\\"/usr/lib/python2.7/dist-packages/ansible/plugins/connection/network_cli.py\\",
line 327, in _connect\n ssh = self.paramiko_conn._connect()\n
File
\\"/usr/lib/python2.7/dist-packages/ansible/plugins/connection/paramiko_ssh.py\\",
line 245, in _connect\n self.ssh = SSH_CONNECTION_CACHE[cache_key]
= self._connect_uncached()\n File \\"/usr/lib/python2.7/dist-packages/ansible/plugins/connection/paramiko_ssh.py\\",
line 368, in _connect_uncached\n raise
AnsibleConnectionFailure(msg)\nAnsibleConnectionFailure: paramiko:
The authenticity of host '10.X.X.X' can't be established.\nThe
ssh-rsa key fingerprint is 4b595d868720e28de57bef23c90546ad.\n\",
\"messages\": [[\"vvvv\", \"local domain socket does not exist,
starting it\"], [\"vvvv\", \"control socket path is
/home/SaiAnirudh/.ansible/pc/87fd82198c\"], [\"vvvv\", \"loaded
cliconf plugin for network_os vyos\"], [\"log\", \"network_os is set
to vyos\"], [\"vvvv\", \"\"]], \"error\": \"paramiko: The authenticity
of host '10.X.X.X' can't be established.\nThe ssh-rsa key fingerprint
is 4b595d868720e28de57bef23c90546ad.\"}"}
Command used-- ansible-playbook playbook.yml -i hosts --ask-vault-pass
Tried changing
host_key_checking = False in Ansible.cfg
Command used-- ansible-playbook playbook.yml -i hosts --ask-vault-pass
=====================================================================
playbook.yml
---
- hosts: all
gather_facts: false
roles:
- backup
===============================================================
Main.yml
C:\Ansible\network\backup\roles\backup\tasks\main.yml
- name: Gather facts (ops)
vyos_facts:
gather_subset: all
- name: execute Vyos run to initiate backup
vyos_command:
commands:
- sh configuration commands | no-more
register: v_backup
- name: local_action
local_action:
module: copy
dest: "C:/Users/Desktop/KVM/Scripting/Ansible/network/backup/RESULTS/Backup.out"
content: "{{ v_backup.stdout[0] }}"
===============================================================
Ansible.cfg
[defaults]
host_key_checking = False
===============================================================
Hosts
[all]
10.X.X.X
[all:vars]
ansible_user=
ansible_ssh_pass=
ansible_connection=network_cli
ansible_network_os=vyos
We need the backup of the vyatta devices but getting the following error
Vault password:
PLAY [all] *************************************************************************************************************
TASK [backup : Gather facts (ops)] *************************************************************************************
fatal: [10.X.X.X]: FAILED! => {"msg": " [WARNING] Ansible is being run
in a world writable directory
(/mnt/c/Users/AnirudhSomanchi/Desktop/KVM/Scripting/Ansible/network/backup),
ignoring it as an ansible.cfg source. For more information see
https://docs.ansible.com/ansible/devel/reference_appendices/config.html#cfg-in-world-writable-dir\n{\"socket_path\":
(https://docs.ansible.com/ansible/devel/reference_appendices/config.html#cfg-in-world-writable-dir/n{/'socket_path/':)
\"/home/SaiAnirudh/.ansible/pc/87fd82198c\", \"exception\":
\"Traceback (most recent call last):\n File
\\"/usr/bin/ansible-connection\\", line 104, in start\n
self.connection._connect()\n File
\\"/usr/lib/python2.7/dist-packages/ansible/plugins/connection/network_cli.py\\",
line 327, in _connect\n ssh = self.paramiko_conn._connect()\n
File
\\"/usr/lib/python2.7/dist-packages/ansible/plugins/connection/paramiko_ssh.py\\",
line 245, in _connect\n self.ssh = SSH_CONNECTION_CACHE[cache_key]
= self._connect_uncached()\n File \\"/usr/lib/python2.7/dist-packages/ansible/plugins/connection/paramiko_ssh.py\\",
line 368, in _connect_uncached\n raise
AnsibleConnectionFailure(msg)\nAnsibleConnectionFailure: paramiko:
The authenticity of host '10.X.X.X' can't be established.\nThe
ssh-rsa key fingerprint is 4b595d868720e28de57bef23c90546ad.\n\",
\"messages\": [[\"vvvv\", \"local domain socket does not exist,
starting it\"], [\"vvvv\", \"control socket path is
/home/SaiAnirudh/.ansible/pc/87fd82198c\"], [\"vvvv\", \"loaded
cliconf plugin for network_os vyos\"], [\"log\", \"network_os is set
to vyos\"], [\"vvvv\", \"\"]], \"error\": \"paramiko: The authenticity
of host '10.X.X.X' can't be established.\nThe ssh-rsa key fingerprint
is 4b595d868720e28de57bef23c90546ad.\"}"}
PLAY RECAP *************************************************************************************************************
10.X.X.X : ok=0 changed=0 unreachable=0 failed=1 skipped=0 rescued=0 ignored=0
[WARNING] Ansible is in a world writable directory , ignoring it as an ansible.cfg source.
I also got this warning but this won't lead you to the failure of your command execution in ansible. it is just a warning which means you have given full permission(i.e drwxrwxrwx or 777) to your directory which is not secure. you can remove this warning by changing permission with chmod command(like sudo chmod 755).
For me this was not working....
I needed to unmount it and then mount it again and use another command:
sudo umount /mnt/c
sudo mount -t drvfs C: /mnt/c -o metadata
Go to the actual folder and:
chmod o-w .
More Info, Link1 Link2
if you don't want to change permissions , all you need to do is to append your config file into the default ansible config file
. in linux it is in /etc/ansible/ansible.cfg
I'm new to Ansible trying to become $USER then create .ssh folder inside $HOME directory and I'm getting Permission denied:
---
- hosts: amazon
gather_facts: False
vars:
ansible_python_interpreter: "/usr/bin/env python3"
account: 'jenkins'
home: "{{out.stdout}}"
tasks:
- name: Create .SSH directory
become: true
become_method: sudo
become_user: "{{account}}"
shell: "echo $HOME"
register: out
- file:
path: "{{home}}/.ssh"
state: directory
My output is:
MacBook-Pro-60:playbooks stefanov$ ansible-playbook variable.yml -v
Using /Users/stefanov/.ansible/ansible.cfg as config file
PLAY [amazon] *************************************************************************************************************************************************************************************
TASK [Create .SSH directory] **********************************************************************************************************************************************************************
changed: [slave] => {"changed": true, "cmd": "echo $HOME", "delta": "0:00:00.001438", "end": "2017-08-21 10:23:34.882835", "rc": 0, "start": "2017-08-21 10:23:34.881397", "stderr": "", "stderr_lines": [], "stdout": "/home/jenkins", "stdout_lines": ["/home/jenkins"]}
TASK [file] ***************************************************************************************************************************************************************************************
fatal: [slave]: FAILED! => {"changed": false, "failed": true, "msg": "There was an issue creating /home/jenkins/.ssh as requested: [Errno 13] Permission denied: b'/home/jenkins/.ssh'", "path": "/home/jenkins/.ssh", "state": "absent"}
to retry, use: --limit #/Users/stefanov/playbooks/variable.retry
PLAY RECAP ****************************************************************************************************************************************************************************************
slave : ok=1 changed=1 unreachable=0 failed=1
I'm guessing - name and - file are dicts and considered different tasks.
And what was executed in - name is no longer valid in - file?
Because I switched to Jenkins user in - name and in - file I'm likely with the account I do SSH.
Then how can I concatenate both tasks in one?
What is the right way to do this?
Another thing how can I do sudo with file module? I can't see such option:
http://docs.ansible.com/ansible/latest/file_module.html
Or should I just do shell: mkdir -pv $HOME/.ssh instead of using file module?
Then how can I concatenate both tasks in one?
You cannot do it, but you can just add become to the second task, which will make it run with the same permissions as the first one:
- file:
path: "{{home}}/.ssh"
state: directory
become: true
become_method: sudo
become_user: "{{account}}"
Another thing how can i do sudo with file module can't see such option
Because become (and other) is not a parameter of a module, but a general declaration for any task (and play).
I'm guessing -name and -file are dicts and considered different tasks.
The first task is shell, not name. You can add name to any task (just like become).
I'm running an ansible 2.3.1.0 on my local machine (macOs) and trying to achieve :
connecting to user1#host1
copying a file from user2#host2:/path/to/file to user1#host1:/tmp/path/to/file
I'm on my local, with host1 as hosts and user1 as remote_user:
- synchronize: mode=pull src=user2#host2:/path/to/file dest=/tmp/path/to/file
Wrong output:
/usr/bin/rsync (...) user1#host1:user2#host2:/path/to/file /tmp/path/to/file
Conclusion
I've been trying different options. I've debugged ansible. I can't understand what's wrong.
Help!
Edit 1
I've also tried adding delegate_to:
- synchronize: mode=pull src=/path/to/file dest=/tmp/path/to/file
delegate_to: host2
It gives:
fatal: [host1]: UNREACHABLE! => {"changed": false, "msg": "Failed to connect to the host via ssh: Permission denied (publickey,password,keyboard-interactive).\r\n", "unreachable": true}
And also:
- synchronize: mode=pull src=/path/to/file dest=/tmp/path/to/file
delegate_to: user2#host2
Which gives:
fatal: [host1 -> host2]: FAILED! => {"changed": false, "cmd": "/usr/bin/rsync --delay-updates -F --compress --archive --rsh=/usr/bin/ssh -S none -o StrictHostKeyChecking=no --rsync-path=sudo rsync --out-format=<<CHANGED>>%i %n%L host1:/path/to/file /tmp/path/to/file", "failed": true, "msg": "Permission denied (publickey).\r\nrsync: connection unexpectedly closed (0 bytes received so far) [Receiver]\nrsync error: unexplained error (code 255) at io.c(235) [Receiver=3.1.2]\n", "rc": 255}
NB: ssh user1#host1 and then ssh user2#host2 works with ssh keys (no password required)
Please pay attention to this notes from modules' docs:
For the synchronize module, the “local host” is the host the synchronize task originates on, and the “destination host” is the host synchronize is connecting to.
The “local host” can be changed to a different host by using delegate_to. This enables copying between two remote hosts or entirely on one remote machine.
I guess, you may want to try (assuming Ansible can connect to host2):
- synchronize:
src: /path/to/file
dest: /tmp/path/to/file
delegate_to: host2