Ansible: remote_user gets ignored - ansible

My Ansible playbook fails, $USER and $HOME are not set correctly on the remote server.
I have a user called lala on remote-server.
Most tasks get executed as user root, directly via ssh:
[all:vars]
ansible_user = root
But this tasks should get executed as user lala via ssh:
- name: migrate
command: /home/lala/bin/manage.py migrate
remote_user: lala
This fails. The uid of the remote process is from lala, but the environment variables like $HOME $USER are still from user root.
I would like ansible to connect to user lala via ssh directly.
I can see this clearly if I use -vvvv:
<coffee-and-sugar.club> SSH: EXEC ssh -vvv ... -o 'User="root"' ...
How to make ansible connect via ssh lala#remote-server?

I found a solution. I don't use remote_user, instead I set the variable ansible_user for the task like this:
- name: migrate
command: /home/lala/bin/manage.py migrate
vars:
ansible_user: lala

Related

How to run playbook on my application server using sudo?

I am writing a playbook for my Application tomcat node It will copy, deploy and stop/start tomcats.
I have a hop box serverA, another hop box serverB and tomcat node tomcatC. Manually using putty i use below steps to get on to the tomcat
Login to serverA using userId1
ssh to serverB using userId2
ssh to tomcatC using userId1
sudo to tomcat user.
Also I am able to directly ssh to tomcatC from serverA and my Ansible master is also serverA from where I am running the playbooks.
How do i run my playbook for this? Below is my playbook i am using as of now but it's not working.
ansible-playbook -i my-inventory my-V3.yml --tags=download,copy,deploy -e release_version=5.7 -e target_env=tomcatC -u userId1 --ask-pass. AND my-v3.yml looks like below -
hosts: '{{ target_env }}'
#serial: 1
remote_user: userId1
become: yes
become_user: tomcat
Getting this Error NOW -
GATHERING FACTS ***************************************************************
fatal: [tomcatC] => Missing become password
You can set the user a command is run as like so:
- cron:
name: "Clear Root Mail"
minute: "0"
hour: "22"
job: "sudo rm /var/spool/mail/root"
user: myuser
Or use become: true like so:
- name: Start Server
shell: "nohup /home/myuser/StartServer.sh &"
become: true
You can have shell scripts run the commands you need to run as well that ansible calls from the jump box you have. Your problem looks like you dont have the correct ssh key applied though.

Ansible root/password login

I'm trying to use Ansible to provision a server and the first thing I want to do is test the ssh access. If I use ssh directly I can log in fine...
ssh root#server
root#backups's password:
If I use Ansible I can't...
user#ansible:~$ ansible backups -m ping --user root --ask-pass
SSH password:
backups | UNREACHABLE! => {
"changed": false,
"msg": "Invalid/incorrect password: Permission denied, please try again.",
"unreachable": true
}
The password I'm using is correct - 100%.
Before anyone suggests using SSH keys - that's what part of what I'm looking to automate.
The issue was caused by the getting started documentation setting a trap.
It instructs you to create an inventory file with servers, use ansible all -m ping to ping those servers and to use the -u switch to change the remote user.
What it doesn't tell you is that if like me not all you servers have the same user, the advised way to specify a user per server is in the inventory file...
server1 ansible_connection=ssh ansible_user=user1
server2 ansible_connection=ssh ansible_user=user2
server3 ansible_connection=ssh ansible_user=user3
I was provisioning a server, and the only user I had available to me at the time was root. But trying to do ansible server3 -user root --ask-pass failed to authenticate. After a couple of wasted hours I discovered the -user switch is only effective if the inventory file doesn't have a user. This is intended precedence behaviour. There are a few gripes about this in GitHub issues but a firm 'intended behaviour' mantra is the response you get if you challenge it. It seems to go against the grain to me.
I subsequently discovered that you can specify -e 'ansible_ssh_user=root' to override the inventory user - I will see about creating a pull request to improve the docs.
While you're here, I might be able to save you some time with some further gotchas. This behaviour is the same if you use playbooks. In there you can specify a remote_user but this isn't honoured - presumably also because of precedence. Again you can override the inventory user with -e 'ansible_ssh_user=root'
Finally, until I realised Linode could provision a server with an SSH key deployed, I was trying to specify the root password to an ad-hoc command. You have to encrypt the password and this gives you a long string and this is almost certainly going to include $ in it which bash will treat as substitutions. Make sure you escape these.
The lineinfile module behaviour isn't intuitive either.
Write your hosts file like this. It will work.
192.168.2.4
192.168.1.4
[all:vars]
ansible_user=azureuser
Then execute the following command: ansible-playbook --ask-pass -i hosts main.yml --check to check before configuration.
Also create a ansible.cfg file. Then paste the following contents there:
[defaults]
inventory = hosts
host_key_checking = False
Note: All the 3 files namely, main.yml,ansible.cfg & hosts must be in the same folder.
Also, the code is tested for devices connected to a private network using Private IPs. I haven't checked using Public IPs. If using Azure/AWS, create a test VM and connect it to the VPN of the other devices.
Note: You need to install the SSHPass package to be able to authenticate with Password.
For Ubuntu: apt-get install sshpass

Running a command in an ansible-playbook from the ansible host using variables from the current ansible process

Having hit a brick wall with troubleshooting why one shell script is hanging when I'm trying to run it via Ansible on the remote host, I've discovered that if I run it in an ssh session from the ansible host it executes successfully.
I now want to build that into a playbook as follows:
- name: Run script
local_action: shell ssh $TARGET "/home/ansibler/script.sh"
I just need to know how to access the $TARGET that this playbook is running on from the selected/limited inventory so I can concatenate it into that local_action.
Is there an easy way to access that?
Try with ansible_host:
- name: Run script
local_action: 'shell ssh {{ ansible_host }} "/home/ansibler/script.sh"'

Ansible have shell command use stored password

Looking to have a way for a password in a file to be used when I call the shell script below. I don't want to have to type the password in for a lot of machines to copy one file over. I need to use SCP or ti won't work
I'm also using ansible vault
- hosts: localhost
gather_facts: no
tasks:
- name: Copy File to Local Machine
shell: "scp test#{{ item }}:/home/test/*.csv /location/on/localhost"
with_items: "{{groups['firewall']}}"
answer provided below:
- name: Copy File to Local Machine
shell: "sshpass -p {{ ansible_ssh_pass }} scp test#{{ item }}:/home/test/*.csv /destination"
with_items: "{{groups['firewall']}}"
You can save the password first.
$ cat config.sh
eval `ssh-agent -s`
ssh-add ~/.ssh/default # <- replace the ssh key with yours
Have it ready and run it before running ansible playbook
$ chmod +x config.sh
$ . config.sh
Agent pid 87414
Enter passphrase for /home/test/.ssh/default:
Identity added: /home/test/.ssh/default (/home/test/.ssh/default)
Then you shouldn't have password prompt issue.

Ansible execute remote ssh task

I have tried connecting to a remote "custom" Linux vm, and copied my ssh public ssh it, yet, I'm able to get Ansible to ping / connect it, as I keep getting remote host unreachable. "I think its because its running custom version of linux and ssh auth might be behaving differently"
The following remote ssh command works
# ssh -t user#server_ip "show vd"
password: mypassword
I'm trying to convert this above command to an Ansible playbook
---
- hosts: server_ip
gather_facts: no
remote_user: root
tasks:
- name: check devise status
shell: ssh -i server_ip "show vd"
register: output
- expect:
command: sh "{{ output.stdout }}"
responses:
(?i)password: user
I'm unable to get it to work, and I'm not sure if this is the write way of doing it, your input is highly appreciated it.
Whenever you have ssh connection problems you should add the -vvvv parameter to your ansible-playbook command line. That way it will give detailed information about the ssh connection and errors.

Resources