I have inventory file which has multiple users for a server, like below.
[TEST]
server1 ansible_user=user1
server1 ansible_user=user2
server1 ansible_user=user3
server1 ansible_user=user4
When I run playbook using this inventory, it only runs on "server1 ansible_user=user4", ignoring first 3 users. How can I run playbook on all 4 users?
With this inventory you have one inventory entry server1 and with each new line you override ansible_user variable.
If you really want (what is the use case) to make this happen, use host aliasing:
[TEST]
s1_u1 ansible_host=server1 ansible_user=user1
s1_u2 ansible_host=server1 ansible_user=user2
s1_u3 ansible_host=server1 ansible_user=user3
s1_u4 ansible_host=server1 ansible_user=user4
But be prepared to possible concurrency issues, like APT lock for example.
Related
I am trying to run certain tasks using delegate_to localhost or connection: local and other tasks on the remote host. however the task is executed on the localhost multiple times when i use "delegate_to: localhost"
my inventory
localhost ansible_host=127.0.0.1 ansible_connection=local ansible_python_interpreter="{{ansible_playbook_python}}"
[master1]
ip-10-90-148-195.ap-southeast-1.compute.internal
[master2]
ip-10-90-149-130.ap-southeast-1.compute.internal
[master3]
ip-10-90-150-239.ap-southeast-1.compute.internal
[master:children]
master1
master2
master3
[worker]
ip-10-90-148-206.ap-southeast-1.compute.internal
ip-10-90-149-178.ap-southeast-1.compute.internal
ip-10-90-150-86.ap-southeast-1.compute.internal
[all:vars]
ansible_user="core"
ansible_ssh_private_key_file="~/.ssh/id_rsa"
my task:
- name: host name
shell: echo `hostname` >> /tmp/123
#delegate_to: localhost
#connection: local
if i comment delegate_to: localhost and connection: local, i will get a file /tmp/123 on each remote host with their own hostname inside it. expected result.
however if i uncomment either one of them,the task will be executed 6 times on the localhost. meaning /tmp/ls on localhost will have localhost's hostname printed 6 times in it.
my goal is simple, i just want to run certain task on all host as per define in playbook hosts: groupa:groupb, and certain task on localhost, but 1 time only. i thought this is straight forward but i have been spending hours but no result.
if your hosts contains groupa:groupb then yes make sense to have 6 entries (it runs the tasks 6 times on localhost)
you need to add the option run_once: true in your task level.
or modify the playbook to run on the localhost only.
I have set 2 servers for testing purposes, which are behind a NAT network. So I configured port forwarding to SSH port for both of them.
My inventory file looks like this:
[webservers]
example.com:12021
example.com:12121
[webservers:vars]
ansible_user=root
ansible_ssh_private_key_file=~/test/keys/id_ed25519
But Ansible only identifies one of them(whichever is first in the list). My "hack" to run ansible-playbook commands on both of them is by changing the order in the hostlist, and running the playbook twice.
So, is there any way to identify the hosts by port number, and not by hostname?
Thanks in advance.
Use any label you like:
[webservers]
server1 ansible_host=example.com ansible_port=12021
server2 ansible_host=example.com ansible_port=12121
I have to check the status of my databases which runs on many servers with different user ids for example: on server1 it runs with user1 on server2 it runs with user2 etc. In my playbook, I have the code to check the database status but don't know how to run this on different hosts and different user ids.
I have written playbook which can check the database status but doesn't know how to make this run with different user ids on different servers.
Playbook:
---
- hosts: all
become: true
become_user: db2inst1
tasks:
- name: Start DB2
command: /home/db2inst1/sqllib/adm/db2_ps
Inventory:
[db-servers]
192.168.4.30
You should take a look at the "Connecting to hosts: behavioral inventory parameters" section of Ansible Docs. It explains all the variables you can set to modify Ansible's behavior on each host.
In your case, you are going to have to define the ansible_user variable to use on each host, and the corresponding authentication details (passwords, ssh keys, etc.)
You can configure these variables on your inventory files. For example, suppose that you need to use the user root for some server, and ubuntu for another. You can configure your playbook like this.
[db-servers]
192.168.4.30 ansible_user=root
192.178.4.31 ansible_user=ubuntu
I hope it helps.
I have a playbook with multiple tasks for a single role, i want to divide the tasks say 80% to first host and remaining 20% to second host , the first and second host will be picked from
ansible-playbook -i 1.2.3.4, 2.3.4.5, update.yml
where 1.2.3.4 is first server ip and 2.3.4.5 is second server ip. How can i achieve this.
To recap:
You have one role with 10 tasks. 6 of which you want to execute on server 1 and the rest on server 2
A way would be to write 2 different playbooks which will include the tasks you want to execute on the specified hosts.
Another might be to use tags on each task and execute ansible with --tags and specify them on playbook level
- hosts: all
tags:
- foo
role:
...
- hosts: all
tags:
- bar
role:
...
ref https://docs.ansible.com/ansible/latest/user_guide/playbooks_tags.html
Playbook tasks execution can be controlled by tags or blocks. My previous answer was related to the task execution on few of the hosts(I miss understood)
For eg.
serial: "80%"
would mean that all the tasks will be performed on 80% of the hosts first then will be performed on the remaining hosts.
For playbook to execute some tasks on few hosts and some on few hosts you can may be use when with ansible_hostname set to some hosts
I have a complex Ansible setup with several hosts in my group file. Something like this.
# hosts/groups
[local]
127.0.0.1
[server1]
server1.domain.com
[server2]
server2.domain.com
[group1]
local
server1
[group2]
local
server2
This way, I can run both groups against localhost:2222 which is my Vagrant box, however, they will be both executed. For testing I would very much prefer to choose, which setting, I would like to test. I have experimented with --extra-vars arguments and conditionals, which is pretty ugly. Is there a way to use the extra_vars argument with the host configuration. Using a command like ...
ansible-playbook playbook.yml -i hosts -l 127.0.0.1:2222 --extra-vars "vhost=server1.domain.com"
Or am I entirely wrong.
I don't think there's an easy way to do this via alterations to how you executes Ansible.
The best option I can think of involves potentially re-organizing your playbooks. If you create group1.yaml and group2.yaml, each containing the statements necessary for setting up group1 and group2, respectively, then you can run something like
[$]> ansible-playbook -l 127.0.0.1:2222 group1.yaml
to only run the group1 configuration against your development instance.
If you still want a convenient way to run all tasks, alter your playbook.yaml to include the other playbooks:
- include: group1.yaml
- include: group2.yaml