Dynamic File names in vars_files with inventory variables - ansible

Following is simple playbook
- name: Create VM and associated resources
hosts: linux
connection: local
vars_files:
- vars_files/{{ env_name }}_vars.yml
- vars_files/base_vars.yml
roles:
- linux
And my inventory File is TEST.yml
all:
vars:
env_name: TEST
linux:
hosts:
TEST-SERVER:
ansible_host: 10.10.10.10
When I run the playbook ansible-playbook -vvv plabook_test.yml, I am receiving the following error.
skipping vars_file 'vars_files/{{ env_name }}_vars.yml' due to an undefined variable
Any Idea how can I used a variable from my inventory in the file name?
Any help is greatly appreciated.
Thanks,

The inventory file is wrong because of the key "host"
all:
vars:
env_name: TEST
linux:
host:
TEST-SERVER:
ansible_host: 10.10.10.10
You should have seen a warning
[WARNING]: Skipping unexpected key (host) in group (linux), only "vars", "children" and "hosts" are valid
Fix the inventory
all:
vars:
env_name: TEST
linux:
hosts:
TEST-SERVER:
ansible_host: 10.10.10.10

Related

ansible-pull playing playbook on remote host : how to point to the localhost in inventory? (Could not match supplied host pattern)

Im trying to use ansible-pull locally on a remote host where ansible retrieve a playbook on a git repo with an inventory.
I dont know how to point to the remote host in the inventory and in group vars. I have tried with the machine_name (user#machine_name in bash) but it tells me
[WARNING]: Could not match supplied host pattern, ignoring:
this is my inventory :
all:
children:
webserver:
hosts:
ip-172-31-21-218: #this is the machine name
dbserver:
hosts:
node2:
ansible_host: 34.201.53.127
I want the playbook to use the host machine to play the playbook locally with ansible pull but using the inventory to understand the inventory machine name matches the host machine name...
this is sample from official documentation and keep reading the comments :)
all:
hosts: # you ignore this line
mail.example.com: # and this line also
children:
webservers:
hosts:
foo.example.com:
bar.example.com:
dbservers:
hosts:
one.example.com:
two.example.com:
three.example.com:
Regards.

`remote_user` is ignored in playbooks and roles

I have defined the following in my ansible.cfg
# default user to use for playbooks if user is not specified
# (/usr/bin/ansible will use current user as default)
remote_user = ansible
However I have a playbook bootstrap.yaml where I connect with root rather than ansible
---
- hosts: "{{ target }}"
become: no
gather_facts: false
remote_user: root
vars:
os_family: "{{ osfamily }}}"
roles:
- role: papanito.bootstrap
However it seems that remote_user: root is ignored as I always get a connection error, because it uses the user ansible instead of root for the ssh connection
fatal: [node001]: UNREACHABLE! => {"changed": false,
"msg": "Failed to connect to the host via ssh:
ansible#node001: Permission denied (publickey,password).",
"unreachable": true}
The only workaround for this I could find is calling the playbook with -e ansible_user=root. But this is not convenient as I want to call multiple playbooks with the site.yaml, where the first playbook has to run with ansible_user root, whereas the others have to run with ansible
- import_playbook: playbooks/bootstrap.yml
- import_playbook: playbooks/networking.yml
- import_playbook: playbooks/monitoring.yml
Any suggestions what I am missing or how to fix it?
Q: "remote_user: root is ignored"
A: The playbook works as expected
- hosts: test_01
gather_facts: false
become: no
remote_user: root
tasks:
- command: whoami
register: result
- debug:
var: result.stdout
gives
"result.stdout": "root"
But, the variable can be overridden in the inventory. For example with the inventory
$ cat hosts
all:
hosts:
test_01:
vars:
ansible_connection: ssh
ansible_user: admin
the result is
"result.stdout": "admin"
Double-check the inventory with the command
$ ansible-inventory --list
Notes
It might be also necessary to double-check the role - role: papanito.bootstrap
See Controlling how Ansible behaves: precedence rules
I faced a similar issue, where ec2 instance required different username to ssh with. You could try with below example
- import_playbook: playbooks/bootstrap.yml
vars:
ansible_ssh_user: root
Try this
Instead of “remote_user: root”use “remote_user: ansible” and additional “become: yes” ,”become_user: root”,”become_method: sudo or su”

Hosts are not being picked from the Inventory file

I'm trying to perform tasks in the playbook for the hosts mentioned in my inventory file which are grouped under "Jira" But for some reason my group is not being identified to pick. for the content of the files please look below.
How can I run all the tasks mentioned in the playbook with all the hosts in the inventory?
I have an Inventory file with the below contents: Hosts.yml
all: # the all group contains all hosts
hosts:
ansible:
ansible_host: #{ansible-controller}
ansible_user: root
crowd:
ansible_host: #{crowd}
ansible_user: root
jira:
ansible_host1: 53.31.54.56
ansible_host2: 53.31.54.55
I have playbook with content:
---
- name: Install Jira Application
hosts: jira
gather_facts: true
become: true
remote_user: root
roles:
- ansible-preparation
#- jira-applicationsetup
I always get below error message:
root#sedcagse0550:/usr/Anil/InfraAutomation/gsep-infrastructure-automation : ansible-playbook jira-fullinstall.yml
[WARNING]: Could not match supplied host pattern, ignoring: jira
PLAY [Install Jira Application] *************************************************************
skipping: no hosts matched
PLAY RECAP **********************************************************************************
How can I perform all the tasks to all the hosts mentioned in the inventory file?
You should run the ansible-playbook with inventory parameter(-i) like this:
ansible-playbook -i Hosts.yml jira-fullinstall.yml
Otherwise, ansible checks the default inventory file location which is "/etc/ansible/hosts".
If your Hosts.yml already in that location, check your inventory file. In Ansible 2.7 User Guide YAML inventory files look like this:
all:
jira:
hosts:
53.31.54.56:
53.31.54.55:
If i understood it correct, Your inventory file should look like as below::
ansible: # Group Name
hosts:
ansible_host: #Host name
ansible_user: root # Host Variable
crowd:
hosts:
ansible_host: #{crowd}
ansible_user: root
jira:
hosts:
ansible_host1:
ansible_host: 53.31.54.56
ansible_host2:
ansible_host: 53.31.54.55`
Please refer this link for detailed formatting about yaml based inventory

Exclude a group that contains localhost

Given the following inventory :
[group1]
myserver.domain.com ansible_ssh_user=myUser
[group2]
localhost ansible_connection=local
How can I only execute my playbook on the group1 host(s) ?
When I use --limit=group1, it also includes localhost
I tried --limit='!group2', it does not work either.
Any idea?
Thx in advance
EDIT:
I am using ansible 1.9.2.
I can't test it on your ansible version. I suggest you a workaround changing the target host definition in your playbook as follows:
So you'll have something like this
- name: Test limit
hosts: "{{ hosts_nodes | default('all')}}"
tasks:
- file: path=/tmp/mydir state=directory
and running the playbook adding the additional environment variable hosts_nodes
ansible-playbook -i test.inventory test.yml -e hosts_nodes=group1
What ansible version are you using? It works correclty in version 2.1.2.0.
This is my test.inventory file
[group1]
myserver.domain.com ansible_ssh_user=myUser
[group2]
localhost ansible_connection=local
This is my test playbook test.yml
- name: Test limit
hosts: all
tasks:
- file: path=/tmp/mydir state=directory
I get what expected both running
ansible-playbook -i test.inventory --limit group2 test.yml
and
ansible-playbook -i test.inventory --limit '!group1' test.yml

Make Ansible included playbooks run on same hosts as their parent

Helllo, what is the best way to make an included playbook run on the same hosts as the playbook that called him?
I've tried declaring a variable in the parent playbook with the host name and then passing it to the included playbook, but I get an error telling me that the variable is undefined.
Below is my playbook:
---
# Main staging configuration playbook
- vars:
host_name: "stage_ansible"
hosts: "{{ host_name }}"
remote_user: ubuntu
tasks:
- name: test connection
ping:
remote_user: ubuntu
- include: NginxDefinitions.yml
vars:
service_name: "interaction.qmerce.com"
env_name: "stage4"
host_name_pass: "{{ host_name }}"
...
and the error I'm receiving:
`ERROR! 'host_name' is undefined
If you want to define the hosts runtime and avoid hard coding them on the playbook, you can pass the hosts as extra variables on the command line.
To do so, remove vars definition from your first play and add the following to the ansible-playbook command line:
--extra-vars host_name=localhost
or when you have multiple hosts:
--extra-vars '{"host_name":["host1","host2","host3"]}'

Resources