Ansible: Get Variable with inventory_hostname - ansible

I have the following passwords file vault.yml:
---
server1: "pass1"
server2: "pass2"
server3: "pass3"
I am loading these values in a variable called passwords:
- name: Get Secrets
set_fact:
passwords: "{{ lookup('template', './vault.yml')|from_yaml }}"
delegate_to: localhost
- name: debug it
debug:
var: passwords.{{ inventory_hostname }}
The result of the debugging task shows me the result I want to get: The password for the specific host.
But if I set the following in a variables file:
---
ansible_user: root
ansible_password: passwords.{{ inventory_hostname }}
This will not give me the desired result. The ansible_password takes "passwords" literally and not as a variable.
How can I achieve the same result I got when debugging the passwords.{{ inventory_hostname }}?

Regarding the part
... if I set the following in a variables file ...
I am not sure since I miss some information about your use case and data flow. However, in general the syntax ansible_password: "{{ PASSWORDS[inventory_hostname] }}" might work for you.
---
- hosts: localhost
become: false
gather_facts: false
vars:
PASSWORDS:
SERVER1: "pass1"
SERVER2: "pass2"
SERVER3: "pass3"
localhost: "pass_local"
tasks:
- name: Debug var
debug:
var: PASSWORDS
- name: Set Fact 'ansible_password'
set_fact:
ansible_password: "{{ PASSWORDS[inventory_hostname] }}"
- name: Debug var
debug:
var: ansible_password
In that way you can access a element by name.

Related

Ansible how to set_fact with condition or var base on condition

I have 3 types of servers: dev, qa and prod. I need to send files to server specific home directories, i.e:
Dev Home Directory: /dev/home
QA Home Directory:/qa/home
PROD Home Directory: /prod/home
I have var set as boolean to determine the server type and think of using set_fact with condition to assign home directories for the servers. My playbook looks like this:
---
- hosts: localhost
var:
dev: "{{ True if <hostname matches dev> | else False }}"
qa: "{{ True if <hostname matches qa> | else False }}"
prod: "{{ True if <hostname matches prod> | else False }}"
tasks:
- set_facts:
home_dir: "{{'/dev/home/' if dev | '/qa/home' if qa | default('prod')}}"
However, then I ran the playbook, I was getting error about 'template expected token 'name', got string> Anyone know what I did wrong? Thanks!
Use match test. For example, the playbook
shell> cat pb.yml
- hosts: localhost
vars:
dev_pattern: '^dev_.+$'
qa_pattern: '^qa_.+$'
prod_pattern: '^prod_.+$'
dev: "{{ hostname is match(dev_pattern) }}"
qa: "{{ hostname is match(qa_pattern) }}"
prod: "{{ hostname is match(prod_pattern) }}"
tasks:
- set_fact:
home_dir: /prod/home
- set_fact:
home_dir: /dev/home
when: dev|bool
- set_fact:
home_dir: /qa/home
when: qa|bool
- debug:
var: home_dir
gives (abridged)
shell> ansible-playbook pb.yml -e hostname=dev_007
home_dir: /dev/home
Notes:
The variable prod is not used because /prod/home is default.
prod/home is assigned to home_dir first because it's the default. Next tasks can conditionally overwrite home_dirs.
Without the variable hostname defined the playbook will crash.
A simpler solution that gives the same result is creating a dictionary of 'pattern: home_dir'. For example
- hosts: localhost
vars:
home_dirs:
dev_: /dev/home
qa_: /qa/home
prod_: /prod/home
tasks:
- set_fact:
home_dir: "{{ home_dirs|
dict2items|
selectattr('key', 'in' , hostname)|
map(attribute='value')|
list|
first|default('/prod/home') }}"
- debug:
var: home_dir
Adding an alternative method to achieve this. This kind of extends the group_vars suggestion given by #mdaniel in his comment.
Ansible has a ready-made mechanism to build the variables based on the inventory hosts and groups. If you organize your inventory, you can avoid a lot of complication in trying to match host patterns.
Below is a simplified example, please go through the link above for more options.
Consider an inventory file /home/user/ansible/hosts:
[dev]
srv01.dev.example
srv02.dev.example
[qa]
srv01.qa.example
srv02.qa.example
[prod]
srv01.prod.example
srv02.prod.example
Using group_vars:
Then you can have below group_var files in /home/user/ansible/group_vars/ (matching inventory group names):
dev.yml
qa.yml
prod.yml
In dev.yml:
home_dir: "/dev/home"
In qa.yml:
home_dir: "/qa/home"
In prod.yml:
home_dir: "/prod/home"
Using host_vars:
Or you can have variables specific to hosts in host_vars directory /home/user/ansible/host_vars/:
srv01.dev.example.yml
srv01.prod.example.yml
# and so on
In srv01.dev.example.yml:
home_dir: "/dev/home"
In srv01.prod.example.yml:
home_dir: "/prod/home"
These variables will be picked based on which hosts you run the playbook, for example the below playbook:
---
- hosts: dev
tasks:
- debug:
var: home_dir
# will be "/dev/home"
- hosts: prod
tasks:
- debug:
var: home_dir
# will be "/prod/home"
- hosts: srv01.dev.example
tasks:
- debug:
var: home_dir
# will be "/dev/home"
- hosts: srv01.prod.example
tasks:
- debug:
var: home_dir
# will be "/prod/home"

set path when file exists in Ansible yml code

I'm trying to set a var only when a file exists, here is one of my attempts
---
- hosts: all
tasks:
- stat:
path: '{{ srch_path_new }}/bin/run'
register: result
- vars: srch_path="{{ srch_path_new }}"
when: result.stat.exists
This also didn't work
- vars: srch_path:"{{ srch_path_new }}"
The task you are looking for is called set_fact: and is the mechanism ansible uses to declare arbitrary "host variables", sometimes called "hostvars", or (also confusingly) "facts"
The syntax would be:
- set_fact:
srch_path: "{{ srch_path_new }}"
when: result.stat.exists
Also, while vars: is a legal keyword on a Task, its syntax is the same as set_fact: (or the vars: on the playbook): a yaml dictionary, not a key:value pair as you had. For example:
- debug:
msg: hello, {{ friend }}
vars:
friend: Jane Doe
and be aware that vars: on a task exist only for that task

ansible - Incorrect type. Expected "object"

site.yaml
---
- name: someapp deployment playbook
hosts: localhost
connection: local
gather_facts: no
vars_files:
- secrets.yml
environment:
AWS_DEFAULT_REGION: "{{ lookup('env', 'AWS_DEFAULT_VERSION') | default('ca-central-1', true) }}"
tasks:
- include: tasks/create_stack.yml
- include: tasks/deploy_app.yml
create_stack.yml
---
- name: task to create/update stack
cloudformation:
stack_name: someapp
state: present
template: templates/stack.yml
template_format: yaml
template_parameters:
VpcId: "{{ vpc_id }}"
SubnetId: "{{ subnet_id }}"
KeyPair: "{{ ec2_keypair }}"
InstanceCount: "{{ instance_count | default(1) }}"
DbSubnets: "{{ db_subnets | join(',') }}"
DbAvailabilityZone: "{{ db_availability_zone }}"
DbUsername: "{{ db_username }}"
DbPassword: "{{ db_password }}"
tags:
Environment: test
register: cf_stack
- name: task to output stack output
debug: msg={{ cf_stack }}
when: debug is defined
Error at line debug: msg={{ cf_stack }} saying:
This module prints statements during execution and can be useful for debugging variables or expressions without necessarily halting the playbook. Useful for debugging together with the 'when:' directive.
This module is also supported for Windows targets.
Incorrect type. Expected "object".
Ansible documentation allows the above syntax, as shown here
$ ansible --version
ansible 2.5.1
....
How to resolve this error?
You still need to remember quotes for lines starting with a {, even when using short-hand notation:
- debug: msg="{{ cf_stack }}"
This would be more obvious using full YAML notation:
- debug:
msg: "{{ cf_stack }}"
Also, given this is a variable, you could just do:
- debug:
var: cf_stack

Making prompted vars usable across the playbook

I have one playbook with multiple tasks that have to run on different hosts.
In the beginning of the play I want to prompt the operator for their credentials which are the same for every host in the play. I want to have those credentials "stored" somewhere so they can be used across the tasks to log in on the provided host(s).
Playbook looks as followed,
---
- name: Ask for credentials
vars_prompt:
- name: username
prompt: "Username?"
- name: password
prompt: "Password?"
tasks:
- set_fact:
username: "{{username}}"
- set_fact:
password: "{{password}}"
- hosts: Host1
vars:
ansible_user: "{{ username }}"
ansible_password: "{{ password }}"
tasks:
- name: Do stuff
- hosts: Host2
vars:
ansible_user: "{{username}}"
ansible_password: "{{password}}"
tasks:
- name: Do stuff
...
From the moment the play hits the first task it will fail with the flowing error,
msg: 'The field ''remote_user'' has an invalid value, which includes an undefined variable. The error was: ''username'' is undefined'
Anyone that has experience in making prompted vars usable across the whole play and all tasks?
Q: "Make prompted vars usable across the whole play and all tasks>"
A: Run the first play with the group of all hosts that should be connected later. Run once the set_fact task. This will create the variables username and password for all hosts in the group.
For example if the group test_jails comprises hosts test_01, test_02, test_03 the play
- hosts: test_jails
vars_prompt:
- name: "username"
prompt: "Username?"
- name: "password"
prompt: "Password?"
tasks:
- set_fact:
username: "{{ username }}"
password: "{{ password }}"
run_once: true
- hosts: test_01
vars:
ansible_user: "{{ username }}"
ansible_password: "{{ password }}"
tasks:
- debug:
msg: "{{ ansible_user }} {{ ansible_password }}"
gives
ok: [test_01] => {
"msg": "admin 1234"
}

How to set fact witch is visible on all hosts in Ansible role

I'm setting fact in a role:
- name: Check if manager already configured
shell: >
docker info | perl -ne 'print "$1" if /Swarm: (\w+)/'
register: swarm_status
- name: Init cluster
shell: >-
docker swarm init
--advertise-addr "{{ ansible_default_ipv4.address }}"
when: "'active' not in swarm_status.stdout_lines"
- name: Get worker token
shell: docker swarm join-token -q worker
register: worker_token_result
- set_fact:
worker_token: "{{ worker_token_result.stdout }}"
Then I want to access worker_token on another hosts. Here's my main playbook, the fact is defined in the swarm-master role
- hosts: swarm_cluster
become: yes
roles:
- docker
- hosts: swarm_cluster:&manager
become: yes
roles:
- swarm-master
- hosts: swarm_cluster:&node
become: yes
tasks:
- debug:
msg: "{{ worker_token }}"
I'm getting undefined variable. How to make it visible globally?
Of course it works perfectly if I run debug on the same host.
if your goal is just to access worker_token from on another host, you can use hostvars variable and iterate through the group where you've defined your variable like this:
- hosts: swarm_cluster:&node
tasks:
- debug:
msg: "{{ hostvars[item]['worker_token'] }}"
with_items: "{{ groups['manager'] }}"
If your goal is to define the variable globally, you can add a step to define a variable on all hosts like this:
- hosts: all
tasks:
- set_fact:
worker_token_global: "{{ hostvars[item]['worker_token'] }}"
with_items: "{{ groups['manager'] }}"
- hosts: swarm_cluster:&node
tasks:
- debug:
var: worker_token_global

Resources