Ansible - Can't access value - Got error: 'dict object' has no attribute - ansible

---
- hosts: localhost
gather_facts: no
vars:
var_folder_path: /home/play
tasks:
- name: Include all yaml files in directories
include_vars:
dir: "{{ var_folder_path }}/vars"
extensions:
- 'yaml'
- name: "Print Variable Name"
shell: echo "{{ item }}"
loop:
- "{{ global.globalname.property.Name }}"
- "{{ S3.secret }}"
My var files under /home/play/vars
example_1.yaml
global:
globalname:
property:
cipher: DEFAULT
client:
type: dynamic
Name: test-run
example_2.yaml
gcp:
keyname: sample-run
S3:
secret: run
Resources: false
celery:
resources:
limits:
cpu: 5
When I execute the playbook I get the below error. Not sure why the values are not loading
fatal: [localhost]: FAILED! => {"msg": "'dict object' has no attribute 'globalname'"}

To debug a playbook when an error occurs, it is very often a good idea to insert a debug task.
In your case insert a debug task before the shell task and get the whole variable global output:
- debug:
var: global

As mentioned in my comment above, there are no issues with the playbooks you shared, except for the "S3" variable definition in example_2.yaml, which should be "s3" (in lower case).
One possible cause for the error you reported is that there are more than one "global" var definition in the var files at /home/play/vars, and is overriding the global var definition in example_1.yaml.
Default Ansible merge is in ASCII order, i.e., the last group loaded overwrites the previous groups. See how-variables-are-merged from Ansible official documentation for more details on how variables are merged in Ansible and update your var files accordingly.
As mentioned by user #phanaz in the other answer, its a good practice to use "debug" module for printing the vars to validate, in such scenarios.

Related

Setting and reading environment variables in Ansible does not work [duplicate]

I am deploying a CentOS machine and one among the tasks was to read a file that is rendered the Consul service which places it under /etc/sysconfig. I am trying to later read it in a variable using the lookup module but it is throwing an error below:
fatal: [ansible_vm1]: FAILED! => {"failed": true, "msg": "could not locate file in lookup: /etc/sysconfig/idb_EndPoint"}
But I am running the lookup task way below the point where the idb_EndPoint file is generated and also I looked it up manually logging in to verify the file was available.
- name: importing the file contents to variable
set_fact:
idb_endpoint: "{{ lookup('file', '/etc/sysconfig/idb_EndPoint') }}"
become: true
I also tried previlege escalations with another user become_user: deployuser along with become: true but didn't work still. Using the Ansible version 2.2.1.0.
All lookup plugins in Ansible are executed locally on the control machine.
Instead use slurp module:
- name: importing the file contents to variable
slurp:
src: /etc/sysconfig/idb_EndPoint
register: idb_endpoint_b64
become: true
- set_fact:
idb_endpoint: "{{ idb_endpoint_b64.content | b64decode }}"

Read a file locally and use the vars remote in Ansible

I read a YAML file locally with the following playbook:
- name: Ensure the deploy_manifest var is defined and read deploy manifest
hosts: localhost
connection: local
gather_facts: False
tasks:
- assert:
that: deploy_manifest is defined
msg: |
Error: Must provide providers config path. Fix: Add '-e deploy_manifest=/path/to/manifest' to the ansible-playbook command
- name: Read deploy manifest
include_vars:
file: "{{ deploy_manifest }}"
name: manifest
register: manifest
- debug:
msg: "[{{ manifest.key }}]: {{ manifest.value }}"
with_dict: "{{ manifest.ansible_facts }}"
and then in the same playbook YAML file I run:
- name: Deploy Backend services
hosts: backend
remote_user: ubuntu
gather_facts: False
vars:
env: "{{ env }}"
services: "{{ manifest.ansible_facts }}"
tasks:
- include_role:
name: services_backend
when: backend | default(true) | bool
However it doesn't work because debug fails. It says that manifest is empty.
Which is the best way to read a YAML file or generally a configuration in a playbook and then have the variables passed in another playbook?
Your debug module doesn't say "that manifest is empty", it says the key manifest.key does not exist because it does not.
You registered a fact named manifest with:
register: manifest
You try to refer to a key of the above manifest named key and another key (!) named value:
msg: "[{{ manifest.key }}]: {{ manifest.value }}"
Please read Looping over Hashes chapter and acknowledge that (without using loop control) you refer to the iterated variable using item.
Please note that with name: manifest and register: manifest you read your vars file into manifest.ansible_facts.manifest.

Trying to include a list of tasks from an playbook in Ansible

My folder structure:
First I'll give you this so you can see how this is laid out and reference it when reading below:
/environments
/development
hosts // Inventory file
/group_vars
proxies.yml
/custom_tasks
firewall_rules.yml // File I'm trying to bring in
playbook.yml // Root playbook, just brings in the plays
rev-proxy.yml // Reverse-proxy playbook, included by playbook.yml
playbook.yml:
---
- include: webserver.yml
- include: rev-proxy.yml
proxies.yml just contains firewall_custom_include_file: custom_tasks/firewall_rules.yml
firewall_rules.yml:
tasks:
- name: "Allowing traffic from webservers on 80"
ufw: src=10.10.10.3, port=80, direction=in, rule=allow
- name: "Allowing traffic all on 443"
ufw: port=443, rule=allow
and finally rev-proxy.yml play:
---
- hosts: proxies
become: yes
roles:
- { role: firewall }
- { role: geerlingguy.nginx }
pre_tasks:
# jessie-backports for nginx-extras 1.10
- name: "Adding jessie-backports repo"
copy: content="deb http://ftp.debian.org/debian jessie-backports main" dest="/etc/apt/sources.list.d/jessie-backports.list"
- name: Updating apt-cache.
apt: update_cache="yes"
- name: "Installing htop"
apt:
name: htop
state: present
- name: "Coopying SSL certificates"
copy: src=/vagrant/ansible/files/ssl/ dest=/etc/ssl/certs force=no
tasks:
- name: "Including custom firewall rules."
include: "{{ inventory_dir }}/{{ firewall_custom_include_file }}.yml"
when: firewall_custom_include_file is defined
vars_files:
- ./vars/nginx/common.yml
- ./vars/nginx/proxy.yml
What I'm trying to do:
Using Ansible 2.2.1.0
I'm trying to include a list of tasks that will be run if a variable firewall_custom_include_file is set. The list is included relative to the inventory directory by doing "{{ inventory_dir }}/{{ firewall_custom_include_file }}.yml" - in this case that works out to /vagrant/ansible/environments/development/custom_tasks/firewall_rules.yml
Essentially the idea here is that I need to have different firewall rules be executed based on what environment I'm in, and what hosts are being provisioned.
To give a simple example: I might want to whitelist a database server IP on the production webserver, but not on the reverse proxy, and also not on my development box.
The problem:
Whenever I include firewall_rules.yml like above, it tells me:
TASK [Including custom firewall rules.] ****************************************
fatal: [proxy-1]: FAILED! => {"failed": true, "reason": "included task files must contain a list of tasks"}
I'm not sure what it's expecting, I tried taking out the tasks: at the beginning of the file, making it:
- name: "Allowing traffic from webservers on 80"
ufw: src=10.10.10.3, port=80, direction=in, rule=allow
- name: "Allowing traffic all on 443"
ufw: port=443, rule=allow
But then it gives me the error:
root#ansible-control:/vagrant/ansible# ansible-playbook -i environments/development playbook.yml
ERROR! Attempted to execute "/vagrant/ansible/environments/development/custom_tasks/firewall_rules.yml" as inventory script: problem running /vagrant/ansible/environments/development/custom_tasks/firewall_rules.yml --list ([Errno 8] Exec format error)
Attempted to read "/vagrant/ansible/environments/development/custom_tasks/firewall_rules.yml" as YAML: 'AnsibleSequence' object has no attribute 'keys'
Attempted to read "/vagrant/ansible/environments/development/custom_tasks/firewall_rules.yml" as ini file: /vagrant/ansible/environments/development/custom_tasks/firewall_rules.yml:2: Expected key=value host variable assignment, got: name:
At this point I'm not really sure what it's looking for in the included file, and I can't seem to really find clear documentation on this, or other people having this issue.
Try to execute with -i environments/development/hosts instead of directory.
But I bet that storing tasks file inside inventory is far from best practices.
You may want to define list of custom rules as inventory variable, e.g.:
custom_rules:
- src: 10.10.10.3
port: 80
direction: in
rule: allow
- port: 443
rule: allow
And instead of include task, make something like this:
- ufw:
port: "{{ item.port | default(omit) }}"
rule: "{{ item.rule | default(omit) }}"
direction: "{{ item.direction | default(omit) }}"
src: "{{ item.src | default(omit) }}"
with_items: "{{ custom_rules }}"

Parametrized ansible task include - 'paramater is undefined'

I have problem using parametrized ansible include.
I have created following file, named tasks/haproxy.xml
- name: "change node state to {{state}} in haproxy"
tags:
- "haproxy-{{state}}"
become: yes
become_user: root
haproxy:
state: "{{ state }}"
wait: yes
host: "{{ inventory_hostname }}"
backend: app
socket: /var/container_data/haproxy/run/haproxy.sock
delegate_to: "{{ item }}"
with_items: "{{ groups.haproxy }}"
I am including this file in my playbook.yml, passing value of state parameter
- include: tasks/haproxy.yml state=enabled
I am getting following error
TASK [include] *****************************************************************
included: /home/bb/tasks/haproxy.yml for 172.16.224.68, 172.16.224.69
ERROR! 'state' is undefined
state is my parameter, passed when doing include (as described in http://docs.ansible.com/ansible/playbooks_roles.html#task-include-files-and-encouraging-reuse)
Whats wrong?
I am using Ansible 2.0.2.0.
edit:
using alternative syntax for passing paramteres
- include: tasks/haproxy.yml
vars:
state: enabled
gives exactly same error message.
Resolved by removing single leading space (!!) when using alternative syntax (vars).
So correct parametrized include is
- include: tasks/haproxy.yml
vars:
state: enabled
vars keyword must be at the same level as include keyword.
Otherwise it does not work, with message ERROR! 'state' is undefined.
Shortened syntax (- include: tasks/haproxy.yml state=enabled) still does not work.

how to read json file using ansible

I have a json file in the same directory where my ansible script is. Following is the content of json file:
{ "resources":[
{"name":"package1", "downloadURL":"path-to-file1" },
{"name":"package2", "downloadURL": "path-to-file2"}
]
}
I am trying to to download these packages using get_url. Following is the approach:
---
- hosts: localhost
vars:
package_dir: "/var/opt/"
version_file: "{{lookup('file','/home/shasha/devOps/tests/packageFile.json')}}"
tasks:
- name: Printing the file.
debug: msg="{{version_file}}"
- name: Downloading the packages.
get_url: url="{{item.downloadURL}}" dest="{{package_dir}}" mode=0777
with_items: version_file.resources
The first task is printing the content of the file correctly but in the second task, I am getting the following error:
[DEPRECATION WARNING]: Skipping task due to undefined attribute, in the future this
will be a fatal error.. This feature will be removed in a future release. Deprecation
warnings can be disabled by setting deprecation_warnings=False in ansible.cfg.
You have to add a from_json jinja2 filter after the lookup:
version_file: "{{ lookup('file','/home/shasha/devOps/tests/packageFile.json') | from_json }}"
In case if you need to read a JSON formatted text and store it as a variable, it can be also handled by include_vars .
- hosts: localhost
tasks:
- include_vars:
file: variable-file.json
name: variable
- debug: var=variable
for future visitors , if you are looking for a remote json file read. this won't work
as ansible lookups are executed in the local
you should use a module like Slurp

Resources