trying to run a playbook:
---
- name: azure authorization
hosts: localhost
become: yes
gather_facts: true
tasks:
- azure_authorization_configuration
where task look like:
---
- name:
stat: >
path="{{ azure_subscription_authorization_configuration_file_dir }}"
register: stat_dir_result
tags:
- azure
and defaults main file look like:
---
azure_subscription_authorization_configuration_file_dir: '~/.azure/'
Directories tree look like:
├── hosts
├── playbooks
│ └── azure_authorization_playbook.yml
├── roles
│ ├── az_auth
│ │ ├── defaults
│ │ │ └── main.yml
│ │ └── tasks
│ │ └── main.yml
Ansible version: 2.9.1
Ansible playbook command line snippet:
/> ansible-playbook "/Users/user/Dev/Ansible/playbooks/azure_authorization_playbook.yml"
Output:
[WARNING]: No inventory was parsed, only implicit localhost is available
[WARNING]: provided hosts list is empty, only localhost is available. Note that the implicit localhost does not match 'all'
ERROR! A malformed block was encountered while loading a block
Don't have any idea which block was encountered while loading a which block, can anyone tell me where is the issue? Thanks!
The error is clearly coming from your playbook, because it doesn't call any roles or load any other playbooks. That is, if I put this in a file:
---
- name: azure authorization
hosts: localhost
become: yes
gather_facts: true
tasks:
- azure_authorization_configuration
And try to run it, I get the same error. The issue is the entry in your tasks block. A task should be a dictionary, but you've provided only a string:
tasks:
- azure_authorization_configuration
You include an example of a correctly written task in your question. If we put that into your playbook, it would look like:
- name: azure authorization
hosts: localhost
become: yes
gather_facts: true
tasks:
- name:
stat: >
path="{{ azure_subscription_authorization_configuration_file_dir }}"
register: stat_dir_result
tags:
- azure
I got this error because i had a syntax error in my playbook. Note the use of colons(':') in your playbook.
Ok, now I know how my playbook should look like, it was:
---
- name: azure authorization
hosts: localhost
become: yes
gather_facts: true
tasks:
- azure_authorization_configuration
Should be:
---
- name: azure authorization
hosts: localhost
become: yes
gather_facts: true
roles:
- azure_authorization_configuration
in my case it was the error in the role. i missed ":"
the wrong code
$ cat main.yml
---
# tasks file for db.local
- include pre_install.yml
- include my_sql.yml
the good code:
$ cat main.yml
---
# tasks file for db.local
- include: pre_install.yml
- include: my_sql.yml
Related
I want to setup a server using Ansible. This is my file structure:
group_vars/
all.yml
development.yml
production.yml
vault/
all.yml
development.yml
production.yml
playbooks/
development.yml
production.yml
roles/
common/
tasks/
main.yml
vars/
main.yml
ansible.cfg
hosts
This is my ansible.cfg:
[defaults]
vault_password_file = ./vault_pass.txt
host_key_checking = False
inventory = ./hosts
The development.yml playbook:
- hosts: all
name: Development Playbook
become: true
roles:
- ../roles/common
vars_files:
- ../group_vars/development.yml
- ../group_vars/all.yml
- ../group_vars/vault/development.yml
- ../group_vars/vault/all.yml
And the tasks/main.yml file of the common role:
# Set hostame
- name: Set hostname
become: true
ansible.builtin.hostname:
name: "{{ server.hostname }}"
# Set timezone
- name: Set timezone
become: true
community.general.timezone:
name: "{{ server.timezone }}"
# Update all packages
- name: Update all packages
become: true
ansible.builtin.apt:
upgrade: dist
update_cache: true
The group_vars/all.yml file looks like this:
server:
hostname: "myhostname"
timezone: "Europe/Berlin"
When running the playbook using ansible-playbook playbooks/development.yml, I get this error:
fatal: [default]: FAILED! => {"msg": "The task includes an option with an undefined variable. The error was: 'dict object' has no attribute 'hostname'. 'dict object' has no attribute 'hostname'\n\nThe error appears to be in '/ansible/roles/common/tasks/main.yml': line 6, column 3, but may\nbe elsewhere in the file depending on the exact syntax problem.\n\nThe offending line appears to be:\n\n# Set hostame\n- name: Set hostname\n ^ here\n"}
Can someone explain to me why the vars_files does not work and how to fix this?
Ansible automatically imports files and directories in group_vars that match the name of an active group. That is, if you are targeting the production group, and you have a file group_vars/production.yaml, the variables defined in this file will be imported automatically.
If instead of a file you have the directory group_vars/production/, then all files in that directory will be imported for hosts in the production group.
So your files in group_vars/vault/ will only be imported automatically for hosts in the vault hostgroup, which isn't the behavior you want.
Without knowing all the details about your deployment, I would suggest:
Create directories group_vars/{all,production/development}.
Rename group_vars/all.yml inventory files to group_vars/all/common.yml, and similarly for development.yml and production.yml (the name common.yml isn't special, you can use whatever name you want).
Rename group_vars/vault/all.yml to group_vars/all/vault.yaml, and similarly for the other files.
This will give you the following layout:
group_vars/
├── all
│ ├── common.yml
│ └── vault.yml
├── development
│ ├── common.yml
│ └── vault.yml
└── production
├── common.yml
└── vault.yml
Ansible raw command can work via SSH, but this playbook cannot work with the same Cisco command show version.
It is giving this error message, related to SSH:
fatal: [192.168.1.15]: FAILED! => {"changed": false, "msg": "Connection type ssh is not valid for this module"}
Here is the inventory under inventory/host-file
[routers]
192.168.1.15
[routers:vars]
ansible_network_os=ios
ansible_user=admin
ansible_password=admin
ansible_connection=network_cli
And the playbook playbooks/show_version.yml
---
- name: Cisco show version example
hosts: routers
gather_facts: false
tasks:
- name: run show version on the routers
ios_command:
commands: show version | incl Version
register: output
- name: print output
debug:
var: output.stdout_lines
Here is my file structure
.
├── ansible.cfg
├── hosts
├── inventory
│ └── host-file
└── playbooks
└── show_version.yml
I do run the playbook with the command below, from the folder playbooks
ansible-playbook show_version.yml
Is this a SSH issue?
Can anyone share some experience?
According your error message
Connection type ssh is not valid for this module
at least the variable ansible_connection wasn't set with a value as it should. A minimal working example from a running environment
---
- name: Cisco show version example
hosts: routers
gather_facts: false
vars: # for execution environment
ansible_connection: ansible.netcommon.network_cli
ansible_network_os: cisco.ios.ios
ansible_become: yes
ansible_become_method: enable
tasks:
- name: Gather only the config and default facts
cisco.ios.ios_facts:
gather_subset:
- config
- name: Show facts
debug:
msg: "{{ ansible_facts.net_version }}"
In respect to the given comments, maybe you can add a debug task in your playbook
- name: Show 'ansible_*' values
debug:
msg:
- "{{ ansible_network_os }}"
- "{{ ansible_connection }}"
- meta: end_play
to see what gets actually loaded.
I'm trying to build an ansible configuration with less tied roles. but I'm struggling to find the best config ..
First of all I created many roles as elementary as possible and here what my ansible folder looks like:
.
├── group_vars
│ └── all
├── inventory.yml
├── playbook.yml
└── roles
├── app_1
│ └── defaults
├── app_2
│ └── defaults
├── postgres
│ └── defaults
├── rabbitmq
│ └── defaults
└── redis
└── defaults
inventory.yml
all:
children:
db:
hosts:
db.domain.com:
app1:
hosts:
app1.domain.com:
app2:
hosts:
app2.domain.com:
cache:
hosts:
cache.domain.com:
playbook.yml
- name: "play db"
hosts: db
roles:
- postgres
- name: "play cache"
hosts: cache
roles:
- redis
- name: "play app1"
hosts: app1
roles:
- app_1
- rabbitmq
- name: "play app2"
hosts: app2
roles:
- app_2
- rabbitmq
the problem here is that I have no idea how different roles can share variables because they're in different hosts. app_1 and app_2 needs variables defined in redis and postgres for example.
I have two solutions:
Define all variables in group_vars/all => the problem is that there are a lot of variable and my file will be too big besides the duplication of variables (locally in the role + globally)
in each role I could say, If you need a variable from postgres then use hostvars from the group "db" but here I think the role is not supposed to know anything about hosts configuration .
I really have no idea how to solve this problem to have a clean config.
thank you !
for the purpose of tests, any role need to have it's own variables, so you can test them individualy.
And variables also have a scope and precedence. see: variable precedence
So when you declare a variable at the role scope, it will not be available for others roles. if you need a variable to be global, add them to group_vars scope, host_vars scope, play scope or extra_vars scope (cli). anyway, you will need to include them.
One way to reuse the variables from other roles or group_vars is to use vars_files to load them for the play you want.
For example, if your app1 hosts require variables defined in redis/defaults/main.yml:
- name: "play app1"
hosts: app1
vars_files:
- roles/redis/defaults/main.yml
roles:
- app_1
- rabbitmq
Or a better option in my opinion would be to have variables segregated into group_vars and load them same way for other hosts.
- name: "play app2"
hosts: app2
vars_files:
- group_vars/db.yml
roles:
- app_2
- rabbitmq
my playbook directory structure.
/ansible_repo/
└── playbooks/
├──playbooks1.yml
├──playbooks2.yml
├── somerole.yml --> main playbook with roles
└── roles/
└── somerole
├── default
│ └── main.yml
├── handler
│ └── main.yml
├── tasks
│ └── main.yml
└── vars
└── main.yml
playbooks1.yml :
---
- hosts: all
tasks:
- pause:
minutes: 3
- name: ping host
win_ping:
somerole.yml :
---
- hosts: ci_host
roles:
- somerole
somerole\tasks\main.yml :
---
- include: playbooks/playbooks1.yml
when I run the role on some host:
ansible-playbook role-test.yml -vv --limit somehost
I get this error:
fatal: [somehost]: FAILED! =>
reason: |-
conflicting action statements: hosts, tasks
if I change the like that it passed:
- pause:
minutes: 3
- name: ping host
win_ping:
I tried understand how to set hosts and tasks in both, role-tasks-main and playbook.yml
and include the playbook into the role task.
if I get conflict I can config hierarchy host?
The error indicates that you are including a playbook inside a role, and for a role hosts and tasks are not allowed.
As somerole.yml is your main playbook, you can invoke other playbooks and roles as necessary.
Example:
- name: run playbook playbook1
import_playbook: playbooks/playbooks1.yml
- hosts: ci_host
roles:
- somerole
- name: run playbook playbook2
import_playbook: playbooks/playbooks2.yml
ERROR! vars file vars not found on the Ansible Controller. If you are using a module and expect the file to exist on the remote, see the remote_src option.
---
# tasks file for user-management
- hosts: linux
become: yes
vars_files:
- vars/active-users
- vars/remove-users
roles:
- { role: user-management }
tasks:
- import_tasks: /tasks/user-accounts.yml
- import_tasks: /tasks/authorized-key.yml
Trying to run the main.yml on a server to execute on remote hosts (linux). The vars playbook in the vars directory has two playbooks (active-users, remove-users).
Your vars folder should be at the same level than your playbook.
If yes would it be possible that you missed .yml extension of your active-users and remove-users.
Check the relative path this way:
vars_files:
- ../vars/active-users.yaml
- ../vars/remove-users.yaml
I was stuck at the same error, issue I found was that my vars_files's location defined in the playbook was incorrect.
Steps I followed :-
1> run #find / name | grep <vars_files>.yaml # this command is going to run search for your <vars_files>.yaml starting from the root directory; check location of the vars_files.
2> make sure location of the vars files in the playbook match with the step1.
for instance:-
hosts: localhost
connection: local
gather_facts: no
become: yes
vars_files:
- /root/secrets.yaml
#MoYaMoS, Thanks! This was the resolution for me. It wasn't obvious at first as I have a top level play calling two plays in the play/ directory. The domain-join play is the one that calls group_vars in this case.
├── group_vars
│ └── linux
├── plays
│ ├── create-vm.yml
│ └── domain-join.yml
└── provision.yml
I just specified the relative path in the domain-join.yml as you mentioned. Works perfectly.
vars_files:
- ../group_vars/linux