ERROR! vars file vars not found on the Ansible Controller - ansible

ERROR! vars file vars not found on the Ansible Controller. If you are using a module and expect the file to exist on the remote, see the remote_src option.
---
# tasks file for user-management
- hosts: linux
become: yes
vars_files:
- vars/active-users
- vars/remove-users
roles:
- { role: user-management }
tasks:
- import_tasks: /tasks/user-accounts.yml
- import_tasks: /tasks/authorized-key.yml
Trying to run the main.yml on a server to execute on remote hosts (linux). The vars playbook in the vars directory has two playbooks (active-users, remove-users).

Your vars folder should be at the same level than your playbook.
If yes would it be possible that you missed .yml extension of your active-users and remove-users.

Check the relative path this way:
vars_files:
- ../vars/active-users.yaml
- ../vars/remove-users.yaml

I was stuck at the same error, issue I found was that my vars_files's location defined in the playbook was incorrect.
Steps I followed :-
1> run #find / name | grep <vars_files>.yaml # this command is going to run search for your <vars_files>.yaml starting from the root directory; check location of the vars_files.
2> make sure location of the vars files in the playbook match with the step1.
for instance:-
hosts: localhost
connection: local
gather_facts: no
become: yes
vars_files:
- /root/secrets.yaml

#MoYaMoS, Thanks! This was the resolution for me. It wasn't obvious at first as I have a top level play calling two plays in the play/ directory. The domain-join play is the one that calls group_vars in this case.
├── group_vars
│ └── linux
├── plays
│ ├── create-vm.yml
│ └── domain-join.yml
└── provision.yml
I just specified the relative path in the domain-join.yml as you mentioned. Works perfectly.
vars_files:
- ../group_vars/linux

Related

Why does including var files using vars_files not work in Ansible?

I want to setup a server using Ansible. This is my file structure:
group_vars/
all.yml
development.yml
production.yml
vault/
all.yml
development.yml
production.yml
playbooks/
development.yml
production.yml
roles/
common/
tasks/
main.yml
vars/
main.yml
ansible.cfg
hosts
This is my ansible.cfg:
[defaults]
vault_password_file = ./vault_pass.txt
host_key_checking = False
inventory = ./hosts
The development.yml playbook:
- hosts: all
name: Development Playbook
become: true
roles:
- ../roles/common
vars_files:
- ../group_vars/development.yml
- ../group_vars/all.yml
- ../group_vars/vault/development.yml
- ../group_vars/vault/all.yml
And the tasks/main.yml file of the common role:
# Set hostame
- name: Set hostname
become: true
ansible.builtin.hostname:
name: "{{ server.hostname }}"
# Set timezone
- name: Set timezone
become: true
community.general.timezone:
name: "{{ server.timezone }}"
# Update all packages
- name: Update all packages
become: true
ansible.builtin.apt:
upgrade: dist
update_cache: true
The group_vars/all.yml file looks like this:
server:
hostname: "myhostname"
timezone: "Europe/Berlin"
When running the playbook using ansible-playbook playbooks/development.yml, I get this error:
fatal: [default]: FAILED! => {"msg": "The task includes an option with an undefined variable. The error was: 'dict object' has no attribute 'hostname'. 'dict object' has no attribute 'hostname'\n\nThe error appears to be in '/ansible/roles/common/tasks/main.yml': line 6, column 3, but may\nbe elsewhere in the file depending on the exact syntax problem.\n\nThe offending line appears to be:\n\n# Set hostame\n- name: Set hostname\n ^ here\n"}
Can someone explain to me why the vars_files does not work and how to fix this?
Ansible automatically imports files and directories in group_vars that match the name of an active group. That is, if you are targeting the production group, and you have a file group_vars/production.yaml, the variables defined in this file will be imported automatically.
If instead of a file you have the directory group_vars/production/, then all files in that directory will be imported for hosts in the production group.
So your files in group_vars/vault/ will only be imported automatically for hosts in the vault hostgroup, which isn't the behavior you want.
Without knowing all the details about your deployment, I would suggest:
Create directories group_vars/{all,production/development}.
Rename group_vars/all.yml inventory files to group_vars/all/common.yml, and similarly for development.yml and production.yml (the name common.yml isn't special, you can use whatever name you want).
Rename group_vars/vault/all.yml to group_vars/all/vault.yaml, and similarly for the other files.
This will give you the following layout:
group_vars/
├── all
│   ├── common.yml
│   └── vault.yml
├── development
│   ├── common.yml
│   └── vault.yml
└── production
├── common.yml
└── vault.yml

gitlab-runner: "local" ansible role not found

The ansible docu says
If Ansible were to load ansible.cfg from a world-writable current working directory, it would create a serious security risk.
That makes sense but causes a problem in my ci-pipeline fir my project:
.
├── group_vars
├── host_vars
├── playbooks
├── resources
├── roles
| ├── bootstrap
| └── networking
├── ansible.cfg
├── inventory.yml
├── requirements.yml
├── site.yml
└── vault.yml
I have two "local" roles which are checked in under source control of the ansible project under ./roles, but the roles are not found when i run ansible-playbook --syntax-check site.yml
$ ansible-playbook --syntax-check site.yml
[WARNING] Ansible is being run in a world writable directory (/builds/papanito/infrastructure), ignoring it as an ansible.cfg source. For more information see https://docs.ansible.com/ansible/devel/reference_appendices/config.html#cfg-in-world-writable-dir
[WARNING]: provided hosts list is empty, only localhost is available. Note
that the implicit localhost does not match 'all'
ERROR! the role 'networking' was not found in /builds/papanito/infrastructure/playbooks/roles:/root/.ansible/roles:/usr/share/ansible/roles:/etc/ansible/roles:/builds/papanito/infrastructure/playbooks
The error appears to have been in '/builds/papanito/infrastructure/playbooks/networking.yml': line 14, column 7, but may
be elsewhere in the file depending on the exact syntax problem.
The offending line appears to be:
roles:
- { role: networking, become: true }
^ here
ERROR: Job failed: exit code 1
--------------------------------------------------------
Obviously cause roles are searched
A roles/ directory, relative to the playbook file.
Thus my ansible.cfg defined to look in ./roles
# additional paths to search for roles in, colon separated
roles_path = ./roles
So based on the ansible docu I can use the environment variable ANSIBLE_CONFIG as I do as follows in the gitlab-ci.yml
variables:
SITE: "site.yml"
PLAYBOOKS: "playbooks/**/*.yml"
ANSIBLE_CONIG: "./ansible.cfg"
stages:
- verify
before_script:
.....
ansible-verify:
stage: verify
script:
- ansible-lint -v $SITE
- ansible-lint -v $PLAYBOOKS
- ansible-playbook --syntax-check $SITE
- ansible-playbook --syntax-check $PLAYBOOKS
But I still get the error above. What do I miss?
site.yml
- import_playbook: playbooks/networking.yml
- import_playbook: playbooks/monitoring.yml
playbooks/networking.yml
- name: Setup default networking
hosts: all
roles:
- { role: networking, become: true }
- { role: oefenweb.fail2ban, become: true }
I know the topic is old, but you have a typo in your config file. You are missing an F in ANSIBLE_CONFIG, so write this instead
variables:
SITE: "site.yml"
PLAYBOOKS: "playbooks/**/*.yml"
ANSIBLE_CONFIG: "./ansible.cfg"
BTW, it helped to solve my problem
Looks like a Hierarchy setup issue, there is no task associated within the roles bootstrap, networking; Instead looks like the playbooks are in a different folder called playbooks.
Refer directory layout: https://docs.ansible.com/ansible/latest/user_guide/playbooks_best_practices.html

Ansible copy module ignores role files directory

I'm having the following issue and am not sure if it's a bug or my setup is wrong. I've created a role ssh with the following structure:
.
├── roles
├── ssh
│ ├── files
│ │ └── sshd_config
│ └── tasks
│ └── main.yml
The main.yml file looks like this:
---
- hosts: all
tasks:
- name: "Set sshd configuration"
copy:
src: sshd_config
dest: /etc/ssh/sshd_config
Because sshd_config is stored in the recommended files directory, I expected the copy command to automatically fetch that file when referencing it from the task.
Instead, Ansible looks for sshd_config in the following directories:
ansible.errors.AnsibleFileNotFound: Could not find or access 'sshd_config'
Searched in:
<redacted>/roles/ssh/tasks/files/sshd_config
<redacted>/roles/ssh/tasks/sshd_config
<redacted>/roles/ssh/tasks/files/sshd_config
<redacted>/roles/ssh/tasks/sshd_config on the Ansible
Notice it does look in a files directory, but does so in the tasks folder!
Main goal is to send a local file (on my host machine) to the remote server.
I run the playbook with following command:
ansible-playbook -i hosts ./roles/ssh/tasks/main.yml -vvv
Questions:
Is my assumption right Ansible should look for the file in the files directory adjacent to tasks directory?
Did I mess up my setup?
I think you confused roles with playbooks. You created a playbook in place where the role should be. You rather should create a role and then create a playbook (outside of /roles dir) that uses it.
Here's example /roles/ssh/tasks/main.yml:
- name: "Set sshd configuration"
copy:
src: sshd_config
dest: /etc/ssh/sshd_config
and playbook using ssh role:
---
- hosts: all
tasks:
- import_role:
name: ssh

Ansible - malformed block was encountered while loading a block

trying to run a playbook:
---
- name: azure authorization
hosts: localhost
become: yes
gather_facts: true
tasks:
- azure_authorization_configuration
where task look like:
---
- name:
stat: >
path="{{ azure_subscription_authorization_configuration_file_dir }}"
register: stat_dir_result
tags:
- azure
and defaults main file look like:
---
azure_subscription_authorization_configuration_file_dir: '~/.azure/'
Directories tree look like:
├── hosts
├── playbooks
│ └── azure_authorization_playbook.yml
├── roles
│ ├── az_auth
│ │ ├── defaults
│ │ │ └── main.yml
│ │ └── tasks
│ │ └── main.yml
Ansible version: 2.9.1
Ansible playbook command line snippet:
/> ansible-playbook "/Users/user/Dev/Ansible/playbooks/azure_authorization_playbook.yml"
Output:
[WARNING]: No inventory was parsed, only implicit localhost is available
[WARNING]: provided hosts list is empty, only localhost is available. Note that the implicit localhost does not match 'all'
ERROR! A malformed block was encountered while loading a block
Don't have any idea which block was encountered while loading a which block, can anyone tell me where is the issue? Thanks!
The error is clearly coming from your playbook, because it doesn't call any roles or load any other playbooks. That is, if I put this in a file:
---
- name: azure authorization
hosts: localhost
become: yes
gather_facts: true
tasks:
- azure_authorization_configuration
And try to run it, I get the same error. The issue is the entry in your tasks block. A task should be a dictionary, but you've provided only a string:
tasks:
- azure_authorization_configuration
You include an example of a correctly written task in your question. If we put that into your playbook, it would look like:
- name: azure authorization
hosts: localhost
become: yes
gather_facts: true
tasks:
- name:
stat: >
path="{{ azure_subscription_authorization_configuration_file_dir }}"
register: stat_dir_result
tags:
- azure
I got this error because i had a syntax error in my playbook. Note the use of colons(':') in your playbook.
Ok, now I know how my playbook should look like, it was:
---
- name: azure authorization
hosts: localhost
become: yes
gather_facts: true
tasks:
- azure_authorization_configuration
Should be:
---
- name: azure authorization
hosts: localhost
become: yes
gather_facts: true
roles:
- azure_authorization_configuration
in my case it was the error in the role. i missed ":"
the wrong code
$ cat main.yml
---
# tasks file for db.local
- include pre_install.yml
- include my_sql.yml
the good code:
$ cat main.yml
---
# tasks file for db.local
- include: pre_install.yml
- include: my_sql.yml

Multiple environment deployment

I know that you can change between different inventory files using the -i flag which can be used to switch between different hosts.
In my case, the hosts to be acted upon change between deployments, so I take the hosts in as --extra-vars and use delegate_to to deploy to that host (see below for more details).
I was hoping for a way to switch between files containing environment variables in a similar fashion. For example, lets say I have the following simplified directory structure:
/etc/ansible/
├── ansible.cfg
├── hosts
└── project/
└── environments/
   ├── dev/
   │   └── vars.yml
   └── prd/
      └── vars.yml
The structure of vars.yml in both environments would be exactly the same, just with the variables having different values due to the differences between environments.
I've found a few places that talk about doing something similar such as these:
https://rock-it.pl/managing-multiple-environments-with-ansible-best-practices/
http://rosstuck.com/multistage-environments-with-ansible
http://www.geedew.com/setting-up-ansible-for-multiple-environment-deployments/
In those guides, they act against statically declared hosts. One thing that help me seems to be the directories called group_vars. It looks like the inventory points to the config with the same name, and assumingly uses those variables when the hosts: directive of a play contains the host(s) specified in the inventory header.
However, Since I dynamically read in the servers that we're acting against via the CLI flag --extra-vars, I can't take that approach because I will always have something like this for my plays:
...
hosts: localhost
tasks:
...
- name: do something
...
delegate_to: {{ item }}
with_items: {{ given_hosts }}
Or I run a task first that takes the servers and adds them to a new host like this:
- name: Extract Hosts
hosts: localhost
tasks:
- name: Adding given hosts to new group...
add_host:
name: "{{ item }}"
groups: some_group
with_items:
- "{{ list_of_hosts | default([]) }}"
and then uses the dynamically created group:
- name: Restart Tomcat for Changes to Take Effect
hosts: some_group
tasks:
- name: Restarting Tomcat...
service:
name: tomcat
state: restarted
So I need to find a way to specify which vars.yml to use. Because I use Jenkins to kick off the Ansible playbook via CLI over SSH, I was hoping for something like the following:
ansible-playbook /path/to/some/playbook.yml --include-vars /etc/ansible/project/dev/vars.yml
At the least, how would I explicitly include a vars.yml file in a playbook to use the variables defined within?
You can use:
extra vars with #: --extra-vars #/etc/ansible/project/dev/vars.yml
or
include_vars:
- include_vars: "/etc/ansible/project/{{ some_env }}/vars.yml"
to load different variables depending in your environment.

Resources