Getting Ansible example (jboss-standalone) to work with Vagrant - yaml

I need some assistance with getting https://github.com/ansible/ansible-examples.git / jboss-standalone to work with Vagrant. I think I am making the same mistake, with my Vagrant configuration.
My Vagrantfile is here:
VAGRANTFILE_API_VERSION = "2"
Vagrant.configure(VAGRANTFILE_API_VERSION) do |config|
config.vm.box = "chef/centos-6.6"
config.vm.network "forwarded_port", guest: 80, host: 8080
config.vm.hostname = "webserver1"
config.vm.provision :ansible do |ansible|
ansible.playbook = "site.yml"
ansible.verbose = "vvvv"
ansible.inventory_path = "/Users/miledavenport/vagrant-ansible/jboss-standalone/hosts"
end
end
My hosts file is here:
# Generated by Vagrant
default ansible_ssh_host=127.0.0.1 ansible_ssh_port=2222
[jboss-servers]
webserver1
[webserver1]
127.0.0.1 ansible_connection=local
[localhost]
127.0.0.1
I am fairly new to using ansible, and want to "play" with Ansible, by using Vagrant.
"vagrant up" produces the following error:
TASK: [jboss-standalone | Install Java 1.7 and some basic dependencies] *******
FATAL: no hosts matched or all hosts have already failed -- aborting
"vagrant ssh" works OK.
site.yml is:
---
# This playbook deploys a simple standalone JBoss server.
- hosts: jboss-servers
user: root
roles:
- jboss-standalone
I don't understand why I am getting the error:
FATAL: no hosts matched
The hosts contains webserver1, which is the same as the Vagrantfile hostname.
Can someone please help me to resolve this error.
Thanks :)
Miles.

Maybe your intent is to create a parent group called jboss-servers, with a subgroup called webserver1
Try changing [jboss-servers] to [jboss-servers:children]
This will make the group jboss-servers also contain 127.0.0.1 as its hosts, and your playbook should run. Link to DOCS
At the moment since webserver1 does not have the KVP ansible_ssh_host=<ip> associated with it, it is just a hostname without an ip to connect to. Make it a subgroup of jboss-servers only if you dont have webserver1 mapped to some IP in your /etc/hosts file or something :)

Related

How to render a YAML list in a Vagrantfile

I have a Vagrantfile which is referencing a YAML file to make multi-host configuration easier.
It's mostly working, but I'm using the Ansible provisioner and I need to reference a list/array for the ansible.groups item.
The YAML looks like this:
hosts:
- host:
provision:
ansible:
enable: yes
playbook_path: webserver.yml
groups:
- web
- vagrant
- live
I'm trying to reference it in the Vagrantfile using:
if host['provision']['ansible']['enable'] == true
vmhost.vm.provision "ansible" do |ansible|
ansible.verbose = "vv"
ansible.config_file = "./ansible.cfg"
ansible.playbook = host['provision']['ansible']['playbook_path']
ansible.tags = host['provision']['ansible']['tags']
ansible.groups = host['provision']['ansible']['groups']
end
end
But this gives me this error when building the actual VMs:
undefined method `each_pair' for ["web", "vagrant", "dev"]:Array
I searched and haven't found anything that addresses ansible_groups specifically, though I have seen various patterns for reading lists/arrays in the Vagrantfile. Any ideas?
The Ansible groups are not supposed to be an array but a hash, mapping the groups to the server names. It should look like:
hosts:
- host:
provision:
ansible:
enable: yes
playbook_path: webserver.yml
groups:
web:
- machine1
- machine2
vagrant:
- machine3
- machine4
live:
- machine5
See the documentation for more details on how this can be used.

Ansible called by Vagrant does not prompt for vault password

Summary
I have a Vagrantfile provisioning a Virtualbox VM with Ansible. The Ansible playbook contains an Ansible Vault-encrypted variable. My problem is that Vagrant provisioning does not prompt for the password although I pass the option to do so.
Minimal, complete example
Vagrantfile:
Vagrant.configure(2) do |config|
config.vm.provider "virtualbox" do |vb|
# Build a master VM for this box and clone it for individual VMs
vb.linked_clone = true
end
config.vm.box = "bento/ubuntu-16.04"
config.vm.hostname = "test-vm"
config.vm.provision :ansible do |ansible|
ansible.verbose = true
ansible.playbook = "playbook.yml"
ansible.ask_vault_pass = true
# ansible.raw_arguments = --ask-vault-pass
# ansible.raw_arguments = ["--vault-id", "#prompt"]
# ansible.raw_arguments = ["--vault-id", "dev#prompt"]
end
end
playbook.yml:
---
- name: Test
hosts: all
vars:
foo: !vault |
$ANSIBLE_VAULT;1.1;AES256
65306264626234353434613262613835353463346435343735396138336362643535656233393466
6331393337353837653239616331373463313665396431390a313338333735346237363435323066
66323435333331616639366536376639626636373038663233623861653363326431353764623665
3663636162366437650a383435666537626564393866643461393739393434346439346530336364
3639
tasks:
- name: print foo's value
debug:
msg: "foo -> {{ foo }}"
The Ansible Vault password is abc.
When I call vagrant up on first execution of the Vagrantfile or later vagrant provision I do not get the expected prompt to enter the password. Instead the task print foo's value prints the (red) message:
fatal: [default]: FAILED! => {"msg": "Attempting to decrypt but no vault secrets found"}
I also tried the outcommented alternatives in the Vagrantfile to make Ansible prompt for the password. I can see all of them in the ansible-playbook call printed by Vagrant.
Additionally, I tried several options when encrypting foo with ansible-vault encrypt_string, which did not help either.
What can I do to make Ansible prompt for the password when called with Vagrant?
Versions
kubuntu 16.04
Vagrant 1.8.1 and Vagrant 2.0.0
Ansible 2.4.0.0
Update
This is the Ansible call as printed by Vagrant:
PYTHONUNBUFFERED=1 ANSIBLE_FORCE_COLOR=true ANSIBLE_HOST_KEY_CHECKING=false ANSIBLE_SSH_ARGS='-o UserKnownHostsFile=/dev/null -o IdentitiesOnly=yes -o ControlMaster=auto -o ControlPersist=60s' ansible-playbook --connection=ssh --timeout=30 --ask-vault-pass --limit="default" --inventory-file=/opt/vagrantVM/.vagrant/provisioners/ansible/inventory -v playbook.yml
If I execute this directly without Vagrant the password prompt works as expected! So it must be Vagrant, which somehow suppresses the prompt.
In Ansible 2.4.0.0, the Vault password prompt (i.e. --ask-vault-pass) is skipped when no tty is present (there is no execution of getpass.getpass function).
With the Ansible 2.4.0.0 prompt implementation, the Vagrant provisioner integration don't receive an interactive prompt.
Note that --ask-pass and --ask-become-pass implementation haven't changed (i.e. there is no mechanism to skip the getpass function) and are still working fine with Vagrant.
I plan to report the issue upstream to the Ansible project, but for the moment you can resolve the situation by downgrading to Ansible 2.3 (or by using vault_password_file provisioner option instead).
References:
https://github.com/ansible/ansible/pull/22756/files#diff-bdd6c847fae8976ab8a7259d0b583f34L176
https://github.com/hashicorp/vagrant/issues/2924
See the related Bug Fix Pull Request (#31493) in Ansible project

Using Ansible hosts raises `--limit` does not match any hosts

My Vagrantfile looks like:
Vagrant.configure("2") do |config|
config.vm.box = "vag-box"
config.vm.box_url = "boxes/base.box"
config.vm.network :private_network, ip: "192.168.100.100"
config.vm.provision :setup, type: :ansible_local do |ansible|
ansible.playbook = "playbook.yml"
ansible.provisioning_path = "/vagrant"
ansible.inventory_path = "/vagrant/hosts"
end
end
My playbook file looks like:
---
- name: Setup system
hosts: localhost
become: true
become_user: root
roles:
- { role: role1 }
- { role: role2 }
My hosts file looks like:
[localhost]
localhost # 192.168.100.100
During ansible execution I get the following error:
ERROR! Specified --limit does not match any hosts
First: "localhost" is a name that is assigned to the 127.0.0.1 address by convention. This refers to the loopback address of the local machine. I don't think this is what you are trying to use it for, based on the comment in your hosts file.
Second: The Ansible provisioner in Vagrant usually creates a custom inventory file, with the required contents to provision the Vagrant box. For example:
# Generated by Vagrant
myvagrantbox ansible_ssh_host=127.0.0.1 ansible_ssh_port=2222 ansible_ssh_user='vagrant' ansible_ssh_private_key_file='/home/sage/ansible/vagrant/myvagrantbox/.vagrant/machines/myvagrantbox/virtualbox/private_key'
As you are overriding the inventory file, you will need to specify a similar line for the hostname of your vagrant box.
If you omit the ansible.inventory_path = "/vagrant/hosts" line from your config, it should JustWork(tm). You might also wish to specify config.vm.hostname = "myvagrantboxname" in your configuration, so you know what hostname will be used.
See the Using Vagrant and Ansible and the Vagrant -- Ansible Provisioner documentation for more details.

Ansible & Vagrant - give args to ansible provision

A collegeau of mine wrote a script to automate Vagrant installations, to include Ansible scripts. So if I run ansible provision, the playbook ansible/playbooks/provision.yml` is run at the vagrant machine(s).
The downside of this script is the Ansible playbook will only deploy on the machine with ansible provision.
Now, as I'm writing code and working, I am noticing the downsides. Because I can give ansible-playbook parameters / arguments, such asansible-playbook -i inventory provision.yml -vvv --tags "test". But this is not possible because of an architectual problem.
So, instead of solving the real problem (which I try to evade), are there any guru's out there, who can point me in the right directoin, to make it possible to give ansible provision arguments? E.g. ansible provision -vvv.
I looked at https://www.vagrantup.com/docs/cli/provision.html but without help.
Thanks.
Not completly sure I have understood correctly but maybe this config (from one of my projects), in vagrantfile, could help :
config.vm.provision "ansible" do |ansible|
ansible.playbook = "ansible/playbook.yml"
ansible.limit = 'all'
ansible.tags = 'local'
ansible.sudo = true
ansible.verbose = 'v'
ansible.groups = {
"db" => ["db"],
"app" => ["app"],
"myproject" => ["myproject"],
"fourth" => ["fourth"],
"local:children" => ["db", "app", "myproject", "fourth"]
}
end
In this Vagrantfile, I configured 4 VM vagrant.
vagrant_ansible_inventory looks like this :
# Generated by Vagrant
db ansible_ssh_host=127.0.0.1 ansible_ssh_port=2222 ansible_ssh_private_key_file=/home/user/.vagrant.d/insecure_private_key
app ansible_ssh_host=127.0.0.1 ansible_ssh_port=2200 ansible_ssh_private_key_file=/home/user/.vagrant.d/insecure_private_key
myproject ansible_ssh_host=127.0.0.1 ansible_ssh_port=2201 ansible_ssh_private_key_file=/home/user/.vagrant.d/insecure_private_key
fourth ansible_ssh_host=127.0.0.1 ansible_ssh_port=2202 ansible_ssh_private_key_file=/home/user/.vagrant.d/insecure_private_key
[db]
db
[app]
app
[myproject]
myproject
[fourth]
fourth
[local:children]
db
app
myproject
fourth
https://www.vagrantup.com/docs/provisioning/ansible_local.html

vagrant with ansible error

If I have Vagrantfile with ansible provision:
Vagrant.configure(2) do |config|
config.vm.box = 'hashicorp/precise32'
config.vm.network "forwarded_port", guest: 80, host: 8080
config.vm.provision :ansible do |ansible|
ansible.playbook = "playbook.yml"
ansible.inventory_path = "hosts"
ansible.limit = 'all'
ansible.sudo = true
end
end
My hosts file is very simple:
[local]
web ansible_connection=local
and playbook.yml is:
---
- hosts: local
sudo: true
remote_user: vagrant
tasks:
- name: update apt cache
apt: update_cache=yes
- name: install apache
apt: name=apache2 state=present
When I start vagrant with wagrant up I got error:
failed: [web] => {"failed": true, "parsed": false}
[sudo via ansible, key=daxgehmwoinwalgbzunaiovnrpajwbmj] password:
What's the problem?
The error is occurring because ansible is assuming key based ssh authentication, however your vagrant is creating a VM is uses password based authentication.
There are two ways you can solve this issue.
You can run your ansible playbook as
ansible-playbook playbook.yml --ask-pass
This will tell ansible to not assume key-based authentication, instead use password based ssh authentication and ask one before execution.

Resources