How to use Ansible with Vagrant locally? - vagrant

I followed the official tutorial from: http://docs.ansible.com/ansible/guide_vagrant.html
My Vagrantfile looks like:
VAGRANTFILE_API_VERSION = '2'
Vagrant.configure(VAGRANTFILE_API_VERSION) do |config|
config.vm.box = 'bento/centos-7.2'
# SSH default root user
config.ssh.username = 'root'
config.ssh.password = 'vagrant'
# Networking
# config.vm.network :private_network, ip: '192.168.33.16'
# Provisioning
config.vm.provision :ansible do |ansible|
ansible.playbook = 'playbooks/main.yml'
end
end
And my test.sh script:
ansible-playbook \
--private-key=.vagrant/machines/default/virtualbox/private_key \
-u vagrant \
-i .vagrant/provisioners/ansible/inventory/vagrant_ansible_inventory \
playbooks/main.yml \
$#
But when I run the script, I receive the following error:
fatal: [default]: UNREACHABLE! => {"changed": false, "msg": "Failed to connect to the host via ssh.", "unreachable": true}
to retry, use: --limit #playbooks/main.retry
PLAY RECAP *********************************************************************
default : ok=0 changed=0 unreachable=1 failed=0
How to test ansible-playbook with Vagrant?

If you want to run ansible against host then Vagrant already has ansible_local provisioner.
To run this you have to first install ansible on guest machine if it's not already installed. And this can be done from Vagrantfile shell provisioner with easy_install or pip.

Related

Ansible called by Vagrant does not prompt for vault password

Summary
I have a Vagrantfile provisioning a Virtualbox VM with Ansible. The Ansible playbook contains an Ansible Vault-encrypted variable. My problem is that Vagrant provisioning does not prompt for the password although I pass the option to do so.
Minimal, complete example
Vagrantfile:
Vagrant.configure(2) do |config|
config.vm.provider "virtualbox" do |vb|
# Build a master VM for this box and clone it for individual VMs
vb.linked_clone = true
end
config.vm.box = "bento/ubuntu-16.04"
config.vm.hostname = "test-vm"
config.vm.provision :ansible do |ansible|
ansible.verbose = true
ansible.playbook = "playbook.yml"
ansible.ask_vault_pass = true
# ansible.raw_arguments = --ask-vault-pass
# ansible.raw_arguments = ["--vault-id", "#prompt"]
# ansible.raw_arguments = ["--vault-id", "dev#prompt"]
end
end
playbook.yml:
---
- name: Test
hosts: all
vars:
foo: !vault |
$ANSIBLE_VAULT;1.1;AES256
65306264626234353434613262613835353463346435343735396138336362643535656233393466
6331393337353837653239616331373463313665396431390a313338333735346237363435323066
66323435333331616639366536376639626636373038663233623861653363326431353764623665
3663636162366437650a383435666537626564393866643461393739393434346439346530336364
3639
tasks:
- name: print foo's value
debug:
msg: "foo -> {{ foo }}"
The Ansible Vault password is abc.
When I call vagrant up on first execution of the Vagrantfile or later vagrant provision I do not get the expected prompt to enter the password. Instead the task print foo's value prints the (red) message:
fatal: [default]: FAILED! => {"msg": "Attempting to decrypt but no vault secrets found"}
I also tried the outcommented alternatives in the Vagrantfile to make Ansible prompt for the password. I can see all of them in the ansible-playbook call printed by Vagrant.
Additionally, I tried several options when encrypting foo with ansible-vault encrypt_string, which did not help either.
What can I do to make Ansible prompt for the password when called with Vagrant?
Versions
kubuntu 16.04
Vagrant 1.8.1 and Vagrant 2.0.0
Ansible 2.4.0.0
Update
This is the Ansible call as printed by Vagrant:
PYTHONUNBUFFERED=1 ANSIBLE_FORCE_COLOR=true ANSIBLE_HOST_KEY_CHECKING=false ANSIBLE_SSH_ARGS='-o UserKnownHostsFile=/dev/null -o IdentitiesOnly=yes -o ControlMaster=auto -o ControlPersist=60s' ansible-playbook --connection=ssh --timeout=30 --ask-vault-pass --limit="default" --inventory-file=/opt/vagrantVM/.vagrant/provisioners/ansible/inventory -v playbook.yml
If I execute this directly without Vagrant the password prompt works as expected! So it must be Vagrant, which somehow suppresses the prompt.
In Ansible 2.4.0.0, the Vault password prompt (i.e. --ask-vault-pass) is skipped when no tty is present (there is no execution of getpass.getpass function).
With the Ansible 2.4.0.0 prompt implementation, the Vagrant provisioner integration don't receive an interactive prompt.
Note that --ask-pass and --ask-become-pass implementation haven't changed (i.e. there is no mechanism to skip the getpass function) and are still working fine with Vagrant.
I plan to report the issue upstream to the Ansible project, but for the moment you can resolve the situation by downgrading to Ansible 2.3 (or by using vault_password_file provisioner option instead).
References:
https://github.com/ansible/ansible/pull/22756/files#diff-bdd6c847fae8976ab8a7259d0b583f34L176
https://github.com/hashicorp/vagrant/issues/2924
See the related Bug Fix Pull Request (#31493) in Ansible project

Trying to provision my Vagrant with Ansible

I am trying to provision a virtual machine with an Ansible playbook.
Following the documentation, I ended up with this simple Vagrant File:
Vagrant.configure("2") do |config|
config.vm.box = "ubuntu/xenial64"
config.vm.network "private_network", ip: "192.168.50.5"
config.vm.provision "ansible" do |ansible|
ansible.verbose = "vvv"
ansible.playbook = "playbook.yml"
end
end
As you can see, I am trying to provision a xenial64 machine (Ubuntu 16.04) from a playbook.yml file.
When I launch vagrant provision, here is what I get:
$ vagrant provision
==> default: Running provisioner: ansible...
default: Running ansible-playbook...
PYTHONUNBUFFERED=1 ANSIBLE_FORCE_COLOR=true ANSIBLE_HOST_KEY_CHECKING=false ANSIBLE_SSH_ARGS='-o UserKnownHostsFile=/dev/null -o IdentitiesOnly=yes -o ControlMaster=auto -o ControlPersist=60s' ansible-playbook --connection=ssh --timeout=30 --limit="default" --inventory-file=/home/mmarteau/Code/ansible-arc/.vagrant/provisioners/ansible/inventory -vvv playbook.yml
Using /etc/ansible/ansible.cfg as config file
statically included: /home/mmarteau/Code/ansible-arc/roles/user/tasks/ho-my-zsh.yml
statically included: /home/mmarteau/Code/ansible-arc/roles/webserver/tasks/nginx.yml
statically included: /home/mmarteau/Code/ansible-arc/roles/webserver/tasks/php.yml
statically included: /etc/ansible/roles/geerlingguy.composer/tasks/global-require.yml
statically included: /etc/ansible/roles/geerlingguy.nodejs/tasks/setup-RedHat.yml
statically included: /etc/ansible/roles/geerlingguy.nodejs/tasks/setup-Debian.yml
PLAYBOOK: playbook.yml *********************************************************
1 plays in playbook.yml
PLAY RECAP *********************************************************************
So my file seems to be read because I get some statically included from roles into my playbook.yml file.
However, the script stops very quickly, and I don't have any information to debug or to see any errors.
How can I debug this process?
EDIT: More info
Here is my playbook.yml file:
---
- name: Installation du serveur
# hosts: web
hosts: test
vars:
user: mmart
apps:
dev:
branch: development
domain: admin.test.dev
master:
branch: master
domain: admin.test.fr
bitbucket_repository: git#bitbucket.org:Test/test.git
composer_home_path: '/home/mmart/.composer'
composer_home_owner: mmart
composer_home_group: mmart
zsh_theme: agnoster
environment_file: arc-parameters.yml
ssh_agent_config: arc-ssh-config
roles:
- apt
- user
- webserver
- geerlingguy.composer
- geerlingguy.nodejs
- deploy
- deployer
...
Here is my host file:
[web]
XX.XX.XXX.XXX ansible_ssh_private_key_file=/somekey.pem ansible_become=true ansible_user=ubuntu
[test]
Here is the generated host file from vagrant in .vagrant/provisioners/ansible/inventory/vagrant_ansible_inventory :
# Generated by Vagrant
default ansible_ssh_host=127.0.0.1 ansible_ssh_port=2222 ansible_ssh_user='ubuntu' ansible_ssh_private_key_file='/home/mmart/Code/ansible-test/.vagrant/machines/default/virtualbox/private_key'
Is this correct? Shouldn't the ansible_ssh_user be set to vagrant ?
In your playbook use the default as hosts, as vagrant by default will only create an inventory element for that particular host:
---
- name: Installation du serveur
hosts: default
(...)

Getting Ansible example (jboss-standalone) to work with Vagrant

I need some assistance with getting https://github.com/ansible/ansible-examples.git / jboss-standalone to work with Vagrant. I think I am making the same mistake, with my Vagrant configuration.
My Vagrantfile is here:
VAGRANTFILE_API_VERSION = "2"
Vagrant.configure(VAGRANTFILE_API_VERSION) do |config|
config.vm.box = "chef/centos-6.6"
config.vm.network "forwarded_port", guest: 80, host: 8080
config.vm.hostname = "webserver1"
config.vm.provision :ansible do |ansible|
ansible.playbook = "site.yml"
ansible.verbose = "vvvv"
ansible.inventory_path = "/Users/miledavenport/vagrant-ansible/jboss-standalone/hosts"
end
end
My hosts file is here:
# Generated by Vagrant
default ansible_ssh_host=127.0.0.1 ansible_ssh_port=2222
[jboss-servers]
webserver1
[webserver1]
127.0.0.1 ansible_connection=local
[localhost]
127.0.0.1
I am fairly new to using ansible, and want to "play" with Ansible, by using Vagrant.
"vagrant up" produces the following error:
TASK: [jboss-standalone | Install Java 1.7 and some basic dependencies] *******
FATAL: no hosts matched or all hosts have already failed -- aborting
"vagrant ssh" works OK.
site.yml is:
---
# This playbook deploys a simple standalone JBoss server.
- hosts: jboss-servers
user: root
roles:
- jboss-standalone
I don't understand why I am getting the error:
FATAL: no hosts matched
The hosts contains webserver1, which is the same as the Vagrantfile hostname.
Can someone please help me to resolve this error.
Thanks :)
Miles.
Maybe your intent is to create a parent group called jboss-servers, with a subgroup called webserver1
Try changing [jboss-servers] to [jboss-servers:children]
This will make the group jboss-servers also contain 127.0.0.1 as its hosts, and your playbook should run. Link to DOCS
At the moment since webserver1 does not have the KVP ansible_ssh_host=<ip> associated with it, it is just a hostname without an ip to connect to. Make it a subgroup of jboss-servers only if you dont have webserver1 mapped to some IP in your /etc/hosts file or something :)

vagrant with ansible error

If I have Vagrantfile with ansible provision:
Vagrant.configure(2) do |config|
config.vm.box = 'hashicorp/precise32'
config.vm.network "forwarded_port", guest: 80, host: 8080
config.vm.provision :ansible do |ansible|
ansible.playbook = "playbook.yml"
ansible.inventory_path = "hosts"
ansible.limit = 'all'
ansible.sudo = true
end
end
My hosts file is very simple:
[local]
web ansible_connection=local
and playbook.yml is:
---
- hosts: local
sudo: true
remote_user: vagrant
tasks:
- name: update apt cache
apt: update_cache=yes
- name: install apache
apt: name=apache2 state=present
When I start vagrant with wagrant up I got error:
failed: [web] => {"failed": true, "parsed": false}
[sudo via ansible, key=daxgehmwoinwalgbzunaiovnrpajwbmj] password:
What's the problem?
The error is occurring because ansible is assuming key based ssh authentication, however your vagrant is creating a VM is uses password based authentication.
There are two ways you can solve this issue.
You can run your ansible playbook as
ansible-playbook playbook.yml --ask-pass
This will tell ansible to not assume key-based authentication, instead use password based ssh authentication and ask one before execution.

Ansible & Vagrant - apt-get: command not found

I am new with using Vagrant and Ansible. Currently I am stuck on Ansible telling me that it can't find apt-get command.
My Vagrant box runs on Ubuntu and here are the relevant files:
// Vagrantfile
Vagrant.configure("2") do |config|
config.vm.box = "precise32"
config.vm.box_url = "http://files.vagrantup.com/precise32.box"
config.vm.network :private_network, :ip => "192.168.33.10"
# make sure apt repo is up to date
config.vm.provision :shell, :inline => 'apt-get -qqy update'
config.vm.provision "ansible" do |ansible|
ansible.playbook = "playbook.yml"
end
end
// vagrant_ansible_inventory_default
# Generated by Vagrant
default ansible_ssh_host=127.0.0.1 ansible_ssh_port=2222
// playbook.yml
---
- name: Install MySQL, Nginx, Node.js, and Monit
hosts: 127.0.0.1
user: root
# remote_user: user
# sudo: yes
roles:
- nginx
// roles/nginx/tasks/main.yml
---
- name: Installs nginx web server
apt: pkg=nginx state=installed update_cache=true
notify:
- start nginx
When I run vagrant provision, I get
[default] Running provisioner: shell...
[default] Running: inline script
stdin: is not a tty
[default] Running provisioner: ansible...
PLAY [Install MySQL, Nginx, Node.js, and Monit] *******************************
GATHERING FACTS ***************************************************************
ok: [127.0.0.1]
TASK: [nginx | Installs nginx web server] *************************************
failed: [127.0.0.1] => {"cmd": "apt-get update && apt-get install python-apt -y -q",
"failed": true, "item": "", "rc": 127}
stderr: /bin/sh: apt-get: command not found
msg: /bin/sh: apt-get: command not found
FATAL: all hosts have already failed -- aborting
PLAY RECAP ********************************************************************
to retry, use: --limit #/Users/foosbar/playbook.retry
127.0.0.1 : ok=1 changed=0 unreachable=0 failed=1
Ansible failed to complete successfully. Any error output should be
visible above. Please fix these errors and try again.
What am I missing?
What happens if you run the play against the IP address you've actually assigned to vagrant?
// playbook.yml
---
- name: Install MySQL, Nginx, Node.js, and Monit
hosts: 192.168.33.10
The hosts on which you want to run the play is the vagrant. hosts in this case doesn't refer to the master, but the nodes.

Resources