Install an apt package using chef.json on Vagrant solo-provisioner - vagrant

I would like to install several arbitrary APT packages using Vagrants Chef solo provisioner.
chef.json seems to allow you to execute chef commands, but I'm unclear as to how to do this. Something like:
chef.json = {
apt: {
package: {'libssl-dev': {action: 'install'}}
}
?

Chef uses recipes to define resources that are executed on nodes via a chef-client.
A recipe is basically a definition of what to do (a script)
A resource is a particular element you are configuring (a file, a service, or package etc)
A node is the machine running chef-client
The json that you are setting up for chef-solo defines attributes which are like variables that your Chef can use to decide what to do.
So you have a hash of attributes for Chef to use, but you need a recipe that configures resources based on that hash to be executed on your node
In your case you need to configure the package resource
package "name" do
some_attribute "value"
action :action
end
The package resource supports lots of different package back ends, including apt so you don't need to worry about differences (except for package names).
To install the packages from your hash you can create a recipe like:
node[:apt][:package].each do |pkg,pkg_data|
package pkg do
action pkg_data[:action].to_sym
end
end
Individual recipes are then packaged up into cookbooks which is a logical grouping of like recipes. Generally a cookbook would be for a piece of software, say httpd or mysql.
As Tensibia mentions, read through the Vagrant Chef-Solo docco for where to put your recipe/cookbook and run from there.

chef.json does not execute or define commands.
It defines attributes for the node which can be used by recipes.
I would recomand reading THIS
and THIS
Some of the json content is generated by vagrant like defining the runlist attribute with the chef.add_recipe keyword in the vagrantfile.
For your use case you should have a cookbook with a recipe parsing node['apt'] and using deb_package resource.

Related

Dependency loop during Pupppet provisioning due missing OS package

Im trying to provision my development server using Vagrant and Puppet. Below is some of my Puppet Manifest at this point. The issue im having is that im ending up in a dependency loop which is ofcourse correct. The only problem is that i dont see a way to do it without so therefor i need some help.
Im using the latest version of the box provided by Puppetlabs named puppetlabs/ubuntu-14.04-64-puppet. While adding a PPA to the package manager i receive an error that apt-add-repository is not available. Therefor you need to install the software-properties-common package.
The only problem is that before installing this package, you need to run apt-get update. The second problem is that the manifest wont accept it and it will try to add the PPA before so that, ofcourse which is a logic conclusion, it only has to update the package manager once. But by picking this last solution i will end up in a loop which triggers an error:
==> default: Error: Failed to apply catalog: Found 1 dependency cycle:
==> default: (Exec[add-apt-repository-ppa:ondrej/php-7.0] => Class[Apt::Update] => Exec[apt_update] => Class[Apt::Update] =>
Package[git] => Class[Systempackages] => Apt::Ppa[ppa:ondrej/php-7.0]
=> Exec[add-apt-repository-ppa:ondrej/php-7.0])
class systempackages {
package { [ 'git', 'curl', 'acl', 'unattended-upgrades', 'vim', 'software-properties-common']:
ensure => "installed",
require => [
Class['apt::update'],
],
}
}
/*===========================================*/
## System
Exec { path => [ "/bin/", "/sbin/" , "/usr/bin/", "/usr/sbin/" ] }
class{'systempackages':}
# APT
class { 'apt':
update => {
frequency => 'always',
},
}
apt::ppa { 'ppa:ondrej/php-7.0':
before => Package['php7.0-cli'],
require => Class['systempackages'],
}
# PHP
package {'php7.0-cli':
ensure => 'installed',
}
Given that this is on vagrant, I suggest installing package software-properties-common manually as part of your Vagrantfile.
Something like config.vm.provision "shell", inline: "apt-get update && apt-get install software-properties-common should work.
The circular dependency reflects the fact that Puppet is not a provisioning system. It can be used by a provisioning system or in conjunction with one, but it depends on a fairly substantial software stack being available before it can get off the ground. If Package 'software-properties-common' is necessary for full functioning of the Apt subsystem, then your best bet is to rely on your provisioning system to install it, so that it is available before Puppet ever runs, and to avoid declaring any relationship between that package and the classes and resources of the Apt module.
You are also impacted by the puppetlabs-apt module being quite good about declaring the relationships needed to ensure proper order of application. This is a double-edged sword, however: people cause themselves trouble with surprising frequency by declaring their own relationships with classes or defined types from that module that conflict with the ones it declares itself. In particular, it is asking for trouble to have your Apt::ppa resource require a class containing resources that themselves require any class or resource from the Apt module.
In any case, class apt::update is not a public class of the module. The main implication is that code outside the module should not reference it in any way. You should instead rely on the value you provided for class parameter $apt::update to instruct Puppet to perform an apt-get update at a suitable time.

override attributes in elasticsearch cookbook

I am trying to use "elasticsearch/cookbook-elasticsearch" cookbook for with my wrapper cookbook. I want to override following default attributes from cookbook-elasticsearch in my wrapper cookbook.
default.elasticsearch[:rpm_url] = "https://download.elasticsearch.org/elasticsearch/elasticsearch/elasticsearch-0.90.12.noarch.rpm"
default.elasticsearch[:rpm_sha] = "ab7ea2e00d8f1b73642e3ea44d9647b11e6b0b96"
Cookbook : https://github.com/elasticsearch/cookbook-elasticsearch
How do I do this in my-elasticsearch cookbook ?
cat site-cookbooks/my-elasticsearch/attributes/default.rb
override.elasticsearch[:rpm_url] = "https://download.elasticsearch.org/elasticsearch/elasticsearch/elasticsearch-1.4.4.noarch.rpm"
override.elasticsearch[:rpm_sha] = "ec8b41c54a6d897479645b2507476e0824bc71db"
is this correct one ?
I want to use this cookbook for chef 10
any help
To add information to #mark-oconnor comment:
Documentation about attributes in chef 10.
Recommended notation would be override['elasticsearch']['rpm_url'] = "new_value" the method and symbol way to access attributes have been problematic in the past.
As the cookbook loading order in chef 10 is not always clearly predictable you have to use override level to ensure the correct value is used when compîling recipes.
Edit after comments:
In the elasticsearch cookbook in version 0.3.13 the default recipe install from tarball.
If you wish to use a packaged install you have to call the corresponding recipe before the default, as in the default recipe there's a guard to not install tarball if elasticsearch is already installed.
The correct recipe in wrapper cookbook for this particular case is:
include_recipe 'elasticsearch::rpm' # Take the overriden attributes and install package
include_recipe 'elasticsearch' # no need to ::default, if omitted it's the recipe loaded

Can you mix chef-zero chef-metal, chef-metal-vagrant (Vagrant) and berkshelf?

I want to leverage chef-metal and chef-zero with my existing cookbooks and chef-repo (already leveraging berkshelf and vagrant for dev)
I started with the example provided at https://github.com/opscode/chef-metal#vagrant
I've got a vagrant_linux.rb
require 'chef_metal_vagrant'
vagrant_box 'CentOS-6.4-x86_64' do
url 'http://developer.nrel.gov/downloads/vagrant-boxes/CentOS-6.4-x86_64-v20130427.box'
end
with_machine_options :vagrant_options => {
'vm.box' => 'CentOS-6.4-x86_64'
}
I also have dev_server.rb
require 'chef_metal'
with_chef_local_server :chef_repo_path => '~/workspace/git/my-chef-repo'
machine 'dev_server' do
tag 'dev_server'
recipe 'myapp'
converge true
end
If I put my myapp cookbook under ~/workspace/git/my-chef-repo/cookbooks, the above works fine
using the following command, I've got a vagrant managed vm named dev_server converging (applying myapp recipe)
chef-client -z vagrant_linux.rb dev_server.rb
But now, I'd like to keep my cookbooks folder empty and use berkshelf,
It does not look supported by chef-zero at the moment, is it ?
How could I do that ?
You can pass :cookbook_path that contains multiple paths as an Array like so: https://github.com/opscode/ec-metal/blob/master/cookbooks/ec-harness/recipes/vagrant.rb#L12-L13
with_chef_local_server :chef_repo_path => repo_path,
:cookbook_path => [ File.join(repo_path, 'cookbooks'),
File.join(repo_path, 'vendor', 'cookbooks') ]
Then you can use berks to vendor upstream cookbooks into a different path (vendor/cookbooks/), while putting your own cookbooks into cookbooks/ like so: https://github.com/opscode/ec-metal/blob/master/Rakefile#L114
berks vendor vendor/cookbooks/
The "berks vendor" command is how I generally do that--use "berks vendor" and add the vendored path to your cookbook path.

List all the declared packages in chef

I'm working on a infrastructure where some servers don't have access to the internet, so I have to push the packages to the local repo before declaring them to be installed on Chef.
However we've been on a situation where Chef failed to install a package since the package wasn't there on some boxes and it has been successful on some other boxes.
What I want to do is to run a Ruby/RSpec test before applying Chef config on the nodes to make sure the packages declared on the recipes do actually exist on the repo.
In order to do that I need to be able to list all the packages exists in the our recipes.
My question is: Is there anyway to list all the declared packages in Chef? I had a quick look at Chef::Platform and ChefSpec but unfortunately couldn't find anything useful to my problem.
Do you have any idea where is the best place to look at?
If you use ChefSpec you can find all the packages by calling chef_run.find_resources(:package) inside some test. See the source code. Like this:
require 'chefspec'
describe 'example::default' do
let(:chef_run) { ChefSpec::Runner.new.converge(described_recipe) }
it 'does something' do
chef_run.find_resources(:package)...
end
end
You could install one or more of the community ohai plugins. For example the following will return information about installed sofware:
debian
Redhat
windows
Once the plugins are enabled they will add additional node attributes that will be searchable from chef-server.

ChefSpec should not test included recipe

I have built a cookbook for installing Jenkins CI. It uses the key and repository resources from the yum cookbook, so I end up with the following recipe:
yum_key "RPM-GPG-KEY-jenkins" do
url "http://pkg.jenkins-ci.org/redhat/jenkins-ci.org.key"
action :add
end
yum_repository "jenkins" do
description "Jenkins-CI 3rd party repository"
url "http://pkg.jenkins-ci.org/redhat"
key "RPM-GPG-KEY-jenkins"
action :add
end
When I include this recipe in another recipe:
include_recipe 'sp_jenkins::default'
and I test this with the following ChefSpec test
it 'includes the `sp_jenkins::default` recipe' do
expect(chef_run).to include_recipe('sp_jenkins::install')
end
my ChefSpec test fails with the following output:
NameError:
Cannot find a resource for yum_key on chefspec version 0.6.1
(I'm not sure why it says version 0.6.1, gem list tells me it's using 3.0.2)
The sp_jenkins cookbook does depend on the yum cookbook (metadata.rb), and runs fine, however, the cookbook I'm currently writing does not depend on the yum cookbook and therefore doesn't have the yum_key and yum_repository methods available.
Is there a way to prevent ChefSpec from 'descending' into included recipes/cookbooks and just test the current cookbook?
Ohai! Julian is correct - ChefSpec actually does a Chef Solo run in memory on your local machine. It rewrites the provider actions to be a noop, but creates a registry of all the actions taken (including those that would be taken if notifications were executed).
So just like you need the yum cookbook to converge this recipe on a real node, you need it to converge during your unit tests with ChefSpec. The easiest way to accomplish this is by using the Berkshelf or Librarian resolvers. To use the Berkshelf resolver, simply require 'chefspec/berkshelf' after requiring chefspec:
# spec_helper.rb
require 'chefspec'
require 'chefspec/berkshelf'
If you have Berkshelf installed on your system, it will pull all the cookbooks into a temporary directory and run ChefSpec for you.
You may also want to take a look at Strainer, which aims to solve a similar problem.
On a somewhat unrelated note, I am working on a fairly large refactor to the Jenkins cookbook that may better suit your needs.
Sources:
I wrote it...
No, there's no way to prevent it from descending, because it's trying to converge an entire Chef run in memory.
However, if you use the Berkshelf functionality in ChefSpec, the Berkshelf dependency resolver will feed all dependent cookbooks to the in-memory Chef run, and you'll be golden.
It is absolutely valid to expect to test your cookbook in isolation, and not include other projects' code into the scope of your tests. Unfortunately there appears to be no supported, "clean" way to do this, that I can find. I was able to achieve this, but it comes at a price.
To use this technique, do not require 'chefspec/berkshelf' anywhere in your test code, only chefspec itself, as you are intentionally not gathering other cookbook source. Here is a template of my working test module (not my complete test code, as I have omitted RSpec config options):
describe 'mycookbook::recipe' do
let(:chef_run) do
ChefSpec::SoloRunner.new(platform: 'x', version: 'x') {
# ...
}.converge(described_recipe)
end
before :each do
allow_any_instance_of(Chef::RunContext::CookbookCompiler).to receive(:cookbook_order) do
Chef::Log.debug 'Attempt to source external cookbooks blocked'
[described_cookbook]
end
allow_any_instance_of(Chef::Recipe).to receive(:include_recipe) do |recipe|
Chef::Log.debug "Attempt to include #{recipe} blocked"
end
end
it 'works' do
# ...
end
end
You need both of these in your before. The one I had to work for is the intercept of the :cookbook_order method. I had to drill down into the Chef internals to discover this. Keep in mind, this worked for me using Chef 14, but there is no guarantee that this will be future-safe. After upgrading Chef you might have to find another solution, if the implementation of CookbookCompiler ever changes. (The intercept of Chef::Recipe.include_recipe however is a supported API and therefore should be at least somewhat future-safe.)
And, I mention that this comes at a price. (Other than using an unsupported hack!) You will not be able to do any expects for your recipe or attribute includes, except within your own cookbook. A test case like this will fail, because the recipe can't actually be included, as you are preventing that:
it 'includes othercookbook::recipe' do
expect_any_instance_of(Chef::Recipe).to receive(:include_recipe).with('othercookbook::recipe')
end
Also, you must now satisfy in your before blocks all attributes and other preconditions that might otherwise be fulfilled by other recipes in your run list. So you may be signing yourself up for considerable pain by doing this. But, once you have finished, you will have much less brittle tests. (Although to achieve 100% purity regarding external dependencies, you must also surrender fauxhai, which will be even more painful.)

Resources