Chef-solo: deploy: access to release_path - ruby

I have following Chef cookbook:
deploy "/home/prj" do
repo "https://path_to_repo"
user node.project_owner
group node.project_owner
symlink_before_migrate({})
end
How can I access to the provider's release path? In my case in will be: /home/prj/releases/20120506125222/ .

It depends on where you want to access the release path. "Inside" the resource, i.e. the callbacks, that's easily possible using something iike
deploy "/home/prj" do
before_migrate do
gemfile = File.read("#{release_path}/Gemfile")
end
end
Outside of the resource, you don't have the release_path variable available. You can however use the current symlink which points to the currently deployed version, i.e. the last release:
current_path = "home/prj/current"
release_path = File.readlink(current_path)
Most of the time, you can to things directly in the current_path without having to resort to resolving the symlink target.
That said, you typically don't want to actually do things in there directly. Instead, you are encouraged to generate additional files in the shared directory (i.e. /home/prk/shared) and let chef symlink those files into the release during deployment. That's exactly what symlink_before_migrate is for. That way, you don't need to actually know the release path yourself but can let chef handle that for you.

Related

How to set dynamic variable once it is identified to all recipes in a cookbook

I need a help to set a global variable in chef recipes.
I have below series of recipes:
Discovers the tomcat from path variable/attibutes/default.rb:
default['tomcat_cookbook']['tomcathome']="['/home/tomcat','/home/ApacheTomcat']"
This recipe will identify the tomcat installation as either one of the directory will be available on server out of this two directories.
Lets say, if it sets the tomcathome to directory "/home/tomcat", I have some more subsequent recipes like start/stop/restart tomcat.
Currently for every recipe I am running discovery logic inside stop/start recipes while knowing that on a particular server, tomcathome is set to "/home/tomcat" .
Is there any way I can remove duplicate code for tomcat home discovery and make use of the identified tomcathome variable for remaining recipes.
Please suggest.
I think this would be a good use of libraries. I'll assume the cookbook name is tomcat_cookbook. In the libraries folder in a cookbook, create a file called path.rb.
Add the following code into the path.rb file. I prefer to namespace my libraries to organize my methods using CookbookName::ModuleName format.
libraries/path.rb:
module TomcatCookbook
module Path
def install_path
node['tomcat_cookbook']['tomcathome'].each do |path|
return path if ::Dir.exist?(path)
end
end
end
end
Within any recipe, you can include this module and use the methods in it:
# Use this include for use in the recipe
Chef::Recipe.send(:include, TomcatCookbook::Path)
# Use this include for using methods in the directory resource itself
Chef::Resource::Directory.send(:include, TomcatCookbook::Path)
Chef::Log.info("Install Path: #{install_path}")
directory "tomcat_install_path" do
path install_path
action :create
end
In certain situations, I have needed to create a common cookbook which includes only libraries which I can use across multiple cookbooks.

capistrano 3.8, shared_path seems partly be ignored

I'm working on a capistrano deployment configuration and would like to set the shared folder on another place. Background is, that I want to use a wildcard deployment (review app) and the target directory will be generated on-the-fly (which means, there isn't a shared folder in it) and I would use the shared folder with the assets across ALL review apps in this environment.
Therefore I have directories on the server:
/var/www/review/application_name
/var/www/review/application_name/shared/... (here are the assets and configurations I would like to share across ALL review apps)
/var/www/review/application_name/branch-name/ - this is the deployment path which will be created by capistrano when deploying a specific branch to the review stage.
I have used shared_path
set :shared_path, "/var/www/review/#{fetch(:application)}"
which works fine for the linked_dirs, but NOT for the linked_files. I get the error message:
00:01 deploy:check:linked_files
ERROR linked file /var/www/review/www.app.tld/123/shared/myfile does not exist on review.app.tld
which is true - but I don't know how to tell cap to put it in place. Of course the named file is in the shared folder
/var/www/review/www.app.tld/shared/
but capistrano seems to search on the wrong place when trying to check the linked_files (again: the linked_dirs are processed correct).
Any hints? Thanks in advance!
The shared_path is not something you can configure directly. Using set will not have any effect.
The shared path in Capistrano is always a directory named shared inside your :deploy_to location. Therefore if you want to change the shared path, you must set :deploy_to, like so:
set :deploy_to, -> { "/var/www/review/#{fetch(:application)}" }
This will effectively cause shared_path to become:
"/var/www/review/#{fetch(:application)}/shared"
Keep in mind that :deploy_to is used as the base directory for many things: releases, repo, current, etc. So if you change :deploy_to you will affect all of them.
If your :application variable is defined at some later point, or changed, you'll need to set to a deferred variable:
set :shared_path, -> { "/var/www/review/#{fetch(:application)}" }
This evaluates that string on-demand instead of in advance.

List all the declared packages in chef

I'm working on a infrastructure where some servers don't have access to the internet, so I have to push the packages to the local repo before declaring them to be installed on Chef.
However we've been on a situation where Chef failed to install a package since the package wasn't there on some boxes and it has been successful on some other boxes.
What I want to do is to run a Ruby/RSpec test before applying Chef config on the nodes to make sure the packages declared on the recipes do actually exist on the repo.
In order to do that I need to be able to list all the packages exists in the our recipes.
My question is: Is there anyway to list all the declared packages in Chef? I had a quick look at Chef::Platform and ChefSpec but unfortunately couldn't find anything useful to my problem.
Do you have any idea where is the best place to look at?
If you use ChefSpec you can find all the packages by calling chef_run.find_resources(:package) inside some test. See the source code. Like this:
require 'chefspec'
describe 'example::default' do
let(:chef_run) { ChefSpec::Runner.new.converge(described_recipe) }
it 'does something' do
chef_run.find_resources(:package)...
end
end
You could install one or more of the community ohai plugins. For example the following will return information about installed sofware:
debian
Redhat
windows
Once the plugins are enabled they will add additional node attributes that will be searchable from chef-server.

ChefSpec should not test included recipe

I have built a cookbook for installing Jenkins CI. It uses the key and repository resources from the yum cookbook, so I end up with the following recipe:
yum_key "RPM-GPG-KEY-jenkins" do
url "http://pkg.jenkins-ci.org/redhat/jenkins-ci.org.key"
action :add
end
yum_repository "jenkins" do
description "Jenkins-CI 3rd party repository"
url "http://pkg.jenkins-ci.org/redhat"
key "RPM-GPG-KEY-jenkins"
action :add
end
When I include this recipe in another recipe:
include_recipe 'sp_jenkins::default'
and I test this with the following ChefSpec test
it 'includes the `sp_jenkins::default` recipe' do
expect(chef_run).to include_recipe('sp_jenkins::install')
end
my ChefSpec test fails with the following output:
NameError:
Cannot find a resource for yum_key on chefspec version 0.6.1
(I'm not sure why it says version 0.6.1, gem list tells me it's using 3.0.2)
The sp_jenkins cookbook does depend on the yum cookbook (metadata.rb), and runs fine, however, the cookbook I'm currently writing does not depend on the yum cookbook and therefore doesn't have the yum_key and yum_repository methods available.
Is there a way to prevent ChefSpec from 'descending' into included recipes/cookbooks and just test the current cookbook?
Ohai! Julian is correct - ChefSpec actually does a Chef Solo run in memory on your local machine. It rewrites the provider actions to be a noop, but creates a registry of all the actions taken (including those that would be taken if notifications were executed).
So just like you need the yum cookbook to converge this recipe on a real node, you need it to converge during your unit tests with ChefSpec. The easiest way to accomplish this is by using the Berkshelf or Librarian resolvers. To use the Berkshelf resolver, simply require 'chefspec/berkshelf' after requiring chefspec:
# spec_helper.rb
require 'chefspec'
require 'chefspec/berkshelf'
If you have Berkshelf installed on your system, it will pull all the cookbooks into a temporary directory and run ChefSpec for you.
You may also want to take a look at Strainer, which aims to solve a similar problem.
On a somewhat unrelated note, I am working on a fairly large refactor to the Jenkins cookbook that may better suit your needs.
Sources:
I wrote it...
No, there's no way to prevent it from descending, because it's trying to converge an entire Chef run in memory.
However, if you use the Berkshelf functionality in ChefSpec, the Berkshelf dependency resolver will feed all dependent cookbooks to the in-memory Chef run, and you'll be golden.
It is absolutely valid to expect to test your cookbook in isolation, and not include other projects' code into the scope of your tests. Unfortunately there appears to be no supported, "clean" way to do this, that I can find. I was able to achieve this, but it comes at a price.
To use this technique, do not require 'chefspec/berkshelf' anywhere in your test code, only chefspec itself, as you are intentionally not gathering other cookbook source. Here is a template of my working test module (not my complete test code, as I have omitted RSpec config options):
describe 'mycookbook::recipe' do
let(:chef_run) do
ChefSpec::SoloRunner.new(platform: 'x', version: 'x') {
# ...
}.converge(described_recipe)
end
before :each do
allow_any_instance_of(Chef::RunContext::CookbookCompiler).to receive(:cookbook_order) do
Chef::Log.debug 'Attempt to source external cookbooks blocked'
[described_cookbook]
end
allow_any_instance_of(Chef::Recipe).to receive(:include_recipe) do |recipe|
Chef::Log.debug "Attempt to include #{recipe} blocked"
end
end
it 'works' do
# ...
end
end
You need both of these in your before. The one I had to work for is the intercept of the :cookbook_order method. I had to drill down into the Chef internals to discover this. Keep in mind, this worked for me using Chef 14, but there is no guarantee that this will be future-safe. After upgrading Chef you might have to find another solution, if the implementation of CookbookCompiler ever changes. (The intercept of Chef::Recipe.include_recipe however is a supported API and therefore should be at least somewhat future-safe.)
And, I mention that this comes at a price. (Other than using an unsupported hack!) You will not be able to do any expects for your recipe or attribute includes, except within your own cookbook. A test case like this will fail, because the recipe can't actually be included, as you are preventing that:
it 'includes othercookbook::recipe' do
expect_any_instance_of(Chef::Recipe).to receive(:include_recipe).with('othercookbook::recipe')
end
Also, you must now satisfy in your before blocks all attributes and other preconditions that might otherwise be fulfilled by other recipes in your run list. So you may be signing yourself up for considerable pain by doing this. But, once you have finished, you will have much less brittle tests. (Although to achieve 100% purity regarding external dependencies, you must also surrender fauxhai, which will be even more painful.)

What is the best way to store project specific config info in ruby Rake tasks?

I have rake tasks for getting the production database from the remote server, etc. It's always the same tasks but the server info changes per project. I have the code here: https://gist.github.com/868423 In the last task, I'm getting a #local_db_dir_path = nil error.
I don't think want to use shell environment variables because I don't want to set them up each time I use rake or open a new shell.
Stick the settings in a YAML file, and read it like this:
require 'yaml'
config = YAML.load("config.yaml") # or wherever
$remote_host = config['remote_host']
$ssh_username = config['ssh_username']
# and so on
Or you can just read one big config hash:
$config = YAML.load("config.yaml")
Note that I'm using globals here, not instance variables, so there's no chance of being surprised by variable scope.
config.yaml would then look like this:
---
remote_host: some.host.name
ssh_username: myusername
other_setting: foo
whatever: bar
I tend to keep a config.yaml.sample checked in with the main body of the code which has example but non-working settings for everything which I can copy across to the non-versioned config.yaml. Some people like to keep their config.yaml checked in to a live branch on the server itself, so that it's versioned, but I've never bothered with that.
you should be using capistrano for this, you could use mulitsage or just separate host setting to a task, example capistrano would look like this:
task :development do
server "development.host"
end
task :backup do
run "cd #{current_path}; rake db:dump"
download "remote_path", "local_path"
end
and call it like this:
cap development backup

Resources