I have a code to compare sha-256 value from the file and artifactory which throws error if the checksum does not match. But I want to compute checksum in my own recipe rather than reading it from artifactory. Below is the code I have so far and trying to find out if there are any functions or methods which I can use in my recipe to compute sha-256 for a file.
Thanks in advance
only_if { node['abc'] }
not_if { ::File.exist?(checksum_file) and ::File.read(checksum_file).strip==coordinates['checksum'].strip }
message 'The previously deployed checksum is not aligned with the actual value'
level :debug
notifies :create, 'remote_file[download file]', :immediately
end```
chef is built on top of ruby, so we can unleash the power of ruby within recipes.
if you would like to compute the sha-256 checksome for a file, the following ruby code might be handy
require 'digest'
Digest::SHA256.file('/path/to/file').hexdigest
the snippet in your post, does not specify which chef resource are you using, although it feels like you are using the the log resource.
since you mentioned that you are interested in downloading the file from artifactory, remote_file resource can do that and it has a checksum property:
checksum: Optional, see use_conditional_get. The SHA-256 checksum of the file. Use to prevent a file from being re-downloaded. When the local file matches the checksum, Chef Infra Client does not download it.
Related
I am trying to create an LWRP that will call the resource that is defined within itself. My cookbook's structure is as follows:
In the machine cookbook's provider, I have a code snippet as follows:
require 'chef/provisioning' # driver for creating machines
require '::File'
def get_environment_json
##environment_template = JSON.parse(File::read(new_resource.template_path + "environment.json"))
return ##environment_template
end
The code is only trying to read a json file and I am using File::read for it.
I keep getting an error as follows:
LoadError
cannot load such file -- ::File
Does anyone know how I can use File::read inside my LWRP's provider?
OK, so the prior two answers are both half right. You have two problems.
First, you can't require ::File as it's already part of Ruby. This is the cause of your error.
Second, if you call File.read you will grab Chef's File not ruby's. You need to do a ::File.read to use Ruby's File class.
require '::File'
Is incorrect and is causing the LoadError. Delete this line. You don't need it. File is part of the Ruby core and doesn't need to be required.
To further explain, the string argument to require represents the file name of the library you want to load. So, it should look like require "file", or require "rack/utils".
It happens becuase Chef already has a file resource. We have to use the Ruby File class in a recipe.We use ::File to use the Ruby File class to fix this issue. For example:
execute 'apt-get-update' do
command 'apt-get update'
ignore_failure true
only_if { apt_installed? }
not_if { ::File.exist?('/var/lib/apt/periodic/update-success-stamp') }
end
Source: https://docs.chef.io/ruby.html#ruby-class
I have built a cookbook for installing Jenkins CI. It uses the key and repository resources from the yum cookbook, so I end up with the following recipe:
yum_key "RPM-GPG-KEY-jenkins" do
url "http://pkg.jenkins-ci.org/redhat/jenkins-ci.org.key"
action :add
end
yum_repository "jenkins" do
description "Jenkins-CI 3rd party repository"
url "http://pkg.jenkins-ci.org/redhat"
key "RPM-GPG-KEY-jenkins"
action :add
end
When I include this recipe in another recipe:
include_recipe 'sp_jenkins::default'
and I test this with the following ChefSpec test
it 'includes the `sp_jenkins::default` recipe' do
expect(chef_run).to include_recipe('sp_jenkins::install')
end
my ChefSpec test fails with the following output:
NameError:
Cannot find a resource for yum_key on chefspec version 0.6.1
(I'm not sure why it says version 0.6.1, gem list tells me it's using 3.0.2)
The sp_jenkins cookbook does depend on the yum cookbook (metadata.rb), and runs fine, however, the cookbook I'm currently writing does not depend on the yum cookbook and therefore doesn't have the yum_key and yum_repository methods available.
Is there a way to prevent ChefSpec from 'descending' into included recipes/cookbooks and just test the current cookbook?
Ohai! Julian is correct - ChefSpec actually does a Chef Solo run in memory on your local machine. It rewrites the provider actions to be a noop, but creates a registry of all the actions taken (including those that would be taken if notifications were executed).
So just like you need the yum cookbook to converge this recipe on a real node, you need it to converge during your unit tests with ChefSpec. The easiest way to accomplish this is by using the Berkshelf or Librarian resolvers. To use the Berkshelf resolver, simply require 'chefspec/berkshelf' after requiring chefspec:
# spec_helper.rb
require 'chefspec'
require 'chefspec/berkshelf'
If you have Berkshelf installed on your system, it will pull all the cookbooks into a temporary directory and run ChefSpec for you.
You may also want to take a look at Strainer, which aims to solve a similar problem.
On a somewhat unrelated note, I am working on a fairly large refactor to the Jenkins cookbook that may better suit your needs.
Sources:
I wrote it...
No, there's no way to prevent it from descending, because it's trying to converge an entire Chef run in memory.
However, if you use the Berkshelf functionality in ChefSpec, the Berkshelf dependency resolver will feed all dependent cookbooks to the in-memory Chef run, and you'll be golden.
It is absolutely valid to expect to test your cookbook in isolation, and not include other projects' code into the scope of your tests. Unfortunately there appears to be no supported, "clean" way to do this, that I can find. I was able to achieve this, but it comes at a price.
To use this technique, do not require 'chefspec/berkshelf' anywhere in your test code, only chefspec itself, as you are intentionally not gathering other cookbook source. Here is a template of my working test module (not my complete test code, as I have omitted RSpec config options):
describe 'mycookbook::recipe' do
let(:chef_run) do
ChefSpec::SoloRunner.new(platform: 'x', version: 'x') {
# ...
}.converge(described_recipe)
end
before :each do
allow_any_instance_of(Chef::RunContext::CookbookCompiler).to receive(:cookbook_order) do
Chef::Log.debug 'Attempt to source external cookbooks blocked'
[described_cookbook]
end
allow_any_instance_of(Chef::Recipe).to receive(:include_recipe) do |recipe|
Chef::Log.debug "Attempt to include #{recipe} blocked"
end
end
it 'works' do
# ...
end
end
You need both of these in your before. The one I had to work for is the intercept of the :cookbook_order method. I had to drill down into the Chef internals to discover this. Keep in mind, this worked for me using Chef 14, but there is no guarantee that this will be future-safe. After upgrading Chef you might have to find another solution, if the implementation of CookbookCompiler ever changes. (The intercept of Chef::Recipe.include_recipe however is a supported API and therefore should be at least somewhat future-safe.)
And, I mention that this comes at a price. (Other than using an unsupported hack!) You will not be able to do any expects for your recipe or attribute includes, except within your own cookbook. A test case like this will fail, because the recipe can't actually be included, as you are preventing that:
it 'includes othercookbook::recipe' do
expect_any_instance_of(Chef::Recipe).to receive(:include_recipe).with('othercookbook::recipe')
end
Also, you must now satisfy in your before blocks all attributes and other preconditions that might otherwise be fulfilled by other recipes in your run list. So you may be signing yourself up for considerable pain by doing this. But, once you have finished, you will have much less brittle tests. (Although to achieve 100% purity regarding external dependencies, you must also surrender fauxhai, which will be even more painful.)
I am playing with Roles with Chef Solo (11.4.4 and 11.6.0). A bit Confused.
For Chef Solo runs, should roles be written in Ruby or JSON?
As per the official docs: About Roles, Roles can be stored as domain-specific Ruby (DSL) files or JSON data.
NOTE: chef-client uses Ruby for Roles, when these files are uploaded to Chef Server, they are converted to JSON. Whenever chef-repo is refreshed, the contents of all domain-specific Ruby files are re-compiled to JSON and re-uploaded to the server.
My question is, if the requirement is to run Chef in solo mode without a server and roles are needed, should the roles be written in Ruby or JSON (we don't have a server to convert Ruby to JSON)?
My guess is the latter. Does anyone know the correct answer?
BTW: I've seen people mixing Ruby and JSON in role files...
What is the Ruby DSL equivalent for rbenv.rb below?
Example, run rbenv + ruby-build cookbooks to install rbenv on Ubuntu.
rbenv.json
{
"run_list": ["role[rbenv]"]
}
roles/rbenv.rb
name "rbenv"
description "rbenv + ruby-build"
run_list(
"recipe[rbenv]",
"recipe[ruby_build]"
)
override_attributes(
:rbenv => {
:git_repository => "https://github.com/sstephenson/rbenv.git"
},
:ruby_build => {
:git_repository => "https://github.com/sstephenson/ruby-build.git"
}
)
Chef Solo run chef-solo -c solo.rb -j rbenv.json -l debug works as expected. This is to achieve cloning via HTTPS because it easier behind the firewall.
However, using a Ruby DSL version of role rbenv.rb like below
name "rbenv"
description "rbenv + ruby-build"
run_list "recipe[rbenv]", "recipe[ruby_build]"
# default_attributes ":rbenv" => {":install_prefix" => "/opt"}
override_attributes ":rbenv" => {":git_repository" => "https://github.com/sstephenson/rbenv.git"}, ":ruby_build" => {":git_repository" => "https://github.com/sstephenson/ruby-build.git"}
It didn't seem to work because it still used the default attributes (clone via git URL instead of HTTPS).
I am new to Ruby so most likely I made some mistakes in the DSL code, please help;-)
* git[/opt/rbenv] action sync[2013-09-03T03:44:53+00:00] INFO: Processing git[/opt/rbenv] action sync (rbenv::default line 91)
[2013-09-03T03:44:53+00:00] DEBUG: git[/opt/rbenv] finding current git revision
[2013-09-03T03:44:53+00:00] DEBUG: git[/opt/rbenv] resolving remote reference
================================================================================
Error executing action `sync` on resource 'git[/opt/rbenv]'
================================================================================
Mixlib::ShellOut::ShellCommandFailed
------------------------------------
Expected process to exit with [0], but received '128'
---- Begin output of git ls-remote "git://github.com/sstephenson/rbenv.git" master* ----
STDOUT:
STDERR: fatal: unable to connect to github.com:
github.com[0: 192.30.252.128]: errno=Connection timed out
---- End output of git ls-remote "git://github.com/sstephenson/rbenv.git" master* ----
Ran git ls-remote "git://github.com/sstephenson/rbenv.git" master* returned 128
I prefer to use JSON format wherever possible for one simple reason - it's easy to parse and validate with a script. Here are three things that you can do if all your Chef data is in JSON format:
Easily perform a syntax check in a git pre-commit hook, something that's much harder to do when the file is in the Ruby DSL format.
Validate the keys and values in a data bag entry. This can be useful to check that you are not going to deploy invalid or nonsensical data bag entries to production.
Compare (with a little extra work - key ordering in a dictionary needs to be taken into account) the value of an object on a server with what's in git. The --format json argument is useful here.
So I use the following recipe:
include_recipe "build-essential"
node_packages = value_for_platform(
[ "debian", "ubuntu" ] => { "default" => [ "libssl-dev" ] },
[ "amazon", "centos", "fedora", "centos" ] => { "default" => [ "openssl-devel" ] },
"default" => [ "libssl-dev" ]
)
node_packages.each do |node_package|
package node_package do
action :install
end
end
bash "install-node" do
cwd Chef::Config[:file_cache_path]
code <<-EOH
tar -xzf node-v#{node["nodejs"]["version"]}.tar.gz
(cd node-v#{node["nodejs"]["version"]} && ./configure --prefix=#{node["nodejs"]["dir"]} && make && make install)
EOH
action :nothing
not_if "#{node["nodejs"]["dir"]}/bin/node --version 2>&1 | grep #{node["nodejs"]["version"]}"
end
remote_file "#{Chef::Config[:file_cache_path]}/node-v#{node["nodejs"]["version"]}.tar.gz" do
source node["nodejs"]["url"]
checksum node["nodejs"]["checksum"]
notifies :run, resources(:bash => "install-node"), :immediately
end
It successfully installed nodejs on my Vagrant VM but on restart it's getting executed again. How do I prevent this? I'm not that good in reading ruby code.
To make the remote_file resource idempotent (i.e. to not download a file already present again) you have to correctly specify the checksum of the file. You do this in your code using the node["nodejs"]["checksum"] attribute. However, this only works, if the checksum is correctly specified as the SHA256 hash of the downloaded file, no other algorithm (esp. not MD5) is supported.
If the checksum is not correct, your recipe will still work. However, on the next run, Chef will notice that the checksum of the existing file is different from the one you specified and will download the file again, thus notify the install node ressource and do the whole compile stuff.
With chef, it's important that recipes be idempotent. That means that they should be able to run over and over again without changing the outcome. Chef expects to be able to run all the recipes on a node periodically, and that should be ok.
Do you have a way of knowing which resource within that recipe is causing you problems? The remote_file one is the only one I'm suspicious of being non-idempotent, but I'm not sure offhand.
Looking at the Chef wiki, I find this:
Deprecated Behavior In Chef 0.8.x and earlier, Remote File is also
used to fetch files from the files/ directory in a cookbook. This
behavior is now provided by #Cookbook File, and use of Remote File for
this purpose is deprecated (though still valid) in Chef 0.9.0 and
later.
Anyway, the way chef tends to work, it will look to see if whatever "#{Chef::Config[:file_cache_path]}/node-v#{node["nodejs"]["version"]}.tar.gz" resolves to exists, and if it does, it should skip that resource. Is it possible that install-node deletes that file when it's finished installing? If so, chef will re-fetch it every time.
You can run a recipe only once overriding the run-list with -o modifier.
sudo chef-client -o "recipe[cookbook::recipe]"
-o RunlistItem,RunlistItem..., Replace current run list with specified items
--override-runlist
In my experience remote_file always runs when executing chef-client, even if the target file already exists. I'm not sure why (haven't dug into the Chef code to find the exact cause of the bug), though.
You can always write a not_if or only_if to control the execution of the remote_file resource, but usually it's harmless to just let it run every time.
The rest of your code looks like it's already idempotent, so there's no harm in running the client repeatedly.
There's an action you can specify for remote_file that will make it run conditionally:
remote_file 'target' do
source 'wherever'
action :create_if_missing
end
See the docs.
If you want to test whether your recipe is idempotent, you may be interested in ToASTER, a framework for systematic testing of Chef scripts.
http://cloud-toaster.github.io/
Chef recipes are executed with different configurations in isolated container environments (Docker VMs), and ToASTER reports various metrics such as system state changes, convergence properties, and idempotence issues.
Is there anyway to get Ruby's require statement to download a file from somewhere like github rather than just the local file system?
Update: Sorry I should have made the question clearer. I want to download a file that contains Ruby module and import it into my script rather than just downloading an image or some other arbitrary file within my script.
In other words something like this
require 'http:\\github.com\myrepo\snippet.rb'
puts 'hi'
By default, this is not possible. Also, it's not a good idea for security reasons.
You have a couple of alternatives. If the file you want to include is a Gem and Git repository, then you can use Bundler to download and package the dependency in your project. Then you'll be able to require the file directly in your source code.
This is the best and safest way to include an external dependency.
If you trust the source and you really know what you are doing, you can download the file using Net::HTTP (or any other HTTP library) and eval the body directly in your Ruby code.
You can package everything in a custom require_remote function.
You could download and eval it
require "open-uri"
alias :require_old :require
def require(path)
return false if $".include?(path)
unless path=~ /\Ahttp:\/\/
return require_old(path)
end
eval(open(path).read)
$"<< path
true
end
Be aware, this code has no error checking for network outages nonexisting files, ... . I also believe it is in general not a good idea to require libraries this way, there are security and reliability problems in this approach. But maybe you have a valid usecase for this.
you can include a remote gem from within Gemfiles then it will download when you run bundle install
After reading this question and answers I wanted something a little more bullet proof and verbose that used a paradigm of creating a local file from a repo and then requiring it, only if it didn't already exist locally already. The request for the repo version is explicit via the method repo_require. Used on files you control, this approach improves security IMO.
# try local load
def local_require(filename, relative_path)
relative_flname = File.join(relative_path, filename)
require_relative(relative_flname)
end
# try loading locally first, try repo version on load error
# caution: only use with files you control access to!
def repo_require(raw_repo_prefix, filename, relative_path = '')
local_require(filename, relative_path)
rescue LoadError => e
puts e.message
require 'open-uri'
tempdir = Dir.mktmpdir("repo_require-")
temp_flname = File.join(tempdir, File.basename(filename))
return false if $LOADED_FEATURES.include?(temp_flname)
remote_flname = File.join(raw_repo_prefix, filename)
puts "file not found locally, checking repo: #{remote_flname}"
begin
File.open(temp_flname, 'w') do |f|
f.write(URI.parse(remote_flname).read)
end
rescue OpenURI::HTTPError => e
raise "Error: Can't load #{filename} from repo: #{e.message} - #{remote_flname}"
end
require(temp_flname)
FileUtils.remove_entry(tempdir)
end
Then you could call repo_require like this:
repo_require('https://raw.githubusercontent.com/username/reponame/branch',
'filename', 'relative_path')
The relative_path would the the relative path you would use for the file if the repo was locally installed. For example, you may have something like require_relative '../lib/utils.rb'. In this example filename='lib/utils.rb' and relative_path='..'. This information allows the repo url to be constructed correctly as it does not use the relative path portion.