windows_zipfile Chef resource failing due to 'rubyzip' gem file download - ruby

We are facing a situation, where end users Windows VM does not have internet connectivity, but only have access to file store.
We are using windows_zipfile resource in one of our recipe. So cookbook execution failed in Windows cookbook, due to the reason that, it is not able to download rubyzip from "rubygems.org" site.
We are thinking of solving the issue in either of these two ways,
Replace the windows_zipfile code with powershell_script and implement the code using Powershell commands
Load the rubyzip gem and its dependency in file store and install the gems before calling windows_zipfile resource.
Please provide suggestions to handle the scenario. Also let me know, is there any other way to solve the issue.

You should be able to install a chef_gem from a local path, after downloading it from a source inside your network (just replace the URL of https://rubygems.org):
{"httpclient" => "2.7.1", "rubyzip" => "1.1.7"}.each do |gem,version|
filename = "#{gem}-#{version}.gem"
remote_file File.join(Chef::Config[:file_cache_path], filename) do
source "https://rubygems.org/downloads/#{filename}"
end
chef_gem gem do
source File.join(Chef::Config[:file_cache_path], filename)
version version
end
end
As the Gem is used by Chef's ruby, make sure to use the chef_gem resource.

Related

Chef recipe 'include_recipe' takes precedence over other code and resources

I'm attempting to create a Chef cookbook that, for now, is mostly just a wrapper cookbook for another cookbook (the audit cookbook). I'm still learning Chef, but from what I can gather from the About Recipes documentation and the Resources Reference documentation, Chef recipes should execute in the order that they're defined (via Ruby code and/or Chef resources).
In the About Recipes documentation, it mentions that
When a recipe is included, the resources found in that recipe will be
inserted (in the same exact order) at the point where the
include_recipe keyword is located.
In the Resources Reference documentation, they have an apt_update resource that presumably executes before the include_recipe method due to the fact that it's defined earlier in the recipe.
My wrapper cookbook has a single recipe, default.rb, which is literally these two lines:
package 'ruby-dev'
include_recipe 'audit'
However, during a chef-client or chef-solo run I see that the audit::inspec recipe runs before the security::default recipe which causes things to break because InSpec has some other dependencies that need to be installed beforehand. Before I used the package resource I was using the execute resource to explicitly run apt-get install ruby-dev or yum install ruby-dev depending on the platform using a case statement, but same problem (all that code was skipped and the include_recipe method called first).
In case it's useful, I'm using Chef 12 which I realize is EOL but I have other dependencies that require me to stick with this version of Chef for now.
I may very well just be misunderstanding how Chef converges work and the order in which execution occurs, but it's causing me a lot of grief so I'd really appreciate some pointers! Does include_recipe always occur before other code within your recipe? Is there any way around this? Am I missing something?
-- EDIT --
I was able to get the desired functionality (install other packages and gems before an include_recipe call triggered installation of a dependency gem) using the following code in my cookbook recipe:
package 'build-essential' do
action :nothing
end.run_action(:install)
chef_gem 'train' do
version "1.4.4"
action :install
end
chef_gem 'signet' do
version "0.11.0"
action :install
end
include_recipe 'audit'
Note that I ended up installing the build-essential package rather than the ruby-dev package from my original code snippet, and also installed two gems for Chef client to use. This all gets installed in the order I expected during the compile phase of the Chef run.
Sources:
https://docs.chef.io/resource_reference.html#run-in-compile-phase
https://docs.chef.io/resource_chef_gem.html
if you would examine the audit::inspec rescipe, you will find that it uses a compile time installation of the inspec rubygem (see the last line)
inspec_gem 'inspec' do
version node['audit']['inspec_version']
source node['audit']['inspec_gem_source']
action :nothing
end.run_action(:install)
from chef documentation:
run_action
Use .run_action(:some_action) at the end of a resource block to run the specified action during the compile phase.

Creating Ruby gems with same name executables?

I want to have a Ruby Gem that will have the same executable as another Gem.
When called with command args it will either do something, or pass the command on to the other Gem.
The first problem I have is that it isn't able to run two same named executables. I get this error:
Bundler is using a binstub that was created for a different gem. This is deprecated, in future versions you may need to bundle binstub yourgem to work around a system/bundle conflict.
How can I have Gems with the same named executables and ensure that the target one executes?
You cannot rely on Bundler or Rubygems to manage this for you. All it does it copy an executable that you specified in your gem spec to its bin/ directory.
The first problem you'll have is that the executable that runs may be dependent on the order in which the gems were installed which you can't guarantee.
Another problem that you'll have is that you cannot execute code on gem installation so you will be unable to run code that would try to automate this set up for people who install your gem.
I believe your gem should provide a non-conflicting executable. You can then supply post install instructions in your gem spec that are displayed to a user installing the gem, in the README, in a blog post, etc. you can tell the user that they need to set up an alias that points to your executable. In all shells that I'm aware of aliases will be executed before filesystem executables.
For the times when people want to bypass your alias and execute the original executable you can tell people to escape the command, e.g. \original-gem. That bypasses alias and function lookup in most shells and will allow users to have your super awesome version as the default (thru the alias) and a way to easily access the original.

can you get the Gem object to return a list of available gems in the path?

I am having issues getting my local gems to work on my web server. I keep getting no such file to load errors even though I have added my local gem dir to my .gemrc paths. I am hoping to find a way to at least see what gems I DO have access to.
I have tried adding my local gem path a couple ways including
Gem.path.push "/myHome/usrName/ruby/gems
with no luck. How do I do something like
Gem.available_gems.each do |g|
puts g
end
?
Try using Bundler. It manages your gems for you and lets your project know where to find them when employed properly. On production servers it can install a local copy of the gem so that you don't need to worry about setting up your path vars correctly.

Ruby gems in lib - spare tire principle

I'm working on a console ruby application (not rails!) I will be installing this application on several machines. I was wondering if there is a way i can build it so i dont have to install the gems i'm using for the app on each machine. I'd like to be able to just copy the directory to each machine and run it. Ideally, i'd like to put the gems in the lib folder or something and reference them from there, so i don't have to even install them on my dev machine. Is there a way to do this?
In .net, we call this the "spare tire" principle.
thanks,
Craig
How about using bundler?
Then you can include a Gemfile that specifies all the necssary gems and just run "bundle install" on each machine to pull them down.
If you really want to bundle them with the app run "bundle package" and the gems will be stored in vendor/cache.
You could take the same approach as rails allows and "vendor" your gems. This involves creating a new directory (rails uses vendor/gems) and unpack the gem into this directory, using gem unpack.
You then configure your load path to include all of the sub-folders below that.
Edit
You can configure your load path by doing something like this
Dir.glob(File.join("vendor", "gems", "*", "lib")).each do |lib|
$LOAD_PATH.unshift(File.expand_path(lib))
end

How to develop a gem in staging environment?

I am trying to hack through a forked gem (buildr). As such I cloned it from github and began to butcher the code. The official gem is installed on my system (under /usr/lib/ruby.../gems/buildr...). There is an executable which I need to use in my dev process - buildr.
Now I want the buildr executable and the library to point to my forked repo and not the default gem installation. This would be for this gem only. As such, the changes I make against the forked repo is usable directly for testing and so forth.
I would guess I need to load my library prior to the system gem loading. Can somebody recommend the best way to do so?
I did something similar for work when the Spreadsheet gem broke backward compatibility. I put the previous versions code in it's own module and just renamed the gem my-spreadsheet and installed that (I really wanted some of the features of the new gem but I also didn't want to rewrite all my previous code at that point).
If it's just a binary you want to override you could always do some PATH magic, setting the directory of your binary first and thus make sure you always override. But personally I'd prefer making my own copy with a new name and installing that.
you could bump the version in the gemspec for your fork. Then when you install your version of the gem, it will use your (newer) version by default.
change buildr.gemspec
#...
spec.version = '1.3.4.dev'
#...
Then
$ gem build buildr.gemspec
$ sudo gem install buildr-1.3.4.dev.gem
and it should work.

Resources