What is the best way to store project specific config info in ruby Rake tasks? - ruby

I have rake tasks for getting the production database from the remote server, etc. It's always the same tasks but the server info changes per project. I have the code here: https://gist.github.com/868423 In the last task, I'm getting a #local_db_dir_path = nil error.
I don't think want to use shell environment variables because I don't want to set them up each time I use rake or open a new shell.

Stick the settings in a YAML file, and read it like this:
require 'yaml'
config = YAML.load("config.yaml") # or wherever
$remote_host = config['remote_host']
$ssh_username = config['ssh_username']
# and so on
Or you can just read one big config hash:
$config = YAML.load("config.yaml")
Note that I'm using globals here, not instance variables, so there's no chance of being surprised by variable scope.
config.yaml would then look like this:
---
remote_host: some.host.name
ssh_username: myusername
other_setting: foo
whatever: bar
I tend to keep a config.yaml.sample checked in with the main body of the code which has example but non-working settings for everything which I can copy across to the non-versioned config.yaml. Some people like to keep their config.yaml checked in to a live branch on the server itself, so that it's versioned, but I've never bothered with that.

you should be using capistrano for this, you could use mulitsage or just separate host setting to a task, example capistrano would look like this:
task :development do
server "development.host"
end
task :backup do
run "cd #{current_path}; rake db:dump"
download "remote_path", "local_path"
end
and call it like this:
cap development backup

Related

Environment Variables on Heroku and Mailgun Problems with Phoenix Framework

I was following this guide on deploying to Heroku and this one for sending email.
Everything works fine in development. My variables are set in Heroku:
heroku config
...
MAILGUN_DOMAIN: https://api.mailgun.net/v3/xxxxxx.mailgun.org
MAILGUN_KEY: key-3-xxxxxx
...
And loaded from the config files like so:
config :take_two, Mailer,
domain: System.get_env("MAILGUN_DOMAIN"),
key: System.get_env("MAILGUN_KEY")
However when I try to send email on Heroku when the Mailgun config is set from environment variables I get this error:
** (FunctionClauseError) no function clause matching in IO.chardata_to_string/1
(elixir) lib/io.ex:346: IO.chardata_to_string(nil)
(elixir) lib/path.ex:467: Path.join/2
(elixir) lib/path.ex:449: Path.join/1
lib/client.ex:44: Mailgun.Client.send_without_attachments/2
This happens when the domain is not set for the Mailgun Client. But it is supposed to be set from the environment variable. I made a simple module to test:
defmodule TakeTwo.Mailer do
require Logger
use Mailgun.Client,
Application.get_env(:take_two, Mailer)
def blank_shot do
Logger.info Application.get_env(:take_two, Mailer)[:domain]
Logger.info Application.get_env(:take_two, Mailer)[:key]
send_email from: "steve#xxx.com", to: "speggy#xxx.com", subject: "Hello", text: "This is a blank shot"
end
When I run TakeTwo.Mailer.blank_shot I see the correct domain/key variables logged followed by the error. I am not sure how to debug the Mailgun client remotely.
Finally, if I recreate the above module in the shell (after running heroku run iex -S mix) it works just fine!?
I feel like when the original module is being loaded perhaps the environment variables have yet to be loaded??
The answer was a little buried in a comment so I wanted to make it easier to find. As the other answer mentions, the environment variables aren't available, but the buildpack lets you configure them to be:
I created a elixir_buildpack.config file and added the following:
config_vars_to_export=(DATABASE_URL MAILGUN_DOMAIN MAILGUN_KEY SECRET_KEY_BASE)
The environment variables aren't available at build time. I had the same issue and decided to get rid of the macro carrying the configuration. You can use this patch to move on.

Running Cucumber tests on different environments

I'm using Cucumber and Capybara for my automated front end tests.
I have two environments that I would like to run my tests on. One is a staging environment, and the other is the production environment.
Currently, I have my tests written to access staging directly.
visit('https://staging.somewhere.com')
I would like to re-use the tests in production (https://production.somewhere.com).
Would it be possible to store the URL in a variable in my step definitions
visit(domain)
and define domain using an environment variable called form the command line? Like
$> bundle exec cucumber features DOMAIN=staging
if I want to point the tests to my staging environment, or
$> bundle exec cucumber features DOMAIN=production
if I want it to run in production?
How do I go about setting this up? I'm fairly new to Ruby and I've been searching the forums for a straight forward information but could not find any. Let me know if I can provide more information. Thanks for your help!
In the project's config file, create a config.yml file
---
staging:
:url: https://staging.somewhere.com
production:
:url: https://production.somewhere.com
Then extra colon in the yml file allows the hash key to be called as a symbol.
In your support/env.rb file, add the following
require 'yaml'
ENV['TEST_ENV'] ||= 'staging'
project_root = File.expand_path('../..', __FILE__)
$BASE_URL = YAML.load_file(project_root + "/config/config.yml")[ENV['TEST_ENV']][:url]
This will default to the staging environment unless you override the TEST_ENV. Then, from your step or hook, you can call:
visit($BASE_URL)
or you might need :/
visit "#{$BASE_URL}"
This will allow you to use
bundle exec cucumber features TEST_ENV=production
I don't use cucumber much but you should be able to do
bundle exec cucumber features DOMAIN=staging
then in your tests use ENV['DOMAIN'] || YOUR_DEFAULT_DOMAIN to utilize this variable. YOUR_DEFAULT_DOMAIN should probably be your test environment.
See Here

Using Rake, that loads my db from a YAML file, can i set environments?

My database.yml looks like:
adapter: mysql
database: my_db
username: user1
password: '123'
host: localhost
This is a non-rails application, just using rake/ruby for some scripting.
Can I set a default (dev) and production in this yaml file, or is that rails specific?
If yes, when running something like:
rake user:create
How do I pass in if it is production and therefore use the production db settings in the yaml file?
Where you read the yaml file into memory and parse it is a good place to put the logic to use the current environment (or a default if none is set). A simple way to do this is to rely on an environment variable (don't use RAILS_ENV if it's not a rails app, or RACK_ENV if it's not a rack app).
Use something like:
my_env = ENV['MY_ENV_VARIABLE'] || 'development'
db_settings = YAML::load(File.open(yml_file))[my_env]
Then you can call rake via:
MY_ENV_VARIABLE=production rake my_task
Or if you want to add a param to the task itself, you can set it up to use
rake my_task[production]
(but that can get messy depending on if you have to quote the whole thing, like in zsh).
Another approach that some libraries use (like heroku_san) is to have a separate task that sets the environment variable, and rely on calling multiple tasks, so you'd have a task :production that sets the environment variable and can then call
rake production my_task

What is the best way to write specs for code that depends on environment variables?

I am testing some code that pulls its configuration from environment variables (set by Heroku config vars in production, for local development I use foreman).
What's the best way to test this kind of code with RSpec?
I came up with this:
before :each do
ENV.stub(:[]).with("AWS_ACCESS_KEY_ID").and_return("asdf")
ENV.stub(:[]).with("AWS_SECRET_ACCESS_KEY").and_return("secret")
end
If you don't need to test different values of the environment variables, I guess you could set them in spec_helper instead.
You also can stub the constant:
stub_const('ENV', {'AWS_ACCESS_KEY_ID' => 'asdf'})
Or, if you still want the rest of the ENV:
stub_const('ENV', ENV.to_hash.merge('AWS_ACCESS_KEY_ID' => 'asdf'))
That would work.
Another way would be to put a layer of indirection between your code and the environment variables, like some sort of configuration object that's easy to mock.
This syntax works for me:
module SetEnvVariable
def set_env_var(name, value)
# Old Syntax
# ENV.stub(:[])
# ENV.stub(:[]).with(name).and_return(value)
allow(ENV).to receive(:[]) # stub a default value first if message might be received with other args as well.
allow(ENV).to receive(:[]).with(name).and_return(value)
end
end
As Heroku suggests, you can use Foreman's .env file to store environment variables for development.
If you do that, you can use foreman run to run your specs:
foreman run bundle exec rspec spec
If you're using dotenv to setup your environment during tests but need to modify an env variable for a specific test then following approach can be useful.
A simpler method than stubbing ENV is to replace the environment for the duration of the test, and then restore it afterwards like so:
with_environment("FOO" => "baz") do
puts ENV.fetch("FOO")
end
Using a helper like this:
module EnvironmentHelper
def with_environment(replacement_env)
original_env = ENV.to_hash
ENV.update(replacement_env)
yield
ensure
ENV.replace(original_env)
end
end
By using ensure the original environment is restored even if the test fails.
There's a handy comparison of methods for setting & modifying environment variables during tests including stubbing the ENV, replacing values before / after the test, and gems like ClimateControl.
I'd avoid ENV.stub(:[]) - it does not work if other things are using ENV such as pry(you'll get an error about needing to stub DISABLE_PRY).
#stub_const works well as already pointed out.
You can use https://github.com/littleowllabs/stub_env to achieve this. It allows you to stub individual environment variables without stubbing all of them as your solution suggested.
Install the gem then write
before :each do
stub_env('AWS_ACCESS_KEY_ID', 'asdf')
stub_env('AWS_SECRET_ACCESS_KEY','secret')
end
What you want is the dotenv gem.
Running tests under foreman, as #ciastek suggests, works great when running specs from CLI. But that doesn't help me run specs with Ruby Test in Sublime Text 2. Dotenv does exactly what you, transparently.

Capistrano: How to Include common settings in multiple project deploy.rb files

this is probably a newbie ruby question. I have several libraries and apps that I need to deploy to several different hosts. All of the apps and libs will share some common settings for those hosts-- e.g. host name, database server/user/pass, etc.
My goal is to do something like:
cap host1 stage deploy
cap host2 stage deploy
cap host1 prod deploy
# ...
My question is how do you include these common settings in all of your deploy.rb files? More specifically, I want to create a an rb file that I can include that has some common settings and several host specific task definitions:
set :use_sudo, false
# set some other options
task :host1 do
role :app, "host1.example.com"
role :web, "host1.example.com"
role :db, "host1.example.com", :primary => true
set :rodb_host, "dbhost"
set :rodb_user, "user"
set :rodb_pass, "pass"
set :rodb_name, "db"
end
task :host2 do
#...
end
deploy.task :carsala do
transaction do
setup
update_code
symlink
end
end
And then "include" this file in all of my deploy.rb files where I define stage, prod, etc and overwrite any "common" configuration parameters as necessary. Any suggestions would be appreciated. I've tried a few different things, but I get errors from cap for all of them.
Edit: I've tried
require 'my_module'
But I get errors complaining about an undefined task object.
I just experimented with it a little more and what I discovered is that you have to:
load 'config/my_module'
I can put all of my common definitions here and just load it into my deploy.rb.
It appears from the docs that load loads and executes the file. Alternatively, require attempts to load the library specified. I'm not totally sure about real difference, but it appears that there is some separation between the current app symbol space and the library require'd (hence the errors about the undefined task object) that isn't a problem when you do a load.
require 'my_extension'
Save your extensions in my_extension.rb
Jon has it right, that's the simplest way to go, just save it in a separate file and use require 'filename'. You could also use something fancy like Webistrano for deployment which also supports this in the form of Capistrano 'Recipes'. I've been using it for a while on a few projects and have come to love it.
I'm not sure how complex your needs are, but this works well for me for deployment:
set :application, "app"
set :scm, :subversion
# ... set all your common variables
task :staging do
set :repository, "http://app/repository/trunk/"
# ... set other uncommon variables in task
end
task :production do
set :repository, "http://app/repository/production/"
# ...
end
Deployment is just
cap staging deploy
or
cap production deploy

Resources