rails 4 use memcached as session store using dalli - session

I am not able to use memcached as session store with rails 4 using dalli gem.
Here's what I have done.
I added dalli gem to Gemfile
gem 'dalli'
I added the following line in config/initializers/session_store.rb
Rails.application.config.session_store ActionDispatch::Session::CacheStore, :expire_after => 20.minutes
And I added the following line in development.rb
config.cache_store = :dalli_store
Now when I start my development server with thin server without starting memcached server, I still can login as usual. Should I get some error like no memcached server running or something like that.
I am not sure if rails is using memcached as session store or not.
Can someone tell me what have I missed in using memcached as session store in development environment?
For your information, I have been using devise as authentication gem.
Thanks

Yes, you should see an error like this in the console:
DalliError: No server available
However, you will still get the session cookie, since Rails will generate it and send it to the browser.
it's just that Rails does not have a place to store the data associated with the cookie.
So, for a demo, try this:
Stop memcached. In some controller action do this:
def some_action
puts session[:test]
session[:test] = "hello"
end
You should not see "hello" in STDOUT.
Now, restart memcached and hit the action again. (might need to refresh the browser twice).
This time, you should see "hello".
If you again stop memcached, the "hello" will no longer be displayed.
I hope that makes it clear that the generation of the cookie (containing the session key)
and the storage of data against the value of the cookie (i.e. the session key) are two different things. And of course, ensure that memcached really is stopped.
As for the part being able to login even with memcached stopped, check to see that you have cleared all cookies for the domain (localhost) and that you have restarted the rails server after making the change. Also, clear out the tmp/cache directory.
PS. If you do not see the error DalliError: No server available then that means that memcached is probably still running somewhere. Try accessing memcached via Dalli via the Rails console and see if you are able to store/get data.
PPS. If you see files being stored in tmp (like tmp/cache/D83/760/_session_id%3A4d65e5827354d0e1e8153a4664b3caa1), then that means that Rails is falling back to FileStore for storing the session data.

Related

How do I share Redis key/values across multiple instances with rackup/Sinatra?

I'm trying to use Redis as my session store, which seem to work just fine. However I can't figure out how to let multiple instances of Sinatra access the same session. This is what I have in my config.ru:
require 'redis-rack'
use Rack::Session::Redis, :redis_server => "redis://#{ENV['REDIS_HOST']}:6379/0"
I must be missing an argument to set this, but the documentation is lacking for this case:
https://github.com/redis-store/redis-rack
Maybe that's not what I want to achieve this behavior?
The end goal is to be deploying my Sinatra application with docker to a clustered environment so I can release new versions without downtime. So whatever let's me share the rack session between multiple instances works. I suppose I could create a redis object manually and not use the session keyword, just seems like the wrong way to do it.

Rails 4 Simple Model Caching With Redis

I am new to Redis and Rails caching, and would like to perform simple model caching. I have just read these 2 articles :
http://www.sitepoint.com/rails-model-caching-redis/
http://www.victorareba.com/tutorials/speed-your-rails-app-with-model-caching-using-redis
Since Redis model caching consists in storing JSON strings in redis and retrieving them with code like
def fetch_snippets
snippets = $redis.get("snippets")
if snippets.nil?
snippets = Snippet.all.to_json
$redis.set("snippets", snippets)
end
#snippets = JSON.load snippets
end
I don't understand what is the need of using
gem 'redis-rails'
gem 'redis-rack-cache'
I don't see where the cache store or other caching mechanisms are at use in that kind of examples, since they consist only in reading/writing to Redis.
Thank you for any help.
Here is what I have in my Gemfile
gem 'redis'
gem 'readthis'
gem 'hiredis'
gem 'redis-browser'
readthis - recently implemented nice feature to not crash Rails when Redis is down Disable Rails caching if Redis is down. And it supports advanced Redis data types (not just strings as redis-rails).
hiredis - is a little faster
redis-browser - allows me to see what is actually cached (easier than cli).
Here is my application.rb
config.cache_store = :readthis_store, { expires_in: 1.hour.to_i, namespace: 'foobar', redis: { host: config.redis_host, port: 6379, db: 0 }, driver: :hiredis }
Then in my models I do:
def my_method_name
Rails.cache.fetch("#{cache_key}/#{__method__}", expires_in: 1.hour) do
# put my code here
end
end
I used https://github.com/MiniProfiler/rack-mini-profiler to see which queries were firing lots of DB request and determined what I should cache.
The snippet you posted isn't really clever. It assumes that the entire snippet collection is never updated locally, as it doesn't set any expiration for the content that is stored into Redis.
As for the gems, you don't need them at all if your goal is the example you posted.
The redis-rails is likely a plugin to connect to Redis in Rails. However, connecting to Redis is as easy as creating an initializer file and opening a new connection to Redis with the correct Redis URL using the Ruby Redis gem.
The second gem seems to add a Redis-based storage for Rack cache. If you don't know what it is, it's probably better if you don't use it at all.

Using redis with heroku

This is my first time using redis and the only reason I am is because I'm trying out autocomplete search tutorial. The tutorial works perfectly in development but I'm having trouble setting up redis for heroku.
I already followed these steps on the heroku docs for setting up redis but when I run heroku run rake db:seed I get Redis::CannotConnectError: Error connecting to Redis on 127.0.0.1:6379 (Errno::ECONNREFUSED)
I'm not very familiar with heroku so if you guys need any more information let me know.
Edit
I've completed the initializer steps shown here and when I run heroku config:get REDISCLOUD_URL the result is exactly the same as the Redis Cloud URL under the config vars section of my Heroku settings.
Following the documentation, I then set up config/initializers/redis.rb like so:
if ENV["REDISCLOUD_URL"]
$redis = Redis.new(:url => ENV["REDISCLOUD_URL"])
end
Just to check, I tried substituting the actual URL for redis cloud inside the if block instead of just the REDISCLOUD_URL variable but that didn't work. My error message hasn't changed when I try to seed the heroku db.
It’s not enough to just create a $redis variable that points to the installed Redis server, you also need to tell Soulmate about it, otherwise it will default to localhost.
From the Soulmate README you should be able to do something like this in an initializer (instead of your current redis.rb initializer, which you won’t need unless you are using Redis somewhere else in your app):
if ENV["REDISCLOUD_URL"]
Soulmate.redis = ENV["REDISCLOUD_URL"]
end
Looking at the Soulmate source, an easier way may be to set the REDIS_URL environment variable to the Redis url, either instead of or as well as REDISCLOUD_URL, as it looks like Soulmate checks this before falling back to localhost.
Your code is trying to connect to a local Redis instance, instead the one from Redis Cloud - make sure you've completed the initializer step as detailed in order to resolve this.

ruby neo4j session persistence

I'm using neo4j in a Ruby CLI app.
Each time a command is run from the command line, "session = Neo4j::Session.open(:server_db)" is re-established which is quite slow.
Is there anyway to persist the "session" first time use and re-use it in subsequent command invocations from the command line.
Regards
The neo4j-core gem uses the faraday gem to make persistent HTTP connections. That's defined here:
https://github.com/neo4jrb/neo4j-core/blob/master/lib/neo4j-server/cypher_session.rb#L24
That uses the NetHttpPersistent Faraday adapter here:
https://github.com/lostisland/faraday/blob/master/lib/faraday/adapter/net_http_persistent.rb
Which I believe uses the net-http-persistent library:
https://github.com/drbrain/net-http-persistent
When calling open on Session, you can pass in a second argument Hash of options. You can specify a connection key in that hash which is a Faraday connection object which you've created. That might allow you to save some token/string somewhere and the reload the Faraday object each time from that to pick up the session from where it left off.
The other option is to have a daemon in the background which has the connection open

ActiveRecord: how to reconnect to PostgreSQL automatically when connection drops?

I am using ActiveRecord with Sinatra and PostgreSQL. When the database connection drops (due to temporary network failure or postgres server restarting), my app doesn't re-acquire connection automatically. I'll have to restart the app in order to connect to postgres again. I remember I didn't have this problem when I was using Rails in another project.
Do I need to put some configuration or code to tell ActiveRecord to reconnect to PostgreSQL automatically?
ActiveRecord::Base.verify_active_connections! has removed back in 2012 in rails commit 9d1f1b1ea9e5d637984fda4f276db77ffd1dbdcb. so we can't use that method.
sentences below is my result of short investigation. I am no experts in rails activerecord. so listen with caution. (but hope this helpful)
comment in connection_pool.rb said
# 1. Simply use ActiveRecord::Base.connection as with Active Record 2.1 and
# earlier (pre-connection-pooling). Eventually, when you're done with
# the connection(s) and wish it to be returned to the pool, you call
# ActiveRecord::Base.clear_active_connections!. This will be the
# default behavior for Active Record when used in conjunction with
# Action Pack's request handling cycle.
so maybe you (and I. I have a same situation just like you) have to return connection to pool.
and to return connection to pool in sinatra as Action Pack's request handling cycle, use ActiveRecord::ConnectionAdapters::ConnectionManagement
use ActiveRecord::ConnectionAdapters::ConnectionManagement
and then as stated in rails commit 9d1f1b1ea9e5d637984fda4f276db77ffd1dbdcb we are using a different way as in this line, always checkout_and_verify when using Basae.connection by obeying action pack lifecycle.
def connection
# this is correctly done double-checked locking
# (ThreadSafe::Cache's lookups have volatile semantics)
#reserved_connections[current_connection_id] || synchronize do
#reserved_connections[current_connection_id] ||= checkout
end
end
UPDATED 2019-01-11 As of Rails 4.2 I have to use
ActiveRecord::Base.clear_active_connections!
and ActiveRecord will reconnect on next query. Works also from Rails console, which is rather convenient
From https://www.new-bamboo.co.uk/blog/2010/04/11/automatic-reconnection-of-mysql-connections-in-active-record/
If you use Active Record outside Rails or at least outside controller actions you have to verify connections on your own before executing a database statement. This can be done with the following code:
ActiveRecord::Base.verify_active_connections!
Since Active Record uses one connection per thread, in multi-threaded applications this verification has to be executed for each thread separately.
The blog post is about reconnecting to MySQL but I'm guessing it would be the same regardless of the engine used, as it's abstracted away. The blog also mentions a reconnect option in the configuration, but you'll have to find out if that works for Postgres.

Resources