We're using the MongoHQ addon on Heroku, with the Mongoid 3.0 adapter. The addon plans come with a size limit, and Mongo will silently fail writing when the DB limit has been reached (unless configured for safe mode--in which case it'll throw exceptions).
I'm trying to query from within the app how close we are and send an alert if we've reached the limit. How can I run something like the db.stats() command but using Mongoid?
I've found out how to do this in Mongoid 3.x which uses Moped as driver, not the Ruby driver from 10gen.
It was the author of Moped himself who answered a github issue raised on the matter.
Mongoid.default_session.command(collstats: 'collection_name')
This will return the same results as db.stats() from the Mongo console. As an additional bonus, if the collection is capped, there'll be a flag in the return values indicating that.
You can call the ".db" method on your object (e.g. a Document), and do .stats on it.
For example:
MyBlog.db.stats
For verisons prior to Mongoid 3.0.0, Mongoid.master.stats should also work.
Related
I'm running a very basic Sinatra server, which simply shows a Chartkick graph of some data I have through the Sequel gem. I'm noticing that the data on the chart doesn't seem to update unless I quit the Sinatra server script and rerun it. I don't really understand how that would be possible... the only non-normal thing option I'm using when reading my database using Sequel is the read-only option.. would that cause this?
It turns out, from reading another post on here:
First, by default, multiple processes can have the same SQLite
database open at the same time, and several read accesses can be
satisfied in parallel.
In case of writing, a single write to the database locks the database
for a short time, nothing, even reading, can access the database file
at all.
Beginning with version 3.7.0, a new “Write Ahead Logging” (WAL) option
is available, in which reading and writing can proceed concurrently.
By default, WAL is not enabled. To turn WAL on, refer to the SQLite
documentation.
I currently have script A, which maintains a connection to the DB file and writes to it regularly, and script B, which is my Sinatra server that reads information from that DB file. I worked around this issue by using a block connection in my Sinatra script. I don't know how to turn on WAL with Sequel though...
So i'm essentially using a column in my parse server db as a configuration for my app. I now realize parse has a config feature specifically for this.
My question is which is faster/better to use and why? Hitting a query every single time i want to check, or check the parse config. Isn't checking the config variable just another query?
If anyone can provide proper/better/ or any documentation...
ParseConfig is faster because it store the config in the cache while ParseQuery will retrieve it from the DB. But you cannot use ParseConfig for every query operation but only to get things that are relevant to all of your users like: "Message of the day" and more.
Another very cool thing about ParseConfig is that if you don't have network connectivity it will fallback automatically to the client cache.
I'm attempting to write a simple Ruby/Nokogiri scraper to get event information from multiple pages and then output it to a CSV that is attached to an email sent out weekly.
I have completed the scraping components and the CSV component and it's working perfectly. However, I now realize that I need to know when new events are added, which means I need some sort of database. Ideally I would just store this locally.
I've dabbled a bit with using the ruby gem 'sequel', but the data does not seem to persist beyond the running of the program. Do I need to download some database software to work with 'sequel'? Also I'm not using the Rails framework, just Ruby.
Any and all guidance is deeply appreciated!
I'm guessing you did Sequel.sqlite, as in the first example in the Sequel README, which creates an in-memory SQLite database. To create a database in your filesystem instead of memory, just pass it a path, e.g.:
Sequel.sqlite("./my-database.db")
This is, of course, assuming that you have the sqlite3 gem installed. If the given file doesn't exist, it will be created.
This is covered in the Sequel docs.
I am using ActiveRecord with Sinatra and PostgreSQL. When the database connection drops (due to temporary network failure or postgres server restarting), my app doesn't re-acquire connection automatically. I'll have to restart the app in order to connect to postgres again. I remember I didn't have this problem when I was using Rails in another project.
Do I need to put some configuration or code to tell ActiveRecord to reconnect to PostgreSQL automatically?
ActiveRecord::Base.verify_active_connections! has removed back in 2012 in rails commit 9d1f1b1ea9e5d637984fda4f276db77ffd1dbdcb. so we can't use that method.
sentences below is my result of short investigation. I am no experts in rails activerecord. so listen with caution. (but hope this helpful)
comment in connection_pool.rb said
# 1. Simply use ActiveRecord::Base.connection as with Active Record 2.1 and
# earlier (pre-connection-pooling). Eventually, when you're done with
# the connection(s) and wish it to be returned to the pool, you call
# ActiveRecord::Base.clear_active_connections!. This will be the
# default behavior for Active Record when used in conjunction with
# Action Pack's request handling cycle.
so maybe you (and I. I have a same situation just like you) have to return connection to pool.
and to return connection to pool in sinatra as Action Pack's request handling cycle, use ActiveRecord::ConnectionAdapters::ConnectionManagement
use ActiveRecord::ConnectionAdapters::ConnectionManagement
and then as stated in rails commit 9d1f1b1ea9e5d637984fda4f276db77ffd1dbdcb we are using a different way as in this line, always checkout_and_verify when using Basae.connection by obeying action pack lifecycle.
def connection
# this is correctly done double-checked locking
# (ThreadSafe::Cache's lookups have volatile semantics)
#reserved_connections[current_connection_id] || synchronize do
#reserved_connections[current_connection_id] ||= checkout
end
end
UPDATED 2019-01-11 As of Rails 4.2 I have to use
ActiveRecord::Base.clear_active_connections!
and ActiveRecord will reconnect on next query. Works also from Rails console, which is rather convenient
From https://www.new-bamboo.co.uk/blog/2010/04/11/automatic-reconnection-of-mysql-connections-in-active-record/
If you use Active Record outside Rails or at least outside controller actions you have to verify connections on your own before executing a database statement. This can be done with the following code:
ActiveRecord::Base.verify_active_connections!
Since Active Record uses one connection per thread, in multi-threaded applications this verification has to be executed for each thread separately.
The blog post is about reconnecting to MySQL but I'm guessing it would be the same regardless of the engine used, as it's abstracted away. The blog also mentions a reconnect option in the configuration, but you'll have to find out if that works for Postgres.
I've been using the connection option :connecttimeoutms when setting up MongoDB connections using the mongodb ruby gem. Like so:
connection = Connection.from_uri(uri, :connecttimeoutms => connect_timeout)
Now getting warnings that
connecttimeoutms is not a valid option for Mongo::Connection
since a recent bundle update.
Has this disappeared? Does anyone know what I should replace it with?
#sumskyi is absolutely right of course
After such investigation it seems it is now :connect_timeout and is measured in seconds
Thanks