Ruby Mongo DB multiple records of same value - ruby

I'm new to MongoDB and databases in general. I'm using Ruby and I would like to query against a specific UUID in the database.
The ID is stored as _id and the value is '101b437a-be16-44f6-b0b0-0201cdee6510'
I have the following that usually queries my database:
field = '_id:'
value = 101b437a-be16-44f6-b0b0-0201cdee6510
def query_field(field,value)
query = {#{field}: value}
#result = #mongo_interface.get(query)
expect(#result.count).to be >= 1
puts "Number of matched values: #{#result.count}"
end
def get(param_hash, collection_name = nil)
col_name = (collection_name.nil? || collection_name.empty?) ? #collection : collection_name
#docs = #db[col_name].find(param_hash)
end
When I look within the _id field, I'm assuming it's stored as some sort of binary key and thus isn't found using my search.
Is there some conversion I could/should do to make the query above work?
Thank you.

Using an ODM like Mongoid will ease your pain. Add it to your Gemfile:
gem 'mongoid'
and run bundle install. Make sure you skimmed through the installation guide to add all the necessary configs.
Then include the following line to your model/class, say:
class Product
include Mongoid::Document
...
end
You'll be able to query the records like Product.find(id) right after.

Related

Select a record based on translated name

on ruby console when I do Resource.all it give me the following:
[<Resource id:'...', name_translated:{"en"=>'vehicle',"fr"=>'véhicule'}> ...]
How do I make a selection such that Resource.find_by_name_translated("vehicle")
This would work, I think it's not the most efficient way though:
#app/models/resource.rb
def self.find_by_english_name(name)
Resource.all.select do |resource|
resource.name_translated['en'] == name
end
end
if you want to be able to find by multiple languages (defaulting to english) with one method, try this:
def self.find_by_name(name, language = 'en')
Resource.all.select do |resource|
resource.name_translated[language] == name
end
end
Since you're using Postgres this can also be written as follows:
def self.find_by_name(name, language = 'en')
Resource.where("name_translated ->> '#{language}' = '#{name}'")
end
I'd use regex query if your db doesn't allow to query by json fields:
Resource.where("name_translated LIKE '#{translated_name}%'")

Mongo Ruby Driver #find() on Specified field Values

Ruby: ruby 1.9.3p194 (2012-04-20 revision 35410) [x86_64-linux]
RubyGem: mongo (2.0.4)
I need help querying a MongoDB database with their gem and updating the appropriate fields.
EDIT: I'm trying to loop over documents of a Mongo database, pull down the values of specific fields in those documents, and update them later in the script.
Objectives
Query the database for documents where the field partner_id is "partner" and where the field state is "provisioned", and return only the values under the _id and config fields.
After this point, I'll be iterating over each document, generating a password, and updating another database.
Update the database with the newly generated password to each documents config field.
I'm at my wit's end as I've seen about half a dozen different way to write the syntax, and the documentation is little help unless I already knew how to do these things. Any assistance would be greatly appreciated!
#!/usr/bin/env ruby
require 'json'
require 'net/http'
require 'mongo'
# Fetch the addons database URI and connect.
db_uri = ENV['DATABASE_URI']
client = Mongo::Client.new(db_uri)
# Connect to the needed collection and pull down each document to be looped over individually.
# **Having trouble getting this to work. The result is just '= []' - don't know what I'm doing wrong.
client[:collection].find("partner_id" => "partner", "state" => "provisioned", :fields => ["_id", "config"]).each {
# Need something here to pull down the values from each document's '_id' and 'config' fields and assign them to variables.
user_id =
user_config =
user_config = JSON.parse(user_config)
# ...generating password and updating other database...
# Convert the Hash of the user's new configuration into JSON, and update the original database with it.
# Not sure if any of this is correct. When querying to check, the database doesn't seem to be updated.
user_config = user_config.to_json
client[:collection].update(
{"_id" => user_id},
{'$set' => {
"config" => user_config
}
}
)
}
end
return
You're not finding anything because this:
:fields => ["_id", "config"]
argument to find isn't specifying the fields you want, find just sees that as a third document field to look for. Your documents probably don't have a field called field whose value is an array of those strings so the query silently finds nothing at all.
If you want to limit your query, you need to use projection:
client[:collection].find("partner_id" => "partner", "state" => "provisioned")
.projection('_id' => 1, 'config' => 1)
.each { |doc| ... }
Then inside the each block the doc will be a Hash so you can say:
user_id = doc['user_id']
user_config = doc['user_config']
If I'm reading your code right, the user_config should be a Hash already so you probably won't need to parse it yourself.

Changing ID to something more friendly

When creating a record the URL generated to view that record ends with its id
/record/21
I would like to be able to change that to something easier to read, such as my name and reference attributes from the model. I have looked at friendly_id but has trouble implementing a custom method to generate the URL
class Animal < ActiveRecord::Base
extend FriendlyId
friendly_id :name_and_ref
def name_and_ref
"#{name}-#{reference}"
end
end
I ended up getting an error
PG::UndefinedColumn: ERROR: column animals.name_and_ref does not exist LINE 1: SELECT "animals".* FROM "animals" WHERE "animals"."name_an... ^ : SELECT "animals".* FROM "animals" WHERE "animals"."name_and_ref" = 'Clawd-A123456' ORDER BY "animals"."id" ASC LIMIT 1
def show
#animal = Animal.friendly.find(params[:id])
end
I then come across the to_param method which Rails has available, in my model I have
def to_param
"#{self.id}-#{self.name}"
end
which will generate a URL for me of
/19-clawd
This works, but when I do the following it throws an error
def to_param
"#{self.name}-#{self.reference}"
end
My question though is how can I generate the URL to be name and reference without it throwing
Couldn't find Animal with 'id'=Clawd-A123456
If you would like to use your own "friendly id" then you'll need to adjust the find statement in your controller to something like
id = params[:id].split(/-/, 2).first
#animal = Animal.find(id)
Similarly, for the name/reference combination
name, reference = params[:id].split(/-/, 2)
#animal = Animal.find_by(name: name, reference: reference)
The second choice is a little more difficult because you'll have to do some work in the model to guarantee that the name/reference pair is unique.
The easiest way, is to go with friendly_id and simply add the missing database column. Keep in mind that you will need to ensure this new column is unique for every record. It basically acts as primary key.

Mongoid: convert embedded document into referenced/own collection

I need to convert an embedded document onto its own collection, so it can be referenced from another collection.
Lets suppose I have a Parent that embeds many Childs.
I was thinking of something along this:
Parent.all.each do |p|
p.childs.all.each do |c|
c.raw_attributes['parent_id'] = p.id
end
p.save! #will save parent and cascade persist all childs onto their own coll
end
Is this an option? Ideally I would run this in a console and I would only change mongoid mappings from embed_* to has_*, so I wouldn't need to change the rest of my code or use another collection as staging.
I think, the code should look more like this (didn't test)
child_coll = Mongoid.database.collection('children')
Parent.all.each do |p|
p.childs.all.each do |c|
c.attributes['parent_id'] = p.id
child_coll.insert c.attributes # save children to separate collection
end
p.childs = nil # remove embedded data
p.save
end
After that, you can change your embeds_many to has_many and (hopefully) it should work well.
too little rep to comment, but I think Sergio's (otherwise very helpful) answer may be outdated. With mongoid 3.0.5 I couldn't use
child_coll = Mongoid.database.collection('children')
but instead used
child_coll = Mongoid.default_session[:children]
which did the trick for me
For me I need to remove the '_id' attribute before inserting otherwise I will get Duplicated key Error.
Here is an updated version of Sergio Tulentsev's approach with Pencilcheck's addition and an update of sbauch's correction.
First, leave the embeds_many/embedded_in statements in place in your models.
Second, run something like this block of code:
child_coll = Mongoid.client(:default).database.collection(:children)
Parent.all.each do |p|
p.childs.all.each do |c|
dup = c.attributes
dup['_id'] = nil
dup['parent_id'] = p.id
child_coll.insert_one dup # save children to separate collection
c.destroy
end
p.childs = nil # remove embedded data
p.save
end
Third, change your embeds_many to has_many and your embedded_in to belongs_to.
Fini.

Get inserted ID from Sequel prepare

I have a prepared insert statement in Sequel (using Oracle).
prepared_statement = DB[:table_name].prepare(:insert, :name, :value=>:$value)
When I call it the row gets added just fine.
prepared_statement.call :value=>'Some value'
I have a trigger and a sequence set up so the ID will be auto generated. I would like to get back the row (or the id) I just added, but I can't see how. I can't use insert because value is a CLOB and may be greater than 4000 characters.
In JRuby, using the JDBC adapter you can override the insert and pass in the returning clause. The tricky part is that you don't always know what the primary key is at this level so you may have to use ROWID or request all of the columns back.
You end up with something that looks similar to this:
module Sequel
module JDBC
class Database
def execute_insert_with_returning(conn, sql, opts = {})
columns = opts[:key_columns] || ["ROWID"]
q = "{ call #{sql} returning #{columns.join(',')} into #{columns.collect {|_| '?'}.join(',')} }"
stmt = conn.prepare_call(q)
raise "Unable to prepare call for insert" if stmt.nil?
begin
columns.each_with_index do |_, index|
stmt.registerOutParameter(index+1, JavaSQL::Types::VARCHAR)
end
return nil if 0 == stmt.executeQuery
values = (1..columns.count).inject({}) do |memo, index|
key = columns[index-1].downcase.to_sym rescue nil
memo[key] = stmt.get_string(index) unless key.nil?
memo
end
values
ensure
stmt.close
end
end # #execute_insert_with_returning
alias execute_without_specialized_insert execute
def execute(sql, opts={}, &block)
if opts[:type] == :insert
synchronize(opts[:server]) do |conn|
execute_insert_with_returning conn, sql, opts
end
else
execute_without_specialized_insert sql, opts, &block
end
end # #execute
end # Database
end # JDBC
end # Sequel
I've done something pretty much like this and it works pretty good. I think we had to override the Sequel::Model as well so it passes the primary key in as opts[:key_columns] but I may be remembering incorrectly.
This is a bit of a load bearing kludge that gets the job done. It would be more elegant to specialize it to the Oracle JDBC adapter and to ensure that all of the error handling code is present from the original execute statement. Given the time I'd love to get something better and give it back to the Sequel project.
The way to get the populated sequence values is through the RETURNING clause of the INSERT
statement, as I discuss in this response to a similar question regarding CodeIgniter.
I'm not sure whether the base version of RoR supports that syntax, but it appears to be possible to extend ActiveRecord to handle it. Find out more.
Sequel's Oracle adapter doesn't have native prepared statement support, so it falls back to issuing a regular query. If you can use JRuby, the jdbc adapter has native prepared statement support, so it should just work there. If you can't use JRuby, you'll have to work on adding native prepared statement support to the Oracle adapter. I don't have access to an Oracle installation, so I can't test any support, but I'll be happy to provide advice if you run into problems.

Resources