I have a group of radio buttons returning the following hash:
{"1"=>"1", "3"=>"2"}
The key represents the event_id and the value represents the regoption_id. I need to insert these to the subscriptions table preferably all at once.
I tried the following:
params[:children].each do |child|
Subscription.create({:event_id => child[0], :regoption_id => child[1]}).save
end
This ends up saving just one radio group, not all in the hash. Any ideas on how to do this?
There's a gem called activerecord-import that will insert multiple records efficiently. It works with many popular DB backends, and will just do the right thing with most of them. This does exactly what you want: it accepts an array of object instances, or an array-of-hashes-of-values, and inserts them into a table in a single statement.
Here's a usage example right from the gem documentation:
books = []
10.times do |i|
books << Book.new(:name => "book #{i}")
end
Book.import books
Related
I am newbie to Sequel and ruby and I have one thing need your help.
In a word, I can't access database query result with dot operator.
I am using sequel adapter in padrino ruby project.
For example,
persons = Person.all
persons.each do |p|
puts p.name . # this output correct person name, as 'john'
end
But if I do some query
persons = Person.where(:age=>20)
persons.each do |p|
puts p.name . # this line cause error
end
I compared their data types and there are different each other.
puts persons # in first case - array
#<Gig:0x007fbdb6d64ef0>
#<Gig:0x007fbdb6d64838>
#<Gig:0x007fbdb6d641f8>
puts persons # in second case - object
#<Sequel::Postgres::Dataset:0x007fbdbc614898>
So I tried to change result to hash array in second case to access fields with dot operator
persons_hash= persons.collect do |p|
ro.to_hash
end
In this case, I was able to access user name with person[0][:name], but I couldn't access with dot operator.
So I want to know how should I have to do to access Sequel query result using dot operator.
Thanks :)
persons.each do |p|
puts p.name . # this line cause error
end
What exact error are you getting here? I'm guessing an undefined method error? Seems you may be familiar with ActiveRecord syntax. I have not used sequel myself, but it is a bit different from AR. According to their docs, you would do something like this
persons.map(:name) # => ['Harry', 'Steve', 'Israel', ...]
The all method returns an array of hashes, where each hash corresponds to a record.
For your above example, I would try the following:
persons.each do |p|
puts p[:name] . # here we are now accessing the name hash key instead of trying to access the name method.
end
You want to access the name key of each hash being iterated over. Because you are iterating through an array OF hashes. This is why you could access that data with person[0][:name]. You were calling the 0th item of the persona array and accessing its 'name' hash key.
I've been trying to find an answer for a little while now, and I'm not sure if I'm asking the wrong questions, but I've had no luck so far.
I have an array generated from querying an SQLite table. I also have an array of times generated from a csv file. I am trying to pull the time[0] (an id number) and check if it exists in the array from the SQL table
array = []
SQLite3::Database.new("t.db") do |db|
db.execute ("SELECT t FROM ts") do |row|
array << row
end
end
times = CSV.read('times1.csv')
times.each do |time|
#puts "This is the trip id: #{time[0]}"
if array.include? time[0]
puts time[0]
end
end
When I do this, I get no results. I know for a fact that there should be a few matches. When I try to iterate over the array like:
array.each do |row|
if row = 2345
puts "Match found"
end
end
Here is the strange part that has really stumped me. I know that 2345 only comes up once or twice in the array, however when I run this code, it seems as though it prints "Match found" for every element (like 5,000 times it says "Match found"). I feel like something is wrong with the array generated from the SQLite but I can't for the life of me figure it out.
Any ideas? Thanks.
In here:
db.execute ("SELECT t FROM ts") do |row|
your row is an array whose sole element is the t you're after. That means that you end up with an array-of-arrays in array rather than an array of numbers and of course your array.include? time[0] check fails because array only contains other arrays.
You probably want to say:
array << row[0]
to collect the t values in array.
When I execute the code below, my array 'tasks' ends up with the same last row from the dbi call repeated for each row in the database.
require 'dbi'
require 'PP'
dbh = DBI.connect('DBI:SQLite3:test', 'test', '')
dbh.do("DROP TABLE IF EXISTS TASK;")
dbh.do("CREATE TABLE TASK(ID INT, NAME VARCHAR(20))")
# Insert two rows
1.upto(2) do |i|
sql = "INSERT INTO TASK (ID, NAME) VALUES (?, ?)"
dbh.do(sql, i, "Task #{i}")
end
sth = dbh.prepare('select * from TASK')
sth.execute
tasks = Array.new
while row=sth.fetch do
p row
p row.object_id
tasks.push(row)
end
pp(tasks)
sth.finish
So if I have two rows in my TASK table, then instead of getting this in the tasks array:
[[1, "Task 1"], [2, "Task 2"]]
I get this
[[2, "Task 2"], [2, "Task 2"]]
The full output looks like this:
[1, "Task 1"]
19877028
[2, "Task 2"]
19876728
[[2, "Task 2"], [2, "Task 2"]]
What am I doing wrong?
It seems there are some strange behavior in row objects wich seems to be some kind of singleton, and that's why dup method wont solve it.
Jumping into the source code it seems that the to_a method will duplicate the inner row elements and that's why it works so the answer is to use to_a on the row object or if you want you can also transform it into a Hash to preserve meta.
while row=sth.fetch do
tasks.push(row.to_a)
end
But I recommend the more ruby way
sth.fetch do |row|
tasks << row.to_a
end
Are you sure you have copied your code exactly as it is ? AFAIK the code you have written shouldn't work at all... You mix two constructs that are not intended to be used that way.
Am i wrong to assume that you come from a C or Java background ? Iteration in ruby is very different, let me try to explain.
A while loop in ruby has this structure :
while condition
# code to be executed as long as condition is true
end
A method with a block has this structure :
sth.fetch do |element|
# code to be executed once per element in the sth collection
end
Now there something really important to understand : fetch, or any other method of this kind in ruby, is not an iterator as you would encounter in C for example - you do not have to call it again an again until the iterator hits the end of the collection.
You just call it once, and give it a block as argument, which is a kind of anonymous function (as in javascript). The fetch method will then pass ("yield") each element of the collection, one after another, to this block.
So the correct syntax in your case should be :
sth.fetch do |row|
p row
tasks.push row
end
which could be otherwise written like this, in a more "old school" fashion :
# define a function
# = this is your block
def process( row )
p row
tasks.push row
end
# pass each element of a collection to this function
# = this is done inside the fetch method
for row in sth
process row
end
I would advise you to read more on blocks / procs / lambdas, because they are all over the place in ruby, and IMHO are one of the reasons this language is so awesome. Iterators is just the beginning, you can do a LOT more with these...If you need good reference docs, the pickaxe is considered one of the best sources among rubyists, and i can tell you more if you want.
I don't know how your code works entirely, but I guess if you change tasks.push(row) into tasks.push(row.dup), then it shall work. If that is the case, then sth.fetch keeps giving you the same array (same object id) each time even if its content is renewed, and you are pushing the same array into tasks repeatedly.
There are so many things that can be happening but try this.
First ensuring the block is passed to the while using parens.
while (row=sth.fetch) do
p row
tasks.push(row)
end
Then the idiomatic ruby way
sth.fetch do |row|
p row
tasks << row # same as push
end
I would just like to return true, if my Array of Contact(s) (model) contains a Contact with id equal to some value. For example:
#contacts = Contact.all
#someval = "alskjdf"
find_val(#contacts, #someval)
def find_val(contacts, val)
#contact.each do |c|
if c.id == val
return true
end
end
return false
end
I have to do this repeatedly in my app (at most about 100 times for this particular actions), in order to exclude some data from an external API that has a list of contacts. Is this going to be too expensive?
I thought I might be able to do something faster, similar to ActiveRecord find on the array after it's been pulled down from the db, but can't figure that out. Would it be too expensive to call ActiveRecord like this?
Contacts.find_by_id(#someval)
The above line would have to be called hundreds of times... I figure iterating through the array would be less expensive. Thanks!
The best approach would be to index the contacts in a hash, using the contact id as the key after you retrieve them all from the db:
contacts = Contact.all.inject({}) {|hash, contact| hash[contact.id] = contact; hash }
Then you can easily get a contact with contact[id] in a performant way.
One way to reduce the amount of code you have to write to search the array is to open the array class and make a custom instance method:
class Array
def haz_value?(someval)
if self.first.respond_to? :id
self.select { |contact| contact.id == someval }.length > 0
else
false
end
end
end
Then, you can call #contacts.haz_value? #someval. In terms of efficiency, I haven't done a comparison, but both ways use Array's built-in iterators. It would probably be faster to create a stored procedure in your database and call it through ActiveRecord, here is how you can do that.
What's the most efficient way to iterate through an entire table using Datamapper?
If I do this, does Datamapper try to pull the entire result set into memory before performing the iteration? Assume, for the sake of argument, that I have millions of records and that this is infeasible:
Author.all.each do |a|
puts a.title
end
Is there a way that I can tell Datamapper to load the results in chunks? Is it smart enough to know to do this automatically?
Thanks, Nicolas, I actually came up with a similar solution. I've accepted your answer since it makes use of Datamapper's dm-pagination system, but I'm wondering if this would do equally as well (or worse):
while authors = Author.slice(offset, CHUNK) do
authors.each do |a|
# do something with a
end
offset += CHUNK
end
Datamapper will run just one sql query for the example above so it will have to keep the whole result set in memory.
I think you should use some sort of pagination if your collection is big.
Using dm-pagination you could do something like:
PAGE_SIZE = 20
pager = Author.page(:per_page => PAGE_SIZE).pager # This will run a count query
(1..pager.total_pages).each do |page_number|
Author.page(:per_page => PAGE_SIZE, :page => page_number).each do |a|
puts a.title
end
end
You can play around with different values for PAGE_SIZE to find a good trade-off between the number of sql queries and memory usage.
What you want is the dm-chunked_query plugin: (example from the docs)
require 'dm-chunked_query'
MyModel.each_chunk(20) do |chunk|
chunk.each do |resource|
# ...
end
end
This will allow you to iterate over all the records in the model, in chunks of 20 records at a time.
EDIT: the example above had an extra #each after #each_chunk, and it was unnecessary. The gem author updated the README example, and I changed the above code to match.