Take total price from all product for many users - ruby

i have issue
Model Driver has association like has_many: orders
class Driver < User
has_many :orders
end
and Order has belongs_to :user
class Order < ActiveRecord::Base
belongs_to :user
end
and Order has column like price
My problem is, i want to display all drivers by table
first column will be first_name, next last_name,
i want to display total price for each users. price will be counting by summing by all orders for user,
Problem is n+1, how i can display total price for each user, without sent separate request to the DB
Example of index page
first_name.
last_name
Price
Arian
Lain
2500
Brain
Kokun
4700

You can get a sum for grouped values using a single SQL query with SUM and GROUP BY. In Rails, this can look like this:
#sums = Order.group_by(:user_id).sum(:price)
# {1 => 2500, 2 => 4700, ...}
In your view, you can then fetch the sum for the respective user using the user's id as a key, e.g. with this (assuming you have the current driver / user in the driver variable):
<%= #sums[driver.id] %>

Related

How to group and sum by foreign key?

I have these two models in my Rails app:
class Person < ApplicationRecord
has_many :payments
end
class Payment < ApplicationRecord
belongs_to :person
end
How can I group the payments by person and order them by amount?
Right now I have...
Payment.group(:person_id).sum("amount")
...which works but doesn't include the persons' names. It returns something like this:
{ 1 => 1200.00, 2 => 2500.00 }
How can I replace the IDs / integers with the persons' names and also sort the whole thing by amount?
Thanks for any help!
Just be a bit more specific:
Payment.select('people.name, SUM(payments.amount)').joins(:person).group(:person_id)
Assuming that the persons table is named people in your application.
This will return the ActiveRecord::Relation that you can work with:
Person.joins(:payments).group('persons.id').select("persons.id, persons.name, sum(payments.amount) as amounts_summ")
Only for unique name fields:
Assuming you have name property for Person model, solution can be like this:
Payment.joins(:person).group(:name).order('sum_amount DESC').sum(:amount)
It generates query
SELECT SUM("payments"."amount") AS sum_amount, "name" AS name FROM "payments" INNER JOIN "persons" ON "persons"."id" = "payments"."persons_id" GROUP BY "name" ORDER BY sum_amount DESC
and return hash like this:
=> {"Mike"=>22333.0, "John"=>5676.0, "Alex"=>2000.0, "Carol"=>2000.0}

How to limit the number of votes in 'acts_as_votable' gem

I'm currently using the rubygem acts_as_votable as the voting system in my project, but I want to limit the total votes of every user (which I used rubygem devise)
The target of the vote is a model called Pin:
class Pin < ActiveRecord::Base
acts_as_votable
end
Should I use a method and put it in the before_action: to make sure that the vote your making won't let your total votes exceed say like 10?
Updated: 8/18/2015
Now I popped up with a new question:
I created another model group, and declaim the relationship:
(group.rb)
has_many: pins
(pin.rb)
belongs_to: group
So, here comes up the question, if I want to limit the votes in every group, say like: 10 in group 1, 10 in group 2, 10 in group 3....
How can I accomplish it?
You can do something like this:
def upvote
#pin = Pin.find(params[:id])
# check for user's total votes
if current_user.find_voted_items.size < 10
#pin.vote_by :voter => current_user
else
..... #your code
flash[:notice] = "your total votes exceed"
redirect_to pins_path
end
end

Cache queries when creating sub records?

I have an application which handles orders with line items. The line items come in as part of the order in JSON format, e.g.:
{
"customer_id":24,
"line_items":[
{
"variant_id":"1423_101_10",
"quantity":"5",
"product_id":"1423"
},
{
"variant_id":"2396_101_12",
"quantity":"3",
"product_id":"2396"
}
]
}
So this will set up an order in the orders table, e.g.:
id | customer_id
1 | 24
And line items in the line_items table, e.g.:
id | order_id | product_id | variant_id | quantity | price*
1 | 1 | 1423 | 1423_101_10 | 5 | 10
2 | 1 | 2396 | 2396_101_10 | 3 | 15
*price doesn't come from the order JSON, it's retrieved via a lookup
However, when the new records are created it does a SELECT for the order for each line_item added. This wouldn't be an issue in the example above, but this application can and does have hundreds and sometimes thousands of line items for a particular order, so it seems like it's inefficient and potentially a cause of the Heroku server running out of memory. Is there a way to only load the Order once, rather than for each line item?
Another potential bottleneck is that a lookup is done against a Products table to get the price. In the example above, there's no possible caching, but if multiple variants of the same Product are selected, it seems inefficient to look up the Product each time when it may already have been loaded. For example, 1423_101_10, 1423_101_12, 1423_102_10 and 1423_102_12 are all the same Product with the same price. Is it better to try and cache Products already looked up or would that complicate things further?
Edit:
Completely forgot to add any code!
Order Model:
class Order < ActiveRecord::Base
has_many :line_items, :dependent => :destroy
Line Item Model:
class LineItem < ActiveRecord::Base
before_create :set_price
belongs_to :order
belongs_to :product, :primary_key => "product_id", :conditions => proc { "season = '#{order.season}'" }
def set_price
write_attribute :price, product.prices[order.currency] if price.nil? && product && order
end
Product Model:
class Product < ActiveRecord::Base
Edit 2:
OrdersController (simplified)
class OrdersController < ApplicationController
def create
#order = Order.new(order_params)
authorize! :create, #order
if #order.save
render_order_json
end
end
def order_params
permitted = params.permit(:customer_id, :line_items => line_item_params)
permitted[:line_items_attributes] = permitted.delete("line_items") if permitted["line_items"]
permitted
end
def line_item_params
[:product_id, :variant_id, :quantity]
end
Edit 3: An example of the SQL I see reported:
Order Load (1.0ms) SELECT "orders".* FROM "orders" WHERE "orders"."id" = $1 ORDER BY "orders"."id" ASC LIMIT 1 [["id", 1]]
Product Load (1.0ms) SELECT "products".* FROM "products" WHERE "products"."product_id" = $1 AND (season = 'AW14') ORDER BY "products"."id" ASC LIMIT 1 [["product_id", 1423]]
SQL (2.0ms) INSERT INTO "line_items" ("order_id", "price", "product_id", "quantity", "variant_id") VALUES ($1, $2, $3, $4, $5) RETURNING "id" [["order_id", 1], ["price", 10.0], ["product_id", 1423], ["quantity", 5], ["variant_id", "1423_101_10"]]
Order Load (1.0ms) SELECT "orders".* FROM "orders" WHERE "orders"."id" = $1 ORDER BY "orders"."id" ASC LIMIT 1 [["id", 1]]
Product Load (2.0ms) SELECT "products".* FROM "products" WHERE "products"."product_id" = $1 AND (season = 'AW14') ORDER BY "products"."id" ASC LIMIT 1 [["product_id", 2396]]
SQL (1.0ms) INSERT INTO "line_items" ("order_id", "price", "product_id", "quantity", "variant_id") VALUES ($1, $2, $3, $4, $5) RETURNING "id" [["order_id", 1], ["price", 15.0], ["product_id", 2396], ["quantity", 3], ["variant_id", "2396_101_10"]]
If you want to speed up the create action, you have several options:
removing the database intensive callbacks
speeding up the callbacks through caching
delay creation to be executed through background tasks
Depending on your application needs, those might be viable options in the order of impact into your code-base and infrastructure.
This totally depends on what you have already setup, so it might be the other way around.
By removing the callback (set_price) that creates the 1+n problem in your create code, you will have to create some lookup method that fetches all the prices at once and applies them to the order.
Caching could go into the set_price method, so that the lookup is only done once. You will have to take care of cache-expiry when the price changes, which might be none-trivial.
Using a background-job like resque or sidekiq can take the order and do all the processing without the response timing out. You will have to do an asynchronous check for the order to be processed to make it visible in the frontend.
In the end it just took a bit of a workflow change to speed up the Order creation.
Instead of calling set_price on the before_create method, it's done in the OrderController via the Order model. So now my code looks like:
Order Model:
class Order < ActiveRecord::Base
has_many :line_items, :dependent => :destroy
def set_prices
self.line_items.each do |item|
item.set_price
end
end
LineItem model:
class LineItem < ActiveRecord::Base
belongs_to :order
belongs_to :product, :primary_key => "product_id", :conditions => proc { "season = '#{order.season}'" }
def set_price
self.price = Product.where(:product_id => product_id, :season => season).first.prices[currency]
end
OrdersController:
class OrdersController < ApplicationController
def create
#order = Order.new(order_params)
authorize! :create, #order
#order.set_prices
if #order.save
render_order_json
end
end
def order_params
permitted = params.permit(:customer_id, :line_items => line_item_params)
permitted[:line_items_attributes] = permitted.delete("line_items") if permitted["line_items"]
permitted
end
def line_item_params
[:product_id, :variant_id, :quantity]
end
This question was also related and also sped things up.
Stop child models updating when parent is updated

Ruby - Datamapper - Collecting records by joining two models

I have two tables
users
-------
id
name
organization_id
organizations
----------
id
org_name
org_unique_num
abc
xyz
organization_id in users table is foreign key to organizations table's id
class Organization
include DataMapper::Resource
property :id, Serial
property :org_name, String
property :org_unique_num, Integer
property :abc String
property :xyz String
has n, :users
end
class User
include DataMapper::Resource
property :id, Serial
property :name, String
property :organization_id, Integer
property :age, Integer
belongs_to :organization
end
I want to grab the user's record with joining Organization table where user's age > 25. So the result should look like
user_id name organization_id org_name org_unique_num age
12 John 356 ATT 76763 38
35 Lisa 981 IBM 2376 28
So how can I achieve this? Please note I dont want column abc and xyz in the result.
User.all(:age.gt => 25)
This will just give me users with age >25, but I want to grab user's org info as well. Is it possible to do it one statement? or will have to do it in multiple steps. Like collecting all user_id then pass to Organization model to with id in().. that would be ugly.
Any help will be appreciated.
DataMapper will do all join job for you.
you do not need to extract organizations for each user, this is done automatically.
So, you simply fetch your users with this: User.all(:age.gt => 25)
And each user will have its organization attached to it:
User.all(:age.gt => 25).each do |user|
p user.name
# organization not yet fetched, only referenced
p user.organization
# now organization are fetched
p user.organization.id # display org ID
p user.organization.org_name # display org name
# etc
end
Regard "i do not need abc and xyz", if they are Text columns, DataMapper will load them lazily, meant the data will be fetched only when requested via user.abc and user.xyz

How can I avoid running singular expressions against arrays in activerecord and rails 3?

I am sorry if I am asking the question poorly. I have a Rails 3.1 app with models (simplified) like so:
class Employee < ActiveRecord::Base
has_many :merged_children, :class_name => 'Employee', :foreign_key => "merge_parent_id"
has_many :timesheets
def total_time
merged_children.timesheets.in_range(range).hours_minutes.sum
end
end
class Timesheet < ActiveRecord::Base
belongs_to :employee
def in_range(range)
# filter records based on transaction_date in range
end
def hours_minutes
(hours + minutes/60.0).to_f
end
end
Note: The in_range method acts as a scope, essentially, and hours_minutes is a calculation. hours_minutes is valid for each timesheet record in the resulting dataset, and then total_time should sum those values and return the amount.
The "total_time" method is not working because employee.merged_children returns an array and timesheets is meant to run against a single Employee object.
Is there any way to structure the "total_time" so that it still sends one query to the db? It seems inelegant to iterate over the merged_children array, issuing a query for each. Not sure if a direct call to an Arel table would help or hurt, but I am open to ideas.
If we get it right, the resulting SQL should effectively look something like:
SELECT sum(hours + minutes/60.0)
FROM employees e1 join employees e2 on e1.id = e2.merge_parent_id join timesheets t on t.employee_id = e2.id
WHERE e1.id = [#employee.id] and t.transaction_date BETWEEN [#range.begin] and [#range.end]
Thanks so much!
The easiest thing here might be to add
has_many :children_timesheets, :through => :merged_children, :source => :timesheets
To your employee model,
Then (assuming in_range is actually a scope, or a class method that does a find)
children_timesheets.in_range(...)
Should be the collection of timesheets you're interested in and you can do something like
children_timesheets.in_range(...).collect(&:hours_minutes).sum
Untested with actual data.
range = ((1.day.ago)...(2.days.ago))
merge_parent = Employee.find(some_id)
Timesheet.where(:transaction_date => range)
.joins(:employee).where(:employees => {:merge_parent_id => merge_parent.id})
.sum('hours*60 + minutes')
(0.3ms) SELECT SUM(hours*60 + minutes) AS sum_id FROM "timesheets" INNER JOIN "employees" ON "employees"."id" = "timesheets"."employee_id" WHERE "employees"."merge_parent_id" = 1 AND ("timesheets"."created_at" >= '2011-12-13 03:04:35.085416' AND "timesheets"."created_at" < '2011-12-12 03:04:
Returns "0" for me. So hopefully it will return something nicer for you

Resources