I'm trying to get a limited set of results in a sub collection
Basically I have this:
user = Model::User.find(token)
playlists = user.playlists
playlists.each do |playlist|
criteria = playlist.tracks.limit(4) #I Want to limit these to return max 4 tracks
#the criteria is perfect here but the decorator still returns all the tracks
#setting this criteria on the user saves the new tracks list limited by 4
#like this:
playlists[index].tracks = criteria
end
decorator = Decorator::PlaylistCollection.new(playlist)
response_with decorator
This isn't working, and my question is how can I limit every playlist to return 4 tracks max
What I want is that the response contains all the playlists with 4 tracks max in it (also the queries on mongo should be optimized)
display all the playlist
every playlist contains max 4 tracks
it's required to prevent the call for all the tracks of the playlist from mongoid/mongo
Thanks!
I got the solution, mongoid creates an array with ids (track_ids) with all the ids of the tracks in the list.
By overriding the track_ids instead of tracks nothing is queried or stored in mongoid.
user = Model::User.find(token)
playlists = user.playlists
playlists.map do |playlist|
playlist.track_ids = playlist.track_ids.take(4)
end
decorator = Decorator::PlaylistCollection.new(playlist)
response_with decorator
I think that the following code would do the trick.
user = Model::User.find(token)
playlists = user.playlists[0..3]
decorator = Decorator::PlaylistCollection.new(playlist)
response_with decorator
Related
I've got a question, I don't know if this is possible but suppose these models
class Event < ApplicationRecord
has_many :gifs
end
class Gif < ApplicationRecord
belongs_to :event
end
blank database using rails console you do this
gif_1 = Gif.create
gif_2 = Gif.create
gif_3 = Gif.create
event_1 = Event.new
event_1.gifs = [gif_1, gif_3]
event_1.save
event_2 = Event.new
event_2.gifs = [gif_2]
event_2.save
How would you order events by their last gifs created's created_at attribute.
Here is an example of what I've tried but that doesn't produce the correct result
ordered_events = Event.includes(:gifs).joins(:gifs).order("gifs.created_at DESC")
ordered_events.first.id
=> 2 # I want this to return 1
Now I understand why my attempt probably didn't work. I think it's because it probably only looked at the first Gif to do the ordering.
On top of this I had another thought, and here I have no idea where to begin trying to do this in a query, but what if the Event has 0 Gif, from what I wrote it seems no gifs simply relegates those events to after the ones who do have gifs but this would not work for me.
here's another context in rails console which is more realistic since normally you'd need an event first to store the Gif
event_1 = Event.create
event_2 = Event.create
gif_1 = Gif.create(event_id: event_1.id)
gif_2 = Gif.create(event_id: event_2.id)
event_1 = Event.create
gif_3 = Gif.create(event_id: event_1.id)
Now here what I would like to get back from my query would be something of the sorts of [event_1, event_3, event_2] because since event_3 has no gifs I want to use his created_at to order.
I know how I could do this by hand via some helper function or other but I would really love to be able to this kind of thing in one query directly.
As an example:
Event.joins(:gifs)
.group('events.id')
.order('MAX(gifs.created_at) DESC')
This query takes events, joins them with gifs, groups by event.id (to eliminate event duplicates in case one event has several gifs) and sorts result by latest (maximal) created_at time of event's gifs, in descending order.
Since it uses joins method, which unwraps to INNER JOIN in SQL query, events without gifs won't be returned by this query. To fix it use left_outer_joins:
Event.left_outer_joins(:gifs)
.group('events.id')
.order('MAX(gifs.created_at) DESC')
Suppose i have a array of ids like below.
#all_ids=[1,2,3,4]
My requirement is using very last id i will fetch records from one table(i.e-User) and i will fetch records from another table(i.e-Payment) using rest of the ids in an array.In the below method i will do this operation.
users_controller.rb:
class UsersController < ApplicationController
def update
#all_ids=[1,2,3,4]
end
end
Please help me to resolve this problem.
The pop method returns the last element in the array. It is a destructive method in that it alters the contents of the array. The array will now hold all the values minus the last one.
user = User.find(#all_ids.pop)
payments = Payment.find(#all_ids)
user = User.find(#all_ids.pop)
payments = Payment.find(#all_ids)
I have records with a 'resource' field which can contain multiple resources. When I return this data, I need to iterate over this field and return an individual record for each value in the field. I am currently using sinatra and am able to interate over the fields okay, but I am having difficulty replacing the field in the json array.
For example
event: Name
resources: resourceA, resourceB, resourceC
This record needs to be returned as 3 uniqe records/events with only one resource per record.
With the code listed below, I am getting three records, but all three records are coming back with the same resource value (resourceC)
Here is my code
docs = #db.view('lab/events', :startkey => params[:startDate], :endkey => endSearch)['rows']
rows = Array.new
docs.each do |doc|
resources = doc['value']['resources'].split(",")
resources.each do |r|
doc['value']['resources'] = r
rows.push(doc['value'])
end
end
Any help is greatly appreciated.
Thanks
Chris
if you use the ruby gem "json" you can convert the json string to a hash
require 'json'
converted_hash = JSON(json_string).to_hash
This should be much easier to manage.
You can then turn the hash to a JSON string:
new_json_string = converted_hash.to_json
Basically what is happening is ruby is seeing all three records as the same record so as the hash value is updated on one record, it impacts all other records that were created from the same doc. To get around this, I acutally needed to create a duplicate record each time through and modify it's value.
docs = #db.view('lab/events', :startkey => params[:startDate], :endkey => endSearch)['rows']
rows = Array.new
docs.each do |doc|
resources = doc['value']['resources'].split(",")
resources.each do |r|
newDoc = doc['value'].dup # <= create a duplicate record and update the value
newDoc["resources"] = r
rows.push(newDoc)
end
end
I'm trying to setup something for a movie store website (using ASP.NET, EF4, SQL Server 2008), and in my scenario, I want to allow a "Member" store to import their catalog of movies stored in a text file containing ActorName, MovieTitle, and CatalogNumber as follows:
Actor, Movie, CatalogNumber
John Wayne, True Grit, 4577-12 (repeated for each record)
This data will be used to lookup an actor and movie, and create a "MemberMovie" record, and my import speed is terrible if I import more than 100 or so records using these tables:
Actor Table: Fields = {ID, Name, etc.}
Movie Table: Fields = {ID, Title, ActorID, etc.}
MemberMovie Table: Fields = {ID, CatalogNumber, MovieID, etc.}
My methodology to import data into the MemberMovie table from a text file is as follows (after the file has been uploaded successfully):
Create a context.
For each line in the file, lookup the artist in the Actor table.
For each Movie in the Artist table, lookup the matching title.
If a matching Movie is found, add a new MemberMovie record to the context and call ctx.SaveChanges().
The performance of my implementation is terrible. My expectation is that this can be done with thousands of records in a few seconds (after the file has been uploaded), and I've got something that times out the browser.
My question is this: What is the best approach for performing bulk lookups/inserts like this? Should I call SaveChanges only once rather than for each newly created MemberMovie? Would it be better to implement this using something like a stored procedure?
A snippet of my loop is roughly this (edited for brevity):
while ((fline = file.ReadLine()) != null)
{
string [] token = fline.Split(separator);
string Actor = token[0];
string Movie = token[1];
string CatNumber = token[2];
Actor found_actor = ctx.Actors.Where(a => a.Name.Equals(actor)).FirstOrDefault();
if (found_actor == null)
continue;
Movie found_movie = found_actor.Movies.Where( s => s.Title.Equals(title, StringComparison.CurrentCultureIgnoreCase)).FirstOrDefault();
if (found_movie == null)
continue;
ctx.MemberMovies.AddObject(new MemberMovie()
{
MemberProfileID = profile_id,
CatalogNumber = CatNumber,
Movie = found_movie
});
try
{
ctx.SaveChanges();
}
catch
{
}
}
Any help is appreciated!
Thanks, Dennis
First:
Some time ago I wrote an answer about calling SaveChanges after 1, n or all rows:
When should I call SaveChanges() when creating 1000's of Entity Framework objects? (like during an import)
It is actually better to call SaveChanges after more than 1 row, but not after all.
Second:
Make sure you have index on name in Actors table and title in Movies, that should help. Also you shouldn't select whole Actor, if you need only his ID:
Instead of:
Actor found_actor = ctx.Actors.Where(a => a.Name.Equals(actor)).FirstOrDefault();
you can select:
int? found_actor_id = ctx.Actors.Where(a => a.Name.Equals(actor)).Select(a => a.ID).FirstOrDefault();
and then
Something.ActorID = found_actor_id;
This can be faster, because doesn't require whole Actor entity and doesn't require additional lookups, specially when combined with index.
Third:
If you send a very large file, there is still probability of timeout, even with good performance. You should run this import in separate thread and return response immediately. You can give some kind of identifier to every import and allow user to check status by this ID.
How can I interact with objects I've created based on their given attributes in Ruby?
To give some context, I'm parsing a text file that might have several hundred entries like the following:
ASIN: B00137RNIQ
-------------------------Status Info-------------------------
Upload created: 2010-04-09 09:33:45
Upload state: Imported
Upload state id: 3
I can parse the above with regular expressions and use the data to create new objects in a "Product" class:
class Product
attr_reader :asin, :creation_date, :upload_state, :upload_state_id
def initialize(asin, creation_date, upload_state, upload_state_id)
#asin = asin
#creation_date = creation_date
#upload_state = upload_state
#upload_state_id = upload_state_id
end
end
After parsing, the raw text from above will be stored in an object that look like this:
[#<Product:0x00000101006ef8 #asin="B00137RNIQ", #creation_date="2010-04-09 09:33:45 ", #upload_state="Imported ", #upload_state_id="3">]
How can I then interact with the newly created class objects? For example, how might I pull all the creation dates for objects with an upload_state_id of 3? I get the feeling I'm going to have to write class methods, but I'm a bit stuck on where to start.
You would need to store the Product objects in a collection. I'll use an array
product_collection = []
# keep adding parse products into the collection as many as they are
product_collection << parsed_product_obj
#next select the subset where upload_state_ud = 3
state_3_products = product_collection.select{|product| product.upload_state_id == 3}
attr reader is a declarative way of defining properties/attributes on your product class. So you can access each value as obj.attribute like I have done for upload_state_id above.
select selects the elements in the target collection, which meet a specific criteria. Each element is assigned to product, and if the criteria evaluates to true is placed in the output collection.