limit the results of a query server side - voltrb

I would like to know how to limit a query (server side. My knowledge is Meteor). For example, the comments of a particular post instead of retrieving all the comments in database.
I want it to be limited server side to avoid malicious client to fetch all the data of the database.
Thanks.

I think we talked about this in gitter, but I'm planning to add a query_restrictions api where you can reject queries that you don't like. I'll probably add a default globally configurable limit also. (that you can override on a per model basis)

You could use limit. For example:
post.comments.limit(10)

Related

Data integration for Magento to Quick Book

I'm currently new to Talend and I'm learning through videos and documentation, so I'm just not sure how to approach/implement this with best practices.
Goal
Integrate Magento and Quick Book using Talend.
My thoughts
Initially my first thought was I will setup direct DB connection for Magento and will take relevant data which I need and will process it and will send to QuickBook using REST API's(specifically bulk API's in batch)
But then again I thought it would be little hectic for me to query Magento database(multiple joins) so I've another option to use Magento's REST API.
But as I'm not much familiar with the tool I'm struggling little to find best suitable approach, so any help is appreciated.
What I've done till now?
I've saved my auth(for QB) and db(Magento) credentials data in file and using tFileInputDelimited and tContextLoad, I'm storing them in context variables so they can be accessible globally.
I've successfully configured database connection and dbinput but I've not used metadata for connection(should I use that and if Yes how can I pass dynamic values there?). I've used my context variables data in db connection settings.
I've taken relevant fields for now but if I want multiple fields simple query is not enough as Magento stores data in multiple tables for Customer etc but it's not big deal I know but I think it might increase my work.
For now that's what I've built and my next step is send the data to QB using REST while getting access_token and saving it to context variable and again storing the QB reference into Magento DB.
Also I've decided to use QB bulk API's but I'm not sure how I can process data in chunks in Talend(I tried to check multiple resources but no luck) i.e. if the Magento is returning 500 rows I want to process them in chunks of 30 as QB batch max limit is 30, so I will be sending it using REST to QB and as I said I also want to store back QB reference ID in magento(so I can update it later).
Also this all will be on local, then how can I do same in production? how I can maintain development and production environment?
Resources I'm referring
For REST and Auth best practices - https://community.talend.com/t5/How-Tos-and-Best-Practices/Using-OAuth-2-0-with-Talend-to-Access-Goo...
Nice example for batch processing here:
https://community.talend.com/t5/Design-and-Development/Batch-processing-in-talend-job/td-p/51952
Redirect your input to a tFileOutputDelimited.
Enter the output filename, tick the option "Split output in several files" from the "Advanced settings" and enter the value of 1000 into the field "Rows in each output file". This will create n files based on the filename with 1000 in each.
On the next subjob, use a tFileList to iterate over this file list to get records from each file.

Laravel pagination in Data Table

I am using DataTable plugin in Laravel. I have a record of 3000 entries in some
But when i load that page it loads all 3000 records in the browser then create pagination, this slow down the page loading.
How to fix this or correct way
Use server-side processing.
Get help from some Laravel Packages. Such as Yajra's: https://yajrabox.com/docs/laravel-datatables/
Generally you can solve pagination either on the front end, the back end (server or database side), or a combination of both.
Server side processing, without a package, would mean setting up TOP/FETCH or make rows in data being returned from your server. 

You could also load a small amount (say 20) and then when the user scrolls to the bottom of the list, load another 20 or so. I mention the inclusion of front end processing as well because I’m not sure what your use cases are, but I imagine it’s pretty rare any given user actually needs to see 3000 rows at a time.

Given that Data Tables seems to have built-in functionality for paginating data, I think that #tersakyan is essentially correct — what you want is some form of back-end filtering or paginating of rows of data to limit what’s being sent to the front end.

I don’t know if that package works for you or not or what your setup looks like, but pagination can also be achieved directly from a DataBase returning data via the SQL (using TOP/FETCH for example) or could be implemented in a Controller or Service by tracking pages of data and “loading a page at a time” both from the server and then into the table. All you would need is a unique key to associate each "set of pages" for a specific request.
But for performance, you want to avoid both large data requests and operations on large sets of data. So the more you limit how much data is being grabbed or processed at any stage of your application using it, the more performant your application will be in principle.




What is the most efficient way to filter a search?

I am working with node.js and mongodb.
I am going to have a database setup and use socket.io to have real-time updates that will have the db queried again as well or push the new update to the client.
I am trying to figure out what is the best way to filter the database?
Some more information in regards to what is being queried and what the real time updates are:
A document in the database will include information such as an address, city, time, number of packages, name, price.
Filters include city/price/name/time (meaning only to see addresses within the same city, or within the same time period)
Real-time info: includes adding a new document to the database which will essentially update the admin on the website with a notification of a new address added.
Method 1: Query the db with the filters being searched?
Method 2: Query the db for all searches and then filter it on the client side (Javascript)?
Method 3: Query the db for all searches then store it in localStorage then query localStorage for what the filters are?
Trying to figure out what is the fastest way for the user to filter it?
Also, if it is different than what is the most cost effective way, then the most cost effective as well (which I am assuming is less db queries)...
It's hard to say because we don't see exact conditions of the filter, but in general:
Mongo can use only 1 index in a query condition. Thus whatever fields are covered by this index can be used in an efficient filtering. Otherwise it might do full table scan which is slow. If you are using an index then you are probably doing the most efficient query. (Mongo can still use another index for sorting though).
Sometimes you will be forced to do processing on client side because Mongo can't do what you want or it takes too many queries.
The least efficient option is to store results somewhere just because IO is slow. This would only benefit you if you use them as cache and do not recalculate.
Also consider overhead and latency of networking. If you have to send lots of data back to the client it will be slower. In general Mongo will do better job filtering stuff than you would do on the client.
According to you if you can filter by addresses within time period then you could have an index that cuts down lots of documents. You most likely need a compound index - multiple fields.

Using same ID does various RETS Server fetch different number of datas?

I am Using VieleRETS as client applications to fetch data into our MYSQL database. Is there any chances if I use other RETS Client application service such and Rets Connecter to fetch data and same amount of data will be updated of it will vary?
Not 100% sure what you are asking.
If you are asking whether 2 different MLS Systems have the same schema, the answer is no. Well, not likely at least.
If you are asking whether 2 clients, viele and some other client, will see the same schema from the same MLS system the answer is yes. And in this scenario you should see the same record counts, assuming no defect in the data.
A simple example of data defect would be the DMQL queries ListingID=0+ and ModificationTimestamp=1900-01-01T00:00:00+ should return all listings in both cases. If a listing timestamp or id were null the counts would be off.

Persisting Data Across Requests MVC3 and Razor

I am working on an MVC3 and Razor website. The user has to select their way through a few choices before finally working on the data.
For example:
Client List -> Version List (Filtered by client) -> Etc (Filtered by version)
Once a user selects a client, they select a version for the client. So I'm passing the client id on the querystring. For each mode of the controller of version I'm passing around the client id. On views that I want to show the client name, I'm querying the database for the client and stuffing it into the ViewBag. This seems very inefficient. I feel like I could use a cookie to hold the client id & name.
Now that I've got my version controller done, I'm facing the same pattern again with each subsequent controller, but now I need to persist both client and version...
What is a preferred approach for persisting information like this across requests?
This seems very inefficient
That's what database are made and optimized for => query data based on fields and if you put indexes on those fields it will be screamingly fast. Of course Session, Cookies, Cache are some common techniques that you could employ to limit the number of queries to the database but you will have to assume the possible staleness of data that you are getting this way (if some other thread/process modified the data in the database you no longer get correct results).
So before doing any premature optimizations here's what I would recommend you: hammer your database until you discover that this is actually a bottleneck for your application. Databases might become bottleneck in some very high traffic applications where you should resort to one of the afforementioned techniques (or in some poorly written applications of course but let's exclude this possibility for the moment).
You should use TempData, which allows you to pass data between the current and next HTTP requests. Be sure to keep in mind that it uses the session.
Greg Shackles has a great article all about TempData here
see this similar question MVC3 multi step form - How to persist model object

Resources