Laravel limit records with pagination - laravel-5

I'm using Laravel 5.2 and when working with pagination, I cannot limit the result. When I try with $query->take($x), the result is x records, but when I try with $query->take($x)->paginate(5), it give me all records with 5 records per page.
Could anyone work with laravel pagination could give a hint to resolve that problem?
Thank you so much!

It is not about just fetching the records, you will have to add a number of features in the grid e.g. may be export, searching, filtration etc. I have used Nayjest grid and would recommend you to use another one. I came across and issue and contacted the community and that's what is their reply:
At all, I would recommend to migrate to next major version of grids that is moved to separate repository: https://github.com/view-components/grids
Tip: If you have a lot of usage of nayjest/grids, you can use view-components/grids in parallel for your new reports.
so you should go for:
https://github.com/view-components/grids

Related

How to look for S&P 500 Constituents history, added and removed dates etc

I am trying to get a historical list of the S&P500 underlying stocks mix. all tickers the dates were added to the S&P500 index mix and the dates tickers were removed from the list. and throughout the years for each period what is the mix. I did some search, doesn't seems to have any luck.
if anyone can provide some good search keywords, or suggest a place to look for would be appreciated
this is something very specific.
I currently use backtrader to work on some data. if there is a systematic way to get the data, please let me know as well.
many thanks.
You can access this data systematically in QuantRocket, via data provider Sharadar:
https://www.quantrocket.com/data/?filter=sharadar

Pagination using jqgrid for randomly changing data

I am developing a page in a application using Spring, hibernate. In that page I have to display records by fetching from database in a datatable using jqgrid. The problem at here is the records that will be added to database in hundreds for every hour and total records in database is 300 000 and it's goes on increasing. Now what is the best solution for this problem(in terms of pagination and loading data)?
Here is solution for this.
store you data in session after retrieving from database and apply paging.
http://docs.spring.io/spring/docs/3.0.x/api/org/springframework/beans/support/PagedListHolder.html
I hop this will work.
yes i got you problem.
Twitter is also facing same problem to return tweets.
then solve this issue using max_id and since_id
here is documentation(flow) is there please refer it.
https://dev.twitter.com/docs/working-with-timelines
You can apply same algorithm in your paging it will solve issue.
is this solve you issue?

Joomla getItems running out of memory with big results set

I've got just over 10,000,000 records in the database of my component and I think getItems/getListQuery is trying to load every single one of them into memory. The search form on the site extremely slow or comes back saying php is out of memory.
phpMyAdmin seems to be able to handle displaying this data - why not Joomla?
The strange thing is that the items are then displayed correctly using the globally set list limit of 5 to a page.
I've just looked and Joomla's cache is disabled - is that screwing me up here?
Many thanks in advance!
I fixed it in the end by copying the getPagination, getTotal, getItems etc. from the library's (list.php) and into my model (to override them). Then in each method, I made sure the results where returned instead of sending them to the cache.
The getTotal function seems to count the number of rows instead of doing a seperate count(*). That's ok with a few thousand records but over 1/2 million is asking for trouble!

Better than Jqgrid?

I need an advise, I am using Jqgrid as a grid tool, somehow I am finding some complication of showing very big data I am talking of 450 records with big data and 10 columns in there.
Is there any better grid you guys suggest to work with which gives me a better performance
I've always used dataTables and i always reccomend it because it's easy to use and configure. It is perfect and fast to display data ( i used it with tables with more than 100.000 rows with no problems, configuring server side procesiing correctly ).
The only thing you must now is that (as far as i know) it doesn't support colspan in the body of the table so, if your layouts require that, using datatables becomes impossible. (i usually found other way to show things rather than using colspan but for some this is a blocker)

Magento - Migrate products by copying all database tables with catalog_ prefix

In order to migrate only the products and categories, I manually copied all the database tables with the catalog_ prefix from one db to another and it seems to have worked rather well.. so far.
But does anyone know if there is anything potentially bad in doing this?
It might be bad if you have custom eav attributes. Also, even core eav attributes ids can mismatch on different magento instances (if you installed different magento versions).
Time will tell. The tables in Magento are pretty much all relational - so if you've missed something with a foreign key dependency - you're bound to run into issues.
What about your custom attributes, attribute sets, historic orders that relate to a certain entity ID etc.
You would be better off exporting and re-importing your catalogue for a "cleaner" approach, although, it will take some time if you have a large catalogue (100k+).
Have a look at Unirgy Rapidlow - it supports features your looking for and we recommend it to a lot of clients as a drop-in replacement for Dataflow.
Thanks for the answers, guys.
In case anyone is thinking of trying this, some issues did creep in. When creating new products through the admin, I suddenly found I couldn't get them to show up in the front-end.
Also, (this may or may not have been related) I noticed the image upload buttons seemed to have vanished in the Add Product screen.
In the end the paranoia got too much and I was attributing every glitch to the potentially ropey db migration. I scrapped it and took a totally different approach.

Resources