I am to generate a report from the database with 1000's of records. This report is to be generated on monthly basis and at times the user might want to get a report spanning like 3 months. Already as per the current records, a month's data set can reach to like 5000.
I am currently using vue-excel to which makes an api call to laravel api and there api returns the resource which is now exported by vue-excel. The resource does not only return the model data but there are related data sets I also need to fetch.
This for smaller data sets works fine that is when I am fetching like 3000 records but for anything larger than this, the server times out.
I have also tried to use laravel excel with the query concern actually timed them and both take same amount of time because laravel excel was also mapping to get me the relations.
So basically, my question is: is there some better way to do this so as get this data faster and avoid the timeouts
just put this on start of the function
ini_set(max_execution_time, 84000); //84000 is in seconds
this will override the laravel inbuild script runtime max value.
Related
I wanted to check how quickly my web application will display results for a query : SELECT * FROM orders.
the query returns about 20k records on one page and it takes about 15 seconds
Why on every browser the response time stops after two seconds? Is it because the browser has trouble displaying so many records per one page? at 70k it gets out of memory.
Database - mysql on hosting
problem
correct response time
If you want to check how long it takes for the web app to process. You can add logging before and after doing the query.
You also could add some logging of the current time, when receiving the request and before returning the response.
As for why the request stops after two seconds, I don't think we have enough information to decide.
It could be from the web server default configuration that you use.
In my opinion, displaying 20k records might not be an efficient approach.
Other than the time to query and response time.
You might want to consider the looping that happens on the front end.
Personally, I would recommend paging at a lower number, and if you need to display all the data at once. You might consider using lazy loading as an option.
I know this is a very generic answer, but hopefully, this could help you out.
i have a around 6gb collection with more then 7 lacks records.
trying below query
Location::whereIn('location_id', $id)->get();
but its taking more than 15th seconds to fetch data for 200 records i can not use pagination i need it in one go.
I am using Laravels jeneggers/mongodb package.
Searching large data is what's taking the time, likely not the small response.
Look into adding Indexes
Another option would be looking into archiving old data to make the data that's being queried shorter.
I have to pull data from a table that has 2 million rows. The eloquent query looks like this:
$imagesData = Images::whereIn('file_id', $fileIds)
->with('image.user')
->with('file')
->orderBy('created_at', 'DESC')
->simplePaginate(12);
The $fileIds array used in whereIn can contain 100s or even 1000s of file ids.
The above query works fine in small table. But in production site that has over 2 million rows in Images table, it takes over 15 seconds to get a reply. I use Laravel for api only.
I have read through other discussions on this topic. I changed paginate() to simplePaginate(). Some suggests perhaps having a DB:: query with whereRaw might work better than whereIn. Some says it might be due to PDO in php while processing whereIn and some recommends using Images::whereIn which I already used.
I use MariaDB, with InnoDB for db engine and its loaded into RAM. The sql queries performs well for all other queries, but only the ones that has to gather data from huge table like this takes time.
How can I optimise the above laravel query so I can reduce down the query response to couple of seconds if possible when the table has millions of rows?
You need indexing, which segmented your data by certain columns. You are accessing file_id and created_at. Therefore this following index will help performance.
$table->index(['file_id', 'created_at']);
Indexing will increase insert time and can make queries have weird execution plans. If you use the SQL EXPLAIN on the query before an after executing the query, we can verify it helps the problem.
Here is an update on steps taken to speed up the page load.
The real culprit for such a really slow query was not the above specific query alone. After the above query fetches the data, the php iterates over the data and does a sub-query inside it to check something. This query was using the filename column to search the data during each iteration. Since filename is string and not indexed, the response time for that controller took so long since each sub-query is crawling through 1.5 millions rows inside a foreach loop. Once I removed this sub-query, the loading time decreased by a lot.
Secondly, I added the index to file_id and created_at as suggested by #mrhn and #ceejayoz above. I created a migration file like this:
Schema::table('images', function (Blueprint $table) {
$table->index('file_id', 'created_at');
});
Optimised the PHP script further. I removed all queries that does searches using fileNames and changed it to use the id to fetch results. Doing this made a huge difference throughout the app and also improved the server speed due to less CPU work during peak hours.
Lastly, I optimised the server by performing the following steps:
Completed any yum updates
Updated Litespeed
Tweaked Apache profile, this included some caching modules and installed EA-PHP 7.3
Updated cPanel
Tuned Mysql to allow your server to utilize some more of your server resources.
Once I edit all of the above steps, I found a huge difference in the loading speed. Here are the results:
BEFORE:
AFTER:
Thanks to each and everyone who have commented on this. Your comments helped me in performing all of the above steps and the result was fruitful. Thanks heaps.
I am using DataTable plugin in Laravel. I have a record of 3000 entries in some
But when i load that page it loads all 3000 records in the browser then create pagination, this slow down the page loading.
How to fix this or correct way
Use server-side processing.
Get help from some Laravel Packages. Such as Yajra's: https://yajrabox.com/docs/laravel-datatables/
Generally you can solve pagination either on the front end, the back end (server or database side), or a combination of both.
Server side processing, without a package, would mean setting up TOP/FETCH or make rows in data being returned from your server.
You could also load a small amount (say 20) and then when the user scrolls to the bottom of the list, load another 20 or so. I mention the inclusion of front end processing as well because I’m not sure what your use cases are, but I imagine it’s pretty rare any given user actually needs to see 3000 rows at a time.
Given that Data Tables seems to have built-in functionality for paginating data, I think that #tersakyan is essentially correct — what you want is some form of back-end filtering or paginating of rows of data to limit what’s being sent to the front end.
I don’t know if that package works for you or not or what your setup looks like, but pagination can also be achieved directly from a DataBase returning data via the SQL (using TOP/FETCH for example) or could be implemented in a Controller or Service by tracking pages of data and “loading a page at a time” both from the server and then into the table. All you would need is a unique key to associate each "set of pages" for a specific request.
But for performance, you want to avoid both large data requests and operations on large sets of data. So the more you limit how much data is being grabbed or processed at any stage of your application using it, the more performant your application will be in principle.
I have an application who is doing a job aggregating data from different Social Network sites Back end processes done Java working great.
Its front end is developed Rails application deadline was 3 weeks for some analytics filter abd report task still few days left almost completed.
When i started implemented map reduce for different states work great over 100,000 record over my local machine work great.
Suddenly my colleague gave me current updated database which 2.7 millions record now my expectation was it would run great as i specify date range and filter before map_reduce execution. My believe was it would result set of that filter but its not a case.
Example
I have a query just show last 24 hour loaded record stats
result comes 0 record found but after 200 seconds with 2.7 million record before it comes in milliseconds..
CODE EXAMPLE BELOW
filter is hash of condition expected to check before map_reduce
map function
reduce function
SocialContent.where(filter).map_reduce(map, reduce).out(inline: true).entries
Suggestion please.. what would be ideal solution in remaining time frame as database is growing exponentially in days.
I would suggest you look at a few different things:
Does all your data still fit in memory? You have a lot more records now, which could mean that MongoDB needs to go to disk a lot more often.
M/R can not make use of indexes. You have not shown your Map and Reduce functions so it's not possible to point out mistakes. Update the question with those functions, and what they are supposed to do and I'll update the answer.
Look at using the Aggregation Framework instead, it can make use of indexes, and also run concurrently. It's also a lot easier to understand and debug. There is information about it at http://docs.mongodb.org/manual/reference/aggregation/