Improve Load Speed of ajax Select2 - ajax

I am loading around 40,000 searchable entries in a select2 and it takes a long time to load. Is there any way of dumping the dataset to a faster source such as memcache or redis in order to massively improve loading of the select2 on search?

You can try the Transient API
$my_data = array(
1,2,3,4,5
//.....
1000000
);
set_site_transient( 'my_transient_data', $my_data, 100000 );
// 100000 is the expiration date(in seconds)..
// Get array....
get_site_transient('my_transient_data' );
// Delete the Transient...
delete_site_transient( 'my_transient_data' );
Transient API Codex

Related

Slow on fetching records from DynamoDB table

I have a DynamoDB table with 151 records, table size is 666 kilobytes and average item size is 4,410.83 bytes.
This is my table schema:
uid - partition key
version - sort key
status - published
archived - boolean
... // other attributes
When I'm doing scan operation from AWS Lambda.
module.exports.searchPois = async (event) => {
const klass = await getKlass.byEnv(process.env.STAGE, 'custom_pois')
const status = (event.queryStringParameters && event.queryStringParameters.status) || 'published'
const customPois = await klass.scan()
.filter('archived').eq(false)
.and()
.filter('status').eq(status)
.exec()
return customPois
}
This request takes up 7+ seconds to fetch. I'm thinking to add a GSI so I can perform a query operation. But before I add, is it really like this slow when using scan and If I add GSI, will it fetch faster like 1-3 seconds?
Using Scan should be avoided of at all possible. For your use-case a GSI would be much more efficient and a sparse index would be even better: https://docs.aws.amazon.com/amazondynamodb/latest/developerguide/bp-indexes-general-sparse-indexes.html
In saying that, for the small number of items you have it should not take 7seconds. This is likely caused by making infrequent requests to DynamoDB as DynamoDB relies on caching metadata to improve latency for requests, if your requests are infrequent then the metadata will no exist in cache increasing response times.
I suggest to ensure you re-use your connections, create your client outside of the Lambda event handler and ensure you keep active traffic on the table.

How to reduce size of data in laravel for ajax response?

I am fetching data from database which is returning almost 5000 records form database . when I see resource size over network it is 10 mb of size.
This table has almost fifty columns. Is there any way I can reduce the size which displaying in network tab of browser?
here is the query
$consignments = Consignment::query();
$consignments->where('delivery_run_id', null);
$consignments = $consignments->orderBy('id', 'desc')->limit(5000)->get();
return Response::json(['status' => 'success', 'consignment' => $consignments]);
i try to use select() but no effect
try this
$consignments = $consignments->orderBy('id', 'desc')->limit(5000)->select(['column one','column two','column three']);
I solved it this way. I just select the column which was required for ajax response and ignored all other column and now I successfully reduced data from 10mb to 2.5mb

Laravel 5.8 pagination without Eloquent

I need a custom pagination but in my case, i do not use eloquent for fetching data. I use an API where i fetch data.
I've been dealing with Pagination class, but as far as i know, it takes a collection and paginate it. That's not what i need.
What i need is creating a paginate object based on a subset of records gotten by a search query. Let's say such query has a total of 10000 records and I only get an array of 50 items, so each paginate has 50 elements. So, i need to create the pagination links based on this info.
Is there any way of accomplish it?
EDIT:
$models = array('total' => $n_results,
'per_page' => 30,
'current_page' => 1,
'last_page' => ceil($n_results/30),
'next_page' => "******",
'prev_page' => "******",
'from' => 1,
'to' => 30,
'data' => $items);
Based on what I understood here what I would do:
1- if you cannot ask the limit and offset for each call of the API and the API provides all the results to you at once but you want to show them 50 at a time, then I would create a temp_table and insert the data to it and then it's like it's on my own database then I would be able to sort,limit and offset by myself.
2- if the total result is not that much (like less than 500) and you want the user to not be overwhelmed by all the results together and you wanna show it to them 50 results at a time or 20 results at a time you can load all the result in blade by hide their element. (it's not what I would recommend but it would do the trick if you don't want a temp_table)

Simple pagination in datatable using ajax without sending total count from server

I'm using DataTables 1.10.5. My table uses server side processing via ajax.
$('#' + id).dataTable({
processing: true,
serverSide: true,
ajax: 'server-side-php-script-url',
"pagingType": "simple_incremental_bootstrap"
});
Everything will work properly if I send 'recordsTotal' in the server response. But I don't want to count the total entries because of performance issues. So I tried to use the pagination plugin simple_incremental_bootstrap. However it is not working as expected. The next button always return the first page itself. If I give 'recordsTotal' in server response this plugin will work properly. I found out that If we don't give 'recordsTotal', the 'start' param sent by datatable to server side script is always 0. So my server side script will always return the first page.
According to this discussion, server side processing without calculating total count is not possible because “DataTables uses the record count that is passed back to it to deal with the paging controls”. The suggested workaround is “So the display records are needed, but it would be possible to just pass back a static number (like 1'000'000 or whatever) which would make DataTables think there are a million rows. You could hide the information element if this information is totally bogus!”
I wonder if anybody have a solution for this. Basically I want to have a simple pagination in my datatable with ajax without sending total count from server.
A workaround worth to try..
If we don't send recordsTotal from server, the pagination won't work properly. If we send a high static number as recordsTotal, table will show an active Next button even if there is no data in next page.
So I ended up in a solution which utilizes two parameters received in ajax script - 'start' and 'length'.
If rows in current page is less than 'limit' there is no data in next page. So total count will be 'start' + 'current page count'. This will disable Next button in the last page.
If rows in current page is equal to or greater than 'limit' there is more data in next pages. Then I will fetch data for next page. If there is at least one row in next page, send recordsTotal something larger than 'start + limit'. This will display an active Next button.
Sample code:
$limit = require_param('length');
$offset = require_param('start');
$current_page_data = fn_to_calculate_data($limit, $offset); // in my case, mysqli result.
$data = “fetch data $current_page_data”;
$current_page_count = mysqli_num_rows($current_page_data);
if($current_page_count >= $limit) {
$next_page_data = fn_to_calculate_data($limit, $offset+$limit);
$next_page_count = mysqli_num_rows($next_page_data);
if($next_page_count >= $limit) {
// Not the exact count, just indicate that we have more pages to show.
$total_count = $offset+(2*$limit);
} else {
$total_count = $offset+$limit+$next_page_count;
}
} else {
$total_count = $offset+$current_page_count;
}
$filtered_count = $total_count;
send_json(array(
'draw' => $params['draw'],
'recordsTotal' => $total_count,
'recordsFiltered' => $filtered_count,
'data' => $data)
);
However this solution adds some load to server as it additionally calculate count of rows in next page. Anyway it is a small load as compared to the calculation total rows.
We need to hide the count information from table footer and use simple pagination.
dtOptions = {};
dtOptions.pagingType = "simple";
dtOptions.fnDrawCallback = function() {
$('#'+table_id+"_info").hide();
};
$('#' + table_id).dataTable(dtOptions);

Insert lots of data at once using Laravel migrations?

I currently parse a CSV file to insert data into a database, but the problem is that because it's 20 000 rows, it takes very long. Is there a way to insert more lines at once using Laravel migrations?
This is what I am doing at the moment:
foreach ($towns as $town) {
DB::table('town')->insert(
array(
// data goes here
)
);
}
I think maybe my question is a bit vague. I want to know what the format is to mass insert multiple items using one query, and if this will actually make a difference in speed?
You can mass insert by filling an array with your data:
foreach ($towns as $town) {
$array[] = array(... your data goes here...);
}
And then run it just once
DB::table('town')->insert($array);
But I really don't know how much faster it can be. You can also disable query log:
DB::disableQueryLog();
It uses less memory and is usually faster.

Resources