Get number of returned rows from a query using DB::Listen() - laravel

I'm addding some database logging to a laravel (5.8) application and I have registered a DB::listener callback, but it seems I'm fairly limited to the data the $query object has populated.
It does have the time taken to execute, the statement, so it must be being logged after the query is run, so it would make sense for it to be posible to return the number of rows impacted/returned.
I've configured a custom channel for the DB logs, and only enabled them when a config value is set.
My implementation looks like the below.
if (config('app.sql_profiler')) {
DB::listen(function ($query) {
Log::channel('db')->debug(
$query->sql,
[$query->bindings, $query->time]
);
});
}
I would like to extend it to look like
if (config('app.sql_profiler')) {
DB::listen(function ($query) {
Log::channel('db')->debug(
$query->sql,
[
$query->bindings,
$query->time,
// add $query->resultCount.
]
);
});
}
Any suggestions as to where to begin looking would be very helpful.

Related

How to make pagination API in laravel using POST method

Suppose i need to fetch user money transactions there are 100+ transactions in database and i need to send all the user transactions through the API to a android app, i have idea how to make using GET method but using GET method its not Dynamic.
In API i'm sorting data by 4-5 parameters in API input using post method
And i want to make this API for infinite Scrolling
And i'm using Stored Procedure for getting data
Then How can i achieve Laravel pagination in POST method?
my current response something like this
{
"Transactions"[
{
"name":"food";
"amount":100;
}
]
}
Actually, you can do it in multiple ways but the way I use most is like below
public function getTransaction(Request $request)
{
$transactions = Transactions::where('SOME-Condition', $request->Condition)->paginate(10);
$sorted = $transactions ->sortBy([
['name', 'asc'],
['age', 'desc'],
]);
return $sorted;
}
or you can also do it like this
public function getTransaction(Request $request)
{
$transactions = Transactions::where('SOME-Condition', $request->Condition)
->orderBy('abc')
->orderBy('def')
->paginate(10);
return $sorted;
}

How to attach id of table to request in laravel while validating?

The Laravel form request validator runs a query when using exists.
'item_name'=>'required|exists:items,name'
After validating for saving data I need to again run the same query and find items.id
Can I prevent this extra query?
I'm using validation for 1000 rows in csv. Please suggest other optimization techniques if any.
I finally used collection of all items inside FormRequest constructor.
Item::all();
this would get huge amount of data but would prevent running query n times.
Inside witValidator function
public function withValidator(Validator $validator,Array $row): Validator
{
$validator->after(function ($validator)use($row) {
$exists = $this->item_collection->firstWhere('id',$this->id);
if(!$exists){
$validator->errors()->add('id','id not found');
}
});
}

How to utilize Laravel Cache in API?

In my company we have a three user roles: admin, physician and client. All of them can view one of the records table where we have about 1 million rows and we are in need of caching the results from database.
I've read 10's of posts on Stack and else but I am still trying to figure out the proper way of how to caching.
What I've read is that the proper way is to cache per page, so I cache page 1, page 2 etc based on user page selection. This all works fine.
BUT each user role sees different datasets with different filters selected by them and this is where the problem starts. I cache the results and then filtering the paginated 10 rows seems kind of redundant.
I don't know if I should cache results for each user role with the selected parameters?
Or should I cache all the results first, then load the needed relationships and filter the collection with the parameters from user and then create pagination?
Or shouldn't I be using cache at all in this example and just use simple pagination?
// Set the cache time
$time_in_minutes = 5 * 60;
// Request page and if not set then default page is 1
$page = $paginationObject['page'];
// Set items per page
$per_page = $paginationObject['perpage'] ? $paginationObject['perpage'] : 10;
// Set the cache key based on country
$cache_key = "l04ax_pct_dispensing_forms_{$request->get('country')}_page_{$page}_per_page_$per_page";
// Cache::forget($cache_key);
// Set base query for results
$baseQuery = $this->model->with(['details', 'patient']);
// Assign appropriate relations based on user role
if (Auth::user()->isPhysician()) {
$baseQuery->physicianData();
}
else if (Auth::user()->isManufacturer()) {
$baseQuery->manufacturerData();
}
else if (Auth::user()->isSuperAdmin() || Auth::user()->isAdmin()) {
$baseQuery->adminData();
}
//--------------------------------------
// Add filtering params from request
// -------------------------------------
$baseQuery->when($request->has('atc_code'), function ($query) use ($request) {
if ($request->get('atc_code') === NULL) {
throw new RequestParameterEmpty('atc_code');
}
$query->whereHas('details', function ($subQuery) use ($request) {
$subQuery->where('atc_code', $request['atc_code']);
});
})
->when($request->has('id'), function ($query) use ($request) {
if ($request->get('id') === NULL) {
throw new RequestParameterEmpty('id');
}
$query->where('l04ax_dispensing_forms.id', $request['id']);
})
->when($request->has('pct_patients_hematology_id'), function ($query) use ($request) {
if ($request->get('patient_id') === NULL) {
throw new RequestParameterEmpty('patient_id');
}
$query->where('patient_id', $request['patient_id']);
})
->when($request->has('physician_id'), function ($query) use ($request) {
if ($request->get('physician_id') === NULL) {
throw new RequestParameterEmpty('physician_id');
}
$query->where('physician_id', $request['physician_id']);
})
->when($request->has('date'), function ($query) use ($request) {
if ($request->get('date') === NULL) {
throw new RequestParameterEmpty('date');
}
$query->whereDate('created_at', Carbon::parse($request->get('date'))->toDateString());
})
->when($request->has('deleted'), function ($query) use ($request) {
if ($request->get('only_deleted') === NULL) {
throw new RequestParameterEmpty('only_deleted');
}
$query->onlyTrashed();
})
->when($request->has('withTrashed'), function ($query) use ($request) {
if ($request->get('withTrashed') === NULL) {
throw new RequestParameterEmpty('withTrashed');
}
$query->withTrashed();
});
// Remember results per page into cache
return Cache::remember($cache_key, $time_in_minutes, function () use ($baseQuery, $per_page, $page) {
return new L0axPctDispensingFormsCollection($baseQuery->paginate($per_page, ['*'], 'page', $page));
});
In this example the results are cached per page, but when different user logs in, then the results are wrong.
What would be the best way to approach this?
I wouldn't recommend caching this because of the problem you have already encountered. Caching is massively helpful in some areas (e.g. for reference data like a persistent list of countries or currencies), but for user-specific data I would avoid.
If you really did want to cache you could use cache tagging (supported by redis using the phpredis driver only) to tag by user id. However, as mentioned, I wouldn't recommend in this scenario!
If your desire to cache is driven by the scenario where your pages are loading slowly I would recommend installing Laravel Debugbar, and checking to see how many queries your api calls are generating.
If you find a single api call is generating more queries than the number of records you are loading, then you likely are having the 'n + 1 problem' and need to eager load any nested relationships rather than call them in your resource.
P.s You can immediately reduce the number of queries generated by this controller method by only calling Auth::user() once. e.g. $user = Auth::user() and then $user->isSuperAdmin();

Laravel6 WhereHas Error 500 when using AJAX

im new to Laravel and facing an interesting Issue right now in my App.
I have 3 tables.
Producers
id
producer_name
Types
id
type_name
Models
id
model_name
device_type_id
device_producer_id
Within my Producers Model I have defined the follwing Filter method:
public function scopeFilterByType($query, $type_id)
{
$query->whereHas('models', function($q) use $type_id { $q->where('device_type_id', $type_id );});
}
Using Tinker I can do the following:
App\DeviceProducer::filterByType(3)->get()
And get full response with my Producers associated to my given type.
I created an Function so when a user select a device type Ajax will load all Producers from this type.
public function reqProducer(Request $request)
{
$producers = DeviceProducer::filterByType($request->type_id)->get();
return response()->json( $producers );
}
But when AJAX is calling my endpoint it gets HTTP500 error.
I figured out when using a request without WhereHas for example:
$producers = DeviceProducer::where('id', $request->producer_id)->get();
It just works fine and I get my results. So it seems have to do something with "WhereHas". I know I could Solve this by first asking Models Table and the creating an Foreach loop. But I this solution would be less readable then my first attempt.
Does anyone has an suggestion what im doing wrong or is it just like there is noch AJAX support for WhereHas querys?
Kind regards
Mike
I think this is your issue use $type_id
Please fix as
public function scopeFilterByType($query, $type_id)
{
$query->whereHas('models', function($q) use ($type_id) { $q->where('device_type_id', $type_id );});
}

Can I optimize this script updating ~6000 rows with a lot of data

I have ~5-6k $items that I need to update in the database. Each item needs a HTTP request to get the data from the page. In the HTTP GET request I get arrays that are massive (~500-2500) and I need to insert only those lines that are not in the database. It seems to take a lot of time with my current script (1 item every 2-4 minutes) on my vagrant scotch box.
Simplified example:
<?php
namespace App\Http\Controllers;
use Illuminate\Http\Request;
use App\Http\Requests;
use GuzzleHttp\Client;
use App\Item;
use App\ItemHistory;
use Carbon\Carbon;
use DB;
class UpdateController extends Controller
{
public function getStart() {
// Don't cancel the script
ignore_user_abort(true);
set_time_limit(0);
$client = new Client();
$items = Item::where('updated_at', '<=', Carbon::now()->subDay())->get();
foreach($items as $item) {
$response = $client->request('GET', 'API_URL');
// get the body
$body = $response->getBody()->getContents();
$hugeArray = $body['history']; // can be from 100 to 5 000 lines and I use regex to get the "history" array from the body
$arrayCollection = collect($hugeArray);
foreach($arrayCollection->take(-100) as $row) { // I take the last 100 since each row = 1 hour, so I get items in the last 100 hours
$date = new \DateTime($row['created_at']);
if( ! ItemHistory::whereItemId($item->id)->whereSoldAt($date)->count()) { // Checking if it already exists
// I insert the new rows..
$history = new ItemHistory;
// ....
$history->save();
}
}
}
}
}
I actually crawl the data and use regex to find the arrays in the body response.
Am I doing something wrong? It takes quite a while until it moves onto the next $item.
I can provide a simplified answer - synchronous execution, object hydration, and bulk database querys.
Consider the following example:
$requests = function () use ($items) {
foreach ($items as $item) {
yield new GuzzleHttp\Psr7\Request($method, $uri);
}
};
$client = new GuzzleHttp\Client();
foreach ($requests() as $request) {
$client->sendAsync($request)
->then(
function(Psr7\Http\Message\ResponseInterface) {
// process the response into array;
return $arrayFromResponse;
})
->then(
function ($unfilteredArray) {
// filter the array as necessary
return $filteredArray;
})
->then(
function($filteredArray) {
// create the array for bulk insert / update
return $sqlArray;
})
->then(
function($sqlArray) {
// perform bulk db operations.
}
);
}
Synchronous Http queries - The above example highlight's some of Guzzle's asynchronous capabilities, while breaking out the processing steps. The code you linked above is synchronous. Perform a request, wait for a response, process response, rince & repeat. Asynchronous Http requests will ensure that data is being downloaded while other information is being processed. I should note that your results will vary, and depending on your particular use case, may see increased resource usage.
Object Hydration - aka what your ORM is doing when you perform a query and it returns an object instance (rather than an array), is time consuming and memory intensive. #orcamius (one of Doctrine's developers) wrote a fairly technical article on the subject. While this is not Eloquent specific, it does provide insight into operations that go on behind the scenes for all ORM's. The code snippet performs many of these (reference $itemHistory, $history, Item::where).
Bulk Database Operations - a widely known fact is that database operations are slow. This time is further increased when coupled with object hydration. It is much better to perform a single insert with 1000x records vs 1000x inserts. To do this, code will have to be modified from using the ORM to using the DB tables directly. Bulk inserts can be performed by DB::table('itemHistory')->insert($arrayOfValues) as seen in the docs
Update: Although not shown then() has a method signature of then(callable $fulfilled, callable $onError). If something goes awry with the request you could do something like
// promise returned from a request
$p->then(
function (Psr\Http\Message\ResponseInterface $response) use ($p)
if ($response->getResponseCode() >= 400) {
$p->cancel();
}
//perform processing
return $someArray;
},
function (RequestException $e) {
echo $e->getMessage() . "\n";
echo $e->getRequest()->getMethod();
})
->then(
function($someArray) use ($p) {
// filter or other processing
});
Additional information on Guzzle's Promises can be found within the Github Repo

Resources