Laravel eloquent query slow with 100000 records - laravel

I have this simple eloquent query, works fine, but with few records.
When database increments until 100000 records become very slow.
I read should be use chunk instead of get. How can I implement it for this query?
$collection = Contact::with('shop');
$collection = $collection->orderBy('created_at', 'DESC');
$collection = $collection->get();
$json = $collection->map(function ($contact) {
return [
'id' => $contact->id,
'name' => $contact->name,
...about 50 columns more.
'shop' => [
'id' => optional($contact->shop)->id,
'name' => optional($contact->shop)>name
],
...about 6 relations more.
];
});
$json = $json->paginate(50);
return response()->json(['contacts' => $json], 200);

You are converting getting all the data like 1M or how many records it has. Then you are mapping it and paginate it and getting only 50. There is huge performance problem with your code.
You can directly call like this:
return response()->json(['contacts' => Contact::with('shop')->orderBy('created_at', 'DESC')->paginate(50)], 200);
If you only need id and name for contacts:
return response()->json(['contacts' => Contact::select('id', 'name', 'created_at')->orderBy('created_at', 'DESC')->paginate(50)], 200);

Related

make better code in laravel many to many relation

hi i wrote this code and it works just fine but i think its not the best way to do it!
i want to get all the jobs for 1 company.
each company can have many addresses and each address can have many jobs
here is my code:
$company = Company::find($id)->with('addresses.jobDetails.job')->first();
$jobs = [];
foreach ($company->addresses as $address) {
foreach ($address->jobDetails as $detail) {
array_push($jobs, [
'id' => $detail->job->id,
'title' => $detail->job->title,
'country' => $detail->job->country,
'city' => $detail->job->city,
'type' => $detail->job->type,
'work_types' => JobType::where('job_id',$detail->job->id)->pluck('title'),
'income' => $detail->income,
]);
}
}
return $jobs;
can anyone help me to change this to better code please
thank you in advance
You do the opposite and start with JobDetails
$jobDetails = JobDetail::whereHas('address.company', function($companyQuery) use($id) {
$companyQuery->where('id', $id);
})->whereHas('jobs', function($jobQuery) {
$jobQuery->where('is_active', 1);
})->with('jobs')->get();
foreach ($jobDetails as $detail) {
array_push($jobs, [
'id' => $detail->job->id,
'title' => $detail->job->title,
'country' => $detail->job->country,
'city' => $detail->job->city,
'type' => $detail->job->type,
'work_types' => JobType::where('job_id',$detail->job->id)->pluck('title'),
'income' => $detail->income,
]);
}
return $jobs;
EDIT:
In your query
Company::find($id)->with('addresses.jobDetails.job')->first();
You run 4 queries with eager loading. one for each model. You can check in the result that you got that all the data is present in the variable $company.
The example I gave you it runs only two queries, the first one (job_details) will use joins to filter the Job results by the id of the companies table (you can make it faster by using the field company_id in the addresses table)
The second one is for the jobs relation using eager loading.

Laravel - updateOrCreate with large number of records

I have potentially 500,000 records to either insert or update into my database.
I'm using the updateOrCreate function in Laravel but it's still taking a very long time.
I'm current using a foreach loop wrapped in a database transaction but is there a better solution?
DB::transaction(function() use ($items, $client) {
foreach($items as $item) {
$data = array(
'external_id' => $item->external_id,
'comment' => $item->comment,
'code_id' => $item->code_id,
'client_id' => $client->id
);
Registration::updateOrCreate(
[
'user_id' => $item->user_id,
'date' => Carbon::parse($item->date)->format('Y-m-d'),
'session' => $item->session
],
$data
);
}
});
Well since you have so many records its inevitable for it to take a long time, my suggestion is to chunk the data you are getting like so
foreach($items->chunk(1000) as $chunk) {
foreach($chunk as $item) {
...
}
}
The above method will go over 1000 (or as many as you want) items at a time, and should theoretically decrease the load time by a bit. But still I really don't think you can make it a lot faster.
I think you should here use ShouldQueue approach of Laravel
and instead of updateOrCreate method use Query builder to update single row using where('id',$id) this will ma

Best Way to Count Record Many Table - Laravel

I want to display the number of records from several tables at once in one view.
I've tried it using eloquent count.
public function index(){
$order = Order::count();
$owner = Owner::count();
$room = Room::count();
$member = Transaction::where([
['status', 'waiting'],
['type', 1]
])->count();
$highlight = Transaction::where([
['status', 'waiting'],
['type', 2]
])->count();
return view('admin.index', [
'order' => $order,
'owner' => $owner,
'room' => $room,
'member' => $member,
'highlight' => $highlight
]);
}
Is there a better way?
You could also use view-composers. But this is not better, just different. I mean you have to query anyway, so why do you think there should be a better way?

Pass array into a single column in Laravel

i am trying to pass some values into a single column in laravel database table.
The values are like this 20,45,67,89
but i want them to enter into the colume like this
===USER_ID====
20
45
67
89
I have tried like below, but not working..any suggestions ?
foreach ($request->val2 as $value){
$str_explode = explode(",",$value);
DB::table('retirement')->insertGetId([
'user_id' => $str_explode,
'amount' => $request->val1,
'week' => $request->week
]);
}
Hope this will work
foreach ($request->val2 as $value){
$str_explode = explode(",",$value);
$insert = [];
foreach($str_explode as $str){
$insert[] = [
'user_id' => $str,
'amount' => $request->val1,
'week' => $request->week
];
}
DB::table('retirement')->insert($insert);
I'm not sure i understood your question clearly, i'm assuming you want to insert array to a column:
did you try to set the column in migration to Json?
did you set the $casts in the model to json or array?
protected $casts = [ 'user_id' => 'array' ];
then when you do this, you can have an array added to that column like
Posts::create(['user_id'=>[1,2,3,4]]);
normally the user_id field is set to unsignedBigInt(), that type will not accept anything but integers, you gotta check the migration column type first.
explode() is returning an array, not a single value, that's why it will fail. Instead, you should loop through all values like this:
foreach ($request->val2 as $value){
$str_explode = explode(",",$value);
foreach($str_explode as $str){
DB::table('retirement')->insertGetId([
'user_id' => $str,
'amount' => $request->val1,
'week' => $request->week
]);
}
}
As a side advice, as you are not saving the id returned by insertGetID, you can simply use insert. Moreover, it's usually a good practice to use create because this way you will also save timestamps for created and updated.

Why stored duplicate rows to table

I have custom query for adding parsed remote posts to table:
foreach ($parsedPosts as $post) {
$hasRemotePostAlready = Post::where('remote_post_id', $post->id)->first();
if(null === $hasRemotePostAlready) {
$data = [
'title' => $post->title,
'description' => $post->description,
'remote_post_id' => $post->id
];
Post::create($data);
}
}
Variable $parsedPosts has more then 3500 posts and when I run my script to add posts then any remote posts duplicated. Why they posts duplicated and why not work my condition:
$hasRemotePostAlready = Post::where('remote_post_id', $post->id)->first();
How I can fix duplicating rows problem in my case?
You can use firstOrCreate or updateOrCreate methods for this case, take a look at the documentation
In your case try this:
$hasRemotePostAlready = Post::firstOrCreate(
[
'remote_post_id' => $post->id // columns to check if record exists
],
[
'title' => $post->title,
'description' => $post->description
]
);

Resources