Export large data using WhereIn and cursor in Laravel - laravel

I have a collection of 25k rows that I get from a CSV file, which contains a URL column.
How can I perform an optimized query using the value of the URL column?
This is what i have so far:
public function export()
{
$collection = FastExcel::import($file);
$urls = $this->getData($collection);
FastExcel::data($urls)->export('file.xlsx');
}
// generator function
public function getData( $collection )
{
$query = MyModel::select('url', 'views')
->whereIn('url', $collection->pluck('url'))
->cursor();
foreach ($query as $row) {
yield $row;
}
}
This works using little memory, but the query time is very high.
Memory usage 54 MB
Query: select `url`, `views` from `my_table` where `url` in... 297.36ms
I am using Laravel 9, MySQL 5.7

Build your query with query builder ( db raw ) to get one result , then do your magic

Related

Eloquent: Scope for Model where latest related Model meets permission

I have a Model Products that has many of the Model Prices.
Every day there is a different price for the product.
Now I am trying to create a scope that gives me all products which latest price is between two values.
I tried this with a whereHas query:
public function scopePriceBetween($query, ...$priceRange) {
return $query->whereHas('price', function ($query) use ($priceRange) {
$query->latestPrice()->whereBetween('price', $priceRange);
})
}
with the scope on the price model
public function scopeLatestPrice($query) {
return $query->latest('date')->limit(1);
}
But this will give me all the products where any price was between the range and not just the latest price.
Is there a way to do this with acceptable performance in eloquent or do I need to add a latest_price column to my product model?
for later price you can use database temp column or you can use redis. but i recommend temp column.
First Solution : Temporary Table
DB::statement("CREATE TEMPORARY TABLE last_prices SELECT prices.* from prices join products on products.id=prices.product_id and prices.id=(select id from prices where prices.product_id=products.id and `prices`.`deleted_at` is null order by `id` desc limit 1);");
$query = Product::select("products.*")
->join("last_prices", "products.id", "last_prices.product_id");
in this example, every task has many jobs, you can query database to make a temporary table and fetch the last_job from jobs;
Second Solution : Using Cache Server
DBMS temp table is fast, but you can gain performance by Cache server (for example redis).
you can store every product last price in cache server by product_id:
public function getLastPriceAttribute(){
//cache for an hour
$p_id = $this->id;
return Cache::tags(['product'])->remember($this->id, 60*60, function () uses ($p_id) {
return Price::where('product_id', $p_id)
->latest()
->first();
});
}
The third solution:
if your price updates are daily and you haven't or don't want to use cache server you can make a database table named last_prices and update it daily with laravel schedule as follow:
in App\Console\Kernel.php :
//suggestion has not tested
protected function schedule(Schedule $schedule)
{
$schedule->call(function () {
$updateValues = array();
foreach( Product::all() as $product){
array_push($updateValues , array(
"product_id" => product->id,
"price_value" =>
Price::where('product_id',$product->id)
->latest()
->first()->price_value;
));
}
LastPrices::updateOrInsert($updateValues);
})->dailyAt("05:30"); }
UPDATE
for this:
Product::latestPriceBetween([100,200])->category('electronics');
you can make Suggested Third Solution to have Last_price Table.
and define scope with join, with this nice package : https://github.com/fico7489/laravel-eloquent-join
looks like something like this:
public function scopePriceBetween($query, ...$priceRange) {
return $query->join("last_prices","last_prices.product_id","products.id")->whereBetween('last_prices.value', $priceRange);
}

`Row `1` must be array` in laravel

I am trying to import csv file in laravel with help of maatwebsite . I have query which is bringing data from two table both have relation with each other. it is exporting only one table data when I try to fetch data of both tables it gives me an error of Row1must be array
$data = SaleOrder::where('id',$id)->with('customers')->get()->toArray();
return Excel::create('Packlist Sale Order '.$id, function($excel) use ($data) {
$excel->sheet('mySheet', function($sheet) use ($data)
{
foreach($data as $customer)
{
$sheet->fromArray($customer['customers']);
}
$sheet->fromArray($data);
});
})->download('xlsx');
I want fetch data of both tables in csv file
You are using a with('customers') which means $data is a multi dimensional array with customers already in it, and likely breaking $sheet->fromArray($data);
If you remove the with('customers') from your query and do this:
foreach($data as $salesOrder)
{
$sheet->fromArray($salesOrder->customers()->get()->toArray());
}
This will load it on demand and leave it out of $data.

Laravel fetch raw query result

I don't want laravel to format my query result to an array or object ..etc. All I want, is to run the result set from database and then I will manually do the fetch myself in my custom code.
At the moment, I ran my select query and get my result in an array. The reasons for that, because the result is huge and I want to stream it directly to API.
$result = self::$db->select('select * from customer');
How can I tell laravel, to return my query result set without any format at all?
You can use DB:Raw like:
$results = DB::table('users')->select(DB::raw("*"))->get()
Or
$results = DB::select('select * from users where id = ?', [1]);
These two will return a neat object without any casts or relations etc. You can also make any object or array your API need by simple eloquent models by the way. Please explain more about data type you wanna extract from model query.
You must be use ->toSql() or ->dd()
Exapmle
Customer::toSql(); // select * from `customer`
if you want some condition
$query = Customer::where(`some conditions`);
$sql = $query->toSql();
$bindings = $query->getBindings();
$sql = str_replace('?', '%s', $sql);
$sql = sprintf($sql, ...$bindings);
Thanks everyone, I end up writing a raw function to query the data I want from database.
public static function dataStreamJSON($stmt, $headers)
{
return Response::stream(function() use ($stmt){
$conn = self::getConnection();
$result = sqlsrv_query($conn, "exec $stmt");
echo '
{
"Customers": {
"Customer": [';
$counter = 0;
while($customer = sqlsrv_fetch_object($result)) {
if($counter !== 0){
echo ",";
}
$counter++;
$row = [
'Firstname' => $customer->Firstname,
'Lastname' => $customer->Lastname,
...
];
echo json_encode($row);
unset($row);
unset($customer);
}
echo ']
}
}';
#sqlsrv_free_stmt($result);
#sqlsrv_close($conn);
}, 200, $headers);
}
The purpose of this code is to stream the data out to JSON format on browser without store the data in any variable, which will caused “out of memory” error.
I managed to stream 700MB of JSON data to the browser without any error. With this code, you will never run into “out of memory” error.
Best way to test this, is to use CURL to access your API and download the data to a JSON file. If you open on browser, it will freeze your screen because browser can't handle large data.
You can use toArray() or toJson() methods like below:
$array = Customer::all()->toArray();
$json = Customer::all()->toJson();
echo '<pre>';
print_r($array);
print_r($json);
If you want to run raw SQL Queries, you can do as below
$users = DB::select('select * from users where 1');
echo '<pre>';
print_r($users);
You can use
1) query builder way:-
DB::table('your_table_name)->select('your_col_names')->get();
eg:- DB::table('shop')->select('product_id','product_name')->get();
2) use laravel Raw
$orders = DB::table('orders')->selectRaw('price * ? as price_with_tax', [1.0825])->get();
3) for select raw
$product_count = DB::table('product')->select(DB::raw('count(*) as total_product_count'))->where('status',1)->get();

Laravel: How to update pivot tables withou using first

Im new in Laravel. I want to update my leaves pivot table. I am trying with below code but it only updates the single row i have multiple rows in db with same leave_id and i want to update all this where leave_id = xyz
I have following function in my model Leave:
public function relLeave(){
return $this->belongsToMany(User::class)->withPivot('days');
}
LeaveController:
public function saveUpdate(Request $request)
{
$leave = Leave::find($request->id);
$msg = $leave->relLeave()->Where('leave_id', $request->id)->get()->first();
$msg->pivot->days = $request->days;
$msg->pivot->save();
}
I followed #option's instruction and it works for me i removed the first();
below is my updated code.
$msg = $leave->relLeave()->Where('leave_id', $request->id)->get();
foreach($msg as $msgs)
{
$msgs->pivot->days = $request->days;
$msgs->pivot->save();
}
you can update extra fields in pivot table when updating relationship
$leave->relLeave()->sync([$request['id'] => ['days' => $request['days']]]);
You can use Query Builder for that if it's an option:
DB::table('leave_user')->where('leave_id', $request->id)->update(['days' => $request->days]);
This is just one DB query and it's pretty simple one.
If you want Eloquent solution, use updateExistingPivot() in a loop:
$leave = Leave::find($request->id);
$usersIds = $leave->relLeave()->pluck('id')->toArray();
foreach ($usersIds as $userId) {
$leave->relLeave()->updateExistingPivot($userId, ['days' => $request->days]);
}

Symfony 1.4 improve doctrine save() method

I have 500 entries in my db. In my backend I have action. For example:
public function executeMyAction(sfWebRequest $request) {
// Get some data from table
$templates = Doctrine_Core::getTable('SeoTemplates')->findOneByEntity('training');
//Get data from other table(500 items)
$trainings = Doctrine::getTable('Training')->getTraining();
// Make some operations with data
foreach ($trainings as $training) {
$training->setSomeValue1('some_data');
$training->setSomeValue2('some_data');
$training->setSomeValue2('some_data');
}
// Problem part (try to save)
$trainings->save();
}
save() performed for a long time. How to solve this problem? Is it possible?
In my problem part I have all known error Fatal error: Maximum execution time of 30 seconds exceeded in
Save each record instead of a collection
$templates = Doctrine_Core::getTable('SeoTemplates')->findOneByEntity('training');
$trainings = Doctrine::getTable('Training')->getTraining();
foreach ($trainings as $training) {
$training->setSomeValue1('some_data');
$training->setSomeValue2('some_data');
$training->setSomeValue2('some_data');
$training->save();
}
or use Doctrine to update the records using a query
$q = Doctrine_Query::create()
->update('TABLE')
->set($val1, '?', $val1)
->set($val2, '?', $val2)
->set($val3, '?', $val3)
->where('id = ?', $id)
->execute();

Resources