I am loading data from excel. In foreach I am checking for each record if it does exist in database:
$recordExists = $this->checkIfExists($record);
function checkIfExists($record) {
$foundRecord = $this->repository->newQuery()
->where(..., $record[...])
->where(..., $record[...])
...
->get();
}
When the excel contains up to 1000 values which is relatively small piece of data - the code runs around 2 minutes. I am guessing this is very inefficient way to do it.
I was thinking of passing the array of loaded data to the method checkIfExists but then I could not query on the data.
What would be a way to proceed?
You can use laravel queue if you want to do a lot of work within a very short time. Your code will run on backend. Client can not recognize the process. just show a message to client that this process is under queue. Thats it
You can check the Official Documentation From Below Url
https://laravel.com/docs/5.8/queues
If you passes all the data from the database to the function (so no more queries to the database), you can use laravel collections functions to filter.
On of them is where => https://laravel.com/docs/5.8/collections#method-where
function checkIfExists($record, Collection $fetchedDataFromDatabase) {
// laravel collectons 'where' function
$foundRecord = $fetchedDataFromDatabase
->where(..., $record[...])
->where(..., $record[...]);
}
other helpful functions.
filter
contains
Related
What would be the fastest way using as less as possible memory to seed 1M records in Laravel?
for( $i=0; $i<1000; $i++){
User::factory()->count(1000)->make();
}
Or using chunk method or everytime the loop has 1000 records make the records and empty the variable? Or are there any other ways this could be done faster?
Eloquent models give a significant overhead. Laravel factories are a nice pattern to get some data in your database, but you will notice a slow-down for 1000s of models, since each of the models first get constructed from the PHP class (and if you would persist the factory models using ->create() instead of ->make(), it sends out a bunch of creating/created/saving/saved events as well). So the most performant way forward is to not use the factory code but instead use simple DB statements like so:
$data = [['username' => 'foo', ...], ['username' => 'bar', ...]];
Illuminate\Support\Facades\DB::table('users')->insert($data);
At a certain $data size you could run into an SQL prepared statement placeholder limit, which basically means the query is becoming too big. In that case, you can also just chunk the code above to:
// Use `collect()` to create a Collection object (glorified array)
// which can chunk/paginate the data:
collect($data)->chunk(100)->each(function($chunk) {
DB::table('users')->insert($chunk);
});
That being said, the Laravel factories are not bad to use and they are very handy in testing.
I have a collection that contains an array with 2 objects in it. I'm trying to get the email attribute, right now there are only 2 objects, but there could be 100 objects, so I'm not sure if I need a loop or if I need to use array_merge, or something.
I've attached a screenshot of the data, I'm not showing the attributes, but just know that one of them is email.
I'm die/dumping a variable called $tokens.
In the end, it should return a list of emails, for example:
john.doe#email.com
jane#email.com
thomas.brown#email.com
And I need this logic in the controller, not in the view (.blade), because I'm trying to save this data in a csv file, using fputcsv, that's the main objective and idea here.
Thank you to #jagcweb for the inspiration, his code/answer just needed a couple of tweaks. The following works and does what I was trying to do:
$emails_array = array();
for($i=0; $i <= sizeof($tokens)-1; $i++) {
array_push($emails_array, $tokens[$i]["email"]);
}
Now the $emails_array contains an array with both emails in it.
This sounds like what pluck is for:
$tokens->pluck('email')
Laravel 8.x Docs - Collections - pluck
I am querying a large data sets from the table and then iterating through a loop for creating a json file.
$user = App\User::all();
foreach($user as $val){
// logic goes here for creating the json file
}
Now the problem i am facing is that when iterating through the loop it is consuming memory and i am getting error 'Allowed memory size exhausted'.And also the cpu usage of the server becomng so high.
My question how i should use the laravel lazy collections to get rid of this issue.I have gone through the offcial docs but couldnt find the way.
Just replace the all method with the cursor one.
$user = App\User::cursor();
foreach($user as $val){
// logic goes here for creating the json file
}
For more informations about the methods you can chain, refer to the official documentation
I was thinking about using cache/storage to store source for api routes which doesnt update that much. I mean i would like to update stored data every one minute (example) and my routes will use that data as source instead of database as source. It will make less database queries. Im I thinking in good way? How to achieve that in Laravel?
You could setup an instance of memcached which is supported by Laravel. The Laravel caching documentation is a good place to start learning how to setup a cache driver for memcached. Once setup, your logic could look something like this:
if (Cache::has('key')) {
//your key exists in the cache. get it.
$value = Cache::get('key');
//and use it
useMyValue($value);
}
else
{
//the cache does not contain the key you are looking for
//you can get it from the DB and cache it.
//the next time your function runs it will get the value from cache instead
//of reading from the db
$value = Cache::get('key', function () {
return DB::table(...)->get();
});
//and use your value now like normal
useMyValue()
}
I am trying to do string replace on entries of a column inside a db table. So far, I have reached till here:
$misa = DB::table('mis')->pluck('name');
for($i=0;;$i++)
{
$misa[$i] = substr_replace("$misa[$i]","",-3);
}
The error I am getting is "Undefined offset:443".
P.S. I am not a full-fledged programmer. Only trying to develop a few simple programs for my business. Thank You.
Since it's a collection, use the transform() collection method transform it and avoid this kind of errors. Also, you can just use str_before() method to transform each string:
$misa = DB::table('mis')->pluck('name');
$misa->transform(function($i) {
return str_before($i, ':ut');
});
There are a few ways to make this query prettier and FASTER! The beauty of Laravel is that we have the use of both Eloquent for pretty queries and then Collections to manage the data in a user friendly way. So, first lets clean up the query. You can instead use a DB::Raw select and do all of the string replacing in the query itself like so:
$misa = DB::table('mis')->select(DB::raw("REPLACE(name, ':ut' , '') as name"));
Now, we have a collection containing only the name column, and you've removed ':ut' in your specific case and simply replaced it with an empty string all within the MySQL query itself.
Surprise! That's it. No further php manipulation is required making this process much faster (will be noticeable in large data sets - trust me).
Cheers!