I have a Coin model with id,name,price.
In a function, I extract all the coins and create a comma separated string with all the ids:
$coins = Coin::all();
$coinsIds = $coins->pluck('id')->toArray();
$coinsIdsString = implode(',', $coinsIds);
After that I make a call to an external API:
$url = 'https://myapi.com?ids' . $coinsIdsString;
$response = Http::get($url)->json();
$response value is an array of coins, something like:
[
{
"id":"1",
"name":"A",
"price":"1.2",
},
...
]
What would be the best way to update and save my Coin model with the price value from API?
Unfortunately, you're not going to be able to do anything other than update a single record at a time. That is, loop through the results of the array and perform a database update on each record. My recommendation is
$results = ... // Result of API call;
foreach ($results as $result) {
DB::table('coins')
->where('id', $result['id'])
->update(['price' => $result['price']]);
}
I would then create a scheduled command to periodically perform the update since it is likely to be resource intensive depending on the volume of calls.
https://laravel.com/docs/8.x/scheduling#scheduling-artisan-commands
Related
I am trying to think of a way that a user can add a type(medical) on the frontend but a verifier has to approve that record? i can't seem to figure out the best solution for this, does anyone have any suggestions on how to make this work? It's getting stored in the medical table but not in Verifier table.
In Simple words, I am using the medicalRecord Controller to stored medical and Verifier details.
'document_id' = the medical_id;
'submitted_by' = who created the record
public function store Request $request, $id)
{
if (auth_user_cannot(Capability::CREATE_DRIVER_MEDICAL_RECORD)) {
return redirect($group['name'] . "/" . $group['userId']);
}
$all = $request->all();
if ($request->hasFile('myfile')) {
$result = upload($request->file('myfile'), 'nonzip', 'Medical');
$all['upload'] = $result['upload'];
$all['uploadId'] = $result['uploadId'];
}
// u'll need to save the medical to the user first, then save the verifier to the medical.
$driverMedical = DriverMedicalRecord::create($all);
// Then for the relation;
$medical = $driverMedical->find($id);
$verificationData = DocumentVerification::create( [
'document_id' => $medical,
'submitted_by'=> Auth::id(),
]) ;
$request->user()->posts()->save($post);
$driverMedical->verifier()->save($verificationData);
set a flag in the same table and change its value when it got verified.
ex: is_verfied it should be 0 when it is not approved when it got approved change it to 1 (this is what i understand from your question)
I have ~5-6k $items that I need to update in the database. Each item needs a HTTP request to get the data from the page. In the HTTP GET request I get arrays that are massive (~500-2500) and I need to insert only those lines that are not in the database. It seems to take a lot of time with my current script (1 item every 2-4 minutes) on my vagrant scotch box.
Simplified example:
<?php
namespace App\Http\Controllers;
use Illuminate\Http\Request;
use App\Http\Requests;
use GuzzleHttp\Client;
use App\Item;
use App\ItemHistory;
use Carbon\Carbon;
use DB;
class UpdateController extends Controller
{
public function getStart() {
// Don't cancel the script
ignore_user_abort(true);
set_time_limit(0);
$client = new Client();
$items = Item::where('updated_at', '<=', Carbon::now()->subDay())->get();
foreach($items as $item) {
$response = $client->request('GET', 'API_URL');
// get the body
$body = $response->getBody()->getContents();
$hugeArray = $body['history']; // can be from 100 to 5 000 lines and I use regex to get the "history" array from the body
$arrayCollection = collect($hugeArray);
foreach($arrayCollection->take(-100) as $row) { // I take the last 100 since each row = 1 hour, so I get items in the last 100 hours
$date = new \DateTime($row['created_at']);
if( ! ItemHistory::whereItemId($item->id)->whereSoldAt($date)->count()) { // Checking if it already exists
// I insert the new rows..
$history = new ItemHistory;
// ....
$history->save();
}
}
}
}
}
I actually crawl the data and use regex to find the arrays in the body response.
Am I doing something wrong? It takes quite a while until it moves onto the next $item.
I can provide a simplified answer - synchronous execution, object hydration, and bulk database querys.
Consider the following example:
$requests = function () use ($items) {
foreach ($items as $item) {
yield new GuzzleHttp\Psr7\Request($method, $uri);
}
};
$client = new GuzzleHttp\Client();
foreach ($requests() as $request) {
$client->sendAsync($request)
->then(
function(Psr7\Http\Message\ResponseInterface) {
// process the response into array;
return $arrayFromResponse;
})
->then(
function ($unfilteredArray) {
// filter the array as necessary
return $filteredArray;
})
->then(
function($filteredArray) {
// create the array for bulk insert / update
return $sqlArray;
})
->then(
function($sqlArray) {
// perform bulk db operations.
}
);
}
Synchronous Http queries - The above example highlight's some of Guzzle's asynchronous capabilities, while breaking out the processing steps. The code you linked above is synchronous. Perform a request, wait for a response, process response, rince & repeat. Asynchronous Http requests will ensure that data is being downloaded while other information is being processed. I should note that your results will vary, and depending on your particular use case, may see increased resource usage.
Object Hydration - aka what your ORM is doing when you perform a query and it returns an object instance (rather than an array), is time consuming and memory intensive. #orcamius (one of Doctrine's developers) wrote a fairly technical article on the subject. While this is not Eloquent specific, it does provide insight into operations that go on behind the scenes for all ORM's. The code snippet performs many of these (reference $itemHistory, $history, Item::where).
Bulk Database Operations - a widely known fact is that database operations are slow. This time is further increased when coupled with object hydration. It is much better to perform a single insert with 1000x records vs 1000x inserts. To do this, code will have to be modified from using the ORM to using the DB tables directly. Bulk inserts can be performed by DB::table('itemHistory')->insert($arrayOfValues) as seen in the docs
Update: Although not shown then() has a method signature of then(callable $fulfilled, callable $onError). If something goes awry with the request you could do something like
// promise returned from a request
$p->then(
function (Psr\Http\Message\ResponseInterface $response) use ($p)
if ($response->getResponseCode() >= 400) {
$p->cancel();
}
//perform processing
return $someArray;
},
function (RequestException $e) {
echo $e->getMessage() . "\n";
echo $e->getRequest()->getMethod();
})
->then(
function($someArray) use ($p) {
// filter or other processing
});
Additional information on Guzzle's Promises can be found within the Github Repo
I've created an API using Laravel and I'm trying to find out how to cache Eloquent models. Lets take this example as one of the API endpoints /posts to get all the posts. Also within the method there are various filter options such as category and search and also gives the option to expand the user.
public function index()
{
$posts = Post::active()->ordered();
if (Input::get('category')) $posts = $posts->category(Input::get('category'));
if (Input::get('search')) $posts = $posts->search(Input::get('search'));
if ($this->isExpand('user')) $posts = $posts->with('user');
$posts = $posts->paginate($this->limit);
return $this->respondWithCollection($this->postTransformer->transformCollection($posts->all()), $posts);
}
I have been reading up and found in Laravel 4 you could cache a model like this
return Post::remember($minutes);
But I see this has been removed for Laravel 5.1 and now you have to cache using the Cache facade, but is only retrievable by a single key string.
$posts = Cache::remember('posts', $minutes, function()
{
return Post::paginate($this->limit);
});
As you can see, my controller method contains different options, so for the cache to be effective I would have to create a unique key for each option like posts_cagetory_5, posts_search_search_term, posts_category_5_search_search_term_page_5 and this will clearly get ridiculous.
So either I'm not coming across the right way to do this or the Laravel cache appears to have gone backwards. What's the best solution for caching this API call?
As the search is arbitrary, using a key based on the search options appears to be the only option here. I certainly don't see it as "ridiculous" to add a cache to for expensive DB search queries. I may be wrong as I came by this post looking for a solution to your exact problem. My code:
$itemId = 1;
$platform = Input::get('platform'); // (android|ios|web)
$cacheKey = 'item:' . $itemId . ':' . $platform;
$item = Item::find(1);
if( Cache::has($cacheKey) ) {
$result = Cache::get($cacheKey);
} else {
$result = $this->response->collection( $item, new ItemTransformer( $platform ) );
Cache::tags('items')->put($cacheKey, $result, 60); // Or whatever time or caching and tagged to be able to clear the lot in one go...
}
return $result;
I realise that my example has less complexity but it seems to cover all the bases for me. I then use an observer to clear the cache on update.
I am not sure how to increment the value in a column using Eloquent Model in Laravel 4?
This is what I currently have and I am not sure how correct is this.
$visitor = Visitor::where('token','=','sometoken')->first();
if(isset($visitor)){
$visitor->increment('totalvisits');
}else{
Visitor::create(array(
'token'=>'sometoken',
'totalvisits'=>0
));
}
With Query Builder we could do it using
DB::table('visitors')->increment('totalvisits');
Looks like the code that I posted worked after all
$visitor = Visitor::where('token','=','sometoken')->first();
if(isset($visitor)){
$visitor->increment('totalvisits');
}else{
Visitor::create(array(
'token'=>'sometoken',
'totalvisits'=>0
));
}
Prior to a fix a few weeks ago the increment method actually fell through to the query builder and would be called on the entire table, which was undesirable.
Now calling increment or decrement on a model instance will perform the operation only on that model instance.
Laravel 5 now has atomic increment:
public function increment($column, $amount = 1, array $extra = [])
{
if (! is_numeric($amount)) {
throw new InvalidArgumentException('Non-numeric value passed to increment method.');
}
$wrapped = $this->grammar->wrap($column);
$columns = array_merge([$column => $this->raw("$wrapped + $amount")], $extra);
return $this->update($columns);
}
which essentially works like:
Customer::query()
->where('id', $customer_id)
->update([
'loyalty_points' => DB::raw('loyalty_points + 1')
]);
Below is old answer for Laravel 4 where the built-in increment was a seperate select and then update which of course leads to bugs with multiple users:
If you'd like to accurately count your visitors by ensuring the update is atomic then try putting this in your Visitor model:
public function incrementTotalVisits(){
// increment regardless of the current value in this model.
$this->where('id', $this->id)->update(['totalVisits' => DB::raw('last_insert_id(totalVisits + 1)')]);
//update this model incase we would like to use it.
$this->totalVisits = DB::getPdo()->lastInsertId();
//remove from dirty list to prevent any saves overwriting the newer database value.
$this->syncOriginalAttribute('totalVisits');
//return it because why not
return $this->totalVisits;
}
I'm using it for a change tag system but might work for your needs too.
Does anyone know what to replace the "$this->where('id',$this->id)" with because since dealing with $this Visitor it should be redundant.
I'm working with the latest codeIgniter released, and i'm also working with jquery datatables from datatables.net
I've written this function: https://gist.github.com/4478424 which, as is works fine. Except when I filter by using the text box typing something in. The filter itself happens, but my count is completely off.
I tried to add in $res = $this->db->count_all_results() before my get, and it stops the get from working at all. What I need to accomplish, if ($data['sSearch'] != '') then to utilize the entire query without the limit to see how many total rows with the search filter exists.
If you need to see any other code other than whats in my gist, just ask and I will go ahead and post it.
$this->db->count_all_results() replaces $this->db->get() in a database call.
I.E. you can call either count_all_results() or get(), but not both.
You need to do two seperate active record calls. One to assign the results #, and one to get the actual results.
Something like this for the count:
$this->db->select('id');
$this->db->from('table');
$this->db->where($your_conditions);
$num_results = $this->db->count_all_results();
And for the actual query (which you should already have):
$this->db->select($your_columns);
$this->db->from('table');
$this->db->where($your_conditions);
$this->db->limit($limit);
$query = $this->db->get();
Have you read up on https://www.codeigniter.com/userguide2/database/active_record.html#caching ?
I see you are trying to do some pagination where you need the "real" total results and at the same time limiting.
This is my practice in most of my codes I do in CI.
$this->db->start_cache();
// All your conditions without limit
$this->db->from();
$this->db->where(); // and etc...
$this->db->stop_cache();
$total_rows = $this->db->count_all_results(); // This will get the real total rows
// Limit the rows now so to return per page result
$this->db->limit($per_page, $offset);
$result = $this->db->get();
return array(
'total_rows' => $total_rows,
'result' => $result,
); // Return this back to the controller.
I typed the codes above without testing but it should work something like this. I do this in all of my projects.
You dont actually have to have the from either, you can include the table name in the count_all_results like so.
$this->db->count_all_results('table_name');
Count first with no_reset_flag.
$this->db->count_all_results('', FALSE);
$rows = $this->db->get()->result_array();
system/database/DB_query_builder.php
public function count_all_results($table = '', $reset = TRUE) { ... }
The
$this->db->count_all_results();
actually replaces the:
$this->db->get();
So you can't actually have both.
If you want to do have both get and to calculate the num rows at the same query you can easily do this:
$this->db->from(....);
$this->db->where(....);
$db_results = $this->get();
$results = $db_results->result();
$num_rows = $db_results->num_rows();
Try this
/**
* #param $column_name : Use In Choosing Column name
* #param $where : Use In Condition Statement
* #param $table_name : Name of Database Table
* Description : Count all results
*/
function count_all_results($column_name = array(),$where=array(), $table_name = array())
{
$this->db->select($column_name);
// If Where is not NULL
if(!empty($where) && count($where) > 0 )
{
$this->db->where($where);
}
// Return Count Column
return $this->db->count_all_results($table_name[0]);//table_name array sub 0
}
Then Simple Call the Method
Like this
$this->my_model->count_all_results(['column_name'],['where'],['table name']);
If your queries contain a group by, using count_all_results fails. I wrote a simple method to work around this. The key to preventing writing your queries twice is to put them all inside a private method that can be called twice. Here is some sample code:
class Report extends CI_Model {
...
public function get($page=0){
$this->_complex_query();
$this->db->limit($this->results_per_page, $page*$this->results_per_page);
$sales = $this->db->get()->result(); //no table needed in get()
$this->_complex_query();
$num_results = $this->_count_results();
$num_pages = ceil($num_results/$this->results_per_page);
//return data to your controller
}
private function _complex_query(){
$this->db->where('a', $value);
$this->db->join('(subquery) as s', 's.id = table.s_id');
$this->db->group_by('table.column_a');
$this->db->from('table'); //crucial - we specify all tables here
}
private function _count_results(){
$query = $this->db->get_compiled_select();
$count_query = "SELECT count(*) as num_rows FROM (".$query.") count_wrap";
$r = $this->db->query($count_query)->row();
return $r->num_rows;
}
}