Hi I have been trying to develop a script which will basically stress test one of my API calls. I decided to use Guzzle and have been following this tutorial - http://docs.guzzlephp.org/en/stable/quickstart.html?highlight=getasync#concurrent-requests
I think I may have a miss understanding on how concurrent requests should work or I have an error somewhere in my code.
Below is my entire script, its quite simple but will hopefully do the job
use GuzzleHttp\Client;
use GuzzleHttp\Exception\RequestException;
use GuzzleHttp\Pool;
use GuzzleHttp\Psr7\Request;
use GuzzleHttp\Psr7\Response;
$client = new Client();
$start = time();
$requests = function ($total) use ($client) {
$uri = '';
for ($i = 0; $i < $total; $i++) {
yield function() use ($client, $uri) {
$bulk = [];
$count1 = 0;
while($count1 < 40) {
$bulk[] = json_decode('{}');
$count1++;
}
$payload = [
"drone_id" => 56,
"bulk" => $bulk
];
$jws = new \Gamegos\JWS\JWS();
$headers = array(
'alg' => 'HS256',
'typ' => 'JWT'
);
$key = '';
$token = $jws->encode($headers, $payload, $key);
$request = new Request('POST', $uri, $headers, $token);
return $client->sendAsync($request);
};
}
};
$pool = new Pool($client, $requests(2500), [
'concurrency' => 10,
]);
// Initiate the transfers and create a promise
$promise = $pool->promise();
// Force the pool of requests to complete.
$promise->wait();
echo 'Request took ' . (time() - $start) . ' seconds' . PHP_EOL;
My plan was to send concurrent requests to try and enhance the stress test on the API. I originally had the pool concurrency property set to 5 and the script took 85 seconds to run, I then decided to up this to 10 which in my head means 10 requests should be sent in parrellel reducing the total time by half but after running the script it still took 80 seconds to run.
Could someone please check my code since I believe I must have done something wrong or I have misunderstood how the pool concurrency works.
Related
I'm creating 192 html pages using this function in the web.php file, but when it creates the fifth page gives me the error.
function()
{
$homepage = Page::find(1);
$articles_show = Article::where(function ($query) {
$query->where(function ($query2) {
$query2->where('draft', 0)->whereNull('publication_date');
})->orWhere(function ($query2) {
$query2->where('draft', 0)->where('publication_date', '<=', DateHelper::currentDateTime());
});
})->orderBy('publicated_at', 'DESC')->get();
$last_page = ceil($articles_show->count() / 8);
$articles = [];
$count = 0;
for ($i = 5; $i <= $last_page; $i++) {
$filename = 'page'.$i.'.html';
$max_desc_id = $i * 8;
$min_desc_id = $max_desc_id - 7;
foreach ($articles_show as $article) {
$count++;
if ($article->desc_id >= $min_desc_id && $article->desc_id <= $max_desc_id) {
$articles[] = $article;
}
if (($count % 8) == 0) {
$count = 0;
File::put(
resource_path('views/allArticlesHtml/'.$filename),
view('allArticlesTemplate')->with([
"articles" => $articles,
"homepage" => $homepage,
"this_page" => $i,
"last_page" => $last_page
])->render()
);
continue;
}
}
$articles = [];
// if(($count % 8) == 0){
// }
}
}
The pages are created correctly but it's way too slow.
I don't really know if i am doing this in the right way or not, i'm still very new at programming and i don't know how to improve or recreate this code.
I had this issue the other day. You can do this workaround:
Every time you create 1 file under that command put this:
set_time_limit(30);
It will reset the time limit to 30 seconds and every time reset it.
Reference: https://www.php.net/manual/en/function.set-time-limit.php
I read your code. I'll try to suggest you best solution.
why you are trying to create html files with same content?
you can easily use blade engine and extend a template in different file.
to solve the error Maximum execution time of 60 seconds exceeded
you need to increase max_execution_time value in php.ini file.
or put ini_set('max_execution_time', 180) at the top of php file .
but if you want an alternative way of creating files describe your initial problem.
I am consuming a service using CURL. I am able to connect all the functions of the services in the form of a CXF Service List using the following code.
$client = \Config\Services::curlrequest();
$response = $client->request('GET', 'www.soapservice.co.za/service');
var_dump($response->getBody());
Var Dump Returns a string with Available services
The service has 10 functions listed in this manner getDataFunction.
How do I invoke the function? Or how do I get the contents of the body and start using Service functions?
Any help would be appreciated
This is how I resolved my issue using Codeignter 4 and Soap Client.
public function ownersearch()
{
helper(['form']);
$username = "";
$password = "";
$url = 'getProperty?wsdl';
$first_name = $this->request->getVar('first_name');
$second_name = $this->request->getVar('second_name');
$last_name = $this->request->getVar('last_name');
$user_id = $this->request->getVar('user_id');
$session_id = session()->get('user_id');
$client = new \SoapClient($url);
$deedsusagemodel = new UsageLogModel();
$args = [
'idNumber'=> $user_id,
'username'=> $username,
'password' => $password,
'officeCode' => 1
];
$usageData = [
'user_id' => $session_id
];
if($user_id != Null){
$response = $client->getFunctionByIDNumber($args);
$ndata = $response->return;
if($ndata->errorResponse->errorCode == 64)
{
$error = $ndata->errorResponse->errorDescription;
return $this->fail($error);
}
if($ndata->errorResponse->errorCode == 550)
{
$error = $ndata->errorResponse->errorDescription;
return $this->fail($error);
}
else{
$usagemodel->insert($usageData);
return $this->respond($ndata, 200);
}
}
}
I have so many pages of products on my development store.But when I try to use products.json to get all of my products, it only take 50 listings(one page).Could anybody solve my doubt?
<?php
$url = 'https://myshopifystore/admin/api/2019-07/products.json';
$result = file_get_contents($url);
$data = json_decode($result, true);
dd($data);
my result like this picture:
The limit property on the request has a default of 50 up to 250. In case you want more use a loop to search though every page until the end.
GET /admin/api/2019-07/products.json?limit=250?page=1
Use
GET /admin/api/2019-07/products/count.json
to get the total product you want to search.
More info about paginated page here
EDIT 1 : The paginated ?page= is now deprecated. It will be remove in version 2020-07. You should use cursor-based pagination.
Below function can help to fetch resources with cursor based pagination in new shopify API
public function request($method,$url,$param = []){
$client = new \GuzzleHttp\Client();
$url = 'https://'.$this->username.':'.$this->password.'#'.$this->domain.'/admin/api/2019-10/'.$url;
$parameters = [
'headers' => [
'Content-Type' => 'application/json',
'Accept' => 'application/json'
]
];
if(!empty($param)){ $parameters['json'] = $param;}
$response = $client->request($method, $url,$parameters);
$responseHeaders = $response->getHeaders();
$tokenType = 'next';
if(array_key_exists('Link',$responseHeaders)){
$link = $responseHeaders['Link'][0];
$tokenType = strpos($link,'rel="next') !== false ? "next" : "previous";
$tobeReplace = ["<",">",'rel="next"',";",'rel="previous"'];
$tobeReplaceWith = ["","","",""];
parse_str(parse_url(str_replace($tobeReplace,$tobeReplaceWith,$link),PHP_URL_QUERY),$op);
$pageToken = trim($op['page_info']);
}
$rateLimit = explode('/', $responseHeaders["X-Shopify-Shop-Api-Call-Limit"][0]);
$usedLimitPercentage = (100*$rateLimit[0])/$rateLimit[1];
if($usedLimitPercentage > 95){sleep(5);}
$responseBody = json_decode($response->getBody(),true);
$r['resource'] = (is_array($responseBody) && count($responseBody) > 0) ? array_shift($responseBody) : $responseBody;
$r[$tokenType]['page_token'] = isset($pageToken) ? $pageToken : null;
return $r;
}
using above function in controller
$ids = [];
$nextPageToken = null;
do{
$response = $shop->request('get','products.json?limit=250&page_info='.$nextPageToken.'&rel=next');
foreach($response['resource'] as $product){
array_push($ids, $product['id']);
}
$nextPageToken = $response['next']['page_token'] ?? null;
}while($nextPageToken != null);
I am using laravel 5.6
My script to insert big data is like this :
...
$insert_data = [];
foreach ($json['value'] as $value) {
$posting_date = Carbon::parse($value['Posting_Date']);
$posting_date = $posting_date->format('Y-m-d');
$data = [
'item_no' => $value['Item_No'],
'entry_no' => $value['Entry_No'],
'document_no' => $value['Document_No'],
'posting_date' => $posting_date,
....
];
$insert_data[] = $data;
}
\DB::table('items_details')->insert($insert_data);
I have tried to insert 100 record with the script, it works. It successfully insert data
But if I try to insert 50000 record with the script, it becomes very slow. I've waited about 10 minutes and it did not work. There exist error like this :
504 Gateway Time-out
How can I solve this problem?
As it was stated, chunks won't really help you in this case if it is a time execution problem. I think that bulk insert you are trying to use cannot handle that amount of data , so I see 2 options:
1 - Reorganise your code to properly use chunks, this will look something like this:
$insert_data = [];
foreach ($json['value'] as $value) {
$posting_date = Carbon::parse($value['Posting_Date']);
$posting_date = $posting_date->format('Y-m-d');
$data = [
'item_no' => $value['Item_No'],
'entry_no' => $value['Entry_No'],
'document_no' => $value['Document_No'],
'posting_date' => $posting_date,
....
];
$insert_data[] = $data;
}
$insert_data = collect($insert_data); // Make a collection to use the chunk method
// it will chunk the dataset in smaller collections containing 500 values each.
// Play with the value to get best result
$chunks = $insert_data->chunk(500);
foreach ($chunks as $chunk)
{
\DB::table('items_details')->insert($chunk->toArray());
}
This way your bulk insert will contain less data, and be able to process it in a rather quick way.
2 - In case your host supports runtime overloads, you can add a directive right before the code starts to execute :
ini_set('max_execution_time', 120 ) ; // time in seconds
$insert_data = [];
foreach ($json['value'] as $value)
{
...
}
To read more go to the official docs
It makes no sense to use an array and then convert it to a collection.
We can get rid of arrays.
$insert_data = collect();
foreach ($json['value'] as $value) {
$posting_date = Carbon::parse($value['Posting_Date']);
$posting_date = $posting_date->format('Y-m-d');
$insert_data->push([
'item_no' => $value['Item_No'],
'entry_no' => $value['Entry_No'],
'document_no' => $value['Document_No'],
'posting_date' => $posting_date,
....
]);
}
foreach ($insert_data->chunk(500) as $chunk)
{
\DB::table('items_details')->insert($chunk->toArray());
}
Here is very good and very Fast Insert data solution
$no_of_data = 1000000;
$test_data = array();
for ($i = 0; $i < $no_of_data; $i++){
$test_data[$i]['number'] = "1234567890";
$test_data[$i]['message'] = "Test Data";
$test_data[$i]['status'] = "Delivered";
}
$chunk_data = array_chunk($test_data, 1000);
if (isset($chunk_data) && !empty($chunk_data)) {
foreach ($chunk_data as $chunk_data_val) {
DB::table('messages')->insert($chunk_data_val);
}
}
I used the code below to check the update or insert data of 11 thousand rows. I hope it useful for you.
$insert_data = [];
for ($i=0; $i < 11000; $i++) {
$data = [
'id' =>'user_'.$i,
'fullname' => 'Pixs Nguyen',
'username' => 'abc#gmail.com',
'timestamp' => '2020-03-23 08:12:00',
];
$insert_data[] = $data;
}
$insert_data = collect($insert_data); // Make a collection to use the chunk method
// it will chunk the dataset in smaller collections containing 500 values each.
// Play with the value to get best result
$accounts = $insert_data->chunk(500);
// In the case of updating or inserting you will take about 35 seconds to execute the code below
for ($i=0; $i < count($accounts); $i++) {
foreach ($accounts[$i] as $key => $account)
{
DB::table('yourTable')->updateOrInsert(['id'=>$account['id']],$account);
}
}
// In the case of inserting you only take about 0.35 seconds to execute the code below
foreach ($accounts as $key => $account)
{
DB::table('yourTable')->insert($account->toArray());
}
Can anyone help me? I use redis cache. But I see same results on every pages when I use pagination. How can I fix it? Thanks.
You should cache your results per page, with a key that is the current page.
$currentPage = request()->get('page',1);
$category = Cache::remember('sellcategory-' . $currentPage, 10, function(){
return DB::table('elans')->orderBy('updated_at', 'desc')->where(['derc' => 1,'elaninnovu' => 'Satılır'])->paginate(10);
});
This solution to cache and clear pagination caches.
How to cache:
$page = request()->get('page', 1);
$limit = request()->get('limit', 10);
$users = Cache::remember('admin' . $page, 10, function() use ($limit){
return DB::table('users')->paginate($limit);
});
You can use a loop to check the prefix and delete them like this.
public static function forgetCaches($prefix)
{
// Increase loop if you need, the loop will stop when key not found
for ($i=1; $i < 1000; $i++) {
$key = $prefix . $i;
if (Cache::has($key)) {
Cache::forget($key);
} else {
break;
}
}
}
Clear caches:
// Clear caches:
forgetCaches('admin');