magento programmatically saving order takes too long - ajax

I am using the following code in a controller of a custom module to change the order status using AJAX calls and I found it runs very slowly. I did search on Google, but there is no such a topic on it.
I added the time() lines to test the execution time. It takes 30 - 40 seconds to finish each call. I use this same code in Mage_Adminhtml_Sales_OrderController so that I can change the order status in backend, but it returns very quickly and the exetime is equal to 0. I don't think this is hardware performance related, as it runs on 8GB memory and 8 cores CPU. Are there any possible reasons for Magento to take so long to save the order? Does this call have some limits so that it works faster in back-end?
public function changestatusAction()
{
if (!isset($_POST["oid"]) || !$_POST['oid']) {
echo '{"error":1}';
return;
}
if (!isset($_POST["status"]) || !$_POST['status']) {
echo '{"error":1}';
return;
}
$errstr = "";
try{
$order = Mage::getModel('sales/order')->loadByIncrementId($_POST['oid']);
$order->addStatusHistoryComment('', $_POST['status'])
->setIsVisibleOnFront(false)
->setIsCustomerNotified(false);
$starttime = time();
$order->save();
$endtime = time();
$exetime = $endtime - $starttime;
}catch(Exception $e){
$errstr .= $e->getMessage();
}
if($errstr)
echo '{"changed":0,"err":"'.$errstr.'"}';
else echo '{"changed":1,"exetime":"'.$exetime.'"}';
return;
}

I think the problem is when magento try to send email to customer. Change parameter of method setIsCustomerNotified to true or disable magento email communication. If it help you must configure your server to properly send email

Related

Laravel queues Call to a member function storeAs() on string

I am trying to store heavy image through the laravel queue as follows because i dont want my user to wait until file gets stored:
This code is in controller
if($request->hasfile('featuringPhoto'))
{
$prefixFilename = date("Y-m-d-H-i-s");
$coverFilename = $prefixFilename.$request->featuringPhoto->getClientOriginalName();
ProceessFiles::dispatch($request->featuringPhoto->getRealPath(),$coverFilename);
}
else
{
$coverFilename4 = NULL;
}
Below code is in job
protected $files, $filename;
public function __construct($files,$filename)
{
$this->files= $files;
$this->filename = $filename;
}
public function handle()
{
if($this->files){
$coverFilename = $prefixFilename.$this->filename;
$img = file_get_contents($this->files);
$img->storeAs('images/photosUploadedByUser', $coverFilename, 'public');
}
}
It Gives me an error saying Call to a member function storeAs() on string
I tried this solution from stack but didnt work
Any suggestion will be appreciated.Thank you.
I think it is wrong to assume you are gonna save a lot time by executing the save operation in a queue, also because you are already fetching it from the web server. Queues will with scaling often be moved to worker servers and with this approach this will not work.
In the spirit of the question, stackoverflow and to explain to you what is not working. file_get_contents() returns the content of the file as a string. So to fix your problem, you should just store the results of that. You obviously can not call methods on strings.
$coverFilename = $prefixFilename.$this->filename;
$img = file_get_contents($this->files);
Storage::put('images/photosUploadedByUser/' . $coverFilename, $img);

laravel update runs well on my local machine but on the live server returns error "Creating default object from empty value"

i have this code which run very well on my local machine but on the live server returns an error "Creating default object from empty value", i have checked if the value is null but not.
what the piece of code does is if the value is not found in the database, create new one else update, the create works very well but the update does not work on the live server but works on my local machine.
the problem is on the else part of the program where $id >=1 and is this line $ex->subject=$exams[$i]['value'];
public function saveExam(Request $request){
if($request->id==""){
return response()->json(['error'=>'Please select student']);
}
$exams=array_slice($request->exam,0,count($request->exam)-2);
$class_rec=array_slice($request->exam,count($request->exam)-2,1);
// return response()->json(['success'=>$class[0]['value']]);
$data;
$id=$request->id;
$sess=settings_session::find(1);
$session=$sess->session;
$term=$sess->term;
$id_num=0;
$table;
if($request->level=="primary"){
$table="App\\exam_report";
$id_num=exam_report::where('student_id',$id)->where('session',$session)->where('term',$term)->count();
}else if($request->level=="nursery"){
$table="App\\nursery_exam_report";
$id_num=nursery_exam_report::where('student_id',$id)->where('session',$session)->where('term',$term)->count();
}
//else if($request->level=="pre-nursery"){
// $table="App\\pnursery_exam_report";
//}
else if($request->secondary){
$table="App\\secondary_exam_report";
}
if($id_num <= 0){
for ($i=0; $i < count($exams) ; $i++) {
$ex=new $table;
$ex->subject=$exams[$i]['value'];
$i++;
$ex->first_test=$exams[$i]['value'];
$i++;
$ex->second_test=$exams[$i]['value'];
$i++;
$ex->exam=$exams[$i]['value'];
$i++;
$ex->total=$exams[$i]['value'];
$i++;
$ex->grade=$exams[$i]['value'];
$ex->student_id=$id;
$ex->class=$class_rec[0]['value'];
$ex->term=$term;
$ex->session=$session;
$ex->save();
}
}else if($id_num >=1){
$exa=$table::where('student_id',$id)->first();
$ids=$exa->id;
$id_it=$ids;
for ($i=0; $i < count($exams) ; $i++) {
$ex=$table::find($id_it);
$ex->subject=$exams[$i]['value'];
$i++;
$ex->first_test=$exams[$i]['value'];
$i++;
$ex->second_test=$exams[$i]['value'];
$i++;
$ex->exam=$exams[$i]['value'];
$i++;
$ex->total=$exams[$i]['value'];
$i++;
$ex->grade=$exams[$i]['value'];
$ex->student_id=$id;
$ex->class=$exams[$i]['value'];
$ex->term=$term;
$ex->session=$session;
$ids=$exa->id;
$ex->save();
$id_it++;
}
}
return response()->json(['success'=>'Success']);
}
it is expected to update the table but the error is on $ex->subject=$exams[$i]['value'] where $id >=1
Check all the data, collations, and names on the mysql remote server, check also the caps (in Windows the caps dont matter for tables, in Linux they do)
I found the answer, the problem is that on the database, the ids are not arranged serially, and each time i do this
$exa=$table::where('student_id',$id)->first();
it picks the highest id, and when i try to add the next id which did not exist, i get the error, but on my local server, my database ids are arranged well, i don't know why this behavior, the solution to my problem is this line i changed
$exa=$table::where('student_id',$id)->get();
$ids=$exa->min('id');
$id_it=$ids;
instead of pulling the first record, i get all record and find the minimum id

No definition found for Table yahoo.finance.xchange

I have a service which uses a Yahoo! Finance table yahoo.finance.xchange. This morning I noticed it has stopped working because suddenly Yahoo! started to return an error saying:
{
"error": {
"lang": "en-US",
"description": "No definition found for Table yahoo.finance.xchange"
}
}
This is the request URL. Interesting fact: if I try to refresh the query multiple times, sometimes I get back a correct response but this happen very rarely (like 10% of the time). Days before, everything was fine.
Does this mean Yahoo API is down or am I missing something because the API was changed? I would appreciate any help.
Since I have the same problem and that it started today too, that others came to post exactly in the same time as well, and that it still works most of the time, the only explanation I can find is that they have some random database errors on their end and we can hope that this will be solved soon. I also have a 20% rate of failures when refreshing the page of the query.
My guess is that they use many servers to handle the requests (let's say 8) and that one of them is empty or doesn't have that table for some reasons so whenever it directs the query to that server, the error is returned.
Temporary solution: Just modify your script to retry 3-4 times. That did it for me because among 5 attempts at least one succeeds.
I solve this issue by using quote.yahoo.com instead of the query.yahooapis.com service. Here's my code:
function devise($currency_from,$currency_to,$amount_from){
$url = "http://quote.yahoo.com/d/quotes.csv?s=" . $currency_from . $currency_to . "=X" . "&f=l1&e=.csv";
$handle = fopen($url, "r");
$exchange_rate = fread($handle, 2000);
fclose($handle );
$amount_to = $amount_from * $exchange_rate;
return round($amount_to,2);
}
EDIT the above no longer works. At this point, lets just forget about yahoo lol Use this instead
function convertCurrency($from, $to, $amount)
{
$url = file_get_contents('https://free.currencyconverterapi.com/api/v5/convert?q=' . $from . '_' . $to . '&compact=ultra');
$json = json_decode($url, true);
$rate = implode(" ",$json);
$total = $rate * $amount;
$rounded = round($total);
return $total;
}
Same error, i migrate to http://finance.yahoo.com
Here is C# example
private static readonly ILog Log = LogManager.GetCurrentClassLogger();
private int YahooTimeOut = 4000;
private int Try { get; set; }
public decimal GetRate(string from, string to)
{
var url =
string.Format(
"http://finance.yahoo.com/d/quotes.csv?e=.csv&f=sl1d1t1&s={0}{1}=X", from, to);
var request = (HttpWebRequest)WebRequest.Create(url);
request.UseDefaultCredentials = true;
request.ContentType = "text/csv";
request.Timeout = YahooTimeOut;
try
{
using (var response = (HttpWebResponse)request.GetResponse())
{
var resStream = response.GetResponseStream();
using (var reader = new StreamReader(resStream))
{
var html = reader.ReadToEnd();
var values = Regex.Split(html, ",");
var rate = Convert.ToDecimal(values[1], new CultureInfo("en-US"));
if (rate == 0)
{
Thread.Sleep(550);
++Try;
return Try < 5 ? GetRate(from, to) : 0;
}
return rate;
}
}
}
catch (Exception ex)
{
Log.Warning("Get currency rate from Yahoo fail " + ex);
Thread.Sleep(550);
++Try;
return Try < 5 ? GetRate(from, to) : 0;
}
}
I've got the same issue.
I need exchange rates in my app, so I decided to use currencylayer.com API instead - they give 168 currencies, including precious metals and Bitcoin.
I've also written a microservice using webtask.io to cache rates from currencylayer and do cross-rate calculations.
And I've written a blog post about it 🤓
Check it out if you want to run your own microservice, it's pretty easy 😉
I found solution, in my case, just change http to https and everything works fine.

How to prevent additional page requests after response sent

I have configured a listener on kernel.request which sets a new response with redirect when the session time has reached a certain value. The listener works fine and redirects to a certain page, on the next request, after the session has ended. But my problem is on the page I have many links and if I press multiple times the same link, the initial request with the redirect is cancelled/stopped and a new request is made with the last link pressed and so it passes my redirect even though the session has ended and is destroyed. So, my question is how to prevent additional requests/link presses after the firs request is made?
Here is my code:
public function onKernelRequestSession(GetResponseEvent $event)
{
$request = $event->getRequest();
$route = $request->get('_route');
$session = $request->getSession();
if ((false === strpos($route, '_wdt')) && ($route != null)) {
$session->start();
$time = time() - $session->getMetadataBag()->getCreated();
if ($route != 'main_route_for_idle_page') {
if (!$session->get("active") && $route == 'main_route_for_site_pages') {
$session->invalidate();
$session->set("active", "1");
} else {
if ($time >= $this->sessionTime) {
$session->clear();
$session->invalidate();
$event->setResponse(new RedirectResponse($this->router->generate('main_route_for_idle_page')));
}
}
} else {
if ($session->get("activ")) {
$session->clear();
$session->invalidate();
}
}
}
}
Thak you.
Idea #1: Simple incremental counter
Each request sends sequence number as param which is being verified as expected at the server.
Server increments the number and sends it back via response
the new number is used in future requests
Basically, if server expects the SEQUENCE number to be 2 and client sends 1 the request is to be rejected.
Idea #2: Unique hash each time
Similar to the idea above, but uses unique hashes to eliminate predictive nature of incremental sequence.
I resolved the issue using JQuery: when a link was pressed I disabled the other ones and so only one request is made from the page:
var isClicked = false;
$(".menu-link").click(function(e) {
if(!isClicked) {
isClicked = true;
} else {
e.preventDefault();
}
});
Thanks.

PHP Redis Session not saving

EDIT
I tried debugging this with xdebug and netbeans. It's weird that the exports will work during the debug session if I put in some breakpoints. However, with no break points, a more realistic environment, the exports don't work.
I've tried adding sleeps into some parts of the code.
I think that maybe PHP is ending before the Redis commit is completed. Maybe the Redis connections are being done asynchronously, but I checked PRedis and the default is a synchronous connection.
I am working on a reporting tool.
Here is the basic issue.
We store a report into the session object but on later requests when we try to get to the report in the session object it's gone.
Here is a more detailed version.
I store a 'report' object into the session like so
$_SESSION['report_name_unixtimestamp'] = gzcompress( serialize( $reportObject ) );
The user sees the report in some table form and then if they want they can export it. The report could change so the idea behind storing it in the session like this is that when the user exports it to PDF, Excel, etc, they'll be getting a report identical to the one they are viewing.
The user clicks on an export button and on the PHP side it will go into the session, fetch the report via the key provided as a get parameter (uncompresses and unserializes it), create the export and send it to the user for download.
This has worked well up until the point that we tried to introduce the Redis caching server as a tool for better session management.
What happens now is the following:
The first time we run the report it will get stored into the cache and the export will work successfully.
We will run the report again, with the same user account in the same session. This changes the unixtimestamp and so there should be two entries in the $_SESSION. ( $_SESSION['report_name_oldertimetamp'] and $_SESSION['report_name_newertimestamp'] ). When we click on the export button again we get an error saying that the file doesn't exist ( because it hasn't been sent by the server ).
If we check the redis server for the newer version of the report it isn't there, but the old timestamp is still there.
Now, this worked with the file session management but not with Redis. we've tried the redis module for php as well as the pure php client Predis.
Does anyone have any ideas?
Here are a few more details :
Redis has NOT run out of memory. We've checked this many times.
We already know that to unserialize the report object in the session the report class has to be included already. ( remember, the first export works fine but anything after that fails )
If we check the php session object during the request that the report is running on, it WILL contain the newer report but it never makes it to Redis.
Below is the save handler that is being used with Predis.
The redis_session_init is the function I call right before session_start() so that it gets registered. I'm not sure how the redis_session_write function works though so maybe someone can help me with that.
<?php
namespace RedisSession
{
$redisTargetPrefix = "PHPREDIS_SESSION:";
$unpackItems = array( );
$redisServer = "tcp://cache.emcweb.com";
function redis_session_init( $unpack = null, $server = null, $prefix = null )
{
global $unpackItems, $redisServer, $redisTargetPrefix;
if( $unpack !== null )
{
$unpackItems = $unpack;
}
if( $server !== null )
{
$redisServer = $server;
}
if( $prefix !== null )
{
$redisTargetPrefix = $prefix;
}
session_set_save_handler( 'RedisSession\redis_session_open', 'RedisSession\redis_session_close', 'RedisSession\redis_session_read', 'RedisSession\redis_session_write', 'RedisSession\redis_session_destroy', 'RedisSession\redis_session_gc' );
}
function redis_session_read( $id )
{
global $redisServer, $redisTargetPrefix;
$redisConnection = new \Predis\Client( $redisServer );
return base64_decode( $redisConnection->get( $redisTargetPrefix . $id ) );
}
function redis_session_write( $id, $data )
{
global $unpackItems, $redisServer, $redisTargetPrefix;
$redisConnection = new \Predis\Client( $redisServer );
$ttl = ini_get( "session.gc_maxlifetime" );
$redisConnection->pipeline( function ($r) use (&$id, &$data, &$redisTargetPrefix, &$ttl, &$unpackItems)
{
$r->setex( $redisTargetPrefix . $id, $ttl, base64_encode( $data ) );
foreach( $unpackItems as $item )
{
$keyname = $redisTargetPrefix . $id . ":" . $item;
if( isset( $_SESSION[ $item ] ) )
{
$r->setex( $keyname, $ttl, $_SESSION[ $item ] );
}
else
{
$r->del( $keyname );
}
}
} );
}
function redis_session_destroy( $id )
{
global $redisServer, $redisTargetPrefix;
$redisConnection = new \Predis\Client( $redisServer );
$redisConnection->del( $redisTargetPrefix . $id );
$unpacked = $redisConnection->keys( $redisTargetPrefix . $id . ":*" );
foreach( $unpacked as $unp )
{
$redisConnection->del( $unp );
}
}
// These functions are all noops for various reasons... opening has no practical meaning in
// terms of non-shared Redis connections, the same for closing. Garbage collection is handled by
// Redis anyway.
function redis_session_open( $path, $name )
{
}
function redis_session_close()
{
}
function redis_session_gc( $age )
{
}
}
The issue was solved and it was much dumber than I thought.
The save handler doesn't implement locking in any way. On the report pages there are multiple requests being made to the server via ajax and the like. One of the ajax requests starts before the report gets saved to session space. Thus, it reads the session, then writes the session at the end.
Since the reports executes faster every time, the report would get cached to the session in Redis but would then be overwritten by the other script that had an older version of the sessien.
I had help from one of my co-workers. Ugh! This was a headache I'm glad to be over.

Resources