I'm developing a PHP application written using the Laravel 4.1 framework. So far I only have a few MySQL queries per page, many of them are cached using the Cache class where possible, using a Redis server.
Currently I'm developing locally with an internal (but not localhost) MySQL database, using Apache 2.2.24 and PHP 5.4.17.
Using Chromes Developer Tools, I'm checking the Network tab to see how long the page is taking to load etc, but I'm seeing some weird results. The page spends a long time waiting for the content to be served, as you can see below:
As you can see, the new page takes 682ms waiting for the content to be sent back to the browser. Is there anyway that I can improve this? Why does Laravel has such a big overhead?
Apart from a custom Facade that we use to make using Entypo easier, there are no extra packages except the defaults that come with Laravel.
Does anybody know how this can be improved?
If I were you I would install the Chrome Clockwork extension plus the Laravel Clockwork package from Composer. Clockwork gives you a timeline where you can see what it is that takes so long, plus a database query tab where you can see how long time each query takes to load.
Happy hunting (:
Related
I have a laravel website which is running on apache2 and it's running on multiple servers for multiple clients. There is only one web server where the website loads slowly. Now it's not loading slowly on every page load, it randomly loads slowly. Also after loading a page, that has loaded quickly, if you leave the site for say 5 minutes and begin to navigate to another page, the page loads slowly yet again.
Not sure if in fact it's apache that causes the slowness of the site, or if it's a third party plugin because as the page is loading it mentions m.stripe.com is loading and then on another page load, it could be another plugin loading.
Are there any tools that I can use to decipher this issue.
It is very difficult to answer why a Laravel website is slower. There are many reasons behind a slower Laravel application. You have to debug and take decisions where you need to improve in your application. Here a list I focus on when I develop an application.
Database
How may database query load on each page? you have to ensure any recursive query not exist. Make efficient data cache. Check queries takes a little time to execute.
Network Connection
If you are using different network connections with your Redis, database, queue then make sure those connections are well optimized and taking a little time to connect and serve data.
Cache Files
Make Cache you blade HTML files, routers, config files, and optimize those cache. Also, reduct autoloads services as much as possible.
Optimize your Images
Optimize your images. I recommend not to use local files. try to use a cloud service.
Minify CSS and JS file
You should minify your CSS and js files. and try to use libraries from CDN.
Use Queue
Use queue where possible like email send, PDF Generate...
Found that there where 2 A name records with different IP adresses for the same domain. I would have thought some error would have been raised by apache because of this, but it randomly selected the ip to serve the site and of course long load times for the ip that does not exist.
Minify your HTML, CSS, and Javascript. it's a great way to increase page spread. Depending on the IDE you use, you could install a minifier extension, use webpack to bundle your code, or use an online minifier like https://fixcode.org/minify
Use a CDN to load static files like scripts, images, and style sheet. you could also load all of your site via a CDN
you could also consider lazy loading parts of your script. especially in cases where you compile scripts with webpack
I have a market prices table, it updated every millisecond from a third-party service.
I need something to display the live prices to my HTML view, I don't want to use ajax
Asynchronous JavaScript probably is the easiest and lowest latency (best) solution although it still won't likely provide you to the millisecond accuracy.
Presenting the raw data source in a html frame that is repeatedly refresh via JavaScript may be an option depending on formatting needs but this has it's own challenges and is a bit of a hack.
You might be able to achieve this with static site generation task (rebuild) scheduling but you would be looking at even larger latency in that case, people would also need to refresh the page to see the updated data unless you refresh the page on an interval with this solution.
When dealing with markup the content will either need to be refreshed by JavaScript or by browser refresh.
To display data live, yes! websockets are the answer. Though you can't run them on the same server your default Laravel server relies. You'll need to create a different server via Artisan console or Task scheduler listening on a different port. Make sure your hosting provider supports the ability to listen over other ports than 80/443. You may need a VPS or something serverless to run such application.
The only reliable websocket PHP library which I know it has good integrity with Laravel is Ratchet.
Though I advise NodeJS with SocketIO when it comes to websocket applications.
I'm creating PHP MySQL real time chat app. A friend told me It is very bad to use PHP for real time apps & that would kill server CPU.
I know that PHP isn't the proper choice but i wanna get some advices to make the performance better with using PHP for real time chat apps.
I also wanna know why some developers prefer using PHP7 over NodeJS/Socket.IO/GO/..etc ?
Thanks,,
I would prefer Nodejs + Socket for realtime chat application because of one reason that it will be completely Javascript from Browser/Mobile to server and if NoSql is database than upto database as well.
But your point of PHP7 vs Nodejs/Socket left me with a question and to get the answer I came across this blog. This might help you as well. Its too long to explain it here.
PHP7 vs Nodejs
I have tested my site freshdeals.co.in on http://www.webpagetest.org/result/150325_HZ_TAN/
I didn't understand why my site taking to much time in TTFB, and what i can do to reduce this.
Alos i would like to know for every http request on the page include this TTFB or not?
I think your server is overloaded and due to that may be you are getting this issues, I will suggest you please monitor your server and optimize your MySQL and web server which is installed on your server.
Also, Please check page speed suggestion of your site and try to update it on your site : https://developers.google.com/speed/pagespeed/insights/?url=freshdeals.co.in&tab=desktop
How customized is the implementation? Server capacity might be an issue but for a 15 second page load time it is more likely that a customization is the cause of the issue. Excessive iteration is a fairly consistent problem for Magento implementations, often seen by many calls to Mage_Core_Model_Abstract::load().
Take the public dataset and load on a local computer and run a profiler against it. XDebug or Zend Debugger both have profilers, or try New Relic. If you aren't able to do that turn the query log on in MySQL and load a page. In the worst case you should not see much more than 100 queries. If you see more than that you will likely need to re-architect some customizations.
Also, make sure that you're using an opcache for PHP code and that your data cache is working. But even if both of those are turned off you should not be seeing 15 second page load times.
In the past, when I checked the site speed in google page speed or many similar tools, the site got very high scores (good css & js optimize). I installed the Advanced CSS/JS Aggregation module and boost module to get high score.
Then, suddenly, I started to get message on the google page speed (and other tools), saying my server response time is slow - around 3 seconds.
My site built with Drupal 7 and hosted on Bluehost Shared hosting.
Bluehost technical support says that the problem is not in their side
What do you think causing the server to be slow?
How can I fix it?
Or at least, what should I check (images, caching, something else)?
The first thing to figure out is what's a desirable response time. For example, if you have lots of modules and pretty heavy site/homepage then maybe 3 seconds is ok unless something is done to change the processing time(caching, using less modules etc).
Back to your case of where should i check:
Check, your homepage and what views and other things are loading for your homepage to be rendered. Then make a list and go one by one to ask:
Is it optimal/can it be improved? maybe something is throwing the caching out(dynamic parameters be injected by each request for the item etc).
If you're using views, enable the sql view to see what sql statements its using and you can use tools to analyze/improve it(this could be a question by itself)
Look at the modules that load/being used to make sure you need them.
Check on the drupal caching(/admin/config/development/performance) and make sure the correct checkboxes are checked.
This could as well be blue host's problem because if they're hosting so many sites on the server, the server will start kicking some sites out of memory and load them back as they're requested by the visitors hence the slow load(server requests the site, drupal loads it from database etc).
You can ask specific questions after you check those.