Google Lighthouse Thinks Cloudflare Bot Fight Mode is Deprecated - performance

Google Lighthouse is giving me the following error under "Best Practices:"
window.webkitStorageInfo is deprecated. Please use
navigator.webkitTemporaryStorage or
navigator.webkitPersistentStorage instead.
The source file is called invisiblejs. It doesn't take much legwork to find out that this is related to Cloudflare's Bot Fight Mode setting. What I haven't been able to figure out is what to do about this. Is this a false positive on Google's part? Is there something I need to update? Is Cloudflare running deprecated code?

I'm getting the same thing. It also hogs up a lot of resources that reduce the pagespeed load time. Starting to wonder if this layer of security is worth it because it may increase bounce rate with longer page load times. This is also a cyber security question.

Related

Google APIs - API Key not Counting Against Queries

I've come to you today in hopes of getting some support in regards to the Google Distance Matrix API. Currently I'm using this in a very simple way with a Web Services request through an HTTP interface and am having no problems getting results. Unfortunately my project seems to be running into Query limits due to the 2,500 query Quota limit. I have added Billing to the project to allow for going over 2,500 queries, and it reflects that increased quota in my project. What's funky though is that the console is not showing any usage and so I'm not sure if these requests are being ran against what I have set up.
I am using a single API Key for the project which is present in my requests, and as I said before the requests ARE working, but again I'm hoping to see if someone can shed some light as to why I might not be seeing my queries reflected in my usage, and to see how I can verify that my requests are being run under the project for which I have attached billing.
If there is any information I can provide to help assist in finding an answer, please feel free to let me know and I'll be happy to give what information I can.
After doing some digging I was able to find the following relevant thread to answer my question:
Google API Key hits max request limit despite billing enabled

My wordpress website and dashboard ,both are too slow, server responded in 11 sec

Domain of my blog is codesaviour.
Since last month my blog and wp-admin dashboard has slowed down to a frustrating level. I have already removed post revision after reading from speeding up wordpress.
Here is the Google PageSpeed Insight report of my blog. According to it server responding time is 11s.
I even read following threads in stack overflow :
link. I tried to implement the steps but blog is still slow,no change.
My host is Hostgator.in,their online assistance asked me to enable gzip compression as instructed at link,So I followed the instruction, as I was not having .htaccess file on server I created one and pasted the code mentioned in previous link,but nothing helped. It is slow like before, even online reports doesn't show that gzip is even working.
Here is a report from gtmetrix that includes Pagespeed and YSlow reports.Third Tab Timeline shows that it took 11.46s in receiving.
Main problem is server response of 11s (google pagespeed report) or 11.46s(gtmetrix report).
Google suggests to reduce it under 200ms ,How can I reduce it?
#Constantine responded in this link , that many wordpress website are going through same slow phase.
I am using following plugins:
Akismet
Google Analyticator
Google XML Sitemaps
Jetpack by WordPress.com
Revision Control
SyntaxHighlighter Evolved
WordPress Gzip Compression
WordPress SEO
WP Edit
Every time I select add new plugin following error is reported,
An unexpected error occurred. Something may be wrong with
WordPress.org or this server’s configuration.
Also whenever i am installing any plugin using upload option, its giving me error :
Can't load versions file.
http_request_failed
Please help me,in order to increase speed of my blog and dashboard, also suggestion for the errors I am receiving.
Edit
Automatically , without any changes , 11.46s has been reduced to 1.26s .
I will focus on the speed issue. Generally, when things start to be slow, it is a good idea to test by gradually switching off the features until it is fast. The last thing you switched off before it is fast is slow. Then look at that thing in details. Try to split the given task to subtask and do it again, until you find the exact cause of the problem. I would do that with the plugins as well. After the testing is finished, I would put the features back.
Use an effective caching plugin like "WP Super Cache". It drastically improves your page"s load time. Optimizing your images is also essential for your site"s speed. WP-SmushIt performs great for this issue.The last plugin which I highly recommend you is WP-Optimize.This plugin basically clean up your WordPress database and optimize it without doing manual queries. It sometimes gives error when you installed the same plugin more than ones. Firstly, you should delete the plugin from your ftp program instead of using wordpress platform. Otherwise, its not working properly due to errors. Then try to install again the same plugin which you had already deleted.
If you're going to maintain a site about programming then you really have to fix the performance. It really is awful.
The advice you get from automated tools isn't always that good.
Looking at the link you provided the biggest problem is the HTML content generation from GET http://codesaviour.com/ which is taking 11.46 seconds (there are problems elsewhere - but that is by far the worst) - 99% of the time the browser is just waiting - it only takes a fraction of a second to transfer the content across the network. Wordpress is notorious for poor performance - often due to overloading pages with plugins. Your landing page should be fast and cacheable (this fails on both counts).
even online reports doesn't show that gzip is even working
The HAR file you linked to says it is working. But compression isn't going to make much impact - it's only 8.4Kb uncompressed. The problem is with the content generation.
You should certainly use a Wordpress serverside cache module (here's a good comparison).
DO NOT USE the Wordpress Gzip plugin - do the compression on the webserver - it's much faster and more flexible.
In an ideal world you should be using ESI - but you really need control over the infrastructure to implement that properly.
Diagnosing performance problems is hard - fixing them is harder and that is when you have full access to the system it's running on. I would recommend you set up a local installation of your stack and see how it performs there - hopefully you can reproduce the behaviour and will be able to isolate the cause - start by running HPROF, checking the MySQL query log (I'm guessing these aren't available from your hosting company). You will howevver be able to check the state of your opcode cache - there are free tools for both APC and ZOP+. Also check the health of your MySQL query cache.
Other things to try are to disable each of the plugins in turn and measure the impact (you can get waterfalls in Firefox using the Firebug extension, and in chrome using the bundled developer tools).
You might also want to read up a bit on performance optimization - note that most books tend to focus on the problems client-side but your problems are on your server. You might even consider switching to a provider who specializes in Wordpress or use a different CMS.
symcbean's answer is good, but I would add a few things:
This is a server-side issue
This has been said by others, but I want to further emphasize that this is a server side issue, so all those client-side speed testing tools are going to be of very limited value
HostGator isn't high-performance hosting
I don't know about India, but HostGator in the US is generally very slow for dynamic, database driven sites (like yours). It absolutely shouldn't take 11 seconds to load the page, especially since your site doesn't look particular complex, but unless you're serving a totally static site, HostGator probably won't ever give you really stellar performance.
Shared hosting leaves you at the mercy of badly-behaved "neighbors"
If you're using one of HostGator's standard shared hosting packages (I'm assuming you are), you could have another site on the same machine using too many resources and crippling the performance of your site. See if you can get HostGator to look into that.
Why not use a service built for this?
This looks like a totally standard blog, so a service like Tumblr or Wordpress.com (not .org) might be a better choice for your needs. Performance will be excellent and the cost should be very low, even with a custom domain name. If you aren't experienced in managing WordPress and don't have any interest in learning how (don't blame you), why not leave all that to the experts?
You need to make some adjustment to improve your speed up WordPress.
The first step is: clean some unwanted plugins you had in WordPress.
The second step is: delete the theme you not used.
The third step is: compress all images with lossless quality.
The fourth step is: Clean up the database.
If you have done all these steps you will fix your WordPress. You want more details to check out this link: How to fix WordPress dashboard slow.
Other than the usual suggestions, if you are hosting your MySql db on another host from the web server, check the latency between the two. Wordpress is unbelievably chatty with it's db (50+ db calls to load each dashboard page, for example). By moving the db onto the same host as the web server, I got excellent performance.

Automatic scaling of app

I was wondering whether it is possible to autoscale if the demand for requests escalates? What do people do if the app they just created goes viral in the middle of the night, and people starts getting error-codes instead of data? Or is such functionality in the pipeline?
If your app hits its request limit, your extra requests will begin to fail with error code 155 (RequestLimitExceeded). To prevent the requests from failing you should adjust the request limit slider for the relevant app on the on the Account Overview page.
Now, coming to your question, Can this be done automatically? As of now, I will say No. Parse currently requires you to manually do that. Having thoroughly gone through all their blog posts, I will say that there are no hints of this functionality coming in near future. Anyways this question can only be answered 100% "correctly" by someone from Parse. We, at stackOverFlow, can only guess.
This is a great question you raised! As I see parse is a good PaaS with all the "cloudy" features. Even the pricing looks new generation type hourly based, however if it is lacking of automation to adjust the limits you will still pay for your unused capacity over a period of time just as in old datacenters which really bothers me (unless you pay someone to continuously monitor the performance and manually set the limits).

the geo coder to fetch more requests

I am working with geocoder gem and like to process more number of requests from an IP. By default Google API provides only 2500 requests per day.
Please share your thoughts on how I can do more requests than the limit?
As stated before: Using only Google API the only way around the limitation is to pay for it. Or in a more shady way make the requests form more than one IP/API-Key which i would not recommend.
But to stay on the save side i would suggest mixing the services up since there a few more Geocoding APIs out there - for free.
With the right gem mixing them is also not a big issue:
http://www.rubygeocoder.com/
Supports a couple of them with a nice interface. You would pretty much only have to add some rate-limiting counters making sure you stay within the limits of each provider.
Or go the heavy way of implementing your own geocoding. With for example your own running Openstreetmaps database. The Database can be downloaded here: http://wiki.openstreetmap.org/wiki/Planet.osm#Worldwide_data
Which is the best way depends on what your actual requirements are and what ressources you have available.

SSL Speed - specific with Magento

I have seen some general questions regarding speed of SSL, but most answers are generic and ask for specifics to give a better answer.
Well, here are my specifics, i really hope someone can help me with some advice what to do.
Question:
I would prefer to keep SSL on throughout the site, instead of only at default Magento SSL behavior such as logging in, account edits, orders and payments. So basically, also during product browsing, reading CMS pages, etc.
But at what performance cost will this be. I'm only worried about actual performance a user may notice.
I'm running a Magento multistore site on a dedicated server with 4GB memory and dualcore processor with gigabit internet connectivity, running Centos 5 and latest LAMP versions. I run a Comodo SSL multidomain Extended Validation (the 'green bar').
Ask me for any details that are relevant to make a better advice :-)
In short, the answer is you will most definitely see a performance hit. This is why Magento was built the way it was. Secure the pages the have private content, and leave the rest open.
Each HTTPS request made using HTTPS, the client and server must deal with verifying the certificate, passing keys, encrypting and decrypting the data. This adds quite an overhead to apache and the OS. You will also loose the efficiencies of local caching of static content, such as stylesheets, javascript pages, images, etc.
As a result, the client will see a increase in load times, Google will ding you for a slow website, conversion will most likely decrease, and possibly other unforeseen consequences.
Here's a conversation from Magentocommerce about constant HTTPS: magentocommerce
In the end, it's not a great idea. Magento does a very good job knowing which pages should be secure and which are fine without.
But, if you MUST, it is possible. Watch your conversion and analytics numbers closely. If you have Google Analytics installed, add page_speed _trackPageLoadTime to your site. Then, at least, you will know what the dammage is.

Resources