Why YSlow is not detecting my cookie-free domains? - performance

I have moved all my static assets to cloudfront.net, and when I view my source code, my CSS and JS and images are already hosted in cloudfront.net. But when I check GTmetrix.com, my use cookie-free domains is still graded F, and my main domain is still showed in the list, instead of cloudfront.
I already cleared my cache, cloudflare cache, browser cache, and all kinds of cache, but Yslow in GTmetrix still doesn't detect that I'm using a CDN (cloudfront.net).
Anyone here who encountered the same problem?
Actual GTMetrix Result:
https://gtmetrix.com/reports/www.flyskyjetair.com/SgHBKXsJ
Actual Code:
view-source:https://www.flyskyjetair.com/

If you have the strip cookies and cache cookies options enabled however when running your site through YSlow are still receiving a warning, this is due to a YSlow false-positive. If you set your cookies on the top-level domain (e.g. yourwebsite.com) all of your subdomains will also include the cookies that are set. This also includes your custom CDN URL if using one (e.g. cdn.yourwebsite.com).
However, as long as you have the strip cookies option enabled, even if you receive this warning it will be incorrect. YSlow does not take into consideration that the CDN actually strips the cookie and therefore may continue to throw the error. However, if you run a cURL command on the asset or check it within the Chrome Dev tools Network tab, you won’t see any Set-Cookie headers. Therefore this YSlow warning can be safely ignored.
If you are using Cloudflare then you simply won’t be able to achieve 100 on YSlow. Cloudflare appends a __cfduid cookie to every request which cannot be removed due to security reasons.

Related

Google Cloud CDN "Force Cache All Content" NOT Cache All Content

I am using Google Cloud CDN for my WordPress website https://cdn.datanumen.com. I have enabled "Force Cache All Content" option. However, the web pages, css files, javascript files are still not cached. Only the images are cached.
For example, I test the page at https://cdn.datanumen.com/, I have used Ctrl + F5 to refresh the webpage for many times, but always get the same results.
Below is the web page I try to load:
There are "Cache-Control" field in the response header, but no "Age" field. Based on Google document, if a cache hits and cached content is served, there will be a "Age" field. So without "Age" means the file is not cached.
I also check the log:
In the log, cacheFillBytes is 26776 and cacheLookup is true. It seems that Google CDN is trying to lookup cache and fill cache with the contents. But the statusDetails shows "response_sent_by_backend", so the contents are still served from the backend. Normally this should only occur for the first time when I visit the website. But for my case, even if I press Ctrl + F5 to refresh my website for many times, I will always get the same result, the statusDetails never shows "response_sent_by_cache" for page such as https://cdn.datanumen.com/
Why?
Update:
I notice there is a "Vary" field in the response header:
Based on https://cloud.google.com/cdn/docs/caching#non-cacheable_content, if Vary header Has a value other than Accept, Accept-Encoding, or Origin, then the content will not be cached, since for my case "Vary" header is "Accept-Encoding,Cookie,User-Agent", it is not cached. But my question is how to deal with issue and let the content be cached forcely?
Update 2
I have changed the site to a real WordPress site, since that is what I need finally. I plan to use Google Cloud CDN purchased support to see if they can help on this case.
According to the Google Cloud CDN's documentation, the best way to solve your problem is actually using the CACHE_ALL_STATIC cache mode:
CACHE_ALL_STATIC: Automatically caches static content that doesn't have the no-store or private directive. Origin responses that set valid caching directives are also cached. This is the default behavior for Cloud CDN-enabled backends created by using the gcloud command-line tool or the REST API.
USE_ORIGIN_HEADERS: Requires origin responses to set valid cache directives and valid caching headers. Responses without these directives are forwarded from the origin.
FORCE_CACHE_ALL: Unconditionally caches responses, overring any cache directives set by the origin. This mode is not appropriate if the backend serves private, per-user content, such as dynamic HTML or API responses.
But in the case of the last cache mode, there are two warnings about its usage:
When you set the cache mode to FORCE_CACHE_ALL, the default time to live (TTL) for content caching is 3600 seconds (1 hour), unless you explicitly set a different TTL. Accepting the new default TTL of 1 hour might cause some entries that were previously considered fresh (due to having longer TTLs from origin headers) to now be considered stale.
The FORCE_CACHE_ALL mode overrides cache directives (Cache-Control and Expires) but does not override other origin response headers. In particular, a Vary header is still honored, and may suppress caching even in the presence of FORCE_CACHE_ALL. For more information, see Vary headers.

Why do some websites that have SSL not work but still load if using the HTTPS version? How can I avoid it if I make a website?

Sometimes, if I go to a website, such as this one through an HTTP link, it looks fine and works as apparently intended:
However, if you change the address to be HTTPS, the page loads without any browser warnings but looks really weird and seems broken—spacing is messed up, the colors are wrong, fonts don't load, etc.:
All of this same stuff happens in both Firefox and Chrome on my computer.
What causes this to happen? How can I avoid this if I make an HTTPS-secured website?
For me the browser tells you what is wrong in a warning message. Parts of the page are not secure (such as images).
What does this mean? The developer of the site has linked some content such as CSS, JS, or images using HTTPS links and some using HTTP links.
Why is this a problem? Since some content is being retrieved over an insecure connection (http), it would be possible for malicious content to be injected into your browser which could then grab information which was transmitted over https. Browsers have had this warning for a very long time, but in the interest of security they have hedged their behavior on the more secure side of things now.
What will fix this? There is nothing we can do as consumers of the website. The owner of the site should fix the problem. If you are really interested in viewing the site and not concerned about security, you can temporarily disable this protection from the URL bar warning message in Firefox.
As #micker explained, the page looks weird because not all of the sources are loading since their connections could not be made securely and the website's ability to load those sources are being denied by the browser for not being referenced using a secure connection.
To elaborate further. in case it's still not quite clear, a more accurate and technical explanation would be that, for styling a webpage, the Cascading Style Sheets, or CSS, is the language used to describe the presentation of a document or webpage in this case, and tells the browser how elements should be rendered on the screen. If you consider these stylesheets as sort of building blocks, where you can combine them together to define different areas on a webpage to build one masterpiece, then you would see why having multiple building blocks for a site would sound pretty normal.
To save even more time, rather than try to figure out the code for each and every stylesheet or "building block" that I want to include, I can burrow someone else's style sheet that has the properties I want and link to it as a resource instead of making or hosting the resource myself. Now if we pretend that there's a stylesheet for every font size change, font color variance, or font placement, then that means we're going to need a building block to define each of those
Now, If I am on a secure connection, then the browser ensures that connection stays secure by only connecting to other sites, or resources, which are also secure. If any of the sites containing the building blocks of CSS that I want to use but are not secure, AKA not using SSL (indicated by a lack of "s" in "http://" in their address), then the browser will prevent those connections from happening and thus prevents the resources from loading, because the browser considers it a risk to your current secure connection.
In your example's particular case, things looked fine when you entered only http:// without the https:// because the site you were visiting doesn't automatically force visitors to use SSL and lets you connect to it using the less secure, http protocol, which means your browser is not connecting securely to it, and therefore won't take extra steps to protect you by blocking anything outside of that since you're already on an insecure connection anyway. In which case, the browser doesn't need to prevent sources that are coming from an insecure connection or sites because in a way, your connection is already exposed so it can freely connect where it needs to and load any resources regardless if they can be transferred securely or not.
So then, when you go to the "https://" version of the site, there are no browser warnings because you're connecting to that site with a secure connection and unfortunately that also means that if the designer of the page had linked resources from somewhere that just didn't have an SSL connection available or didn't update the link to go to the new https:// standard, then it's going to be considered insecure and since you're on a secure connection, the browser will block those connections which means blocks those resources from being able to load, making the page load incomplete with not all of its building blocks. Build blocks that tells your screen to move all the text on the right into a panel and to have a blue font color while changing to a different font face. Those definitions defining the look and appearance didn't make it through and so those sections adopted whatever existing stylesheet is present which normally don't match with what was intended to be there.

Resolving Mixed Content warning from external insecure server

I have a https site and need to show content from other sites that may or may not be themselves https. Predictably enough, I'm getting warning messages like this in the console...
"Mixed Content: The page at 'https://www.example.com/' (my server) was loaded over HTTPS, but requested an insecure image 'http://www.aninsecuredomain.com/image.jpg'. (not my server) This content should also be served over HTTPS."
(not to the mention the fact that I no longer see the little padlock displayed properly in most browsers who now consider my site's network insecure).
I've read through a bunch of posts on SO on this topic, but I can't seem to find a definitive answer on whether there's anything I can do when I don't own the external servers (so can't guarantee they'll have a https version). Appreciate any thoughts on whether this is possible, and if so how I could go about achieving it!
When you need to include content from another domain in an https webpages you can:
Make the owner of the other domain commit to https by explaining him the security reason behind that
Proxy the content through your website or host it yourself (if you have right to do it)
(If you don't see the padlock anymore it's because your page is no longer secure because it include insecure elements that could have been tempered: it's not they "consider my site's network insecure", it is indeed insecure!)
You should use the // prefix. (instead of http[s]://)
On an https page, the secure version wil be loaded.
On on a plain http page, the plain http version will be loaded.
Edit your theme replacing every occurence of http://fonts.googleapis.com/... with //fonts.googleapis.com/...

How do I know if my page is being cached?

I have a WordPress site that is doing a few weird things, and I believe it is because it is being cached. I changed the contents of a CSS stylesheet file, and the change took around 10 minutes before it appeared live.
I can't however find any caching mechanism setup. I've looked through cPanel and can't see anything setup there. The IP of the site resolves to the IP that cPanel is showing.
I've looked for plugins in WordPress and can't see any caching plugins (although if it was a caching plugin, would accessing a stylesheet be cached?).
Any tips on how I can see if the page is being cached on the server or by a plugin?
Put a JavaScript bug on the page which crafts a random URL and requests it. Compare the number of page requests to random URL requests. But there are lots of scenarios where a browser can cache a page in the absence of caching information.
If your website is behind Cloud Flare network or such, this is normal behavior.
Try running next command (Windows Command prompt/Linux terminal):
ping www.yoursite.com
and visit resolved IP address in browser - this may tell you if you are behind caching network.
Take a look at this article: http://www.mobify.com/blog/beginners-guide-to-http-cache-headers/

Force Browser Caching Across Browser Sessions

I help maintain several Wordpress-based websites that publish news and reference information.
We have been working hard to make pages at the websites load as fast as possible.
One of the things we've done is implement very long "max-age" times in the "cache-control" http headers for most of our static files, such as images and css files.
The particular cache-control setting we're using is "public, max-age=31536000". 31,536,000 seconds is 365 days.
The upside is that this setting does, in fact, cause the static files to be cached as visitors browse through different pages of our sites.
But here's the rub. This cache-control setting doesn't do much for us across browser sessions. Even though the setting is supposed to tell the browser "cache this file for an entire year", if a visitor to our site shuts down their browser, then starts it up just five minutes later and comes back to our site, the browser insists on re-loading all the static files, even though it still has them in its cache.
I've checked this carefully in Firefox, viewing the headers with Live HTTP Headers. But I can also qualitatively see the same thing happening in other browsers.
Apparently, browsers insist on re-loading all content for a website if the content hasn't been loaded once during the current browser session.
So ... Is there any way we can "politely suggest" to browsers that they always load cached content from the cache, even if the browser hasn't been to our site during the current browser session?
Check the ETag, Expires, and Last-Modified headers as well.
You need an Expires header, and sometimes ETag and Last-Modified can defeat caching.

Resources