I am using cloudflare as a proxy to protect against attacks on my site, but when i turn it on, suddenly some css requests can take up to 15 seconds (files that weigh 500B).
what could be causing this issue? the irony is that cloudflare should decrease response time...
That could be tied to your SSL settings. You may want to review:
Error 522: Connection timed out
or you could try loading your assets relative to the protocol:
Why are my images/css/js files missing when I load my page over HTTPS?
If neither of those articles offer a solution, you can also try community.cloudflare.com or submit a ticket to support[#]cloudflare[.]com.
Related
I have moved all my static assets to cloudfront.net, and when I view my source code, my CSS and JS and images are already hosted in cloudfront.net. But when I check GTmetrix.com, my use cookie-free domains is still graded F, and my main domain is still showed in the list, instead of cloudfront.
I already cleared my cache, cloudflare cache, browser cache, and all kinds of cache, but Yslow in GTmetrix still doesn't detect that I'm using a CDN (cloudfront.net).
Anyone here who encountered the same problem?
Actual GTMetrix Result:
https://gtmetrix.com/reports/www.flyskyjetair.com/SgHBKXsJ
Actual Code:
view-source:https://www.flyskyjetair.com/
If you have the strip cookies and cache cookies options enabled however when running your site through YSlow are still receiving a warning, this is due to a YSlow false-positive. If you set your cookies on the top-level domain (e.g. yourwebsite.com) all of your subdomains will also include the cookies that are set. This also includes your custom CDN URL if using one (e.g. cdn.yourwebsite.com).
However, as long as you have the strip cookies option enabled, even if you receive this warning it will be incorrect. YSlow does not take into consideration that the CDN actually strips the cookie and therefore may continue to throw the error. However, if you run a cURL command on the asset or check it within the Chrome Dev tools Network tab, you won’t see any Set-Cookie headers. Therefore this YSlow warning can be safely ignored.
If you are using Cloudflare then you simply won’t be able to achieve 100 on YSlow. Cloudflare appends a __cfduid cookie to every request which cannot be removed due to security reasons.
I just learned and implemented varnish reverse proxy to increase my website speed.
Everything works fine but something minor bothers me.
For some reason, when I check page TTFB for the first time, I get .999s, however, when I rerun the test the number drops to .237s.
I use the following website to check TTFB:
https://www.webpagetest.org
and my website is:
https://www.findfestival.com/
It makes me wonder if the first request to the website hits the cache. When I use curl I can see x-varnish but still it's strange that first time clicking on links are slower compared to the second time clicking on them. (specifically on mobile)
Can you please help me understand why first time Varnish cache doesn't hit?
This is my default.vcl is:
Thanks,
PS, I have seen this post and already tried the solution with no luck!
Varnish Cache first time hit
Seeing how you have X-Mod-Pagespeed in your headers and minimalistic VCL, the conclusion is that you need to take a look at Downstream Caching and make sure that PageSpeed would not send Cache-Control: max-age=0, no-cache which breaks Varnish caching for the most part.
In my own experience, Pagespeed does not play well with Varnish even with downstream caching configuration applied.
It "loves" to send the aforementioned header no matter what. Even if you manage to turn this behaviour off, it results in PageSpeed's own assets not having proper Cache-Control headers plus a few more interesting issues like causing Varnish "hit-for-pass" when rebeaconing has to take place - which is really bad and breaks caching further.
Also have a look at possible configurations. You might want to put PageSpeed at your SSL terminator level (option #1) - that way you don't need Downstream Cache configuration and PageSpeed will be "in front" of Varnish.
I have a WordPress site that is doing a few weird things, and I believe it is because it is being cached. I changed the contents of a CSS stylesheet file, and the change took around 10 minutes before it appeared live.
I can't however find any caching mechanism setup. I've looked through cPanel and can't see anything setup there. The IP of the site resolves to the IP that cPanel is showing.
I've looked for plugins in WordPress and can't see any caching plugins (although if it was a caching plugin, would accessing a stylesheet be cached?).
Any tips on how I can see if the page is being cached on the server or by a plugin?
Put a JavaScript bug on the page which crafts a random URL and requests it. Compare the number of page requests to random URL requests. But there are lots of scenarios where a browser can cache a page in the absence of caching information.
If your website is behind Cloud Flare network or such, this is normal behavior.
Try running next command (Windows Command prompt/Linux terminal):
ping www.yoursite.com
and visit resolved IP address in browser - this may tell you if you are behind caching network.
Take a look at this article: http://www.mobify.com/blog/beginners-guide-to-http-cache-headers/
My server deliver page via HTTPS (as well as any resource on the page). The page by itself comes from main domain: domain.com, while other resources such as images or css comes from the following subdomains:
img.domain.com and css.domain.com respectively. Images and css resources have Cache-Control in response headers, i.e. they should be cached.
The question is why the browser requests images and css resources each time I open the page?
On other hand, if all resources comes from the same domain - everything is taken from the cache? Why it is so? And it looks like this behavior is not the same in different browsers: in some browsers I see conditional requests while in others - unconditional. Is there any standard for such case?
Is Cache-Control set to public for the resources from the alternative domains?
This is a bit of an ancient (2009) article that might help - http://blog.httpwatch.com/2009/01/15/https-performance-tuning/
I'd also consider whether you want to serve the CSS from a separate host...
the browsers going to have to do a DNS lookup, and open up a new TCP connection
if you used the same host DNS lookup goes away, connection overhead may go away (IE9, Chrome speculatively open a second connection), but you get the overhead of the HTTPS negotiation back.
I don't know the right answer without measuring but it's worth thinking about.
my site is really really slow, 4603prospect.com? Sometimes its ok, sometimes its slow. I am caching thumbnails, I dont understand what to make of this pingdom report.... http://tools.pingdom.com/?url=4603prospect.com&treeview=0&column=objectID&order=1&type=0&save=false
Thanks
Todd
Things that might help to check:
Are you using shared hosting? Your server may be responding slowly due to the activity on other websites on the server
Are your images falling foul of an htaccess rule which results in redirects?
How many images are you loading at once? Most browsers can only make ~8 http requests at a time. You could try combining some of your CSS and JS files together to limit the HTTP requests.