What is the optimum http/2 push links for mobile sites? - http2

I have LEMP stack WordPress sites (responsive theme) and use mod_pagespeed with Cloudflare. I have enabled mod_pagespeed filter hint_preload_subresources. I can see that there are over 20 http/2 pushes URL's.
13 URL are added by myself in nginx config file and rest of added by mod_pagespeed.
I want to know that, are there any optimum number of http/2 push for mobile sites header?
Link </wp-content/themes/fonts/economica/economica-bold.woff2>; rel=preload; as=font; type=font/woff2; crossorigin
Link </wp-content/themes/fonts/lora/lora-regular.woff2>; rel=preload; as=font; type=font/woff2; crossorigin
Link </wp-content/themes/fonts/economica/economica-regular.woff2>; rel=preload; as=font; type=font/woff2; crossorigin
Link </wp-content/themes/fonts/lora/lora-regularitalic.woff2>; rel=preload; as=font; type=font/woff2; crossorigin
Link </wp-content/themes/fonts/lora/lora-bold.woff2>; rel=preload; as=font; type=font/woff2; crossorigin
Link </wp-includes/js/jquery/jquery.js>; rel=preload; as=script; crossorigin
Link <https://pagead2.googlesyndication.com/>; rel=preconnect; crossorigin
Link <https://fonts.gstatic.com>; rel=preconnect; crossorigin
Link <https://cdn.onesignal.com>; rel=preconnect; crossorigin
Link <https://tpc.googlesyndication.com>; rel=preconnect; crossorigin
Link <https://www.googletagservices.com>; rel=preconnect; crossorigin
Link <https://securepubads.g.doubleclick.net>; rel=preconnect; crossorigin
Link <https://fonts.googleapis.com>; rel=preconnect; crossorigin
X-Page-Speed Powered By ngx_pagespeed
Link </dalgona-coffee-recipe/_,Mjo.HNzDPsM1gt.js.pagespeed.jm.e1WOIS1Zb_.js>; rel=preload; as=script; nopush
Link </wp-content/plugins/jquery-updater/js/jquery-3.5.1.min.js.pagespeed.jm.A8biqtTJrt.js>; rel=preload; as=script; nopush
Link </wp-content/plugins/jquery-updater/js/jquery-migrate-3.3.0.min.js.pagespeed.jm.KuaEtw4rAV.js>; rel=preload; as=script; nopush
Link </wp-content/themes/focus-pro/js/responsive-menu.js.pagespeed.jm.nP4yb5h4IW.js>; rel=preload; as=script; nopush
Link </wp-content/plugins/ewww-image-optimizer/includes/lazysizes-pre.js.pagespeed.jm.rCxbQtytJ3.js>; rel=preload; as=script; nopush
Link </dalgona-coffee-recipe/_.pagespeed.jo.M8QchvC_7_.js>; rel=preload; as=script; nopush
Link </dalgona-coffee-recipe/_,Mjo.WekTyeyJeu.js.pagespeed.jm.OxNrKbQqy7.js>; rel=preload; as=script; nopush
Link </wp-content/plugins/content-views-query-and-display-post-page/public/assets/js/cv.js.pagespeed.jm.UGPwgHHoMK.js>; rel=preload; as=script; nopush
Link </wp-content/plugins/quicklink/quicklink.min.js.pagespeed.ce.XAGoxleb0H.js>; rel=preload; as=script; nopush
Link </wp-content/plugins/wp-recipe-maker/dist/public-modern.js.pagespeed.ce.SOOy8iGsGt.js>; rel=preload; as=script; nopush
Link </wp-content/plugins/simple-lightbox/themes/baseline/js/prod/client.js.pagespeed.ce.3VunenzJ26.js>; rel=preload; as=script; nopush
Link </wp-content/plugins/simple-lightbox/themes/default/js/prod/client.js.pagespeed.ce.-9aYp48Jjk.js>; rel=preload; as=script; nopush
Link </wp-content/plugins/simple-lightbox/template-tags/item/js/prod/tag.item.js.pagespeed.ce.HKRnB5RaiU.js>; rel=preload; as=script; nopush
Link </wp-content/plugins/simple-lightbox/template-tags/ui/js/prod/tag.ui.js.pagespeed.ce.82hZO-SaY-.js>; rel=preload; as=script; nopush
Link </wp-content/plugins/simple-lightbox/content-handlers/image/js/prod/handler.image.js.pagespeed.ce.m7L4ntgvbt.js>; rel=preload; as=script; nopush

A very subjective question.
There’s two potential problems where pushing can be the wrong thing to do:
Pushing resources when there are better things to use the network for. So overly prioritising less important assets by pushing them.
Pushing when the client already has the resources being pushed. So a wasted push.
Let’s start with the first. Browsers are very, very good about knowing what order to request things in. Shoving everything down the pipe with no thought of that throws all that out the window. Of course a site owner can know better than a browser (which is after all generic and has to do the best for most sites) but doing this is very hard.
For example is see you are pushing your fonts first (or at least those are defined first in the config). Are those the most critical things on your page? Are all of them important? Or by pushing them are you holding up CSS without which nothing can be painted to the screen (not even the text that the font is being used for)? I don’t know your site well enough to know the answer to that but suspect pushing as much as possible is doing more harm than good.
Google published a paper advising to “push just enough to fill idle network time and no more.” The problem is that’s a very hard thing to get right.
More critical requests can come in - for example user cancels the current page and then navigates to the new page. If you’re still pushing everything, then they have to wait for all of that to arrive before they get what they asked for. Even if the server could reprioritised (and most servers are bad at reprioritising!) the problem is the payloads may have long left the server and be clogged up in the network so reprioritising is only of limited use in this case.
The second issue with blindly pushing is that it doesn’t take into account what’s in the browsers cache? So back to your example with your fonts that you are pushing (not to pick on them - it’s just an example and what I’m about to say applies to the other things being pushed too). Maybe pushing them is the right thing to do so the page loads nice and fast. Now you go to a second page on the site and you push the fonts again - even though the browser already has them in the cache! You are now wasting bandwidth which is both more expensive for the user in terms of wasted cost (maybe not on broadband but few enough have an unlimited plan on mobile!) but also in the fact the new content is queued behind this pointless push. It’s possible to avoid this with a cookie based approach but that requires extra setup, or some browsers (like Apache) track what resources have already been pushed on that connection to avoid this, but that only works if same connection is used and you don’t come back later that day.
In short the optimum amount to push with HTTP/2 is often a lot less than you think. And some would argue “push nothing” is the right answer. Google have repeatedly voiced the opinion that HTTP/2 Push has not proven as a net positive and stated they could drop support of it from Chrome (including very recently).
Resource hints and taking that further with the (the as yet very poorly supported) 103 Early Hints is seen as far less risky (as they are hints so the browser remains in control) and give almost the same performance benefits.

Related

Facebook wont detect my IMAGE

hey guys i am having a problem that i see others with online but i am yet to find a fix for my website.
facebook not detecting my images as of a couple of hours ago, i tried disabling hotlinks in cloudflare, tried fb debuger and scrapping tool etc no solution.
example if you try to share this page, no picture appears: http://www.yardhype.com/12-year-old-jamaican-girl-wins-master-chef-junior-competition/
ERROR message
"Provided og:image URL, ........testsite/wblob/5433A6CF57A599/2803/3DF62/w3l6_ERf2C3ADvxqhgccQg/jasmin.png could not be downloaded because it exceeded the maximum allowed sized of 8Mb."
the image is not over 8mb
ALso correct url is shown in the raw tag area
When I opened that link http://www.yardhype.com/12-year-old-jamaican-girl-wins-master-chef-junior-competition/ I don't see the image regardless; I opened up Chrome's debugger and saw this:
A Parser-blocking, cross site (i.e. different eTLD+1) script,
http://ajax.cloudflare.com/cdn-cgi/nexp/dok3v=85b614c0f6/cloudflare.min.js,
is invoked via document.write. The network request for this script MAY be
blocked by the browser in this or a future page load due to poor network
connectivity. If blocked in this page load, it will be confirmed in a
subsequent console message.See
https://www.chromestatus.com/feature/5718547946799104 for more details.
So the way that page is displaying the image is likely not up to standards.

How to disable google web light for the website

When i am opening my website in slow internet connection then google provide google web light feature then website loading faster. But website is not properly aligned. And load ugly website. I want to disable google web light for the website.
I am using for "no-transform" the disable :
<meta http-equiv="Cache-control" content="no-transform" />
I don't know this right approach or not? And how to check it google web light disabled or not. Because google automatically detect some time load normal user interface and some time load ugly(WITH GOOGLE WEB LIGHT) user interface.
Give me some idea for this..thanks.
I am tested website with in https://www.google.com/webmasters/tools/transcoder?pli=1#url=http%3A%2F%2Fwww.winni.in%2Fbangalore
I have seen one link Can I disable Google Web Light for my website? And also putted <meta http-equiv="Cache-control" content="no-transform"> in header but it is not working.
Google weblight won't be appears on the light and fast website. So, the main key to disable permanently google weblight service is just make your website lighter and faster to load. There are many tricks to make your web lighter and faster than before. First, you can compress your web using gzip compress facility, avoid to use graphic, flash, or other media that can make your web slow to load, use CDN facility, and many other tricks you can try to speed up your website. You can prove it on my official web. Google weblight will never be appears on my official web after do all my tricks. Here I can give you full refference about how to do this.
How to Disable Permanently Google Weblight Service from Your Website
May it's useful.

AdSense on history.pushState enabled page

First off, I know this has been discussed over and over again. But let's take this as a "late 2012 edition" since things tend to change rapidly on the internet.
I have this web page which is a "classical" web page with full page refreshes. Every internal click produces new content. We can show AdSense ads this way without a problem.
Now I started looking into "ajaxifying" (PJAX) the whole page for performance reasons (I've actually made a prototype version and it works superbly). The whole thing works only on browsers that support history.pushState, and whenever a user clicks on a internal link a AJAX request is triggered that fetches only the content part of the page (everything between the header and footer) and replaces old content with it.
The end result is, that the user is presented with a brand new page (including the changed URL and what not) and only the mechanism for delivering the page has changed (full reload vs. AJAX). As far as google (and older browsers) is concerned this is still a regular page with regular links (progressive enhancement and all that).
And yet there isn't a way to display AdSense, what with the document.write's and AdSense's TOS ruining the party.
My question: is there a Google approved (I'm not interested in hacks that will get us banned) way to display AdSense ads on a page like this (and I haven't found it). Or if there isn't, does Google have any plans on supporting this in the future (again, I haven't found anything related to this).
update
After some more digging around I came across Google DFP, which seems to support async loading of adds. But, I'm not sure I can load AdSense ads through it dynamically without breaking the TOS. I'm 100% sure I can load other ads this way, but not for AdSense. Could somebody clear this up for me?
According to this page loading Adsense ads through DFP you are subject to the both the DFP and Adsense terms. So I guess if you are following the current Adsense terms you are not allowed to do what you are talking about... at the same time Google provides a rather easy method to do exactly what you want to do with DFP...
Its still a grey area...

Dynamic (Ajax) share buttons (Facebook, Google+ and Twitter) to share an image, with a link and a description

It's been few days I'm looking for a solution and I can't figure out why it's still not working.
Here is my goal:
I have a website with a sideshow. The images are dynamically changed (with previous and next buttons). I just want to share an image on social networks (facebook, google+ and twitter) and actually see the image in my wall with a little description and the link to a page.
Precision:
The image is a thumbnail (so, not the same url) of the main image and the link I want to publish is neither the page I'm on (which is static due to Ajax) nor the image one.
My tries:
I have almost got it on facebook but the image loading failed and it was with a share button (which seems to be deprecated in favor of like) and for google+, the +1 button become red after I click it... I tried XFBML and OpenGraph, but the problem is with Ajax (url is the one of the page or is not changed even with createElement("
Questions:
1. Is there any packaged solution (like addthis, but working the way I want)?
2. Or do you have one (or a clue) for me please?
3. Am I the only one to think that offical facebook and google+ ajax documentation are lame?
Thanks a lot.
Hugo
PS: if I could have a fly-out to edit a comment with the content I'm about to share, it would be fantastic!
One way to accomplish this is with cloaking.
Setup a page which provides the image, title, and description to the social application (aka. facebook, google+). You can then use Javascript to redirect the user to the page you actually want the user to see. For users without Javascript the page should display a link to the target page with a "Click here if you are not automatically redirected". The image should exist on the page but you can place it in a div with style="display: none;" so the user doesn't actually see it.
A more advanced technique would be to use the IP address and browser name (user agent) to determine if the visitor is a user or a social network robot and using a 502 temporary redirect to the page you want the user to see if the visitor is not a social robot. The social robots would be shown a page which has the image, title and description.
The social sharing buttons that you're using all have one thing in common: they all work best when there's a URL representation for the object that you're sharing. Some of them, namely Facebook's like button and Google's +1 button, use the contents of that page to create the snippet that's shared.
This isn't a new problem, though. This is the same problem faced with search indexing of AJAX applications. Sadly there's no easy solution. Here are a couple of challenging ones:
Programmatic Solution
You can improve your back end so that it's capable of rendering pages for each shareable step in the state of your slide show: one page per image. As you step through the slide show you can destroy and re-create the social sharing plugins each time targeting them to this machine accessible version of that image.
Snapshot Solution
You can use a crawler tool that is capable of executing JavaScript to make snapshots of the different states of your application. You can then target the social sharing buttons to the snapshot of the current state.
This might require less back end work but may be challenging to keep up to date.

Facebook Like Button Comments Not Working - https issue

I'm using FBML and everything was working before, then I noticed that when you click like the comment box would not show up anymore. I checked out the dev site and saw that the way it's done now has changed.
I updated my code but it still doesn't show the comments box but the actual liking action works fine.
Here is a link: http://fez.nu/Oniir
EDIT 2: I'm using XFBML, I don't know if that makes a difference. I've heard that FBML is deprecated
EDIT 3: I browse with https on Facebook. I turned it off and the comment boxes show up on my site. So that problem is solved but how do I make it work with users that are using secure browsing on Facebook when my site is not?
You can use an HTML inline frame.
<iframe src="https://mytab.example.com/tabs/"></iframe>
This question has produced an accepted solution to what appears to be your problem.
Abstract
You can avoid SSL warnings for domains that support SSL by not being
specific about the transport protocol. e.g. instead of including
http:// or https:// use //
<!-- Instead of this: -->
<iframe src="http://www.facebook.com/plugins/like.php?params"></iframe>
<!-- Do this: -->
<iframe src="//www.facebook.com/plugins/like.php?params"></iframe>
Security considerations
Please note that there are some security considerations to this approach. I recommend you read the below articles and questions.
Are there security issues with embedding an https iframe on an http page (SE question)
HTTP and HTTPS iframe (SO question)
Other considerations when serving mixed (http/https) content (external site)
Edit 1
In the FAQ for the Like Button, it's stated that it will need 400 pixels in width to give the user the option to add a comment. If you don't have 400 pixels available, I think you would have to sacrifice the option to post a comment, or use a popup-window instead.
This problem started last night.
I thought it was my code at first but a quick rollback to code I KNOW (QA Tested) worked manifested the issue.
I went to the source itself, Facebook's OWN Like button generation tool and THEY have the same problem.
Then I picked BOTH a random CNN & Yahoo news article and that is what you call a trifecta. All 3 sites could not properly render the comments UI elements for the like button.
If it walks, quacks and moves its head like a duck... WTF? :))

Resources