Supporting A/B tests without hurting CLS metric - core-web-vitals

We are a 3rd-party vendor that adds components / UI elements to our clients' websites. We sometimes hide/change the size of this container in run-time, based on contextual parameters or as part of A/B testing.
It is impossible for the website owner to know the final size of the element before we have all the contextual data, so the height cannot be set on the server-side.
To minimize the effect on CLS, the website owner can set an initial height for the container, but this has two issues:
It does not completely eliminate CLS, only reduces it slightly
It creates a bad UX where the page loads up with a white space which then disappears / changes height
What is the recommended approach for eliminating the CLS impact of such an element?

We sometimes hide/change the size of this container in run-time, based on contextual parameters or as part of A/B testing.
Any time you change the size of content at runtime, you risk shifting other content on the page around, and that can negatively affect the use experience. Before you spend a lot of time trying to "fix" CLS for your use case, you might want to consider whether your use case if the right experience for users.
If you cannot change your system and just want to minimize its impact on CLS, here are some options:
Collapse the area only after user input (perhaps ask users to close the container, or, wait for some other expected reason for page re-layout).
Only collapse if all affected content is outside of the viewport. It sounds like you already do this for below-the-fold? For above-the-fold content, you may be able to simultaneously remove content and adjust the scroll position by the exact same amount.
And, perhaps some broader alternatives are:
Don't collapse the area, but replace it with some default content that cannot fail.
Likely not an option, but maybe there are ways to delay showing content until you know if the conditional content will be needed? This depends on how late loading your content is and will have negative tradeoffs with loading performance if you cannot answer that success test quickly...

Related

Lighthouse Field data is not updated for a very long time, what should I do?

Due to a wrong UX design, I have got a very low CLS score. However, I have fixed the mistake over a month ago.
But the Field Data still remains not updated.
What should I do to force update the Field data?
What should I do to force update the Field data?
You can't I am afraid.
However if you want to see your Cumulative Layout Shift data in real time (to catch any problems / confirm fixes early) you can use the "web vitals library" - please see the final level 3 heading ("Tracking using JavaScript for real time data") at the end of this answer.
What is actually going on here?
Field data is calculated on a 28 day rolling basis, so if you made the change over a month ago the problem still persists.
Just because the lab tests yield a 0 cumulative layout shift does not mean that that is the case in the field.
In field data the Cumulative Layout Shift (CLS) is calculated (and accumulates) until the page reaches unload. (See this answer from Addy Osmani, who works at Google on Lighthouse, the engine behind Page Speed Insights).
Because of this you could have issues further down the page or that occur after an amount of time that would cause a layout shift to occur that would not be picked up by automated tests.
This means that if layout shifts occur once you scroll the page (due to lazy loading not working effectively for example) it will start affecting the CLS field data.
Also bear in mind that field data is collected across all devices.
Probable Causes
Here are a couple of probable causes:
Screen sizes
Just because the site doesn't show a CLS on the mobile and desktop sizes that Page Speed Insights uses does not mean that CLS does not occur at different sizes. It could be that tablets or certain mobile screen widths cause an item to "jump around" the screen.
JavaScript layout engines
Another possible causes is using JavaScript for layout. Looking at your "Time to interactive" and "total blocking time" I would guess your site is using JavaScript for layout / templating (as they are both high indicating a large JavaScript payload).
Bear in mind that if your end users are on slower machines (both desktop and mobile) then a huge JavaScript payload may also be having a severe effect on layout shifts as the page is painted.
Fonts
Font swapping causes a lot of CLS issues as a new font is swapped in it can cause word wrapping to change and therefore change the height (and width if the width is not fixed / fluid) of containers.
If for some reason your font is slow to load in or is very late in the load order this could be causing large CLS.
Yet again this is likely to be on slower connections such as 4G where network latency can cause issues. Automated tests may not pick this up as they throttle loading based on a simulation (via an algorithm), rather than applying throttling (actually applying latency and throughput slowdown) to each request.
Additionally if you are using font-icons such as font-awesome then this is a highly probable cause of CLS. If this is the cause then use inline SVGs instead.
Identifying the issue
Here is a question (and answer as nobody answered me) I created on how to identify problems with CLS. The answer I gave to my own question was the most effective way to identify the problem I could find, however I am still hopeful someone will improve upon my answer as more people get used to correcting CLS issues. The same principle would work for finding font word-wrapping issues.
If the issue is JavaScript related as I suspect then changing the CPU slowdown in Developer tools will allow you to spot this.
Go to Developer Tools -> Performance -> Click the "gear" icon if needed on the top right -> "CPU". Set it to 6x slowdown.
Then go to the "rendering" tab and switch on "Paint Flashing", "Layout Shift Regions" and "Layer borders". You may have to enable the "rendering" tab using the 3 vertical dots drop down to the left of the lower panel menu bar.
Now reload your page and look for any issues as you start navigating the page. Keep a close eye out for any blue flashes as they are highlighting items that were shifted. I have found that once I spot a potential shift it is useful to toggle the first two options on and off individually and repeat the action as sometimes layout shifts are not as noticeable as repaints but both together can be confusing.
So do I have to wait 28 days to see if I have fixed the problem?
No, if you watch your CLS score for about 7 days after a fix you will see a slow and steady improvement as people "in the red" disappear from the rolling 28 day average.
If your percentage in the red drops from 22% to below 18% after 7 days then the odds are you have fixed the issue (you would also see a similar drop for people "in the orange").
The actual CLS number (0.19 in your screenshot) may not change until after 28 days so ignore that unless it jumps upwards.
Tracking using JavaScript for real time data
You may want to check out The web vitals library and implement your own tracking of CLS (and other key metrics), this way you can have real-time user data instead of waiting for the Field Data to update.
I have only just started playing with this myself but so far it seems pretty straight forward. I am currently trying to implement my own end-point for the data rather than Google Analytics so I can have real time data under my control. If i get that sorted before the bounty runs out I will update the answer accordingly.
What should I do to force update the Field data?
I am not sure if YOU could do anything to change this data as this data is collected based on Chrome User Experience Report, as mentioned here:
The Chrome User Experience Report is powered by real user measurement
of key user experience metrics across the public web, aggregated from
users who have opted-in to syncing their browsing history, have not
set up a Sync passphrase, and have usage statistic reporting enabled
About your question as to why it is not being updated and in your lab data it has 0 cls but in field data it is not the same, again it depends on variety of factors. Lab data is basically running the report in a controlled environment (mostly your machine) while field data is result of aggregated data from variety of users with variety of network and devices and mot likely they will not be the same as you unless your targeted audience is using similar network and device as the one on which lab report was ran.
You can find few similar threads by searching webmasters forum.

Web Dev: Strategy to maximise user website display speed

There are many specific 'display speed' amelioration techniques discussed on stackoverflow, however, you need to be aware of a particular option, before searching for how to do it.
Therefore, a guide-line strategy would be useful, particularly for those new to website development.
Due to this concept covering numerous individual areas.... the answer may prove best to be collaborative.
My website involves the usual suspects: js, css, google-fonts, images, and I have yet to embed 3 youtube videos.
As such, it likely represents a standard web site, encompassing the needs of many users.
I propose to commence an answer that can subsequently be added to, so that a general strategy can be determined.
Web Dev: Strategy to maximise user website display speed
(work in progress)
Step 1. Compression & Caching
Enable Gzip, or deflate compression, and set cache parameters.
(Typically achieved in .htaccess, unless you have control over your server.)
Step 2. Fonts
Best option is to stick with 'safe fonts'.
(Tested 'linked Google Roboto' in Dev-Tools - 2.8 seconds to download!… ditched it for zero delay.)
Step 3. Images
A single image might be larger than all other resources, therefore images must be the first in line for optimisation.
First crop the image to the absolute minimum required content.
Experiment with compression formats - photos (jpg), block colours (gif), pixels (png).
Then re-size the image (scale image) to the maximum size to be displayed on the site (controlled using css).
Reduced max-display size, allows greater jpg compression.
(if a photo site - provide a link to full size, to allow user discretion.)
Consider delayed image loading eg, LightLazyLoader where images are only loaded when the user scrolls to view more images.
Step 4. Arrange load sequence to provide initial user display
This requires the sequential positioning of linked resources.
Eg. If using responsiveslides, the script must be read before the images begin to display, otherwise many images will be stupidly displayed; until the script is read.
Ie. Breaking the ground rule, and loading scripts early, can present a working display, whilst other 'un-required elements' continue to load.
Step 5. Combine CSS and JS resources according to load sequence
Simply open a css resource in Notepad++ and copy it into, and above, your 'my.css' file… and repeat for JS.
Step 6. Minify the combined resources
Various program exist to achieve this… google tools offers this online… upload the file and download the minified version with your comments and white space stripped out.
(Remember to 'version name' the files, so that you keep your comments.)
Notes:
Keep testing the changes that you make (say) in google-dev-tools Ctrl+shift+I, Tab: Network.
This graphically shows element load times, and the overall timeline until fully loaded.
Where multiple elements begin loading at the same time… this is ideal.
Typically, another group will begin loading when the previous lot have finished.
It is preferable that all elements are downloaded simultaneously.
We are advised that this can be achieved by storing the different elements in different locations, to facilitate multi-threaded downloads (where elements cannot be combined).
Comment
Using 2Mb/s throttling, and a clean load, my 1st image display is now ready # 2 seconds (down from 15 seconds).
The 2s delay is due to other threads being occupied.
My objective is to commence the '1st image' load earlier, whilst ensuring that in less than 500ms the 2nd image is ready to be displayed.
This is a work in progress.

What's causing this excessive "Composite Layers", "Recalculate Style" and "Update Layer Tree" cycle?

I am very intrigued by the excessive number of "composite layers", "recalculate style" and then "update layer tree" events in one of our webapps. I'm wondering what's causing them here.
If you point your Chrome to one of our fast moving streams, say https://choir.io/player/beachmonks/github, and turn on your "FPS meter", you can see that the app can achieve about 60fps most of the times when we are on the top.
However, as soon as I scroll down a few messages and leave the screen as it is, the FPS rate drops dramatically down to around 10 or even lower. What the code is doing here is that it renders each incoming message, prepends it to the top and scroll the list up Npx, which is the height of the new message, to keep the viewport position intact.
(I understand that scrollTop will invalidate the screen but I have carefully ordered the operations to avoid layout thrashings. I am also aware of the synchronous repaint that happens every second, it's caused by the jquery.sparkline but it's not relevant to this discussion.)
Here is what I see when I tried to profile it.
.
What do you think might be causing the large number of layer operations?
The CSS property will-change: transform on all elements needing a repaint solved the problem with too many Composite Layers for me.
I had the same issue. I fixed it by reducing the size of the images.
There was some thumbnails in the scrollable list. The size of each thumbnail was 3000x1800 but they were resized by CSS to 62x44. Using 62x44 images reduced the time consumed for "Composite layers".
Some info about the Composite Layers that can help
from what I see here it says
event: Composite Layers
Description: Chrome's rendering engine composited image layers.
for the meaning of the word composite from wikipedia
Compositing is the combining of visual elements from separate sources into single images, often to create the illusion that all those elements are parts of the same scene
so this the process of making the page we actually see by taking the output of coding/resizing images, parse HTML and parse CSS to make the final page we see

Coordinating graphic elements with streaming media

if you were watching the State of the Union Address (http://www.whitehouse.gov/state-of-the-union-2013) you would have seen graphic supplements that appeared alongside of the video stream of the President that served to illustrate his key points.
The video on the site is a composite of this, but during the live streaming these were handled separately.
My question is: what is the best approach for doing this? especially if one wanted very tight control of the appearance of the graphics (i.e. right when the point is made, not before and not long after).
I'm wondering if any tools exist to facilitate this? I've been scouring google, but I don't think that I have the correct technical vocabulary for what I'm describing because I'm coming up blank.
I imagine AJAX would be a good starting point, but I'm not sure how to achieve the level of control that they had, or how to handle the back end of things.
For anyone who might encounter this challenge we devised two ways to solve it:
The first is a bit mickey mouse: It requires that you know how many images, etc you want to use beforehand (which in most cases you would). We wrote a script to repeatedly request an image and inserts it into the page, and on finding an image then request the next image in the chain.
Ie. Display default image -> request image 1
then, displaying image 1 -> request image 2
etc
From your end you can simply drop the images into a folder on your server when you are ready for them to go in. An advantage of this is that the images can be interactive, with links to other content, etc.
The big disadvantage, of course, is a lot of unnecessary requests to your page. In our case we anticipated enough traffic that it didn't seem wise. Also, there are plenty of opportunities for mistakes and depending how frequently your timer fires there are likely to be timing discrepancies.
The Second costs money: we found the program Ustream (http://www.ustream.tv/producer) which allows us all the image control we require in terms of timing with the advantage of providing support for media clips etc. And it allows you to record everything streamed.
The disadvantage is that what the user sees is an integrated video on your site, so that you have to handle links to related content and provide images (if you want your users to have access to them) separately.
Hope this comes in handy for someone
I would still welcome any suggestions on how to make the first method more effective

Increase page loading speed

In my app, I've a panorama-page which contains around 10 panorama items. Each panorama item has some path drawings, a list picker and few input fields.The problem i'm facing is that whenver i navigate to this page the navigation is very slow due to lot of content to initialize. If i comment the InitializeComponent(); the loading becomes fast.I thought of adding the XAML content in code, but the problem is that i've to access the input fields by their name in code, so it didn't worked.Any idea how i can speed up the navigation to the page.Thanks..
From the UI Guide:
Use either a single color background
or an image that spans the entire
panorama. If you decide to use an
image, any UI image type that is
supported by Silverlight is
acceptable, but JPEGs are recommended,
as they generally have smaller file
sizes than other formats.
You can use multiple images as a
background, but you should note that
only one image should be displayed at
any given time.
Background images should be between
480 x 800 pixels and 1024 x 800 pixels
(width x height) to ensure good
performance, minimal load time, and no scaling.
Consider hiding panorama sections
until they have content to display.
Also, 10 PanoramaItems seems like a lot since the recommended maximum is 4. You should either cut down on the number, or hide the content until it's required. Have a read of the best practice guide for Panoramas on MSDN.
I think you could improve page performance by creating usercontrols for the specific panorama items, add an empty panorama control to your page (with only the headers) and as picypg suggests load these usercontrols when they are needed.
Another way could be that you load the first page and show this one already to the user. In the background you could start loading the other panorama items.
My suggested approach would be for the first one. Using the lazyloading principle.
I woudl assume that your delays are due to the number of items on the page. This will lead to a very large object graph which will take a long time to create. I'd also expect it's using lots of memory and you have a very high fill rate which is slowing down the GPU.
Having input items/fields on PanoItems can cause UX issues if you're not careful.
That many panoItems could also cause potential navigation issues for the user.

Resources