I have created a page where there are various items on a page and people need to vote on them by clicking "recommend" (like how they have it on levi.store.com). The items are sorted based on the number of "recommends" they receive. The problem I am having is that there are 100 of these items, and when I try to display them it becomes way too slow. Is there a way to do this more effectively, this is some pseudo-code of what I have (I am using Wordpress)
$theCategory = 'the-item-category'; //every item is a post and is placed into this category
$items->query('cat='.$theCategory); //this gets all those items in that category
while($items->have_posts()) : $items->the_post();
<h1><?php the_title(); ?></h1>
<iframe src="http://www.facebook.com/plugins/like.php?href=<?php echo urlencode(get_permalink($post->ID)); ?>&layout=button_count&show_faces=false&width=450&action=recommend&font&colorscheme=light&height=21" scrolling="no" frameborder="0" style="border:none; overflow:hidden; width:140px; height:21px;" allowTransparency="true"></iframe>
I would recommend using the FBML version of the Like button. You can then display them on-demand, like TechCrunch does on story mouse over, or start loading them after page load (i.e. on DOM ready). Turn off automatic FBML parsing in your Facebook init and them use FB.XFBML.parse(DOM ID) to render each Like button.
Using iframes directly mean you are trying to load 100 web pages on page load. That's a lot, especially when browsers will only open a max of 8 connections per domain. Some open less. So with 100 Like buttons, it will take over a dozen "rounds" to load everything.
I'm assuming you are fetching and caching the number of Likes each story has on the server.
Related
I try to improve the overall performance of loading time (especially images) with the Google Chrome Lighthouse extension of this website: https://muckenthaler.de/
When the Performance Test is finished I get a list of opportunities, please see this screenshot or test for yourself: https://capture.dropbox.com/YV5ii1vrj0xpfWwK
Under Serve images in next-gen formats are listed some image urls (like this one: https://muckenthaler.de/media/image/54/7c/cb/Sliderbild_Produkte_Steh_SItz_Tische_2.jpg) that don't even appear on the specific page but somehow seem to get loaded into it and affect the performance.
How could I prevent this and why are these image resources loaded?
Here are the WebPageTest results for this page: https://www.webpagetest.org/result/220529_AiDcZP_759/1/details/#waterfall_view_step1
All you need to know from this thumbnail is that yellow rows indicate HTTP redirects and purple bars represent images. The longer the bar, the longer it took for the resource to load.
So we can tell a few things from this waterfall image:
there are many redirects
there are many images
many images take a very long time to load, relative to other resources
When I look up https://muckenthaler.de/media/image/54/7c/cb/Sliderbild_Produkte_Steh_SItz_Tische_2.jpg in the response headers, I see at request 19 that there's a redirect to that image from https://muckenthaler.de/media/image/Sliderbild_Produkte_Steh_SItz_Tische_2.jpg.
Looking up that image in your source code, I see
<img src="https://muckenthaler.de/media/image/Sliderbild_Produkte_Steh_SItz_Tische_2.jpg"
alt=""
loading="eager">
Also note that this content is inside of something called <div class="hidden-elements">. These elements of class emotion--element are set to display: none so that the contents are not shown on screen, but loading="eager" on the images forces them to be loaded.
It seems like maybe your CMS (Shopware) is trying to eagerly preload images that will be used on other pages. That's not a terrible idea if you have a small number of lightweight images and users are very likely to navigate to those pages, but in this case it's loading dozens of images totalling over 30 MB. So definitely not recommended.
According to the CWV Tech Report, Shopware websites tend to only load 2 MB of images and have pretty good Core Web Vitals performance compared to other CMS and ecommerce platforms. That leads me to believe that there might be a misconfiguration on your end, or you may have installed a bad plugin.
First things first, a big thanks to Rick Viscomi for the research!
I found the answer which is basically Shopware 5 hidden elements, that can be shown and then be removed clicking the number next to the chain icon.
Here is a screenshot.
i store images on my server.
Other websites/ pages can display these images.
I want to prevent old images showing on other pages after i changed these on my server.
Other websites can use this code:
<a href="link" target="_blank">
<img src="https://www.example.com/image.gif?1222259157.415"/>
</a>
*where "1222259157.415" is a random number and never changes.
Will this prevent caching?
Or will the image (image.gif?1222259157.415) be cached.
If this works, i can replace image.gif on my server and it will display this new image on the other websites too?
Yes, it will. THis is a typical cache busting approach.
You could make the number not random but i.e. a number based on the file write time. Anything that only changes when the image changes, so caching STILL applies for repeated loads of the same unchanged image.
I have searched the net for a solution but can't seem to get anywhere.
My page (php) is loading with one url (let's say www.mysite.com)
in the page several search options on music (albums) can be done and the tracks are shown. (without refreshing the page). the info comes from a database.
So the url stays the same.
In this search process the facebook meta tags (description, url, title) stay the same also because I never reload the page, I only load content into div's.
I would like to be able to 'like' the album, and backlink to it. So I have created the function to load the album by using the url: www.mysite.com?album=12345
I can show a popup with this url to share this.
So, if you go to this url, the content is automatically loaded based on the url parameter.
And on this spot (where you can see the url with the parameter ?album=12345) I would like to show the 'like' button as well. (I generated the url, so I use this in the code:)
echo '<div style="overflow:visable" class="fb-like" data-href="http://mysite.com/?album='.$albumid.'" data-send="false" data-width="300" data-show-faces="false">?</div>';
it works so far... (after I added the parse code to enable the button)
However the like button takes the default meta tags description and title etc.
Not particular on this album or artist - so it's not unique.
Note: if I remove the meta[property=og:url] from the header I can make the button backlink to the right url with the ?album parameter. Otherwise it would go back to the default root of the site mysite.com (this does make the lint tool give an error on the missing meta)
I have tried to add into this same function something like:
$("meta[property=og\\:url]").attr("content", "http://mysite.com/?album=<?php echo $albumid; ?>");
$("meta[property=og\\:title]").attr("content", "<?php echo $artistname; ?>");
$("meta[property=og\\:description]").attr("content", "<?php echo $albumname; ?>");
I did this so the meta tags will be changed, just to let the like button show the right description etc. However this doesn't work.
I understand that facebook scrapes the page (I used the lint tool etc.) but I will never executes javascript, so the meta tags wil stay as default (when first loading the page)
What can I do to make a unique like button, with it's own description (albumname etc) without making a html page for each one of them (millions of albums in the database...)
I hope it makes sense.
I can't seem to figure this one out, help please :-)
Based on the comments below I used the following solution:
you should create the right fb meta tags when the url (with the params ?alb_id=12345) is opened.
That's enough for the like button to do its job.
Your logic is fine, up to the point where you're setting the meta tags using jquery.
They should be set using PHP. As you can imagine the scraper won't execute the jquery, but if it's fed the already PHP-customized meta tags it will use them (as provided).
Just have the og:tags prepared server-side, depending on the albumId requested, and it should work. It might not work right away, I remember there used to be occasional caching issues with the scraper before.
In short, index.php?album=123 will send a different set of og:tags to the scraper than say index.php?album=321. Just set them up server-side.
<meta property="og:title" content="<?php echo $artistTitle; ?>"/>
What can I do to make a unique like button, with it's own description (albumname etc) without making a html page for each one of them (millions of albums in the database...)
You can’t, because Open Graph objects are URLs (resp. are represented/identified by their URL).
One URL == one Open Graph object.
But where’s the problem in having one URL for each album? Since it all works using parameters, it’s not like you have to create a page for each album URL manually …
On my website I have a image rotator. The page takes some time, it varies on how many pictures are in the slide show, to load the pictures. This delays the page and it takes more time to appear. After the page is finished loading how would I then be able to run this script? That way they could see the page, and then seconds later have this image rotator appear. The image rotator is not necessary so it's fine if it does not show up right away.
<php require(DIR_WS_INCLUDES . 'slideshow.php'); ?>
I'll take any language, php, javascript or jquery, or ajax. I do not know ajax so if would be helpful if the answer was in one of the first two languages.
Thanks, Luc
PHP won't be a good solution for this - the whole PHP page will have been executed by the time the page is shown to the user.
Instead, you might look at JavaScript solutions - searching for JavaScript image carousel or rotator should lead you to many popular methods and libraries.
Php is a service program, it can not do a multi threading work. It only show the page after completed all the codes. So I think, whether can send some data to other pages when the main page is open? then process divide in several pages, then return the data back the main page
for example:
main.php, there has 3 divs.
<div id="a">aaa</div>
<div id="b">bbb</div>
<div id="c">ccc</div>
How to open the main.php, then automatic send word aaa to a.php, word bbb to b.php, word ccc to c.php?
I prefer ajax can do it. but I search on web, can not find a tutorial which can suit me. Can any one teach me a little? Thanks.
If you are doing Ajax, are you planning on using only native Javascript, or are you open to a framework?
For example, you can do what you want with jQuery.
$(document).ready(function() {
$("#a").load("/aaa.php");
$("#b").load("/bbb.php");
$("#c").load("/ccc.php");
});
What this does is wait until the DOM is ready and then uses .load to call the URL and put the results in each specified div.
To do this without a framework takes a bit more code, but of course you don't have to include the framework then (in case the framework is overkill.