How lighthouse measure our page speed? Does it go through all indexed pages from sitemap or does it only cares about homepage?
My use case is that I optimised homepage for over 50 points but other subpages are below 30. Would it affect how google treat my page?
Lighthouse runs a barrage of tests against the page that you are testing(It could be homepage, about, product etc.) and then generates a report on how well the page did.
Related
I have a Joomla page which is a menu item of type 'wrapper' and which also displays a couple of modules beside that wrapper.
I can see how many hits an article has had, however, this page has no article, how do I know how many visitors it has had?
Joomla doesn't keep track of the number of hits on pages, it keeps track on some content items, such as articles, so, when an article is displayed within a page, it increases the number of hits for that article.
Now, if you want to know the number of hits on a non article page, you have 3 options:
Add an empty article to that page and then look at the hits on that article..
Use a grep command on your logs to know the number of hits.
Check Google Analytics for the number of hits on that page.
The latter is the simplest, but it requires that you install Google Analytics on the website (or at least on that particular page).
I want to test different variations of product pages/layouts that I set up on Magento. It would be simple with a CMS page - but one cannot just create a new product. Customers should be able to buy exactly the same product, but enter on different pages.
NB: The google website optimizer thing (which I could never get working anyway) is apparantly dead now, replaced by content experiments via Google Analytics.
CMS pages are simple A/B whereas product pages are multivariate. That means you get something ridiculous like 32 recipes for a product if you go and setup different descriptions, titles, pictures and whatever else. It all works just fine in 1.7 but there is some effort needed to follow the screencasts, test and tinker.
If you want A/B for products consider setting up two stores/website views in the backend, e.g. 'store A' and 'store B'. Now edit the index.php and set the store code to 'A' for people with even IP addresses and 'B' for people with odd IP addresses:
$mageRunCode=(ip2long(1 & $_SERVER['REMOTE_ADDR']) ? 'A' : 'B')
Consider testing the layout change for only one product. The normal product page will be your control page (page "A")
If the price on the control page doesn't get updated frequently, simply create the challenger page (page B) as a static page somewhere on your server.
After you've done that setup the experiment in Analytics, add the experiment code to page A and you're done.
I've just been given the task of turning around a site's plummeting SEO. One of their issues is their well ranking deep products have now been cut off due to them now being shown as part of a dynamic faceted search option. It can't and won't be indexed and the faceted search is important to the way they need to display their products.
You could create a rewrite rule to make it appear like they are static pages, but I would recommend not doing that.
Ask yourself if Google would want to crawl search result pages: probably not. It's very common to NOINDEX, NOFOLLOW search result and tags pages because they are low quality in terms of content.
Here's a free extension to help you accomplish just that.
Here's a helpful article on SEO in relation to Magento.
Sounds like they got hit by Panda/Penguin. You should be focusing on building links to your main categories and your homepage. Clean up poor quality pages (eg search result pages). Build deep links to products that perform well, but vary your anchor tags considerably. Without a link we can't give much more advice than that.
Imagine a site with 1,000 pages, with 20 pics on each page. Sorting is done by rating, and since site is very popular, ratings for pics are changed very frequently (rating is based on number of votes). One pic might have rating 5.6, and 5 mins later 5.9 and so on.
There is a problem when visitor is browsing site - on some pages he'll see pics he alredy saw on one of previous pages (because in meantine rating is changed for some pics).
I don't know what is the proper solution for this. Do I need to make changes in script so site will remember order of ALL pics when user start browsing? Also, new pics are added every minute or two, so they needs to be visible too.
What is the best approach for this?
Thanks
Think about how the ratings will change.. if it really is so popular then the ratings will not change so much unless the image is new. For new images a rating could change considerably but the divergence will decrease upon each rating. You may just want to put something saying the image is 'hot' or 'trending' or something similar.
Google will now parse certain microdata (for example reviews) on your web pages and display the info in search results. They call this Rich Snippets
I am wondering is this page specific or domain specific?
I keep all my reviews on a separate review page thats linked to from the home page. But my review page itself is very unlikely to be displayed in a search result, more likely to be displayed is my homepage or product landing page. But being that the review microdata is not on these pages (but is on the website). I am wondering if the rich snippets will be shown for these pages?
They're tied to the page, effectively; a result which returns the homepage won't include content from another page. As with any other organic ranking scenario, Google aims to return the best individual page for a query; as such, if it percieves your homepage to be a more authoritive resource and result for the search query, it'll return that rather than the page containing the microformatted data.
I'd tentatively suggest that the wider problem is one of value attribution, and that undertaking some page-level SEO in order to clearly signpost content/context, and to ensure that content is distinct and relevant at page-level (and in one place for one topic) might help.