I have my google news sitemap, which according to google's spec, should contain articles posted in last 2 days. Now, suppose there are no articles pushed in last 2 days for my site, my news sitemap would be empty. IS this the desired behaviour. Do I need to show something else, if no articles are posted in last 2 days. Will the webmaster tools invalidate the sitemap?
I don't see what other choice there is for a sitemapper if you have not created any news stories within last two days. You could perhaps choose not to update your news sitemap? :)
Will the webmaster tools invalidate the sitemap?
Yes, it will... But then, GWT judges it as any regular xml sitemap and does not account for specific Google News sitemap rules.
So the more important question is:
IS this the desired behaviour. Do I need to show something else, if no
articles are posted in last 2 days.
... to which I cannot find any clear answer anywhere :(
Related
I have a Blogger blog and I used to have Blogger's own comment system in it. I didn't like some parts of it, so I tried changing to Google+ comments instead.
I have no problems with the comment box, it's implemented well, works fine, etc. But when I had Blogger comments, I could see the newest comments my visitors had posted site-wide and I also received email notifications when someone posted a comment in any post of my blog.
However, now, with Google Plus comments, I don't seem to get any sort of notification. (no emails, not even that alert thingy on top-right corner of Google that only ever shows Youtube comments I don't care about) And, also, I know of no way to check the most recent comment in my website.
I kind of need either of these features (most recent / notifications) so I can reply to people when they post comments on my blog. After all I got dozens of posts it's not viable to check every single one of them for new comments every single day.
How can I view the most recent Google plus comments within a website? Or at least receive an email when there is a new Google plus comment posted in my website?
P.S.: I'm not interested in an API for these. There should be an actual user interface somewhere for these things, right?
As it currently stands, this feature has not worked since October 2016.
According to a post by a Google Employee in the official Blogger Forum on 2nd February 2017 -
Hi all,
Thanks for posting.
Just wanted to let you know that the concerned team is aware of this
issue and is working on it. I will keep you all posted as soon as I
get an update from them.
Best,
Theo
Any updates regarding this issue will be likely posted in the above forum thread
We use Joomla 3.4.4 for our Company website. We have mod_rewrite and SEF-Urls.
In our company website, we use categories only to organize articles internally, not for public access.
Nevertheless, Google has somehow found out the categories and displays them in the search results. People clicking on these category search results land on a page with several articles, which is not intended.
How can I prevent Google from indexing the category pages?
I'll try to set the robots field in the category options to "noindex, follow". Hope this helps.
A quick workaround: Adding some RewriteRules in .htaccess. These redirect the unwanted category requests to the main page.
I scanned the whole google results and by now I have about 10 RewriteRules for unwanted URIs.
This was a major problem with our websites. Google searches would show several unwanted categories and include a number prefix (10-videos). Clicking the Google search would show a dozen various articles that were all marked noindex, nofollow. As the category itself was marked noindex, nofollow and the global default was noindex, nofollow, it was a complete mystery why this was happening.
After a several years of frustration, I finally solved it. There are two parts. A long-term permanent solution and a short-term temporary solution which will remove them from Google searches within 24 hours.
The long-term permanent solution is to disallow the categories in robots.txt. When the site is re-crawled, they should go away. Add the offending categories at the end of robots.txt. This will also take care of their sub-categories. These are case sensitive in Google so make sure to use only lower case. (Disallow: /10-videos)
The short-term 90 day solution is to manually remove the URLs in Google Search Console. This can currently only be done in the old version of GSC. It is not yet available in the new version.
In the new Search Console click Index : Coverage. Make sure the Valid tab is highlighted. At the bottom, click on “Indexed, not submitted in sitemap” to see the offending links.
In the old version go to Google Index : Remove URLs. Click Temporarily Hide. Enter just the category (10-videos) as it will automatically add the website link. This will also take care of their sub-categories. By the next day the bad URLs will be gone (for 90 days). These are case sensitive so make sure to use only lower case.
Hello i have personal site and about 1 month ago i rebuild the complete site. I sent a new sitemap.xml file and is not indexed yet, but im having 404 crawler errors with the old url.
Google said the sitemap is correct,so, any idea, i must do something, or just wait longer?
Is not really important because is just a personal site, but i`m just curious about what is that happening.
Sorry for my bad english, but im spanish and thanks in advance
this just takes time.
I experienced some speed-up in that progress by using other google services as places or analytics. but to answer your question:
If your sitemap has been detected correctly it will work but this might take some time.
having sent normally the first time my sitemap.xml through webmaster tools, I notice every day submitted url's plots (beside indexed ones under optimisation->sitemaps menu) without doing anything from my own. I use drupal7 with sitemap module (http://drupal.org/project/xmlsitemap) and there's no automated tasks enabled.
Does it mean that url's are submitted "internally" by google every day? Or there's something wrong that I need to resolve?
Many thanks for help.
Google will remember any sitemaps you submit and their crawler will automatically download those and associated resources more or less whenever it feels like doing so. This is usually reflected in your Webmaster Tools. In all likelihood it'll even do so without you entering your sitemap on their website if your site gets linked to. Same goes for pretty much any other bot and crawler out in the wild.
No need to worry, everything is doing what it's supposed to. It's a Good Thing(tm) when Google crawls your site frequently :).
In a News sitemap, we need to include URLs for all our articles, but only from the last three days. we shouldn't include older articles. Why we shouldn't old articles and if we keep the old articles what would happen?
I assume your talking about the XML sitemap for the crawling engine like google?
Yes, you can keep the old articles, even more, this is preferable.