Will sitemaps (with indexes and text url lists) work with cross subdomains? - sitemap

I have two subdomains I use for my website: static.example.com and www.example.com. Due to the nature of my web server, it is best for me to serve the static content (css, js files and hopefully sitemaps) with static.example.com.
I have put Sitemap: https://static.example.com/sitemap.xml into the robots.txt for www.example.com. However, I will need to have several sitemap indexes with hundreds of thousands to a few millions of urls under different subdirectories.
For example, I have the following subdirectories in the main website:
www.example.com/articles
www.example.com/questions
www.example.com/videos
...
Therefore, can I structure my sitemap.xml in this way:
<?xml version="1.0" encoding="UTF-8"?>
<sitemapindex xmlns="http://www.sitemaps.org/schemas/sitemap/0.9">
<sitemap>
<loc>https://static.example.com/sitemaps/article.xml</loc>
</sitemap>
<sitemap>
<loc>https://static.example.com/sitemaps/question.xml</loc>
</sitemap>
<sitemap>
<loc>https://static.example.com/sitemaps/video.xml</loc>
</sitemap>
</sitemapindex>
Then for example in the article sitemap:
<?xml version="1.0" encoding="UTF-8"?>
<sitemapindex xmlns="http://www.sitemaps.org/schemas/sitemap/0.9">
<sitemap>
<loc>https://static.example.com/sitemaps/article/1-10000.txt</loc>
<lastmod>2021-04-22T19:50:00+00:00</lastmod>
</sitemap>
<sitemap>
<loc>https://static.example.com/sitemaps/article/10001-20000.txt</loc>
<lastmod>2021-04-22T19:50:00+00:00</lastmod>
</sitemap>
</sitemapindex>
And in each .txt files I will be listing the urls that address the main website. For example
https://www.example.com/article/1
https://www.example.com/article/5
https://www.example.com/article/8
...
Is this structure okay? Cross submits explained here explicitly allows me to put my main sitemap under a different domain and for txt url lists it tells me to put them into the highest-level directory. Didn't see it mentioning serving url lists or sitemap indexes under a different subdomain.
Is it possible for me to serve my sitemaps and url lists in this way?

This by default won't work. The sitemaps protocol states that (see section on "Sitemap file location
"):
Note that this means that all URLs listed in the Sitemap must use the same protocol (http, in this example) and reside on the same host as the Sitemap. For instance, if the Sitemap is located at http://www.example.com/sitemap.xml, it can't include URLs from http://subdomain.example.com.
However, there are ways to make it work. For example, with Google, it will work as long as all subdomains have been verified in Search Console (see details here). More generally, you need to edit the robots.txt files to prove that you own all these hosts (even if they are just subdomains). You can see the "Sitemaps & Cross Submits" section of the same sitemaps protocol for details.

Related

How to customise the default language sitemap url in a multilanguage Hugo website

I have a multilanguage website in Hugo and right now the sitemap generated automatically is the following:
<sitemapindex xmlns="http://www.sitemaps.org/schemas/sitemap/0.9">
<sitemap>
<loc>https://domain/en/sitemap.xml</loc>
<lastmod>2022-04-20T08:34:57+02:00</lastmod>
</sitemap>
<sitemap>
<loc>https://domain/it/sitemap.xml</loc>
<lastmod>2022-04-20T08:34:57+02:00</lastmod>
</sitemap>
</sitemapindex>
The issue is that all the content in English, which is the default language, does not contain /en in the url but simply the slug itself, such as /products /blog. The italian content contains the language indication in the url instead, such as /it/prodotti, /it/blog.
Sitemap-wise, it doesn't seem to be advisable to have the english sitemap in /en/sitemap. It should be in /domain/sitemap_en.xml instead.
Any clue on how to customise the localised url of the sitemap?
Thank you.
Here is the hugo Built-in Template for sitemapindex:
https://github.com/gohugoio/hugo/blob/master/tpl/tplimpl/embedded/templates/_default/sitemapindex.xml
They use .SitemapAbsURL variable, but I didn't find in documentation from where it came. However, you could rewrite sitemapindex for example with .Permalik
To override the built-in sitemapindex.xml template, create a new file in either of these locations:
layouts/sitemapindex.xml
layouts/_default/sitemapindex.xml

web.config file not being read by Google PageSpeed Insights

I have read through answers here and still stuck: IIS7 Cache-Control
I have the following web.config.xml file in the root directory of my website:
<?xml version="1.0" encoding="UTF-8"?>
<configuration>
<system.webServer>
<staticContent>
<clientCache cacheControlMode="UseMaxAge" cacheControlMaxAge="365.00:00:00"/>
</staticContent>
</system.webServer>
</configuration>
The purpose of this web.config file is to pass the Google PageSpeed Insight 'leverage browser caching' test. I am using Windows Plesk hosting, and therefore cannot use a .htaccess file for this.
No matter how I try and format the contents of the web.config file, Google does not seem to recognise any form of browser caching is occurring. I am not sure if it is just Google, or if it means that the images and other static resources on my page are being cached or not. Is there an easy way to check this?
Can anyone see any issues with my web.config.xml contents that might be causing the issue? Or is there anything else I need to do with it other than stick it in the root directory of my site?
The file name should be web.config and not web.config.xml
*.config is already a xml type of file

Google does not index my images - using sitemap, multi-lang subdomains and static subdomain

Most of my images cannot be found in the Google Image Search.
I have submitted Google Sitemaps. There are no problems reported on Search Console, but only 1 image out of 34 is indexed. I suspect my multi-language setup could be a problem.
I have a website with serves output in different languages. For each language I have a subdomain: de.openisles.org and en.openisles.org.
For each of the language domains I have a sitemap, for example with the language-dependent text.
My sitemap entries look like this:
<!-- de.openisles.org/sitemap.xml -->
<url>
<loc>http://de.openisles.org/media/screenshots/2016-01-03-demanded-goods.html</loc>
<image:image>
<image:loc>http://static.openisles.org/media/screenshots/2016-01-03-demanded-goods.png</image:loc>
<image:caption>Infopanel: verlangte Güter</image:caption>
</image:image>
</url>
<!-- en.openisles.org/sitemap.xml -->
<url>
<loc>http://en.openisles.org/media/screenshots/2016-01-03-demanded-goods.html</loc>
<image:image>
<image:loc>http://static.openisles.org/media/screenshots/2016-01-03-demanded-goods.png</image:loc>
<image:caption>Info panel: Demanded goods</image:caption>
</image:image>
</url>
The two websites link each other, so that Google knows it's the same content in another language.
<link rel="alternate" hreflang="de" href="http://de.openisles.org/media/screenshots/2016-01-03-demanded-goods.html" />
<link rel="alternate" hreflang="en" href="http://en.openisles.org/media/screenshots/2016-01-03-demanded-goods.html" />
Because images are not language-dependent (I do not want them to be) I have an additional subdomain static.openisles.org. To tell Google that my static server belongs to me, I added this subdomain also in the Search Console.
My question is simple: What am I doing wrong? Why is Google not indexing my images?
It's entirely possible that nothing is wrong with your sitemap, especially if Google Search Console doesn't say anything.
Google has its own algorithm for what to index and what not to index, and submitting a sitemap does not guarantee indexing; it only helps Google to more fully map out your website, if it decides to crawl it.
I've submitted a sitemap for a library with contained 4,000,000 urls, but its been close to a month now and Google's only indexed around 14,000.
I think the fact that even one of your images has been indexed is a good sign - Google was able to find it! Have patience, my friend, and I think you'll find the other images will slowly get indexed as well.
Best of luck!

Unable to generate sitemaps for my website

I had used sitemap on my website for a while , so I thought I should update my sitemap, but when I went to any online sitemap generator, sitemap isn't properly generating, its not based on any cms
my current sitemap : http://mycampusnotes.com/sitemap.xml
but I have been using many sitemap generator website, I am getting only these 3 url as output
<?xml version="1.0" encoding="UTF-8"?>
<urlset
xmlns="http://www.sitemaps.org/schemas/sitemap/0.9"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://www.sitemaps.org/schemas/sitemap/0.9
http://www.sitemaps.org/schemas/sitemap/0.9/sitemap.xsd">
<!-- created with Free Online Sitemap Generator www.xml-sitemaps.com -->
<url>
<loc>http://mycampusnotes.com/</loc>
</url>
<url>
<loc>http://mycampusnotes.com/Default.aspx</loc>
</url>
<url>
<loc>http://mycampusnotes.com/privacyPolicy.html</loc>
<lastmod>2014-07-03T13:23:21+00:00</lastmod>
</url>
</urlset>
I haven't changed any permission to any folder, is there some script that is blocking ? or some added files that are blocking the sitemap generation ? I am getting these 3 outputs on every sitemap generator website, the only change I made is the change in frontpage (main page), I changed it completely , so is that the reason for failure of generation of sitemap ?
That's right, because your home page contains only 2 outbound internal links:
Every of these 2 pages also doesn't contain internal links to other pages.

Multiple Sitemap: entries in robots.txt?

I have been searching around using Google but I can't find an answer to this question.
A robots.txt file can contain the following line:
Sitemap: http://www.mysite.com/sitemapindex.xml
but is it possible to specify multiple sitemap index files in the robots.txt and have the search engines recognize that and crawl ALL of the sitemaps referenced in each sitemap index file? For example, will this work:
Sitemap: http://www.mysite.com/sitemapindex1.xml
Sitemap: http://www.mysite.com/sitemapindex2.xml
Sitemap: http://www.mysite.com/sitemapindex3.xml
Yes it is possible to have more than one sitemap-index-file:
You can have more than one Sitemap index file.
Highlight by me.
Yes it is possible to list multiple sitemap-files within robots.txt, see as well in the sitemap.org site:
You can specify more than one Sitemap file per robots.txt file.
Sitemap: http://www.example.com/sitemap-host1.xml
Sitemap: http://www.example.com/sitemap-host2.xml
Highlight by me, this can not be misread I'd say, so simply spoken, this can be done.
This is also necessary for cross-submits, for which btw. the robots.txt has been chosen.
Btw Google, Yahoo and Bing, all are members of sitemaps.org:
Sitemap 0.90 is offered under the terms of the Attribution-ShareAlike Creative Commons License and has wide adoption, including support from Google, Yahoo!, and Microsoft.
So you can rest assured that your sitemap entries will be properly read by the search engine bots.
Submitting them via webmaster tools can not hurt either - as John Mueller commented.
If your sitemap is over 10 MB (uncompressed) or has more than 50 000 entries Google requires that you use multiple sitemaps bundled with a Sitemap Index File.
Using Sitemap index files (to group multiple sitemap files)
In your robots.txt point to a sitemap index which should look like this:
<?xml version="1.0" encoding="UTF-8"?>
<sitemapindex xmlns="http://www.sitemaps.org/schemas/sitemap/0.9">
<sitemap>
<loc>http://www.example.com/sitemap1.xml.gz</loc>
<lastmod>2012-10-01T18:23:17+00:00</lastmod>
</sitemap>
<sitemap>
<loc>http://www.example.com/sitemap2.xml.gz</loc>
<lastmod>2012-01-01</lastmod>
</sitemap>
</sitemapindex>
It's recommended to create a sitemap index file, rather separate XML URLs to put in your your robots.txt file.
Then, put the indexed sitemap URL as below in your robots.txt file.
Sitemap: http://www.yoursite.com/sitemap_index.xml
If you want to learn how to create indexed sitemap URL, then follow this guide from sitemap.org
Best Practice:
Create image sitemap, video sitemap separately if your website has huge number of such contents.
Check spelling of robots file, it should be robots.txt, don't use robot.txt or any misspelling.
Put robots.txt file in root directly only.
For more info, you can visit robots.txt's official website.
You need specify in your in your file sitemap.xml this code:
<?xml version="1.0" encoding="UTF-8"?>
<sitemapindex xmlns="http://www.sitemaps.org/schemas/sitemap/0.9">
<sitemap>
<loc>http://www.exemple.com/sitemap1.xml.gz</loc>
</sitemap>
<sitemap>
<loc>http://www.exemple.com/sitemap2.xml.gz</loc>
</sitemap>
</sitemapindex>
source: https://support.google.com/webmasters/answer/75712?hl=fr#
It is possible to write them, but it is up to the search engine to know what to do with it. I suspect many search engines will either "keep digesting" more and more tokens, or alternatively, take the last sitemap they find as the real one.
I propose that the question be "if I want ____ search engine to index my site, would I be able to define multiple sitemaps?"

Resources