I have a master sitemap that contains links to other site maps that is accessable on a path like:
www.website.com/sitemap.xml
I wanted to ask if this is enough for the search engines or if I need to link this to my site?
linking - I know I can use a robots.txt file but I is it possible to just add a link to the head of the site - something like (and I'm just guessing):
<head>
<link rel="sitemap" type="application/xml" title="Sitemap" href="/sitemap.xml">
</head>
thankyou
Adam
This is totally okay.
Sitemap should always be located in the root and that is the only place where the search engines will look.
I suggest you to use a Google Webmasters tool to submit a sitemap for your domain so you can get indexed and you can monitor search engine behavior.
Hopefully this info will help you.
Related
Most of my images cannot be found in the Google Image Search.
I have submitted Google Sitemaps. There are no problems reported on Search Console, but only 1 image out of 34 is indexed. I suspect my multi-language setup could be a problem.
I have a website with serves output in different languages. For each language I have a subdomain: de.openisles.org and en.openisles.org.
For each of the language domains I have a sitemap, for example with the language-dependent text.
My sitemap entries look like this:
<!-- de.openisles.org/sitemap.xml -->
<url>
<loc>http://de.openisles.org/media/screenshots/2016-01-03-demanded-goods.html</loc>
<image:image>
<image:loc>http://static.openisles.org/media/screenshots/2016-01-03-demanded-goods.png</image:loc>
<image:caption>Infopanel: verlangte Güter</image:caption>
</image:image>
</url>
<!-- en.openisles.org/sitemap.xml -->
<url>
<loc>http://en.openisles.org/media/screenshots/2016-01-03-demanded-goods.html</loc>
<image:image>
<image:loc>http://static.openisles.org/media/screenshots/2016-01-03-demanded-goods.png</image:loc>
<image:caption>Info panel: Demanded goods</image:caption>
</image:image>
</url>
The two websites link each other, so that Google knows it's the same content in another language.
<link rel="alternate" hreflang="de" href="http://de.openisles.org/media/screenshots/2016-01-03-demanded-goods.html" />
<link rel="alternate" hreflang="en" href="http://en.openisles.org/media/screenshots/2016-01-03-demanded-goods.html" />
Because images are not language-dependent (I do not want them to be) I have an additional subdomain static.openisles.org. To tell Google that my static server belongs to me, I added this subdomain also in the Search Console.
My question is simple: What am I doing wrong? Why is Google not indexing my images?
It's entirely possible that nothing is wrong with your sitemap, especially if Google Search Console doesn't say anything.
Google has its own algorithm for what to index and what not to index, and submitting a sitemap does not guarantee indexing; it only helps Google to more fully map out your website, if it decides to crawl it.
I've submitted a sitemap for a library with contained 4,000,000 urls, but its been close to a month now and Google's only indexed around 14,000.
I think the fact that even one of your images has been indexed is a good sign - Google was able to find it! Have patience, my friend, and I think you'll find the other images will slowly get indexed as well.
Best of luck!
I've developed a site which is available via two top level domain names. Both the language on the site is Dutch, one for the Dutch visitors and one for the Belgian visitors.
The .be version of the was recently "launched". Under the hood it's the same site ofcourse and we're using a meta tag to prevent getting penalized for duplicate content. (Google's support page)
So; there's this page: www.domain.nl|be/vakantie/oostenrijk/tirol/
And depending on the TLD this is the implemented meta tag:
// Dutch site visitors
<link rel="alternate" hreflang="nl-NL" href="http://www.bergenmeer.nl/vakantie/oostenrijk/"/>
// Belgium site visitors
<link rel="alternate" hreflang="nl-BE" href="http://www.bergenmeer.be/vakantie/oostenrijk/"/>
The Belgian version is live since about 6 weeks. Both sites are equiped with a sitemap listing the URLs for that domain. But we're seeing the following in Google Cache.
The live version of this page (see URL, phone number on the top right.
The cached version of this page (see URL, phone number on the top right.
When you load this page (despite some performance issues, we're looking into that) and you inspect the network traffic you'll see the page opens with a HTTP 200 response. No redirects whatsoever. Why is Google not showing the Belgian version of the page?
Thanks for the time you take to share your thoughts.
Ben
For .be you could have
<link rel="canonical" href="http://www.bergenmeer.be/vakantie/oostenrijk/"/>
<link rel="alternate" hreflang="nl-NL" href="http://www.bergenmeer.nl/vakantie/oostenrijk/"/>
and for .nl you could have
<link rel="canonical" href="http://www.bergenmeer.nl/vakantie/oostenrijk/"/>
<link rel="alternate" hreflang="nl-BE" href="http://www.bergenmeer.be/vakantie/oostenrijk/"/>
Giving Google a hint at what you want prioritised and therefore to make it into the cache as it appears to only be using the alternate.
We are just getting started with SEO/Ajax so hoping someone can help us figure this out - One of the #! urls is showing up as the first organic result for our startup nurturelist.com. Although this link technically works, we would 1) not like to have any #! urls show up in search results because they look weird and we have non #! versions 2) the second organic result in the image is the one that we'd actually like to appear at the top.
Thanks very much on any thoughts on how we can make this happen...
Do you just simply not want the #! to show up in search results? Simply make a robots.txt in your root directory (in most cases the public_html directory) and add these lines to it:
User-agent: *
Disallow: /\#!/
This prevents Google from indexing all pages under the /#!/ subdirectory.
However:
If the page has already been indexed by Googlebot, using a robots.txt
file won't remove it from the index. You'll either have to use the
Google Webmaster Tools URL removal tool after you apply the
robots.txt, or instead you can add a noindex command to the page via a
tag or X-Robots-Tag in the HTTP Headers.
(Source)
Here is a link to the Google Webmaster Tools URL Removal Tool
So add this to pages you don't want indexed:
<meta name="ROBOTS" content="NOINDEX, NOFOLLOW" />
I'm currently building a music based website and I want to build something like this template. It uses ajax and deep linking. (And it makes use of the History.js library - please notice how there's no '#' in the URLs.)
The reason I want to use these 'ajaxy' methods (or maybe use the template altogether) is so that when music is playing, it will remain un-interrupted as the user navigates the site.
My worry is that my site wont be crawlable by Google but I think I can modify code in the page source to fix that. If I look at the source code to the template, in the head I see
<meta name="description" content="">
<meta name="author" content="">
<meta name="keywords" content="">
Now if I add this to the head:
<meta name="fragment" content="!">
will that make the site crawlable? Is there other code I need to add on top of this? Or is it just not possible for this template?
I'm following this guide https://developers.google.com/webmasters/ajax-crawling/docs/getting-started, and I'm on step 3. I will of course have to complete the other steps but I don't know I'm heading in the right direction, or heading towards a dead end!
Any help would be very much appreciated. Many thanks in advance.
From what you said it sounds like your site updates the address bar with clean urls as you navigate via ajax. That's good. The next thing is you want to do is make sure those urls work. If you directly go to a url do you see the specific content it represents. And would a crawler also see the correct content without running javascript. Progressive enhancement works well for that. The final thing is you want to do is make sure bots can pick up those urls.
I've not played with the meta tag for ! But it looks like it is only for the home page and you still need to implement the escaped fragment page. Maybe it does support other pages but the article does not cover that.
I code a news PHP script. End of each news I have a Facebook share button. The problem is I can't display thumbnail images with Facebook share.
I tried Meta OG
<link rel="image_src" href="" />
element without any success. Interesting thing is, some of the domains which is using my news PHP Script, has no problem with it but some has.
Domain without any problems:
http://www.yenialanya.com/manset/vergi-denetmenine-itiraz.htm (please check the bottom of the news)
Domains with problems:
http://www.usakhabermerkezi.com/egitim-ogretim/usak-universitesi-rektorluk-secimleri-sonuclandi-iste-secim-sonuclari.htm
http://www.demokrathaber.net/dunya/dunyanin-ekseni-kaydi.htm
http://www.tebilisim.com/v4/siyaset/benzin-zamlardan-bizde-hosnut-degiliz.htm
I also tried addThis and it didn't solve the problem.
All of the domain names above are using the same system. I thought it might be because of the system so I tried clean HTML page:
http://www.phpsistem.com/fb/
As you can see in the last example, I used 2 different kind of sharing options. First with popup. I sent all parameters over URL but some domains display images, some don't. I also added addThis option.
I also thought about .htaccess and cleaned everything in it since I thought .htaccess might block something. I took every step very carefully which I could think of.
This issue started to be annoying, I would be glad if anyone could help me out.
Use Open Graph protocol
<meta property="og:title" content="The Rock"/>
<meta property="og:type" content="movie"/>
<meta property="og:url" content="http://www.imdb.com/title/tt0117500/"/>
<meta property="og:image" content="http://ia.media-imdb.com/rock.jpg"/>
<meta property="og:site_name" content="IMDb"/>
<meta property="fb:admins" content="USER_ID"/>
<meta property="og:description"
content="A group of U.S. Marines, under command of
a renegade general, take over Alcatraz and
threaten San Francisco Bay with biological
weapons."/>
To test each links use URL Linter
Look at this forum, most of them will ask same questions, why OG image is not working on like button. It's a bug maybe? Bug 16580
Are you using a public server or a local one? Facebook share doesn't show pictures if the URLs are coming from localhost.
Facebook seems to want images that are at least 200px in both directions, whether supplied in the OG metadata, or just embedded on the page. They have updated their URL linter to show this error for the OG metadata recently. I can't find sources now, but I thought they used to have a maximum pixel dimension of less than 200px previously...
Also, I've seen problems displaying thumbnail images for Chrome on OS X, where on Windows browsers there is no problem. Really strange.
Go to http://developers.facebook.com/tools/debug and fill in your url
If the Responscode is 503 then your website is not accessible. It could be that your website is under construction…