All Social Plugin Comments vanish from article - comments

We've had facebook comments via the social plugin implemented for some time on our website, on a per-article basis. Occasionally, an article 'dumps' all its comments. They still show up in the app moderation back end, but none of them permit 'Visit Website'. They may be approved, but they no longer appear on the article page, though they will still show up on the poster's facebook if they chose to post to their wall. This seems to happen the most on 'noisier' articles, but I suspect that to be the result of more posters meaning more people to notice the disappearance. The article will then accept new comments normally, but all old comments seem to be permanently lost.
Obviously, changing the canonical URL for the article would cause this. However, we've had it happen on articles I do not believe changed. Is there anything else that can cause comments to become disassociated from the article that I could possibly correct? Can this happen if the URL (not canonical!) changes? For SEO purposes we have the article headlines in the URLs, but the plugin is set to a canonical url without the headline to avoid disruption if the headline is updated. Is that enough?

You should think of the URL for an article as a unique string identifier for the comments plugin. The URL defined should be unique (only for one page). The formatting of the URL doesn't matter. However, even the slightest change in the URL will disrupt the comments plugin because now the string is technically different. I'm not sure what else could be causing your issue.
It may help to reread the documentation on the comments plugin here: https://developers.facebook.com/docs/reference/plugins/comments/

Related

New Google Plus Comments in Blog - How to View / Receive Notifications?

I have a Blogger blog and I used to have Blogger's own comment system in it. I didn't like some parts of it, so I tried changing to Google+ comments instead.
I have no problems with the comment box, it's implemented well, works fine, etc. But when I had Blogger comments, I could see the newest comments my visitors had posted site-wide and I also received email notifications when someone posted a comment in any post of my blog.
However, now, with Google Plus comments, I don't seem to get any sort of notification. (no emails, not even that alert thingy on top-right corner of Google that only ever shows Youtube comments I don't care about) And, also, I know of no way to check the most recent comment in my website.
I kind of need either of these features (most recent / notifications) so I can reply to people when they post comments on my blog. After all I got dozens of posts it's not viable to check every single one of them for new comments every single day.
How can I view the most recent Google plus comments within a website? Or at least receive an email when there is a new Google plus comment posted in my website?
P.S.: I'm not interested in an API for these. There should be an actual user interface somewhere for these things, right?
As it currently stands, this feature has not worked since October 2016.
According to a post by a Google Employee in the official Blogger Forum on 2nd February 2017 -
Hi all,
Thanks for posting.
Just wanted to let you know that the concerned team is aware of this
issue and is working on it. I will keep you all posted as soon as I
get an update from them.
Best,
Theo
Any updates regarding this issue will be likely posted in the above forum thread

Ajax Website, problems with History and Seo

I have a few problems i could use some input on.
I have a website, where all the content is loaded with ajax, it works quite well. There are a few issues with that approach though, or some UX issues.
User cannot copy URL from loaded content, since it will allways show the default URL only.
SEO will take a hit, since it cannot be crawled, the sitemap is like 2 pages only, even though when a normal user browses, they will see alot more.
Browser history, back and forward, does not work. Hitting the back button goes to the main page.
Now, i have searched and read alot.
Google has a hack, that seems to allow the site to be crawled, IF you use # in your url, does not work with empty url, which leads me to...
Manipulating the browser history with pushState/popState.
Now, i have tried getting it to work, but i just cant get my head around which process is the best way to take. Should i redo all my ajax?
Right now i have 2 div boxes, and i switch between them with loaded content, to get that nice sweet transition between pages. My frontpage is basically just 2 empty divs, nothing else. It works, but i get the feeling it is a pretty bad way to do it, thoughts?
If anyone know some good guides, feel free to give me, i have as i said read alot, but i might have missed some golden ones out there.
Google does execute some Javascript when indexing and ranking pages. However, text which is not immediately visible to users is demoted when establishing content relevancy.
Manipulating the browser history with pushState/popState.
It is very unlikely Google will trust your content if you need to use those tricks. And content which is not trusted is not ranked.
UPDATE: Manipulating browser history with pushState is ok.
Moreover, if your URLs change all the time, Google won't appreciate it, unless you manage to set canonical links.

magento cookie text replaces description in google

Quick question, since I've added the magento cookie options in 1.7.0.2 google has swapped my description (the bit of text under main link in search results) for the text that I have in my cookie confirmation box. Not only is this terrible for people that find us through google, I doubt google bot will be all too pleased with it. All my pages have descriptions set but for some reason they are not being used? the cookie explanation text is used instead. Does anyone know how I can change this? or stop it happening?
Many Thanks
I was facing the exact same problem: Google was showing the cookie warning text as description in search results for my Magento store.
The problem turned out to be my Meta description being too short. Solution for me was making the meta description longer, atleast about 150 characters (including spaces).
What goes in your < description > tag is found in Magento's backoffice: system>configuration>general>design, under HTML head, Default Description.
After save, I cleared cache and checked the page source for showing the updated meta. To make things with Google go faster, I used their webmaster tools to submit the store url for crawling. After a little wait, Google was showing the store's description in the search results just like it's supposed to.
Hope this can still help you!
Cheers
Could you paste your cookie confirmation box and how it works, as well as some of your meta descriptions?
Blank out as necessary, just need the gist of the structure.

Get URI fragment (hash) to affect SEO? Get indexed by SEs?

I am building a forum site where the post is retrieved on the same page as the listing via AJAX. When a new post is shown, the URI fragment is changed (ex: .php#1_This-is-the-first-post). Also the title and meta tags are changed.
My question is this. I have read that search engines aren't able to use #these-words. So therefore, my entire site won't be able to be indexed (as it will look like one page).
What can i do to get around this, or at least make my sub-pages be able to get indexed?
NOTE: I have built almost all of the site, so radically changes would be hard. SEO is my weakest geek-skill.
Add non-AJAX versions of every page, and link to them from your popups as "permalinks" (or whatever you want to call them). Not only aren't your pages available to search engines, they can't be bookmarked or emailed to friends. I recently worked with some designers on a site and talked them out of using an AJAX-only design. They ended up putting article "teasers" in popups and making users go to a page with a bookmarkable URL to read the complete texts.
As difficult as it may be, the "best" answer may be to re-architect your site to use the hash tag URL scheme more sparingly
Short of that, I'd suggest the following:
Create an alternative, non-hash based URL scheme. This is a must.
Create a site-map that allows search engines to find your existing pages through the new URL scheme.
Slowly port your site over. You might consider adding these deeper links on the page, or encourage users to share those links instead of the hash-based ones, etc.
Hope this helps!

How to force a page to be removed from the search engine index?

Situation: Google has indexed a page in a forum. The thread is now deleted. How/whether can I make Google and other search engines to delete the cached copy? I doubt they would have anything against that since the linked page does not exist anymore and keeping the index updated and valid should be in their best interests.
Is this possible or do I have to wait months for an index update? Or will the page now stay there forever?
I am not the owner of the respective site so I can't change robots.txt for example. I would like to force the update as the "third party".
I also noticed that a new page on that resource I created two days ago is already in the cache. Given that can I make an estimate how long will it take for a non valid page on this domain to be dropped?
EDIT: So I did the test. It took google shortly under 2 months to drop the page. Quite a long time...
It's damn near impossible to get it removed - however replacing the page with entirely blank content will ensure that you nuke the ranking of the page when it is respidered.
You can't really make Google delete anything, except perhaps in extreme circumstances. You can adjust your robots.txt file to promote a revisit interval that might update things sooner, but if it is a low traffic site, you might not get a revisit very soon.
EDIT:
Since you are not the site-owner, you can modify the meta tags on the page with "revisit-after" tags as discussed here.
You cant make search engines to remove the link but don't worry soon the link will be removed as the link will not longer be active. You need not wait for months for this to happen.
If your site is registered with Google Webmaster, you can request to remove pages from the index. It works, I tried and used it in the past.
EDIT: Since you are not the owner, I am afraid that this solution would not work.

Resources