Facebook comments linking only to base url, sub-pages not working - comments

I've looked and looked and oddly cannot find a solution or direct discussion of what seems like a very common scenario. I have developed a site with multiple pages and multiple articles per page. I want the users to be able to register a facebook comment on an individual article. The url was displaying correcly for each article but the app:id was not registered correctly. I fixed that and now when a person registers a comment, facebook uses the base url only and picks a random article picture. Why doesn't it simply register the url I provided?
Here is the code...
<div id="fb-root"></div><script src="http://connect.facebook.net/en_US/all.js#xfbml=1"></script><fb:comments href="http://www.AmericanWomenMedia.com/fi/<%= Session["PageName"] %>?id=<%= Session["CurrentArticleID"] %>" num_posts="3" width="500"></fb:comments>
Any help would be greatly appreciated.

Related

Custom URL for Component Search Results

I am writing to seek help to display custom results in a SEF URL on Joomla CMS.
Example: This is a page with a customized search, https://jobwalkins.in/search.html?search=IT&exf_5=1&exf_4=-1&option=com_jomclassifieds&view=search&Itemid=147
I would like to display this link as https://jobwalkins.in/today-walkins-in-hyderabad.html
I am using https://extensions.joomla.org/extension/jom-classifieds/ as the extensions.
Any helpful inputs will be greatly appreciated. I am looking forward to hearing from you soon.
Best Regards,
Syed H
I was able to get the desired output using https://extensions.joomla.org/extension/sh404sef/. The website in question https://jobwalkins.in/ now shows the predefined search results in custom URLs.
Here are a few of them which I was able to achieve:
https://jobwalkins.in/jobs-in-bangalore.html where the actual link was https://jobwalkins.in/search.html?search=&exf_5=2&exf_4=-1&option=com_jomclassifieds&view=search&Itemid=147
https://jobwalkins.in/today-walkins-in-hyderabad.html where the actual link was https://jobwalkins.in/search.html?search=&exf_5=1&exf_4=-1&option=com_jomclassifieds&view=search&Itemid=147
It works even for the links where the keywords are searched ex:
I searched for a keyword "fresher" and have set the page to render on custom URL https://jobwalkins.in/fresher-jobs.html where the actual link was https://jobwalkins.in/search.html?search=fresher&exf_5=-1&exf_4=-1&option=com_jomclassifieds&view=search&Itemid=147
The sh404SEF https://extensions.joomla.org/extension/sh404sef/ worked great and helped me address my concern very well.
Hope this post is useful for someone who may have a similar issue.

SEO with angularjs and asp.net restfull service

I have developed a website using angularjs and web api.
The problem is that the ajax rendered content is not crawable by google. And no one can find the website using google search.
After reading many articles regarding this issue, including:
This one with all links of explanation going out,
Google ajax crawling protocol, and also stack over flow question, I couldn't find the proper solution. Those that mention asp.net solutions, are talking about mvc, and I need only the simple REST by web api, other articles are not talking about asp.net.
Is there any simple explanation?
I'm the one who asked this same question long ago, so I will answer from my experience:
Firstly, if all your content are accessible via unique URIs (including the hashbang if you use it), modern search engines should index it just fine. In fact Google can index javascript generated content now. You can try that via the Google Webmaster tool and see how your site is indexed.
Secondly, there are libraries that help you to serve parsed content to search engines if you need to, but in my case I didn't bother much with it since Google is indexing js nicely.
I've seen others ask this question, and maybe I'm missing something or this is outdated, but I don't see why AngularJS needs to be an issue with SEO.
Say you have a landing page and it has a bunch of links. Assuming you're using html5 mode in AngularJS (and I'm not sure that's 100% necessary) and something like ng-route then the links on the landing page can work both as "angular" (JavaScript) links and "old school" (full page load) links.
If you're a human user you can click a link and it will do angular magic and adjust the content without loading the full page. Ok, all fine.
But if you instead copy the link and paste it in a new tab or new browser, it will still work - assuming you've set up routes correctly.
I'm not an SEO expert by any stretch of the imagination, but as I understand it, having links that load pages and having those pages have real and useful content is the core of SEO, and done this way, AngularJS should work fine. The key thing to check is if you copy and paste the link (not just click it) that it works.

One-page AJAX-based WordPress site. How should I do it?

I am trying to create a one-page WordPress website, something like the ones you sometimes see in ThemeForest's WP section: the whole website is a long page that has everything in one place, from about us, to portfolio, to some blog posts, to contacts.
Placing all things on one page is not difficult. But when I started thinking about how to present individual posts and pages, I realised that I probably need a general way of getting posts' data via AJAX, and create new blocks with JS. How should I go about this? I suppose this was done before, but I struggle to find something this specific on Codex or a tutorial with best practices.
Any advice or link will be greatly appreciated.
You could use a plugin such as jQuery Easytabs, download it here, that has a built-in Ajax component.
I've found that the easiest way is to just get all content to load into the divs ahead of time, vs. trying to load all pages through Ajax. However, appending something like '?ajax/ajax' to the end of your urls through the Easytabs plugin is one option that I have successfully used in the past.
If you decide to use the easytabs functionality, there is ample documentation on the page that I linked to.

Displaying correct joomla website information when posting website link

I was hoping someone can help me fix an issue. When someone posts a link to my joomla created website, they get the heading "Whats New?", which is my default article page for the site. It is the current blog articles written.
For example, if someone posted my link on facebook, it would look like this:
Whats New?
MyDomain.com
Description of website goes here...
Everything looks great except for the "Whats New?". Is there a way to put My webpage name instead of the name of the default page? How about showing an image? When posted on facebook, there is just text and no image used.
Thanks, any help would be greatly appreciated
Facebook uses Opengraph data to build those posts. If facebook isn't offered OpenGraph data, then it will use its own methods to try and find the information it needs. Sometimes with useless results. There are a lot of options to fix this. Joomla extensions has a few opengraph extensions for you to install, some of those should work fine. You can always write something yourself or add the data in your template. But don't expect results right away, because facebook caches those media objects for some time.
Open graph: https://developers.facebook.com/docs/opengraph/
Joomla Extensions: http://extensions.joomla.org/extensions/site-management/seo-a-metadata/open-graph
There are more ways to fix this, but this is probably the easiest for you. Hope it helps.
Good Luck.
In the Joomla backend, do the following:
Open the menu item the the article is assigned to.
On the right hand side, open the Page Display Options panel
Add whatever you like to the Browser Page Title parameter.
Hope this helps

How does Facebook grab the text of the article when pasting the url?

Im a bit curious about this Facebook's useful functionality. When I paste a URL on the 'What's on your mind?' box, it almost perfectly gets the body of the article. How does Facebook do this?
Thanks!
It's part of how Facebook Share works.
The URL Linter is pretty helpful as well. For example, if we test it with this very question, you can scroll down and see where it's getting the data from
"Hello, Im a bit curious about this
Facebook's useful functionality. When
I paste a URL on the 'What's on your
mind?' box, it almost perfectly gets
the body of the article. How does
Facebook do this?" extracted from
<description> or first <p>
I can't speak for Facebook specifically, but there are entire companies dedicated to providing that kind of service. For example, Reddit recently outsourced preview generation to a 3rd party.
So, essentially, there's a certain amount of automation and a large amount of manual tweaking and configuration.
You might also look at the Readability tool, which extracts the main content of a web page - that might provide some insight into the processes involved.
You can put your own entries into the shared content, by using the things described in the OpenGraph protocol on Facebook developer website.
It basically goes to the page and begins sniffing for ID's in the HTML marked as Content or Main and probably a few other common terms people use when building a site and specifying where things like menus, content, main body, right menu, top menu, main article, etc are placed in the page when pulling it in dynamically (or non dynamically for that matter).
For example, look at the source of this page itself. You'll see an area that begins div id="content"
Bingo. That's where the facebook sniffer begins. It then grabs probably the first picture it finds within that area as well as the first bit of text in that area as well.

Resources