Facebook instant article - Multiple url to same article - url-parameters

I have a rel=canonical link in facebook instant article as
www.domain.com?p=news&newsid=10
but when someone access to the link below with another ref parameter then it unable to read the facebook instant article and become to web page, because the link was different.
www.domain.com?p=news&newsid=10&ref=1
www.domain.com?p=news&newsid=10&ref=2
www.domain.com?p=news&newsid=10&ref=3
Any idea how can i develop one article with multiple link?

From Instant Articles team here.
The key we have hold as Instant Article fetcher, is the canonical URL field.
For your cases, you should all of the links rendering same article and also all responses having as key canonical the following URL:
www.domain.com?p=news&newsid=10
This way you can render same Instant Article for any ref=** query param anyone shares within Facebook news feed.
You can also have on the op-tracker tags, these fields to better track your content, or even the original URL that was shared, for better tracking purposes check the example below, extracted from this documentation:
<figure class="op-tracker">
<iframe>
<script>
// The URL the user shared
// (if there are no redirections, otherwise the final URL in the chain)
var urlSharedByUser = ia_document.shareURL;
// The article title
var title = ia_document.title;
// Referrer is always set to 'ia.facebook.com'
var referrer = ia_document.referrer;
</script>
</iframe>
</figure>

Related

Crawlable Ajax content. SEO-ing without hashbang. Is my way ok?

I'm going to build my application based on ajax, and my URLs are something like:
http://server.com/module/#function_name,param1,param2...etc
After referencing some discussions about google's suggestion: hashbang (#!), it's not hard for me to realize that it was not the best solution. There are several reasons:
The URL is pretty ugly, anyway.
It's terrible if someday Google(or some other search-engines) suggest a better solution other than hashbang. I must keep my ugly url with hashbang, or write some js-code to make link to my page still alive.
HTML5 pushState will be popular someday.
For all things above, I decide to make it my way: my navigation links will be like this:
<a href="http://server.com/module/for-crawler/function-name/param1/param2/...">
Some text </a>
And some jQuery code will make it capable to load ajax content instead of page-change like a normal link:
$(function(){
$('a').live('click',function(e){
var realURL = translateURL( $(this).attr('href') )
loadContent( realURL );
e.prevetnDefault();
return false;
})
})
/* -- the function translateURL will turn url like :
..... http://server.com/module/for-crawler/function-name/param1/param2/...
Into:
..... http://server.com/module/#function-name/param1/param2/...
That's the real url I think all ajaxers are used to dealing with
*/
When crawler reads my page, it will follow the url in "href" attribute, and I will provide it with static non-js version of my page just for google to read. After some days, my page is indexed & user will see my page on Google's results like this:
http://server.com/module/for-crawler/function-name/param1/param2/...
I'm going to user js again to redirect user to my normal ajax version, I mean, to the real URL :
http://server.com/module/#function-name/param1/param2/...
That's the best approach I can think about at this time. Please give me advices : should I do it that way, or can I make it better ? Thanks all guys !
Depending on your audience I would suggest to use HTML5 PushState anyway.
If the client is not supporting HTML5 PushState let him simply use the same version of your app as the crawlers do. In my opinion a page reload is not as bad as a hashed URL. Since users share URLs your hashed URL gets exposed to other users. This URL wouldn't work for, let's say Facebooks Link sharing previews or any other client that doesn't support JavaScript.
Instead I would only use the crawler-friendly app in combination with HTML5 PushState. With PushState you will always expose a single URL, independent of the JavaScript support of your client.
First, detect whether PushState is supported:
function supports_history_api() {
return !!(window.history && history.pushState);
}
Then your click-handler would look something like this:
$('a').live('click',function(e){
var url = $(this).attr('href');
e.preventDefault();
loadContent( url );
history.pushState({"url":url}, $(this).attr('title'), url);
return false;
})

ASP.NET MVC3 Html.Raw changing absolute URLs?

I'm using ASP.NET MVC 3 and the helper #Html.Raw in my view.
I'm passing it some HTML markup that I have stored in a database. The markup contains some URLs that point to other places on my site. For example http://www.foo.fom/events. These data are forum posts, so the page they're displayed on has the form http://www.foo.com/forums/thread/42/slug.
However, when the page is rendered, the saved URLs are rendered in modified form as:
http://www.foo.com/forums/thread/42/events/
This only happens for URLs on my site. If the URL points to some external site, it is unchanged.
I have verified that what I'm passing into #Html.Raw is the correct URL (http://www.foo.com/events). Why is it getting changed as the page is rendered? Is there an easy way to disable this "feature"?
Here's my code for displaying the markup:
<div>
#Html.Raw(post.Body)
</div>
and here's the controller code that genrates the page data:
var post = _forumRepository.GetPostById(id)
var model = new ForumPostView()
{
Body = post.Body,
PostDate = post.DatePosted,
PostedBy = post.Author,
PostId = post.Id
};
return View(model);
I have verified via debugger that the exact URL in the post.Body before being passed back to the View is of the form "http://www.foo.com/events" (no trailing slash). I have also verified via debugger that the value is unchanged before it is passed into #Html.Raw.
It sounds like the urls that are pointing to other pages on your site are non-absolute. Are you certain they start with a / or http? If not, it's behaving exactly as it's supposed and treating them like relative urls -- and thus appending them to the current url.
(Html.Raw will not manipulate the string, so it's not at fault here)
Also, it wouldn't hurt to show us your code.
No, in fact I am an idiot. The URLs were indeed stored in relative form without a leading /, which is why they ended up being relative to the current page. The text displayed was absolute, which is what I saw when I looked at the db. That's what I get for debugging on a few hours' sleep ;)

How can I paste text to a pastebin site through API's HTTP POST using a bookmarklet?

I want to paste some text to a pastebin site through the site's API.
As I have figured out (I have limited knowledge of programming) I need two things: First to process the selected text and then to post it via HTTP POST to the pastebin site.
I tried to do this...
javascript:'<body%20onload="document.forms[0].submit()"><form%20method="post"%20action="http://sprunge.us"><input%20type="hidden"%20name="sprunge"%20value="+ document.getSelection() +"></form>'
....which (you guessed it!) returns to me EVERY TIME a page in which the selected text being "+ document.getSelection() +".
Any help?
You have to make a form programmatically, add the fields required and then post -- all in JavaScript.
Here is an example -- you will at least have to change the URL and the fieldname:
<a href="
javascript:(function(){
var myform = document.createElement('form');
myform.method='post';
/* change this URL: */
myform.action='http://my-example-pastebin.com/submit.php';
/* The goodies go here: */
var myin=document.createElement('input');
/* Change the fieldname here: */
myin.setAttribute('name','fieldname_for_pasted_text');
myin.setAttribute('value',document.getSelection());
myform.appendChild(myin);
/* If you need another field for username etc: */
myin=document.createElement('input');
myin.setAttribute('name','some_field_1');
myin.setAttribute('value','some_field_value_1');
myform.appendChild(myin);
myform.submit();
})()
">Bookmarklet for posting selected text to an online pastebin</a>
The above compacted without comments and linebreaks:
Bookmarklet for posting selected text to an online pastebin
I'm not familiar with sprunge.us, but if you've got the URL and fieldname right in your example, you could get this to work by search-replace:
http://my-example-pastebin.com/submit.php → http://sprunge.us/
fieldname_for_pasted_text → sprunge
You should also remove the second field (some_field_1, somefield_value_1) included in my example.

Is there a way to AJAX load a page and change URL in URL bar without hashing?

This is probably going to get a resounding no, but I am wondering if it possible to have the URl change dynamically with using hashing, and without invoking a http request from the browser?
My client is keen on using AJAX for main navigation. This is fine, when the end user goes to the front page first, but when they want to use the deep linking, despite it working, it forces an extra load time as the page loads the front page, then invokes the AJAX from the hash.
UPDATE: Could it be possible, given that what I want to avoid is the page reload (the reason is that it looks bad) to stem the reload by catching the hash with PHP before the headers are sent, and redirecting before the page load. This way only one page loads, and the redirect is all but invisible to the user. Not sure how to do this, but seems like it is possible?
Yes, this is possible. I often do this to store state in the hash part of the URL. The result is that the page doesn't reload, but if the user does reload, they're taken to the right page.
Using this method, the URL will look like: "/index#page=home" or "/index#page=about"
You'll need to write a JavaScript function that handles navigation, and you'll need a containing div that gets rewritten with the contents fetched from AJAX.
Home
About
Questions
<div id="content"></div>
<script type="text/javascript">
function link(page) {
location.hash = "page="+page;
loadPage(page);
}
// NOTE: This is using MooTools. Use the AJAX method in whatever
// JavaScript framework you're using.
function loadPage(page) {
new Request.HTML({
url: "/ajax/"+page+".html",
onSuccess: function(tree, elements, html) {
document.id('content').setProperty('html', html);
}
}).get();
}
</script>
Now, you'll also need to have something that checks the hash on page load to load the right content initially. Again, this is using MooTools, but use whatever onLoad method your JavaScript framework provides.
<script type="text/javascript">
document.addEvent('domready', function() {
parts = location.hash.split('=');
loadPage(parts[1]);
}
</script>
Ok, the problem is that opening an AJAX link of the form http://example.com/#xyz results in a full page being downloaded to the browser, and then the AJAX-altered content is changed once the page has loaded and checked the hash part of its URL. The user has a diconcerting experience.
You can hugely improve this by making a page that just contains the static elements - menus, etc. - and a loading GIF in the content area. This page checks its URL upon loading and dynamically fetches the content specified by the hash part. The page can have any URL you want; we'll use http://example.com/a. Links to this page (http://example.com/a#xyz) now provide a good user experience for users with scripting enabled.
However, new users won't come to the site by fetching http://example.com/a; they'll fetch http://example.com. This is fine - serve the full page, including the home page content and links that don't require scripting to work (e.g., http://example.com/xyz). A script run on loading this page should alter the href of AJAXable links to their AJAX form (http://example.com/a#xyz); thus the first link a user clicks on will result in a full page load but subsequent ones won't.
The only remaining problem is is a no-script user gets sent an AJAX link. You can add a noscript block to the AJAX page that contains a message explaining the problem and provides a link back to the homepage; you could include instructions on how to enable scripting or even how to modify the link by removing a# and pressing enter.
It's not a great answer, but you can offer a different link in the page itself; e.g., if the address bar shows /#xyz you include a link to /xyz somewhere in the page. You could also add a link or button that uses script to bookmark the page, which would again use the non-AJAX form of the link.

How to "bookmark" page or content fetched using AJAX?

How to "bookmark" page or content fetched using AJAX?
It looks like it can be easy if we just add the details to the "anchor", and then, use the routing or even in PHP code or Ruby on Rails's route.rb, to catch that part, and then show the content or page accordingly? (show the whole page or partial content)
Then it can be very simple? It looks like that's how facebook does it. What are other good ways to do it?
Update: There is now the HTML5 History API (pushState, popState) which deprecates the HTML4 hashchange functionality. History.js provides cross-browser compatibility and an optional hashchange fallback for HTML4 browsers.
To store the history of a page, the most popular and full featured/supported way is using hashchanges. This means that say you go from yoursite/page.html#page1 to yoursite/page.html#page2 you can track that change, and because we are using hashes it can be picked up by bookmarks and back and forward buttons.
You can find a great way to bind to hash changes using the jQuery History project
http://www.balupton.com/projects/jquery-history
There is also a full featured AJAX extension for it, allowing you to easily integrate Ajax requests to your states/hashes to transform your website into a full featured Web 2.0 Application:
http://www.balupton.com/projects/jquery-ajaxy
They both provide great documentation on their demo pages to explain what is happening and what is going on.
Here is an example of using jQuery History (as taken from the demo site):
// Bind a handler for ALL hash/state changes
$.History.bind(function(state){
// Update the current element to indicate which state we are now on
$current.text('Our current state is: ['+state+']');
// Update the page"s title with our current state on the end
document.title = document_title + ' | ' + state;
});
// Bind a handler for state: apricots
$.History.bind('/apricots',function(state){
// Update Menu
updateMenu(state);
// Show apricots tab, hide the other tabs
$tabs.hide();
$apricots.stop(true,true).fadeIn(200);
});
And an example of jQuery Ajaxy (as taken from the demo site):
'page': {
selector: '.ajaxy-page',
matches: /^\/pages\/?/,
request: function(){
// Log what is happening
window.console.debug('$.Ajaxy.configure.Controllers.page.request', [this,arguments]);
// Adjust Menu
$menu.children('.active').removeClass('active');
// Hide Content
$content.stop(true,true).fadeOut(400);
// Return true
return true;
},
response: function(){
// Prepare
var Ajaxy = $.Ajaxy; var data = this.State.Response.data; var state = this.state;
// Log what is happening
window.console.debug('$.Ajaxy.configure.Controllers.page.response', [this,arguments], data, state);
// Adjust Menu
$menu.children(':has(a[href*="'+state+'"])').addClass('active').siblings('.active').removeClass('active');
// Show Content
var Action = this;
$content.html(data.content).fadeIn(400,function(){
Action.documentReady($content);
});
// Return true
return true;
And if you ever want to get the querystring params (so yoursite/page.html#page1?a.b=1&a.c=2) you can just use:
$.History.bind(function(state){
var params = state.queryStringToJSON(); // would give you back {a:{b:1,c:2}}
}
So check out those demo links to see them in action, and for all installation and usage details.
If you use jquery, you can do that in a simple manner. just use ajaxify plugin. it can manage bookmarking of ajax pages and many other things.
Check this, something may help you:
How to change URL from javascript: http://doet.habrahabr.ru/blog/15736/
How to pack the app state into url: http://habrahabr.ru/blogs/javascript/92505/
An approach description: http://habrahabr.ru/blogs/webstandards/92300/
Note: all articles are in Russian, so either Google Translate them, or just review the code and guess the details.
Take a look to the Single Page Interface Manifesto
I tried many packages. The jQuery History plugin seems to be most complete:
http://github.com/tkyk/jquery-history-plugin

Resources