I am developing single page application using hashbangs(#!)
So the application urls are in format #!/api/generalelections.
When i directly enter this url in browser the mvc does not seem to recognize hashbangs ,it always take me to default action mentiond in glolab.asax.
Could anyone provide me solution to handle this problem to redirect to proper action which is api/generalelections
Everything that goes after # is treated by browser as anchor tag and will not be supplied to the server.
In your Xhr requests to the server you should use 'normal' uris, but not your navigation ones.
Related
I have tried to set my site up ( http://www.diablo3values.com )according to the guidelines set out here : https://developers.google.com/webmasters/ajax-crawling/ However, it appears that Google has updated their indexes (because I see the revisions to the meta description tags) but the ajax content does not show up in the index.
I am trying to use the “Handle pages without hash fragments” option.
If you view either of the following:
http://www.diablo3values.com/?_escaped_fragment_=
http://www.diablo3values.com/about?_escaped_fragment_=
you will correctly see the HTML snap shot with my content. (those are the two pages I an most concerned about).
Any Ideas? Am I doing something wrong? How do you get google to correclty recognize the tag.
I'm typing this as an answer, since it got a little to long to be a comment.
First of all, your links seems to point to localhost:8080/about, and not /about, which probably is why google doesn't index it in the first place.
Second, here's my experience with pushstate urls and Google AJAX crawling:
My experience is that ajax crawling with pushstate urls is handled a little differently by google than with hashbang urls. Since google won't know that your url is a pushstate url (since it looks just like a regular url), you need to add <meta name="fragment" content="!"> to all your pages, not only the "root" page. And google doesn't seem to know that the pages are part of the same application, so it treats every page as a separate Ajax application. So the Google bot will never actually create a navigation structure inside _escaped_fragment_, like _escaped_fragment_=/about, as it would with a hashbang url (#!/about). Instead, it will request /about?_escaped_fragment_= (which you aparently already have set up). This goes for all your "deep links". Instead of /?_escaped_fragment_=/thelink, google will always request /thelink?_escaped_fragment_=.
But as said initially, the reason it doesn't work for you is probably because you have localhost:8080 urls in your _escaped_fragment_ generated html.
Googlebot only knows to crawl the escaped fragment if your urls conform to the hash bang standard. As users navigate your site, your urls need to be:
http://www.diablo3values.com/
http://www.diablo3values.com/#!contact
http://www.diablo3values.com/#!about
Googlebot actually needs to see these urls in the source code so that it can follow them. Then it knows to download the following urls:
http://www.diablo3values.com/?_escaped_fragment=contact
http://www.diablo3values.com/?_escaped_fragment=about
On your site you appear to be loading a new page on each click, and then loading the content of each page via AJAX too. This is not how I would expect an AJAX site to work. Usually the purpose of using AJAX is so that the user never has to load a whole new page. When the user clicks, the new content section is loaded and inserted into the page. You serve the navigation once and then you only serve escaped fragments of the content.
I'm trying to figure out right configuration for cross origin AJAX call in Safari extension injected script.
My configuration in Extension Builder:
Extension Website Access: All
Include Secure Pages: true
Whitelist: -
Blacklist: -
My goal is to get something like Chromes "permissions": "http://mysite.com/*", and be able to pull user configuration from web service.
note: JSONP drops warning, so I would prefer to avoid it.
Any luck with this? I'm having the same problems. Same setup works fine in the Chrome extension, but hitting Access-Contol-Allow-Origin when trying to do it in Safari Extension.
FIXED - UPDATE:
Hey, I figured out what the problem is. So, it looks like you need to do the cross-domain ajax via the background page. What I end up doing is determining all the requests I need to make in the injected script, then message pass the requests to the background page. The background page listens for messages from the injected script, makes the appropriate ajax calls, and then sends the results via a message to the injected script. The injected script is then listening for messages from the background page, once it gets the message(s) with the ajax results, it takes the appropriate action in the page that's being viewed.
If you click on a result in Google Instant, the referer sent by your browser to the destination website contains a bunch of parameters, including the all important q=[autocompleted query]
But you're coming from a page whose URL is simply http://www.google.com/ with a bunch of stuff after the # character, i.e. as an on-page anchor.
So the browser appears to be sending a URL as the referer which is different from the URL of the page that you were viewing when you clicked.
There doesn't seem to be an additional redirection, so how on earth do they do that?
Most of the time, a Google search result actually sends you to a Google redirect page rather than directly to the target page. They use JavaScript to switch the target of the link onmousedown as you click on it.
You can see this effect by click-and-holding on the search result link and watching your status bar.
This isn't specific to Google Instant, they've been doing it for quite a long time on their standard results pages.
The page anchor part of the URL can be manipulated client-side without a new request to the server. Even when talking about static anchor links (e.g. Section Foo), clicking on them does not cause a new request to be sent to the server; it is processed completely within the browser.
The javascript being used by Google to make Google Instant work is simply altering the anchor programatically before making a request to the server.
What Google are you using?
My URL after searching is this:
http://www.google.es/#sclient=psy&hl=es&q=something+to+search&aq=f&aqi=g4g-o1&aql=&oq=&gs_rfai=&pbx=1&fp=b0....
It does include the q= part
I have a page that gathers environment status from a couple of IBM WebSphere servers using iframes similar to this:
<iframe src="http://server:9060/ibm/console/status?text=true&type=server&node=NODE&name=ServerName_server_NODE"></iframe>
and it happily prints out "Started" or "Unavailable" etc. But if I load the same url in a normal browser sometimes it works, sometimes it does not? Some of them are showing a login page, while others are simply return HTTP code 500.
So whats the difference between loading the page through an iframe vs through a browser?
I can tell you that the iframe solution works no matter which machine I am doing it on, so I do not belive it has anything to do with the user whos opening the page. And before you ask, why not keep the solution that works, well its because it takes a long time to open the page with the iframes vs a page where everything is requested through ajax.
Update: Using jQuery to perform the ajax call returns "error" and "undefined" for the servers that I can't see in a normal browser.
One difference is an iframe has to render the view while XHR would not.
An iframe is essentially the same as opening with the browser. In both cases the browsers credentials are used, so there will be no difference between the two.
Secondly, loading something in an iframe should take the same amount of time as requesting it through XHR, since in both cases the browser makes an HTTP request and waits for the response. Although I should add that an iframe will take time to render the content onto the page. However if you plan on displaying it with ajax anyways, an iframe/xhr solution will be more or less the same.
In case of ajax request same origin policy (which restricts cross domain call) comes into picture. So you can't make cross domain call using xhr. Alternative for same is embed flex swf file in your page as activex control and make flex call through javascript and then flex is responsible to make cross domain call (flex can if targeted domain allows cross domain using crossdomain.xml) and renders result using javascript again.
Greetings,
Here's the problem I'm having. I have a page which redirects directly to another page the first time it is visited. If the user clicks 'back', though, the page behaves differently and instead displays content (tracking session IDs to make sure this is the second time the page has been loaded). To do this, I tell the user's browser to disable caching for the relevant page.
This works well in IE7, but Firefox 3 won't let me click 'back' to a page that resulted in a redirect. I assume it does this to prevent the typical back-->redirect again loop that frustrates so many users. Any ideas for how I may override this behavior?
Alexey
EDIT: The page which we redirect to is an external site over which we have no control. Server-side redirects won't work because this wouldn't generate a 'back' button for in the browser.
To quote:
Some people in the thread are talking about server-side redirect, and redirect headers (same thing)... keep in mind that we need client-side redirection which can be done in two ways:
a) A META header - Not recommended, and has some problems
b) Javascript, which can be done in at least three ways ("location", "location.href" and "location.replace()")
The server side redirect won't and shouldn't activate the back button, and can't display the typical "You'll be redirected now" page... so it's no good (it's what we're doing at the moment, actually.. where you're immediately redirected to the "lucky" page).
I think the Mozilla team takes a step into the right direction by breaking this particularly annoying pattern. Finding a way around it somehow defies the purpose, doesn't it?
Instead of redirecting on first encounter, you could simply make your page render differently when a user hits it the first time. Should be easy enough on the server side, since you already have the code that is able to make that distinction.
You can get around this by creating an iframe and saving the state of the page in a form field in the iframe before doing the redirect. All browsers save the form fields of an iframe.
This page has a really good description of how to get it working. This is the same technique google maps uses when you click on map search results.
I'm strongly in favor for the Firefox behaviour.
The most basic way to redirect is to let the server send HTTP status code 302 + Location header back to the client. This way the client (typically a browser) will not place the request URI into its history, but just resend the same request to the advocated URI.
Now it seems that Firefox started to apply the bevaviour also for server responses that try redirections e.g. by Javascript's onload event.
If you want the browser not to display a page, I think the best solution is if the server does not send the page in the first place.
Its possibly in aide to eliminate repeated actions.
A lot of ways people do things is
page 1 -> [Action] -> page 2 -> redirect to page 2 without the action parameters.
Now if you were permitted to click the back button in this situation and visit the page without the redirect, the action would be blindly re-performed.
Instead, firefox presumes the server sent a redirect header for a good reason.
Although it is noted, that you can however have content delivered after the redirect header, sending a redirect header ( at least in php ) doesn't terminate execution, so in theory, if you were to ingnore the redirect request you would get the page doing weird stuff.
( I circumvent this by the fact all our redirects are done via the same function call, where i call an explicit terminate directly after the redirect, because people when coding assume this is how it behaves )
In the URL window of firefox type about:config
Change this setting in firefox
browser.sessionstore.postdata
Change from a 0 to 1