script to auto-generate ajax url's for google analytics? - ajax

I inherited a site that's completely built using Ajax so I'm using this code to detect pageloads:
(the standard GA code)
var _gaq=[["_setAccount","UA-#######-#"],["_trackPageview"]];
(function(d,t){var g=d.createElement(t),s=d.getElementsByTagName(t)[0];g.async=1;
g.src=("https:"==location.protocol?"//ssl":"//www")+".google-analytics.com/ga.js";
s.parentNode.insertBefore(g,s)}(document,"script"));
and the gaq.push:
_gaq.push(['_trackPageview', '/#pagename']);
except this is a big site with over 30 pages and I dont want to have to write in every one. is there a script that can find the paths/pagenames and put it in there for me?
I don't know Ajax OR Javascript, so apologies if this is not worded correctly.

You can use the _push function however because an AJAX request can mean so many things (sometimes a page load, sometimes a postback of data, and sometimes just a ping to make sure the user is still alive) you wouldn't want to track every XMLHttp request.
Depending on what language your site is written in, I would strongly suggest making use of some form of partial page component (php include, ASP.NET MVC Partial, ServerSide Include) etc which would stop you re-writing the code a million times if you change it.
However you will still need to edit your 30'odd pages if they are hard-coded. Or one master page if its a sensible design.
Check out this link for a sample on how to track ajax requests using the push method.

Related

How to code in Laravel so that the requests are executed without refreshing the page?

In Laravel when we using forms to store or delete a resource, the page is refreshed. What is the best technology to avoid refreshing the page while the request is being processed? AJAX, Vue.js, etc?
There are two main ways to handle http requests: synchronously and asynchronously.
Laravel is a PHP framework and therefore uses... PHP, which is a synchronous language. This implies a page refresh for every requests you make. The point is, every PHP framework have this behavior, this is the way PHP works.
So let's answer your question: indeed, you need an asynchronous technology to make a request to the server and get the response without refreshing the page. The technolodgy of choice in this case is Javascript, which will be able to make AJAX calls.
An AJAX (asynchronous JavaScript and XML) will, as stated in its name, make an asynchronous request. But an AJAX request is just the way of doing it, it's not really a technology. Yes, javascript frameworks like Vue.js are using AJAX, but that is overkill to just make some AJAX requests.
Using Axios or even jQuery is much easier and will allow you to make a request, grab the answer and modify your page without refresh very quickly :)
[EDIT]
The process to achieve what you are looking for is pretty simple:
Use Axios or jQuery to make an AJAX call (an asynchronous request)
Handle this request with Laravel, as you do for every other request
Returns something (or not, it depends of you) to alert your user that something happened
This response will be handled by Javascript
Vue is suitable for small projects where you just want to add a little bit of reactivity, submit a form with AJAX, show the user a modal, display the value of an input as the user is typing, or many other similarly straightforward things. It's scalable and also fantastic for huge project.

Loading Ajax Response Data with Adsense Codes inside

I'm 10000000% sure that this question has been asked before, however, the majority of the responses that I came across were from back in 2005, 2006 and so on. Not to mention, almost all of the questions themselves were too general. Therefore, I'm asking this so that for anyone else needs to find this out, then they won't need to dig through about 50 webpages to get an idea.
My question is simply that I have a webpage that has Google Ads embedded into the HTML of the website. The website was first developed as a static HTML site where each link reloaded a new page. Nevermind the backend technology of the website - the website itself produces purely dynamic content. The website is close to completion and now a fully-ajax listener has been added to all the links. When any of the links are clicked, JavaScript takes over, parses the link and sets that using popstate or the hashbang. The page itself is then queried to the server via AJAX and the content is updated using document.getElementByID('container').innerHTML=ajax.responseText; This way, there is almost a 100% method of accessing content that was replaced by AJAX.
This all works fine, but the responseText itself may, WILL contain Google Ads, and I was just wondering how to display them as if it were a static page. Clearly this doesn't work. Here are the options that I've come across:
Use an IFrame:
An IFrame seems to be an effective way to load the content; just stick the adsense codes a simple adsense.html iframe file and let the browser and
directly into page, it isn't possible
it's against their TOS
there is document.write() omitted in ajax request
Your chance is:
Create simple iframe
<iframe src="advert.html"></iframe>
and in advert.html, add your advert code
It's then loaded fine without problems.
Good luck

Ajax generated content, crawling and black listing

My website uses ajax.
I've got a user list page which list users in an ajax table (with paging and more information stuff...).
The url of this page is :
/user-list
User list is created by ajax. When the user click on one user, he is redirected to a page which url is : /member/memberName
So we can see here that ajax is used to generate content and not to manage navigation (with the # character).
I want to detect bot to index all pages.
So, in ajax I want to display an ajax table with paging and cool ajax effetcs (more info...) and when I detect a bot I want to display all users (without paging) with a link to the member page like this :
JohnBob...
Do you think I can be black listed with this technique ? If you think so, could you please provide an alternative solution by keeping these clean urls and without redeveloping the user-list (without ajax) ?
Google support a specification to make AJAX crawlable:
http://code.google.com/web/ajaxcrawling/docs/specification.html
I did an experiment and it works:
http://seo-website-designer.com/SEO-Ajax-Google-Solution
As this is a Google specification, you won't get penalised (unless you abuse it).
Saying that, only Google support it at the moment (AFAIK).
Also, I believe following the concept of Progressive Enhancement is a better approach. That is, create a working html website then make the JavaScript enhance it
Maybe use the urls with an onclick to trigger your AJAX scripting? Like
Some URL
I don't think Google would punish you for this, you primarily use JScript, but you do provide a fall back for their bot, so your site doesn't get any less accessible.
EDIT
Ok, I misunderstood. Then my guess would be you basically have two options:
1. Write a different part of your site where bots end up, or,
2. Rewrite your current site to for example always give a 'full' page, with an option to only get, say, the content div. Then you can get only the content with JavaScript, but bots will always get a nice page.

is it possible to do partial postback on web?

I read some paragraphs in a book saying that it is not possible to do a partial postback for web, even AJAX is employed. Ajax will postback everything and update only ajaxfied controls.
However, on pages I made using ajax, I used Fiddler to monitor the transportation. I found when the page initial load, it loaded everything include pictures .... However, when I click a button and do a ajax postback. I can only see the some data were loaded.... Looks like it doesn't need to reload the whole page again.
I don't know if what I see is correct? Or the book I read is correct?
Thank you guys.
That depends what you put in the term "postback".
The AJAX call will send the complete form data back to the server, just as if the form was posted normally. The server will answer with a partial response that only contains the parts of the page that should be updated.
So, the request is not partial, but the response is.
I am not sure how you are posting back from the client side. I am guessing you are using UpdatePanels. How well you 'AJAX-ify' a web page depends on what method you employ.
UpdatePanels - Read Dave Ward's posting on them - http://encosia.com/2007/07/11/why-aspnet-ajax-updatepanels-are-dangerous/
PageMethods to post back to a web service, get the data and update the DOM to display the result
JQuery and other such AJAX frameworks to post back to a web service
I am sure the link above should clear things up a bit
I'm having a hard time understanding your terminology. I'm not really sure what a "postback" is, much less a "partial" one. I do know that one of the basic ways to transmit information to an HTTP server is via a POST request, which is usually used when submitting forms. If you mean to say that the entire form is transmitted when you click a submit button, I believe you'd be right.
You also seem to be doing something with AJAX, but it's difficult to tell. The whole point of AJAX is to have dynamic data displayed on a page without resorting to reloading it. Defining what to send and what to do with the results is entirely up to your own JavaScript. So unless you're using a framework, which you don't specify, there is no such thing as "ajaxified controls."
In any case, "AJAX" usually means using the XMLHttpRequest() method of modern browsers to send data to servers without refreshing the page. When you call this function, you specify exactly what data to send. This has nothing to do with HTML forms. One caveat: if you are indeed using a library for AJAX, it might impose additional limits on how you structure information to send.

Iframe vs normal / ajax get request

I have a page that gathers environment status from a couple of IBM WebSphere servers using iframes similar to this:
<iframe src="http://server:9060/ibm/console/status?text=true&type=server&node=NODE&name=ServerName_server_NODE"></iframe>
and it happily prints out "Started" or "Unavailable" etc. But if I load the same url in a normal browser sometimes it works, sometimes it does not? Some of them are showing a login page, while others are simply return HTTP code 500.
So whats the difference between loading the page through an iframe vs through a browser?
I can tell you that the iframe solution works no matter which machine I am doing it on, so I do not belive it has anything to do with the user whos opening the page. And before you ask, why not keep the solution that works, well its because it takes a long time to open the page with the iframes vs a page where everything is requested through ajax.
Update: Using jQuery to perform the ajax call returns "error" and "undefined" for the servers that I can't see in a normal browser.
One difference is an iframe has to render the view while XHR would not.
An iframe is essentially the same as opening with the browser. In both cases the browsers credentials are used, so there will be no difference between the two.
Secondly, loading something in an iframe should take the same amount of time as requesting it through XHR, since in both cases the browser makes an HTTP request and waits for the response. Although I should add that an iframe will take time to render the content onto the page. However if you plan on displaying it with ajax anyways, an iframe/xhr solution will be more or less the same.
In case of ajax request same origin policy (which restricts cross domain call) comes into picture. So you can't make cross domain call using xhr. Alternative for same is embed flex swf file in your page as activex control and make flex call through javascript and then flex is responsible to make cross domain call (flex can if targeted domain allows cross domain using crossdomain.xml) and renders result using javascript again.

Resources