Does Ajax always require the use of node.js? - ajax

I learning about the use of AJAX in web development, and I need to know if AJAX always require the use of node.js, or JQUERY?
Thanks.

That is a very broad question, so the answer might be broad as well:
The short answer: Ajax does not require jQuery nor Node.js.
In practice, Ajax is a technology for asynchronous operations utilized by Javascript send data to and retrieve from a server asynchronously(1). Ajax is fully available in plain, vanilla Javascript, and it works as follows (example taken from Wikipedia, see sources):
// This is the client-side script.
// Initialize the Http request.
var xhr = new XMLHttpRequest();
xhr.open('get', 'send-ajax-data.php');
// Track the state changes of the request.
xhr.onreadystatechange = function() {
var DONE = 4; // readyState 4 means the request is done.
var OK = 200; // status 200 is a successful return.
if (xhr.readyState === DONE) {
if (xhr.status === OK) {
alert(xhr.responseText); // 'This is the returned text.'
} else {
alert('Error: ' + xhr.status); // An error occurred during the request.
}
}
};
// Send the request to send-ajax-data.php
xhr.send(null);
This is a classic example, showing both how to use Ajax with vanilla Javascript, and also why it's much easier with other means such as jQuery, shortening the same snippet to just:
$.ajax({
url: "http://fiddle.jshell.net/favicon.png",
}).done(function(data) {
// Do something with data.
});
Sources (including vanilla Ajax examples):
Wikipedia: Ajax
A Guide to Vanilla Ajax Without jQuery
jQuery: ajax()

There is no need to use node.js to perform an Ajax request. You can make an Ajax request even using vanilla Javascript. However, jQuery made the Ajax request is very easy and cross-browser compatible with just some lines of code. So, I recommend you to stick with jQuery instead of using vanilla Javascript.
You can find more information regarding the jQuery Ajax feature here: http://api.jquery.com/jquery.ajax/
You can also find more information about the vanilla Javascript Ajax request feature here:
http://www.w3schools.com/ajax/

No, most browsers supply means to perform asynchronous javascript requests but libraries such as jQuery partly came about to smooth over the differences between browsers, making ajax a lot more portable.
Modern browsers generally don't have so great differences, so portability is probably is less of an issue, but using libraries has become common practice.

Related

Node request, cheerio - how to handle additional ajax load

I'm using node, request and cheerio, to fetch data from a html page. This has not been any problem but one page loads additional data through ajax to fill different containers. These are empty and undefined when the initial request is done, how do I handle this the best way?
request(url, function (error, response, html) {
if (!error && response.statusCode == 200) {
var $ = cheerio.load(html);
forum_url = $('.this.url.is.loaded.separatly.with.ajax'[1].attr('href');
}
});
Cheerio isn't really designed with ajax in mind. If you are able to extract the urls that need to be downloaded, you would likely have to maintain multiple seperate $ objects, as it's unlikely they can be merged easily.
Usually, in cases where you need to execute javascript found on a scraped page, we would turn to Phantom.js. Phantom is a headless browser that you control using javascript, it's pretty cool.
You can check out some Phantom.js web scraping code here: http://code4node.com/snippet/web-scraping-with-node-and-phantomjs

How to use NodeJS with node-rest-client methods to post dynamic data to front end HTML

I am rather new to NodeJS so hopefully I am able to articulate my question(s) properly. My goal is to create a NodeJS application that will use the node-rest-client to GET data and asynchronously display it in HTML on client side.
I have several node-rest-client methods created and currently I am calling my GET data operation when a user navigates to the /getdata page. The response is successfully logged to the console but I'm stumbling on the best method to dynamically populate this data in an HTML table on the /getdata page itself. I'd like to follow Node best practices, ensure durability under high user load and ultimately make sure I'm not coding a piece of junk.
How can I bind data returned from my Express routes to the HTML front end?
Should I use separate "router.get" routes for each node-rest-method?
How can I bind a GET request to a button and have it GET new data when clicked?
Should I consider using socket.io, angularjs and ajax to pipe data from the server side to client side?
-Thank you for reading.
This is an example of the route that is currently rendering the getdata page as well as calling my getDomains node-rest-client method. The page is rendering correct and the data returned by getDomains is successfully printed to the console, however I'm having trouble getting the data piped to the /getdata page.
router.get('/getdata', function(req, res) {
res.render('getdata', {title: 'This is the get data page'});
console.log("Rendering:: Starting post requirement");
args = {
headers:{"Cookie":req.session.qcsession,"Accept":"application/xml"},
};
qcclient.methods.getDomains(args, function(data, response){
var theProjectsSTRING = JSON.stringify(data);
var theProjectsJSON = JSON.parse(theProjectsSTRING);
console.log('Processing JSON.Stringify on DATA');
console.log(theProjectsSTRING);
console.log('Processing JSON.Parse on theProjectsSTRING');
console.log('');
console.log('Parsing the array ' + theProjectsJSON.Domains.Domain[0].$.Name );
});
});
I've started to experiment with creating several routes for my different node-rest-client methods that will use res.send to return the data and the perhaps I could bind an AJAX call or use angularjs to parse the data and display it to the user.
router.get('/domaindata', function(req, res){
var theProjectsSTRING;
var theProjectsJSON;
args = {
headers:{"Cookie": req.session.qcsession,"Accept":"application/xml"},
};
qcclient.methods.getDomains(args, function(data, response){
//console.log(data);
theProjectsSTRING = JSON.stringify(data);
theProjectsJSON = JSON.parse(theProjectsSTRING);
console.log('Processing JSON.Stringify on DATA');
console.log(theProjectsSTRING);
console.log('Processing JSON.Parse on theProjectsSTRING');
console.log('');
console.log('Parsing the array ' + theProjectsJSON.Domains.Domain[0].$.Name );
res.send(theProjectsSTRING);
});
});
I looked into your code. You are using res.render(..) and res.send(..). First of all you should understand the basic request-response cycle. Request object gives us the values passed from routes, and response will return values back after doing some kind of processing on request values. More particularly in express you will be using req.params and req.body if values are passed through the body of html.
So all response related statements(res.send(..),res.json(..), res.jsonp(..), res.render(..)) should be at the end of your function(req,res){...} where you have no other processing left to be done, otherwise you will get errors.
As per the modern web application development practices in javascript, frameworks such as Ruby on Rails, ExpressJS, Django, Play etc they all work as REST engine and front end routing logic is written in javascript. If you are using AngularJS then ngRoute and open source ui-router makes work really easy. If you look closely into some of the popular MEAN seed projects such as mean.io, mean.js even they use the ExpressJS as REST engine and AngularJS does the rest of heavyweight job in front end.
Very often you will be sending JSON data from backend so for that you can use res.json(..). To consume the data from your endpoints you can use angularjs ngResource service.
Let's take a simplest case, you have a GET /domaindata end point :
router.get('/domaindata',function(req,res){
..
..
res.json({somekey:'somevalue'});
});
In the front end you can access this using angularJS ngResource service :
var MyResource = $resource('/domaindata');
MyResource.query(function(results){
$scope.myValue = results;
//myValue variable is now bonded to the view.
});
I would suggest you to have a look into the ui-router for ui front end routing.
If you are looking for sample implementation then you can look into this project which i wrote sometime back, it can also give you an overview of implementing login, session management using JSON Web Token.
There are lot of things to understand, let me know if you need help in anything.

better way to use in AJAX requests? $.POST or $.AJAX?

I have seen many articles for making ajax requests..
most of them are using $.AJAX for jquery ajax posting and some of them are using $.POST for jquery ajax posting...
I want to know what is the best way if I want to post using ajax? which method makes the ajax request fast and in lightweight?
$.post is a shorthand way of using $.ajax for POST requests, so no difference.
$.ajax is generally better to use if you need some advanced configuration.
$.post is just shorthand for $.ajax({type: 'POST'}). It makes no difference to the speed or weight of the request, just changes the readability of your code.
$.post is just a shorthand for $.ajax({ type: 'POST' }) [see reference], so there is no acceptable performance improvement, but still a readability one.

ajax form submit cross server

I have several servers on an intranet. I am passing data from one server to be processed on another server. Attempting to use ajax but I am a noob.
<script type="text/javascript" src="jquery-1.8.0.js"></script>
<script type="text/javascript">
function print(oForm){
var toggle = oForm.elements["toggle"].value;
var ticket_type_id = oForm.elements["ticket_type_id"].value;
var printer_id = oForm.elements["printer_id"].value;
var store_id = oForm.elements["store_id"].value;
var data = oForm.elements["data"].value;
var dataString = "toggle="+ toggle+ "&ticket_type_id="+ ticket_type_id+ "&printer_id="+ printer_id+ "&store_id="+ store_id+ "&data="+ data;
$.ajax(
{
type:"POST",
url:"http://192.168.12.103/crowncontrol/backend/processes/print.php",
data:dataString,
success: function(data){
alert("successful");
}
}
);
}
</script>
The above URL does not work.
But if I make the url:
"../../../backend/processes/print.php"
Which is the same location, it works fine.
Also if I send it via Anchor Get it works fine:
href="http://192.168.12.103/crowncontrol/backend/processes/print.php?etc"
The reason I am using ajax is, I want my print.php script to run with out the user noticing. The reason I can't use url:"../../../backend/processes/print.php" is because I will be sending information from one server to another servers on my intranet.
Any help would be appreciated. I've spent far too long trying to get it to work on my own.
AFTER help from the answers below instead of the entire ajax code I used:
$.getJSON('http://192.168.12.103/crowncontrol/backend/processes/print.php?callback=?',dataString,function(res){
//alert('Success');
});
also:
<script src="https://ajax.googleapis.com/ajax/libs/jquery/1.6.2/jquery.min.js" type="text/javascript">
This is a result of the same origin policy. You can not perform a normal AJAX requests cross-domain requests for security reasons (see the link about same origin policy).
Fortunately for you jQuery includes JSONP request support which uses script tag injection instead of XMLHttpRequest.
Instead of creating and using an xhr object (XMLHttpRequest which is how ajax is done) it creates a script tag with an src attribute set to your URL. it should work.
Try changing your code to :
$.ajax(
{
type:"POST",
url:"http://192.168.12.103/crowncontrol/backend/processes/print.php?callback=?",
data:dataString,
success: function(data){
alert("successful");
}
}
);
(notice the ?callback=? part)
Here is a jsonp tutorial for jQuery
Here is some information about jsonp and some information about the same origin policy
Easy way to deal with this problem is to make a script file in your server and then route the requests through that server request.Use this logic below:
Instead of making the AJAX request directly to cross domain, make the AJAX request to a new script on your server.
In that script file, get the request and make the required call(to that cross domain address).
Then recieve the response from the cross domain server and send it to the client.
Receive the result from your own server which has required data.
This diagram shows:

Protecting prototype.js based XHR requests against CSRF

Django has been updated to 1.3, and in fact ever since 1.2.5, it has extended the scheme to pass a Cross Site Request Forgery protection token to XMLHttpRequests. The Django folks helpfully provide an example for jQuery to apply a specific header to every XHR.
Prototype (and thus Scriptaculous) have to comply to this scheme, yet I can't find a way to tell prototype to add the X-CSRFToken header. The best would be to do it once in a way that applies it across the app (like for jQuery).
Is there a way to do that?
This is a wild guess but you could try extending the base AJAX class...
Ajax.Base.prototype.initialize = Ajax.Base.prototype.initialize.wrap(
function (callOriginal, options) {
var headers = options.requestHeaders || {};
headers["X-CSRFToken"] = getCookie("csrftoken");
options.requestHeaders = headers;
return callOriginal(options);
}
);

Resources