Does anyone have an example of how to fire forge.ajax periodically in a Trigger.io app? It seems to fire once, then die silently. Is Trigger.io removing the setTimeout or stopping it? I'm using this technique adapted from Paul Irish.
// Wrap this function in a closure so we don't pollute the namespace
(function worker() {
forge.request.ajax({
url: 'ajax/test.html',
complete: function() {
// Schedule the next request when the current one's complete
setTimeout(worker, 5000);
}
});
})();
Could it be a scope issue perhaps?
Thanks for any advice.
While forge.request.ajax is similiar to jQuery.ajax it is not the same, and it has no complete callback.
You probably want to put your setTimeout in both the success and error callbacks.
Related
I just have implemented AJAX calls using fetch in react-native.
Implementing queuing of these AJAX calls hasn't been very well implemented.
Can any one help me?
If you want to make parallel requests, you can use the fact that fetch returns a promise to you, and then you can use Promise.all to wait for completion of all promises.
For example:
var urls = ['http://url1.net', 'http://url2.net'];
var requests = [];
urls.forEach((url)=>{
request = fetch(url); // You can also pass options or any other parameters
requests.push(request);
});
// Then, wait for all Promises to finish. They will run in parallel
Promise.all(requests).then((results) => {
// Results will hold an array with the results of each promise.
}).catch((err)=>{
// Promise.all implements a fail-fast mechanism. If a request fails, the catch method will be called immediately
});
I noticed that you added the 'multithreading' tag. It is important to notice that this code won't be doing any threading for you, as JS (generally) runs in only one thread.
When to use use async false or async true in an ajax call. In terms of performance does it make any difference ?
example :
$.ajax({
url : endpoint,
type : "post",
async : false,
success : function(data) {
if (i==1){
getMetricData(data)}
else if (i==2)
{
capture = data;
}
}
});
It's not relative to performance...
You set async to false, when you need that ajax request to be completed before the browser passes to other codes:
<script>
// ...
$.ajax(... async: false ...); // Hey browser! first complete this request,
// then go for other codes
$.ajax(...); // Executed after the completion of the previous async:false request.
</script>
When async setting is set to false, a Synchronous call is made instead of an Asynchronous call.
When the async setting of the jQuery AJAX function is set to true then a jQuery Asynchronous call is made. AJAX itself means Asynchronous JavaScript and XML and hence if you make it Synchronous by setting async setting to false, it will no longer be an AJAX call.
for more information please refer this link
It is best practice to go asynchronous if you can do several things in parallel (no inter-dependencies).
If you need it to complete to continue loading the next thing you could use synchronous, but note that this option is deprecated to avoid abuse of sync:
jQuery.ajax() method's async option deprecated, what now?
In basic terms synchronous requests wait for the response to be received from the request before it allows any code processing to continue. At first this may seem like a good thing to do, but it absolutely is not.
As mentioned, while the request is in process the browser will halt execution of all script and also rendering of the UI as the JS engine of the majority of browsers is (effectively) single-threaded. This means that to your users the browser will appear unresponsive and they may even see OS-level warnings that the program is not responding and to ask them if its process should be ended. It's for this reason that synchronous JS has been deprecated and you see warnings about its use in the devtools console.
The alternative of asynchronous requests is by far the better practice and should always be used where possible. This means that you need to know how to use callbacks and/or promises in order to handle the responses to your async requests when they complete, and also how to structure your JS to work with this pattern. There are many resources already available covering this, this, for example, so I won't go into it here.
There are very few occasions where a synchronous request is necessary. In fact the only one I can think of is when making a request within the beforeunload event handler, and even then it's not guaranteed to work.
In summary. you should look to learn and employ the async pattern in all requests. Synchronous requests are now an anti-pattern which cause more issues than they generally solve.
In this PlunkerDemo, I'm trying to broadcast an event from the parent controller to child controller. However doing it directly in the parent controller won't work. The handler doesn't register the event. However doing it based on an ng-click or based on setTimeout, it works. Is it due to the scope life cycle?
http://beta.plnkr.co/edit/ZU0XNK?p=preview
See the comments of the accepted answer. They explain my problem.
Any changes to angular scope must happen within the angular framework, If any changes have to be made outside the framework we have to use the .$apply function.
$apply() is used to execute an expression in angular from outside of
the angular framework.
In your case you are triggering the $broadcast within setTimeout, where the callback gets called outside the angular framework.
So you have two solutions, either use the $timeout service provided by angular or use .$apply function.
I prefer to use the $timeout function.
var ParentCtrl = function($scope, $rootScope, $timeout){
$scope.broadcast = function(){
$rootScope.$broadcast('Sup', 'Here is a parameter');
};
$timeout(function(){
$scope.$broadcast('Sup');
}, 1000);
//this one does not work! Most likely due to scope life cycle
$scope.$broadcast('Sup');
$scope.$on('SupAgain', function(){
console.log('SupAgain got handled!');
});
};
Demo: Fiddle
Using $apply
setTimeout(function(){
$scope.$apply(function(){
$scope.$broadcast('Sup');
});
}, 1000);
A more reliable option can be to use $interval in child controller. So, instead of having significant timeout, there will be polling every small interval.
Also, instead of broadcast, use a service with a flag. Every poll will check if flag is set. When the flag is set by parent controller, the timer will be stopped during next poll. And that can indicate the event happened. The parent controller can also share data with child controller, via service.
So here's the rub - I have a system that, when an event occurs, fires off a chain of disparate asynchronous code (some code even fires off more events).
Now during acceptance testing I fire that same event - but what strategy should use to notify the test runner that all the subsequent events are finished and ready to move on to the next step?
I started out just waiting for fixed amount of time - but I that was always just a fudge.
And now, I'm hooking in to the tail events and moving on when they have all finished. But I can see this becoming v. complex as the system grows.
Just wondering if there is an alternative strategy that I've missed. Any suggestions?
FWIW I'm using cucumber.js & zombie to test an express app on node.js.
Cheers and thanks for your time,
Gordon
Obviously, the solution will be different depending on the application, but I find it helpful to architecture my applications in such a way that callbacks for the top-level asynchronous functions are tracked all the way to the end. Once all the top-level callbacks are done, you can fire some event or call some callback that indicates everything is over. Using something like async's parallel, you could potentially do something like this:
eventEmitter.on('someEvent', function(some, data, callback) {
async.parallel([
function(cb) { firstAsyncThing(some, cb); },
function(cb) { secondAsyncThing(data, cb); }
], function(err, results) {
// called when all functions passed in the array have their `cb` called
callback(err, results);
});
});
So then you can pass a callback into your event:
eventEmitter.emit('someEvent', 'some', 'data', function(error, results) {
// the event has been fully handled
});
What is the best way to implement an Ajax request queue using jQuery? Specifically, I want to accomplish the following:
A user triggers any number of Ajax requests within a web page, which need to be queued up and submitted sequentially.
The web page needs to receive responses from the server and adjust itself accordingly.
Finally, if an error occurs (connection lost, server failed to respond, etc.), I want jQuery to invoke a JavaScript function.
I'm struggling with the last requirement in particular, as the error handling mechanism in jQuery's Ajax functions is not very intuitive. Are there any examples/tutorials that could help me with this task?
Thanks!
I've made a few buffers like this. Some examples can be found simple task Buffer and deferred item queue.
As for error handling you can always use .ajaxError
jQuery ajax has an error attribute where you can attach a function and the method will only fire if an error occurs. See the below example:
jQuery.ajax({
type: "POST",
url: "url",
dataType:"json",
data:{},
success:function(data){
alert("Succeed!");
},
error:function (xhr, ajaxOptions, thrownError){
alert(xhr.status);
alert(thrownError);
}
});
I hope you find this helpful.