async await execution in javascript - async-await

I found some SO questions/answers but I am unable to get the async await javascript concept right..please advise ..my question is Should not console.log('end') be executed two times? .
debugger;
async function withoutAwait() {
console.log('without await')
}
async function withAwait() {
await 0
console.log('with await')
}
console.log('start')
withoutAwait()
withAwait()
console.log('end')
credit to question goes to Konrad Linowski:async await)

No, the console.log('end') will not be executed two times. The Javascript is synchronous. That means that every instructions are executed one after another. With the keyword async you indicate that you want to execute once the instruction is finished.
If your function withAwait takes 25 seconds, then your console.log('with Await') will be fired 25 seconds after. Meanwhile the console.log('end') will be fired right after calling your function.

There's only one line of code with console.log('end') and it isn't inside a function (let alone a function that gets called multiple times) so no, it shouldn't be called twice.

Related

Get value from a promise near_sdk

I'm having a function that return a promise expected to return a u128 value from another contract. How can I get this value from this promise?
Here is my codes:
I tried async await in future/rust but it didn't work
The value is baked into the call_result variable you have there. The promise will be scheduled after query_staked_amount finishes and then once the promise finishes executing, the callback query_staked_amount_callback is invoked.
In this callback, the result of the original promise (get_staked_amount) will be available in that variable you have call_result.
Hope that answers your question.

waiting for web worker making http call to finish

I'm creating multiple web workers making http calls. I have to limit the number of web workers and so I'm trying to wait for some of the workers to finish.
here's an example of what I thought might work using Promises:
anArray.map(async contact => {
await new Promise((res, rej) => {
const worker = new Worker('./http.worker', { type: 'module' });
worker.onmessage = () => {
res();
};
worker.postMessage(contact);
});
});
I thought this would wait for each promise to resolve before moving on to the next item... but it's not.
What could I do to make this work? or... I've also thought of building an array of workers and run a recursive loop that checks/waits for one to be available... I'm open to general suggestions for solving this.
.map() is not promise aware. It does not look at the return value from each iteration to see if it's a promise and then pause the loop. Instead, it just blindly runs all the iterations one after another. When you return promises from .map() which you are with the async callback, that just means that your .map() will produce an array of promises and all your loop iterations will be "in-flight" at the same time, not sequenced.
If you want to iterate a loop and pause the loop in each iteration until a promise resolves, then use a regular for loop:
async function someFunc() {
for (let contact of anArray) {
await new Promise((res, rej) => {
const worker = new Worker('./http.worker', { type: 'module' });
worker.onmessage = () => {
res();
};
worker.postMessage(contact);
});
}
}
FYI, http calls in Javascript are non-blocking and asynchronous so it's not entirely clear why you're doing them in WebWorkers. Unless you have CPU-intensive processing of the result, you can do the http requests in the main thread just fine without blocking it.
Also FYI, for a number of different options for processing an array, with only N requests in flight at the same time (where you decide what value N is), see these various answers:
runN(fn, limit, cnt, options): Loop through an API on multiple requests
pMap(array, fn, limit): Make several requests to an api that can only handle 20 at a time
rateLimitMap(array, requestsPerSec, maxInFlight, fn): Proper async method for max requests per second
mapConcurrent(array, maxConcurrent, fn): Promise.all() consumes all my ram
There are also features to do this built into the Bluebird promise library and the Async-promises library.

Pattern for Observables that includes acknowledgement

I'm working on something that is recording data coming from a queue. It was easy enough to process the queue into an Observable so that I can have multiple endpoints in my code receiving the information in the queue.
Furthermore, I can be sure that the information arrives in order. That bit works nicely as well since the Observables ensure that. But, one tricky bit is that I don't want the Observer to be notified of the next thing until it has completed processing the previous thing. But the processing done by the Observer is asynchronous.
As a more concrete example that is probably simple enough to follow. Imagine my queue contains URLs. I'm exposing those as an Observable in my code. The I subscribe an Observer whose job is to fetch the URLs and write the content to disk (this is a contrived example, so don't take issue with these specifics). The important point is that fetching and saving are async. My problem is that I don't want the observer to be given the "next" URL from the Observable until they have completed the previous processing.
But the call to next on the Observer interface returns void. So there is no way for the Observer to communicate back to me that has actually completed the async task.
Any suggestions? I suspect there is probably some kind of operator that could be coded up that would basically withhold future values (queue them up in memory?) until it somehow knew the Observer was ready for it. But I was hoping something like that already existed following some established pattern.
similar use case i ran into before
window.document.onkeydown=(e)=>{
return false
}
let count=0;
let asyncTask=(name,time)=>{
time=time || 2000
return Rx.Observable.create(function(obs) {
setTimeout(function() {
count++
obs.next('task:'+name+count);
console.log('Task:',count ,' ', time, 'task complete')
obs.complete();
}, time);
});
}
let subject=new Rx.Subject()
let queueExec$=new Rx.Subject()
Rx.Observable.fromEvent(btnA, 'click').subscribe(()=>{
queueExec$.next(asyncTask('A',4000))
})
Rx.Observable.fromEvent(btnB, 'click').subscribe(()=>{
queueExec$.next(asyncTask('B',4000))
})
Rx.Observable.fromEvent(btnC, 'click').subscribe(()=>{
queueExec$.next(asyncTask('C',4000))
})
queueExec$.concatMap(value=>value)
.subscribe(function(data) {
console.log('onNext', data);
},
function(error) {
console.log('onError', error);
},function(){
console.log('completed')
});
What you describe sounds like "backpressure". You can read about it in RxJS 4 documentation https://github.com/Reactive-Extensions/RxJS/blob/master/doc/gettingstarted/backpressure.md. However this is mentioning operators that don't exist in RxJS 5. For example have a look at "Controlled Observables" that should refer to what you need.
I think you could achieve the same with concatMap and an instance of Subject:
const asyncOperationEnd = new Subject();
source.concatMap(val => asyncOperationEnd
.mapTo(void 0)
.startWith(val)
.take(2) // that's `val` and the `void 0` that ends this inner Observable
)
.filter(Boolean) // Always ignore `void 0`
.subscribe(val => {
// do some async operation...
// call `asyncOperationEnd.next()` and let `concatMap` process another value
});
Fro your description it actually seems like the "observer" you're mentioning works like Subject so it would make maybe more sense to make a custom Subject class that you could use in any Observable chain.
Isn't this just concatMap?
// Requests are coming in a stream, with small intervals or without any.
const requests=Rx.Observable.of(2,1,16,8,16)
.concatMap(v=>Rx.Observable.timer(1000).mapTo(v));
// Fetch, it takes some time.
function fetch(query){
return Rx.Observable.timer(100*query)
.mapTo('!'+query).startWith('?'+query);
}
requests.concatMap(q=>fetch(q));
https://rxviz.com/v/Mog1rmGJ
If you want to allow multiple fetches simultaneously, use mergeMap with concurrency parameter.

Sending AJAX ASYNC Request one after another at an interval of 15 seconds

I want to poll the data from the WebServer(PHP) at an interval of 15 seconds, for 50 to 100 times (or lets say infinite loop, until the stopFlag variable is set to true.).
For this data polling, i am going to use the AJAX ASYNC message for sending the requests to the WebServer.
How can i achieve this?
I have tried to solve this puzzle by myself, but unfortunatly, i failed as there is no keyword for pausing the script execution in the javascript.
Is there any way to make it work? or any workaround on this? Kindly let me know or share your experience if you have already faced this issue.
You have to use a callback for the timeout; it will recursively call the next function.
You can also use jQuery which can help you to make your code more compact. The result might look something like this:
var finished = false;
function keepTrying() {
if (finished) {
return;
}
$.ajax(params);
setTimeout(function() {
keepTrying();
}, 15000);
}
And in params you would have a success function like this:
function success() {
finished = true;
}
Just call keepTrying() the first time; it will loop until it is successful. This code is a bit ugly but hopefully you get the idea.

mocha - callback to "end" never gets called?

I'm new to mocha, and am trying to implement a new test. I'm finding that the callback to my end method never gets called
it('should allow valid urls', function(){
var myUrl = "http://localhost:8080/test";
api.get(myUrl)
.end(function(err, res) {
console.log('THIS IS THE END, MY FRIEND');
});
});
Does anyone know why? I've tried expect with a callback as well, and it never gets called too.
Turns out that mocha does analysis of the function arguments, and I'd forgotten to put an argument in my mocha callback - even though it's never referenced in my function or any visible code at all!
So the solution was simply to add a variable, done as a function parameter to my it function, and it worked, even though it's not visibly used in the immediate context ;-)
it('should allow valid urls', function(done){
...
EDIT: Note that done should be used in my callback, as mentioned by # oligofren, but I hadn't gotten to that point yet and was surprised to see the callback itself not firing.

Resources