RxJS Service Call Throtting / Queuing - rxjs

I'm attempting to use RxJS to implement service call throttling / queuing.
For example, Google Maps' Geocoder API. Let's say I don't want this to be called more than once a second, but one or more parts of my application may request a geocode more often than that. I'd want the requests to queue, with adjacent requests being at least 1s apart, but I'd also want to be able to 'cancel' a request if it no longer becomes required during this wait.
Is this an applicable use of RxJS, and if so what might this look like?
Thanks.

Here is something that should guide you (jsfiddle):
// Helper functions
function remove_from_queue(queue, id) {
queue.forEach(function(x, index){
if (x.execute.request === id) {
queue.splice(index, 1);
}
});
// console.log('queue after removal', queue);
}
function add_to_queue (queue, id){
queue.push({execute : {request : id}});
}
function getFirstInQueue(queue){
return queue[0];
}
function noop(x) {}
function log(label) {
return function (x) {
console.log.call(console, label, x);
}
}
function timestamp(label){
return function (x) {
console.log.call(console, Date.now() - startingDate, label,x );
}
}
function label(label){
return function (x) {
var res = {};
res[label] = x;
return res;
}
}
var startingDate = Date.now();
var requests$ = Rx.Observable.generateWithRelativeTime(
{request : 1},
function (x) { return x.request < 10; },
function (x) { return {request : x.request + 1}; },
function (x) { return x; },
function (x) { return 100 ; }
);
var cancelledRequests$ = Rx.Observable.generateWithRelativeTime(
{request : 1},
function (x) { return x.request < 20; },
function (x) { return {request : x.request + 4}; },
function (x) { return x; },
function (x) { return 500 ; }
);
var timer$ = Rx.Observable.interval(990).map(function (){return {}}).take(10);
var source$ = Rx.Observable.merge(
requests$.map(label('execute')),
cancelledRequests$.map(label('cancel')),
timer$
)
//.do(log('source'));
controlledSource$ = source$
.scan(function (state, command){
var requestsToExecuteQueue = state.requestsToExecuteQueue;
if (command.cancel) {
remove_from_queue(requestsToExecuteQueue, command.cancel.request);
}
if (command.execute) {
add_to_queue(requestsToExecuteQueue, command.execute.request);
}
console.log('queue', requestsToExecuteQueue.slice())
return {
command : command,
requestExec$ : Rx.Observable
.return(getFirstInQueue(requestsToExecuteQueue))
.filter(function(x){return x})
.do(function(x){remove_from_queue(requestsToExecuteQueue, x.execute.request)}),
requestsToExecuteQueue : requestsToExecuteQueue
}
}, {command : undefined, requestExec$ : undefined, requestsToExecuteQueue : []})
.pluck('requestExec$')
.sample(Rx.Observable.interval(1000))
.mergeAll();
controlledSource$.do(timestamp('executing request:')).subscribe(noop)
Basically :
scan is used to manage the state (queue of requests, addition and removal)
for each request, we pass an observable which (when subscribed to) releases the first element of the queue, and remove that element from the queue
sample is used to get one such observable every second
mergeAll allows to subscribe to that observable
we have to use a timer$ object to continue polling the queue even after the source of requests has completed (you still need to empty the queue of remaining requests). You can adapt that logic to your real case by having timer$ emitting for X seconds after completion of your source for example or whatever suits you best.

Related

Re-execute async RxJS stream after delay

I'm using RxJS 6 to lazily step through iterable objects using code similar to example running below. This is working well but I'm having trouble solving my final use case.
Full code here
import { EMPTY, defer, from, of } from "rxjs";
import { delay, expand, mergeMap, repeat } from "rxjs/operators";
function stepIterator (iterator) {
return defer(() => of(iterator.next())).pipe(
mergeMap(result => result.done ? EMPTY : of(result.value))
);
}
function iterateValues ({ params }) {
const { values, delay: delayMilliseconds } = params;
const isIterable = typeof values[Symbol.iterator] === "function";
// Iterable values which are emitted over time are handled manually. Otherwise
// the values are provided to Rx for resolution.
if (isIterable && delayMilliseconds > 0) {
const iterator = values[Symbol.iterator]();
// The first value is emitted immediately, the rest are emitted after time.
return stepIterator(iterator).pipe(
expand(v => stepIterator(iterator).pipe(delay(delayMilliseconds)))
);
} else {
return from(values);
}
}
const options = {
params: {
// Any iterable object is walked manually. Otherwise delegate to `from()`.
values: ["Mary", "had", "a", "little", "lamb"],
// Delay _between_ values.
delay: 350,
// Delay before the stream restarts _after the last value_.
runAgainAfter: 1000,
}
};
iterateValues(options)
// Is not repeating?!
.pipe(repeat(3))
.subscribe(
v => {
console.log(v, Date.now());
},
console.error,
() => {
console.log('Complete');
}
);
I'd like to add in another option which will re-execute the stream, an indefinite number of times, after a delay (runAgainAfter). I'm having trouble composing this in cleanly without factoring the result.done case deeper. So far I've been unable to compose the run-again behavior around iterateValues.
What's the best approach to accomplish the use case?
Thanks!
Edit 1: repeat just hit me in the face. Perhaps it means to be friendly.
Edit 2: No, repeat isn't repeating but the observable is completing. Thanks for any help. I'm confused.
For posterity here is the full code sample for a revised edition is repeat-able and uses a consistent delay between items.
import { concat, EMPTY, defer, from, interval, of, throwError } from "rxjs";
import { delay, expand, mergeMap, repeat } from "rxjs/operators";
function stepIterator(iterator) {
return defer(() => of(iterator.next())).pipe(
mergeMap(result => (result.done ? EMPTY : of(result.value)))
);
}
function iterateValues({ params }) {
const { values, delay: delayMilliseconds, times = 1 } = params;
const isIterable =
values != null && typeof values[Symbol.iterator] === "function";
if (!isIterable) {
return throwError(new Error(`\`${values}\` is not iterable`));
}
// Iterable values which are emitted over time are handled manually. Otherwise
// the values are provided to Rx for resolution.
const observable =
delayMilliseconds > 0
? defer(() => of(values[Symbol.iterator]())).pipe(
mergeMap(iterator =>
stepIterator(iterator).pipe(
expand(v => stepIterator(iterator).pipe(delay(delayMilliseconds)))
)
)
)
: from(values);
return observable.pipe(repeat(times));
}
I'm gonna be honest, but there could be better solution for sure. In my solution, I ended up encapsulating delay logic in a custom runAgainAfter operator. Making it an independent part, that doesn't affect your code logic directly.
Full working code is here
And the code of runAgainAfter if anybody needs it:
import { Observable } from "rxjs";
export const runAgainAfter = delay => observable => {
return new Observable(observer => {
let timeout;
let subscription;
const subscribe = () => {
return observable.subscribe({
next(value) {
observer.next(value);
},
error(err) {
observer.error(err);
},
complete() {
timeout = setTimeout(() => {
subscription = subscribe();
}, delay);
}
});
};
subscription = subscribe();
return () => {
subscription.unsubscribe();
clearTimeout(timeout);
};
});
};
Hope it helps <3

How to batch additions to arrays/lists returned by rxjs observables?

have an observable that returns arrays/lists of things: Observable
And I have a usecase where is is a pretty costly affair for the downstream consumer of this observable to have more items added to this list. So I'd like to slow down the amount of additions that are made to this list, but not loose any.
Something like an operator that takes this observable and returns another observable with the same signature, but whenever a new list gets pushed on it and it has more items than last time, then only one or a few are added at a time.
So if the last push was a list with 3 items and next push has 3 additional items with 6 items in total, and the batch size is 1, then this one list push gets split into 3 individual pushes of lists with lengths: 4, 5, 6
So additions are batched, and this way the consumer can more easily keep up with new additions to the list. Or the consumer doesn't have to stall for so long each time while processing additional items in the array/list, because the additions are split up and spread over a configurable size of batches.
I made an addAdditionalOnIdle operator that you can apply to any rxjs observable using the pipe operator. It takes a batchSize parameter, so you can configure the batch size. It also takes a dontBatchAfterThreshold, which stops batching of the list after a certain list size, which was useful for my purposes. The result also contains a morePending value, which you can use to show a loading indicator while you know more data is incomming.
The implementation uses the new requestIdleCallback function internally to schedule the batched pushes of additional items when there is idle time in the browser. This function is not available in IE or Safari yet, but I found this excelent polyfill for it, so you can use it today anyways: https://github.com/aFarkas/requestIdleCallback :)
See the implementation and example usage of addAdditionalOnIdle below:
const { NEVER, of, Observable } = rxjs;
const { concat } = rxjs.operators;
/**
* addAdditionalOnIdle
*
* Only works on observables that produce values that are of type Array.
* Adds additional elements on window.requestIdleCallback
*
* #param batchSize The amount of values that are added on each idle callback
* #param dontBatchAfterThreshold Return all items after amount of returned items passes this threshold
*/
function addAdditionalOnIdle(
batchSize = 1,
dontBatchAfterThreshold = 22,
) {
return (source) => {
return Observable.create((observer) => {
let idleCallback;
let currentPushedItems = [];
let lastItemsReceived = [];
let sourceSubscription = source
.subscribe({
complete: () => {
observer.complete();
},
error: (error) => {
observer.error(error);
},
next: (items) => {
lastItemsReceived = items;
if (idleCallback) {
return;
}
if (lastItemsReceived.length > currentPushedItems.length) {
const idleCbFn = () => {
if (currentPushedItems.length > lastItemsReceived.length) {
observer.next({
morePending: false,
value: lastItemsReceived,
});
idleCallback = undefined;
return;
}
const to = currentPushedItems.length + batchSize;
const last = lastItemsReceived.length;
if (currentPushedItems.length < dontBatchAfterThreshold) {
for (let i = 0 ; i < to && i < last ; i++) {
currentPushedItems[i] = lastItemsReceived[i];
}
} else {
currentPushedItems = lastItemsReceived;
}
if (currentPushedItems.length < lastItemsReceived.length) {
idleCallback = window.requestIdleCallback(() => {
idleCbFn();
});
} else {
idleCallback = undefined;
}
observer.next({
morePending: currentPushedItems.length < lastItemsReceived.length,
value: currentPushedItems,
});
};
idleCallback = window.requestIdleCallback(() => {
idleCbFn();
});
} else {
currentPushedItems = lastItemsReceived;
observer.next({
morePending: false,
value: currentPushedItems,
});
}
},
});
return () => {
sourceSubscription.unsubscribe();
sourceSubscription = undefined;
lastItemsReceived = undefined;
currentPushedItems = undefined;
if (idleCallback) {
window.cancelIdleCallback(idleCallback);
idleCallback = undefined;
}
};
});
};
}
function sleep(milliseconds) {
var start = new Date().getTime();
for (var i = 0; i < 1e7; i++) {
if ((new Date().getTime() - start) > milliseconds){
break;
}
}
}
let testSource = of(
[1,2,3],
[1,2,3,4,5,6],
).pipe(
concat(NEVER)
);
testSource
.pipe(addAdditionalOnIdle(2))
.subscribe((list) => {
// Simulate a slow synchronous consumer with a busy loop sleep implementation
sleep(1000);
document.body.innerHTML += "<p>" + list.value + "</p>";
});
<script src="https://unpkg.com/rxjs#6.5.3/bundles/rxjs.umd.js"></script>

Wait for n executions of a method and then continue after complete

I have a function that returns promise:
Setup.zoomIn() : Promise<void> {...}
I would like to use rxjs to invoke that function 5 times with delay of 1 second between each, like this:
let obs = Observable.create(observer => {
let count = 0;
setTimeout(() => {
Setup.zoomIn();
count++;
observer.next();
}, 1000);
if (count === 5) {observer.complete();}
};
obs.subscribe(() =>
console.log('zoomed out');
)};
Only and only when that is executed I would like to continue with execution to perform further steps:
obs.toPromise.then(() => {
// do some stuff here but only after zoom has been invoked 5 times
})
Create a list of observables for zoomIns functions and concat them with another Observable.
function zoomIn(i) {
return new Promise(res => {
setTimeout(()=>res(i), 1000);
});
};
function anotherPromise() {
return Rx.Observable.defer(()=> {
return new Promise(res => {
setTimeout(()=>res('anotherPromise'), 3000);
});
});
}
const zoonInList = Array(5).fill(0).map((x, i)=>i).map(i=>
Rx.Observable.defer(()=> {
return zoomIn(i);
})
);
Rx.Observable.concat(...zoonInList, anotherPromise())
.subscribe(x=>console.log(x))

How to return from within an observer?

I was trying to return filter function but return doesn't seem to work with callbacks. Here this.store.let(getIsPersonalized$) is an observable emitting boolean values and this.store.let(getPlayerSearchResults$) is an observable emiting objects of video class.
How do I run this synchronously, can I avoid asynchronus callback altogether given that I can't modify the observables received from store.
isPersonalized$ = this.store.let(getIsPersonalized$);
videos$ = this.store.let(getPlayerSearchResults$)
.map((vids) => this.myFilter(vids));
myFilter(vids) {
this.isPersonalized$.subscribe((x){
if(x){
return this.fileterX(vids);//Return from here
}
else {
return this.filterY(vids);//Or Return from here
}
});
}
fileterX(vids) {
return vids.filter((vid) => vids.views>100;);
}
fileterY(vids) {
return vids.filter((vid) => vids.views<20;);
}
I got it working this way, you don't need myFilter(vids) at all if you can get the branching out on isPersonalized$'s subscribe. Here is the updated code.
this.store.let(getIsPersonalized$);
videos$: Observable<any>;
ngOnInit() {
this.isPersonalized$.subscribe((x) => {
if (x) {
this.videos$ = this.store.let(getPlayerSearchResults$)
.map((vids) => this. fileterX(vids));
} else {
this.videos$ = this.store.let(getPlayerSearchResults$)
.map((vids) => this. fileterY(vids));
}
});
}
fileterX(vids) {
return vids.filter((vid) => vids.views>100;);
}
fileterY(vids) {
return vids.filter((vid) => vids.views<20;);
}
It looks like you want to evaluate the latest value of isPersonalized$ within the map function, i'd do that via withLatestFrom (Example: The first one toggles true/false every 5s, the second emits an increasing number every 1s):
const isPersonalized$ = Rx.Observable.interval(5000)
.map(value => value % 2 === 0);
const getPlayerSearchResults$ = Rx.Observable.interval(1000)
.withLatestFrom(isPersonalized$)
.map(bothValues => {
const searchResult = bothValues[0];
const isPersonalized = bothValues[1];
...
});

Transitioning away from Object.observe

I've been using Object.observe() as part of a nw.js project that is now transitioning from nw.js v.0.12.3 to latest.
I have code like this:
..(myclass)..
data: { a:0, b:42 },
setupHandlers: function () {
Object.observe(this.data, changes => this.draw());
},
draw: function () { .. }
My initial conversion looks like:
data: {_a: 0, _b: 42},
get a() { return this._a; }
set a(val) { this.data._a = val; this.draw(); }
get b() { return this._b; }
set b(val) { this.data._b = val; this.draw(); }
and then change every place that wrote to data (myobj.data.a = 1) to instead write to the object (myobj.a = 1), thus using the setter.
It's a very labor-intensive conversion, is there an easier way?
We ended up using Proxy to catch attribute assignment:
const shallow_observer = function (obj, fn) {
return new Proxy(obj, {
set(target, name, val) {
target[name] = val;
if (fn) fn(target, name, val);
return true;
}
});
};
which allowed us to do:
data: { a:0, b:42 },
setupHandlers: function () {
this.data = shallow_observer(this.data, (data, field, value) => this.draw());
},
draw: function () { .. }
We have a deep_observer function too (which is much more complex), that detects changes in a nested data structure, but the shallow_observer was sufficient for all our use-cases.

Resources