I found cstom operator that I want to use.
This is an operator that retries http requests. Code is from Stephen Fluin: https://github.com/StephenFluin/http-operators/blob/master/operators/retryExponentialBackoff.operator.ts.
Problem is that if after all these reties it does not puts error in stream only completes.
I want it to throw an error. How to do it?
I think this part should be modified:
error(err: any) {
if (count <= maxTries) {
subscription.add(scheduler.schedule(subscribe, initialWait * Math.pow(2, count++)));
}
},
Here is whole operator's class
/**
* Repeats underlying observable on a timer
*
* #param maxTries The maximum number of attempts to make, or -1 for unlimited
* #param initialWait Number of seconds to wait for refresh
*/
export const retryExponentialBackoff = (
maxTries = -1,
initialWait = 1,
scheduler: SchedulerLike = asyncScheduler
) => <T>(
source: Observable<T>
) => {
return new Observable<T>(subscriber => {
let count = 1;
const subscription = new Subscription();
const subscribe = () =>
subscription.add(
source.subscribe({
next(value: T) {
count = 1;
subscriber.next(value);
},
error(err: any) {
if (count <= maxTries) {
subscription.add(scheduler.schedule(subscribe, initialWait * Math.pow(2, count++)));
}
},
complete() {
subscriber.complete();
},
})
);
subscribe();
return subscription;
});
};
I would try to add the error bubbling to the subscriber like so:
error(err: any) {
if (count <= maxTries) {
subscription.add(scheduler.schedule(subscribe, initialWait * Math.pow(2, count++)));
}
else {
subscriber.error(err);
}
},
So that after your maxTries count have een exhausted the error is emitted downstream.
Related
I try to use cypress-wait-until for a simple case. https://github.com/NoriSte/cypress-wait-until
Visit a page
Check if an element is here
If not, reload the page until this element is found.
Once found, assert the element exists
Working code (cypress-wait-until not used)
before(() => {
cy.visit('http://localhost:8080/en/registration');
});
describe('Foo', () => {
it('should check that registration button is displayed', () => {
const selector = 'button[data-test=startRegistration-individual-button]';
cy.get(selector).should('exist');
});
});
Not working, timed out retrying
before(() => {
cy.visit('http://localhost:8080/en/registration');
});
describe('Foo', () => {
it('should check that registration button is displayed', () => {
const options = { timeout: 8000, interval: 4000 };
const selector = 'button[data-test=startRegistration-individual-button]';
cy.waitUntil(() => cy.reload().then(() => Cypress.$(selector).length), options);
cy.get(selector).should('exist');
});
});
Not working, see error below
before(() => {
cy.visit('http://localhost:8080/en/registration');
});
describe('Foo', () => {
it('should check that registration button is displayed', () => {
const options = { timeout: 8000, interval: 4000 };
const selector = 'button[data-test=startRegistration-individual-button]';
cy.waitUntil(() => {
cy.reload();
return Cypress.$(selector).length;
}, options);
cy.get(selector).should('exist');
});
For the two versions not working as soon as I remove cy.reload(), it starts to work.
Question
What can I do to make it work with a reload?
EDIT
This command I wrote works correctly.
Cypress.Commands.add('refreshUntil', (selector: string, opts?: { retries: number; waitAfterRefresh: number }) => {
const defaultOptions = {
retries: 10,
waitAfterRefresh: 2500,
};
const options = { ...defaultOptions, ...opts };
function check(selector: string): any {
if (Cypress.$(selector).length) { // Element is there
return true;
}
if (options.retries === 0) {
throw Error(`${selector} not found`);
}
options.retries -= 1;
cy.log(`Element ${selector} not found. Remaining attempts: ${options.retries}`);
cy.reload();
// Wait a some time for the server to respond
return cy.wait(options.waitAfterRefresh).then(() => check(selector));
}
check(parsedSelector);
});
I could see two potential difference with waitUntil from cypress-wait-until
Cypress.$(selector).length would be new on each try
There is a wait time after the reload before checking again if the element is there
EDIT 2
Here is the working solution using cypress-wait-until
cy.waitUntil(() => cy.reload().wait(2500).then(() => Cypress.$(selector).length), options);
Cypress rules apply inside cy.waitUntil() as well as outside so .wait(2500) would be bad practice.
It would be better to change your non-retying Cypress.$(selector).length into a proper Cypress retrying command. That way you get 4 seconds (default) retry but only wait as long as needed.
Particulary since cy.waitUntil() repeats n times, you are hard-waiting (wasting) a lot of seconds.
cy.waitUntil(() => { cy.reload(); cy.get(selector) }, options)
// NB retry period for `cy.get()` is > 2.5 seconds
Here is the working solution using cypress-wait-until
cy.waitUntil(() => cy.reload().wait(2500).then(() => Cypress.$(selector).length), options);
I ended up writing my own method (inspired from cypress-wait-until ) without the need to have a hard wait time
/**
* Run a check, and then refresh wait until an element is displayed.
* Retries for a specified amount of time.
*
* #function
* #param {function} firstCheckFunction - The function to run before checking if the element is displayed.
* #param {string|{ selector: string, type: string }} selector - The selector to search for. Can be a string or an object with selector and type properties.
* #param {WaitUntilOptions} [opts={timeout: 5000, interval: 500}] - The options object, with timeout and interval properties.
* #throws {Error} if the firstWaitFunction parameter is not a function.
* #throws {Error} if the specified element is not found after all retries.
* #example
* cy.refreshUntilDisplayed('#element-id', () => {...});
* cy.refreshUntilDisplayed({ selector: 'element-id', type: 'div' }, () => {...});
*/
Cypress.Commands.add('waitFirstRefreshUntilDisplayed', (firstCheckFunction, selector: string | { selector: string, type: string }, opts = {}) => {
if (!(firstCheckFunction instanceof Function)) {
throw new Error(`\`firstCheckFunction\` parameter should be a function. Found: ${firstCheckFunction}`);
}
let parsedSelector = '';
// Define the default options for the underlying `cy.wait` command
const defaultOptions = {
timeout: 5000,
interval: 500,
};
const options = { ...defaultOptions, ...opts };
// Calculate the number of retries to wait for the element to be displayed
let retries = Math.floor(options.timeout / options.interval);
const totalRetries = retries;
if (typeof selector === 'string') {
parsedSelector = selector;
}
if (typeof selector !== 'string' && selector.selector && selector.type) {
parsedSelector = `${selector.type}[data-test=${selector.selector}]`;
}
// Define the check function that will be called recursively until the element is displayed
function check(selector: string): boolean {
if (Cypress.$(selector).length) { // Element exists
return true;
}
if (retries < 1) {
throw Error(`${selector} not found`);
}
if (totalRetries !== retries) { // we don't reload first time
cy.log(`Element ${parsedSelector} not found. ${retries} left`);
cy.reload();
}
// Waits for the firstCheckFunction to return true,
// then pause for the time define in options.interval
// and call recursively the check function
cy.waitUntil(firstCheckFunction, options).then(() => { // We first for firstCheckFunction to be true
cy.wait(options.interval).then(() => { // Then we loop until the selector is displayed
retries -= 1;
return check(selector);
});
});
return false;
}
check(parsedSelector);
});
I'm using rxjs to connect to a WebSocket service, and in case of failure, I want to retry 3 times, wait 30 seconds, then repeat infinitely, how can I do this?
I found a solution, first, create the following operator:
function retryWithDelay<T>(
repetitions: number,
delay: number
): (a: Observable<T>) => Observable<T> {
let count = repetitions;
return (source$: Observable<T>) =>
source$.pipe(
retryWhen((errors) =>
errors.pipe(
delayWhen(() => {
count--;
if (count === 0) {
count = repetitions;
return timer(delay);
}
return timer(0);
})
)
)
);
}
Then, use use it like this:
function connect(url: string) {
return webSocket({ url })
.pipe(retryWithDelay(3, 30000));
}
You can do this by doing the following:
//emit value every 200ms
const source = Rx.Observable.interval(200);
//output the observable
const example = source
.map(val => {
if (val > 5) {
throw new Error('The request failed somehow.');
}
return val;
})
.retryWhen(errors => errors
//log error message
.do(val => console.log(`Some error that occur ${val}, pauze for 30 seconds.`))
//restart in 30 seconds
.delayWhen(val => Rx.Observable.timer(30 * 1000))
);
const subscribe = example
.subscribe({
next: val => console.log(val),
error: val => console.log(`This will never happen`)
});
See the working example: https://jsbin.com/goxowiburi/edit?js,console
Be aware that this is an infinite loop and you are not introducing unintended consequences into your code.
Im using NestJS. I want to get all data from paginated API (i dont know the total page). Right now im using while loop to get all the data until the API returns 204 No Content, this is my code so far:
async getProduct() {
let productFinal: ProductItem[] = [];
let products: ProductItem[] = [];
let offset = 1;
let state = COLLECTING_STATE.InProgress;
let retryCount = 1;
do {
const path = `product?limit=50&offset=${offset}`;
products = await this.httpService
.get(path, { headers, validateStatus: null })
.pipe(
concatMap((response) => {
// if the statusCode is "204", the loop is complete
if (response.status === 204) {
state = COLLECTING_STATE.Finish;
}
// check if the response is error
if (response.status < 200 || response.status >= 300) {
// log error
Logger.error(
`[ERROR] Error collecting product on offset: ${offset}. StatusCode: ${
response.status
}. Error: ${JSON.stringify(response.data)}. Retrying... (${retryCount})`,
undefined,
'Collect Product'
);
// increment the retryCount
retryCount++;
// return throwError to trigger retry event
return throwError(`[ERROR] Received status ${response.status} from HTTP call`);
}
// return the data if OK
return of(response.data.item);
}),
catchError((err) => {
if (err?.code) {
// log error
Logger.error(
`Connection error: ${err?.code}. Retrying... (${retryCount})`,
undefined,
'Collect Product'
);
// increment the retryCount
retryCount++;
}
return throwError(err);
}),
// retry three times
retry(3),
// if still error, then stop the loop
catchError((err) => {
Logger.error(
`[ERROR] End retrying. Error: ${err?.code ?? err}`,
undefined,
'Collect Product'
);
state = COLLECTING_STATE.Finish;
return of(err);
})
)
.toPromise();
// set retryCount to 1 again
retryCount = 1;
// check if products is defined
if (products?.length > 0) {
// if so, push the product to final variable
productFinal = union(products, productFinal);
}
// increment the offset
offset++;
// and loop while the state is not finish
} while ((state as COLLECTING_STATE) !== COLLECTING_STATE.Finish);
return productFinal;
}
The endpoint product?limit=50&offset=${offset} is from third-party service, it doesn't have one endpoint to grab all the data so this is the only way, it has a maximum limit of 50 per offset, and it didn't have a nextPage or totalPage information on the response so i have to make offset variable and increment it after the previous request is complete.
How do I replace the while loop with the RxJS operator? And can it be optimized to make more than one request at a time (maybe four or five), thus taking less time to get all data?
Based on answer from RxJS Observable Pagination, but increment offset every time request is made:
const { of, timer, defer, EMPTY, from, concat } = rxjs; // = require("rxjs")
const { map, tap, mapTo, mergeMap, take } = rxjs.operators; // = require("rxjs/operators")
// simulate network request
function fetchPage({ limit, offset }) {
// 204 resposne
if (offset > 20) {
return of({ status: 204, items: null });
}
// regular data response
return timer(100).pipe(
tap(() =>
console.log(`-> fetched elements from ${offset} to ${offset+limit}`)
),
mapTo({
status: 200,
items: Array.from({ length: limit }).map((_, i) => offset + i)
})
);
}
const limit = 10;
function getItems(offset = 0) {
return defer(() => fetchPage({ limit, offset })).pipe(
mergeMap(({ status, items }) => {
if (status === 204) {
return EMPTY;
}
const items$ = from(items);
const next$ = getItems(offset + limit);
return concat(items$, next$);
})
);
}
// process only first 100 items, without fetching all of the data
getItems()
.pipe(take(100))
.subscribe({
next: console.log,
error: console.error,
complete: () => console.log("complete")
});
<script src="https://unpkg.com/rxjs#6.6.2/bundles/rxjs.umd.min.js"></script>
Regarding possible optimization to make parallel requests - I don't think it will work well. Instead you could show data progressively, as soon as items are loading. Or change API as was suggested in the comments.
have an observable that returns arrays/lists of things: Observable
And I have a usecase where is is a pretty costly affair for the downstream consumer of this observable to have more items added to this list. So I'd like to slow down the amount of additions that are made to this list, but not loose any.
Something like an operator that takes this observable and returns another observable with the same signature, but whenever a new list gets pushed on it and it has more items than last time, then only one or a few are added at a time.
So if the last push was a list with 3 items and next push has 3 additional items with 6 items in total, and the batch size is 1, then this one list push gets split into 3 individual pushes of lists with lengths: 4, 5, 6
So additions are batched, and this way the consumer can more easily keep up with new additions to the list. Or the consumer doesn't have to stall for so long each time while processing additional items in the array/list, because the additions are split up and spread over a configurable size of batches.
I made an addAdditionalOnIdle operator that you can apply to any rxjs observable using the pipe operator. It takes a batchSize parameter, so you can configure the batch size. It also takes a dontBatchAfterThreshold, which stops batching of the list after a certain list size, which was useful for my purposes. The result also contains a morePending value, which you can use to show a loading indicator while you know more data is incomming.
The implementation uses the new requestIdleCallback function internally to schedule the batched pushes of additional items when there is idle time in the browser. This function is not available in IE or Safari yet, but I found this excelent polyfill for it, so you can use it today anyways: https://github.com/aFarkas/requestIdleCallback :)
See the implementation and example usage of addAdditionalOnIdle below:
const { NEVER, of, Observable } = rxjs;
const { concat } = rxjs.operators;
/**
* addAdditionalOnIdle
*
* Only works on observables that produce values that are of type Array.
* Adds additional elements on window.requestIdleCallback
*
* #param batchSize The amount of values that are added on each idle callback
* #param dontBatchAfterThreshold Return all items after amount of returned items passes this threshold
*/
function addAdditionalOnIdle(
batchSize = 1,
dontBatchAfterThreshold = 22,
) {
return (source) => {
return Observable.create((observer) => {
let idleCallback;
let currentPushedItems = [];
let lastItemsReceived = [];
let sourceSubscription = source
.subscribe({
complete: () => {
observer.complete();
},
error: (error) => {
observer.error(error);
},
next: (items) => {
lastItemsReceived = items;
if (idleCallback) {
return;
}
if (lastItemsReceived.length > currentPushedItems.length) {
const idleCbFn = () => {
if (currentPushedItems.length > lastItemsReceived.length) {
observer.next({
morePending: false,
value: lastItemsReceived,
});
idleCallback = undefined;
return;
}
const to = currentPushedItems.length + batchSize;
const last = lastItemsReceived.length;
if (currentPushedItems.length < dontBatchAfterThreshold) {
for (let i = 0 ; i < to && i < last ; i++) {
currentPushedItems[i] = lastItemsReceived[i];
}
} else {
currentPushedItems = lastItemsReceived;
}
if (currentPushedItems.length < lastItemsReceived.length) {
idleCallback = window.requestIdleCallback(() => {
idleCbFn();
});
} else {
idleCallback = undefined;
}
observer.next({
morePending: currentPushedItems.length < lastItemsReceived.length,
value: currentPushedItems,
});
};
idleCallback = window.requestIdleCallback(() => {
idleCbFn();
});
} else {
currentPushedItems = lastItemsReceived;
observer.next({
morePending: false,
value: currentPushedItems,
});
}
},
});
return () => {
sourceSubscription.unsubscribe();
sourceSubscription = undefined;
lastItemsReceived = undefined;
currentPushedItems = undefined;
if (idleCallback) {
window.cancelIdleCallback(idleCallback);
idleCallback = undefined;
}
};
});
};
}
function sleep(milliseconds) {
var start = new Date().getTime();
for (var i = 0; i < 1e7; i++) {
if ((new Date().getTime() - start) > milliseconds){
break;
}
}
}
let testSource = of(
[1,2,3],
[1,2,3,4,5,6],
).pipe(
concat(NEVER)
);
testSource
.pipe(addAdditionalOnIdle(2))
.subscribe((list) => {
// Simulate a slow synchronous consumer with a busy loop sleep implementation
sleep(1000);
document.body.innerHTML += "<p>" + list.value + "</p>";
});
<script src="https://unpkg.com/rxjs#6.5.3/bundles/rxjs.umd.js"></script>
I'm attempting to use RxJS to implement service call throttling / queuing.
For example, Google Maps' Geocoder API. Let's say I don't want this to be called more than once a second, but one or more parts of my application may request a geocode more often than that. I'd want the requests to queue, with adjacent requests being at least 1s apart, but I'd also want to be able to 'cancel' a request if it no longer becomes required during this wait.
Is this an applicable use of RxJS, and if so what might this look like?
Thanks.
Here is something that should guide you (jsfiddle):
// Helper functions
function remove_from_queue(queue, id) {
queue.forEach(function(x, index){
if (x.execute.request === id) {
queue.splice(index, 1);
}
});
// console.log('queue after removal', queue);
}
function add_to_queue (queue, id){
queue.push({execute : {request : id}});
}
function getFirstInQueue(queue){
return queue[0];
}
function noop(x) {}
function log(label) {
return function (x) {
console.log.call(console, label, x);
}
}
function timestamp(label){
return function (x) {
console.log.call(console, Date.now() - startingDate, label,x );
}
}
function label(label){
return function (x) {
var res = {};
res[label] = x;
return res;
}
}
var startingDate = Date.now();
var requests$ = Rx.Observable.generateWithRelativeTime(
{request : 1},
function (x) { return x.request < 10; },
function (x) { return {request : x.request + 1}; },
function (x) { return x; },
function (x) { return 100 ; }
);
var cancelledRequests$ = Rx.Observable.generateWithRelativeTime(
{request : 1},
function (x) { return x.request < 20; },
function (x) { return {request : x.request + 4}; },
function (x) { return x; },
function (x) { return 500 ; }
);
var timer$ = Rx.Observable.interval(990).map(function (){return {}}).take(10);
var source$ = Rx.Observable.merge(
requests$.map(label('execute')),
cancelledRequests$.map(label('cancel')),
timer$
)
//.do(log('source'));
controlledSource$ = source$
.scan(function (state, command){
var requestsToExecuteQueue = state.requestsToExecuteQueue;
if (command.cancel) {
remove_from_queue(requestsToExecuteQueue, command.cancel.request);
}
if (command.execute) {
add_to_queue(requestsToExecuteQueue, command.execute.request);
}
console.log('queue', requestsToExecuteQueue.slice())
return {
command : command,
requestExec$ : Rx.Observable
.return(getFirstInQueue(requestsToExecuteQueue))
.filter(function(x){return x})
.do(function(x){remove_from_queue(requestsToExecuteQueue, x.execute.request)}),
requestsToExecuteQueue : requestsToExecuteQueue
}
}, {command : undefined, requestExec$ : undefined, requestsToExecuteQueue : []})
.pluck('requestExec$')
.sample(Rx.Observable.interval(1000))
.mergeAll();
controlledSource$.do(timestamp('executing request:')).subscribe(noop)
Basically :
scan is used to manage the state (queue of requests, addition and removal)
for each request, we pass an observable which (when subscribed to) releases the first element of the queue, and remove that element from the queue
sample is used to get one such observable every second
mergeAll allows to subscribe to that observable
we have to use a timer$ object to continue polling the queue even after the source of requests has completed (you still need to empty the queue of remaining requests). You can adapt that logic to your real case by having timer$ emitting for X seconds after completion of your source for example or whatever suits you best.