I'm trying to make an API call for each element of an array, and emit an specific event if every API response is true.
I was doing the following:
let emit = true
array.forEach(element => {
this.service.getElement(element).subscribe(loaded => {
if(!loaded){
emit = false;
}
});
});
this.loaded.emit(emit);
But the last line always emits true
How can I wait for every request to be resolved before making the output event emission?
So the last line will always emit true because the service code is executed asynchronously. This means that it executes everything except the callback function in the subscribe method. This callback function executed once the stream emits a new element. I guess, you are making Http requests to an endpoint here. If so, the callback function is executed for each Http response and the order is not guaranteed.
What you can do is:
forkJoin(
...array.map(e => this.service.getElement(e))
).subscribe(responseArray =>
console.log(responseArray);
console.log('Complete');
);
The callback only executes if all Http response are available. And responseArray has the exact same order as the elements in the array.
Note: Keep in mind that if you do
let emit = false;
forkJoin(...).subscribe(() => emit = true);
console.log(emit);
It would print you false, because the callback executes async. If that seems like strange behaviour for you, I'd highly recommend to read about the JavaScript Eventloop.
Cause call http API is async, so this.loaded.emit(emit) executed first;
fix:
let emit = true
array.forEach(element => {
this.service.getElement(element).subscribe(loaded => {
if(!loaded){
emit = false;
this.loaded.emit(emit);
}
});
});
If you want execute this.loaded.emit(emit) when all of API response is true, try use forkJoin.
Related
I've implemented API data caching in my app so that if data is already present it is not re-fetched.
I can intercept the initial fetch
cy.intercept('**/api/things').as('api');
cy.visit('/things')
cy.wait('#api') // passes
To test the cache is working I'd like to explicitly test the opposite.
How can I modify the cy.wait() behavior similar to the way .should('not.exist') modifies cy.get() to allow the negative logic to pass?
// data is cached from first route, how do I assert no call occurs?
cy.visit('/things2')
cy.wait('#api')
.should('not.have.been.called') // fails with "no calls were made"
Minimal reproducible example
<body>
<script>
setTimeout(() =>
fetch('https://jsonplaceholder.typicode.com/todos/1')
}, 300)
</script>
</body>
Since we test a negative, it's useful to first make the test fail. Serve the above HTML and use it to confirm the test fails, then remove the fetch() and the test should pass.
The add-on package cypress-if can change default command behavior.
cy.get(selector)
.if('exist').log('exists')
.else().log('does.not.exist')
Assume your API calls are made within 1 second of the action that would trigger them - the cy.visit().
cy.visit('/things2')
cy.wait('#alias', {timeout:1100})
.if(result => {
expect(result.name).to.eq('CypressError') // confirm error was thrown
})
You will need to overwrite the cy.wait() command to check for chained .if() command.
Cypress.Commands.overwrite('wait', (waitFn, subject, selector, options) => {
// Standard behavior for numeric waits
if (typeof selector === 'number') {
return waitFn(subject, selector, options)
}
// Modified alias wait with following if()
if (cy.state('current').attributes.next?.attributes.name === 'if') {
return waitFn(subject, selector, options).then((pass) => pass, (fail) => fail)
}
// Standard alias wait
return waitFn(subject, selector, options)
})
As yet only cy.get() and cy.contains() are overwritten by default.
Custom Command for same logic
If the if() syntax doesn't feel right, the same logic can be used in a custom command
Cypress.Commands.add('maybeWaitAlias', (selector, options) => {
const waitFn = Cypress.Commands._commands.wait.fn
// waitFn returns a Promise
// which Cypress resolves to the `pass` or `fail` values
// depending on which callback is invoked
return waitFn(cy.currentSubject(), selector, options)
.then((pass) => pass, (fail) => fail)
// by returning the `pass` or `fail` value
// we are stopping the "normal" test failure mechanism
// and allowing downstream commands to deal with the outcome
})
cy.visit('/things2')
cy.maybeWaitAlias('#alias', {timeout:1000})
.should(result => {
expect(result.name).to.eq('CypressError') // confirm error was thrown
})
I also tried cy.spy() but with a hard cy.wait() to avoid any latency in the app after the route change occurs.
const spy = cy.spy()
cy.intercept('**/api/things', spy)
cy.visit('/things2')
cy.wait(2000)
.then(() => expect(spy).not.to.have.been.called)
Running in a burn test of 100 iterations, this seems to be ok, but there is still a chance of flaky test with this approach, IMO.
A better way would be to poll the spy recursively:
const spy = cy.spy()
cy.intercept('**/api/things', spy)
cy.visit('/things2')
const waitForSpy = (spy, options, start = Date.now()) => {
const {timeout, interval = 30} = options;
if (spy.callCount > 0) {
return cy.wrap(spy.lastCall)
}
if ((Date.now() - start) > timeout) {
return cy.wrap(null)
}
return cy.wait(interval, {log:false})
.then(() => waitForSpy(spy, {timeout, interval}, start))
}
waitForSpy(spy, {timeout:2000})
.should('eq', null)
A neat little trick I learned from Gleb's Network course.
You will want use cy.spy() with your intercept and use cy.get() on the alias to be able to check no calls were made.
// initial fetch
cy.intercept('**/api/things').as('api');
cy.visit('/things')
cy.wait('#api')
cy.intercept('METHOD', '**/api/things', cy.spy().as('apiNotCalled'))
// trigger the fetch again but will not send since data is cached
cy.get('#apiNotCalled').should('not.been.called')
My situation is as follows: I am performing sequential HTTP requests, where one HTTP request depends on the response of the previous. I would like to combine the response data of all these HTTP requests into one observable. I have implemented this before using an async generator. The code for this was relatively simple:
async function* AsyncGeneratorVersion() {
let moreItems = true; // whether there is a next page
let lastAssetId: string | undefined = undefined; // used for pagination
while (moreItems) {
// fetch current batch (this performs the HTTP request)
const batch = await this.getBatch(/* arguments */, lastAssetId);
moreItems = batch.more_items;
lastAssetId = batch.last_assetid;
yield* batch.getSteamItemsWithDescription();
}
}
I am trying to move away from async generators, and towards RxJs Observables. My best (and working) attempt is as follows:
const observerVersion = new Observable<SteamItem>((subscriber) => {
(async () => {
let moreItems = true;
let lastAssetId: string | undefined = undefined;
while (moreItems) {
// fetch current batch (this performs the HTTP request)
const batch = await this.getBatch(/* arguments */, lastAssetId);
moreItems = batch.more_items;
lastAssetId = batch.last_assetid;
const items = batch.getSteamItemsWithDescription();
for (const item of items) subscriber.next(item);
}
subscriber.complete();
})();
});
Now, I believe that there must be some way of improving this Observer variant - this code does not seem very reactive to me. I have tried several things using pipe, however unfortunately these were all unsuccessful.
I found concatMap to come close to a solution. This allowed me to concat the next HTTP request as an observable (done with the this.getBatch method), however I could not find a good way to also not abandon the response of the current HTTP request.
How can this be achieved? In short I believe this problem could be described as appending data to an observable inside the observable itself. (But perhaps this is not a good way of handling this situation)
TLDR;
Here's a working StackBlitz demo.
Explanation
Here would be my approach:
// Faking an actual request
const makeReq = (prevArg, response) =>
new Promise((r) => {
console.log(`Running promise with the prev arg as: ${prevArg}!`);
setTimeout(r, 1000, { prevArg, response });
});
// Preparing the sequential requests.
const args = [1, 2, 3, 4, 5];
from(args)
.pipe(
// Running the reuqests sequantially.
mergeScan(
(acc, crtVal) => {
// `acc?.response` will refer to the previous response
// and we're using it for the next request.
return makeReq(acc?.response, crtVal);
},
// The seed(works the same as `reduce`).
null,
// Making sure that only one request is run at a time.
1
),
// Combining all the responses into one object
// and emitting it after all the requests are done.
reduce((acc, val, idx) => ({ ...acc, [`request${idx + 1}`]: val }), {})
)
.subscribe(console.warn);
Firstly, from(array) will emit each item from the array, synchronously and one by one.
Then, there is mergeScan. It is exactly the result of combining scan and merge. With scan, we can accumulate values(in this case we're using it to have access to the response of the previous request) and what merge does is to allow us to use observables.
To make things a bit easier to understand, think of the Array.prototype.reduce function. It looks something like this:
[].reduce((acc, value) => { return { ...acc }}, /* Seed value */{});
What merge does in mergeScan is to allow us to use the accumulator something like (acc, value) => new Observable(...) instead of return { ...acc }. The latter indicates a synchronous behavior, whereas with the former we can have asynchronous behavior.
Let's go a bit step by step:
when 1 is emitted, makeReq(undefined, 1) will be invoked
after the first makeReq(from above) resolves, makeReq(1, 2) will be invoked
after makeReq(1, 2) resolves, makeReq(2, 3) will be invoked and so on...
Somebody I consulted regarding this matter came up with this solution, I think it's quite elegant:
defer(() => this.getBatch(options)).pipe(
expand(({ more_items, last_assetid }) =>
more_items
? this.getBatch({ ...options, startAssetId: last_assetid })
: EMPTY,
),
concatMap((batch) => batch.getSteamItemsWithDescription()),
);
From my understanding the use of expand here is very similar to the use of mergeScan in #Andrei's answer
I am currently working on a project where I send UDP commands to a Tello drone.
The problem is that it uses UDP and when I send commands too fast before the previous one hasn't finished yet, the second command/action doesn't take place. I am using RxJS for this project and I want to create a mechanism to wait for the response ("ok" or "error") from the drone.
My Idea is to have 2 different observables. 1 observable that is the input stream from the responses from the drone and one observable of observables that I use as a commandQueue. This commandQueue has simple observables on it with 1 command I want to send. And I only want to send the next command when I received the "ok" message from the other observable. When I get the "ok" I would complete the simple command observable and it would automatically receive the next value on the commandQueue, being the next command.
My code works only when I send an array of commands, but I want to call the function multiple times, so sending them 1 by 1.
The following code is the function in question, testsubject is an observable to send the next command to the drone.
async send_command_with_return(msg) {
let parentobject = this;
let zeroTime = timestamp();
const now = () => numeral((timestamp() - zeroTime) / 10e3).format("0.0000");
const asyncTask = data =>
new Observable(obs => {
console.log(`${now()}: starting async task ${data}`);
parentobject.Client.pipe(take(1)).subscribe(
dataa => {
console.log("loool")
obs.next(data);
this.testSubject.next(data);
console.log(`${now()}: end of async task ${data}`);
obs.complete();
},
err => console.error("Observer got an error: " + err),
() => console.log("observer asynctask finished with " + data + "\n")
);
});
let p = this.commandQueue.pipe(concatMap(asyncTask)).toPromise(P); //commandQueue is a subject in the constructor
console.log("start filling queue with " + msg);
zeroTime = timestamp();
this.commandQueue.next(msg);
//["streamon", "streamoff", "height?", "temp?"].forEach(a => this.commandQueue.next(a));
await p;
// this.testSubject.next(msg);
}
streamon() {
this.send_command_with_return("streamon");
}
streamoff() {
this.send_command_with_return("streamoff");
}
get_speed() {
this.send_command_with_return("speed?");
}
get_battery() {
this.send_command_with_return("battery?");
}
}
let tello = new Tello();
tello.init();
tello.streamon();
tello.streamoff();
You can accomplish sending commands one at a time by using a simple subject to push commands through and those emissions through concatMap which will execute them one at a time.
Instead of trying to put all the logic in a single function, it will may be easier to make a simple class, maybe call it TelloService or something:
class TelloService {
private commandQueue$ = new Subject<Command>();
constructor(private telloClient: FakeTelloClient) {
this.commandQueue$
.pipe(
concatMap(command => this.telloClient.sendCommand(command))
)
.subscribe()
}
sendCommand(command: Command) {
this.commandQueue$.next(command);
}
}
When the service is instantiated, it subscribes to the commandQueue$ and for each command that is received, it will "do the work" of making your async call. concatMap is used to process commands one at a time.
Consumers would simply call service.sendCommand() to submit commands to the queue. Notice commands are submitted one at a time, it's not necessary to submit an array of commands.
Here is a working StackBlitz example.
To address your condition of waiting until you receive an ok or error response before continuing, you can use takeWhile(), this means it will not complete the observable until the condition is met.
To introduce a max wait time, you can use takeUntil() with timer() to end the stream if the timer emits:
this.commandQueue$
.pipe(
concatMap(command => this.telloClient.sendCommand(command).pipe(
takeWhile(status => !['ok', 'error'].includes(status), true),
takeUntil(timer(3000))
))
)
.subscribe()
Here's an updated StackBlitz.
The goal is to iterate through a collection of IDs, making an HTTP call for each ID. For each ID, I'm using a service with a get() method that returns an Observable. Each time the get() method is called, I'm subscribing to the returning Observable and trying to push the result into an array, which will eventually get passed on to a different method for a new operation.
Relevant service method:
public get(departmentId: number): Observable<IDepartmentModel> {
return super.get<IDepartmentModel>(departmentId);
}
note: the super class is leveraging Angular Http, which is well tested and confirmed to be working correctly. The problem with the logic isn't here...
Relevant component methods:
note the departmentService.get() call that's being called several times within the forEach.
setInitialDepartmentsAssignedGridData(): void {
this.settingsForDropdownSelectedCompanyId = this.userForm.get('defaultCompany').get('defaultCompanyId').value;
let departments: IDepartmentModel[] = [];
this.userService.user.getValue() //confirmed: valid user is being pulled back from the userService (logic is fine here..)
.userCompanies.find(comp => comp.companyId === this.settingsForDropdownSelectedCompanyId) // getting a valid match here (logic is fine here..)
.departmentIds.forEach(deptId => this.departmentService.get(deptId).first().subscribe(dept => { // getting a valid department back here (logic is fine here...)
departments.push(dept); // HERE LIES THE PROBLEM
}));
this.setDepartmentsAssignedRowData(departments);
}
setDepartmentsAssignedRowData(departments: IDepartmentModel[]): void {
console.log('setDeptAssignedRowData called'); // confirmed: method is getting called...
console.log(departments); // confirmed: fully-composed collection of departments logged to the console...
departments.forEach(dept => {
console.log(dept);
}); // Y U NO WORK!?
departments.map((department) => {
console.log(department); // Y U NO WORK?
this.departmentAssignedRowData.push({
departmentId: department.id,
departmentName: department.name
});
});
this.departmentAssignedGridOptions.api.setRowData(this.departmentAssignedRowData);
}
The problem is, although what's getting logged to the console is a fully-composed department-objects array, it's not TRULY "there"; what's getting passed to setDepartmentsAssignedRowData is an empty array.
I'm sure what's happening is that the async operations are not complete before the departments array gets passed to the second method. Some of what I've read online says to use forkJoin, but I can't see how that will look in this context. I've also read concatMap may work, but again, in this context, I'm not sure how to make that work...
In this context, how do I leverage RxJS to make sure the intended, fully-composed departments array is truly ready to be passed?
thanks for any insight you can provide. help is much appreciated!
You are correct, you need forkJoin
let observableArray = this.userService.user.getValue()
.userCompanies.find(comp => comp.companyId === this.settingsForDropdownSelectedCompanyId)
.departmentIds.map(deptId => this.departmentService.get(deptId)) // map is building out an array of observables
This will be an array of http request observables that you want to make in parallel. Now you can pass this array to forkJoin.
Observable.forkJoin(...observableArray)
The return of forkJoin will be an array of results from observableArray. forkJoin will not emit to the next operator in the sequence until all of the observables in observableArray have completed (so when all of the http requests have finished)
So altogether the code will be
let observableArray = this.userService.user.getValue()
.userCompanies.find(comp => comp.companyId === this.settingsForDropdownSelectedCompanyId)
.departmentIds.map(deptId => this.departmentService.get(deptId));
Observable.forkJoin(...observableArray).subscribe(res => {
// res = [resId0, resId1, resId2, ..., resIdX];
});
You mentioned passing the result to another operator. If that operator is another http request where you pass an array of data (from forkJoin), then you can use the flatMap operator.
Observable.forkJoin(...observableArray)
.flatMap(res => {
return this.otherApi(res);
})
.subscribe(res => {
// res is the result of the otherApi call
});
flatMap will chain your api requests together. So altogether what is happening is
run array of observables in parallel
once complete, run second api (otherApi)
I am trying to get time expiry cache to work for an observable that abstracts a "request-response", using postMessage and message events on the window.
The remote window expects a message getItemList and replies to it with a message of type {type: 'itemList', data: []}.
I would like to model the itemList$ observable in such a way that it caches the last result for 3 seconds, so that no new requests are made during that time, however, I cannot think of a way to achieve that in an elegant (read, one observable – no subjects) and succint manner.
Here is the example in code:
const remote = someIframe.contentWindow;
const getPayload = message => message.data;
const ofType = type => message => message.type === type;
// all messages coming in from the remote iframe
const messages$ = Observable.fromEvent(window, 'message')
.map(getPayload)
.map(JSON.parse);
// the observable of (cached) items
const itemList$ = Observable.defer(() => {
console.log('sending request');
// sending a request here, should happen once every 3 seconds at most
remote.postMessage('getItemList');
// listening to remote messages with the type `itemList`
return messages$
.filter(ofType('itemList'))
.map(getPayload);
})
.cache(1, 3000);
/**
* Always returns a promise of the list of items
* #returns {Promise<T>}
*/
function getItemList() {
return itemList$
.first()
.toPromise();
}
// poll every second
setInterval(() => {
getItemList()
.then(response => console.log('got response', response));
}, 1000);
I am aware of the (very similar) question, but I am wondering if anyone can come up with a solution without explicit subjects.
Thank you in advance!
I believe you are looking for the rxjs operator throttle:
Documentation on rxjs github repo
Returns an Observable that emits only the first item emitted by the
source Observable during sequential time windows of a specified
duration.
Basically, if you would like to wait until the inputs have quieted for a certain period of time before taking action, you want to debounce.
If you do not want to wait at all, but do not wish to make more than 1 query within a specific amount of time, you will want to throttle. From your use case, I think you want to throttle