Multiple ApolloLink based on context - apollo-client

I want to implement a way to switch over different links based on the context set in graphql query. What I did so far is something like this which is working fine but doesn't seem to be a nice solution over time.
const link = ApolloLink.from([
HandlerLink1,
HandlerLink2,
ApolloLink.split(
operation => operation.getContext().service === "x",
LinkX,
ApolloLink.split(
operation => operation.getContext().service === "y",
LinkY,
ApolloLink.split(
operation => operation.getContext().service === "z",
LinkZ,
LinkN
)
)
)
]);
Is there any better way rather than doing it nested?

I have also encountered the same problem. The below code works fine for me
const client = new ApolloClient({
cache,
link: ApolloLink.split(
(operation) => operation.getContext().clientName === 'link1',
Link1, // <= apollo will send to this if clientName is "link1"
ApolloLink.split(
(operation) => operation.getContext().clientName === 'link2',
Link2,// <= apollo will send to this if clientName is "link2"
Link3,// <= else Link3 will run
),
)
resolvers: {},
})

Related

What operator is used to get several values from observable

return this.usersTableService.fetchRequestedPageUsersIds(request).pipe(
switchMap((idsToFetch) => {
requestedIds = idsToFetch;
return [this.usersTableService.getNewIdsToFetch(requestedIds, entities), of(idsToFetch)];
}),
//.......?(([newIds, idsToFetch]) => {
return this._fetchNewUsersFromAPI(requestedIds, request, newIds, entities);
}),
catchError((err) => of(loadPageFail(err)))
);
what operator should I use in order to get the value of the return tuple before ?
You can use forkJoin for this
return this.usersTableService.fetchRequestedPageUsersIds(request).pipe(
switchMap((idsToFetch) => {
return forkJoin([this.usersTableService.getNewIdsToFetch(requestedIds, entities), of(idsToFetch)]);
}),
mergeMap(([newIds, idsToFetch]) => {
return this._fetchNewUsersFromAPI(requestedIds, request, newIds, entities);
}),
catchError((err) => of(loadPageFail(err)))
)
You would normally use the map operator(https://stackblitz.com/edit/so-tuple-map?file=index.ts):
const obs$ = of(1).pipe(map(y => ['abc', 'def']), map(([str1, str2]) => str1 + str2))
But if you try that you will encounter other issues with your code ie:
its not good practice to store a local variable inside a switchMap then return it using of
_fetchNewUsersFromAPI needs to be inside a switchMap
Ultimately you'll still be faced with the fundamental problem of how to pass parameters down the observable chain, which I suspect is how you've ended up in this situation to begin with.
There is currently a bountied question asking about the same problem here: How to pass results between chained observables
IMO the best solution from that question is to use nested pipes ie:
const newUsers$ = requestsSubject.pipe(
switchMap(request =>
this.usersTableService.fetchRequestedPageUsersIds(request).pipe(
switchMap(idsToFetch =>
this.usersTableService.getNewIdsToFetch(idsToFetch).pipe(
switchMap(newIds =>
this._fetchNewUsersFromAPI(idsToFetch, request, newIds, entities)
)
)
)
)
)
);
An alternative way using await and toPromise:
function getUsers(request){
const idsToFetch = await this.usersTableService.fetchRequestedPageUsersIds(request).toPromise();
const newIds = await this.usersTableService.getNewIdsToFetch(idsToFetch, entities).toPromise();
const newUsers = await this._fetchNewUsersFromAPI(idsToFetch, request, newIds, entities).toPromise();
return newUsers;
}

How to join streams based on a key

This is for redux-observable but I think the general pattern is pretty generic to rxjs
I have a stream of events (from redux-observable, these are redux actions) and I'm specifically looking to pair up two differnt types of events for the same "resource" - "resource set active" and "resource loaded" - and emit a new event when these events "match up". The problem is these can come in in any order, for any resources, and can be fired multiple times. You might set something active before it is loaded, or load something before it is set active, and other resources might get loaded or set active in between.
What I want is a stream of "this resource, which is now loaded, is now active" - which also means that once a resource is loaded, it can be considered forever loaded.
If these events were not keyed by a resource id, then it would be very simple:
First I would split them up by type:
const setActive$ = action$.filter(a => a.type == 'set_active');
const load = action$.filter(a => a.type == 'loaded');
In a simple case where there is no keying, I'd say something like:
const readyevent$ = setActive$.withLatestFrom(loaded$)
then readyevent$ is just a stream of set_active events where there has been at least one loaded event.
But my problem is that the set_active and loaded events are each keyed by a resource id, and for reasons beyond my control, the property to identify the resource is different in the two events. So this becomes something like:
const setActive$ = action$.filter(a => a.type === 'set_active').groupBy(a => a.activeResourceId);
const loaded$ = action$.filter(a => a.type === 'loaded').groupBy(a => a.id);
but from this point I can't really figure out how to then "re-join
" these two streams-of-grouped-observables on the same key, so that I can emit a stream of withLatestFrom actions.
I believe this does what you are describing:
const action$ = Rx.Observable.from([
{ activeResourceId: 1, type: 'set_active' },
{ id: 2, type: 'loaded' },
{ id: 1, type: 'random' },
{ id: 1, type: 'loaded' },
{ activeResourceId: 2, type: 'set_active' }
]).zip(
Rx.Observable.interval(500),
(x) => x
).do((x) => { console.log('action', x); }).share();
const setActiveAction$ = action$.filter(a => a.type === 'set_active')
.map(a => a.activeResourceId)
.distinct();
const allActive$ = setActiveAction$.scan((acc, curr) => [...acc, curr], []);
const loadedAction$ = action$.filter(a => a.type === 'loaded')
.map(a => a.id)
.distinct();
const allLoaded$ = loadedAction$.scan((acc, curr) => [...acc, curr], []);
Rx.Observable.merge(
setActiveAction$
.withLatestFrom(allLoaded$)
.filter(([activeId, loaded]) => loaded.includes(activeId)),
loadedAction$
.withLatestFrom(allActive$)
.filter(([loadedId, active]) => active.includes(loadedId))
).map(([id]) => id)
.distinct()
.subscribe((id) => { console.log(`${id} is loaded and active`); });
<script src="https://cdnjs.cloudflare.com/ajax/libs/rxjs/5.5.10/Rx.min.js"></script>
The basic approach is to create a distinct stream for each action type and join it with the distinct aggregate of the other. Then merge the two streams. This will emit the value when there are matching setActive and loaded events. The distinct() on the end of the merge makes it so that you will only get notified once. If you want a notification on each setActive action after the initial one then just remove that operator.
groupBy looks somewhat complicated to do this with, there's a key value in there but you get an Observable of Observables - so maybe a little hard to get right.
I would map the id to a common property and then use scan to combine. I use this pattern for grouping in my app.
The accumulator in the scan is an object, which is used as an associative array - each property is an id and the property value is an array of actions accumulated so far.
After the scan, we convert to an observable stream of arrays of matching actions - sort of like withLatestFrom but some arrays may not yet be complete.
The next step is to filter for those arrays we consider complete.
Since you say
where there has been at least one loaded event
I'm going to assume that the presence of two or more actions, with at least one is type 'loaded' - but it's a bit tricky to tell from your question if that is sufficient.
Finally, reset that id in the accumulator as presumably it may occur again later in the stream.
const setActive$ = action$.filter(a => a.type === 'set_active')
.map(a => { return { id: a.activeResourceId, action: a } });
const loaded$ = action$.filter(a => a.type === 'loaded')
.map(a => { return { id: a.id, action: a } });
const accumulator = {};
const readyevent$: Observable<action[]> =
Observable.merge(setActive$, loaded$)
.scan((acc, curr) => {
acc[curr.id] = acc[curr.id] || [];
acc[curr.id].push(curr.action)
}, accumulator)
.mergeMap((grouped: {}) => Observable.from(
Object.keys(grouped).map(key => grouped[key])
))
.filter((actions: action[]) => {
return actions.length > 1 && actions.some(a => a.type === 'loaded')
})
.do(actions => {
const id = actions.find(a => a.type === 'loaded').id;
accumulator[id] = [];
});

Is there a better way to form this code example?

I'm new to rxjs and using redux-observable. The short of it is that I need to make a couple promise requests when i get a connection then output the results. I'm wondering if there is a way to join this into a single map at the end and not have to call store.dispatch multiple times and have the retry work for each individual read. Thanks ahead of time for your comments.
export const handleBleConnectionSuccess = (action$,store,{bleCommunicator}) =>
action$.ofType(c.BLE_CONNECTION_SUCCESS)
.do((a)=>{
Observable.fromPromise(bleCommunicator.readCharacteristic(a.device.id,gattInfo.uuid,gattInfo.firmwareRevision.uuid))
.do((value)=>store.dispatch({type:c.DEVICE_FIRMWARE_VERSION,device:{...a.device,firmwareVersion:value}}))
.retry(3);
Observable.fromPromise(bleCommunicator.readCharacteristic(a.device.id,gattInfo.uuid,gattInfo.modelNumber.uuid))
.do(value=>store.dispatch({type:c.DEVICE_MODEL_NUMBER,device:{...a.device,modelNumber:value}}))
.retry(3);
})
.mapTo({type:'DEVICE_INFORMATION_REQUESTED'});
I'm wondering if there is a way to join this into a single map at the end and not have to call store.dispatch multiple times and have the retry work for each individual read
Yes, there is a better way, and it's possible to do what you want.
From the syntax, I'm guessing that you use ngrx (effects) (and not redux-observable).
So with ngrx/effects you could do it like that:
export const handleBleConnectionSuccess = (
action$,
store,
{ bleCommunicator }
) =>
action$.ofType(c.BLE_CONNECTION_SUCCESS).switchMap(a => {
const readCharacteristic = deviceOrFirmwareUuid =>
bleCommunicator.readCharacteristic(a.device.id, gattInfo.uuid, deviceOrFirmwareUuid);
return Observable.merge(
readCharacteristic(gattInfo.firmwareRevision.uuid)
.map(value => ({
type: c.DEVICE_FIRMWARE_VERSION,
device: { ...a.device, firmwareVersion: value },
}))
.retry(3),
readCharacteristic(gattInfo.modelNumber.uuid)
.map(value => ({
type: c.DEVICE_MODEL_NUMBER,
device: { ...a.device, modelNumber: value },
}))
.retry(3),
{ type: 'DEVICE_INFORMATION_REQUESTED' }
);
});

How to buffer observables in pairs and execute them by the pair?

I have an xhr request that is getting an array with which I execute subsequent xhr requests like so:
const Rx = require('rxjs/Rx');
const fetch = require('node-fetch');
const url = `url`;
// Get array of tables
const tables$ = Rx.Observable
.from(fetch(url).then((r) => r.json()));
// Get array of columns
const columns$ = (table) => {
return Rx.Observable
.from(fetch(`${url}/${table.TableName}/columns`).then(r => r.json()));
};
tables$
.mergeMap(tables => Rx.Observable.forkJoin(...tables.map(columns$)))
.subscribe(val => console.log(val));
I would like to execute the column requests in chuncks so that the requests are not being sent to the server at once.
This SO question is somewhat in the same direction but not completely: Rxjs: Chunk and delay stream?
Now I'm trying something like this:
tables$
.mergeMap(tables => Rx.Observable.forkJoin(...tables.map(columns$)))
.flatMap(e => e)
.bufferCount(4)
.executeTheChunksSerial(magic)
.flatMap(e => e)
.subscribe(val => console.log(val));
But I cannot wrap my head around how to execute the chunks in series...
You can utilize the concurrency argument of mergeMap to get max x requests concurrently to your server:
const getTables = Promise.resolve([{ tableName: 'foo' },{ tableName: 'bar' },{ tableName: 'baz' }]);
const getColumns = (table) => Rx.Observable.of('a,b,c')
.do(_ => console.log('getting columns for table: ' + table))
.delay(250);
Rx.Observable.from(getTables)
.mergeAll()
.mergeMap(
table => getColumns(table.tableName),
(table, columns) => ({ table, columns }),
2)
.subscribe(console.log)
<script src="https://cdnjs.cloudflare.com/ajax/libs/rxjs/5.4.3/Rx.js"></script>

Efficiently maintain a set of objects with 'CRUD-Observables'

I've been experimenting with feathersjs and angular2/Rx.
What I'm trying to achieve is building an angular2 service that's wrapping a feathersjs service in such a way that one can just subscribe to an Observable that emits an up-to-date set of items after any kind of CRUD.
It basically works. However, I don't find the way it's done very elegant:
Wrapping and unwrapping every object that's coming in doesn't seem efficient. Am I taking 'Everything's a stream' too far?
getService(){
let data$: Observable<any> = Observable.from([initialSetOfItems]);
let created$: Observable<any> = Observable.fromEvent(feathersService, 'created').map(o => {return {action: 'c', data: o}});
let updated$: Observable<any> = Observable.fromEvent(feathersService, 'updated').map(o => {return {action: 'u', data: o}});
let removed$: Observable<any> = Observable.fromEvent(feathersService, 'removed').map(o => {return {action: 'r', data: o}});
return data$
.merge(created$, updated$, removed$)
.scan((arr: any[], newObj) => {
switch (newObj.action){
case 'c':
return [].concat(arr, newObj.data);
case 'u':
let indexToUpdate = arr.findIndex((element) => (element.id === newObj.data.id));
if (indexToUpdate > -1){
arr[indexToUpdate] = newObj.data;
}
return arr;
case 'r':
return arr.filter(element => (element.id != newObj.data.id))
}
});
}
I know this might be opinionated. Please bear with me. Rx is a little hard to wrap your head around.
How would you guys try to achieve this?
I think what you are looking for is exactly what feathers-reactive is supposed to do. It is a plugin that turns any service method into an RxJS observable that automatically updates on real-time events. It can be used like this:
const feathers = require('feathers');
const memory = require('feathers-memory');
const rx = require('feathers-reactive');
const RxJS = require('rxjs');
const app = feathers()
.configure(rx(RxJS))
.use('/messages', memory());
const messages = app.service('messages');
messages.create({
text: 'A test message'
}).then(() => {
// Get a specific message with id 0. Emit the message data once it resolves
// and every time it changes e.g. through an updated or patched event
messages.get(0).subscribe(message => console.log('My message', message));
// Find all messages and emit a new list every time anything changes
messages.find().subscribe(messages => console.log('Message list', messages));
setTimeout(() => {
messages.create({ text: 'Another message' }).then(() =>
setTimeout(() => messages.patch(0, { text: 'Updated message' }), 1000)
);
}, 1000);
});
If you want to give it a try, we would love to get some feedback (and bug reports).

Resources