How can I stop the subscription in GraphQL? - graphql

Here's a piece of code I'm interested in (it's taken from /examples/ directory):
Subscription: {
counter: {
subscribe: (parent, args, { pubsub }) => {
const channel = Math.random().toString(36).substring(2, 15) // random channel name
let count = 0
// added var refreshIntervalId =
var refreshIntervalId = setInterval(() => pubsub.publish(channel, { counter: { count: count++ } }), 2000) // <----
return pubsub.asyncIterator(channel)
},
// my new changes that hopefully will work
onDisconnect: (webSocket, context) => {
clearInterval(refreshIntervalId);
}
}
I'm a bit concerned what's the best way (how can I pass refreshIntervalId between subscribe() and onDisconnect() to stop the interval once the connection is closed.
Update: I realized that I should insert onDisconnect under server's option (not in the resolver block), so I probably think I shouldn't really worried about it at all (it should handle disconnection by default).

Related

RxJs - how to make observable behave like queue

I'm trying to achieve next:
private beginTransaction(): Observable() {
..
}
private test(): void {
this.beginTransaction().subscribe((): void => {
this.commitTransaction();
});
this.beginTransaction().subscribe((): void => {
this.commitTransaction();
});
}
beginTransaction can be called concurrently, but should delay the observable until first or only one beginTransaction finished.
In order words: Only one transaction can be in progress at any time.
What have I tried:
private transactionInProgress: boolean = false;
private canBeginTransaction: Subject<void> = new Subject<void>();
private bla3(): void {
this.beginTransaction().subscribe((): void => {
console.log('beginTransaction 1');
this.commitTransaction();
});
this.beginTransaction().subscribe((): void => {
console.log('beginTransaction 2');
this.commitTransaction();
});
this.beginTransaction().subscribe((): void => {
console.log('beginTransaction 3');
this.commitTransaction();
});
}
private commitTransaction(): void {
this.transactionInProgress = false;
this.canBeginTransaction.next();
}
private beginTransaction(): Observable<void> {
if(this.transactionInProgress) {
return of(undefined)
.pipe(
skipUntil(this.canBeginTransaction),
tap((): void => {
console.log('begin transaction');
})
);
}
this.transactionInProgress = true;
return of(undefined);
}
What you've asked about is pretty vague and general. Without a doubt, a more constrained scenario could probably look a whole lot simpler.
Regardless, here I create a pipeline that only lets transaction(): Observable be subscribed to once at a time.
Here's how that might look:
/****
* Represents what each transaction does. Isn't concerned about
* order/timing/'transactionInProgress' or anything like that.
*
* Here is a fake transaction that just takes 3-5 seconds to emit
* the string: `Hello ${name}`
****/
function transaction(args): Observable<string> {
const name = args?.message;
const duration = 3000 + (Math.random() * 2000);
return of("Hello").pipe(
tap(_ => console.log("starting transaction")),
switchMap(v => timer(duration).pipe(
map(_ => `${v} ${name}`)
)),
tap(_ => console.log("Ending transation"))
);
}
// Track transactions
let currentTransactionId = 0;
// Start transactions
const transactionSubj = new Subject<any>();
// Perform transaction: concatMap ensures we only start a new one if
// there isn't a current transaction underway
const transaction$ = transactionSubj.pipe(
concatMap(({id, args}) => transaction(args).pipe(
map(payload => ({id, payload}))
)),
shareReplay(1)
);
/****
* Begin a new transaction, we give it an ID since transactions are
* "hot" and we don't want to return the wrong (earlier) transactions,
* just the current one started with this call.
****/
function beginTransaction(args): Observable<any> {
return defer(() => {
const currentId = currentTransactionId++;
transactionSubj.next({id: currentId, args});
return transaction$.pipe(
first(({id}) => id === currentId),
map(({payload}) => payload)
);
})
}
// Queue up 3 transactions, each one will wait for the previous
// one to complete before it will begin.
beginTransaction({message: "Dave"}).subscribe(console.log);
beginTransaction({message: "Tom"}).subscribe(console.log);
beginTransaction({message: "Tim"}).subscribe(console.log);
Asynchronous Transactions
The current setup requires transactions to be asynchronous, or you risk losing the first one. The workaround for that is not simple, so I've built an operator that subscribes, then calls a function as soon as possible after that.
Here it is:
function initialize<T>(fn: () => void): MonoTypeOperatorFunction<T> {
return s => new Observable(observer => {
const bindOn = name => observer[name].bind(observer);
const sub = s.subscribe({
next: bindOn("next"),
error: bindOn("error"),
complete: bindOn("complete")
});
fn();
return {
unsubscribe: () => sub.unsubscribe
};
});
}
and here it is in use:
function beginTransaction(args): Observable<any> {
return defer(() => {
const currentId = currentTransactionId++;
return transaction$.pipe(
initialize(() => transactionSubj.next({id: currentId, args})),
first(({id}) => id === currentId),
map(({payload}) => payload)
);
})
}
Aside: Why Use defer?
Consider re-writting beginTransaction:
function beginTransaction(args): Observable<any> {
const currentId = currentTransactionId++;
return transaction$.pipe(
initialize(() => transactionSubj.next({id: currentId, args})),
first(({id}) => id === currentId),
map(({payload}) => payload)
);
}
In this case, the ID is set at the moment you invoke beginTransaction.
// The ID is set here, but it won't be used until subscribed
const preppedTransaction = beginTransaction({message: "Dave"});
// 10 seconds later, that ID gets used.
setTimeout(
() => preppedTransaction.subscribe(console.log),
10000
);
If transactionSubj.next is called without the initialize operator, then this problem gets even worse as transactionSubj.next would also get called 10 seconds before the observable is subscribed to (You're sure to miss the output)
The problems continue:
What if you want to subscribe to the same observable twice?
const preppedTransaction = beginTransaction({message: "Dave"});
preppedTransaction.subscribe(
value => console.log("First Subscribe: ", value)
);
preppedTransaction.subscribe(
value => console.log("Second Subscribe: ", value)
);
I would expect the output to be:
First Subscribe: Hello Dave
Second Subscribe: Hello Dave
Instead, you get
First Subscribe: Hello Dave
First Subscribe: Hello Dave
Second Subscribe: Hello Dave
Second Subscribe: Hello Dave
Because you don't get a new ID on subscribing, the two subscriptions share one ID. defer fixes this problem by not assigning an id until subscription. This becomes seriously important when managing errors in streams (letting you re-try an observable after it errors).
I am not sure I have understood the problem right, but it looks to me as concatMap is the operator you are looking for.
An example could be the following
const transactionTriggers$ = from([
't1', 't2', 't3'
])
function processTransation(trigger: string) {
console.log(`Start processing transation triggered by ${trigger}`)
// do whatever needs to be done and then return an Observable
console.log(`Transation triggered by ${trigger} processing ......`)
return of(`Transation triggered by ${trigger} processed`)
}
transactionTriggers$.pipe(
concatMap(trigger => processTransation(trigger)),
tap(console.log)
).subscribe()
You basically start from a stream of events, where each event is supposed to trigger the processing of the transaction.
Then you use processTransaction function to do whatever you have to do to process a transaction. processTransactio needs to return an Observable which emits the result of the processing when the transaction has been processed and then completes.
Then in the pipe you can use tap to do further stuff with the result of the processing, if required.
You can try the code in this stackblitz.

Delay batch of observables with RxJS

I perform http requests to my db and have noticed that if I send all the requests at once, some of them will get a timeout errors. I'd like to add a delay between calls so the server doesn't get overloaded. I'm trying to find the RxJS solution to this problem and don't want to add a setTimeout.
Here is what I currently do:
let observables = [];
for(let int = 0; int < 10000; int++){
observables.push(new Observable((observer) => {
db.add(doc[int], (err, result)=>{
observer.next();
observer.complete();
})
}))
}
forkJoin(observables).subscribe(
data => {
},
error => {
console.log(error);
},
() => {
db.close();
}
);
You can indeed achieve this with Rxjs quite nicely. You'll need higher order observables, which means you'll emit an observable into an observable, and the higher order observable will flatten this out for you.
The nice thing about this approach is that you can easily run X requests in // without having to manage the pool of requests yourself.
Here's the working code:
import { Observable, Subject } from "rxjs";
import { mergeAll, take, tap } from "rxjs/operators";
// this is just a mock to demonstrate how it'd behave if the API was
// taking 2s to reply for a call
const mockDbAddHtppCall = (id, cb) =>
setTimeout(() => {
cb(null, `some result for call "${id}"`);
}, 2000);
// I have no idea what your response type looks like so I'm assigning
// any but of course you should have your own type instead of this
type YourRequestType = any;
const NUMBER_OF_ITEMS_TO_FETCH = 10;
const calls$$ = new Subject<Observable<YourRequestType>>();
calls$$
.pipe(
mergeAll(3),
take(NUMBER_OF_ITEMS_TO_FETCH),
tap({ complete: () => console.log(`All calls are done`) })
)
.subscribe(console.log);
for (let id = 0; id < NUMBER_OF_ITEMS_TO_FETCH; id++) {
calls$$.next(
new Observable(observer => {
console.log(`Starting a request for ID "${id}""`);
mockDbAddHtppCall(id, (err, result) => {
if (err) {
observer.error(err);
} else {
observer.next(result);
observer.complete();
}
});
})
);
}
And a live demo on Stackblitz: https://stackblitz.com/edit/rxjs-z1x5m9
Please open the console of your browser and note that the console log showing when a call is being triggered starts straight away for 3 of them, and then wait for 1 to finish before picking up another one.
Looks like you could use an initial timer to trigger the http calls. e.g.
timer(delayTime).pipe(combineLatest(()=>sendHttpRequest()));
This would only trigger the sendHttpRequest() method after the timer observable had completed.
So with your solution. You could do the following...
observables.push(
timer(delay + int).pipe(combineLatest(new Observable((observer) => {
db.add(doc[int], (err, result)=>{
observer.next();
observer.complete();
}))
}))
Where delay could probably start off at 0 and you could increase it using the int index of your loop by some margin.
Timer docs: https://www.learnrxjs.io/learn-rxjs/operators/creation/timer
Combine latest docs: https://www.learnrxjs.io/learn-rxjs/operators/combination/combinelatest
merge with concurrent value:
mergeAll and mergeMap both allow you to define the max number of subscribed observables. mergeAll(1)/mergeMap(LAMBDA, 1) is basically concatAll()/concatMap(LAMBDA).
merge is basically just the static mergeAll
Here's how you might use that:
let observables = [...Array(10000).keys()].map(intV =>
new Observable(observer => {
db.add(doc[intV], (err, result) => {
observer.next();
observer.complete();
});
})
);
const MAX_CONCURRENT_REQUESTS = 10;
merge(...observables, MAX_CONCURRENT_REQUESTS).subscribe({
next: data => {},
error: err => console.log(err),
complete: () => db.close()
});
Of note: This doesn't batch your calls, but it should solve the problem described and it may be a bit faster than batching as well.
mergeMap with concurrent value:
Perhaps a slightly more RxJS way using range and mergeMap
const MAX_CONCURRENT_REQUESTS = 10;
range(0, 10000).pipe(
mergeMap(intV =>
new Observable(observer => {
db.add(doc[intV], (err, result) => {
observer.next();
observer.complete();
});
}),
MAX_CONCURRENT_REQUESTS
)
).subscribe({
next: data => {},
error: err => console.log(err),
complete: () => db.close()
});

Re-execute async RxJS stream after delay

I'm using RxJS 6 to lazily step through iterable objects using code similar to example running below. This is working well but I'm having trouble solving my final use case.
Full code here
import { EMPTY, defer, from, of } from "rxjs";
import { delay, expand, mergeMap, repeat } from "rxjs/operators";
function stepIterator (iterator) {
return defer(() => of(iterator.next())).pipe(
mergeMap(result => result.done ? EMPTY : of(result.value))
);
}
function iterateValues ({ params }) {
const { values, delay: delayMilliseconds } = params;
const isIterable = typeof values[Symbol.iterator] === "function";
// Iterable values which are emitted over time are handled manually. Otherwise
// the values are provided to Rx for resolution.
if (isIterable && delayMilliseconds > 0) {
const iterator = values[Symbol.iterator]();
// The first value is emitted immediately, the rest are emitted after time.
return stepIterator(iterator).pipe(
expand(v => stepIterator(iterator).pipe(delay(delayMilliseconds)))
);
} else {
return from(values);
}
}
const options = {
params: {
// Any iterable object is walked manually. Otherwise delegate to `from()`.
values: ["Mary", "had", "a", "little", "lamb"],
// Delay _between_ values.
delay: 350,
// Delay before the stream restarts _after the last value_.
runAgainAfter: 1000,
}
};
iterateValues(options)
// Is not repeating?!
.pipe(repeat(3))
.subscribe(
v => {
console.log(v, Date.now());
},
console.error,
() => {
console.log('Complete');
}
);
I'd like to add in another option which will re-execute the stream, an indefinite number of times, after a delay (runAgainAfter). I'm having trouble composing this in cleanly without factoring the result.done case deeper. So far I've been unable to compose the run-again behavior around iterateValues.
What's the best approach to accomplish the use case?
Thanks!
Edit 1: repeat just hit me in the face. Perhaps it means to be friendly.
Edit 2: No, repeat isn't repeating but the observable is completing. Thanks for any help. I'm confused.
For posterity here is the full code sample for a revised edition is repeat-able and uses a consistent delay between items.
import { concat, EMPTY, defer, from, interval, of, throwError } from "rxjs";
import { delay, expand, mergeMap, repeat } from "rxjs/operators";
function stepIterator(iterator) {
return defer(() => of(iterator.next())).pipe(
mergeMap(result => (result.done ? EMPTY : of(result.value)))
);
}
function iterateValues({ params }) {
const { values, delay: delayMilliseconds, times = 1 } = params;
const isIterable =
values != null && typeof values[Symbol.iterator] === "function";
if (!isIterable) {
return throwError(new Error(`\`${values}\` is not iterable`));
}
// Iterable values which are emitted over time are handled manually. Otherwise
// the values are provided to Rx for resolution.
const observable =
delayMilliseconds > 0
? defer(() => of(values[Symbol.iterator]())).pipe(
mergeMap(iterator =>
stepIterator(iterator).pipe(
expand(v => stepIterator(iterator).pipe(delay(delayMilliseconds)))
)
)
)
: from(values);
return observable.pipe(repeat(times));
}
I'm gonna be honest, but there could be better solution for sure. In my solution, I ended up encapsulating delay logic in a custom runAgainAfter operator. Making it an independent part, that doesn't affect your code logic directly.
Full working code is here
And the code of runAgainAfter if anybody needs it:
import { Observable } from "rxjs";
export const runAgainAfter = delay => observable => {
return new Observable(observer => {
let timeout;
let subscription;
const subscribe = () => {
return observable.subscribe({
next(value) {
observer.next(value);
},
error(err) {
observer.error(err);
},
complete() {
timeout = setTimeout(() => {
subscription = subscribe();
}, delay);
}
});
};
subscription = subscribe();
return () => {
subscription.unsubscribe();
clearTimeout(timeout);
};
});
};
Hope it helps <3

Admin on rest - implementing aor-realtime

I'm having a real hard time understanding how to implement aor-realtime (trying to use it with firebase; reads only, no write).
The first place I get stuck: This library generates a saga, right? How do I connect that with a restClient/resource? I have a few custom sagas that alert me on errors, but there is a main restClient/resource backing those. Those sagas just handles some side-effects. In this case, I just don't understand what the role of the client is, and how it interacts with the generated saga (or visa-versa)
The other question is with persistence: Updates stream in and the initial set of records is not loaded in one go. Should I be calling observer.next() with each update? or cache the updated records and call next() with the entire collection to-date.
Here's my current attempt at doing the later, but I'm still lost with how to connect it to my Admin/Resource.
import realtimeSaga from 'aor-realtime';
import { client, getToken } from '../firebase';
import { union } from 'lodash'
let cachedToken
const observeRequest = path => (type, resource, params) => {
// Filtering so that only chats are updated in real time
if (resource !== 'chat') return;
let results = {}
let ids = []
return {
subscribe(observer) {
let databaseRef = client.database().ref(path).orderByChild('at')
let events = [ 'child_added', 'child_changed' ]
events.forEach(e => {
databaseRef.on(e, ({ key, val }) => {
results[key] = val()
ids = union([ key ], ids)
observer.next(ids.map(id => results[id]))
})
})
const subscription = {
unsubscribe() {
// Clean up after ourselves
databaseRef.off()
results = {}
ids = []
// Notify the saga that we cleaned up everything
observer.complete();
}
};
return subscription;
},
};
};
export default path => realtimeSaga(observeRequest(path));
How do I connect that with a restClient/resource?
Just add the created saga to the custom sagas of your Admin component.
About the restClient, if you need it in your observer, then pass it the function which return your observer as you did with path. That's actually how it's done in the readme.
Should I be calling observer.next() with each update? or cache the updated records and call next() with the entire collection to-date.
It depends on the type parameter which is one of the admin-on-rest fetch types:
CRUD_GET_LIST: you should return the entire collection, updated
CRUD_GET_ONE: you should return the resource specified in params (which should contains its id)
Here's the solution I came up with, with guidance by #gildas:
import realtimeSaga from "aor-realtime";
import { client } from "../../../clients/firebase";
import { union } from "lodash";
const observeRequest = path => {
return (type, resource, params) => {
// Filtering so that only chats are updated in real time
if (resource !== "chats") return;
let results = {}
let ids = []
const updateItem = res => {
results[res.key] = { ...res.val(), id: res.key }
ids = Object.keys(results).sort((a, b) => results[b].at - results[a].at)
}
return {
subscribe(observer) {
const { page, perPage } = params.pagination
const offset = perPage * (page - 1)
const databaseRef = client
.database()
.ref(path)
.orderByChild("at")
.limitToLast(offset + perPage)
const notify = () => observer.next({ data: ids.slice(offset, offset + perPage).map(e => results[e]), total: ids.length + 1 })
databaseRef.once('value', snapshot => {
snapshot.forEach(updateItem)
notify()
})
databaseRef.on('child_changed', res => {
updateItem(res)
notify()
})
const subscription = {
unsubscribe() {
// Clean up after ourselves
databaseRef.off();
// Notify the saga that we cleaned up everything
observer.complete();
}
};
return subscription;
}
};
}
};
export default path => realtimeSaga(observeRequest(path));

Create an Rx.Subject using Subject.create that allows onNext without subscription

When creating an Rx.Subject using Subject.create(observer, observable), the Subject is so lazy. When I try to use subject.onNext without having a subscription, it doesn't pass messages on. If I subject.subscribe() first, I can use onNext immediately after.
Let's say I have an Observer, created like so:
function createObserver(socket) {
return Observer.create(msg => {
socket.send(msg);
}, err => {
console.error(err);
}, () => {
socket.removeAllListeners();
socket.close();
});
}
Then, I create an Observable that accepts messages:
function createObservable(socket) {
return Observable.fromEvent(socket, 'message')
.map(msg => {
// Trim out unnecessary data for subscribers
delete msg.blobs;
// Deep freeze the message
Object.freeze(msg);
return msg;
})
.publish()
.refCount();
}
The subject is created using these two functions.
observer = createObserver(socket);
observable = createObservable(socket);
subject = Subject.create(observer, observable);
With this setup, I'm not able to subject.onNext immediately (even if I don't care about subscribing). Is this by design? What's a good workaround?
These are actually TCP sockets, which is why I haven't relied on the super slick websocket subjects.
The basic solution, caching nexts before subscription with ReplaySubject:
I think all you wanted to do is use a ReplaySubject as your observer.
const { Observable, Subject, ReplaySubject } = Rx;
const replay = new ReplaySubject();
const observable = Observable.create(observer => {
replay.subscribe(observer);
});
const mySubject = Subject.create(replay, observable);
mySubject.onNext(1);
mySubject.onNext(2);
mySubject.onNext(3);
mySubject.subscribe(x => console.log(x));
mySubject.onNext(4);
mySubject.onNext(5);
Results in:
1
2
3
4
5
A socket implementation (example, don't use)
... but if you're looking at doing a Socket implementation, it gets a lot more complicated. Here is a working socket implementation, but I don't recommend you use it. Rather, I'd suggest that you use one of the community supported implementations either in rxjs-dom (if you're an RxJS 4 or lower) or as part of RxJS 5, both of which I've helped work on.
function createSocketSubject(url) {
let replay = new ReplaySubject();
let socket;
const observable = Observable.create(observer => {
socket = new WebSocket(url);
socket.onmessage = (e) => {
observer.onNext(e);
};
socket.onerror = (e) => {
observer.onError(e);
};
socket.onclose = (e) => {
if (e.wasClean) {
observer.onCompleted();
} else {
observer.onError(e);
}
}
let sub;
socket.onopen = () => {
sub = replay.subscribe(x => socket.send(x));
};
return () => {
socket && socket.readyState === 1 && socket.close();
sub && sub.dispose();
}
});
return Subject.create(replay, observable);
}
const socket = createSocketSubject('ws://echo.websocket.org');
socket.onNext('one');
socket.onNext('two');
socket.subscribe(x => console.log('response: ' + x.data));
socket.onNext('three');
socket.onNext('four');
Here's the obligatory JsBin

Resources