Apollo Client - refetchQueries after multiple updates - react-apollo

I'm using Apollo client 3 and trying to build an editable table
I have multiple API mutations and I should trigger the refetchQueries after the last mutation.
For example:
const [updateName] = useMutation(updateNameD)
const [updateAge] = useMutation(updateAgeD)
const [updateCity] = useMutation(updateCityD, {refetchQueries: () => [{ query: UsersDocument }],})
The issue: the order of the mutations won't be always the same and I need to refetchQueries inside the last mutation

If all the three mutation needs to be executed at a time, then you can combine the the three mutation query into one query and your refetchQueries will get executed only after the mutation query(combined one) will finish. Please not when you combine the query you will have only one useMutation hook.
Let me know if you have a different use case.
[EDIT]
If you cannot group query, below can be the approach.
const [updateName] = useMutation(updateNameD)
const [updateAge] = useMutation(updateAgeD)
const [updateCity] = useMutation(updateCityD)
const [refetchPageData] = useLazyQuery(UsersDocument)
Promise.all(updateName(), updateAge(), updatedCity()).then(res => {
refetchPageData()
})

Found another workaround:
The useQuery hook return also a refetch method,
then you can use it wherever you want
const [updateName] = useMutation(updateNameD)
const [updateAge] = useMutation(updateAgeD)
const [updateCity] = useMutation(updateCityD)
const {data, refetch} = useQuery(UsersDocument)
await Promise.all([updateName(), updateAge(), updatedCity()])
refetch()

Related

Different query for same __typename data returns undefined on Apollo Graphql from cache

I am fairly new to Apollo and Graphql and I quite don't know what I am doing wrong. I will try to explain my problem with some simplified code but if I am missing some information please let me know and I will update it.
I have an Apollo client instance consuming an external (Directus CMS) graphql API not modifiable by me.
The thing is I am querying a paginated (infinite scroll) of posts with offset and limit variables and the data returned is correct. I am using the following typePolicy in order to merge results arrays in my inMemoryCache. This code is from an example in the Apollo Docs:
posts {
const merged = existing ? existing.slice(0) : [];
const postIdToIndex = Object.create(null);
if (existing) {
existing.forEach((post, index) => {
postIdToIndex[readField("id", post)] = index;
});
}
incoming.forEach((post) => {
const id = readField("id", post);
const index = postIdToIndex[id];
if (typeof index === "number") {
// Merge the new post data with the existing post data.
merged[index] = mergeObjects(merged[index], post);
} else {
// First time we've seen this post in this array.
postIdToIndex[id] = merged.length;
merged.push(post);
}
});
return merged;
}
and this is a simplified version of my query:
query($offset: Int $limit: Int) {
posts(limit: $limit offset: $offset) {
id
name
}
}
With this merge function and query I get my pagination right and everything loads correctly. But when I try to query a single instance of post that queries also additional fields from it with the following query:
query($offset: Int $limit: Int $post_id: Int) {
posts(filter:{id:{_eq: $post_id}}) {
id
name
other_data
}
}
This returns undefined constantly when using useQuery() although i can see that is coming through my merge funtion and merging objects with the mergeObjects() function. If I use a fetchPolicy of no-cache in this query, the data returns correctly so it has something to do with the cache.
I hope someone can lead me in the right direction so I can fix this problem.
Thank you everyone in advance!!

Parse Server - Get Pinned Object using Labels

I am storing objects in the Local Datastore via Pinning. I can pin objects under a label (e.g. following). If I want to return all of the people that a user is following there doesn't seem to be a way to do that. I can't even find a way to return all pinned objects regardless of their label. Am I missing something?
Here is my code for storing a person object in my Local Datastore:
peer.pinWithName( 'Followed' );
I can find out if the peer is followed using:
const Followed = Parse.Object.extend( 'Peer' );
const query = new Parse.Query( Followed );
query.fromLocalDatastore();
response = await query.get( peer.id );
Querying all objects from local data store
const Followed = Parse.Object.extend('Peer');
const query = new Parse.Query(Followed);
query.fromLocalDatastore();
response = await query.find();
Reference: https://docs.parseplatform.org/js/guide/#querying-the-local-datastore
Querying an object from pin with name
const Followed = Parse.Object.extend('Peer');
const query = new Parse.Query(Followed);
query.fromPinWithName('Followed');
response = await query.get(peer.id);
Querying all objects from pin with name
const Followed = Parse.Object.extend('Peer');
const query = new Parse.Query(Followed);
query.fromPinWithName('Followed');
response = await query.find();
Reference: http://parseplatform.org/Parse-SDK-JS/api/2.7.0/Parse.Query.html#fromPinWithName

How to share a single Subject source

I'm trying to share a Subject source across multiple functions that filter their actions and do appropriate tasks, the ones that are not filtered should fall trough without modifications.
I've tried merging same source but it doesn't really work the way I need it to...
const source = new Subject()
source.next({ type: 'some type', action: {} })
merge(
source,
source.pipe(filter(...), do something),
source.pipe(filter(...), do something),
source.pipe(filter(...), do something),
source.pipe(filter(...), do something),
).subscribe(...)
In this case I get original source + filtered ones.
I'm expecting to be able provide same source to multiple functions that can filter on types and do async behaviours, rest of the types that were not filtered should fall trough. Hope this is clear enough, or otherwise will try to make a better example. Thanks!
example here
Basically you want one source with actions. Subject is fine way to do this.
Then you want to do some processing on each type of action. You can filter and subscribe to each substream.
const add$ = source.pipe(filter(a => a.type === "add")).subscribe(function onAddAction(a) {});
const remove$ = source.pipe(filter(a => a.type === "remove")).subscribe(function onRemove(a) {});
Or you can prepare substreams and then merge to all processed actions again.
const add$ = source.pipe(filter(a => a.type === "add"), tap(onAdd));
const remove$ = source.pipe(filter(a => a.type === "remove"), tap(onRemove));
const processedAction$ = merge(add$, remove$);
processedAction$.subscribe(logAction);
If you need to do some preprocessing on all actions you can use share or shareReplay. toAction will be called only once per each item.
const subject = new Subject();
const action$ = subject.pipe(map(toAction), share());
const add$ = action$.pipe(filter(isAdd));
...
merge(add$, remove$).subscribe(logAction);
And if you have problems splitting:
function not(predicate) {
return function(item, ...args) {
return !predicate(item, ...args);
}
}
function any(...predicates) {
return function(item, ...args) {
return predicates.some(p => p(item, ...args));
}
}
const a = source.pipe(filter(fa), map(doA));
const b = source.pipe(filter(fb), map(doB));
const c = source.pipe(filter(fc), map(doC));
const rest = source.pipe(filter(not(any(fa, fb, fc)));
merge(a, b, c, rest).subscribe(logAction);

Using Config.skip with a React-Apollo Query

I'm having some trouble making use of the Config.skip property inside of my graphql() wrapper.
The intent is for the query to be fired with an argument of currentGoalID, only after a user has selected an item from the drop-down (passing the associated currentGoalID) , and the (Redux) state has been updated with a value for currentGoalID.
Otherwise, I expect (as per Apollo documentation) that:
... your child component doesn’t get a data prop at all, and the options or props methods are not called.
In this case though, it seems that my skip property is being ignored based upon the absence of a value for currentGoalID, and the option is being called because the webpack compiler/linter throws on line 51, props is not defined...
I successfully console.log the value of currentGoalID without the graphql()
wrapper. Any idea why config.skip isn't working? Also wish to be advised on the proper use of this in graphql() function call. I've excluded it here, but am unsure of the context, thanks.
class CurrentGoal extends Component {
constructor(props) {
super(props)
}
render (){
console.log(this.props.currentGoalID);
return( <p>Current Goal: {null}</p>
)
}
}
const mapStateToProps = (state, props) => {
return {
currentGoal: state.goals.currentGoal,
currentGoalID: state.goals.currentGoalID,
currentGoalSteps: state.goals.currentGoalSteps
}
}
const FetchGoalDocByID = gql `
query root($varID:String) {
goalDocsByID(id:$varID) {
goal
}
}`;
const CurrentGoalWithState = connect(mapStateToProps)(CurrentGoal);
const CurrentGoalWithData = graphql(FetchGoalDocByID, {
skip: (props) => !props.currentGoalID,
options: {variables: {varID: props.currentGoalID}}
})(CurrentGoalWithState);
// export default CurrentGoalWithState
export default CurrentGoalWithData
See the answer here: https://stackoverflow.com/a/47943253/763231
connect must be the last decorator executed, after graphql, in order for graphql to include the props from Redux.

ReathinkDB:Query multiple tables with chaining

In RethinkDB I am trying to chain multiple queries to multiple database tables. The idea is the same as stored procs for traditional dB's. Basically I query all the users connected to a device and then for each users try to get the rules attached from a rules table.
Here is a gist of the ReQL query I am writing. But forEach does not work as it wants a write query and not read and also do() is failing. Any suggestions?
const get_distance = function(location){
const final_location = 500;
return r.expr(500).sub(location);
};
​
const run_rule = function(device,distance){
return r.db('locationtracker_development').table('customer_details').filter(function(cust){
return cust("deviceId").contains(device);
}).pluck("userId").forEach(function(userName){
//TODO Work on each user
return r.db('locationtracker_development').table('user_rules').filter({'userId':userName('userId')});
});
};
r.do(get_distance(100)).do(function(dist){
return run_rule('gXtzAawbc6',dist);
});
I have gotten it resolved with help from the Slack Channel of RethinkDB.
Here is the code :
const get_distance = function(location){
const final_location = 500;
return r.expr(500).sub(location);
};
const run_rule = function(device,distance){
return r.db('locationtracker_development').table('customer_details').filter(function(cust){
return cust("deviceId").contains(device);
}).coerceTo('array').map(function(doc){
return doc('userId');
}).do(function(userIds){
//return userIds;
return r.db('locationtracker_development').table('user_rules').
getAll(r.args(userIds),{index:'userId'});
});
};
r.do(get_distance(100)).do(function(dist){
return run_rule('gXtzAawbc6',dist);
});
Basically the objects returned needs to be coercedTo and Array and then using map and do functionality we can achieve querying multiple DBs.

Resources