I want to be able to do updates on an object while it is still being created.
For example: Say I have a to-do list where I can add items with names. I also want to be able to edit names of items.
Now say a user with a slow connection creates an item. In that case I fire off a create item mutation and optimistically update my UI. That works great. So far no problem
Now let's say the create item mutation is taking a bit of time due to a slow network. In that time, the user decides to edit the name of the item they just created. For an ideal experience:
The UI should immediately update with the new name
The new name should eventually be persisted in the server
I can achieve #2 by waiting for the create mutation to finish (so that I can get the item ID), then making an update name mutation. But that means parts of my UI will remain unchanged until the create item mutation returns and the optimistic response of the update name mutation kicks in. This means #1 won't be achieved.
So I'm wondering how can I achieve both #1 and #2 using Apollo client.
Note: I don't want to add spinners or disable editing. I want the app to feel responsive even with a slow connection.
If you have access to the server you can implement upsert operations, and you can reduce all queries to the such one:
mutation {
upsertTodoItem(
where: {
key: $itemKey # Some unique key generated on client
}
update: {
listId: $listId
text: $itemText
}
create: {
key: $itemKey
listId: $listId
text: $itemText
}
) {
id
key
}
}
So you will have a sequence of identical mutations differing only in variables. An optimistic response accordingly, can be configured to this one mutation. On the server you need to check if an item with such a key already exists and create or update an item respectively.
Additionally you might want to use apollo-link-debounce to reduce number of requests when user is typing.
I think the easiest way to achieve your desired effect is to actually drop optimistic updates in favor of managing the component state yourself. I don't have the bandwidth at the moment to write out a complete example, but your basic component structure would look like this:
<ApolloConsumer>
{(client) => (
<Mutation mutation={CREATE_MUTATION}>
{(create) => (
<Mutation mutation={EDIT_MUTATION}>
{(edit) => (
<Form />
)}
</Mutation>
)}
</Mutation>
)}
</ApolloConsumer>
Let's assume we're dealing with just a single field -- name. Your Form component would start out with an initial state of
{ name: '', created: null, updates: null }
Upon submitting, the Form would do something like:
onCreate () {
this.props.create({ variables: { name: this.state.name } })
.then(({ data, errors }) => {
// handle errors whichever way
this.setState({ created: data.created })
if (this.state.updates) {
const id = data.created.id
this.props.update({ variables: { ...this.state.updates, id } })
}
})
.catch(errorHandler)
}
Then the edit logic looks something like this:
onEdit () {
if (this.state.created) {
const id = this.state.created.id
this.props.update({ variables: { name: this.state.name, id } })
.then(({ data, errors }) => {
this.setState({ updates: null })
})
.catch(errorHandler)
} else {
this.setState({ updates: { name: this.state.name } })
}
}
In effect, your edit mutation is either triggered immediately when the user submits (since we got a response back from our create mutation already)... or the changes the user makes are persisted and then sent once the create mutation completes.
That's a very rough example, but should give you some idea on how to handle this sort of scenario. The biggest downside is that there's potential for your component state to get out of sync with the cache -- you'll need to ensure you handle errors properly to prevent that.
That also means if you want to use this form for just edits, you'll need to fetch the data out of the cache and then use that to populate your initial state (i.e. this.state.created in the example above). You can use the Query component for that, just make sure you don't render the actual Form component until you have the data prop provided by the Query component.
Related
So, I've been searching the internet for an example of a react-query function called outside the body of the function component; however, I was unable to find a sample.
I have a query type of GraphQL, and I am waiting for a value from the form that I could pass on. So I needed to call it inside the onSubmit function of Formik.
I have generated my hooks with GraphQL Codegen. Since we cannot call the react-hooks outside the function, I am uncertain how I can do this.
So, this is my react-hook for the query
const [{ data, fetching, stale, error, extensions, operation }] = useForgotPasswordQuery({ variables: { email: "value from the form" } });
Unlike with the mutation, it returns a function wherein I could use to call outside the function.
const [, forgotPassword] = useForgotPasswordMutation();
...
<Formik initialValues={{ email: '' }} onSubmit={async (values, { setErrors }) => {
const response = await forgotPassword({ email: values.email });
}}>
I'd like to know on this use case, I cannot use query type? Or this is a way that I am unaware.
I've read here in the Stackoverflow that the convention:
Query — for querying data (SELECT operations)
Mutation — for creating new and updating/deleting existing data (INSERT, UPDATE, DELETE)
In a way, this convention makes sense to me and I want to implement it. So I am trying to do it using the query rather than the mutation.
I have more than 50 fields those are input text and dropdowns in the reactive form. The fields are dependent to each other's value changes in order to trigger validation and to display related field after the selection.
I subscribed to the value changes in ngOnInit() as below:
ngOnInit() {
this.setPageValidation();
}
setPageValidation() {
this.NameSubscription = this.FormGroup.get('personnel').get('name').valueChanges.subscribe(data
=> {
this.enableOrders();
});
this.StateSubscription = this.FormGroup.get('personnel').get('state').valueChanges.subscribe(data
=>
{
this.enableAccount();
});
// more value changes subscription like 40 fields ............................
}
While loading the form, it is taking longer time to load due to subscribing for the value changes when the form loads.
I tried implementing it to move the code to ngOnChanges() but it is not triggering the enable and display of other fields depending on it's initial value that are filled from the table if there are values for those fields. It is just populating the first field and the rest does not display depending upon on its value.
I would like to thank you in advance. I really appreciate your help if there is any best approach to it to resolve without performance issue.
You can do with a single subscription.
this.personnelSubscription =
this.Formgroup.get('personnel').valueChanges.subscribe(data => {
if (data) {
//Console log the data here. It will print the formGroup of personnel
// then select the control and add your validations
// like this data.controls.state
}
})
I'm trying to figure out how queries in Apollo Client are supposed to interact with the cache.
Specifically, I want to know if we run a query that fetches all todos:
todos {
title
completed
}
And then later we run a query that fetches a single todo that was already fetched by the todos query and requests the exact same fields:
todo(id: $id) {
title
completed
}
Should the second query a) fetch the data from the cache, or b) make a network request?
My assumption was that it would be case A. This is based on this quote from an official Apollo blog post:
https://www.apollographql.com/blog/demystifying-cache-normalization/
For example, if we were to:
Perform a GetAllTodos query, normalizing and caching all todos from a backend
Call GetTodoById on a todo that we had already retrieved with GetAllTodos
...then Apollo Client could just reach into the cache and get the object directly without making another request.
However, in my app I kept getting case B, it was always making an additional network request even though I had already requested all the data in a different query.
I assumed that I was doing something wrong, so I checked out this Apollo Full-stack Tutorial repo (https://github.com/apollographql/fullstack-tutorial) and updated the LaunchDetails query to only request the same data that was already requested in the GetLaunchList query. This replicated the same scenario I detailed above with the todos.
The queries now look like this:
export const GET_LAUNCHES = gql`
query GetLaunchList($after: String) {
launches(after: $after) {
cursor
hasMore
launches {
...LaunchTile
}
}
}
${LAUNCH_TILE_DATA}
`;
export const GET_LAUNCH_DETAILS = gql`
query LaunchDetails($launchId: ID!) {
launch(id: $launchId) {
...LaunchTile
}
}
${LAUNCH_TILE_DATA}
`;
I ran the application, and found that a new network request was made for the LaunchDetails query, even though all the required data was already in the cache after the GetLaunchList query was run.
I haven't been able to find any answer to this in the documentation, and the results I'm seeing from the example tutorial app seem to be at odds with the quote from the blog piece above.
Is it the case that a query will only look to the cache if the query has already been run before? Can it not fetch cached data if that data was cached by a different query? Am I missing something?
Please see this better (in my opinion) answer here:
https://stackoverflow.com/a/66053242/6423036
Copying directly from that answer, credit to the author:
This functionality exists, but it's hard to find if you don't know what you're looking for. In Apollo Client v2 you're looking for cache redirect functionality, in Apollo Client v3 this is replaced by type policies / field read policies (v3 docs).
Apollo doesn't 'know' your GraphQL schema and that makes it easy to set up and work with in day-to-day usage. However, this implies that given some query (e.g. getBooks) it doesn't know what the result type is going to be upfront. It does know it afterwards, as long as the __typename's are enabled. This is the default behaviour and is needed for normalized caching.
Let's assume you have a getBooks query that fetches a list of Books. If you inspect the cache after this request is finished using Apollo devtools, you should find the books in the cache using the Book:123 key in which Book is the typename and 123 is the id. If it exists (and is queried!) the id field is used as identifier for the cache. If your id field has another name, you can use the typePolicies of the cache to inform Apollo InMemoryCache about this field.
If you've set this up and you run a getBook query afterwards, using some id as input, you will not get any cached data. The reason is as described before: Apollo doesn't know upfront which type this query is going to return.
So in Apollo v2 you would use a cacheRedirect to 'redirect' Apollo to the right cache:
cacheRedirects: {
Query: {
getBook(_, args, { getCacheKey }) {
return getCacheKey({
__typename: 'Book',
id: args.id,
});
}
},
},
(args.id should be replaced by another identifier if you have specified another key in the typePolicy)
When using Apollo v3, you need a typepolicy / field read policy:
typePolicies: {
Query: {
fields: {
getBook(_, { args, toReference }) {
return toReference({
__typename: 'Book',
id: args.id,
});
}
}
}
}
the query will make a network query.
todo(id: $id) {
title
completed
}
Apollo cache isn't very smart. It is just storage. You need to read/write for more complicated operations manually.
The reason for this is Apollo doesn't know about your schema and data structure. It doesn't know that todo(id: $id) will do DB search by, so it can't optimize to look in the cache.
If you don't want a second fetch, you have to implement your data fetch structure with fragment:
try {
return client.readFragment({
id: 'Todo:5', // The value of the to-do item's unique identifier
fragment: gql`
fragment TodoFragment on Todo {
id
title
completed
}
`,
});
} catch(_e) { // if no fragment is found there will be an error
client.query(QUERY, variables: { id: 5})
}
The way Apollo cache is that if you do two queries:
load todos
todos {
id
title
completed
}
load single todo
todo(id: $id) {
id
title
completed
}
If you list a list of todos and load the second one - it will update the todo data.
The setup:
My basic setup is a Next.js app querying data from a GraphQL API.
I am fetching an array of objects from the API and am able to display that array on the client.
I want to be able to filter the data based on Enum values that are defined in the API schema. I am able to pass these values programmatically and the data is correctly updated.
I want those filters to be persistent when a user leaves the page & come back. I was originally planning to use Redux, but then I read about apollo-link-state and the ability to store local (client) state into the Apollo store, so I set out to use that instead. So far, so good.
The problem:
When I try to combine the local query and the remote query into a single one, I get the following error: networkError: TypeError: Cannot read property 'some' of undefined
My query looks like this:
const GET_COMBINED = gql`
{
items {
id
details
}
filters #client
}
`
And I use it inside a component like this:
export default const Items = () => (
<Query query={GET_COMBINED}>
{({ loading, error, data: { items, filters } }) => {
...do stuff...
}}
</Query>
)
IF however, I run the queries separately, like the following:
const GET_ITEMS = gql`
{
items {
id
details
}
}
`
const GET_FILTERS = gql`
{
filters #client
}
`
And nest the queries inside the component:
export default const Items = () => (
<Query query={GET_ITEMS}>
{({ loading, error, data: { items } }) => {
return (
<Query query={GET_FILTERS}>
{({ data: { filters } }) => {
...do stuff...
}}
</Query>
)
}}
</Query>
)
Then it works as intended!
But it seems far from optimal to nest queries like this when a single query would - in theory, at least - do the job. And I truly don't understand why the combined query won't work.
I've stripped my app to its bare bones trying to understand, but the gist of it is, whenever I try to combine fetching local & remote data into a single query, it fails miserably, while in isolation both work just fine.
Is the problem coming from SSR/Next? Am I doing it wrong? Thanks in advance for your help!
Edit 2 - additional details
The error is triggered by react-apollo's getDataFromTree, however even when I choose to skip the query during SSR (by passing the ssr: false prop to the Query component), the combined query still fails. Besides, both the remote AND local queries work server-side when run separately. I am puzzled.
I've put together a small repo based on NextJS's with-apollo example that reproduces the problem here: https://github.com/jaxxeh/next-with-apollo-local
Once the app is running, clicking on the Posts (combined) link straight away will trigger an error, while Posts (split) link will display the data as intended.
Once the data has been loaded, the Posts (combined) will show data, but the attempt to load extra data will trigger an error. Reloading (i.e. server-rendering) the page will also trigger an error. Checkboxes will be functional and their state preserved across the app.
The Posts (split) page will fully function as intended. You can load extra post data, reload the page and set checkboxes.
So there is clearly an issue with the combined query, be it on the server-side (error on reload) or the client-side (unable to display additional posts). Direct writes to the local state (which bypass the query altogether) do work, however.
I've removed the Apollo init code for brevity & clarity, it is available on the repo linked above. Thank you.
Add an empty object as your resolver map to the config you pass to withClientState:
const stateLink = withClientState({
cache,
defaults: {
filters: ['A', 'B', 'C', 'D']
},
resolvers: {},
typedefs: `
type Query {
filters: [String!]!
}
`,
})
There's a related issue here. Would be great if the constructor threw some kind of error if the option was missing or if the docs were clearer about it.
I'm using Graph.cool graphql as a service and am wondering how to do a mass update to the collection, similar to a SQL update.
In my case I need to update the suffix of a url, in the imageUrl column of my database. I need to swap out a {someid}_sm.jpg to {someid}_lg.jpg
How do I do that with a graphql mutation? I don't want to reload the entire dataset again and am looking for a way to do it that doesn't involve manually interating through the entire list with a graphql client.
mutation {
updatePost() // what goes here?
}
Migration script
The best approach is indeed to use a migration script that combines multiple mutations so only one HTTP request is sent to the GraphQL backend.
Consider this schema:
type Image {
id: ID!
name: String!
}
We can include the same mutation multiple times in one request with GraphQL aliases:
mutation {
first: updateImage(id: "first-id", name: "01_lg.jpg") {
id
name
}
second: updateImage(id: "second-id", name: "02_lg.jpg") {
id
name
}
}
We'll make use of this mechanism in our migration script. I'll describe it with Lokka and Node, however you can choose whatever language and GraphQL client you prefer.
First, we query all existing images to obtain their id and name:
const queryImages = async() => {
const result = await client.query(`{
images: allImages {
id
name
}
}`)
return result.images
}
Then we replace the names accordingly and construct one big request including the necessary updateImage mutations with a different GraphQL alias for each.
If your image names might contain the string sm in the {someid} part mentioned in your question, this script will break! In that case, please adjust accordingly.
const migrateImages = async(images) => {
// beware! if your ids contain the string 'sm', adjust the string replacement accordingly!
const updateMutations = _.chain(images)
.map(image => ({ id: image.id, name: image.name.replace('sm', 'lg')}))
.map(image => `
${image.id}: updateImage(id: "${image.id}", name: "${image.name}") {
id
name
}`)
.value()
.join('\n')
const result = await client.mutate(`{
${updateMutations}
}`)
console.log(`Updated ${Object.keys(result).length} images`)
console.log(result)
}
That's it. If you have to update thousands of images, batching the mutations in say groups of a hundred might be better than to batch all of them in one request. Note that mutations run sequentially on the GraphQL server.
Running the migration
Currently, I suggest the following workflow for running the migration:
Clone your project
Run the migration script on your cloned project
Verify that the migration ran successfully. Double check :)
Run the migration on your original project
You can find the code and further instructions here.
While this approach is great for migrations that are as straightforward as in your example, it's not perfect for all situations. We're already thinking about creating an integrated experience for this use case, such as an interactive migration right in your Graphcool project, with simulated migrations, checks and more. If you have suggestions, let me know in Slack.