How to form a unique constraint with multiple fields in keystonejs?
const Redemption = list({
access: allowAll,
fields: {
program: relationship({ ref: 'Program', many: false }),
type: text({ label: 'Type', validation: { isRequired: true }, isIndexed: 'unique' }),
name: text({ label: 'name', validation: { isRequired: true }, isIndexed: 'unique' }),
},
//TODO: validation to check that program, type, name form a unique constraint
})
The best way I can think to do this currently is by adding another field to the list and concatenating your other values into it using a hook. This lets you enforces uniqueness across these three values (combine) at the DB-level.
The list config (and hook) might look like this:
const Redemption = list({
access: allowAll,
fields: {
program: relationship({ ref: 'Program', many: false }),
type: text({ validation: { isRequired: true } }),
name: text({ validation: { isRequired: true } }),
compoundKey: text({
isIndexed: 'unique',
ui: {
createView: { fieldMode: 'hidden' },
itemView: { fieldMode: 'read' },
listView: { fieldMode: 'hidden' },
},
graphql: { omit: ['create', 'update'] },
}),
},
hooks: {
resolveInput: async ({ item, resolvedData }) => {
const program = resolvedData.program?.connect.id || ( item ? item?.programId : 'none');
const type = resolvedData.type || item?.type;
const name = resolvedData.name || item?.name;
resolvedData.compoundKey = `${program}-${type}-${name}`;
return resolvedData;
},
}
});
Few things to note here:
I've removed the isIndexed: 'unique' config for the main three fields. If I understand the problem you're trying to solve correctly, you actually don't want these values (on their own) to be distinct.
I've also remove the label config from your example. The label defaults to the field key so, in your example, that config is redundant.
As you can see, I've added the compoundKey field to store our composite values:
The ui settings make the field appear as uneditable in the UI
The graphql settings block updates on the API too (you could do the same thing with access control but I think just omitting the field is a bit cleaner)
And of course the unique index, which will be enforced by the DB
I've used a resolveInput hook as it lets you modify data before it's saved. To account for both create and update operations we need to consult both the resolvedData and item arguments - resolvedData gives us new/updated values (but undefined for any fields not being updated) and item give us the existing values in the DB. By combining values from both we can build the correct compound key each time and add it to the returned object.
And it works! When creating a redemption we'll be prompted for the 3 main fields (the compound key is hidden):
And the compound key is correctly set from the values entered:
Editing any of the values also updates the compound key:
Note that the compound key field is read-only for clarity.
And if we check the resultant DB structure, we can see our unique constraint being enforced:
CREATE TABLE "Redemption" (
id text PRIMARY KEY,
program text REFERENCES "Program"(id) ON DELETE SET NULL ON UPDATE CASCADE,
type text NOT NULL DEFAULT ''::text,
name text NOT NULL DEFAULT ''::text,
"compoundKey" text NOT NULL DEFAULT ''::text
);
CREATE UNIQUE INDEX "Redemption_pkey" ON "Redemption"(id text_ops);
CREATE INDEX "Redemption_program_idx" ON "Redemption"(program text_ops);
CREATE UNIQUE INDEX "Redemption_compoundKey_key" ON "Redemption"("compoundKey" text_ops);
Attempting to violate the constraint will produce an error:
If you wanted to customise this behaviour you could implement a validateInput hook and return a custom ValidationFailureError message.
Related
I'd like to display a list of users, based on a filtered Apollo query
// pseudo query
if (user.name === 'John) return true
User names can be edited. Unfortunately, if I change a user name to James, the user is still displayed in my list (the query is set to fetch from cache first)
I tried to update this by using cache.modify:
cache.modify({
id: cache.identify({
__typename: 'User',
id: userId,
}),
fields: {
name: () => {
return newName; //newName is the input new value
},
},
});
But I'm not quite sure this is the correct way to do so.
Of course, if I use refetchQueries: ['myUsers'], I get the correct result, but obviously, this is a bit overkill to refetch the whole list every time a name is updated.
Did I miss something?
I’ve inherited a project that’s setting an inmemorycache with the following key field syntax. None of the examples showcase this particular signature (that I can find at least). All the fields I see in the examples use multiple fields and are placed in the key field attribute. Is this looking for any nested “myField” attributes? How is this expected in the graphql data? (Apollo client 3.2)
const cache = new InMemoryCache({
typePolicies: {
Query: {
/// query info
},
},
UserData: {
fields: {
fieldA: {
merge(existing = [], incoming = []) {
return incoming;
},
},
fieldB: {
merge(existing = [], incoming = []) {
return incoming;
},
},
},
keyFields: [["myField"]], // <-- What is this looking for?
},
},
});
This leads to an invariant violation error:
Uncaught Invariant Violation: Missing field 'myField' while extracting keyFields from {"id":"462a349...... (does not contain myField)
Your code seems fine when it comes to fields map. On the other hand, keyFields in a slightly different question. You could totally skip setting it.
The purpose of keyFields is to uniquely identify your record, so the cache would know how to update. Just like in the relational databases you have a primary key that consists of one or more columns that consider your record unique.
I believe this is well documented in Apollo's documentation, see this:
https://www.apollographql.com/docs/react/caching/cache-configuration/#customizing-cache-ids
I'm using Gatsby with Netlify CMS and have some optional fields in a file collection. The problem with this is that I'm unable to retrieve these fields using GraphQL, as the field doesn't exist if it was left blank.
For example, let's say I have the following collection file:
label: "Primary Color",
name: "primary",
file: "data/palette.yaml",
widget: "object",
fields: [
{
label: "Light",
name: "light",
required: false,
widget: "string"
},
{
label: "Main",
name: "main",
required: false,
widget: "string"
},
{
label: "Dark",
name: "dark",
required: false,
widget: "string"
},
{
label: "Contrast Text",
name: "contrastText",
required: false,
widget: "string"
}
]
All fields are optional. So let's say the user only enters in a value for main. This then saves the data as:
primary:
main: '#ff0000'
light, dark and contrastText are not saved at all - they are simply left out entirely.
When I query the data in GraphQL, I obviously need to check for ALL fields since I have no idea which optional fields were filled in by the user and which were left blank. This means my query should be something like:
query MyQuery {
paletteYaml {
primary {
light
main
dark
contrastText
}
}
}
Using the above example where the user only filled in the main field, the above query will throw an error as light, dark and contrastText fields do not exist.
I am using a file collection type (as opposed to folder collection type) for this, so I can't set a default value. It wouldn't matter if I could set a default value anyway, since GraphQL and Yaml do not accept undefined as a value - they can only accept null or an empty string ("") as a best alternative.
Even if I manually save the yaml file with all field values set to null or "", this wouldn't work either as it would then cause additional issues as I am deep merging the query result with another javascript object.
I simply need to have GraphQL return undefined for each blank (missing) field instead of throwing an error, or not return the blank/missing fields at all.
This seems like a common issue (handling optional fields in Netlify CMS) but there is nothing in the documentation about it. How do people handle this issue?
i want to create a new graphql api and i have an issue that i am struggling to fix.
the code is open source and can be found at: https://github.com/glitr-io/glitr-api
i want to create a mutation to create a record with relations... it seems the record is created correctly with all the expected relations, (when checking directly into the database), but the value returned by the create<YourTableName> method, is missing all the relations.
... so so i get an error on the api because "Cannot return null for non-nullable field Meme.author.". i am unable to figure out what could be wrong in my code.
the resolver looks like the following:
...
const newMeme = await ctx.prisma.createMeme({
author: {
connect: { id: userId },
},
memeItems: {
create: memeItems.map(({
type,
meta,
value,
style,
tags = []
}) => ({
type,
meta,
value,
style,
tags: {
create: tags.map(({ name = '' }) => (
{
name
}
))
}
}))
},
tags: {
create: tags.map(({ name = '' }) => (
{
name
}
))
}
});
console.log('newMeme', newMeme);
...
that value of newMeme in the console.log here (which what is returned in this resolver) is:
newMeme {
id: 'ck351j0f9pqa90919f52fx67w',
createdAt: '2019-11-18T23:08:46.437Z',
updatedAt: '2019-11-18T23:08:46.437Z',
}
where those fields returned are the auto-generated fields. so i get an error for a following mutation because i tried to get the author:
mutation{
meme(
memeItems: [{
type: TEXT
meta: "test1-meta"
value: "test1-value"
style: "test1-style"
}, {
type: TEXT
meta: "test2-meta"
value: "test2-value"
style: "test2-style"
}]
) {
id,
author {
displayName
}
}
}
can anyone see what issue could be causing this?
(as previously mentioned... the record is created successfully with all relationships as expected when checking directly into the database).
As described in the prisma docs the promise of the Prisma client functions to write data, e.g for the createMeme function, only returns the scalar fields of the object:
When creating new records in the database, the create-method takes one input object which wraps all the scalar fields of the record to be
created. It also provides a way to create relational data for the
model, this can be supplied using nested object writes.
Each method call returns a Promise for an object that contains all the
scalar fields of the model that was just created.
See: https://www.prisma.io/docs/prisma-client/basic-data-access/writing-data-JAVASCRIPT-rsc6/#creating-records
To also return the relations of the object you need to read the object again using an info fragment or the fluent api, see: https://www.prisma.io/docs/prisma-client/basic-data-access/reading-data-JAVASCRIPT-rsc2/#relations
We are in the situation that the response of our GraphQL Query has to return some dynamic properties of an object. In our case we are not able to predefine all possible properties - so it has to be dynamic.
As we think there are two options to solve it.
const MyType = new GraphQLObjectType({
name: 'SomeType',
fields: {
name: {
type: GraphQLString,
},
elements: {
/*
THIS is our special field which needs to return a dynamic object
*/
},
// ...
},
});
As you can see in the example code is element the property which has to return an object. A response when resolve this could be:
{
name: 'some name',
elements: {
an_unkonwn_key: {
some_nested_field: {
some_other: true,
},
},
another_unknown_prop: 'foo',
},
}
1) Return a "Any-Object"
We could just return any object - so GraphQL do not need to know which fields the Object has. When we tell GraphQL that the field is the type GraphQlObjectType it needs to define fields. Because of this it seems not to be possible to tell GraphQL that someone is just an Object.
Fo this we have changed it like this:
elements: {
type: new GraphQLObjectType({ name: 'elements' });
},
2) We could define dynamic field properties because its in an function
When we define fields as an function we could define our object dynamically. But the field function would need some information (in our case information which would be passed to elements) and we would need to access them to build the field object.
Example:
const MyType = new GraphQLObjectType({
name: 'SomeType',
fields: {
name: {
type: GraphQLString,
},
elements: {
type: new GraphQLObjectType({
name: 'elements',
fields: (argsFromElements) => {
// here we can now access keys from "args"
const fields = {};
argsFromElements.keys.forEach((key) => {
// some logic here ..
fields[someGeneratedProperty] = someGeneratedGraphQLType;
});
return fields;
},
}),
args: {
keys: {
type: new GraphQLList(GraphQLString),
},
},
},
// ...
},
});
This could work but the question would be if there is a way to pass the args and/or resolve object to the fields.
Question
So our question is now: Which way would be recommended in our case in GraphQL and is solution 1 or 2 possible ? Maybe there is another solution ?
Edit
Solution 1 would work when using the ScalarType. Example:
type: new GraphQLScalarType({
name: 'elements',
serialize(value) {
return value;
},
}),
I am not sure if this is a recommended way to solve our situation.
Neither option is really viable:
GraphQL is strongly typed. GraphQL.js doesn't support some kind of any field, and all types defined in your schema must have fields defined. If you look in the docs, fields is a required -- if you try to leave it out, you'll hit an error.
Args are used to resolve queries on a per-request basis. There's no way you can pass them back to your schema. You schema is supposed to be static.
As you suggest, it's possible to accomplish what you're trying to do by rolling your own customer Scalar. I think a simpler solution would be to just use JSON -- you can import a custom scalar for it like this one. Then just have your elements field resolve to a JSON object or array containing the dynamic fields. You could also manipulate the JSON object inside the resolver based on arguments if necessary (if you wanted to limit the fields returned to a subset as defined in the args, for example).
Word of warning: The issue with utilizing JSON, or any custom scalar that includes nested data, is that you're limiting the client's flexibility in requesting what it actually needs. It also results in less helpful errors on the client side -- I'd much rather be told that the field I requested doesn't exist or returned null when I make the request than to find out later down the line the JSON blob I got didn't include a field I expected it to.
One more possible solution could be to declare any such dynamic object as a string. And then pass a stringified version of the object as value to that object from your resolver functions. And then eventually you can parse that string to JSON again to make it again an object on the client side.
I'm not sure if its recommended way or not but I tried to make it work with this approach and it did work smoothly, so I'm sharing it here.