I am looking to specify a certain required combination of parameters in a graphQL query.
The query should be valid either without any params and return all cats or filter by size AND species.
extend type Query {
cats(size: String, species: String): [Cat]
}
Is the only way to do this via the resolver (throw error if one arg is passed) or is there a neater way?
I don't believe that this is defined in the spec. You could define a new input type and then use this though.
input CatFilter {
size: String!
species: String!
}
extend type Query {
cats(filter: CatFilter): [Cat]
}
That way the parameter is optional, but if given, both properties are required.
Related
I have a schema that looks like this.
exports.typeDefs = gql`
type User {
userid: ID!
name: String
}
type Post {
post_id: ID!
post_category: String!
post_type: String!
post_hashtag: String
user: User
}
`;
Now post have the field named "post_hashtag".
I want to define another schema type and get that post_hashtag property for all the nodes in post type.
of post.
I tried the below type hashtag and put a cipher query on that.
type hashtag{
post_hashtag: String
#cypher[statement: "MATCH (n:Test_Temp) RETURN n.post_hashtag"]
}
But it returns only the one first found hashtag and save it on hashtag node. This is not what I want. I want all the hashtags that are available in any post_hashtag node.
Example: If I query
query{
hashtag{
post_hashtag
}
}
This should give all the hashtags that are available in any of the post node but instead it return only one hashtag.
I've been trying this since few days. going through different solutions but none worked.
Any suggestions Please.
I've figured out the problem. Actually, we just need to use the collect method of neo4j to return an array of hashtags and save it in the hashtag node. And it will save all the hashtags on the node and we can retrieve them later.
First, change the schema a little bit like this.
type hashtag{
post_hashtag: [String]
#cypher[statement: "MATCH (n:Test_Temp) RETURN
n.post_hashtag"]
}
Notice: I've changed the post_hashtag type and made it an array of strings.
Then just the cipher query will change by the one down below.
MATCH (n:Test_Temp) RETURN collect(n.post_hashtag)
And it is done.
I have an input type in my schema that specifies lots of attributes, as it's intended to do. The issue is that what I'm sending to the mutation that will persist these objects is an object with arbitrary fields that may change. As it stands, if I send attributes not specified in the schema, I get the error:
Validation error of type WrongType: argument 'input' with value (...)
contains a field not in 'BotInput': 'ext_gps' # 'setBot'
Concretely, my input type did not specify the attribute exp_gps, and that field was provided.
My Question
Is there a way to make it so the input validation simply ignores any attributes not in the schema, so that it continues to perform the mutation with only whatever was specified in the schema? It'll be often that I don't want to persist the additional attributes, so dropping them is fine, as long as the other attributes get added.
GraphQL does not support arbitrary fields, there is a RFC to support a Map type but it has not been merged/approved into the specification.
I see two possible workarounds that both require to change your schema a little bit.
Say you have the following schema:
type Mutation {
saveBot(input: BotInput) : Boolean
}
input BotInput {
id: ID!
title: String
}
and the input object is:
{
"id": "123",
"title": "GoogleBot",
"unrelated": "field",
"ext_gps": "else"
}
Option 1: Pass the arbitrary fields as AWSJSON
You would change your schema to:
type Mutation {
saveBot(input: BotInput) : Boolean
}
input BotInput {
id: ID!
title: String
arbitraryFields: AWSJSON // this will contain all the arbitrary fields in a json string, provided your clients can pluck them from the original object, make a map out of them and json serialize it.
}
So the input in our example would be now:
{
"id": "123",
"title": "GoogleBot",
"arbitraryFields": "{\"unrelated\": \"field\", \"ext_gps\": \"else\"}"
}
In your resolver, you could take the arbitraryFields string, deserialize it, and hydrate the values on the BotInput object before passing it to the data source.
Option 2: Pass the input as AWSJSON
The principle is the same but you pass the entire BotInput as AWSJSON.
type Mutation {
saveBot(input: AWSJSON) : Boolean
}
You don't have to do the resolver hydration and you don't have to change your client, but you lose the GraphQL type validation as the whole BotInput is now a blob.
I have this schema on my graphcool:
type User #model {
id: ID! #isUnique
name: String!
email: String!
password: String!
}
Using playground, I can execute this properly:
query {
User(id: "1234") {
id
name
}
}
But this query:
query {
User(name: "Thomas") {
id
name
}
}
throws error:
Unknown argument 'name' on field 'User' of type 'Query'. (line 2,
column 8):
User(name: "Thomas").
Why? And how to fix this? From my pov, anything that's already on the model, can be queried immediately, right? Btw, I'm very newbie in graphQL, and there's almost no article talk about this error (every tutorial just like assume that this will immediately works), so please give some more elaborate answer if necessary.
GraphQL does not intrinsically allow arbitrary queries against objects.
Somewhere in your schema there will be an additional declaration like
type Query {
User(id: ID!): User
}
The names in the Query type are the top-level queries you can run, and the arguments listed in that query are the only arguments they accept. (There is a corresponding Mutation type for top-level mutations, which can change the underlying state, and use the mutation keyword in a query.)
If you control the server implementation, you could add a parameter or an additional top-level query
userByName(name: String!): User
but you'd also have to provide an implementation of this query or handle the additional parameter, which is a code change.
When writing queries I can define a resolver on any field and that field’s value will be determined by its resolver, regardless of query depth.
However when writing a mutation I seem to only be able to define the resolvers at the root level. Adding a resolve method to fields in my args or input type does not seem to have any affect.
What’s the best way deal with nested input in mutations?
What do you mean by nested input in your mutations? GraphQL input types does not have resolvers. With resolvers you are just determining how to fetch results. If you would like to have nested input, e.g. for example I would like to create user also with company. I will then define CreateUserInput CreateCompanyInput type for example like this in SDL
input CreateCompanyInput {
name: String!
type: CompanyEnum!
}
input CreateUserInput {
username: String!
firstname: String!
lastname: String!
company: CreateCompanyInput!
}
type Mutation {
createUser(input: CreateUserInput!): User
}
This way I am basically nesting arguments and can implement more complex mutations. In addition I can reuse the CreateCompanyInput for createCompany mutation if I need mutation even for that. I will then have the whole CreateUserInput even with CreateCompanyInput in the createUser resolver as input argument. I can apply transactions as I will create two new records etc. Not sure if it is what you mean by nested input if you mean something else. Just let me know :)
I've recently started to research the possibility of using GraphQL for requesting dynamic data configurations. The very first thing that jumps out at me is the strongly-typed concept of GraphQL.
Is there a way for GraphQL schemas to handle arrays of mixed type objects? I would greatly appreciate either an explanation or possibly a reference I can read over.
I am currently working with GraphQL with Node.js but a later implementation will be out of a Java Container. All data will be JSON pulled from MongoDB.
You either have to make these disparate types implement the same interface, make your resolvers return unions, or create a custom scalar to hold the dynamic data.
The cleanest approach is the first one: if your resulting objects can be of a limited number of types, define the types so that they implement the same interface, and type your resolvers by the interface. This allows the client to conditionally select sub-fields based on the actual type, and you maintain type safety.
The second approach has similar limitations: you need to know the possible types ahead of time, but they do not have to implement the same interface. It is preferable when the possible values are unrelated to each other and have either/or semantics, like success/failure.
The custom scalar approach is the only one in which you do not need to know the possible types of the result, i.e. the structure of the result can be completely dynamic. Here's an implementation of that approach, known as JSON scalar (i.e. cram any JSON-serializable structure into a scalar value). The big downside of this approach is that it makes sub-selection impossible, as the entire value becomes one big scalar (even though it's a complex object).
Since the question is asking about an array of objects of unknown types, I'll point out that you can, of course, have a list of all the options above.
Examples:
#Interface for any search result
interface SearchResult {
title: String!
url: String!
}
#A specific kind of search result
type Book implements SearchResult {
title: String!
url: String!
author: Author!
isbn: String!
}
type Article implements SearchResult {
title: String!
url: String!
categories: [Category]!
}
type Query {
#Search can return a mix of Books and Articles
search(keyword: String!): [SearchResult!]
}
Or
#No interface this time
type Book {
name: String! #No common fields with Article
author: Author!
publisher: Publisher!
}
type Article {
title: String!
url: String!
categories: [Category]!
}
union SearchResult = Book | Article
type Query {
#Search can return a mix of Books and Articles
search(keyword: String!): [SearchResult!]
}
Or
scalar JSON
type Query {
#Search can return anything at all... All bets are off
search(keyword: String!): [JSON!]
}
If data is completely JSON and you would rather preserve them as is, check out JSON scalar type. Basically,
import { GraphQLObjectType } from 'graphql';
import GraphQLJSON from 'graphql-type-json';
export default new GraphQLObjectType({
name: 'MyType',
fields: {
myField: { type: GraphQLJSON },
},
});
I think it's possible to make a custom/generic type that will fit the need.
So that way it's still a strong typed array but the type will be flexable enough to set what you need.
Here is an example with custom types:
https://github.com/stylesuxx/graphql-custom-types