Apollo based server giving error code - BAD_USER_INPUT - graphql

I'm using Vendure headless e-commerce and trying to run a GraphQL mutation according to the specs - but I'm running into an error I cannot seem to decipher.
The error I'm getting is:
{
"errors": [{
"extensions": {
"code": "BAD_USER_INPUT"
},
"locations": [{
"line": 1,
"column": 28
}]
}]
}
This is in turn pointing to the $input variable.
query: "mutation AddPaymentToOrder($input: PaymentInput!) {\n ... etc ...
Usually when I've gotten this error code it means I've made a mistake in the mutation, but in those cases I get a clear message what is wrong. In this mutation I get nothing.
The Vendure docs on this mutation are quite simple: https://www.vendure.io/docs/graphql-api/shop/mutations/#addpaymenttoorder
The mutation I've written is quite simple as well for now:
export const AddPaymentToOrderQuery = gql`
mutation AddPaymentToOrder($input: PaymentInput!) {
addPaymentToOrder(input: $input) {
... on Order {
id
}
}
}
`;
The data I send looks like this:
type Variables = {
input: {
method: string;
}
};
Does anyone understand this error?

The problem was a misunderstanding - a missing input data element metadata that I had forgotten.
Input should look like:
type Variables = {
input: {
method: string;
metadata: Object;
}
};

Related

Apollo mixes two different arrays of the same query seemingly at random

With a schema like
schema {
query: QueryRoot
}
scalar MyBigUint
type Order {
id: Int!
data: OrderCommons!
kind: OrderType!
}
type OrderBook {
bids(limit: Int): [Order!]!
asks(limit: Int): [Order!]!
}
type OrderCommons {
quantity: Int!
price: MyBigUint! // where it doesn't matter whether it's MyBigUint or a simple Int - the issue occurs anyways
}
enum OrderType {
BUY
SELL
}
type QueryRoot {
orderbook: OrderBook!
}
And a query query { orderbook { bids { data { price } }, asks { data { price } } } }
In a graphql playground of my graphql API (and on the network level of my Apollo app too) I receive a result like
{
"data": {
"orderbook": {
"bids": [
{
"data": {
"price": "127"
}
},
{
"data": {
"price": "74"
}
},
...
],
"asks": [
{
"data": {
"price": "181"
}
},
{
"data": {
"price": "187"
}
},
...
]
}
}
}
where, for the purpose of this question, the bids are ordered in descending order by price like ["127", "74", "73", "72"], etc, and asks are ordered in ascending order, accordingly.
However, in Apollo, after a query is done, I notice that one of the arrays gets seemingly random data.
For the purpose of the question, useQuery react hook is used, but the same happens when I query imperatively from a freshly initialized ApolloClient.
const { data, subscribeToMore, ...rest } = useQuery<OrderbookResponse>(GET_ORDERBOOK_QUERY);
console.log(data?.orderbook?.bids?.map(r => r.data.price));
console.log(data?.orderbook?.asks?.map(r => r.data.price));
Here, corrupted data of Bids gets printed i.e. ['304', '306', '298', '309', '277', '153', '117', '108', '87', '76'] (notice the order being wrong, at the least), whereas Asks data looks just fine. Inspecting the network, I find that Bids are not only properly ordered there, but also have different (correct, from DB) values!
Therefore, it seems something's getting corrupted on the way while Apollo delivers the data.
What could be the issue here I wonder, and where to start debugging such kind of an issue? There seem to be no warnings from Apollo either, it seems to just silently corrupt the data.
I'm clearly doing something wrong, but what?
The issue seems to stem from how Apollo caches data.
My Bids and Asks could have the same numeric IDs but share the same Order graphql type. Apollo rightfully assumes a Bid and an Ask with the same ID are the same things and the resulting data gets wrecked as a consequence.
An easy fix is to show Apollo that there's a complex key to the Order type on cache initialization:
cache: new InMemoryCache({
typePolicies: {
Order: {
keyFields: ['id', 'kind'],
}
}
})
This way it'll understand that the Order entities Ask and Bid with the same ID are different pieces of data indeed.
Note that the field kind should be also added to the query strings accordingly.

Unable to filter custom data in siteMetaData in Gatsby using GraphQL in GraphiQL

I've created a basic Gatsby site with the default starter. I'm now trying to add some custom data (the people array) to gatsby-config.json like so:
module.exports = {
siteMetadata: {
title: `Gatsby Default Starter`,
description: `XXX`,
author: `#gatsbyjs`,
people : [
{ id : 1234, name : "Bill Smith", sales : 143, birthdate : "2233-03-22" },
{ id : 5678, name : "Roger Miller", sales : 281, birthdate : "2230-01-06" }
]
},
plugins: [
`gatsby-plugin-react-helmet`,
{ resolve: `gatsby-source-filesystem`,
options: { name: `images`, path: `${__dirname}/src/images`, }
},
`gatsby-transformer-sharp`,
`gatsby-plugin-sharp`,
{ resolve: `gatsby-plugin-manifest`,
options: {
name: `gatsby-starter-default`, short_name: `starter`, start_url: `/`,
background_color: `#663399`, theme_color: `#663399`, display: `minimal-ui`,
icon: `src/images/gatsby-icon.png`
}
}
]
}
Then, in GraphiQL, what I'm trying to do is a query to get a list of people, but limit it to just those with sales above 200, that's my end goal. So first, I did a basic test:
{
site {
siteMetadata {
people {
name
}
}
}
}
That works, I get all people back. Then, I tried:
{
site {
siteMetadata {
people(sales: { gt: 200 }) {
name
}
}
}
}
That gets me an error "Unknown argument sales on field people of type SiteSiteMetadata". That kinda seems to be telling me that Sift underneath Gatsby doesn't have any notion of my custom fields in its schema, which would kind of make sense to me. So, as a test, I try this:
{
site {
siteMetadata(author: { eq: "none" }) {
author
title
}
}
}
My expectation is the query runs successfully but returns an empty result set since the author element's value isn't "none". But instead, I get the same basic error but now telling me "Unknown argument author on field siteMetadata of type Site" and now I'm confused because it seems like it should know about THOSE fields even if it doesn't know about arbitrary ones I add. Then again, maybe that query won't ever work because there's only a single siteMetaData object versus trying to query an array. I'm not sure.
So then I do some research and I see some reference to 'filter', so I try this now:
{
site {
siteMetadata(filter: { eq: "none" }) {
author
title
}
}
}
That gets me "Unknown argument filter on field siteMetadata of type Site."
And now I'm kind of out of ideas.
I did find one post that seemed to possibly imply that you can't query custom data elements like this at all, but some replies seem to imply you, in fact, can (and clearly that first test worked, so the data is found, I just can't get the filtering to work). Maybe I'm using the wrong syntax, but if so then I can't seem to find what the correct syntax looks like (and what's worse is that in the Gatsby docs, the ONE example that MIGHT provide me an answer is error'ing out in the playground and I can't see the code).
It seems like such a simple thing, but I'm at a loss. Any help would be greatly appreciated. Thanks!
EDIT: After I wrote this, I tried putting the data in a separate file that get loaded with the gatsby-transformer-json plugin and tried to query that. The data gets loaded, but I still can't filter the query. I can do:
{
testData {
people {
name
sales
}
}
}
...and that works, returns my data fine. But if I try:
{
testData {
people(sales:{gt:200}) {
name
sales
}
}
}
...or...
{
testData {
people(filter:{sales:{gt:200}}) {
name
sales
}
}
}
...I get the same types of errors. So, I think that at least proves this isn't an issue of querying it from siteMetaData specifically, but I still don't know how to make it do what I want.
For anyone who wants to reproduce this, just add the file data.json in the root of the project with this content:
{
"people" : [
{ "id" : 1234, "name" : "Bill Smith", "sales" : 143, "birthdate" : "2233-03-22" },
{ "id" : 5678, "name" : "Roger Miller", "sales" : 281, "birthdate" : "2230-01-06" }
]
}
Then, add this to the plugins array in gatsby-config.json:
{
resolve: `gatsby-transformer-json`,
options: { typeName: `testData` }
},
{
resolve: `gatsby-source-filesystem`,
options: { name: `data`, path: `${__dirname}/data.json` }
}
No other changes from the initially-generated project are needed. Then, just hop into GraphiQL and try to execute the queries above.
Or, to make things easier, I've created a codesandbox instance that demonstrates this:
https://codesandbox.io/s/gatsby-graphql-querying-json-issue-47km4
EDIT2: I had the thought that maybe this is an issue with GraphiQL itself. So, I created a simple component:
import React from "react"
import { useStaticQuery, graphql } from "gatsby"
const Test = () => {
const testData = useStaticQuery(graphql`
query TestDateQuery {
testData(filter: {sales: {gte:200}}) {
people {
name
}
}
}
`)
console.log("testData", testData);
return (
<div />
)
}
export default Test
I then dropped that into my main Layout component. I get the same sort of error (about filter being an unknown argument) rather than seeing my data. If I remove the filter, I DO see ALL the data. So, again, I can't figure out why just filter isn't working, but that's what I've narrowed it down to.

Apollo - GraphQLError: Syntax Error: Expected $, found Name

I'm using GraphQL in ui/playground, with the syntax bellow, works pretty well.
mutation {
createUnit(unit: {name: "hehehe great!!!"}) {
id,
name
}
}
My doubt is, how can I use GraphQL in Apollo Client, I found something like that:
mutation createUnit(unit: {name: String!}) {
createUnit(unit: {name: "looks great!!!"}) {
id,
name
}
}
But unfortunately I'm getting the error:
GraphQLError: Syntax Error: Expected $, found Name "unit"
Make sure you have defined the type Unit and added the createUnit type as a mutation in the resolvers
module.exports.Mutation = {
createUnit
}
to use it in the graphql playground you just have to specify the variables you are going to use in the mutation like this
mutation($unit: Unit){
createUnit(unit: $unit){
id
name
}
}
and in the variables add :
{
"unit": {
name: "looks great!!!"
}
}

Apollo Array of Custom Input types as Mutation argument throws "__typename": Unknown field error

I do not quite understand what the error even is to be able to tackle this problem. Checking server console also doesn't show any descriptive error. I have added all the necessary code that is related to this issue.
Here is the mutation:
mutation SaveTrials($event: ID!, $input: [ResultTrialsInputType!]!) {
saveTrials(event: $event, results: $input) {
results {
id
trials
}
}
}
I am using Graphene (Python) in backend but the types correspond to the following:
input ResultTrialsInputType {
id: ID
person: ID!
trials: [String]
}
Here is the Python code if it matters:
class ResultTrialsInputType(graphene.InputObjectType):
id = graphene.ID()
person = graphene.ID(required=True)
trials = graphene.List(graphene.String)
When I send data from the apollo using the mutation above, this is what is being sent to the API:
{
"operationName": "SaveTrials",
"variables": {
"event": "207e9f27-be66-4564-9c28-ac92ec44235d",
"input": [
{
"id": "8eb80b8b-c93a-44b1-9624-e75436c13780",
"trials": [
"32.1",
"92.2",
"12.1",
"12.2",
"23.2",
""
],
"__typename": "ResultTrialsObjectType",
"person": "a6f18ab5-df23-421e-b916-73e569bf73ad"
}
]
},
"query": "mutation SaveTrials($event: ID!, $input: [ResultTrialsInputType!]!) {\n saveTrials(event: $event, results: $input) {\n results {\n id\n trials\n __typename\n }\n __typename\n }\n}\n"
}
Response for this query is an error about "__typename":
{
"errors": [
{
"message": "Variable \"$input\" got invalid value [{\"__typename\": \"ResultTrialsObjectType\", \"person\": \"a6f18ab5-df23-421e-b916-73e569bf73ad\", \"id\": \"8eb80b8b-c93a-44b1-9624-e75436c13780\", \"trials\": [\"32.1\", \"92.2\", \"12.1\", \"12.2\", \"23.2\", \"\"]}].\nIn element #0: In field \"__typename\": Unknown field.",
"locations": [
{
"line": 1,
"column": 34
}
]
}
]
}
In anywhere else in my application where an input argument is not array of custom objects as expected. What is the deal here? Am I setting my input arguments in the wrong way? Or am I missing something here?
I tried to add __typename manually to the input type; however, nothing happened.
Thanks!
EDIT: Now that I am checking this out, for some reason __typename is displayed as ResultTrialsObjectType but it should be ResultTrialsInputType. How is this value being generated? Does Apollo generate it or does server generate it and Apollo fetches it?
Your schema specifies that ResultTrialsInputType has three fields: id, person, trials. __typename is a special meta-field that signifies the type of an object -- it should not be added to the schema. In fact, any names that start with two underscores are reserved and should not be used for field names.
As the error indicates, the issue is that __typename is not a field that's specified for ResultTrialsInputType, but you're sending it anyway.
Apollo will automatically attach __typename to any selection sets in your request (not inputs or variable values). So a query like this:
query {
foo {
bar
}
}
becomes:
query {
foo {
bar
__typename
}
}
Apollo needs the __typename for every Object returned in your response in order to effectively cache the response. However, this means any time you are working with a data object returned by Apollo, it will have __typename properties throughout its structure.
What this boils down to is that, generally speaking, you cannot and should not make a query, mutate the response and then turn around and use that as an input to another query or mutation.

How to properly format data with AppSync and DynamoDB when Lambda is in between

Receiving data with AppSync directly from DynamoDB seems working for my case, but when I try to put a lambda function in between, I receive errors that says "Can't resolve value (/issueNewMasterCard/masterCards) : type mismatch error, expected type LIST"
Looking to the AppSync cloudwatch response mapping output, I get this:
"context": {
"arguments": {
"userId": "18e946df-d3de-49a8-98b3-8b6d74dfd652"
},
"result": {
"Item": {
"masterCards": {
"L": [
{
"M": {
"cardId": {
"S": "95d67f80-b486-11e8-ba85-c3623f6847af"
},
"cardImage": {
"S": "https://s3.eu-central-1.amazonaws.com/logo.png"
},
"cardWallet": {
"S": "0xFDB17d12057b6Fe8c8c434653456435634565"
},...............
here is how I configured my response mapping template:
$utils.toJson($context.result.Item)
I'm doing this mutation:
mutation IssueNewMasterCard {
issueNewMasterCard(userId:"18e946df-d3de-49a8-98b3-8b6d74dfd652"){
masterCards {
cardId
}
}
}
and this is my schema :
type User {
userId: ID!
masterCards: [MasterCard]
}
type MasterCard {
cardId: String
}
type Mutation {
issueNewMasterCard(userId: ID!): User
}
The Lambda function:
exports.handler = (event, context, callback) => {
const userId = event.arguments.userId;
const userParam = {
Key: {
"userId":{S:userId}
},
TableName:"FidelityCardsUsers"
}
dynamoDB.getItem(userParam, function(err, data) {
if (err) {
console.log('error from DynamDB: ',err)
callback(err);
} else {
console.log('mastercards: ',JSON.stringify(data));
callback(null,data)
}
})
I think the problem is that the getItem you use when you use the DynamoDB datasource is not the same as the the DynamoDB.getItem function in the aws-sdk.
Specifically it seems like the datasource version returns an already marshalled response (that is, instead of something: { L: [ list of things ] } it just returns something: [ list of things]).
This is important, because it means that $utils.toJson($context.result.Item) in your current setup is returning { masterCards: { L: [ ... which is why you are seeing the type error- masterCards in this case is an object with a key L, rather than an array/list.
To solve this in the resolver, you can use the $util.dynamodb.toDynamoDBJson(Object) macro (https://docs.aws.amazon.com/appsync/latest/devguide/resolver-util-reference.html#dynamodb-helpers-in-util-dynamodb). i.e. your resolver should be:
$util.dynamodb.toDynamoDBJson($context.result.Item)
Alternatively you might want to look at the AWS.DynamoDB.DocumentClient class (https://docs.aws.amazon.com/AWSJavaScriptSDK/latest/AWS/DynamoDB/DocumentClient.html). This includes versions of getItem, etc. that automatically marshal and unmarshall the proprietary DynamoDB typing back into native JSON. (Frankly I find this much nicer to work with and use it all the time).
In that case you can keep your old resolver, because you'll be returning an object where masterCards is just a JSON array.

Resources