How to properly format data with AppSync and DynamoDB when Lambda is in between - aws-lambda

Receiving data with AppSync directly from DynamoDB seems working for my case, but when I try to put a lambda function in between, I receive errors that says "Can't resolve value (/issueNewMasterCard/masterCards) : type mismatch error, expected type LIST"
Looking to the AppSync cloudwatch response mapping output, I get this:
"context": {
"arguments": {
"userId": "18e946df-d3de-49a8-98b3-8b6d74dfd652"
},
"result": {
"Item": {
"masterCards": {
"L": [
{
"M": {
"cardId": {
"S": "95d67f80-b486-11e8-ba85-c3623f6847af"
},
"cardImage": {
"S": "https://s3.eu-central-1.amazonaws.com/logo.png"
},
"cardWallet": {
"S": "0xFDB17d12057b6Fe8c8c434653456435634565"
},...............
here is how I configured my response mapping template:
$utils.toJson($context.result.Item)
I'm doing this mutation:
mutation IssueNewMasterCard {
issueNewMasterCard(userId:"18e946df-d3de-49a8-98b3-8b6d74dfd652"){
masterCards {
cardId
}
}
}
and this is my schema :
type User {
userId: ID!
masterCards: [MasterCard]
}
type MasterCard {
cardId: String
}
type Mutation {
issueNewMasterCard(userId: ID!): User
}
The Lambda function:
exports.handler = (event, context, callback) => {
const userId = event.arguments.userId;
const userParam = {
Key: {
"userId":{S:userId}
},
TableName:"FidelityCardsUsers"
}
dynamoDB.getItem(userParam, function(err, data) {
if (err) {
console.log('error from DynamDB: ',err)
callback(err);
} else {
console.log('mastercards: ',JSON.stringify(data));
callback(null,data)
}
})

I think the problem is that the getItem you use when you use the DynamoDB datasource is not the same as the the DynamoDB.getItem function in the aws-sdk.
Specifically it seems like the datasource version returns an already marshalled response (that is, instead of something: { L: [ list of things ] } it just returns something: [ list of things]).
This is important, because it means that $utils.toJson($context.result.Item) in your current setup is returning { masterCards: { L: [ ... which is why you are seeing the type error- masterCards in this case is an object with a key L, rather than an array/list.
To solve this in the resolver, you can use the $util.dynamodb.toDynamoDBJson(Object) macro (https://docs.aws.amazon.com/appsync/latest/devguide/resolver-util-reference.html#dynamodb-helpers-in-util-dynamodb). i.e. your resolver should be:
$util.dynamodb.toDynamoDBJson($context.result.Item)
Alternatively you might want to look at the AWS.DynamoDB.DocumentClient class (https://docs.aws.amazon.com/AWSJavaScriptSDK/latest/AWS/DynamoDB/DocumentClient.html). This includes versions of getItem, etc. that automatically marshal and unmarshall the proprietary DynamoDB typing back into native JSON. (Frankly I find this much nicer to work with and use it all the time).
In that case you can keep your old resolver, because you'll be returning an object where masterCards is just a JSON array.

Related

What happens if you do a get item in DynamoDb using a projection expression, if the attribute in the expression may not exist

Inside a lambda, I'm calling getItem on a a table with a projection expression for a single field. This is working fine.
const usersTableParams = {
TableName: 'users',
Key: {
'user-name': { S: userID }
},
ProjectionExpression: 'notificationEndpointARN'
};
ddb.getItem(usersTableParams, function (err, data) {
if (err) {
console.log('error getting user info', err);
}
else {
// success
// code...
}
});
Now I want to add another attribute to the projection expression, but that attribute might not exist yet on the item. (If it doesn't exist I will add it at the end of the function).
Does the function fail, does it return null for that attribute, does it not return that attribute at all?
I can't find the answer in the documentation or in any google searches.
If Projection-Expression contains an attribute that doesn't exist in the table, it doesn't throw any error or return null.
It will simply not appear in the result and return the remaining found attributes .
cli> aws dynamodb get-item --table-name my-DynamoDBTable-I3BL7EX05JQR --key file://test.json --projection-expression "data_type,ts,username"
{
"Item": {
"ts": {
"N": "1600755209826"
},
"data_type": {
"S": "Int32"
}
}
}
You can refer this for details: https://docs.aws.amazon.com/cli/latest/reference/dynamodb/get-item.html

Can't custom value of graphql enum

I have looked this question: How to use or resolve enum types with graphql-tools?
And, this doc: https://www.apollographql.com/docs/graphql-tools/scalars/#internal-values
Now, I want to custom the value of graphql enum.
typeDefs.ts:
import { gql } from 'apollo-server';
export const typeDefs = gql`
enum Device {
UNKNOWN
DESKTOP
HIGH_END_MOBILE
TABLET
CONNECTED_TV
}
type CampaignPerformanceReport {
campaignNme: String!
campaignId: ID!
device: Device
}
type Query {
campaignPerformanceReports: [CampaignPerformanceReport]!
}
`;
resolvers.ts:
import { IResolvers } from 'graphql-tools';
import { IAppContext } from './appContext';
export const resolvers: IResolvers = {
Device: {
UNKNOWN: 'Other',
DESKTOP: 'Computers',
HIGH_END_MOBILE: 'Mobile devices with full browsers',
TABLET: 'Tablets with full browsers',
CONNECTED_TV: 'Devices streaming video content to TV screens',
},
Query: {
async campaignPerformanceReports(_, __, { db }: IAppContext) {
return db.campaignPerformanceReports;
},
},
};
As you can see, I custom the value of Device enum in the resolver.
db.ts: a fake db with datas
enum Device {
UNKNOWN = 'Other',
DESKTOP = 'Computers',
HIGH_END_MOBILE = 'Mobile devices with full browsers',
TABLET = 'Tablets with full browsers',
CONNECTED_TV = 'Devices streaming video content to TV screens',
}
export const db = {
campaignPerformanceReports: [
{
campaignId: 1,
campaignNme: 'test',
device: Device.DESKTOP,
},
],
};
I also made an integration test for this:
test.only('should query campaign performance reports correctly with executable graphql schema', async () => {
const schema = makeExecutableSchema({ typeDefs, resolvers });
console.log(printSchema(schema));
const server: ApolloServerBase = new ApolloServer({ schema, context: { db } });
const { query }: ApolloServerTestClient = createTestClient(server);
const res: GraphQLResponse = await query({ query: Q.campaignPerformanceReports });
expect(res).toMatchInlineSnapshot(`
Object {
"data": Object {
"campaignPerformanceReports": Array [
Object {
"campaignId": "1",
"campaignNme": "test",
"device": "DESKTOP",
},
],
},
"errors": undefined,
"extensions": undefined,
"http": Object {
"headers": Headers {
Symbol(map): Object {},
},
},
}
`);
});
As you can see, the result of snapshot testing. The value of device field is still "DESKTOP", I expected the value should be "Computers"
Dependencies version:
"apollo-server": "^2.9.3",
"apollo-server-express": "^2.9.3",
"graphql": "^14.5.4",
The minimal repo: https://github.com/mrdulin/apollo-graphql-tutorial/tree/master/src/custom-scalar-and-enum
The internal values you specify for a GraphQL enum are just that -- internal. This is stated in the documentation:
These don't change the public API at all, but they do allow you to use that value instead of the schema value in your resolvers
If you map the enum value DESKTOP to the internal value Computers, only the behavior of your resolvers will be affected. Specifically:
If a field takes an argument of the type Device and the argument is passed the value DESKTOP, the value actually passed to the resolver function will be Computers.
If a field itself has the type device and we want to return DESKTOP, inside our resolver, we will need to return Computers instead.
Take for example a schema that looks like this:
type Query {
someQuery(device: Device!): Device!
}
If you don't specify internal values, our resolver works like this:
function (parent, args) {
console.log(args.device) // "DESKTOP"
return 'DESKTOP'
}
If you do specify internal values, the resolver looks like this:
function (parent, args) {
console.log(args.device) // "Computers"
return 'Computers'
}
The resolver is the only thing impacted by providing internal values for each enum value. What doesn't change:
How the enum value is serialized in the response. Enum values are always serialized as strings of the enum value name.
How the enum value is written as a literal inside a document. For example, if querying the above same field, we would always write: { someQuery(device: DESKTOP) }
How the enum value is provided as a variable. A variable of the type Device would always be written as "DESKTOP".
NOTE: While the question pertains specifically to Apollo Server, the above applies to vanilla GraphQL.js as well. For example, this enum
const DeviceEnum = new GraphQLEnumType({
name: 'Device',
values: {
UNKNOWN: { value: 'Other' },
DESKTOP: { value: 'Computers' },
HIGH_END_MOBILE: { value: 'Mobile devices with full browsers' },
TABLET: { value: 'Tablets with full browsers' },
CONNECTED_TV: { value: 'Devices streaming video content to TV screens' },
}
})
will still behave as described above.

GraphQL nested query returns null

I am trying to use a GraphQL nested query (I am 80% sure this is a nested query?) to get information on the listing and the chef (author) of the listing. I can get the listing info just fine, but I am unable to get the chef info.
I was under the impression that the default resolver (user) would fire when getListing(args) returned without a valid User object for the chef. But the default resolver does not appear to be firing.
How do I properly get the nested information?
For example, my query is:
query getListing($listingID: String!) {
getListing(listingID: $listingID) {
name
chef {
firstName
}
}
}
The query returns:
{
"data": {
"getListing": {
"name": "Test",
"chef": {
"firstName": null
}
}
}
}
The function getListing(args) queries the DB and returns:
{
name: 'Test',
chef: 'testUsername',
listingID: 'testListingID'
}
My Schema is:
type Listing {
uuid: String!
name: String!
chef: User!
}
type User {
username: String
firstName: String
}
type Query {
getUser(jwt: String!): User
getListing(listingID: String): Listing
}
And my resolvers are:
const resolvers = {
Query: {
getListing: async (parent, args, context, info) => {
console.log('GET_LISTING');
return getListing(args);
},
getUser: async (parent, args, context, info) => {
console.log('GET_USER');
return getUser(args);
},
},
User: async (parent, args) => {
console.log('USER RESOLVER');
return getUser(args);
},
};
Other Info:
I am using Apollo Server running on AWS Lambda integrating with DynamoDB on the backend.
Resolvers exist only at the field level. You can't resolve a type (i.e. User). You can only resolve a field that has that type (i.e. chef).
const resolvers = {
// ...
Listing: {
chef: (parent, args) => {
return getUser()
},
},
}
It's unclear what sort of parameters getUser accepts, so you'll need to modify the above example accordingly. You won't use args unless you actually specify arguments for the field being resolved in your schema. It looks like the returning listing has a chef property that's the name of the user, so you can access that value with parent.chef.

How get rid of redundant wrapper object of a mutation result?

When I'm making a request to my backend through a mutation like that:
mutation{
resetPasswordByToken(token:"my-token"){
id
}
}
I'm getting a response in such format:
{
"data": {
"resetPasswordByToken": {
"id": 3
}
}
}
And that wrapper object named the same as the mutation seems somewhat awkward (and at least redundant) to me. Is there a way to get rid of that wrapper to make the returning result a bit cleaner?
This is how I define the mutation now:
export const ResetPasswordByTokenMutation = {
type: UserType,
description: 'Sets a new password and sends an informing email with the password generated',
args: {
token: { type: new GraphQLNonNull(GraphQLString) },
captcha: { type: GraphQLString },
},
resolve: async (root, args, request) => {
const ip = getRequestIp(request);
const user = await Auth.resetPasswordByToken(ip, args);
return user.toJSON();
}
};
In a word: No.
resetPasswordByToken is not a "wrapper object", but simply a field you've defined in your schema that resolves to an object (in this case, a UserType). While it's common to request just one field on your mutation type at a time, it's possible to request any number of fields:
mutation {
resetPasswordByToken(token:"my-token"){
id
}
someOtherMutation {
# some fields here
}
andYetAnotherMutation {
# some other fields here
}
}
If we were to flatten the structure of the response like you suggest, we would not be able to distinguish between the data returned by one mutation from another. We likewise need to nest all of this inside data to keep our actual data separate from any returned errors (which appear in a separate errors entry).

Relay mutation fragments intersection

I don't use Relay container, because I'd like to have more control over components. Instead of it I use HOC + Relay.Store.forceFetch, that fetches any given query with variables. So I have the following query:
query {
root {
search(filter: $filter) {
selectors {
_id,
data {
title,
status
}
},
selectorGroups {
_id,
data {
title,
}
}
}
}
}
Then I have to do some mutation on selector type.
export default class ChangeStatusMutation extends Relay.Mutation {
getMutation() {
return Relay.QL`mutation {selectors_status_mutation}`;
}
getVariables() {
return {
id: this.props.id,
status: this.props.status
};
}
getFatQuery() {
return Relay.QL`
fragment on selectors_status_mutationPayload{
result {
data {
status
}
}
}
`;
}
static fragments = {
result: () => Relay.QL`
fragment on selector {
_id,
data {
title,
status
}
}`,
};
getOptimisticResponse() {
return {
result: {
_id: this.props.id,
data: {
status: this.props.status
}
}
};
}
getConfigs() {
return [{
type: 'FIELDS_CHANGE',
fieldIDs: {
result: this.props.id
},
}];
}
}
Call mutation in component:
const mutation = new ChangeStatusMutation({id, status, result: selector});
Relay.Store.commitUpdate(mutation);
After mutation commitment selector in Relay storage is not changed. I guess that's because of empty Tracked Fragment Query and mutation performs without any fields:
ChangeStatusMutation($input_0:selectors_statusInput!) {
selectors_status_mutation(input:$input_0) {
clientMutationId
}
}
But the modifying selector was already fetched by Relay, and I pass it to the mutation with props. So Relay knows the type, that should be changed, how to find the item and which fields should be replaced. But can not intersect. What's wrong?
So, you're definitely a bit "off the ranch" here by avoiding Relay container, but I think this should still work...
Relay performs the query intersection by looking up the node indicated by your FIELDS_CHANGE config. In this case, your fieldIDs points it at the result node with ID this.props.id.
Are you sure you have a node with that ID in your store? I'm noticing that in your forceFetch query you fetch some kind of alternative _id but not actually fetching id. Relay requires an id field to be present on anything that you later want to refetch or use the declarative mutation API on...
I'd start by checking the query you're sending to fetch whatever this result type is. I don't see you fetching that anywhere in your question description, so I'm just assuming that maybe you aren't fetching that right now?

Resources