Trying to write to AWS dynamo db via api - aws-lambda

I am new to AWS and I've slowly been trying to perform different actions. I recently set up an API that allows me to query a dynamodb table and now I am trying to set up an api that will allow me to update a value in the table with the current temperature. This data will come from a script running on a raspberry pi.
I've been wading through so many tutorials but I haven't gotten this quite locked down. I am able to write to the db using a hard-coded python script so I know my db and roles is set up correctly. I am now trying to create a node-based lambda function that will accept parms from the URL and put the values into the table. I am missing something.
First, do I need to map the values in the api? Some guides do it, others do not. Like I said, ideally I want to pass them in as URL parms.
const AWS = require('aws-sdk');
const dynamodb = new AWS.DynamoDB({apiVersion: '2012-08-10'});
exports.handler = (event, context, callback) => {
dynamodb.putItem({
TableName: "temperature",
Item: {
"tempid": {
S: event.queryStringParameters["tempid"]
}
}
}, function(err, data) {
if (err) {
console.log(err, err.stack);
callback(null, {
statusCode: '500',
body: err
});
} else {
callback(null, {
statusCode: '200',
body: 'Result from ' + event.queryStringParameters["tempid"] + '!'
});
}
})
};
When I test it in the api using "tempid=hotttub1" in the query string I get this error:
START RequestId: 1beb4572-65bf-4ab8-81a0-c217677c3acc Version: $LATEST
2020-07-09T14:02:05.773Z 1beb4572-65bf-4ab8-81a0-c217677c3acc INFO { tempid: 'hottub1' }
2020-07-09T14:02:05.774Z 1beb4572-65bf-4ab8-81a0-c217677c3acc ERROR Invoke Error {"errorType":"TypeError","errorMessage":"Cannot read property 'tempid' of undefined","stack":["TypeError: Cannot read property 'tempid' of undefined"," at Runtime.exports.handler (/var/task/index.js:11:47)"," at Runtime.handleOnce (/var/runtime/Runtime.js:66:25)"]}
EDIT
If I print out event I can see that the value is coming in and I am apparently referencing it wrong. Still looking.
{
"tempid": "hottub1"
}

It needed to be in this format:
const AWS = require('aws-sdk');
const dynamodb = new AWS.DynamoDB({apiVersion: '2012-08-10'});
exports.handler = (event, context, callback) => {
console.info("EVENT\n" + JSON.stringify(event.tempid, null, 2))
var temperatureid = JSON.stringify(event.tempid, null, 2)
dynamodb.putItem({
TableName: "temperature",
Item: {
"tempid": {
S: temperatureid
}
}

Related

Postman how to export api response from collection runner with iteration to a file using node script

I am completely new to writing node scripts in Postman.
My requirement:
I have an api to get user details. I want to iterate for n number of users. I created a Runner collection and it executes. But i want to write each request response to a file.
Can anyone help me how to do this?
I watched some youtube video https://www.youtube.com/watch?v=cCRmry10874 for this. But my case is i have runner collection with data file.
When i exported the collection, i dont get the different values from data file.
const newman = require('newman');
newman.run({
collection: require('./collection.json'),
reporters: 'cli'
}, (err) => {
if(err) { throw err; }
console.log('collection run complete');
});
const fs = require('fs');
fs.writeFile('response.txt', 'Some text', (error) => {
if(error) {
console.error(error);
}
})
Thanks
Can you try it?
newman.run(
{
collection: require("./collection.json"),
reporters: "cli",
iterationData: "./data.json",
},
(err, summary) => {
if (err) {
throw err;
}
const results = summary.run.executions;
results.forEach((result) => {
fs.appendFileSync("response.txt", result.response.text());
});
}
);
If you are not limited to using a textfile, I would suggest using htmlextra.
It provides an HTML webpage with your runs and response body.

Next.js seems to cache files as in a route of _next/data/[path].json preventing getStaticProps from running in server side render

The issue appears to happen when I post the link on platforms like Discord and Slack, where then to produce a URL preview they send a request to the link. The link which in this case follows this structure (normal format) www.domain.com/ctg/[...ids].
Within [...ids] I either pass one of two ids for the same object, the object has the following structure:
type Catalogue {
id: ID!
edit_id: String!
user_id: String!
title: String
...
}
The first id I could pass into [...ids] would be Catalogue.id
The second id I could pass into [...ids] would be Catalogue.edit_id
Whenever either of those inputs for [...ids] is passed as part of a request the following getStaticProps is ran:
export const getStaticProps: GetStaticProps = async ({ params }) => {
const { ids } = params;
let catalogue: CatalogueType | null = await fetchFullCatalogue(ids[0]);
return {
props: {
catalogue_prop: catalogue,
params,
},
};
};
with fetchFullCatalogue being:
export const fetchFullCatalogue = async (
id: string
): Promise<CatalogueType | null> => {
let catalogue: CatalogueType;
const fetchToUrl =
process.env.NODE_ENV === "development"
? "http://localhost:4000/graphql"
: process.env.BACKEND_URL + "/graphql";
// create a axios fetch request to the http://localhost:4000/graphql
const query = `
<...SOME FRAGMENTS LATER...>
fragment AllCatalogueFields on Catalogue {
id
edit_id
user_id
status
title
description
views
header_image_url
header_color
author
profile_picture_url
event_date
location
created
updated
labels {
...AllLabelFields
}
listings {
...AllListingFields
}
}
query Catalogues($id: ID, $edit_id: String) {
catalogues(id: $id, edit_id: $edit_id) {
...AllCatalogueFields
}
}`;
const config: AxiosRequestConfig = {
method: "post",
url: fetchToUrl,
headers: {
"Content-Type": "application/json",
},
data: JSON.stringify({
query,
variables: { id: id, edit_id: id },
}),
};
let response = await axios(config);
if (response.data.errors) return null;
catalogue = response.data.data.catalogues[0];
console.log("catalogue", catalogue);
return catalogue;
};
The request it is making is to the following API endpoint
Query: {
catalogues: async (
_: null,
args: { id: string; edit_id: string }
): Promise<Catalogue[]> => {
let catalogues: Catalogue[];
// when both id and edit_are passed
if (args.id && args.edit_id) {
catalogues = await getFullCatalogues(args.id, "id", true);
// the following convoluted request is the result of
// me responding to the fact that only the edit_id was working
if (catalogues.length === 0) {
catalogues = await getFullCatalogues(args.edit_id, "edit_id", true);
if (catalogues.length === 0) {
throw new UserInputError("No catalogues found");
}
} else {
catalogues = await getFullCatalogues(
catalogues[0].edit_id,
"edit_id",
true
);
}
console.log("catalogues", catalogues);
} else if (args.id) {
catalogues = await getFullCatalogues(args.id);
} else if (args.edit_id) {
catalogues = await getFullCatalogues(args.edit_id, "edit_id");
} else {
const res = await db.query(fullCatalogueQuery());
catalogues = res.rows;
}
return catalogues;
},
...
},
This results in the following output within the deployed logs:
The logs show the data when the Catalogue is first created which simultaneously navigates me to the URL of "normal format" with Catalogue.id which is interpreted as /_next/data/qOrdpdpcJ0p6rEbV8eEfm/ctg/dab212a0-826f-42fb-ba21-6ebb3c1350de.json. This contains the default data when Catalogue is first generated with Catalogue.title being "Untitled List"
Before sending both requests I changed the Catalogue.title to "asd".
Notice how the request with the Catalogue.edit_id which was sent as the "normal format" was interpreted as /ctg/ee0dc1d7-5458-4232-b208-1cbf529cbf4f?edit=true. This resulted in the correct data being returned with Catalogue.title being "asd".
Yet the following request with the Catalogue.id although being of the same "normal format" never provoked any logs.
(I have tried sending the request without the params ?edit=true and the same happens)
Another important detail is that the (faulty) request with the Catalogue.id produces the (faulty) URL preview much faster than the request with Catalogue.edit_id.
My best theory as to why this is happening is that the data of the URL with Catalogue.id is somehow stored/cached. This would happen as the Catalogue is first created. In turn it would result in the old stored Catalogue.id being returned instead of making the fetch again. Whereas the Catalogue.edit_id makes the fetch again.
Refrences:
Live site: https://www.kuoly.com/
Client: https://github.com/CakeCrusher/kuoly-client
Backend: https://github.com/CakeCrusher/kuoly-backend
Anything helps, I felt like ive tried everything under the sun, thanks in advance!
I learned that For my purposes I had to use getServerSideProps instead of getStaticProps

apollo-server-lambda: Unable to determine event source based on event

I am using apollo-server-lambda for my app. I have create custom authoization http headers and it is required . if authoization: LETMEIN then it will return true and also return all data, if there is no any authoization or wrong authoization then it wll throw an error. For local development I used serverless-offline.In Local environment, it works as expected and here is the image but when I deploy my code to AWS, the api end does not work. It always throws me the error: here is the link.
I test my function AWS console. I am getting this error:
I did not get what I am doing wrong.
Here is my code
/* eslint-disable #typescript-eslint/no-var-requires */
import { ApolloServerPluginLandingPageGraphQLPlayground } from 'apollo-server-core';
import { ApolloServer, AuthenticationError } from 'apollo-server-lambda';
import schema from '../graphql/schema';
import resolvers from '../resolvers';
import runWarm from '../utils/run-warm';
export const authToken = (token: string) => {
if (token === 'LETMEIN') {
return;
} else {
throw new AuthenticationError('No authorization header supplied');
}
};
const server = new ApolloServer({
typeDefs: schema,
resolvers,
debug: false,
plugins: [ApolloServerPluginLandingPageGraphQLPlayground()],
context: ({ event }) => {
//console.log(context);
if (event.headers) {
authToken(event.headers.authorization);
}
},
});
export default runWarm(
server.createHandler({
expressGetMiddlewareOptions: {
cors: {
origin: '*',
credentials: true,
allowedHeaders: ['Content-Type', 'Origin', 'Accept'],
optionsSuccessStatus: 200,
maxAge: 200,
},
},
})
);
This is my Lambda function
/**
* Running warm functions help prevent cold starts
*/
const runWarm =
(lambdaFunc: AWSLambda.Handler): AWSLambda.Handler =>
(event, context, callback) => {
// Detect the keep-alive ping from CloudWatch and exit early. This keeps our
// lambda function running hot.
if (event.source === 'serverless-plugin-warmup') {
return callback(null, 'pinged');
}
return lambdaFunc(event, context, callback);
};
export default runWarm;
This is not a direct answer, but might help, and could be useful if anyone else (like me) found this thread because of the error "Unable to determine event source based on event" when using apollo-server-lambda.
That error is coming from #vendia/serverless-express which is being used by apollo-server-lambda.
Within serverless-express, in src/event-sources/utils.js, there is a function called getEventSourceNameBasedOnEvent(), which is throwing the error. It needs to find something in the event object, and after a bit of experimentation I found that writing the lambda function like this solved the issue for me:
const getHandler = (event, context) => {
const server = new ApolloServer({
typeDefs,
resolvers,
debug: true,
});
const graphqlHandler = server.createHandler();
if (!event.requestContext) {
event.requestContext = context;
}
return graphqlHandler(event, context);
}
exports.handler = getHandler;
Note that the context object is added to the event object with the key "requestContext"....that's the fix.
(Also note that I have defined typeDefs and resolvers elsewhere in the code)
I can't guarantee this is the ideal thing to do, but it did work for me.

read filtered data in AWS dynamoDB with AWS NodeJS Lambda

I want get element from DynamoDB to my NodeJS AWS Lambda. I want selected by "owner" column but do not work. I try same syntax for the "id" column and the result is OK. do I add an index in dynamoDB? where?
'use strict';
var AWS = require('aws-sdk');
var documentClient = new AWS.DynamoDB.DocumentClient({'region': 'eu-west-1'});
exports.handler = function(event, context, callback) {
console.log(JSON.stringify(event));
const claims = event.requestContext.authorizer.claims;
const username = claims['cognito:username'];
var params = {
TableName : "tp-exam",
Key: {
owner: username
}
};
documentClient.get(params, function(err, data){
if (err) {
console.log("Error", err);
const errResponse = {
statusCode: 500,
headers: {
"Access-Control-Allow-Origin": "*"
},
body: JSON.stringify({ Error: 500, device : "DynamoDB"})
};
callback(null, errResponse);
} else {
console.log("Success", data.Item);
const response = {
statusCode: 200,
headers: {
"Access-Control-Allow-Origin": "*"
},
body: JSON.stringify(data.Item)
};
callback(null, response);
}
});
};
my error is:
ValidationException: The provided key element does not match the schema
at Request.extractError (/var/runtime/node_modules/aws-sdk/lib/protocol/json.js:52:27)
at Request.callListeners (/var/runtime/node_modules/aws-sdk/lib/sequential_executor.js:106:20)
at Request.emit (/var/runtime/node_modules/aws-sdk/lib/sequential_executor.js:78:10)
at Request.emit (/var/runtime/node_modules/aws-sdk/lib/request.js:688:14)
at Request.transition (/var/runtime/node_modules/aws-sdk/lib/request.js:22:10)
at AcceptorStateMachine.runTo (/var/runtime/node_modules/aws-sdk/lib/state_machine.js:14:12)
at /var/runtime/node_modules/aws-sdk/lib/state_machine.js:26:10
at Request.<anonymous> (/var/runtime/node_modules/aws-sdk/lib/request.js:38:9)
at Request.<anonymous> (/var/runtime/node_modules/aws-sdk/lib/request.js:690:12)
at Request.callListeners (/var/runtime/node_modules/aws-sdk/lib/sequential_executor.js:116:18) {
code: 'ValidationException',
time: 2021-03-01T21:46:40.263Z,
requestId: 'C7UHG8354A92SGP2T4FRRFU4GFVV4KQNSO5AEMVJF66Q9ASUAAJG',
statusCode: 400,
retryable: false,
retryDelay: 32.73628085995177
}
We need a Global Secondary Index to get or query on an alternate key.
and to query using GSI, we need to use query api.
let docClient = new AWS.DynamoDB.DocumentClient();
documentClient.query(
{
TableName: "tp-exam",
IndexName: "owner-index",
KeyConditionExpression: "#owner_attr = :ownerVal",
ExpressionAttributeValues: {
":ownerVal": "John",
},
ExpressionAttributeNames: {
"#owner_attr": "owner",
},
},
function (err, data) {
console.log("err", err, "data", data);
}
);
owner is a reserved keyword, so, we need to use ExpressAttributeNames to replace with actual attribute.
This error is saying that you are not providing the correct primary key to the DynamoDb#get operation.
A couple of tips:
Make sure you're providing the correct name of your partition key. Are you sure your partition key attribute is named owner?
Make sure you are providing the entire primary key. Primary keys in DynamoDB come in two forms, simple and composite. A simple primary key is made up of a partition key only. A composite primary key is made up of a partition key and a sort key. Your error suggests that you might have a composite primary but aren't specifying the sort key portion of the primary key.
If neither of those issues resolves your problem, post the details of your DynamoDB table so we can see how your keys are defined.
Alternatively, if the owner attribute is not part of your primary key and you'd like to search by that field, you have a few options.
Create a new GSI using owner as the primary key or
You can use the scan operation to search by a non-key attribute.
Note that the first option (creating a GSI) is the preferred method.

How do I include run time arguments while executing a google cloud workflow in Nodejs?

I'm trying to include run time variables while executing a google cloud workflow. I can't find the documentation to do so unless you're using a REST API.
Here's my code that's mostly from their documentation I just get null for the arguments. I think it could be something with the second parameter it expects on createExecution named execution, but I can't figure it out.
const { ExecutionsClient } = require('#google-cloud/workflows');
const client = new ExecutionsClient();
const execute = () => {
return client.createExecution(
{
parent: client.workflowPath('project_id', 'location', 'name'),
},
{
argument: {
users: ['info here'],
},
},
);
};
module.exports = execute;
Thanks for the help!
In case anyone else has this problem you pass the parameter execution to createExecution() along with parent. It's just an object and you can specify argument there which takes a string. Stringify your object and you're good to go!
const { ExecutionsClient } = require('#google-cloud/workflows');
const client = new ExecutionsClient();
const execute = () => {
return client.createExecution({
parent: client.workflowPath('', '', ''),
execution: {
argument: JSON.stringify({
users: [],
}),
},
});
};
module.exports = execute;

Resources