I've got 3 functions.
Cron job lambda function
Event driven function which detects when a new record is added to the DynamoDB
A reusable function which is currently called by the 2 above functions
The Cron job function
export async function scheduledFunction() {
const detailsHistory = await sharedFunction(param1);
}
The event driven function
export async function eventFunction(event) {
event.Records.forEach(async record => {
if (record.eventName === 'INSERT') {
await sharedFunction(param1)
}
}
}
The function called by both of the event and scheduled function
const sharedFunction = async (param1) {
const apiUrl = 'xxxxxx';
const details = await axios.get(apiUrl, {
headers: {
'x-api-key': xxxx
}
});
}
The event function works when the DynamoDB has a new insert and then calls the 3rd party API which works as expected
The scheduled function fires every 4 hours and is works and gets to the sharedFunction, but when its gets to the API call await axios.get it just does nothing, I'm not getting any errors in the CloudWatch. I've placed console.logs() before and after the call and it logs the one before but nothing after.
You should always put async code inside try ... catch block. Also forEach won't work with promise you will need to use for loop. Try this:
export async function eventFunction(event) {
try {
for (let record of event.Records) {
if (record.eventName === 'INSERT') {
await sharedFunction(param1)
}
}
}
catch (err) {
console.log(err);
return err;
}
}
Shared function:
const sharedFunction = async (param1) => {
try {
const apiUrl = 'xxxxxx';
return await axios.get(apiUrl, {
headers: {
'x-api-key': xxxx
}
});
}
catch (err) {
return err;
}
}
Related
I'm not sure what's going on here. I have set up an API route in NextJS that returns before the data has been loaded. Can anyone point out any error here please?
I have this function that calls the data from makeRequest():
export async function getVendors() {
const vendors = await makeRequest(`Vendor.json`);
console.log({ vendors });
return vendors;
}
Then the route: /api/vendors.js
export default async (req, res) => {
const response = await getVendors();
return res.json(response);
};
And this is the makeRequest function:
const makeRequest = async (url) => {
// Get Auth Header
const axiosConfig = await getHeader();
// Intercept Rate Limited API Errors & Retry
api.interceptors.response.use(
function (response) {
return response;
},
async function (error) {
await new Promise(function (res) {
setTimeout(function () {
res();
}, 2000);
});
const originalRequest = error.config;
if (error.response.status === 401 && !originalRequest._retry) {
token[n] = null;
originalRequest._retry = true;
const refreshedHeader = await getHeader();
api.defaults.headers = refreshedHeader;
originalRequest.headers = refreshedHeader;
return Promise.resolve(api(originalRequest));
}
return Promise.reject(error);
}
);
// Call paginated API and return number of requests needed.
const getQueryCount = await api.get(url, axiosConfig).catch((error) => {
throw error;
});
const totalItems = parseInt(getQueryCount.data['#attributes'].count);
const queriesNeeded = Math.ceil(totalItems / 100);
// Loop through paginated API and push data to dataToReturn
const dataToReturn = [];
for (let i = 0; i < queriesNeeded; i++) {
setTimeout(async () => {
try {
const res = await api.get(`${url}?offset=${i * 100}`, axiosConfig);
console.log(`adding items ${i * 100} through ${(i + 1) * 100}`);
const { data } = res;
const arrayName = Object.keys(data)[1];
const selectedData = await data[arrayName];
selectedData.map((item) => {
dataToReturn.push(item);
});
if (i + 1 === queriesNeeded) {
console.log(dataToReturn);
return dataToReturn;
}
} catch (error) {
console.error(error);
}
}, 3000 * i);
}
};
The issue that I'm having is that getVendors() is returned before makeRequest() has finished getting the data.
Looks like your issue stems from your use of setTimeout. You're trying to return the data from inside the setTimeout call, and this won't work for a few reasons. So in this answer, I'll go over why I think it's not working as well as a potential solution for you.
setTimeout and the event loop
Take a look at this code snippet, what do you think will happen?
console.log('start')
setTimeout(() => console.log('timeout'), 1000)
console.log('end')
When you use setTimeout, the inner code is pulled out of the current event loop to run later. That's why end is logged before the timeout.
So when you use setTimeout to return the data, the function has already ended before the code inside the timeout even starts.
If you're new to the event loop, here's a really great talk: https://youtu.be/cCOL7MC4Pl0
returning inside setTimeout
However, there's another fundamental problem here. And it's that data returned inside of the setTimeout is the return value of the setTimeout function, not your parent function. Try running this, what do you think will happen?
const foo = () => {
setTimeout(() => {
return 'foo timeout'
}, 1000)
}
const bar = () => {
setTimeout(() => {
return 'bar timeout'
}, 1000)
return 'bar'
}
console.log(foo())
console.log(bar())
This is a result of a) the event loop mentioned above, and b) inside of the setTimeout, you're creating a new function with a new scope.
The solution
If you really need the setTimeout at the end, use a Promise. With a Promise, you can use the resolve parameter to resolve the outer promise from within the setTimeout.
const foo = () => {
return new Promise((resolve) => {
setTimeout(() => resolve('foo'), 1000)
})
}
const wrapper = async () => {
const returnedValue = await foo()
console.log(returnedValue)
}
wrapper()
Quick note
Since you're calling the setTimeout inside of an async function, you will likely want to move the setTimeout into it's own function. Otherwise, you are returning a nested promise.
// don't do this
const foo = async () => {
return new Promise((resolve) => resolve(true))
}
// because then the result is a promise
const result = await foo()
const trueResult = await result()
I am using NodeJS env with serverless framework.
The service is an endpoint for a contact form submission. Code looks something like this.
I have two async calls, one is writing to dynamoDB and another is sending an Email via SES.
module.exports.blog = async (event, context, callback) => {
const data = JSON.parse(event.body);
const handler = 'AB';
const sesParams = getSesParams(handler, data);
if (typeof data.text !== 'string') {
callback(null, validationErrRes);
return;
}
try {
await logToDB(handler, data);
} catch (dbErr) {
console.error(dbErr);
callback(null, errRes(dbErr, 'Failed to log to DB'));
return;
}
try {
await SES.sendEmail(sesParams).promise();
} catch (emailErr) {
console.error(emailErr);
callback(null, errRes(emailErr, 'Failed to send mail'));
return;
}
callback(null, succsessResponse);
return;
};
The response takes exactly 6sec when the dbput and sendMail takes total of < 300ms.
PS: Running both async calls parallelly does not help much.
Try removing the callback in your function definition and the call to your callback function. Just return the successResponse. You are already an async function so do not need to use a callback. You can also just return error.
module.exports.blog = async (event, context) => {
and
return {
statusCode: 200
}
and
return validationErrRes
Well, my lambda function work's well according to the log's, but it never get completed in the codepipeline stage, I have already set permission to role for allow notificate pipeline ("codepipeline:PutJobSuccessResult",
"codepipeline:PutJobFailureResult") and even set maximun time to 20sec but still not working (it actually ends at 800ms).
const axios = require('axios')
const AWS = require('aws-sdk');
const url = 'www.exampleurl.com'
exports.handler = async (event, context) => {
const codepipeline = new AWS.CodePipeline();
const jobId = event["CodePipeline.job"].id;
const stage = event["CodePipeline.job"].data.actionConfiguration.configuration.UserParameters;
const putJobSuccess = function(message) {
var params = {
jobId: jobId
};
codepipeline.putJobSuccessResult(params, function(err, data) {
if (err) {context.fail(err); }
else {context.succeed(message);}
});
};
const putJobFailure = function(message) {
var params = {
jobId: jobId,
failureDetails: {
message: JSON.stringify(message),
type: 'JobFailed',
externalExecutionId: context.invokeid
}
};
codepipeline.putJobFailureResult(params, function(err, data) {
if (err) console.log(err)
context.fail(message);
});
};
try {
await axios.post(url, { content: stage})
putJobSuccess('all fine')
} catch (e) {
putJobFailure(e)
}
};
The root issue
Because nodeJS runs everything async by default, codepipeline.putJobSuccessResult is being run async. The issue seems to be that the Lambda function is finishing it's execution before codepipeline.putJobSuccessResult has a chance to complete.
The solution
Run codepipeline.putJobSuccessResult synchronously so that it is forced to complete before the response is returned to Lambda for the lambdaHandler.
const putJobSuccess = function(id) {
//await sleep(60);
console.log("Telling Codepipeline test passed for job: " + id)
var params = {
jobId: id
};
return codepipeline.putJobSuccessResult(params, function(err, data) {
if(err) {
console.error(err)
} else {
console.log(data)
}
}).promise()
};
exports.lambdaHandler = async (event, context) => {
...
await putJobSuccess( jobId )
return response
};
Whenever I see this issue, most of the time it is due to 'PutJobSuccessResult' never being invoked. The best way to check this is to go to CloudTrail > 'Event History' and look for 'Event name' = 'PutJobSuccessResult' during the time range you expect the Lambda function calling this API. Probably you will not find the 'PutJobSuccessResult', then please have a look at the code again and the Lambda execution logs in CloudWatch.
Objective: show errors on AWS X-ray (all errors from lambda).
'use strict';
const AWSXRay = require('aws-xray-sdk-core'),
AWS = AWSXRay.captureAWS(require('aws-sdk')),
env = process.env;
AWS.config.update({
region: env.REGION
});
const dynamodbDocumentClient = new AWS.DynamoDB.DocumentClient();
exports.handler = async (event, context, callback) => {
let seg=AWSXRay.getSegment();
try {
const params = {
/*PARAMS HERE*/
};
let res = await dynamodbDocumentClient.scan(params).promise();
throw('ERROR FOR TESTING');
return res;
//callback(null,res);
}
catch(err) {
let subseg=seg.addNewSubsegment('error');
subseg.addMetadata("error", "error", "my_error");
subseg.addAnnotation('errr', 'this is a test');
subseg.addError(err);
subseg.addErrorFlag();
subseg.close();
console.log('==ERROR==',err);
return err;
}
};
When I use AWSXRay.captureAWS the subsegment 'error' doesnt show on X-ray. If I dont use captureAWS the error appear in X-ray correctly.
The issue is likely with how you're structuring your Lambda. First, you shouldn't be using callback in an async function handler, async function handlers should return native promises or errors per the docs.
If you want to use a callback to return the result of your function, which it seems like you do, the handler should be synchronous. That being said, you're throwing your error after the callback, which will not be reflected in X-Ray because the X-Ray segment for Lambda functions closes when the callback is called. If an error is thrown before the callback (e.g. during the DynamoDB call) it should be captured in your error subsegment.
EDIT
After the original post was edited, I was unable to reproduce the error with this code:
const AWSXRay = require('aws-xray-sdk');
const AWS = AWSXRay.captureAWS(require('aws-sdk'));
const ddb = new AWS.DynamoDB.DocumentClient();
exports.handler = async (event, context) => {
try {
const res = await ddb.scan({TableName: 'scorekeep-game'}).promise();
throw('Test Error');
return result;
} catch (e) {
console.log('caught!');
var sub2 = AWSXRay.getSegment().addNewSubsegment('err');
sub2.addError(e);
sub2.addErrorFlag();
sub2.close();
return e;
}
};
I have a DynamoDB Put request wrapped into an async function.
async function putter(param1, param2) {
const paramsPut = {
TableName: MyTableName,
Item: {
"hashKey": param1,
"sortKey": param2,
}
};
dynamodb.put(paramsPut, function(err, data) {
if (err) {
console.log("Failure")
console.log(data)
return data
}
else {
console.log("Success")
console.log(data)
return data
}
});
};
The return for the async funtion is placed in the response function - this the should provide back a promise upon put operation was performed (either sucessfully or not sucessfully).
I then invoke this async put function from another async function:
var param1 = "50";
var param2 = "60";
async function main() {
await putter(param1 , param2)
console.log("Feedback received")
}
When I invoke this aysnc main function I would expect it to provide the Success statement from the put function prior to writing "Feedback received" as it should await the put function response.
However my console logs the "Feedback received" prior to
logging the "Success" statement in the put async function which I
was awaiting.
What am I missing here? Thanks for your support!
Try to change your code like follows:
try {
const data = await dynamodb.put(paramsPut).promise()
console.log("Success")
console.log(data)
return data
} catch (err) {
console.log("Failure", err.message)
// there is no data here, you can return undefined or similar
}
Almost every function from AWS SDK has the promise() variant to return the result as a Promise. Then you can just await the Promise. Don't mix callbacks with promises (async/await) - it makes the code hard to read, it's better to stick with one technique everywhere.