How do I invoke a SAM Lambda function locally with X-Ray statements? - aws-lambda

I'm receiving the following error when invoking an AWS SAM Lambda function locally:
Missing AWS Lambda trace data for X-Ray. Ensure Active Tracing is enabled and no subsegments are created outside the function handler.
Below you can see my function:
/** Bootstrap */
require('dotenv').config()
const AWSXRay = require('aws-xray-sdk')
/** Libraries*/
const se = require('serialize-error')
/** Internal */
const logger = require('./src/utils/logger')
const ExecuteService = require('./src/service')
/**
*
*/
exports.handler = async (event) => {
const xraySegement = AWSXRay.getSegment()
const message = process.env.NODE_ENV == 'production' ? JSON.parse(event.Records[0].body) : event
try {
await ExecuteService(message)
} catch (err) {
logger.error({
error: se(err)
})
return err
}
}
In addition, I have Tracing set to Active in my template.yml.
What part of the documentation am I clearly misreading, missing, or reading over?

For now you can't invoke a SAM lambda locally with X-ray because it is not supported yet.
See
[Feature Request] Add support for X-Ray on SAM Local
#217
aws-lambda-runtime-interface-emulator#level-of-support
The component does not support X-ray and other Lambda integrations locally.
If you don't care about X-ray and just want your code to work you can check the env variable AWS_SAM_LOCAL to prevent X-ray usage:
let AWSXRay
if (!process.env.AWS_SAM_LOCAL) {
AWSXRay = require('aws-xray-sdk')
}
// ...
if (!process.env.AWS_SAM_LOCAL) {
const xraySegement = AWSXRay.getSegment()
}

I am using SAM CLI, version 1.36.0
I noticed importing aws-xray-sdk does not create this issue but invoking the functions do.
To not have a million if statements I created a middle man service that imports aws-xray-sdk then exports noops if the env var process.env.AWS_SAM_LOCAL is set.

Related

How to run nextjs in AWS lambda with `experimental-edge` runtime

I'm trying to find a way to run Next.js (v13.0.6) with OG image generation logic (using #vercel/og) in AWS Lambda
Everything works fine locally (in dev and prod mode) but when I try execute lambda handler getting "statusCode": 500,
It only fails for apis that involve ImageResponse (and runtime: 'experimental-edge' as a requirement for #vercel/og)
I'm pretty sure the problem is caused by Edge Runtime is not being configured correctly
There is my handler code
next build with next.config.js output: 'standalone' creates folder .next/standalone
insde standalone handler.js
const { parse } = require('url');
const NextServer = require('next/dist/server/next-server').default
const serverless = require('serverless-http');
const path = require('path');
process.env.NODE_ENV = 'production'
process.chdir(__dirname)
const currentPort = parseInt(process.env.PORT, 10) || 3000
const nextServer = new NextServer({
hostname: 'localhost',
port: currentPort,
dir: path.join(__dirname),
dev: false,
customServer: false,
conf: {...} // copied from `server.js` in there same folder
});
const requestHandler = nextServer.getRequestHandler();
// this is a AWS lambda handler that converts lambda event
// to http request that next server can process
const handler = serverless(async (req, res) => {
// const parsedUrl = parse(req.url, true);
try {
await requestHandler(req, res);
}catch(err){
console.error(err);
res.statusCode = 500
res.end('internal server error')
}
});
module.exports = {
handler
}
testing it locally with local-lambda, but getting similar results when test against AWS deployed lambda
what is confusing is that server.js (in .next/standalone) has a similar setup, it only involves http server on top of of it
update:
aws lambda logs show
ERROR [Error [CompileError]: WebAssembly.compile(): Compiling function #64 failed: invalid value type 'Simd128', enable with --experimental-wasm-simd #+3457 ]
update 2:
the first error was fixed by selecting Node 16 for AWS lambda, now getting this error
{
"errorType": "Error",
"errorMessage": "write after end",
"trace": [
"Error [ERR_STREAM_WRITE_AFTER_END]: write after end",
" at new NodeError (node:internal/errors:372:5)",
" at ServerlessResponse.end (node:_http_outgoing:846:15)",
" at ServerlessResponse.end (/var/task/node_modules/next/dist/compiled/compression/index.js:22:783)",
" at NodeNextResponse.send (/var/task/node_modules/next/dist/server/base-http/node.js:93:19)",
" at NextNodeServer.handleRequest (/var/task/node_modules/next/dist/server/base-server.js:332:47)",
" at processTicksAndRejections (node:internal/process/task_queues:96:5)",
" at async /var/task/index.js:34:5"
]
}
At the moment of writing Vercel's runtime: 'experimental-edge' seems to be unstable (run into multiple issues with it)
I ended up recreating #vercel/og lib without wasm and next.js dependencies, can be found here
and simply use it in AWS lambda. It depends on #resvg/resvg-js instead of wasm version, which uses binaries, so there should not be much perf loss comparing to wasm

Some modules can be used with NodejsFunction(AWS CDK) and some cannot?

I defined a lambda function using NodejsFunction of AWS CDK.
I installed the modules I want to use in my lambda function in node_modules of CDK.
When I execute sam local invoke, some modules succeed and others fail.
When an error occurs, the error message "File not found in '/var/task/...'" is displayed.
Does this mean that some modules can be used with NodejsFunction and some cannot?
lambda-stack.ts
new lambda.NodejsFunction(this, 'SampleFunction', {
runtime: Runtime.NODEJS_14_X,
entry: 'lambda/sample-function/index.ts'
})
lambda/sample-function/index.ts (use 'date-fns') -> succeeded!
import { format } from 'date-fns'
export const handler = async () => {
try {
console.log(format(new Date(), "'Today is a' eeee"))
} catch (error) {
console.log(error)
}
}
lambda/sample-function/index.ts (use 'chrome-aws-lambda') -> failed
const chromium = require('chrome-aws-lambda')
export const handler = async () => {
try {
const browser = await chromium.puppeteer.launch()
} catch (error) {
// Cannot find module '/var/task/puppeteer/lib/Browser'
console.log(error)
}
}
lambda/sample-function/index.ts (use 'pdfkit') -> failed
const PDFDocument = require('pdfkit')
export const handler = async () => {
try {
const doc = new PDFDocument()
} catch (error) {
// no such file or directory, open '/var/task/data/Helvetica.afm'
console.log(error)
}
}
it seems that the "pdfkit" and "chrome-aws-lambda" are packages that use some binary files and you need to verify that those binary found in lambda.
when you create a lambda using 'new lambda.NodejsFunction()' in background, there is a esbuild process that bundles all files into one so make sure you not see any error related to that build during synth.
in order to verfiy this is the problem you could try upload your lambda with node_modules and chcek if it is work.
alternative you can:
look (or create) "lambda layer" that will contain those binary see example.
build your lambda with all of the dependencies as a docker image and set lambda to run that docker.

"Unsupported Media Type" using serverless offline

I'm working on a small serverless offline assignment and I got error Unsupported Media Type when tried to invoke one lambda function in another.
I found a solution but when I tried to applied to my project was not working:
here in the link all the details. cloud anyone help me with that
https://github.com/dherault/serverless-offline/issues/1005#issue-632401297
there are three possible solutions.
Make sure that the lambda_A have the same port and host where the lambda_B is running.
Lambda_A:
const { Lambda } = require('aws-sdk');
const lambda = new Lambda({
region: 'us-east-1',
endpoint: 'http://localhost:3000',
});
module.exports.main = async (event, context) => {
// invoke
}
Lambda_B: Is running on http://localhost:3000
You have configured out serverless-offline in twice functions.
https://www.serverless.com/plugins/serverless-offline#usage-with-invoke
Lambda_A or Lambda_B have correctly stage?. Remember to use sls offline --stage local in both functions.

Winston CloudWatch Transport not Creating Logs When Running on Lambda

I have an expressjs App that is setup to run from within a AWS Lambda function. When I deploy this app to the lambda, the console logs for the lambda cloudwatch log show up (i.e. /aws/lambda/lambda-name), but it doesn't create a new CloudWatch LogGroup as specified in the configuration.
If I run the lambda function locally and generate logs it will create a CloudWatch Log Group for the local environment.
The Lambda Functions are connecting to an RDS instance so they are contained within a VPC.
The Lambda has been assigned the CloudWatchFullAccess policy so it should not be a permissions error.
I've looked at the Lambda logs and I'm not seeing any errors coming through related to this.
const env = process.env.NODE_ENV || 'local'
const config = require('../../config/env.json')[env]
const winston = require('winston')
const WinstonCloudwatch = require('winston-cloudwatch')
const crypto = require('crypto')
let startTime = new Date().toISOString()
const logger = winston.createLogger({
exitOnError: false,
level: 'info',
transports: [
new winston.transports.Console({
json: true,
colorize: true,
level: 'info'
}),
new WinstonCloudwatch({
awsAccessKeyId: config.aws.accessKeyId,
awsSecretKey: config.aws.secretAccessKey,
logGroupName: 'my-api-' + env,
logStreamName: function () {
// Spread log streams across dates as the server stays up
let date = new Date().toISOString().split('T')[0]
return 'my-requests-' + date + '-' +
crypto.createHash('md5')
.update(startTime)
.digest('hex')
},
awsRegion: 'us-east-1',
jsonMessage: true
})
]
})
const winstonStream = {
write: (message, encoding) => {
// use the 'info' log level so the output will be picked up by both transports
logger.info(message)
}
}
module.exports.logger = logger
module.exports.winstonStream = winstonStream
Then within my express app.
const morgan = require('morgan')
const { winstonStream } = require('./providers/loggers')
app.use(morgan('combined', { stream: winstonStream }
Confirming that the problem was related to the lambda function being in a VPC and not granted public access to the internet through Subnets, Route Tables, NAT and Internet Gateways as described within this post. https://gist.github.com/reggi/dc5f2620b7b4f515e68e46255ac042a7
I believe that to access external internet services you'd need what you described.
But to access an AWS service outside the VPC you can create a VPC endpoint.
https://docs.aws.amazon.com/AmazonCloudWatch/latest/logs/cloudwatch-logs-and-interface-VPC.html

CloudFormation: The runtime parameter of nodejs6.10 is no longer supported for creating or updating AWS Lambda functions

I'm trying to update a cloud formation template with a few lambda functions in it. The last version of the template was deployed a few years ago, and all the lambda functions currently have a runtime of node6.10.
I have updated the runtime for all functions to node10.x, but when I deploy the template, I get the following message:
The runtime parameter of nodejs6.10 is no longer supported for creating or updating AWS Lambda functions
I've created a change set, and reviewed it, and it includes an update to the runtime property for each lambda function, however Cloud Formation seems to be ignoring it.
Is there something I'm missing?
Context:
I assume that you encountered this issue - (you got "nodejs version not supported error" message after you tried to amplify push followed by amplify add auth)
Go to amplify -> backend -> auth -> cognito -> click cognito cloudformation ->
search for "Runtime: node"
change it to "Runtime: nodejs8.10" - whatever latest recommended in error message
re-run
$ amplify push
Unfortunately, I found I had to update the runtime of all functions in a template outside of Cloud Formation, to get the stacks to deploy. I used this script:
const AWS = require('aws-sdk')
const lambda = new AWS.Lambda(...)
main().catch(err => {
console.error(err)
process.exit(1)
})
async function main() {
const functions = await getFunctions()
await Promise.all(
functions
// filter only functions you want to update
.filter(...)
.filter(x => x.Runtime !== 'nodejs10.x')
.map(updateFunction)
)
}
async function updateFunction(func) {
await lambda
.updateFunctionConfiguration({
FunctionName: func.FunctionName,
Runtime: 'nodejs10.x'
})
.promise()
console.log(`function updated: ${func.FunctionName}`)
}
async function getFunctions() {
let marker
let functions = []
do {
const result = await lambda
.listFunctions({
Marker: marker
})
.promise()
functions = [...functions, ...result.Functions]
marker = result.NextMarker
} while (marker)
return functions
}

Resources