How to run nextjs in AWS lambda with `experimental-edge` runtime - aws-lambda

I'm trying to find a way to run Next.js (v13.0.6) with OG image generation logic (using #vercel/og) in AWS Lambda
Everything works fine locally (in dev and prod mode) but when I try execute lambda handler getting "statusCode": 500,
It only fails for apis that involve ImageResponse (and runtime: 'experimental-edge' as a requirement for #vercel/og)
I'm pretty sure the problem is caused by Edge Runtime is not being configured correctly
There is my handler code
next build with next.config.js output: 'standalone' creates folder .next/standalone
insde standalone handler.js
const { parse } = require('url');
const NextServer = require('next/dist/server/next-server').default
const serverless = require('serverless-http');
const path = require('path');
process.env.NODE_ENV = 'production'
process.chdir(__dirname)
const currentPort = parseInt(process.env.PORT, 10) || 3000
const nextServer = new NextServer({
hostname: 'localhost',
port: currentPort,
dir: path.join(__dirname),
dev: false,
customServer: false,
conf: {...} // copied from `server.js` in there same folder
});
const requestHandler = nextServer.getRequestHandler();
// this is a AWS lambda handler that converts lambda event
// to http request that next server can process
const handler = serverless(async (req, res) => {
// const parsedUrl = parse(req.url, true);
try {
await requestHandler(req, res);
}catch(err){
console.error(err);
res.statusCode = 500
res.end('internal server error')
}
});
module.exports = {
handler
}
testing it locally with local-lambda, but getting similar results when test against AWS deployed lambda
what is confusing is that server.js (in .next/standalone) has a similar setup, it only involves http server on top of of it
update:
aws lambda logs show
ERROR [Error [CompileError]: WebAssembly.compile(): Compiling function #64 failed: invalid value type 'Simd128', enable with --experimental-wasm-simd #+3457 ]
update 2:
the first error was fixed by selecting Node 16 for AWS lambda, now getting this error
{
"errorType": "Error",
"errorMessage": "write after end",
"trace": [
"Error [ERR_STREAM_WRITE_AFTER_END]: write after end",
" at new NodeError (node:internal/errors:372:5)",
" at ServerlessResponse.end (node:_http_outgoing:846:15)",
" at ServerlessResponse.end (/var/task/node_modules/next/dist/compiled/compression/index.js:22:783)",
" at NodeNextResponse.send (/var/task/node_modules/next/dist/server/base-http/node.js:93:19)",
" at NextNodeServer.handleRequest (/var/task/node_modules/next/dist/server/base-server.js:332:47)",
" at processTicksAndRejections (node:internal/process/task_queues:96:5)",
" at async /var/task/index.js:34:5"
]
}

At the moment of writing Vercel's runtime: 'experimental-edge' seems to be unstable (run into multiple issues with it)
I ended up recreating #vercel/og lib without wasm and next.js dependencies, can be found here
and simply use it in AWS lambda. It depends on #resvg/resvg-js instead of wasm version, which uses binaries, so there should not be much perf loss comparing to wasm

Related

Some modules can be used with NodejsFunction(AWS CDK) and some cannot?

I defined a lambda function using NodejsFunction of AWS CDK.
I installed the modules I want to use in my lambda function in node_modules of CDK.
When I execute sam local invoke, some modules succeed and others fail.
When an error occurs, the error message "File not found in '/var/task/...'" is displayed.
Does this mean that some modules can be used with NodejsFunction and some cannot?
lambda-stack.ts
new lambda.NodejsFunction(this, 'SampleFunction', {
runtime: Runtime.NODEJS_14_X,
entry: 'lambda/sample-function/index.ts'
})
lambda/sample-function/index.ts (use 'date-fns') -> succeeded!
import { format } from 'date-fns'
export const handler = async () => {
try {
console.log(format(new Date(), "'Today is a' eeee"))
} catch (error) {
console.log(error)
}
}
lambda/sample-function/index.ts (use 'chrome-aws-lambda') -> failed
const chromium = require('chrome-aws-lambda')
export const handler = async () => {
try {
const browser = await chromium.puppeteer.launch()
} catch (error) {
// Cannot find module '/var/task/puppeteer/lib/Browser'
console.log(error)
}
}
lambda/sample-function/index.ts (use 'pdfkit') -> failed
const PDFDocument = require('pdfkit')
export const handler = async () => {
try {
const doc = new PDFDocument()
} catch (error) {
// no such file or directory, open '/var/task/data/Helvetica.afm'
console.log(error)
}
}
it seems that the "pdfkit" and "chrome-aws-lambda" are packages that use some binary files and you need to verify that those binary found in lambda.
when you create a lambda using 'new lambda.NodejsFunction()' in background, there is a esbuild process that bundles all files into one so make sure you not see any error related to that build during synth.
in order to verfiy this is the problem you could try upload your lambda with node_modules and chcek if it is work.
alternative you can:
look (or create) "lambda layer" that will contain those binary see example.
build your lambda with all of the dependencies as a docker image and set lambda to run that docker.

Lambda Layers not installing with Serverless

Currently getting the following error with MongoDB:
no saslprep library specified. Passwords will not be sanitized
We are using Webpack so simply installing the module doesn't work (Webpack just ignores it). I found this thread which talks about how to exclude it from Webpack compilations, but then I have to manually load it into every Lambda function which led me to Lambda Layers.
Following the Serverless guide on using Lambda layers allowed me to get my layer published to AWS and included in all of my functions, but for some reason, it doesn't install the modules. If I download the layer using the AWS GUI, I get a folder with just the package.json and package-lock.json files.
My file structure is:
my-project
|_ layers
|_ saslprep
|_ package.json
and my serverless.yml is:
layers:
saslprep:
path: layers/saslprep
compatibleRuntimes:
- nodejs14.x
This is not my preferred solution as I'd like to use 256, but the way I got around this error/warning was by changing the authMechanism from SCRAM-SHA-256 to SCRAM-SHA-1 in the connection string. The serverless-bundle most likely needs to add this dependency into their package to enable support for Mongo 4.0 SHA256 (my best guess!).
You can specify this authentication mechanism by setting the authMechanism parameter to the value SCRAM-SHA-1 in the connection string as shown in the following sample code.
const { MongoClient } = require("mongodb");
// Replace the following with values for your environment.
const username = encodeURIComponent("<username>");
const password = encodeURIComponent("<password>");
const clusterUrl = "<MongoDB cluster url>";
const authMechanism = "SCRAM-SHA-1";
// Replace the following with your MongoDB deployment's connection string.
const uri =
`mongodb+srv://${username}:${password}#${clusterUrl}/?authMechanism=${authMechanism}`;
// Create a new MongoClient
const client = new MongoClient(uri);
// Function to connect to the server
async function run() {
try {
// Connect the client to the server
await client.connect();
// Establish and verify connection
await client.db("admin").command({ ping: 1 });
console.log("Connected successfully to server");
} finally {
// Ensures that the client will close when you finish/error
await client.close();
}
}
run().catch(console.dir);

"Unsupported Media Type" using serverless offline

I'm working on a small serverless offline assignment and I got error Unsupported Media Type when tried to invoke one lambda function in another.
I found a solution but when I tried to applied to my project was not working:
here in the link all the details. cloud anyone help me with that
https://github.com/dherault/serverless-offline/issues/1005#issue-632401297
there are three possible solutions.
Make sure that the lambda_A have the same port and host where the lambda_B is running.
Lambda_A:
const { Lambda } = require('aws-sdk');
const lambda = new Lambda({
region: 'us-east-1',
endpoint: 'http://localhost:3000',
});
module.exports.main = async (event, context) => {
// invoke
}
Lambda_B: Is running on http://localhost:3000
You have configured out serverless-offline in twice functions.
https://www.serverless.com/plugins/serverless-offline#usage-with-invoke
Lambda_A or Lambda_B have correctly stage?. Remember to use sls offline --stage local in both functions.

Heroku postgres node connection timeout

I'm trying to connect to a Postgres database from my Heroku node app, which works when running locally, both through node and by running the heroku local web command, but when running it on Heroku, it times out while waiting for pool.connect
I'm running the following code snippet through the Heroku console (I've also tried using this code in my app directly, but this is more efficient than redeploying each time):
node -e "
const { Pool } = require('pg');
const pool = new Pool({
connectionTimeoutMillis: 15000,
connectionString: process.env.DATABASE_URL + '?sslmode=require',
ssl: {
rejectUnauthorized: true
}
});
console.log('pool created');
(async() => {
try {
console.log('connecting');
const client = await pool.connect(); // this never resolves
console.log('querying');
const { rows } = await client.query('SELECT * FROM test_table LIMIT 1;');
console.log('query success', rows);
client.release()
} catch (error) {
console.log('query error', error);
}
})()
"
Things I've tried so far:
Using the pg Clientinstead of Pool
Using ssl: true instead of ssl: { rejectUnauthorized: true }
Using client.query without using pool.connect
Increased and omitted connectionTimeoutMillis (it resolves quickly when running locally since I'm querying a database that has just one row)
I've also tried using callbacks and promises instead of async / await
I've tried setting the connectionString both with the ?sslmode=require parameter and without it
I have tried using pg versions ^7.4.1 and ^7.18.2 so far
My assumption is that there is something I'm missing with either the Heroku setup or SSL, any help would be greatly appreciated, Thanks!

Winston CloudWatch Transport not Creating Logs When Running on Lambda

I have an expressjs App that is setup to run from within a AWS Lambda function. When I deploy this app to the lambda, the console logs for the lambda cloudwatch log show up (i.e. /aws/lambda/lambda-name), but it doesn't create a new CloudWatch LogGroup as specified in the configuration.
If I run the lambda function locally and generate logs it will create a CloudWatch Log Group for the local environment.
The Lambda Functions are connecting to an RDS instance so they are contained within a VPC.
The Lambda has been assigned the CloudWatchFullAccess policy so it should not be a permissions error.
I've looked at the Lambda logs and I'm not seeing any errors coming through related to this.
const env = process.env.NODE_ENV || 'local'
const config = require('../../config/env.json')[env]
const winston = require('winston')
const WinstonCloudwatch = require('winston-cloudwatch')
const crypto = require('crypto')
let startTime = new Date().toISOString()
const logger = winston.createLogger({
exitOnError: false,
level: 'info',
transports: [
new winston.transports.Console({
json: true,
colorize: true,
level: 'info'
}),
new WinstonCloudwatch({
awsAccessKeyId: config.aws.accessKeyId,
awsSecretKey: config.aws.secretAccessKey,
logGroupName: 'my-api-' + env,
logStreamName: function () {
// Spread log streams across dates as the server stays up
let date = new Date().toISOString().split('T')[0]
return 'my-requests-' + date + '-' +
crypto.createHash('md5')
.update(startTime)
.digest('hex')
},
awsRegion: 'us-east-1',
jsonMessage: true
})
]
})
const winstonStream = {
write: (message, encoding) => {
// use the 'info' log level so the output will be picked up by both transports
logger.info(message)
}
}
module.exports.logger = logger
module.exports.winstonStream = winstonStream
Then within my express app.
const morgan = require('morgan')
const { winstonStream } = require('./providers/loggers')
app.use(morgan('combined', { stream: winstonStream }
Confirming that the problem was related to the lambda function being in a VPC and not granted public access to the internet through Subnets, Route Tables, NAT and Internet Gateways as described within this post. https://gist.github.com/reggi/dc5f2620b7b4f515e68e46255ac042a7
I believe that to access external internet services you'd need what you described.
But to access an AWS service outside the VPC you can create a VPC endpoint.
https://docs.aws.amazon.com/AmazonCloudWatch/latest/logs/cloudwatch-logs-and-interface-VPC.html

Resources