How do you do async setup outside of a lambda? - aws-lambda

Config makes a call to the parameter store and returns a config object. I need to wait before initialising mysql.
const config = require('./config');
const mysql = require('serverless-mysql')(config);
exports.handler = (event, context) => {
// mysql stuff
}

I assume you need to wait for this to happen?
const mysql = require('serverless-mysql')(config)??
If so, then do this:
const config = require('./config');
async function mySQLStuff() {
try{
const mysql = await require('serverless-mysql')(config);
} catch (error) {
//handle error
}
return mysql;
};
exports.handler = (event, context) => {
mySQLStuff()
.then((data) => //mysql stuff)
};

Related

Sequelize not closing connections when using AWS Lambdas

I'm having an issue where despite replicating the pattern for using Sequelize in AWS Lambdas as described in the docs here: https://sequelize.org/docs/v6/other-topics/aws-lambda/ - the connections are still kept for 10+ minutes even though they are idle.
Here's my Sequelize class, that lives inside the DB package in a private CodeArtifact repo.
const { Sequelize, DataTypes } = require("sequelize");
const fs = require('fs');
const path = require('path');
const basename = path.basename(__filename);
let modelsMap = new Map();
let sequelize = null;
module.exports = class SequelizeClient {
constructor() {}
static async getSequelize(configuration, options){
if(!sequelize){
console.log('load sequelize');
sequelize = await SequelizeClient.#loadSequelize(configuration, options)
} else {
sequelize.connectionManager.initPools();
// restore `getConnection()` if it has been overwritten by `close()`
if (sequelize.connectionManager.hasOwnProperty("getConnection")) {
delete sequelize.connectionManager.getConnection;
}
}
return sequelize;
}
static #initSequelize(options) {
return new Sequelize(
options.database,
options.user,
options.password,
{
host: options.host,
port: parseInt(options.port) || 5432,
logging: console.log,
dialect: "postgres",
pool: {
max: 2,
min: 0,
idle: 1000,
acquire: 3000,
evict: 1000
}
},
);
}
static async #loadSequelize(configuration) {
sequelize = SequelizeClient.#initSequelize(configuration);
await sequelize.authenticate();
return sequelize;
}
static async getModels(sequelize) {
const databaseName = sequelize.config.database;
let models = modelsMap.get(databaseName);
if(models){
return models;
}
models = {};
console.log('load models');
fs
.readdirSync(`${__dirname}/models`)
.filter((file) => file.indexOf('.') !== 0 && file !== basename && file.slice(-3) === '.js')
.forEach((file) => {
const model = require(path.join(`${__dirname}/models`, file))(sequelize, DataTypes);
const name = model.name.charAt(0).toUpperCase() + model.name.slice(1);
models[name] = model;
});
Object.keys(models).forEach((modelName) => {
if (models[modelName].associate) {
models[modelName].associate(models);
}
});
modelsMap.set(databaseName, models);
return models;
}
async static closeConnections(){
await sequelize.connectionManager.close();
}
}
And here's how I use this code inside a Lambda.
const { SequelizeClient } = require("#myPrivatePackage/database");
exports.handler = async (event) => {
try {
const sequelize = await SequelizeClient.getSequelize(someDbParams);
const { MyModel } = await SequelizeClient.getModels(sequelize);
return await MyModel.findOne(whereObj);
} finally {
SequelizeClient.closeConnections();
}
};
Obviously, I cleaned up the code to remove things that are not directly impacting the issue, such as specific queries, code that I'd like to avoid being public etc.
Based on the info I gathered online I'd assume that the connections would be closed once the query is done, but checking pgsql with this
SELECT pid, now() - query_start as duration, query
FROM pg_stat_activity
WHERE pid <> pg_backend_pid()
AND datname = 'db_name'
order by duration desc;
Shows a lot of queries that have duration in excess of 5-10min. Eventually, after around 15min the connections go away.
Any idea as to what I'm doing wrong? Or missing? Or fundamently not understanding when it comes to these concepts? I've been scratching my head for a few days now.

Deno oak websocket must have a method that returns an async iterator

I am trying to build up a WebSocket with oak (not the native deno one).
The following code is how I build the server.
import {Application, Router, Context, send } from "https://deno.land/x/oak#v10.6.0/mod.ts";
const runWS = async (ctx: Context, next: () => Promise<unknown>) => {
try{
const ws = await ctx.upgrade();
ws.onopen = () => {
chatConnection(ws);
};
ws.onclose = () => { console.log('Disconnected from the client!');};
}catch{await next();}
}
let sockets = new Map<string, WebSocket>();
const chatConnection = async (ws: WebSocket) => {
console.log('new websocket, ws: ',ws);
const uid = globalThis.crypto.randomUUID();
sockets.set(uid,ws);
console.log('socket: ',sockets);
for await (const ev of ws){
console.log('ev: ', ev);
}
}
export const wsRoutes = new Router()
.get('/ws', runWS);
But in the for loop (at the end), for ws it says Type 'WebSocket' must have a '[Symbol.asyncIterator]()' method that returns an async iterator.. What's the deal with this and how to fix it?
The error message is providing you with useful information: the WebSocket is not AsyncIterable, which means that it cannot be used with a for await...of loop.
Here is the type documentation for WebSocket in Deno. It is (for the most part) the same as the WHATWG standard WebSocket that is documented on MDN.
If your intention is to respond to incoming message events, you'll need to attach an event listener:
webSocket.addEventListener("message", (messageEvent) => {
// Do something in response to each message event
});
Additional:
Here's an observation based on the code you've shown, but not in response to your question:
It's probably more ergonomic to store the sockets as the keys of your map, and the associated state data in the values. (This is the inverse of what you've shown). Here's an example of why:
import {
Router,
type RouterMiddleware,
} from "https://deno.land/x/oak#v10.6.0/mod.ts";
// You seem to want to log data to the console.
// This function will help you easily log only certain properties of objects:
/**
* Functional implementation of the type utility
* [`Pick<Type, Keys>`](https://www.typescriptlang.org/docs/handbook/utility-types.html#picktype-keys)
*/
function pick<T, K extends keyof T>(
obj: T,
keys: readonly K[],
): Pick<T, K> {
const result = {} as Pick<T, K>;
for (const key of keys) result[key] = obj[key];
return result;
}
type SocketData = { id: string };
const socketMap = new Map<WebSocket, SocketData>();
// Do something when a connection is opened
function handleOpen(ev: Event, ws: WebSocket) {
const socketData: SocketData = { id: window.crypto.randomUUID() };
socketMap.set(ws, socketData);
console.log({
event: pick(ev, ["type"]),
socketData,
});
}
// Do something when an error occurs
function handleError(ev: Event, ws: WebSocket) {
const socketData = socketMap.get(ws);
console.log({
event: pick(ev, ["type"]),
socketData,
});
socketMap.delete(ws);
}
// Do something when a connection is closed
function handleClose(ev: CloseEvent, ws: WebSocket) {
ev.code; // number
ev.reason; // string
ev.wasClean; // boolean
const socketData = socketMap.get(ws);
console.log({
event: pick(ev, ["type", "code", "reason", "wasClean"]),
socketData,
});
socketMap.delete(ws);
}
// Do something when a message is received
// Change `unknown` to the type of message payloads used in your application.
// (for example, JSON messages are `string`)
function handleMessage(ev: MessageEvent<unknown>, ws: WebSocket) {
ev.data; // unknown
ev.lastEventId; // string
ev.ports; // readonly MessagePort[]
const socketData = socketMap.get(ws);
if (socketData) {
socketData.id; // string
}
console.log({
event: pick(ev, ["type", "data", "lastEventId", "ports"]),
socketData,
});
}
const webSocketMiddleware: RouterMiddleware<"/ws"> = async (ctx, next) => {
const ws = ctx.upgrade();
ws.addEventListener("open", (ev) => handleOpen(ev, ws));
ws.addEventListener("error", (ev) => handleError(ev, ws));
ws.addEventListener("close", (ev) => handleClose(ev, ws));
ws.addEventListener("message", (ev) => handleMessage(ev, ws));
await next();
};
export const router = new Router();
router.get("/ws", webSocketMiddleware);
This is my updated code. It avoids the problem entirely
import {Application, Router, Context, send } from "https://deno.land/x/oak#v10.6.0/mod.ts";
interface BroadcastObj{
name: string,
mssg: string
}
const runWS = async (ctx: Context, next: () => Promise<unknown>) => {
if(!ctx.isUpgradable){
ctx.throw(501);
}
const uid = globalThis.crypto.randomUUID();
try{
const ws = await ctx.upgrade();
ws.onopen = () => {
chatConnection(ws);
};
ws.onmessage = (m) => {
let mssg = m.data as string;
if(typeof(mssg) === 'string'){
chatMessage(JSON.parse(mssg));
}
};
ws.onerror = (e) => {console.log('error occured: ', e);};
ws.onclose = () => { chatDisconnect(uid);};
}catch{await next();}
}
let sockets = new Map<string, WebSocket>();
const chatConnection = async (ws: WebSocket, uid: string) => {
await sockets.set(uid,ws);
}
const chatMessage = async (msg: BroadcastObj) => {
await sockets.forEach((ws: WebSocket) => {
ws.send(JSON.stringify(msg));
});
}
const chatDisconnect = async (uid: string) => {
await sockets.delete(uid);
}
export const wsRoutes = new Router()
.get('/ws', runWS);

Umzug Migration UP with Aws Lambda function not working

I have built a test app using nestjs + Sequelize ORM + docker database (as of now local). As per documentation, I am using umzug library and AWS Lambda SAM template and triggering lambda handler. Below is the code for it. Connection Pooling is implemented to reuse existing sequelize connection. Below is the lambdaEntry.ts file where I trigger umzug.up() function. It is triggering but not migrating files.
When done from command prompt node migrate up it works correctly. I am testing using sam invoke command to test it.
require('ts-node/register');
import { Server } from 'http';
import { NestFactory } from '#nestjs/core';
import { Context } from 'aws-lambda';
import * as serverlessExpress from 'aws-serverless-express';
import * as express from 'express';
import { ExpressAdapter } from '#nestjs/platform-express';
import { eventContext } from 'aws-serverless-express/middleware';
import { AppModule } from './app.module';
import sharedBootstrap from './sharedBootstrap';
const { Sequelize } = require('sequelize');
const { Umzug, SequelizeStorage } = require('umzug');
import configuration from '.././config/config';
const fs = require('fs');
let lambdaProxy: Server;
let sequelize = null;
async function bootstrap() {
const expressServer = express();
const nestApp = await NestFactory.create(
AppModule,
new ExpressAdapter(expressServer),
);
nestApp.use(eventContext());
sharedBootstrap(nestApp);
await nestApp.init();
return serverlessExpress.createServer(expressServer);
}
export const handler = (event: any, context: Context) => {
if (!lambdaProxy) {
bootstrap().then((server) => {
lambdaProxy = server;
serverlessExpress.proxy(lambdaProxy, event, context);
(async () => {
if (!sequelize) {
console.log('New connection::');
sequelize = await loadSequelize();
} else {
sequelize.connectionManager.initPools();
if (sequelize.connectionManager.hasOwnProperty('getConnection')) {
delete sequelize.connectionManager.getConnection;
}
}
try {
console.log('MIGRATOR::');
const umzug = new Umzug({
migrations: { glob: 'src/migrations/*.ts' },
context: sequelize.getQueryInterface(),
storage: new SequelizeStorage({ sequelize }),
logger: console,
});
await umzug
.pending()
.then((migrations: any) => {
console.log('pending ? : ', JSON.stringify(migrations));
//test for file exists.
for (const migration of migrations) {
try {
if (fs.existsSync(migration.path)) {
console.log('file exists');
}
} catch (err) {
console.log('file does not exists');
console.error(err);
}
}
async () => {
//BELOW FUNCTION IS TRIGGERING BUT NOT GETTING MIGRATION LOADED.
await umzug.up();
};
})
.catch((e: any) => console.log('error2 ? ', e));
} finally {
await sequelize.connectionManager.close();
}
})();
});
} else {
serverlessExpress.proxy(lambdaProxy, event, context);
}
};
async function loadSequelize() {
const sequelize = new Sequelize(
configuration.database,
configuration.username,
configuration.password,
{
dialect: 'mysql',
host: configuration.host,
port: Number(configuration.port),
pool: {
max: 2,
min: 0,
idle: 0,
acquire: 3000,
evict: 600,
},
},
);
await sequelize.authenticate();
return sequelize;
}
I am able to solve the issue after lot of tries. I seperated out the sequelize connection code and called it from app side and triggered from lambdaentry
lambdaEntry.js file.
async function bootstrap(uuid = null) {
console.log('Calling bootstrap');
const expressServer = express();
const nestApp = await NestFactory.create(
AppModule,
new ExpressAdapter(expressServer),
);
nestApp.use(eventContext());
sharedBootstrap(nestApp);
await nestApp.init();
try {
// Write a function in Service (ex: purhaslistservice) and trigger the service with umzug up from here.
const migrateResult1 = await nestApp.get(PurchaseListService).migrate('down');
console.log(migrateResult1);
const migrateResult2 = await nestApp.get(PurchaseListService).migrate('up');
console.log(migrateResult2);
} catch (err) {
throw err;
}
return serverlessExpress.createServer(expressServer);
}
export const handler = (event: any, context: Context) => {
if (!lambdaProxy) {
bootstrap(uuid).then((server) => {
lambdaProxy = server;
serverlessExpress.proxy(lambdaProxy, event, context);
});
} else {
serverlessExpress.proxy(lambdaProxy, event, context);
}
};
/code/src/purchaselist/purchaselist.service.ts
async migrate(id: string): Promise<any> {
console.log('migrate script triggered', id);
const sequelize = PurchaseListItem.sequelize;
const umzug = new Umzug({
migrations: { glob: 'src/migrations/*.{ts,js}' },
context: sequelize.getQueryInterface(),
storage: new SequelizeStorage({ sequelize }),
logger: console,
});
let consoleDisplay = 'Umzug LOGS:::<br/>';
switch (id) {
default:
case 'up':
await umzug.up().then(function (migrations) {
console.log('Umzug Migrations UP::<br/>', migrations);
consoleDisplay +=
'Umzug Migrations UP::<br/>' + JSON.stringify(migrations);
});
break;
}
return consoleDisplay;
}

Dataloader is not working as expected in graphql resolver

I'm having a dataloader along with my graphql like the following:
async function testDataLoader(accountNumber, req, args) {
const dummy = new DataLoader(async accountNumber => {
return new Promise(async (resolve, reject) => {
// rest call
return resolve([<rest result>])
});
});
return dummy.load(accountNumber)
}
export default {
Friends: {
query1: async ({req, args}) => {
const data = await testDataLoader(["12121"], req, args);
// do something with data
}
query2: async ({req, args}) => {
const data = await testDataLoader(["12121"], req, args);
// do something with data
}
}
};
When we query like:
Friends {
query1
query2
}
I expect dataloader to call my rest services only once. However, I could able to see my rest is called twice. Not sure, where I'm making the mistakes.
The issue is that every time you're calling testDataLoader, you're creating a new instance of DataLoader. You should create a single DataLoader instance per request (per resource you're loading). This way every time you call load you're interacting with the same cache.
You could do something like:
const dummy = new DataLoader(...);
async function testDataLoader(accountNumber) {
return dummy.load(accountNumber)
}
But this would persist the DataLoader between requests, which you don't want to do. What you should do is create the DataLoader instance as part of your context, which is recreated each time a request is executed.
const context = async ({ req }) => {
return {
testLoader = new DataLoader(...),
};
},
const server = new ApolloServer({
...
context,
})
Then just call your loader directly inside your resolver:
query2: async (parent, args, context) => {
const data = await context.testLoader.load(["12121"]);
...
}

Await and Async callbacks hell

I want to make the UserDataGenerator class works like a traditional SYNC class.
My expectation is that userData.outputStructure can give me the data prepared.
let userData = new UserDataGenerator(dslContent)
userData.outputStructure
getFieldDescribe(this.inputStructure.tableName, field) is a ASYNC call which invokes Axios.get
Below is my current progress but it's still not waiting for the data ready when I print out the userData.outputStructure
export default class UserDataGenerator {
inputStructure = null;
outputStructure = null;
fieldDescribeRecords = [];
constructor(dslContent) {
this.outputStructure = Object.assign({}, dslContent, initSections)
process()
}
async process() {
await this.processSectionList()
return this.outputStructure
}
async processSectionList() {
await this.inputStructure.sections.map(section => {
this.outputStructure.sections.push(this.processSection(section));
})
}
async processSection(section) {
let outputSection = {
name: null,
fields: []
}
let outputFields = await section.fields.map(async(inputField) => {
return await this._processField(inputField).catch(e => {
throw new SchemaError(e, this.inputStructure.tableName, inputField)
})
})
outputSection.fields.push(outputFields)
return outputSection
}
async _processField(field) {
let resp = await ai
switch (typeof field) {
case 'string':
let normalizedDescribe = getNormalizedFieldDescribe(resp.data)
return new FieldGenerator(normalizedDescribe, field).outputFieldStructure
}
}
You're trying to await arrays, which doesn't work as you expect. When dealing with arrays of promises, you still need to use Promise.all before you can await it - just like you cannot chain .then on the array.
So your methods should look like this:
async processSectionList() {
const sections = await Promise.all(this.inputStructure.sections.map(section =>
this.processSection(section)
));
this.outputStructure.sections.push(...sections);
}
async processSection(section) {
return {
name: null,
fields: [await Promise.all(section.fields.map(inputField =>
this._processField(inputField).catch(e => {
throw new SchemaError(e, this.inputStructure.tableName, inputField)
})
))]
};
}

Resources