Import DynamoDB Table - YML - aws-lambda

I have a serverless yml file for node project with aws. Also I have a dynamodb table in another file. I can deploy the project with no issues, however VSCode is showing a red alert problem in my import line: Incorrect type expected...
serverless.yml
service: auction-service
plugins:
- serverless-bundle
- serverless-pseudo-parameters
provider:
name: aws
runtime: nodejs12.x
memorySize: 256
stage: ${opt:stage, 'dev'}
region: eu-west-1
environment:
AUCTIONS_TABLE_NAME: ${self:custom.AuctionsTable.name}
iam:
role:
statements:
- ${file(iam/AuctionsTableIAM.yml):AuctionsTableIAM}
resources:
Resources:
AuctionsTable: ${file(resources/AuctionsTable.yml):AuctionsTable}
functions:
createAuction:
handler: src/handlers/createAuction.handler
events:
- http:
method: POST
path: /auction
custom:
AuctionsTable:
name: !Ref AuctionsTable
arn: !GetAtt AuctionsTable.Arn
bundle:
linting: false
AuctionsTable.yml:
[![AuctionsTable:
Type: AWS::DynamoDB::Table
Properties:
TableName: AuctionsTable--${self:provider.stage}
BillingMode: PAY_PER_REQUEST
AttributeDefinitions:
- AttributeName: id
AttributeType: S
KeySchema:
- AttributeName: id
KeyType: HASH][1]][1]

Try This
resources:
Resources:
- ${file(resources/AuctionsTable.yml):AuctionsTable}
Note, you will need to add the name AuctionsTable to the imported file.

Related

Lambda(Serverless Framework) + DynamoDB Accelerator(DAX): App can not connect

i created my app using AWS(API Gateway/Lambda/DynamoDB).
but loading speed of the app is too late.
so, i want to improve the loading speed using DynamoDB DAX(cache).
serverless.yml
service: myapp
frameworkVersion: '2.31'
provider:
name: aws
runtime: nodejs14.x
lambdaHashingVersion: 20201221
stage: $opt:stage
region: us-east-1
iamRoleStatements:
- Effect: Allow
Action:
- dax:*
Resource: '*'
- Effect: Allow
Action:
- dynamodb:Query
- dynamodb:Scan
- dynamodb:GetItem
- dynamodb:PutItem
- dynamodb:UpdateItem
- dynamodb:DeleteItem
Resource: '*'
vpc:
securityGroupIds:
- !GetAtt daxSecurityGroup.GroupId
subnetIds:
- !Ref daxSubnet
functions:
graphql:
handler: ./build/src/app.handler
events:
- http:
path: graphql
method: ANY
environment:
DAX_ENDPOINT: !GetAtt daxCluster.ClusterDiscoveryEndpoint
vpc:
securityGroupIds:
- !GetAtt daxSecurityGroup.GroupId
subnetIds:
- !Ref daxSubnet
resources:
Resources:
daxCluster:
Type: AWS::DAX::Cluster
Properties:
ClusterName: dax-cluster
IAMRoleARN: !GetAtt daxRole.Arn
NodeType: dax.t2.small
ReplicationFactor: 1
SecurityGroupIds:
- !GetAtt daxSecurityGroup.GroupId
SubnetGroupName: !Ref daxSubnetGroup
daxRole:
Type: AWS::IAM::Role
Properties:
AssumeRolePolicyDocument:
Statement:
- Action:
- sts:AssumeRole
Effect: Allow
Principal:
Service:
- dax.amazonaws.com
Version: '2012-10-17'
RoleName: dax-role
ManagedPolicyArns:
- arn:aws:iam::aws:policy/AmazonDynamoDBFullAccess
daxSecurityGroup:
Type: AWS::EC2::SecurityGroup
Properties:
GroupDescription: Security Group for Dax
GroupName: dax-security-group
VpcId: !Ref daxVpc
daxSecurityGroupIngress:
Type: AWS::EC2::SecurityGroupIngress
DependsOn: daxSecurityGroup
Properties:
GroupId: !GetAtt daxSecurityGroup.GroupId
IpProtocol: tcp
FromPort: 8111
ToPort: 8111
SourceSecurityGroupId: !GetAtt daxSecurityGroup.GroupId
daxVpc:
Type: AWS::EC2::VPC
Properties:
CidrBlock: 10.0.0.0/16
EnableDnsHostnames: true
EnableDnsSupport: true
InstanceTenancy: default
Tags:
- Key: Name
Value: dax-cluster
daxSubnet:
Type: AWS::EC2::Subnet
Properties:
AvailabilityZone:
Fn::Select:
- 0
- Fn::GetAZs: ''
CidrBlock: 10.0.0.0/20
Tags:
- Key: Name
Value: dax-cluster
VpcId: !Ref daxVpc
daxSubnetGroup:
Type: AWS::DAX::SubnetGroup
Properties:
Description: Subnet group for DAX
SubnetGroupName: dax-subnet-group
SubnetIds:
- !Ref daxSubnet
MyTable:
Type: AWS::DynamoDB::Table
Properties:
TableName: ${self:provider.stage}_table
AttributeDefinitions:
- AttributeName: id
AttributeType: S
KeySchema:
- AttributeName: id
KeyType: HASH
ProvisionedThroughput:
ReadCapacityUnits: 1
WriteCapacityUnits: 1
PointInTimeRecoverySpecification:
PointInTimeRecoveryEnabled: true
app.ts
import { DynamoDB } from "aws-sdk";
import AmazonDaxClient from "amazon-dax-client";
export default async() {
...
const endpoint = process.env.DAX_ENDPOINT as string;
const config = { ... }
const dax: any = new AmazonDaxClient({endpoints: [endpoint], region: 'us-east-1'});
const dynamodb = new DynamoDB.DocumentClient({...config, service: dax});
await dynamodb.transactWrite({
TransactItems: [
{
Delete: {
TableName: 'development_table',
Key: {
id: args.id,
createdAt: createdAt
}
}
},
],
}).promise();
return ( ... )
}
My App can not connect to API Gateway.(Timeout)
Please help me, That's very kind of you.
i tried...
serverless.yml
...
functions:
graphql:
handler: ./build/src/app.handler
events:
- httpApi:
path: /{id+}
method: GET
- httpApi:
path: /
method: POST
...
But, App can not connect.
i referred to AWS blog
package
"#types/amazon-dax-client": "^1.2.3",
"amazon-dax-client": "^1.2.9",

Unable to interpolate values in Serverless Framework for AWS

I'm trying to organize this serverless YML file, but getting failures.I've scanned their docs for understanding how to interpolate via file substitution and I just can't figure it out:
I've read all these:
https://www.serverless.com/framework/docs/providers/aws/guide/variables#reference-properties-in-other-files
https://forum.serverless.com/t/split-up-include-reference-serverless-yml-file/3747
https://github.com/AnomalyInnovations/serverless-stack-demo-api/blob/master/serverless.yml#L128-L137
ERRORS
Serverless Error ----------------------------------------
The CloudFormation template is invalid: [/Resources/IamRoleLambdaExecution/Type/Policies/0/PolicyDocument/Statement/2] 'null' values are not allowed in templates
Serverless.yaml
service: newsletter
frameworkVersion: '2'
plugins:
- serverless-bundle
provider:
name: aws
runtime: nodejs12.x
memorySize: 256
stage: ${opt:stage, 'dev'}
region: us-west-2
iamRoleStatements:
- ${file(resources/iam/UsersSubscriptionTable.yml):UsersSubscriptionTableIAM}
resources:
- ${file(/resources/dynamo/UsersSubscriptionTable.yml):UsersSubscriptionTable}
functions:
createEmailEntry:
handler: src/Email.addUser
events:
- http:
method: POST
path: /subscribe
removeEmailEntry:
handler: src/Email.removeUser
events:
- http:
method: GET
path: /unsubscribe
# Not recommended for production-use
custom:
bundle:
linting: false
resources/iam/UsersSubscriptionTable.yml
Resources:
UsersSubscriptionTableIAM:
Effect: Allow
Action:
- dynamodb:PutItem
- dynamodb:GetItem
Resource:
- arn:aws:dynamodb:${aws:region}:${aws:accountId}:table/MyCoolTable
resources/dynamo/UsersSubscriptionTable.yml
Resources:
UsersSubscriptionTable:
Type: AWS::DynamoDB::Table
Properties:
TableName: MyCoolTable
BillingMode: PAY_PER_REQUEST
AttributeDefinitions:
- AttributeName: email
AttributeType: S
KeySchema:
- AttributeName: email
KeyType: HASH
This was a huge pain to figure out:
provider:
name: aws
runtime: nodejs12.x
memorySize: 256
stage: ${opt:stage, 'dev'}
region: us-west-2
iam:
role:
statements:
- ${file(resources/iam/UsersSubscriptionTable.yml):UsersSubscriptionTableIAM}
resources:
Resources:
UsersSubscriptionTable: ${file(resources/dynamo/UsersSubscriptionTable.yml):UsersSubscriptionTable}
Resources shouldn't be an array in this case.
In addition, the iam role statement shouldn't have a resources top-level item in the yml either:
UsersSubscriptionTableIAM:
Effect: Allow
Action:
- dynamodb:PutItem
- dynamodb:GetItem
Resource:
- arn:aws:dynamodb:${aws:region}:${aws:accountId}:table/MyCoolTable
UsersSubscriptionTable:
Type: AWS::DynamoDB::Table
Properties:
TableName: MyCoolTable
BillingMode: PAY_PER_REQUEST
AttributeDefinitions:
- AttributeName: email
AttributeType: S
# Specifies the email as the partition key (primary key)
KeySchema:
- AttributeName: email
KeyType: HASH

AWS Automation Document not updating Lambda Alias

I've created an Automation Document using cloud formation to update the live alias for a given function. It runs ok without any errors and I'm not seeing anything cloud trail. But when I check which version is set to alias:live it is left unchanged.
template.yml
AWSTemplateFormatVersion: "2010-09-09"
Description: "AWS CloudFormation Template for Response Plans"
Parameters:
Environment:
Type: String
Default: "sandbox"
Domain:
Type: String
Team:
Type: String
NotificationARN:
Type: AWS::SSM::Parameter::Value<String>
Resources:
ResponsePlan:
Type: AWS::SSMIncidents::ResponsePlan
Properties:
Actions:
- SsmAutomation:
RoleArn: !GetAtt Role.Arn
DocumentName: UpdateAliasDocument
DisplayName: "UpdateLambdaAlias"
IncidentTemplate:
Impact: 3
NotificationTargets:
- SnsTopicArn:
Ref: NotificationARN
Summary: "String"
Title: "String"
Name: "UpdateLambdaAlias"
Tags:
- Key: "Team"
Value: !Ref Team
- Key: "Domain"
Value: !Ref Domain
- Key: "Environment"
Value: !Ref Environment
Document:
Type: AWS::SSM::Document
Properties:
Content:
schemaVersion: "2.2"
parameters:
FunctionVersion:
type: "String"
default: "1"
FunctionName:
type: "String"
mainSteps:
- name: "UpdateLambdaAlias"
action: aws:runShellScript
inputs:
runCommand:
- aws lambda update-alias --function-name {{FunctionName}} --name live --function-version {{FunctionVersion}}
DocumentType: "Command"
TargetType: /
Tags:
- Key: "Team"
Value: !Ref Team
Role:
Type: AWS::IAM::Role
Properties:
AssumeRolePolicyDocument:
Statement:
- Effect: Allow
Principal:
Service:
- lambda.amazonaws.com
Action: sts:AssumeRole
Path: /
Policies:
- PolicyName: EC2Instances
PolicyDocument:
Statement:
- Effect: Allow
Action:
- ec2:*
Resource:
- !Sub arn:${AWS::Partition}:ec2:${AWS::Region}:${AWS::AccountId}:instance/*
- PolicyName: UpdateAliasPolicy
PolicyDocument:
Statement:
- Effect: Allow
Action:
- lambda:UpdateFunctionConfiguration
Resource:
- !Sub arn:${AWS::Partition}:lambda:${AWS::Region}:${AWS::AccountId}:function:${Environment}-*
Instance:
Type: AWS::EC2::Instance
Properties:
ImageId: ami-0c2b8ca1dad447f8a
InstanceType: t2.micro
Monitoring: true
Tags:
- Key: "Team"
Value: !Ref Team
Update
Looks like not target is being found to run the script on
It looks like your Role entry does not have the required permissions to execute the update-alias command. Your policy only allows for lambda:UpdateFunctionConfiguration.
You will at least need the lambda:UpdateAlias permission as well. If this is not enough, you could try being very permissive with your role and then reducing the permissions afterwards.

Serverless wsgi unable to import werkzeug

I'm having issues deploying my serverless application to AWS. In AWS the logs show:
Unable to import module 'wsgi_handler': No module named 'werkzeug'
I have explicitly specified werkzeug in my requirements.txt but it seems that when I run sls deploy the packages specified are not being put inside the zip file that is uploaded to my S3 bucket.
Below is a copy of my serverless.yml file:
service: serverless-flask
plugins:
- serverless-python-requirements
- serverless-wsgi
- serverless-dynamodb-local
custom:
tableName: 'transactions-table-${self:provider.stage}'
wsgi:
app: app.app # entrypoint is app.app, which means the app object in the app.py module.
packRequirements: false
pythonRequirements:
dockerizePip: true
dynamodb:
stages:
- test
- dev
start:
migrate: true
provider:
name: aws
runtime: python3.6
stage: dev
region: us-east-1
iamRoleStatements:
- Effect: Allow
Action:
- dynamodb:Query
- dynamodb:Scan
- dynamodb:GetItem
- dynamodb:PutItem
Resource:
- { "Fn::GetAtt": ["TransactionsDynamoDBTable", "Arn" ] }
environment:
TRANSACTIONS_TABLE: ${self:custom.tableName}
functions:
app:
handler: wsgi_handler.handler
events:
- http: ANY /
- http: 'ANY {proxy+}'
resources:
Resources:
TransactionsDynamoDBTable:
Type: 'AWS::DynamoDB::Table'
Properties:
AttributeDefinitions:
-
AttributeName: transactionId
AttributeType: S
-
AttributeName: timestamp
AttributeType: S
KeySchema:
-
AttributeName: transactionId
KeyType: HASH
-
AttributeName: timestamp
KeyType: RANGE
ProvisionedThroughput:
ReadCapacityUnits: 1
WriteCapacityUnits: 1
TableName: ${self:custom.tableName}
Here is my requirements.tx:
boto3==1.11.17
botocore==1.14.17
Click==7.0
docutils==0.15.2
Flask==1.1.1
itsdangerous==1.1.0
Jinja2==2.11.1
jmespath==0.9.4
MarkupSafe==1.1.1
python-dateutil==2.8.1
s3transfer==0.3.3
six==1.14.0
urllib3==1.25.8
Werkzeug==1.0.0
As far as I know, using the serverless-wsgi plugin should handle the installation of this package automatically but I'm seeing no .requirements folder being created in the .serverless folder or in the zipfile that serveless creates.
The requirements.txt file contained inside the zip faile only contains the following:
-i https://pypi.org/simple
I'm not sure what I'm doing wrong but the only solution so far has been to tear down the infrastructure and redeploy with a new url which isn't ideal.
Adding a reference to the lambda layer did the trick for me (see the layers section):
api:
timeout: 30
handler: wsgi_handler.handler
layers:
- {Ref: PythonRequirementsLambdaLayer}
events:
- http: ANY /
You need add your files manually to the package.
In your serverless.yml, add this
package:
individually: true
exclude:
- ./**
include:
- requirements.txt
- <other files>
Once you deploy, you can download the packaged zip from AWS and verify if your files are there.

Serverless Framework - Setting up resource permissions for dynamodb

I've the following serverless.yml file.
I'm trying to assign read write permissions to the generated dynamodb..
So far it generates my lambda and the dynamodb table but the lambda isn't assigned permissions to access it.
I get no errors and it doesn't seem to add the permission to the dynamodb table.
Can anyone shed any light please?
service:
name: catcam
custom:
stage: ${opt:stage, self:provider.stage}
tableName: ${self:custom.stage}-notes
environment:
tableName: ${self:custom.tableName}
plugins:
- '#hewmen/serverless-plugin-typescript'
- serverless-plugin-optimize
- serverless-offline
provider:
name: aws
runtime: nodejs12.x
iamRoleStatements:
- Effect: Allow
Action:
- dynamodb:Query
- dynamodb:Scan
- dynamodb:GetItem
- dynamodb:PutItem
- dynamodb:UpdateItem
- dynamodb:DeleteItem
Resource:
- { "Fn::GetAtt": ["NotesTable", "Arn" ] }
# - { !GetAtt NotesTable.Arn }
functions:
main: # The name of the lambda function
# The module 'handler' is exported in the file 'src/lambda'
handler: src/lambda.handler
events:
- http:
method: any
path: /{any+}
resources:
Resources:
NotesTable:
Type: AWS::DynamoDB::Table
Properties:
TableName: ${self:custom.tableName}
AttributeDefinitions:
- AttributeName: userId
AttributeType: S
- AttributeName: noteId
AttributeType: S
KeySchema:
- AttributeName: userId
KeyType: HASH
- AttributeName: noteId
KeyType: RANGE
# Set the capacity to auto-scale
BillingMode: PAY_PER_REQUEST
Turns out there was nothing wrong with the above, it's correct!!.. it's was me being a banana and not matching the full name of the table with the environment in the application.. i.e. notes table becomes dev-notes for instance.. maybe the above will help someone.

Resources