Filter Criteria in Lambda Function - aws-lambda

I want to enable DynamoDB streams on my lambda using AWS CDK which I am able to do but I also want to enable the filter criteria on lambda
But I am getting this error:
Invalid filter pattern definition. (Service: AWSLambda; Status Code: 400; Error Code: InvalidParameterValueException
This is the event I am getting from DynamoDB streams:
{
"input": {
"Records": [
{
"eventID": "e92e0072a661a06df0e62e411f",
"eventName": "INSERT",
"eventVersion": "1.1",
"eventSource": "aws:dynamodb",
"awsRegion": "<region>",
"dynamodb": {
"ApproximateCreationDateTime": 1639500357,
"Keys": {
"service": {
"S": "service"
},
"key": {
"S": "key"
}
},
"NewImage": {
"service": {
"S": "service"
},
"channel": {
"S": "email"
},
"key": {
"S": "key"
}
},
"SequenceNumber": "711500000000015864417",
"SizeBytes": 168,
"StreamViewType": "NEW_IMAGE"
},
"eventSourceARN": "arn:aws:dynamodb:<region>:<account>:table/table-name/stream/2021-12-14T13:00:29.888"
}
]
},
"env": {
"lambdaContext": {
"callbackWaitsForEmptyEventLoop": true,
"functionVersion": "$LATEST",
"functionName": "functionName",
"memoryLimitInMB": "128",
"logGroupName": "/aws/lambda/functionName",
"logStreamName": "2021/12/14/[$LATEST]028531c7b489b8ec69bace700acc0",
"invokedFunctionArn": "arn:aws:lambda:<region>:<account>:function:functionName",
"awsRequestId": "c72e80252-4722-b9f0-a03b7f8b820e"
},
"region": "<region-name>"
}
}
The event source mapping code is:
const mapping = new lambda.CfnEventSourceMapping(this, 'event', {
functionName: "functionName,
batchSize: 1,
bisectBatchOnFunctionError: true,
startingPosition: lambda.StartingPosition.TRIM_HORIZON,
eventSourceArn: <stream-arn>,
filterCriteria: filter,
});
I want to get the eventName to be INSERT and the channel to be email here. What should be the value of the filter criteria? Its not working for me

<Edit> CDK filter helpers added in v2.42.0
The original workaround is no longer necessary. The CDK now has event-source filters for Lambda, Kinesis and SQS. Pass the filter to the L2 EventSourceMapping construct:
const source: EventSourceMapping = new lambda.EventSourceMapping(this, "EventSourceMapping",{
target: func,
eventSourceArn: table.tableStreamArn,
startingPosition: lambda.StartingPosition.TRIM_HORIZON,
filters: [
lambda.FilterCriteria.filter({
eventName: lambda.FilterRule.isEqual("INSERT"),
dynamodb: { NewImage: { channel: { S: lambda.FilterRule.isEqual("email") } },},
}),
],
}
);
</Edit>
Here's the DynamoDB streams filter Pattern syntax for new records with a channel of email:
`{ \"eventName\": [\"INSERT\"], \"dynamodb\": { \"NewImage\": {\"channel\": { \"S\" : [\"email\"]}} } }`
In other words, the Pattern is a stringified JSON filter rule with escaped quotes. The pattern is applied against each stream record.
Here is the full CDK syntax. The code starts with the usual L2 EventSourceMapping. It then uses escape hatch syntax to set FilterCriteria on the underlying L1 CfnEventSourceMapping:
// start with the L2 type - Note: the OP code starts with a L1 `CfnEventSourceMapping`
const source: EventSourceMapping = new lambda.EventSourceMapping(this, 'EventSourceMapping', {
target: func,
eventSourceArn: table.tableStreamArn,
startingPosition: lambda.StartingPosition.TRIM_HORIZON,
});
// escape hatch - get a L1 reference
const cfnSouce = source.node.defaultChild as lambda.CfnEventSourceMapping;
cfnSouce.addPropertyOverride('FilterCriteria', {
Filters: [
{
Pattern: `{ \"eventName\": [\"INSERT\"], \"dynamodb\": { \"NewImage\": {\"channel\": { \"S\" : [\"email\"]}} } }`,
},
],
});

Related

DynamoDB streams filter with nested fields not working

I have a Lambda hooked up to my DynamoDB stream. It is configured to trigger if both criteria are met:
eventName = "MODIFY"
status > 10
My filter looks as follows:
{"eventName": ["MODIFY"], "dynamodb": {"NewImage": {"status": [{"numeric": [">", 10]}]}}}
If the filter is configured to only trigger if the event name is MODIFY it works, however anything more complicated than that does not trigger my Lambda. The event looks as follows:
{
"eventID": "ba1cff0bb53fbd7605b7773fdb4320a8",
"eventName": "MODIFY",
"eventVersion": "1.1",
"eventSource": "aws:dynamodb",
"awsRegion": "us-east-1",
"dynamodb":
{
"ApproximateCreationDateTime": 1643637766,
"Keys":
{
"org":
{
"S": "test"
},
"id":
{
"S": "61f7ebff17afad170f98e046"
}
},
"NewImage":
{
"status":
{
"N": "20"
}
}
}
}
When using the test_event_pattern endpoint it confirms the filter is valid:
filter = {
"eventName": ["MODIFY"],
"dynamodb": {
"NewImage": {
"status": [ { "numeric": [ ">", 10 ] } ]
}
}
}
response = client.test_event_pattern(
EventPattern=json.dumps(filter),
Event="{\"id\": \"e00c66cb-fe7a-4fcc-81ad-58eb60f5d96b\", \"eventName\": \"MODIFY\", \"dynamodb\": {\"NewImage\":{\"status\": 20}}, \"detail-type\": \"myDetailType\", \"source\": \"com.mycompany.myapp\", \"account\": \"123456789012\", \"time\": \"2016-01-10T01:29:23Z\", \"region\": \"us-east-1\"}"
)
print(response) >> {'Result': True, 'ResponseMetadata': {'RequestId':...}
Is there something that I'm overlooking? Do DynamoDB filters not work on the actual new image?
probably already found out yourself but for anyone else
its missing the dynamodb json specific numeric field leaf:
{
"eventName": ["MODIFY"],
"dynamodb": {
"NewImage": {
"status": { "N": [{ "numeric": [">", 10] }] }
}
}
}

How to set Job-specific minPayment in v2 jobs for job type webhook?

Chainlink v1 jobs allowed to set job-specific mininmum Payment with the minPayment keyword
{
"initiators": [
{
"type": "RunLog",
"params": { "address": "0x51DE85B0cD5B3684865ECfEedfBAF12777cd0Ff8" }
}
],
"tasks": [
{
"type": "HTTPGet",
"confirmations": 0,
"params": { "get": "https://bitstamp.net/api/ticker/" }
},
{
"type": "JSONParse",
"params": { "path": [ "last" ] }
},
{
"type": "Multiply",
"params": { "times": 100 }
},
{ "type": "EthUint256" },
{ "type": "EthTx" }
],
"startAt": "2020-02-09T15:13:03Z",
"endAt": null,
"minPayment": "1000000000000000000"
}
It seems to be missing in v2 toml jobs. For now only directRequest type v2 jobs have this which were added with this PR
The v1 Job type that serves our purpose is.
name: 'get-request',
initiators: [
{
type: 'external',
params: {
name: process.env.CHAINLINK_EI_NAME,
body: {},
},
},
],
tasks: [
{
type: 'httpget',
},
{
type: 'jsonparse',
},
{
type: process.env.CHAINLINK_BRIDGE_NAME,
},
],
minPayment: '1',
};
How can we set minPayment for webhook type jobs in v2 TOML jobs?
You can do that using the minContractPaymentLinkJuels.
For example:
type = "directrequest"
schemaVersion = 1
name = "my job"
contractAddress = "ORACLE_ADDRESS_HERE"
minContractPaymentLinkJuels = 1000000000000000000
observationSource = """
ds1 [type=bridge name="bridge-data-feed" requestData="{\\"data\\": {\\"from\\":\\"eth\\", \\"to\\", \\"USD\\"}}"];
ds1
For webhook type jobs, you'll actually want to use a custom spec (for example with an external initiator)
type = "webhook"
schemaVersion = 1
externalInitiators = [
{ name = "my-external-initiator-1", spec = "{\"minContractPaymentLinkJuels\": 1000000000000000000}" },
]
observationSource = """
ds1 [type=bridge name="bridge-data-feed" requestData="{\\"data\\": {\\"from\\":\\"eth\\", \\"to\\", \\"USD\\"}}"];
ds1
"""
The spec defines the JSON payload that will be sent to the External Initiator on job creation if the external initiator has a URL, and you can check the amount sent to the node is correct.

Incrementing a value in nested attributes in AWS Lambda and DynamoDB

This is my query to add a new field or increment a nested attribute
const params = {
TableName: process.env.DYNAMODB_GAMES_TABLE,
Key: {
id: gameId
},
UpdateExpression: 'set players.#player.#score = players.#player.#score + :s',
ExpressionAttributeNames: {
'#player': playerId,
'#score': 'score'
},
ExpressionAttributeValues: {
':s': 1
},
ReturnValues: "ALL_NEW"
};
This is the error I get
{
"message": "The document path provided in the update expression is invalid for update",
"code": "ValidationException",
"time": "2020-05-21T03:03:14.328Z",
"requestId": "Q04QEP1G3E2LAM43I04ADLM4IRVV4KQNSO5AEMVJF66Q9ASUAAJG",
"statusCode": 400,
"retryable": false,
"retryDelay": 27.814212380235393
}
My object looks like
{
"id": "09e7a690",
"players": {
"M3EDJeHtoAMCLJg": [
{
"cardId": "1",
"cardTitle": "test",
"pun": "this is a pun"
}
]
}
}

apollo-link-state add field with default value to type

Here's what I'm trying to accomplish:
I have a graphql API endpoint that returns me a Project object like this(unrelated fields removed):
{
"data": {
"Project": {
"id": "cjp4b84wkochq0167gpu8oa7h",
"requests": [
{
"id": "cjpbb6jcdpwj00167y4acl5a1",
"__typename": "Request"
},
{
"id": "cjpbbhlaxpwlx01675jzfyb0j",
"__typename": "Request"
},
{
"id": "cjpbbifg7pwmg0167s0ob1bm6",
"__typename": "Request"
},
],
"__typename": "Project"
}
}
}
I want to use apollo-link-state to add a client-side field to all of these Request objects like this:
{
"data": {
"Project": {
"id": "cjp4b84wkochq0167gpu8oa7h",
"requests": [
{
"id": "cjpbb6jcdpwj00167y4acl5a1",
"expanded": false,
"__typename": "Request"
},
{
"id": "cjpbbhlaxpwlx01675jzfyb0j",
"expanded": false,
"__typename": "Request"
},
{
"id": "cjpbbifg7pwmg0167s0ob1bm6",
"expanded": false,
"__typename": "Request"
},
],
"__typename": "Project"
}
}
}
This would allow me to remove local state from my Component that renders these requests. The problem is that when I define defaults for my ApolloClient clientState as follows:
const client = new ApolloClient({
clientState: {
defaults: {
Project: {
__typename: 'Project',
requests: [{ __typename: 'Request', expanded: false }],
},
},
},
});
Apollo adds it as a new Project object instead of adding it to the existing one(which has an id):
ROOT_QUERY
Project: Project
requests: [Request]
0:
expanded: false
Project({"id":"cjp4b84wkochq0167gpu8oa7h"}): Project
▾Project:cjp4b84wkochq0167gpu8oa7h
when I give it the id it adds the "hi" field to the correct project but the requests are still missing the expanded field. And giving the id only works for a specific project obviously.
const client = new ApolloClient({
clientState: {
defaults: {
'Project({"id":"cjp4b84wkochq0167gpu8oa7h"})': {
__typename: 'Project',
hi: true,
requests: [{ __typename: 'Request', expanded: false }],
},
},
},
});
ROOT_QUERY
Project({"id":"cjp4b84wkochq0167gpu8oa7h"}): Project
▾Project:cjp4b84wkochq0167gpu8oa7h
hi: true
requests: [Request]
0:▾Request:cjpbb6jcdpwj00167y4acl5a1
...unrelated fields
1:▾Request:cjpbbhlaxpwlx01675jzfyb0j
2:▾Request:cjpbbifg7pwmg0167s0ob1bm6
I also tried using the typeDefs field on the clientState object like this:
typeDefs: [`
schema {
query: RootQuery
}
type RootQuery {
Project($id: ID): Project
}
type Project {
id: ID!
requests: [Request]
}
type Request {
id: ID!
expanded: Boolean
}
`],
but this doesn't seem to change anything on the cache and I don't know if I can even give it a default value like this.
Maybe I'm misunderstanding how apollo-link-state works (or even how graphql works) any answer to point me in the right direction is appreciated. I'm very much a beginner when it comes to graphql or apollo.
You need to provide a client side resolver to your clientState configuration.
const clientState = {
resolvers: {
Project {
expanded: () => false
}
}
}
And then you'd pass this into your ApolloClient like so
const apolloClient = new ApolloClient({ clientState });

DynamoDB DocumentClient returns Set of strings (SS) attribute as an object

I'm new to DynamoDB.
When I read data from the table with AWS.DynamoDB.DocumentClient class, the query works but I get the result in the wrong format.
Query:
{
TableName: "users",
ExpressionAttributeValues: {
":param": event.pathParameters.cityId,
":date": moment().tz("Europe/London").format()
},
FilterExpression: ":date <= endDate",
KeyConditionExpression: "cityId = :param"
}
Expected:
{
"user": "boris",
"phones": ["+23xxxxx999", "+23xxxxx777"]
}
Actual:
{
"user": "boris",
"phones": {
"type": "String",
"values": ["+23xxxxx999", "+23xxxxx777"],
"wrapperName": "Set"
}
}
Thanks!
The [unmarshall] function from the [AWS.DynamoDB.Converter] is one solution if your data comes as e.g:
{
"Attributes": {
"last_names": {
"S": "UPDATED last name"
},
"names": {
"S": "I am the name"
},
"vehicles": {
"NS": [
"877",
"9801",
"104"
]
},
"updatedAt": {
"S": "2018-10-19T01:55:15.240Z"
},
"createdAt": {
"S": "2018-10-17T11:49:34.822Z"
}
}
}
Please notice the object/map {} spec per attribute, holding the attr type.
Means you are using the [dynamodb]class and not the [DynamoDB.DocumentClient].
The [unmarshall] will Convert a DynamoDB record into a JavaScript object.
Stated and backed by AWS. Ref. https://docs.aws.amazon.com/AWSJavaScriptSDK/latest/AWS/DynamoDB/Converter.html#unmarshall-property
Nonetheless, I faced the exact same use case, as yours. Having one only attribute, TYPE SET (NS) in my case, and I had to manually do it. Next a snippet:
// Please notice the <setName>, which represents your set attribute name
ddbTransHandler.update(params).promise().then((value) =>{
value.Attributes[<setName>] = value.Attributes[<setName>].values;
return value; // or value.Attributes
});
Cheers,
Hamlet

Resources