Latency between API Gateway and Lambda - aws-lambda

I'm investigating latency between API GW and Lambda. Although the latency may seem low, there is a source of latency that I cannot discern.
It appears to be integration latency from when API GW makes a request to Lambda, to when Lambda begins execution.
We have ruled out cold start by enabling provisioned concurrency and have no spill over into un-provisioned containers.
Example Access log & Corresponding Lambda
{
"requestTime": "09/Jul/2021:12:18:48 +0000",
"requestId": "redacted",
"httpMethod": "POST",
"path": "/redacted",
"resourcePath": "/redacted",
"status": 200,
"responseLatency": 56,
"wafError": "-",
"wafStatus": "-",
"wafLatency": "-",
"authnError": "-",
"authnStatus": "200",
"authnLatency": "2",
"authzError": "-",
"authzStatus": "200",
"authzLatency": "0",
"integrationRequestId": "redacted",
"integrationResponseStatus": "200",
"integrationLatency": "52",
"integrationServiceStatus": "200",
"identitySourceIp": "redacted",
"identityUserAgent": "redacted",
"identityUser": "redacted"
}
REPORT RequestId: redacted Duration: 37.83 ms Billed Duration: 38 ms Memory Size: 2048 MB Max Memory Used: 77 MB
Has anyone encountered this gap before? The overhead from API GW seems minimal.

Related

Golang ACMEv2 HTTP-01 challenge not challenging server

With this code I am attempting a manual HTTP-01 challenge to better understand how the process works. All the requests return 201/200 responses with the expected bodies, and I am able to successfully create the challenge.
The ACME server never seems to challenge the HTTP server however.
I get a successful return when POST'ing to the challenge URL:
2022/07/17 13:49:28 challenge response {
"type": "http-01",
"status": "pending",
"url": "https://acme-staging-v02.api.letsencrypt.org/acme/chall-v3/3039193714/PVI-4A",
"token": "yoevDKY_bARdM5uHmVsk3s5lPK8BsBWC-SfmRN8MkLM"
}
However when polling the authorization status I can see that it stays pending:
2022/07/17 13:49:43 authorization response {
"identifier": {
"type": "dns",
"value": "billabull.com"
},
"status": "pending",
"expires": "2022-07-24T13:49:27Z",
"challenges": [
{
"type": "http-01",
"status": "pending",
"url": "https://acme-staging-v02.api.letsencrypt.org/acme/chall-v3/3039193714/PVI-4A",
"token": "yoevDKY_bARdM5uHmVsk3s5lPK8BsBWC-SfmRN8MkLM"
},
{
"type": "dns-01",
"status": "pending",
"url": "https://acme-staging-v02.api.letsencrypt.org/acme/chall-v3/3039193714/uHeVHQ",
"token": "yoevDKY_bARdM5uHmVsk3s5lPK8BsBWC-SfmRN8MkLM"
},
{
"type": "tls-alpn-01",
"status": "pending",
"url": "https://acme-staging-v02.api.letsencrypt.org/acme/chall-v3/3039193714/RomB0g",
"token": "yoevDKY_bARdM5uHmVsk3s5lPK8BsBWC-SfmRN8MkLM"
}
]
}
Currently I poll for 2 minutes (with the server available) before timing out, so I feel that it should reasonably happen within that time frame.
I have also tested that the HTTP server is made available on port 80 from the domain billabull.com, and making a GET request to the challenge path does return the correct key authorization. However the ACME server is never making a request to the server to begin with.
Does anyone know why the ACME server might not be challenging my server?
I had to use a body of []byte("{}") rather than []byte{} for the challenge endpoint
Edit: For some reason this endpoint doesn't error out, but others will if you pass incorrect body

How to get values from logs in alerts text message in Elasticsearch Kiban

I am continuously sending health data of my Ubuntu machine to elasticsearch using td-agent. This health data contains cpu temperature which I have to monitor. So I have created alerts in which is the temperature value increses to more than 60*F, it gives alerts on my Microsoft Teams channel. This all setup is working fine.
Below is the logs data:
{
"_index": "health_skl_gateway",
"_type": "_doc",
"_id": "DwxjinkBwxSy0OQ_4rhS",
"_version": 1,
"_score": null,
"_source": {
"Data": {
"WiFiIP": "N/A",
"signal_strength": "N/A",
"signal_percent": 0,
"signal_level": "N/A",
"EthIP": "192.168.100.30 ",
"TotalDisk": "916G",
"UsedDisk": "40G",
"FreeDisk": "830G",
"DiskPercent": "5%",
"TotalRAM": "16312468",
"UsedRAM": "3735596",
"FreeRAM": "5866548",
"CPU": 27,
"cpu_temp": 57,
"Internet": true,
"Publish msg count": 442,
"Created": "2021-05-20T15:26:51.557564",
"DeviceId": "TX-G1-318",
"UpTime": "2021-05-19T07:13:05"
},
"hostname": "TX-G1-318",
"Version": "V2"
},
"fields": {
"Data.UpTime": [
"2021-05-19T07:13:05.000Z"
],
"Data.Created": [
"2021-05-20T15:26:51.557Z"
]
},
"sort": [
1621524411557
]
}
In alerting of Kibana, I have set alerts in which if the count is 3, of all documents of index health_skl_gateway, for last 10 minutes, where Data.cpu_temp is greater than 60, it generates alerts to Microsoft Teams channel. Now below is how I have configured the message which is sent to Microsoft teams
So in the message, I am just sending the static text message. But I want to send the actual Data.cpu_temp value in the messsage.
Is this possible. How can we do this? Thanks
Did you try using double braces? Like this. I guess mapping is done in the same way for all alert types.
In the server monitoring example, the email action type is used, and server is mapped to the body of the email, using the template string CPU on {{server}} is high.

Youtube Data API - /channels Endpoint not Returning Smaller Users

I am creating a networking app for musicians. I was wanting to use the Youtube Data API to let users connect their Youtube channel to their profile within my app. I got everything in place and working via making requests to URLs similar to https://www.googleapis.com/youtube/v3/channels?part=snippet,statistics&forUsername=PewDiePie&key=[YOUR_API_KEY]. This works great and returns this JSON:
{ "kind": "youtube#channelListResponse", "etag": "\"p4VTdlkQv3HQeTEaXgvLePAydmU/bj_rirVFbrVoTIOa6lCGdaXaG5M\"", "pageInfo": { "totalResults": 1, "resultsPerPage": 5 }, "items": [ { "kind": "youtube#channel", "etag": "\"p4VTdlkQv3HQeTEaXgvLePAydmU/Blp06js4r7j93y1EfKve84oXWpo\"", "id": "UC-lHJZR3Gqxm24_Vd_AJ5Yw", "snippet": { "title": "PewDiePie", "description": "I make videos.", "publishedAt": "2010-04-29T10:54:00.000Z", "thumbnails": { "default": { "url": "https://yt3.ggpht.com/a/AGF-l79FVckie4j9WT-4cEW6iu3gPd4GivQf_XNSWg=s88-c-k-c0xffffffff-no-rj-mo", "width": 88, "height": 88 }, "medium": { "url": "https://yt3.ggpht.com/a/AGF-l79FVckie4j9WT-4cEW6iu3gPd4GivQf_XNSWg=s240-c-k-c0xffffffff-no-rj-mo", "width": 240, "height": 240 }, "high": { "url": "https://yt3.ggpht.com/a/AGF-l79FVckie4j9WT-4cEW6iu3gPd4GivQf_XNSWg=s800-c-k-c0xffffffff-no-rj-mo", "width": 800, "height": 800 } }, "localized": { "title": "PewDiePie", "description": "I make videos." }, "country": "US" }, "statistics": { "viewCount": "24334379402", "commentCount": "0", "subscriberCount": "102000000", "hiddenSubscriberCount": false, "videoCount": "4054" } } ] }
Most of my app's users will be smaller musicians, likely with less than 10k youtube subscribers. Take my sister for example, this is a link to her youtube channel: https://www.youtube.com/channel/UCe4Eogv2uGaKUe4x3VNrwsg.
Whenever trying to search for her Youtube channel with the API via https://www.googleapis.com/youtube/v3/channels?part=snippet,statistics&forUsername=Audrey_Chopin&key=[YOUR_API_KEY] (and variations such as replacing Audrey_Chopin with Audrey%20Chopin or Audrey+Chopin) yield no results: { "kind": "youtube#channelListResponse", "etag": "\"p4VTdlkQv3HQeTEaXgvLePAydmU/zJL80hJ0IwMo5wddECFapC8I6Q4\"", "pageInfo": { "totalResults": 0, "resultsPerPage": 5 }, "items": [] }.
Are smaller users not supposed to be returned from this endpoint? If so, is there any way I can implement users to search for their profile without forcing the user to do the OAuth process, i.e. signing into their Youtube account?
It seems that using the /search endpoint works better for smaller channels, though there is less information available in this endpoint (I am unable to get subscriber count and video count, which was included in the "statistics" part of the /channel endpoint).
So updating
https://www.googleapis.com/youtube/v3/channels?part=snippet,statistics&forUsername=Audrey_Chopin&key=[YOUR_API_KEY]
to
https://www.googleapis.com/youtube/v3/search?part=snippet&channelType=any&maxResults=50&order=relevance&q=Audrey%20Chopin&type=channel&key=[YOUR_API_KEY]
yielded smaller channels, though without as much data as when using the /channel endpoint.
Still curious, if anybody knows, why the /channel endpoint does not return smaller channels.
Since you know the user's channel id, simply issue a query to the Channels endpoint on the URL:
https://www.googleapis.com/youtube/v3/channels?part=...&id=$CHANNEL_ID&key=$APP_KEY,
and you'll obtain all public (i.e. non-private) info attached to the referenced channel -- without needing any further authentication. Of course you can specify the part parameter as you see fit.
On the other hand, please note that querying the Search.List endpoint for snippet part is much more costly than querying the Channels.List endpoint for both snippet and statistics parts: 100 vs. 5 quota points.

Kinesis creates multiple records with the same sequence number

Based on Kinesis documentation, sequence number is supposed to be unique, however we see the same value being reused across multiple records. Our event producer is Spring Boot application that uses KPL internally, consumers are AWS lambdas. We have performed a re-sharding a couple times during the test. Below you can see sample sequence number reused more than once. How that's even possible?
"Records": [{
"kinesis": {
"kinesisSchemaVersion": "1.0",
"partitionKey": "00000000000000002",
"sequenceNumber": "49596124085897508159438713510240079964989152308217511954",
"data": "************************",
"approximateArrivalTimestamp": 1558991793.009
},
"eventSource": "aws:kinesis",
"eventVersion": "1.0",
"eventID": "shardId-000000000001:49596124085897508159438713510240079964989152308217511954",
"eventName": "aws:kinesis:record",
"invokeIdentityArn": "-----------------",
"awsRegion": "us-east-1",
"eventSourceARN": "-----------------"
}, {
"kinesis": {
"kinesisSchemaVersion": "1.0",
"partitionKey": "00000000000000003",
"sequenceNumber": "49596124085897508159438713510240079964989152308217511954",
"data": ""************************",",
"approximateArrivalTimestamp": 1558991793.009
},
"eventSource": "aws:kinesis",
"eventVersion": "1.0",
"eventID": "shardId-000000000001:49596124085897508159438713510240079964989152308217511954",
"eventName": "aws:kinesis:record",
"invokeIdentityArn": "-----------------",
"awsRegion": "us-east-1",
"eventSourceARN": "-----------------"
}, {
"kinesis": {
"kinesisSchemaVersion": "1.0",
"partitionKey": "00000000000000004",
"sequenceNumber": "49596124085897508159438713510240079964989152308217511954",
"data": ""************************",",
"approximateArrivalTimestamp": 1558991793.009
},
"eventSource": "aws:kinesis",
"eventVersion": "1.0",
"eventID": "shardId-000000000001:49596124085897508159438713510240079964989152308217511954",
"eventName": "aws:kinesis:record",
"invokeIdentityArn": "-----------------",
"awsRegion": "us-east-1",
"eventSourceARN": "-----------------"
}]
When Kinesis stream writers use KPL with user record aggregation (see Consumer De-aggregation) user records are batched together and delivered as a single Kinesis record to regular Kinesis consumers. Kinesis record sequence numbers are unique in this case, but we need to implement de-aggregation.
However, in case the enhanced fan-out is enabled for Lambdas, user records are delivered as individual Kinesis records (no de-aggregation is required) and they share the same sequence number.
So the Kinesis record sequence number is not always unique.

Getting a 403 Forbidden Error for Youtube Data API

I am getting a 403 forbidden error when making an API call to the YouTube Data API.
I have tried to generate different types of keys (Web Browser, Server, etc.). The key is unrestricted. I have tried making the call from a server and from postman for Chrome. The request URL and response is below.
https://www.googleapis.com/youtube/v3/search?part=snippet&maxResults=1&q=surfing&key={api-key}
{
"error": {
"errors": [
{
"domain": "global",
"reason": "forbidden",
"message": "Forbidden"
}
],
"code": 403,
"message": "Forbidden"
}
}
Here are the headers:
alt-svc →quic=":443"; ma=2592000; v="43,42,41,39,35"
cache-control →private, max-age=0
content-encoding →gzip
content-length →118
content-type →application/json; charset=UTF-8
date →Tue, 10 Jul 2018 15:00:27 GMT
expires →Tue, 10 Jul 2018 15:00:27 GMT
server →GSE
status →403
vary →Origin, X-Origin
x-content-type-options →nosniff
x-frame-options →SAMEORIGIN
x-xss-protection →1; mode=block
This error is a Core API error as what specified in the Youtube API documentation.
Access forbidden. The request may not be properly authorized.
You can check the step by step guide provided in the documentation. You will bump with the step on how to properly acquire user authorization.
Intended for developers who want to write applications that interact
with YouTube. It explains basic concepts of YouTube and of the API
itself. It also provides an overview of the different functions that
the API supports.
These types of error related to YouTube APIs core apis errors
Core API errors
forbidden (403) : Access forbidden :The request may not be properly
authorized.
quotaExceeded (403) : quotaExceeded : The request cannot be
completed because you have exceeded your quota.
You can try adding OAuth using this documentation on YouTube Data API Overview as a guide.
If your application will use any API methods that require user
authorization, read the authentication guide to learn how to implement
OAuth 2.0 authorization.
If you are getting the same error then , verify the YouTube Data API v3 service enabled for this key in your Google Developers console.
URL : https://www.googleapis.com/youtube/v3/search?part=snippet&maxResults=1&q=surfing&key={API_KEY}
{
"kind": "youtube#searchListResponse",
"etag": "\"XI7nbFXulYBIpL0ayR_gDh3eu1k/vxoFCv0dm4WdeKtXnUk7GXCJeao\"",
"nextPageToken": "CAEQAA",
"regionCode": "IN",
"pageInfo": {
"totalResults": 1000000,
"resultsPerPage": 1
},
"items": [
{
"kind": "youtube#searchResult",
"etag": "\"XI7nbFXulYBIpL0ayR_gDh3eu1k/Amykv1hEk5vzuqlcAS8z2BEptrU\"",
"id": {
"kind": "youtube#video",
"videoId": "CWYDxh7QD34"
},
"snippet": {
"publishedAt": "2014-09-02T16:52:33.000Z",
"channelId": "UCblfuW_4rakIf2h6aqANefA",
"title": "Best surfing action from Red Bull Cape Fear 2014",
"description": "Click for the FULL EVENT: http://www.redbullcapefear.com/ The southern tip of Sydney Australia is home to one of the most treacherous waves on the planet: ...",
"thumbnails": {
"default": {
"url": "https://i.ytimg.com/vi/CWYDxh7QD34/default.jpg",
"width": 120,
"height": 90
},
"medium": {
"url": "https://i.ytimg.com/vi/CWYDxh7QD34/mqdefault.jpg",
"width": 320,
"height": 180
},
"high": {
"url": "https://i.ytimg.com/vi/CWYDxh7QD34/hqdefault.jpg",
"width": 480,
"height": 360
}
},
"channelTitle": "Red Bull",
"liveBroadcastContent": "none"
}
}
]
}
I tested this with multiple api-keys and I didn't hit an issue.
curl https://www.googleapis.com/youtube/v3/search\?part\=snippet\&maxResults\=1\&q\=surfing\&key\={api-key}
{
"kind": "youtube#searchListResponse",
"etag": "\"XI7nbFXulYBIpL0ayR_gDh3eu1k/r9B676JRBM0twgG6dy2MZT_1KnQ\"",
"nextPageToken": "CAEQAA",
"regionCode": "US",
"pageInfo": {
"totalResults": 1000000,
"resultsPerPage": 1
},
"items": [
{
"kind": "youtube#searchResult",
"etag": "\"XI7nbFXulYBIpL0ayR_gDh3eu1k/E8GZG_CZfJeaVF75eZYmJHnGe0c\"",
"id": {
"kind": "youtube#video",
"videoId": "rj7xMBxd5iY"
},
"snippet": {
"publishedAt": "2017-11-12T11:09:52.000Z",
"channelId": "UCiiFGfvlKvX3uzMovO3unaw",
"title": "BIG WAVE SURFING COMPILATION 2017",
"description": "BIG WAVE SURFING COMPILATION 2017 ** REVISED **AMAZING FOOTAGE ** WITH 60-100FT- HUGE SURF Please Subscribe if You Would like to see More ...",
"thumbnails": {
"default": {
"url": "https://i.ytimg.com/vi/rj7xMBxd5iY/default.jpg",
"width": 120,
"height": 90
},
"medium": {
"url": "https://i.ytimg.com/vi/rj7xMBxd5iY/mqdefault.jpg",
"width": 320,
"height": 180
},
"high": {
"url": "https://i.ytimg.com/vi/rj7xMBxd5iY/hqdefault.jpg",
"width": 480,
"height": 360
}
},
"channelTitle": "Absolutely Flawless",
"liveBroadcastContent": "none"
}
}
]
}
I have had the same problem and it is resolved by enabling the 'YouTube Data API v3' from the API Library

Resources