I've written a small lambda function and deployed to AWS using the serverless framework. It provides a single function that returns a png file.
When the resource is opened in a browser it correctly loads a png.
When requested with curl curl "https://******.execute-api.us-east-1.amazonaws.com/dev/image.png" it produces a base64 encoded version of the image.
When I request on the command line with Accept header curl -H "Accept: image/png" https://******.execute-api.us-east-1.amazonaws.com/dev/image.png" it produces a binary image/png version of the image.
How do I manipulate the request to the API gateway so that all requests have "Accept: image/png" set on them regardless of origin? Or is there another way to ensure that the response will always be binary rather than base64?
Source Code
The handler code loads a png image from disk and then returns a response object with a base64 encoded output of the image.
// handler.js
'use strict';
const fs = require('fs');
const image = fs.readFileSync('./1200x600.png');
module.exports = {
image: async (event) => {
return {
statusCode: 200,
headers: {
"Content-Type": "image/png",
},
isBase64Encoded: true,
body: image.toString('base64'),
};
},
};
The serverless configuration sets up the function and uses the "serverless-apigw-binary" and "serverless-apigwy-binary" plugins to set content handling and binary mime types for the response.
# serverless.yml
service: serverless-png-facebook-test
provider:
name: aws
runtime: nodejs8.10
functions:
image:
handler: handler.image
memorySize: 128
events:
- http:
path: image.png
method: get
contentHandling: CONVERT_TO_BINARY
plugins:
- serverless-apigw-binary
- serverless-apigwy-binary
custom:
apigwBinary:
types:
- 'image/*'
package.json
{
"name": "serverless-png-facebook-test",
"version": "1.0.0",
"main": "handler.js",
"license": "MIT",
"dependencies": {
"serverless-apigw-binary": "^0.4.4",
"serverless-apigwy-binary": "^1.0.0"
}
}
Related
According to the AWS SAM Template Documentation, you can specify a MinimumCompressedSize attribute for AWS::Serverless::API resources which should compress response bodies which exceed a given threshold, yet in local testing this doesn't work.
Take for instance a slightly modified node v16 "Hello-World" project generated using sam init:
template.yaml:
AWSTemplateFormatVersion: '2010-09-09'
Transform: AWS::Serverless-2016-10-31
Description: >
sam-app
Sample SAM Template for sam-app
Globals:
Function:
Timeout: 3
MemorySize: 128
Resources:
CompressedApi:
Type: AWS::Serverless::Api
Properties:
StageName: "Prod"
MinimumCompressionSize: 0
HelloWorldFunction:
Type: AWS::Serverless::Function
Properties:
CodeUri: hello-world/
Handler: app.lambdaHandler
Runtime: nodejs16.x
Architectures:
- x86_64
Events:
HelloWorld:
Type: Api
Properties:
Path: /hello
Method: get
RestApiId: !Ref CompressedApi
And the corresponding node code for app.lambdaHandler:
exports.lambdaHandler = async (event, context) => {
try {
// const ret = await axios(url);
response = {
'statusCode': 200,
'body': JSON.stringify({
message: 'hello world',
// location: ret.data.trim()
})
}
} catch (err) {
console.log(err);
return err;
}
return response
};
After running sam local start-api and opening 127.0.0.1:3000/hello I should expect (if my browser sends an Accept-Encoding header with gzip, deflate, or identity compression) to see a compressed response, even for a small payload of 25 bytes since my MinimumCompressionSize is set to zero.
Instead I get a uncompressed response:
Why doesn't the lambda function response get compressed? Did I make some mistake in composing this template.yaml?
Note: I'm aware it's possible to setup gzip compression using an HttpApi resource (instead of the Api type used here) by having the lambda function itself compress the response body using a process like the one described in this blog post, but my understanding is that this type of Api resource should support this out-of-the-box and there's some mistake I'm making in setting up my application.
The event body is not getting encoded while invoking sam local start-api and sending a multipart request but it's being encoded in the cloud. So, I'd like to have the same behavior in my local environment.
Steps to reproduce the issue:
Create the Hello World project provided by sam init
Add a post method
Resources:
HelloWorldFunction:
Type: AWS::Serverless::Function # More info about Function Resource: https://github.com/awslabs/serverless-application-model/blob/master/versions/2016-10-31.md#awsserverlessfunction
Properties:
CodeUri: hello-world/
Handler: app.lambdaHandler
Runtime: nodejs12.x
Events:
HelloWorld:
Type: Api # More info about API Event Source: https://github.com/awslabs/serverless-application-model/blob/master/versions/2016-10-31.md#api
Properties:
Path: /hello
Method: post
ContentHandling: CONVERT_TO_BINARY # This is not doing the magic as I was expecting
BinaryMediaTypes:
- "*/*"
- multipart/form-data
Return the isBase64Encoded in the handler.
exports.lambdaHandler = async (event, context) => {
try {
// const ret = await axios(url);
response = {
'statusCode': 200,
'body': JSON.stringify({
message: event.isBase64Encoded,
// location: ret.data.trim()
})
}
} catch (err) {
console.log(err);
return err;
}
return response
};
Perform the HTTP request:
curl --location --request POST 'http://127.0.0.1:3000/hello' \
--header 'saa: csv/csv-file' \
--form 'foo=#/home/user/csv-file.csv'
The response is always the same:
{
"message": false
}
I've tried to use the proxy integration but it didn't work.
My workaround is to have something like this in the handler:
const csvString = event.isBase64Encoded ? Buffer.from(event.body, 'base64').toString('utf-8') : event.body;
I'm trying to run a simple script on AWS Lambda using Serverless to push it, the script fetches a url and returns it (a proxy), for some reason I can't see the response.
The script in question:
'use strict';
let axios = require('axios')
module.exports.hello = async (event, context) => {
let res = await axios.get('http://example.com')
return {
statusCode: 200,
body: JSON.stringify({
message: res,
input: event,
}),
}
};
My serverless YML:
service: get-soundcloud-tracks
provider:
name: aws
runtime: nodejs8.10
profile: home
functions:
hello:
handler: handler.hello
events:
- http:
path: users/create
method: get
cors: true
The solution was changing res to res.data inside the JSON.stringify
I have a serverless function, you can find the code below, and this function is deployed on aws lambda using the serverless framework. from within this function i call an external web api.
When i do serverless invoke -f my_func i get the expected response. but if i run a curl command it fails and i get {"message": "Internal server error"}
this is my curl command:
curl -X GET \
https://0wb3echzu8.execute-api.us-east-1.amazonaws.com/dev/my_func \
-H 'cache-control: no-cache' \
-H 'postman-token: 1fc551c0-7010-e1e3-7287-6a0d82d1e91a'
this is my code:
var request = require("request");
var options = { method: 'GET',
url: 'https://api.coinmarketcap.com/v2/listings/',
headers:
{ 'postman-token': '090e284e-62ad-29f0-0f10-92ae14656a37',
'cache-control': 'no-cache' } };
module.exports.my_func = (event, context, callback) => {
request(options, function (error, response, body) {
if (error) { console.log(error); callback(null, error) }
console.log(body)
callback(null, body)
});
};
this is the serverless.yml file:
service: aws-nodejs
app: sonarsic
tenant: vincent
provider:
name: aws
runtime: nodejs6.10
functions:
my_func:
handler: handler.my_func
events:
- http:
path: my_func
method: get
cors: true
It must have to do something with the calling of the web api in my function. if i don't call a web api it works fine.
If I check the logs via serverless logs -f my_func i only get the logs of the calls that worked using serverless invoke.
what can i do to find out what is going wrong inside my function when making a curl command?
adding cors:true to the http section does not solve the problem
cloudwatch is attached to aws lambda but it seems there is nothing written to:
After some discussion on chat, we discovered that statusCode on the response body is missing
let request = require('request');
let options = {
method: 'GET',
url: 'https://api.coinmarketcap.com/v2/listings/',
headers: {
'postman-token': '090e284e-62ad-29f0-0f10-92ae14656a37',
'cache-control': 'no-cache',
},
};
module.exports.my_func = (event, context, callback) => {
request(options, (error, response, body) => {
if (error) { console.log(error); callback(null, error) }
console.log(body)
callback(null, { body, statusCode: 200 }) // response statusCode added
});
};
I want to use Google Cloud Storage JSON API to upload some JSON as file into Cloud Storage. My code is as follows:
import google from 'googleapis';
import fs from 'fs';
export function uploadFileToStorage(json, callback) {
const storage = google.storage({
version: 'v1'
});
// 'https://www.googleapis.com/auth/cloud-platform'
const request = {
bucket: 'bucket-id',
resource: {},
uploadType: 'media',
name: 'test',
media: {
mimeType: 'application/json',
body: json
},
auth: 'api-key'
};
storage.objects.insert(request, callback);
}
The JSON is passed on to the uploadFileToStorage() function. The function is using googleapis to connect to Google Cloud Storage. The code should work, but I am not able to pass the api key correctly.
It is throwing a 400 - Bad request error.