If I am correct, it is possible to deploy my Lambda function using a Docker container that I have built and pushed to AWS ECR.
https://aws.amazon.com/blogs/aws/new-for-aws-lambda-container-image-support/
But the article mentioned above, does not explain how to integrate it with the docker image that I am already using. For e.g. I have a dockerized API built using a flask application. How do I convert it to Lambda function?
Related
No clear path to do development in a serverless environment.
I have an API Gateway backed by some Lambda functions declared in Terraform. I deploy to the cloud and everything is fine, but how do I go about setting a proper workflow for development? It seems like a struggle to push every small code change to the cloud while developing in order to run your code. Terraform has started getting some support by the SAM framework to run your Lambda functions locally (https://aws.amazon.com/blogs/compute/better-together-aws-sam-cli-and-hashicorp-terraform/), but still no way to simulate a local server and test out your endpoints in Postman for example.
First of all I use serverless plugin instead of terraform, my answer is based on what you provided and what I found around.
From what I understood so far with priovided documentation you are able to run sam CLI with terraform (cf: Chapter Local testing)
You might follow this documentation to invoke local functions.
I recommend to use JSON files to create use cases instead of stdin injection.
First step is to create your payload in json file and to invoke your lambda with the json payload like
sam local invoke "YOUR_LAMBDA_NAME" -e ./path/to/yourjsonfile.json
I'm trying to install azure-cli on aws lambda for integration purpose. Size of azure-cli seems to be huge for aws lambda and unable to upload the zip file.
I want to create a service principal (client secret) in azure using lambda using python.
The only way to create service principal is through azure-cli.
Is there any other way to create client secret? or can we handle azure-cli package size to upload in aws lambda?
I have gone through many blogs online, but azure-cli is required to create client secret.
install azure-cli on aws lambda
Do you mean pip install Python package within AWS Lambda?
If so, One of the great things about using Python is the availability of a huge number of libraries that helps you implement fast solutions without having to code all classes and functions from scratch. As mentioned before, Amazon Lambda offers a list of Python libraries that you can import into your function. The problem starts when you have to use libraries that are not available. One way to do it is to install the library locally inside the same folder you have your lambda_function.py file, zip the files and upload it to your Amazon Lambda console. This process can be a laborious and inconvenient task to install libraries locally and upload it every time you have to create a new Lambda function.
To make your life easier, Amazon offers the possibility for us to upload our libraries as AWS Lambda layers, which consists of a file structure where you store your libraries, load it independently to Amazon Lambda, and use them on your code whenever needed. Once you create a Lambda Layer it can be used by any other new Lambda Function.
There are the steps of getting started with AWS Lambda Layers for Python.
This api could be deployed to google cloud function?
https://github.com/Mdsp9070/someoneFlix/tree/master/backend
I tried to deploy but I'm getting this error:
ERROR: (gcloud.functions.deploy) OperationError: code=3, message=Build failed: main.go:16:2: import "flix-api.localhost/flix-api" is a program, not an importable package; Error ID: 975560ac
You have to implement the correct signature to handle functions requests
func myFunction(w http.ResponseWriter, r *http.Request) {
...
}
Here you start you own web server with several endpoint. It's not a Cloud Functions pattern, but more a Cloud Run service. Try this (with the correct ProjectID)
gcloud alpha builds submit --pack=image=gcr.io/PROJECT_ID/backend && \
gcloud run deploy --platform=managed --region=us-central1 --image=gcr.io/PROJECT_ID/backend --port=3333 --allow-unauthenticated backend
And call the url provided. I tested with your code and it worked on my side, I just got an error in the logs Error on loadinf .env file. You might have --set-env-vars to set to add environment variables.
If interested, I can explain more the commands
EDIT
Some explanations
Cloud Functions and Cloud Build share the same backend. Cloud Run host a webserver (in a customizable container). CLoud Function package the function into a webserver (that's why you have to respect a function signature to make it callable by the generic webserver).
Cloud Run can handle concurrent requests in the same instance (up to 80), Cloud Functions only 1. You can have the exact same behavior if you set the Cloud Run concurrency param to 1
Cloud Run need a container. When you have your code, you can write a Dockerfile (you can find examples in the documentation). You can build the container with Cloud Build or with Docker build. In my code example, I used an alpha (and not documented command) of Cloud Build based on Buildpack project. Build pack detect your language, the main file and create automatically a standard container based on your code. Perfect for a quick test and containers that don't required customization.
I am developing aws lambda function and I have an option of using one of these two function, but I don't find any good place where I can see the difference between these two. Which one should be used and in which case?
AWS serverless application model i.e. AWS SAM is used to define a serverless application. You need to deploy this application on AWS lambda via s3.
SAM comes in action while testing the AWS Lambda Function locally because it's not easy to deploy and test on AWS Lambda every time you make a code change.
You can configure SAM on your IDE like eclipse, test and finalise the code then deploy it on Lambda.
For more info about sam https://github.com/awslabs/serverless-application-model/blob/master/HOWTO.md
Hi im trying to connect to mySQL server hosted on aws using an AWS lambda function.I'm very new to this so it would be of great help if someone could provide me any sample code.
Objective is to devlop an alexa skill which retrieves certain data from the db and provides this as output
Please read lambda documentation on creating lambda deployment package which will answer your question. Ensure the packaged environment is same as Lambda Environment (Amazon Linux)
http://docs.aws.amazon.com/lambda/latest/dg/lambda-python-how-to-create-deployment-package.html