Is it possible for one lambda function to deploy a new lamda function with serverless.yml and handler.py stored in s3 or github?
I am using serverless framework and python
AWS Lambda can deploy another lambda with a s3 trigger. The simple flow is like this.
Bundle project and upload to S3 --> S3 Trigger --> Lambda (Create or
Update function Code)
Here is the complete documentation
Related
I would like to trigger an AWS Lambda Function whenever a new file is added to AWS FSX. This is in order to perform an action on the file using to the Lambda function that gets notified.
While considering AWS cloudtrail, Eventbridge and Cloudwatch to trigger the Lambda function; I was unable to find AWS FSX in the data source options for this monitoring resource in AWS. Any suggestion on what tool can be used?
When I try deploy to an existing lambda function configured in serverless.yml as following, it says "An error occurred: ApiLambdaFunction - an-existing-function-name-created-by-my-devops already exists."
functions:
api:
name: an-existing-function-name-created-by-my-devops
So it is not allowed to deploy to an existing lambda not created by serverless?
As Serverless manages your resources via a CloudFormation Stack, you could probably be able to import the lambda function within the UI (Import Existing Resources into a CloudFormation Stack) and do the deploy afterwards again.
I did not try this and there's most probably a better solution though.
Edit: precondition is that you successfully created your stack before adding your desired function.
I am developing aws lambda function and I have an option of using one of these two function, but I don't find any good place where I can see the difference between these two. Which one should be used and in which case?
AWS serverless application model i.e. AWS SAM is used to define a serverless application. You need to deploy this application on AWS lambda via s3.
SAM comes in action while testing the AWS Lambda Function locally because it's not easy to deploy and test on AWS Lambda every time you make a code change.
You can configure SAM on your IDE like eclipse, test and finalise the code then deploy it on Lambda.
For more info about sam https://github.com/awslabs/serverless-application-model/blob/master/HOWTO.md
So I have an existing CloudFormation stack up and running. However, I haven't found a solution for my problem, which is that I want my resources, for example EC2 and Lambda, to have up to date code.
It seems that a CloudFormation stack doesn't update if the template doesn't have any changes. I'm holding my code inside a S3 bucket as a zip-file, but if this file gets changed, CloudFormation doesn't notice it.
Is my best bet creating a git hook script that uses AWS CLI and updates the EC2 and Lambda code or is there some 'elegant' way for CloudFormation to notice these changes?
Create a new lambda function to update your existing lambda and ec2 or call the cloud formation to update them. On your S3, create an object Put event and call that new lambda function. So whenever a new file(zip) is put in s3, your ec2 & lambda gets updated.
I am trying to understand if we can create a LAMBDA function which when execute will create a new Lambda function (if it doesn't exists) or update with a new version if it was already existing. The new lambda function will be stored in Codecommit and therefore the invoking lambda function should be able to clone the codecommit repo and use that function cloned to create the new one. Any suggestions?
Here is how I would do it,
CodeCommit (Trigger) --> Lambda (Trigger External Process to build and
zip it to S3) --> S3 (Create Trigger) --> Lambda(Update Lambda Code [Self or Other Lambda)
With CodeCommit you can also pass in parameters that can differentiate repository to trigger different build process.
Hope it helps.