How to avoid AWS SAM rebuild and reupload a gradle function with unchanged code? - gradle

I'm developing an application with micronaut using SAM CLI to deploy it on AWS Lambda. As I was including dependencies and developing new features, the function packages got bigger an bigger (now they are around 250MB). This makes deployment take a while.
On top of that every time I edit template.yaml and then run sam build && sam deploy to try a new configuration on S3, RDS, etc... I have to wait for gradle to build the function again (even though it's unchanged since the last deployment) and upload the whole package to S3.
As I'm trying to configure this application with many trials and errors on SAM, waiting for this process to complete just to get an error because of some misconfiguration is getting quite counterproductive.
Also my SAM s3 bcuket is at 10GB size after just a single day of work. This may get expensive on the long run.
Is there a way to avoid those gradle rebuilds and reuploads when teh function code is unchanged?

If you are only updating the template.yml file, you could copy the new version to ./.aws-sam/build folder and then run sam deploy
$ cp template.yml ./.aws-sam/build/template.yml
$ sam deploy
If you are editing a lambda you could try to update the function code by itself (after you create it in the template and deploy of course). That can be done via the AWS CLI update-function-code command:
rm index.zip
cd lambda
zip –X –r ../index.zip *
cd ..
aws lambda update-function-code --function-name MyLambdaFunction --zip-file fileb://index.zip
more info can be found here:
Alexa Blogs - Publishing Your Skill Code to Lambda via the Command Line Interface
AWS CLI Command Reference - lambda - update-function-code
my SAM s3 bcuket is at 10GB size
Heh. Yea start deleting stuff. Maybe you can write a script using aws s3?

Related

CDK: which command should I run after changing my code?

I'm running aws serverless using aws cdk and aws-sdk.
I wrote my code and then ran the following commands:
cdk synth
cdk deploy
Now I update the code locally on my machine and want to push the changes.
Which command/s should I run now?
Should I run cdk destroy in between?
Thanks
Running cdk deploy will first synthesize the stacks and then deploy the changes. No need to run synth prior.
Deploying will apply the current stack and destroy any recourses that are no longer in the code, so no need to run destroy first.
Use cdk watch. Cdk will observe the files you specified in your cdk.json file and automatically deploy your changes, which also makes it much faster.
Here is the cdk watch documentation:
https://cdkworkshop.com/20-typescript/30-hello-cdk/300-cdk-watch.html#cdk-watch (start here)
https://aws.amazon.com/blogs/developer/increasing-development-speed-with-cdk-watch/
Just run the same commands again
cdk synth
cdk deploy

How to overwrite the folder in s3 bucket when using sam deploy for lamda deployment

I am using the below command to deploy lambda. It always create a new folder(.template and 1 more) in s3 bucket when there is change in my lamda project files. I want to overwrite the folder, so at any point in time, I will be having only 1 folder. How to do that?
sam deploy --no-fail-on-empty-changeset --s3-bucket bucketName --capabilities CAPABILITY_IAM --stack-name stackName --parameter-overrides "ParameterKey=Stage,ParameterValue=staging"
The easiest way to keep deployment artefact buckets "tidy" is to add a Bucket Lifecycle Rule that expires (deletes) objects after an arbitrary number of days.
You can create the rule in the S3 Console (Management > Lifecyle rules). You can safely expire all the objects, including the "current" version. Artefacts are read only at deploy-time and will be re-created with the next sam deploy.

AWS-CLI Upload only one file in lambda (update-function-code)

I am using AWS-CLI to my files into my lambda (because i want to ship with my own boto3).
The problem is that I have to upload the whole project (my files + boto3) in my lambda.
I have to wait ~5min each time (my connection is kinda bad)
The question is: can i upload only the files that i want (as git)?
Currently I use this command:
zip -r function.zip . && aws lambda update-function-code --function-name MYFUNC --zip-file fileb://function.zip && rm function.zip
Thanks
Create a lambda layer for your common files and attach it to your lambda.
Where you can make direct upload for frequently changing files.
Lambda layers

AWS: CodeDeploy for Lambda can't read appspec

I'm attempting to setup CodePipeline to manage the deployment of a very simple Lambda function.
I'm completely stuck on a problem with the deployment step, and cannot figure out what could be wrong.
When the pipeline attempts to run the CodeDeploy action, it fails with the error...
BundleType must be either YAML or JSON
This is my appspec...
version: 0.0
Resources:
- my-function:
Type: AWS::Lambda::Function
Properties:
Name: "my-function"
My pipeline doesn't have a build step, as it's just a simple js file, with no dependencies, so no build is required.
I've tried adding an action to deploy to S3, and I can confirm that the zip file that's being sent to s3 contains the appspec.yml and index.js and that these are both in the root.
Most of the examples I've seen use a buildspec, but I'm not sure why I would need this, or what it would even do if I had one.
There is nothing wrong with your setup, it is a shortcoming of the services that you cannot use CodeDeploy in a CodePipeline action to Deploy a Lambda function.
The reason is because CodeDeploy expects a JSON or YAML appspec file for the Lambda deployment, but currently CodePipeline supports ZIP as a bundle type so the error is thrown.
To workaround, customers deploy Lambda in a CodePipeline is via CloudFormation deploy action (SAM to be exact). Please see this tutorial on this recommended approach:
https://docs.aws.amazon.com/lambda/latest/dg/build-pipeline.html

AWS: Help setting up CodeDeploy in a Codepipeline

It looks like it's impossible to get Codedeploy to work in a CodePipeline project with a CodeBuild.
First I set up a Pipeline with 3 stages: Source, Build and Deploy, the first 2 stages work perfectly but the 3th (CodeDeploy) throws this error:
CodeBuild pushes the output artifacts to s3 in a .zip file, which is not supported by CodeDeploy.
For this, I tried to set up a Lambda function between CodeBuild and CodeDeploy like this: (Source -> CodeBuild -> Invoke Lambda -> CodeDeploy), The Lambda function uploads the appspec.yml file to s3 and calls putJobSuccessResult, But I still get the same error.
BundleType must be either YAML or JSON
There is a known limitation where the deployment of a Lambda using CodePipeline, with CodeDeploy as the Deployment Provider is not supported as of yet.
This is because CodePipeline will always zip the bundle/artifact, whereas CodeDeploy expects a YAML/JSON file as the source (appspec.yaml file) for Lambda Function deployment.
In order to work around this limitation, you have two options:
Run AWS CLI commands inside your CodeBuild Stage to update/deploy your lambda function
OR
Use CodeBuild to package your lambda function Code and push the artifact to a CloudFormation stage, which will update or create your Lambda Function Resource. You should find the reference documentation at [1] useful for getting the required information about packaging your SAM application.
Ref:
[1] SAM Packaging - https://docs.aws.amazon.com/serverless-application-model/latest/developerguide/serverless-deploying.html#serverless-sam-cli-using-package-and-deploy

Resources