AWS CodeBuild + AWS Lambda = Error: Could not find the required 'MyAssembly.deps.json' - aws-lambda

I received following error in CloudWatch Logs after using AWS CodePipeline (AWS CodeBuild) to deploy my C# Lambda Function Code
Could not find the required 'MyAssembly.deps.json'.
This file should be present at the root of the deployment package.: LambdaException

The problem in my case was that the linux file permissions on files inside the Zip were set to 000; so when the zip was extracted by AWS Lambda; AWS Lambda did not have file permission to access the file MyAssembly.deps.json
I was using C# System.IO.Compression.ZipFile.CreateFromDirectory to author the zip file. I had to shell out to the native zip program to produce a zip file which worked.
Big thanks to https://forums.aws.amazon.com/message.jspa?messageID=856247

I know this is bit old question but writing answer for any user who are still facing the problem on windows system.
this is with dotnet core 3.1
The first command in package manager console to ensure the .deps.json included in publish files
dotnet publish /p:GenerateRuntimeConfigurationFiles=true
and than zip all files of publish folder in the same name of namespace folder. upload the zip file to AWS lambda using console.
worked.
If not than copy all project files ( not the published) in zip and upload to aws lambda.

Related

aws s3 glacier restore from vault

I have vault and need to restore one of the folder from the vault I have initiated the job using AWS CLI and got the inventory using JSON file but unable to get the complete folder from the inventory. Any one can help me restoring the folder?
I am able to get CSV file formate to see the archive ID of the files but is it possible to take the complete folder as it is showing separate archive ID for all files in folder?

unzip requirements when invoke aws lambda

I'm trying to invoke aws lambda from another aws lambda using boto3 in python
in order to construct the first lambda I used serverless framework with custom: zip:true
when I invoke I get the massage:
{'errorMessage': "Unable to import module 'handler': No module named 'joblib'", 'errorType': 'Runtime.ImportModuleError'}
how can I unzip the first lambda requirements?
Thanks for your help
By default, Serverless creates a bucket with a generated name like
<service name>-serverlessdeploymentbuck-1x6jug5lzfnl7 to store your
service's stack state.
Please go to S3 and find you deployment bucket. There you could download your latest lambda archive and unzip it, this is how could see if you really have all required modules uploaded.
Here is how to add extra dependencies not included by AWS.
https://docs.aws.amazon.com/lambda/latest/dg/python-package.html#python-package-dependencies

cloudformation validation not working locally on windows

I'm trying to validate the cloudformation template using AWS CLI on my windows machine locally.
The command is:
aws cloudformation validate-template --template-body file:///C:/AWS/template.json
But Im getting below error:
Error parsing parameter '--template-body': Unable to load param file file:///C:/AWS/template.json: [Errno 2] No such file or directory: 'file:///C:/AWS/template.json'
You can check permission of AWS directory and your template.json file as well.
Sometimes in Windows system directory files you have created in system drive (C://) are user permission restricted. So, It will not allow to access any created file easily.
2nd Way:
You can upload your template to any s3 bucket and then validate your file using s3 url.
For that, aws cli has valid permission for that operations.
Below is the command you can try with changing s3 URL by pointing it to your bucket and stored file:
aws cloudformation validate-template --template-url https://s3.amazonaws.com/cloudformation-templates-us-east-1/S3_Bucket.template

How to avoid AWS SAM rebuild and reupload a gradle function with unchanged code?

I'm developing an application with micronaut using SAM CLI to deploy it on AWS Lambda. As I was including dependencies and developing new features, the function packages got bigger an bigger (now they are around 250MB). This makes deployment take a while.
On top of that every time I edit template.yaml and then run sam build && sam deploy to try a new configuration on S3, RDS, etc... I have to wait for gradle to build the function again (even though it's unchanged since the last deployment) and upload the whole package to S3.
As I'm trying to configure this application with many trials and errors on SAM, waiting for this process to complete just to get an error because of some misconfiguration is getting quite counterproductive.
Also my SAM s3 bcuket is at 10GB size after just a single day of work. This may get expensive on the long run.
Is there a way to avoid those gradle rebuilds and reuploads when teh function code is unchanged?
If you are only updating the template.yml file, you could copy the new version to ./.aws-sam/build folder and then run sam deploy
$ cp template.yml ./.aws-sam/build/template.yml
$ sam deploy
If you are editing a lambda you could try to update the function code by itself (after you create it in the template and deploy of course). That can be done via the AWS CLI update-function-code command:
rm index.zip
cd lambda
zip –X –r ../index.zip *
cd ..
aws lambda update-function-code --function-name MyLambdaFunction --zip-file fileb://index.zip
more info can be found here:
Alexa Blogs - Publishing Your Skill Code to Lambda via the Command Line Interface
AWS CLI Command Reference - lambda - update-function-code
my SAM s3 bcuket is at 10GB size
Heh. Yea start deleting stuff. Maybe you can write a script using aws s3?

How to deploy a WAR file from s3 to AWS EC2?

I have a AWS EC2 instance running with me and there is a maven project running on tomcat7. What I have tried is I am using Jenkins for the CI.So whenever the new push happens to the Git-hub Jenkins starts to build, after completion of build it will upload the war file to the AWS S3.
Where I have stuck is, I am not getting a way to deploy the war file to the AWS Ec2 instance.
I have tried to use Code Deployment where at a point it showed me that it supports only tar, tar.gz and zip is there any way out to deploy the war file to the AWS EC2 instance from the S3.
Thank you.
You can use Amazon Code Deploy which can manage deployment from a S3 bucket and can automate deployment to EC2 instance of your file/scripts.
From the Overview of a Deployment
Here's how it works:
First, you create deployable content – such as web pages, executable
files, setup scripts, and so on – on your local development machine or
similar environment, and then you add an application specification
file (AppSpec file). The AppSpec file is unique to AWS CodeDeploy; it
defines the deployment actions you want AWS CodeDeploy to execute. You
bundle your deployable content and the AppSpec file into an archive
file, and then upload it to an Amazon S3 bucket or a GitHub
repository. This archive file is called an application revision (or
simply a revision).
Next, you provide AWS CodeDeploy with
information about your deployment, such as which Amazon S3 bucket or
GitHub repository to pull the revision from and which set of instances
to deploy its contents to. AWS CodeDeploy calls a set of instances a
deployment group. A deployment group contains individually tagged
instances, Amazon EC2 instances in Auto Scaling groups, or both.
Each time you successfully upload a new application revision that you
want to deploy to the deployment group, that bundle is set as the
target revision for the deployment group. In other words, the
application revision that is currently targeted for deployment is the
target revision. This is also the revision that will be pulled for
automatic deployments.
Next, the AWS CodeDeploy agent on each
instance polls AWS CodeDeploy to determine what and when to pull the
revision from the specified Amazon S3 bucket or GitHub repository.
Finally, the AWS CodeDeploy agent on each instance pulls the target
revision from the specified Amazon S3 bucket or GitHub repository and,
using the instructions in the AppSpec file, deploys the contents to
the instance.
AWS CodeDeploy keeps a record of your deployments so
that you can get information such as deployment status, deployment
configuration parameters, instance health, and so on.
Good part is that code deploy has no additional cost, you only pay for the resources (EC2, S3) that are used in your pipeline
Assuming you have already created a S3 bucket.
Step 1: Create a IAM user / Role who have access to a s3 bucket where in you are placing the WAR file
Step 2: Write a custom script which will download WAR File from S3 to your EC2 instance.
You can also use aws cli to download contents from s3 to your local machine.
Create a startup.sh file and add these contents
aws s3 cp s3://com.yoursitename/warFile/sample.war /tmp
sudo mv /tmp/sample.war /var/lib/tomcat/webapps/ROOT.war
sudo service tomcat restart

Resources