I want to download AWS Lambda source package as Zip. I know that there is an option to download lambda function as SAM file or deployment package in lambda console. But I don't have access to AWS console in the production environment.
See the attached screens.
Below are the two available options.
I want to do same functionality in AWS CLI with minimum shell script commands. After downloading the lambda source in zip format, I will create lambda function in production environment via AWS CLI.
aws lambda create-function --region [AWSREGION] --function-name [FUNCTION] --zip-file fileb://[ZIPFILE] --role [ROLEARN] --handler [FILENAME].lambda_handler --description="[FUNCTIONDESCRIPTION]" --runtime [RUNTIME] --timeout [TIMEOUT] --memory-size [MEMORYSIZE] --profile [PROFILENAME]
Please help me with this matter, help in linux shell script commands will be highly appreciated.
bash one liner
aws lambda get-function --function-name function_name --query 'Code.Location' | xargs wget -o function_name.zip
You may find the answer here :
Download an already uploaded Lambda function
A simple bash solution is also provided at https://gist.github.com/nemaniarjun/defdde356b6678352bcd4af69b7fe529
# Parallelly download all aws-lambda functions
# Assumes you have ran `aws configure` and have output-mode as "text"
# Works with "aws-cli/1.16.72 Python/3.6.7 Linux/4.15.0-42-generic botocore/1.12.62"
download_code () {
local OUTPUT=$1
aws lambda get-function --function-name $OUTPUT | head -n 1 | cut -f 2 | xargs wget -O ./lambda_functions/$OUTPUT.zip
}
mkdir lambda_functions
for run in $(aws lambda list-functions | cut -f 6 | xargs);
doth
download_code "$run" &
done
Edit:
Credits to the original author. Just sharing the code since the URL may get unreachable later.
I've created a simple Bash script based off the original Python script shared earlier.
The difference being that it accepts JSON input and downloads the files in sequence rather than in parallel.
AWS Lambda Download Bash Script
# !/bin/sh
## List the names of all Lambda functions. Can be constrained by using --max-items
for i in `aws lambda list-functions | grep FunctionName | cut -d ":" -f2 | cut -d '"' -f2`
do
echo 'Fetching code for function:' $i
## Using each name, get the function details and then download the zip file containing the source code.
aws lambda get-function --function-name $i | grep Location | awk -F' ' '{print $2}' | xargs wget -O $i.zip
echo 'Code downloaded to' $i.zip
done
Looking at your requirements you can use aws lambda get-function CLI command to download lambda function deployment package.
See Synopsis.
get-function
--function-name <value>
[--qualifier <value>]
[--cli-input-json <value>]
[--generate-cli-skeleton <value>]
You can also see full detail.
But this command will not give you the zip file. If you execute the command.
aws lambda get-function --function-name MyLambdaFunction
It will give you similar to below result.
{
"Code": {
"RepositoryType": "S3",
"Location": "https://awslambda-eu-west-1-tasks.s3.eu-west-1.amazonaws.com/snapshots/014747066885/MyLambdaFunction-aa227fd0-4d4a-4690-9447-6e1818aaa752?versionId=HoQu5vbudzRpYLe0laIVQIahVN2NVxET&X-Amz-Security-Token=FQoGZXIvYXdzEIr%2F%2F%2F%2F%2F%2F%2F%2F%2F%2FwEaDB%2FdpZU6fCyQG%2ByhJyK3A7Dycy5L9hVWmExELuh6f0jFskmKJ62GhGf3J7LC94wB5E5CU2jplsLhw%2Fd%2FmmmJktzo07wI3XLWvSj6zxbHvJFdscCAqF7AYZOhRQR4mOIN6HkanRrHMBHeoTeDqOT6Yk8elhQYfno7dSHP%2FwdNVutS9P0SNmDLDhrxNLAxceDz8nBj1N9AZqhfMwV65OCtTubgLaLSFei75DosXIUaylWsrXgrz4B%2F6bo8LmeDxhNcYefGOBMvwKtyFSdPAP1TulcJpwQIUIC3losjtcTnRt9CSTxhn7TPMDfw4QI5ITKvxgNzO5T2TF2cJVqbotFFVdqPQNHuL2XLMNU24BwjSwF%2FsKWlV6tygXhdQWpTrJFRW%2FqxV%2BX2C1yq0sjpWtc5SerkrmqHvvDjA0L7GlOpG8Q1BLHyQWj0FPmuhrrPyjyFCNqVkpo6eUl35yK%2BHWa1hsXoEPyccoqdHa4gU%3D&X-Amz-Algorithm=AWS4-HMAC-SHA256&X-Amz-Date=20190203T092717Z&X-Amz-SignedHeaders=host&X-Amz-Expires=600&X-Amz-Credential=ASIA54NGUQSHZ4CZTFNT%2F20190203%2Fus-east-1%2Fs3%2Faws4_request&X-Amz-Signature=ee3bbef557cff32f86d26abc769b14"
},
"Configuration": {
"TracingConfig": {
"Mode": "PassThrough"
},
"Version": "$LATEST",
"CodeSha256": "l6q5ldtk0YEhEv3wnJhhCiAPyRd2XB1/8nT+ZWk=",
"FunctionName": "MyLambdaFunction",
"MemorySize": 3008,
"RevisionId": "a3bdbef4-8616-4c6a-ba19-074acb80b143",
"CodeSize": 6083880,
"FunctionArn": "arn:aws:lambda:us-east-1:014747066885:function:MyLambdaFunction",
"Handler": "lambda_function.lambda_handler",
"Role": "arn:aws:iam::014747066885:role/lambda_admin",
"Timeout": 900,
"LastModified": "2019-01-30T10:09:50.283+0000",
"Runtime": "python3.6",
"Description": "Test MyLambdaFunction"
}
}
Now you need to convert it to zip format file with further tweaks.
You can use below two commands to get a lambda function in zip file format.
aws lambda get-function --function-name MyLambdaFunction --profile [AWS_ROFILE] | grep "Location" | awk -F ": " '{ print $2}' | sed 's/"//g' > ~/MyLambdaFunction.txt
wget -i ~/MyLambdaFunction.txt -O ~/MyLambdaFunction.zip
Here is a full shell script commands.
FUNCTION_NAME=${1}
AWS_PROFILE=[AWSPROFILE]
aws lambda get-function --function-name ${FUNCTION_NAME} --profile ${AWS_PROFILE} | grep "Location" | awk -F ": " '{ print $2}' | sed 's/"//g' > ~/${FUNCTION_NAME}.txt
wget -i ~/${FUNCTION_NAME}.txt -O ~/${FUNCTION_NAME}.zip
You can convert it to a shell script (e.g getLambdaFunction.sh) and execute it with below command.
./getLambdaFunction.sh [FUNCTIONNAME]
After getting lambda package as zip file, you can create lambda function.
aws lambda create-function --region us-east-1 --function-name MyLambdaFunction --zip-file fileb://MyLambdaFunction.zip --role arn:aws:iam::[AWSACCOUNT]:role/service-role/[LAMBDAROLE] --handler lambda_function.lambda_handler --description="My Lambda Function" --runtime "python3.6" --profile [AWSPROFILE]
As a sample: it is assumed that MyLambdaFunction is lambda function name, us-east-1 is aws region and run-time is python 3.6.
If the chosen answer didn't work for you too, try this:
Install jq:
# for e.g on a Debian based distro (ubuntu, mint, MXlinux, ..)
sudo apt install jq
Use this script
(past it in a a file named download_all_lambda_functions.sh then run bash download_all_lambda_functions.sh)
download_code () {
# clean double quotes if found
local OUTPUT=${1//\"/}
local dest=./lambda_functions/$OUTPUT.zip
local URL=$(aws lambda get-function --function-name $OUTPUT --query 'Code.Location')
# Using curl instead of wget
echo $URL | xargs curl -o $dest
}
mkdir -p lambda_functions
for run in $(aws lambda list-functions | jq -r .Functions[].FunctionName);
do
echo Found lambda function: $run
download_code "$run"
done
echo "Completed Downloading all the Lamdba Functions!"
I believe the chosen answer's code got broken because AWS CLI is returning a JSON file, which is better explored using tools like jq
Find gist here
Related
I am calling AWS CLI commands in a bash script. I have a need to add tags to files whose prefix is as follows:
/base/user1/foo/file1
/base/user2/foo/fileA
/base/user3/foo/fileX
I only want to delete those under "foo", but if user has files like:
/base/user1/bar/fileZ
, I don't want to delete those under "bar".
I have a script:
!/bin/bash
aws s3api list-objects --bucket myBucket --query 'Contents[].{Key: Key}' --prefix myPrefix --output text | xargs -n 1 aws s3api put-object-tagging --bucket myBucket --tagging "{\"TagSet\": [{ \"Key\": \"myKey\", \"Value\": \"myValue\" }]}" --key
This works fine as long as myPrefix is an absolute path like:
/base/user1/foo/
, but I have too many users to manual do this, so I wanted to do something like:
/base/*/foo/
for the prefix. However, that throws an error:
An error occurred (NoSuchKey) when calling the PutObjectTagging operation: The specified key does not exist
Is there a way in bash in a loop or something that traverses down to the "foo" level so that I can have the full path: prefix /base/user1/foo/, /base/user2/foo, /ase/user3/foo dynamically defined for the prefix? Thanks for any response.
I was able to find a decent solution on my own. Replacing this line from the original post:
aws s3api list-objects --bucket myBucket --query 'Contents[].{Key: Key}' --prefix myPrefix --output text | xargs -n 1 aws s3api put-object-tagging --bucket myBucket --tagging "{\"TagSet\": [{ \"Key\": \"myKey\", \"Value\": \"myValue\" }]}" --key
, with the following:
for BUCKET_PATH in $(aws s3 ls --recursive --summarize $BUCKET);
do
if [[ $BUCKET_PATH == *$PREFIX* ]]; then
echo $BUCKET_PATH
aws s3api put-object-tagging --bucket ${BUCKET} --key ${BUCKET_PATH} --tagging "{\"TagSet\": [{ \"Key\": \"${KEY}\", \"Value\": \"${VALUE}\" }]}"
fi
done
works great.
I am using AWS EC2 CLI to perform a filter on stopped instances, then create an AMI out of these with the AMI name taken from the instance tag.
aws ec2 describe-instances --output text --profile proj --query 'Reservations[*].[Instances[*].[InstanceId, InstanceType, State.Name, Platform, Placement.AvailabilityZone, PublicIpAddress, PrivateIpAddress,[Tags[?Key==`Name`].Value][0][0]]]' --filter --filters Name=instance-state-name,Values=stopped | awk '{print $1, $8}' | xargs -n2 aws ec2 create-image --profile proj --instance-id {} --name {} --no-reboot
how to let args differentiate the two different parameters from AWK (instnaceid, instance name tag), thereby it can be correctly pumped into the ec2 create-image on the instance-id and --name parameter accordingly
You do not need awk.Using AWS CLI, you are extracting 8 values first and then using awk to extract 2 values from that 8 values. Why? Just extract the 2 values from AWS CLI without using awk.
--query 'Reservations[*].[Instances[*].[InstanceId, [Tags[?Key==`Name`].Value][0][0]]]'
will return only the values you are interested in. Then use xargs to pass the arguments to your next command.
xargs -n2 command --arg1 $1 --arg2 $2
Your entire command becomes:
aws ec2 describe-instances --output text --profile proj --query 'Reservations[*].[Instances[*].[InstanceId, [Tags[?Key==`Name`].Value][0][0]]]' --filter --filters Name=instance-state-name,Values=stopped | xargs -n2 aws ec2 create-image --profile proj --instance-id $1 --name $2 --no-reboot
I have a shell script as given below. This script actually add AWS instance in autoscalling scale in protection group. When I run individual commands that went fine. But when I created a shell file and tried to execute same there are error. See below script
set -x
INSTANCE_ID=$(wget -q -O - http://169.254.169.254/latest/meta-data/instance-id)
ASG_NAME=$(aws ec2 describe-tags --filters "Name=resource-id,Values=$INSTANCE_ID" --region us-east-2 | jq '.Tags[] | select(.["Key"] | contains("a:autoscaling:groupName")) | .Value')
ASG_NAME=$(echo $ASG_NAME | tr -d '"')
aws autoscaling set-instance-protection --instance-ids $INSTANCE_ID --auto-scaling-group-name $ASG_NAME --protected-from-scale-in --region us-east-2
error is as given below. I think issue is with second line. It is not able to get ASG_NAME, I tried some of escape character but nothing is working.
+++ wget -q -O - http://169.254.169.254/latest/meta-data/instance-id
++ INSTANCE_ID=i-----
+++ aws ec2 describe-tags --filters Name=resource-id,Values=i------ --region us-east-2
+++ jq '.Tags[] | select(.["Key"] | contains("a:autoscaling:groupName")) | .Value'
++ ASG_NAME=
+++ echo
+++ tr -d '"'
++ ASG_NAME=
++ aws autoscaling set-instance-protection --instance-ids i---- --auto-scaling-group-name --protected-from-scale-in --region us-east-2
usage: aws [options] <command> <subcommand> [<subcommand> ...] [parameters]
To see help text, you can run:
aws help
aws <command> help
aws <command> <subcommand> help
aws: error: argument --auto-scaling-group-name: expected one argument
> Blockquote
Solved issue by recommendation of #chepner. Modified second line by
ASG_NAME=$(aws ec2 describe-tags --filters "Name=resource-id,Values=$INSTANCE_ID" --region us-east-2 --query 'Tags[1].Value')
we have to check for the status of instance and iam trying to capture if any error to logfile. logfile has instance inforamtion but the error is not being writen to logfile below is code let me know what needs to be corrected
function wait-for-status {
instance=$1
target_status=$2
status=unknown
while [[ "$status" != "$target_status" ]]; do
status=`aws rds describe-db-instances \
--db-instance-identifier $instance | head -n 1 \
| awk -F \ '{print $10}'` >> ${log_file} 2>&1
sleep 30
echo $status >> ${log_file}
done
Rather than using all that head/awk stuff, if you want a value out of the CLI, you should use the --query parameter. For example:
aws rds describe-db-instances --db-instance-identifier xxx --query 'DBInstances[*].DBInstanceStatus'
See: Controlling Command Output from the AWS Command Line Interface - AWS Command Line Interface
Also, if your goal is to wait until an Amazon RDS instance is available, then you should use db-instance-available — AWS CLI Command Reference:
aws rds wait db-instance-available --db-instance-identifier xxx
I wrote a simple bash script to download my RDS postgres files.
But the kicker is that is all works fine tine terminal, but when I try the same thing in the script I get an error:
An error occurred (DBLogFileNotFoundFault) when calling the DownloadDBLogFilePortion operation: DBLog File: "error/postgresql.log.2017-11-05-23", is not found on the DB instance
The command in question is this:
aws rds download-db-log-file-portion --db-instance-identifier foobar --starting-token 0 --output text --log-file error/postgresql.log.2017-11-05-23 >> test.log
It all works fine, but when I put the exact same line in the bash script I get the error message that there are no db log files - which is nonsense, they are there.
This is the bash script:
download_generate_report() {
for filename in $( aws rds describe-db-log-files --db-instance-identifier $1 | awk {'print $2'} | grep $2 )
do
echo $filename
echo $1
aws rds download-db-log-file-portion --db-instance-identifier $1 --starting-token 0 --output text --log-file $filename >> /home/ubuntu/pgbadger_script/postgres_logs/postgres_$1.log.$2
done
}
Tnx,
Tom
I re-wrote your script a little and it seems to work for me. It barked about grep. This uses jq.
for filename in $( aws rds describe-db-log-files --db-instance-identifier $1 | jq -r '.DescribeDBLogFiles[] | .LogFileName' )
do
aws rds download-db-log-file-portion --db-instance-identifier $1 --output text --no-paginate --log-file $filename >> /tmp/postgres_$1.log.$2
done
Thanks you Ian, I have an issue, with aws cli 2.4, because the log files download truncated.
to solve this I changed --no-paginate with --starting-token 0 more info in the RDS Reference
finally in bash:
#/bin/bash
set -x
for filename in $( aws rds describe-db-log-files --db-instance-identifier $1 | jq -r '.DescribeDBLogFiles[] | .LogFileName' )
do
aws rds download-db-log-file-portion --db-instance-identifier $1 --output text --starting-token 0 --log-file $filename >> $filename
done