aws cli - download rds postgres logs in a bash script - bash

I wrote a simple bash script to download my RDS postgres files.
But the kicker is that is all works fine tine terminal, but when I try the same thing in the script I get an error:
An error occurred (DBLogFileNotFoundFault) when calling the DownloadDBLogFilePortion operation: DBLog File: "error/postgresql.log.2017-11-05-23", is not found on the DB instance
The command in question is this:
aws rds download-db-log-file-portion --db-instance-identifier foobar --starting-token 0 --output text --log-file error/postgresql.log.2017-11-05-23 >> test.log
It all works fine, but when I put the exact same line in the bash script I get the error message that there are no db log files - which is nonsense, they are there.
This is the bash script:
download_generate_report() {
for filename in $( aws rds describe-db-log-files --db-instance-identifier $1 | awk {'print $2'} | grep $2 )
do
echo $filename
echo $1
aws rds download-db-log-file-portion --db-instance-identifier $1 --starting-token 0 --output text --log-file $filename >> /home/ubuntu/pgbadger_script/postgres_logs/postgres_$1.log.$2
done
}
Tnx,
Tom

I re-wrote your script a little and it seems to work for me. It barked about grep. This uses jq.
for filename in $( aws rds describe-db-log-files --db-instance-identifier $1 | jq -r '.DescribeDBLogFiles[] | .LogFileName' )
do
aws rds download-db-log-file-portion --db-instance-identifier $1 --output text --no-paginate --log-file $filename >> /tmp/postgres_$1.log.$2
done

Thanks you Ian, I have an issue, with aws cli 2.4, because the log files download truncated.
to solve this I changed --no-paginate with --starting-token 0 more info in the RDS Reference
finally in bash:
#/bin/bash
set -x
for filename in $( aws rds describe-db-log-files --db-instance-identifier $1 | jq -r '.DescribeDBLogFiles[] | .LogFileName' )
do
aws rds download-db-log-file-portion --db-instance-identifier $1 --output text --starting-token 0 --log-file $filename >> $filename
done

Related

File redirection not working in shell script for aws cli output

I'm creating ec2 instances and would like to get the user_data for each instance using the command:
aws ec2 describe-instance-attribute --instance-id i-xxxx --attribute userData --output text --query "UserData.Value" | base64 --decode > file.txt
When running this directly via terminal it works, I'm able to get the userData printed into file.txt. However, I need this to run in a shell script I have, which gets the instance id as a parameter.
The lines in test.sh are the following:
#!/bin/bash
echo "$(aws ec2 describe-instance-attribute --instance-id $1 --attribute userData --output text --query "UserData.Value" | base64 --decode)" > file.txt
Where $1 is the instance-id. When running:
./test.sh i-xxxxxxx
It creates an empty file.txt. I have changed the line in the script to:
echo "$(aws ec2 describe-instance-attribute --instance-id $1 --attribute userData --output text --query "UserData.Value" | base64 --decode)"
and it prints the userData to stdout. So why it is not working for file redirection?
Thank you,

Create a file from Shell script Variable

I am trying to create a file from a shell script variable (.sh file ext)
#!/bin/bash
LOGNAME=`date +"error/postgresql.log.%Y-%m-%d-%H00.csv"`
echo Log File Name is: ${LOGNAME}
echo > ${LOGNAME}
aws rds download-db-log-file-portion --db-instance-identifier randomDBname --log-file-name ${LOGNAME} --starting-token 0 --output -n text > ${LOGNAME} --profile aws
sleep 20
I receive this error:
error/postgresql.log.%Y-%m-%d-%H00.csv: No such file or directory
I have tried the following and all have the same error:
echo >> ${LOGNAME}
echo -n >> ${LOGNAME}
echo "Starting Log" > ${LOGNAME}
The echo log file name works fine and prints the log file name without issue. Any ideas what is causing this?
Resolution:
#!/bin/bash
LOGNAME=`date +"error/postgresql.log.%Y-%m-%d-%H00.csv"`
FILENAME=`date +"error-postgresql.log.%Y-%m-%d-%H00.csv"`
echo Log File Name is: ${LOGNAME}
aws rds download-db-log-file-portion --db-instance-identifier randomDBname --log-file-name ${LOGNAME} --starting-token 0 --output csv > ${FILENAME} --profile aws
Found the issue, it was the / in error/postgresql.

Download AWS Lambda source package as Zip from AWS CLI

I want to download AWS Lambda source package as Zip. I know that there is an option to download lambda function as SAM file or deployment package in lambda console. But I don't have access to AWS console in the production environment.
See the attached screens.
Below are the two available options.
I want to do same functionality in AWS CLI with minimum shell script commands. After downloading the lambda source in zip format, I will create lambda function in production environment via AWS CLI.
aws lambda create-function --region [AWSREGION] --function-name [FUNCTION] --zip-file fileb://[ZIPFILE] --role [ROLEARN] --handler [FILENAME].lambda_handler --description="[FUNCTIONDESCRIPTION]" --runtime [RUNTIME] --timeout [TIMEOUT] --memory-size [MEMORYSIZE] --profile [PROFILENAME]
Please help me with this matter, help in linux shell script commands will be highly appreciated.
bash one liner
aws lambda get-function --function-name function_name --query 'Code.Location' | xargs wget -o function_name.zip
You may find the answer here :
Download an already uploaded Lambda function
A simple bash solution is also provided at https://gist.github.com/nemaniarjun/defdde356b6678352bcd4af69b7fe529
# Parallelly download all aws-lambda functions
# Assumes you have ran `aws configure` and have output-mode as "text"
# Works with "aws-cli/1.16.72 Python/3.6.7 Linux/4.15.0-42-generic botocore/1.12.62"
download_code () {
local OUTPUT=$1
aws lambda get-function --function-name $OUTPUT | head -n 1 | cut -f 2 | xargs wget -O ./lambda_functions/$OUTPUT.zip
}
mkdir lambda_functions
for run in $(aws lambda list-functions | cut -f 6 | xargs);
doth
download_code "$run" &
done
Edit:
Credits to the original author. Just sharing the code since the URL may get unreachable later.
I've created a simple Bash script based off the original Python script shared earlier.
The difference being that it accepts JSON input and downloads the files in sequence rather than in parallel.
AWS Lambda Download Bash Script
# !/bin/sh
## List the names of all Lambda functions. Can be constrained by using --max-items
for i in `aws lambda list-functions | grep FunctionName | cut -d ":" -f2 | cut -d '"' -f2`
do
echo 'Fetching code for function:' $i
## Using each name, get the function details and then download the zip file containing the source code.
aws lambda get-function --function-name $i | grep Location | awk -F' ' '{print $2}' | xargs wget -O $i.zip
echo 'Code downloaded to' $i.zip
done
Looking at your requirements you can use aws lambda get-function CLI command to download lambda function deployment package.
See Synopsis.
get-function
--function-name <value>
[--qualifier <value>]
[--cli-input-json <value>]
[--generate-cli-skeleton <value>]
You can also see full detail.
But this command will not give you the zip file. If you execute the command.
aws lambda get-function --function-name MyLambdaFunction
It will give you similar to below result.
{
"Code": {
"RepositoryType": "S3",
"Location": "https://awslambda-eu-west-1-tasks.s3.eu-west-1.amazonaws.com/snapshots/014747066885/MyLambdaFunction-aa227fd0-4d4a-4690-9447-6e1818aaa752?versionId=HoQu5vbudzRpYLe0laIVQIahVN2NVxET&X-Amz-Security-Token=FQoGZXIvYXdzEIr%2F%2F%2F%2F%2F%2F%2F%2F%2F%2FwEaDB%2FdpZU6fCyQG%2ByhJyK3A7Dycy5L9hVWmExELuh6f0jFskmKJ62GhGf3J7LC94wB5E5CU2jplsLhw%2Fd%2FmmmJktzo07wI3XLWvSj6zxbHvJFdscCAqF7AYZOhRQR4mOIN6HkanRrHMBHeoTeDqOT6Yk8elhQYfno7dSHP%2FwdNVutS9P0SNmDLDhrxNLAxceDz8nBj1N9AZqhfMwV65OCtTubgLaLSFei75DosXIUaylWsrXgrz4B%2F6bo8LmeDxhNcYefGOBMvwKtyFSdPAP1TulcJpwQIUIC3losjtcTnRt9CSTxhn7TPMDfw4QI5ITKvxgNzO5T2TF2cJVqbotFFVdqPQNHuL2XLMNU24BwjSwF%2FsKWlV6tygXhdQWpTrJFRW%2FqxV%2BX2C1yq0sjpWtc5SerkrmqHvvDjA0L7GlOpG8Q1BLHyQWj0FPmuhrrPyjyFCNqVkpo6eUl35yK%2BHWa1hsXoEPyccoqdHa4gU%3D&X-Amz-Algorithm=AWS4-HMAC-SHA256&X-Amz-Date=20190203T092717Z&X-Amz-SignedHeaders=host&X-Amz-Expires=600&X-Amz-Credential=ASIA54NGUQSHZ4CZTFNT%2F20190203%2Fus-east-1%2Fs3%2Faws4_request&X-Amz-Signature=ee3bbef557cff32f86d26abc769b14"
},
"Configuration": {
"TracingConfig": {
"Mode": "PassThrough"
},
"Version": "$LATEST",
"CodeSha256": "l6q5ldtk0YEhEv3wnJhhCiAPyRd2XB1/8nT+ZWk=",
"FunctionName": "MyLambdaFunction",
"MemorySize": 3008,
"RevisionId": "a3bdbef4-8616-4c6a-ba19-074acb80b143",
"CodeSize": 6083880,
"FunctionArn": "arn:aws:lambda:us-east-1:014747066885:function:MyLambdaFunction",
"Handler": "lambda_function.lambda_handler",
"Role": "arn:aws:iam::014747066885:role/lambda_admin",
"Timeout": 900,
"LastModified": "2019-01-30T10:09:50.283+0000",
"Runtime": "python3.6",
"Description": "Test MyLambdaFunction"
}
}
Now you need to convert it to zip format file with further tweaks.
You can use below two commands to get a lambda function in zip file format.
aws lambda get-function --function-name MyLambdaFunction --profile [AWS_ROFILE] | grep "Location" | awk -F ": " '{ print $2}' | sed 's/"//g' > ~/MyLambdaFunction.txt
wget -i ~/MyLambdaFunction.txt -O ~/MyLambdaFunction.zip
Here is a full shell script commands.
FUNCTION_NAME=${1}
AWS_PROFILE=[AWSPROFILE]
aws lambda get-function --function-name ${FUNCTION_NAME} --profile ${AWS_PROFILE} | grep "Location" | awk -F ": " '{ print $2}' | sed 's/"//g' > ~/${FUNCTION_NAME}.txt
wget -i ~/${FUNCTION_NAME}.txt -O ~/${FUNCTION_NAME}.zip
You can convert it to a shell script (e.g getLambdaFunction.sh) and execute it with below command.
./getLambdaFunction.sh [FUNCTIONNAME]
After getting lambda package as zip file, you can create lambda function.
aws lambda create-function --region us-east-1 --function-name MyLambdaFunction --zip-file fileb://MyLambdaFunction.zip --role arn:aws:iam::[AWSACCOUNT]:role/service-role/[LAMBDAROLE] --handler lambda_function.lambda_handler --description="My Lambda Function" --runtime "python3.6" --profile [AWSPROFILE]
As a sample: it is assumed that MyLambdaFunction is lambda function name, us-east-1 is aws region and run-time is python 3.6.
If the chosen answer didn't work for you too, try this:
Install jq:
# for e.g on a Debian based distro (ubuntu, mint, MXlinux, ..)
sudo apt install jq
Use this script
(past it in a a file named download_all_lambda_functions.sh then run bash download_all_lambda_functions.sh)
download_code () {
# clean double quotes if found
local OUTPUT=${1//\"/}
local dest=./lambda_functions/$OUTPUT.zip
local URL=$(aws lambda get-function --function-name $OUTPUT --query 'Code.Location')
# Using curl instead of wget
echo $URL | xargs curl -o $dest
}
mkdir -p lambda_functions
for run in $(aws lambda list-functions | jq -r .Functions[].FunctionName);
do
echo Found lambda function: $run
download_code "$run"
done
echo "Completed Downloading all the Lamdba Functions!"
I believe the chosen answer's code got broken because AWS CLI is returning a JSON file, which is better explored using tools like jq
Find gist here

error in awscli call doesnt send to logfile

we have to check for the status of instance and iam trying to capture if any error to logfile. logfile has instance inforamtion but the error is not being writen to logfile below is code let me know what needs to be corrected
function wait-for-status {
instance=$1
target_status=$2
status=unknown
while [[ "$status" != "$target_status" ]]; do
status=`aws rds describe-db-instances \
--db-instance-identifier $instance | head -n 1 \
| awk -F \ '{print $10}'` >> ${log_file} 2>&1
sleep 30
echo $status >> ${log_file}
done
Rather than using all that head/awk stuff, if you want a value out of the CLI, you should use the --query parameter. For example:
aws rds describe-db-instances --db-instance-identifier xxx --query 'DBInstances[*].DBInstanceStatus'
See: Controlling Command Output from the AWS Command Line Interface - AWS Command Line Interface
Also, if your goal is to wait until an Amazon RDS instance is available, then you should use db-instance-available — AWS CLI Command Reference:
aws rds wait db-instance-available --db-instance-identifier xxx

Run bash script against 2 text files as variables

I need to run the following command in a bash script or any script for that matter:
$ aws rds download-db-log-file-portion --db-instance-identifier $LIST1 --log-file-name $LIST2 --region us-west-2
I have a text file with all the hostnames $LIST1 and another file with all the log files names $LIST2.
Basically I would like to know the best way to take each entry from $LIST1 and download all the logs for that entry from $LIST2.
$LIST1 sample:
host1
host2
host3
$LIST2 sample:
error/mysql-error.log
error/mysql-error.log.0
error/mysql-error.log.1
Example of a regular run of the command:
$ aws rds download-db-log-file-portion --db-instance-identifier host1 --log-file-name error/mysql-error.log--region us-west-2
The problem is I have 100+ hosts and each host has about 90 logs.
Here is a bash solution:
while read -r host
do
while read -r logfile
do
aws rds download-db-log-file-portion --db-instance-identifier $host --log-file-name $logfile --region us-west-2
done < logfiles.txt
done < hosts.txt

Resources