JMESPath query expression with bash variable - bash

Messing around with a simple aws cli query to check for the existence of a Lambda function and echo the associated role if it exists:
#!/bin/bash
fname=$1
role=$(aws lambda list-functions --query 'Functions[?FunctionName == `$fname`].Role' --output text)
echo "$fname role: $role"
However, $fname appears to be resolving to an empty string in the aws command. I've tried escaping the back ticks, swapping ` to ' and a miriad of other thrashing edits (and yes, I'm passing a string on the cl when invoking the script :)
How do I properly pass a variable into JMESPath query inside a bash script?

Because the whole JMESPath expression is enclosed in single quotes, bash is not expanding the $fname variable. To fix this you can surround the value with double quotes and then use single quotes (raw string literals) for the $fname var:
aws lambda list-functions --query "Functions[?FunctionName == '$fname'].Role" --output text

Swapping the backticks to single quotes, didn't work for me... :(
But escaping the backticks works :)
Here are my outputs:
aws elbv2 describe-listeners --load-balancer-arn $ELB_ARN --query "Listeners[?Port == '$PORT'].DefaultActions[].TargetGroupArn | [0]"
null
aws elbv2 describe-listeners --load-balancer-arn $ELB_ARN --query "Listeners[?Port == \`$PORT\`].DefaultActions[].TargetGroupArn | [0]"
"arn:aws:elasticloadbalancing:ap-southeast-2:1234567:targetgroup/xxx"

Related

How to do a bash `for` loop in terraform termplatefile?

I'm trying to include a bash script in an AWS SSM Document, via the Terraform templatefile function. In the aws:runShellScript section of the SSM document, I have a Bash for loop with an # sign that seems to be creating an error during terraform validate.
Version of terraform: 0.13.5
Inside main.tf file:
resource "aws_ssm_document" "magical_document" {
name = "magical_ssm_doc"
document_type = "Command"
document_format = "YAML"
target_type = "/AWS::EC2::Instance"
content = templatefile(
"${path.module}/ssm-doc.yml",
{
Foo: var.foo
}
)
}
Inside my ssm-doc.yaml file, I loop through an array:
for i in "$\{arr[#]\}"; do
if test -f "$i" ; then
echo "[monitor://$i]" >> $f
echo "disabled=0" >> $f
echo "index=$INDEX" >> $f
fi
done
Error:
Error: Error in function call
Call to function "templatefile" failed:
./ssm-doc.yml:1,18-19: Invalid character;
This character is not used within the language., and 1 other diagnostic(s).
I tried escaping the # symbol, like \#, but it didn't help. How do I
Although the error is pointing to the # symbol as being the cause of the error, it's the ${ } that's causing the problem, because this is Terraform interpolation syntax, and it applies to templatefiles too. As the docs say:
The template syntax is the same as for string templates in the main Terraform language, including interpolation sequences delimited with ${ ... }.
And the way to escape interpolation syntax in Terraform is with a double dollar sign.
for i in "$${arr[#]}"; do
if test -f "$i" ; then
echo "[monitor://$i]" >> $f
echo "disabled=0" >> $f
echo "index=$INDEX" >> $f
fi
done
The interpolation syntax is useful with templatefile if you're trying to pass in an argument, such as, in the question Foo. This argument could be accessed within the yaml file as ${Foo}.
By the way, although this article didn't give the answer to this exact issue, it helped me get a deeper appreciation for all the work Terraform is doing to handle different languages via the templatefile function. It had some cool tricks for doing replacements to escape for different scenarios.

Variable not expanding in Double quotes for bash script

I have a bash script where i'm trying to call a curl which is having a variable value as input. When trying to execute the bash script the variable value is not getting expanded in double quotes.
Expected curl in script after variable expansion should be as following:
/usr/bin/curl -s -vvvv http://hmvddrsvr:8044/query/service -u iamusr:pssd -d 'statement=DELETE FROM `test_bucket` WHERE type = "Metadata" AND market = "ES" AND status = "active" AND meta(test_bucket).id="fgsd34sff334" '
Getting executed as follows when observed in debug mode:
/usr/bin/curl -s -vvvv http://hmvddrsvr:8044/query/service -u iamusr:pssd -d 'statement=DELETE FROM `test_bucket` WHERE type = "Metadata" AND market = "ES" AND status = "active" AND meta(test_bucket).id=\""$idp_sub"\" '
My bash script is as follows:
#!/bin/bash
idp_sub=""
for idp_sub in $(cat /opt/SP/jboss/home/mayur/es_idp_sub.txt)
do
/usr/bin/curl -s -vvvv http://hmvddrsvr:8044/query/service -u iamusr:pssd -d 'statement=DELETE FROM `test_bucket` WHERE type = "Metadata" AND market = "ES" AND status = "active" AND meta(test_bucket).id=\""$idp_sub"\" ' -o /opt/SP/jboss/home/mayur/es_delete_response.txt
done
How does do i expand the variable value within double quotes as shown above in expected output ?
Your double-quoted string is inside single quotes, where it won't be expanded.
Compare:
foo=bar
echo 'foo=\""$foo\"'
echo 'foo="'"$foo"'"'
In the second example, we end the single quotes, and double-quote $foo, then start new single quotes for the final '.
It's probably easier to read if we expand using printf instead:
printf 'foo=%s\n' "$foo"
That's something you might want to run as a process substitution.
BUT...
This is a wrong and dangerous way to construct an SQL query (and the web server is also poor, if it forwards arbitrary queries - I hope it has no write permissions to the data). Read about "SQL command injection" and come back to this code when you understand the issues.
Nothing inside single quotes will be expanded by bash, including any double-quotes, and variable names. The good news is you can end your single-quoted section and immediately start a double-quoted section to introduce the variable, and it will all be concatenated into a single argument for the application (curl). Try:
/usr/bin/curl -s -vvvv http://hmvddrsvr:8044/query/service -u iamusr:pssd -d 'statement=DELETE FROM `test_bucket` WHERE type = "Metadata" AND market = "ES" AND status = "active" AND meta(test_bucket).id=\"'"$idp_sub"'\" ' -o /opt/SP/jboss/home/mayur/es_delete_response.txt
You can make your code strongly injection-proof by rejecting any string containing a double-quote, but you might reject some strings that have been legitimately escaped.
If you can use the q syntax to quote the string, you can make it more injection-proof, but I guess the attacker just has to inject ]":
/usr/bin/curl -s -vvvv http://hmvddrsvr:8044/query/service -u iamusr:pssd -d 'statement=DELETE FROM `test_bucket` WHERE type = "Metadata" AND market = "ES" AND status = "active" AND meta(test_bucket).id=q\"['"$idp_sub"]'\" ' -o /opt/SP/jboss/home/mayur/es_delete_response.txt
You could then search for and reject the pattern string ]" as your anti-injection, which will allow a much wider class of legitimate strings. You would have to tell the users that you have applied q[] quoting to their input, so they don't have to.

Getting value out from square brackets in "aws ec2 describe-instances" output

I am saving output of my command in a variable and using it in other command
instance_id=$(aws ec2 describe-instances --query Reservations[*].Instances[*].[InstanceId] --filters "Name=tag:staging,Values=staging")
I get this output [ [ [ "i-09140824c1b7f9ea7" ] ] ]
How to I remove brackets from the output and use it in the variable in this command
aws ec2 associate-address --instance-id $instance_id --allocation-id allocid
I am new to bash so any help would be appreciated.
You can extract the value and decode it from JSON to plain text with the following:
instance_id_json='[ [ [ "i-09140824c1b7f9ea7" ] ] ]'
instance_id=$(jq -r '.[][][]' <<<"$instance_id_json")
Consider changing your original code to:
instance_id_json=$(aws ec2 describe-instances --query 'Reservations[*].Instances[*].[InstanceId]' --filters 'Name=tag:staging,Values=staging')
...note that we're putting the query in quotes (since it has glob characters), not just the filter (and since the filter is intended to be literal text with no expansions, we're defaulting to single-quotes rather than double as a matter of form).

bash function with third arg is an array

I'm trying to write a bash function that does the following:
If there is no 3rd argument, run a command.
If there is a third argument, take every argument from the third one and run a command.
The problem I have is the last bit of the command --capabilities CAPABILITY_IAM in the else statement that I don't want to pass in all the time if I have multiple parameters.
An error occurred (InsufficientCapabilitiesException) when calling the CreateStack operation: Requires capabilities : [CAPABILITY_NAMED_IAM]
// that means I need to pass in --capabilities CAPABILITY_IAM
Is there a way to tell bash that: hey, take all the args from the 3rd one, then add the --capabilities CAPABILITY_IAM after? Like in JavaScript I can do this:
function allTogetherNow(a, b, ...c) {
console.log(`${a}, ${b}, ${c}. Can I have a little more?`);
}
allTogetherNow('one', 'two', 'three', 'four')
Here's my function:
cloudformation_create() {
if [ -z "$3" ]; then
aws cloudformation create-stack --stack-name "$1" --template-body file://"$2" --capabilities CAPABILITY_IAM
else
aws cloudformation create-stack --stack-name "$1" --template-body file://"$2" --parameters "${#:3}" --capabilities CAPABILITY_IAM
fi
}
And the 3rd and so on parameters look like this if I don't use a bash function:
aws cloudformation create-stack --stack-name MY_STACK_NAME --template-body file://MY_FILE_NAME --parameters ParameterKey=KeyPairName,ParameterValue=TestKey ParameterKey=SubnetIDs,ParameterValue=SubnetID1 --capabilities CAPABILITY_IAM
Update 22 May 2019:
Following Dennis Williamson's answer below. I've tried:
Passing the parameters in the AWS way:
cloudformation_create STACK_NAME FILE_NAME ParameterKey=KeyPairName,ParameterValue=TestKey ParameterKey=SubnetIDs,ParameterValue=SubnetID1
Got error:
An error occurred (ValidationError) when calling the CreateStack operation: Parameters: [...] must have values
Pass in as a string:
cloudformation_create STACK_NAME FILE_NAME "ParameterKey=KeyPairName,ParameterValue=TestKey ParameterKey=SubnetIDs,ParameterValue=SubnetID1"
Got error:
An error occurred (ValidationError) when calling the CreateStack operation: ParameterValue for ... is required
Pass in without ParameterKey and ParameterValue:
cloudformation_create STACK_NAME FILE_NAME KeyPairName=TestKey SubnetIDs=SubnetID1
Got error:
Parameter validation failed:
Unknown parameter in Parameters[0]: "PARAM_NAME", must be one of: ParameterKey, ParameterValue, UsePreviousValue, ResolvedValue
// list of all the params with the above error
Pass in without ParameterKey and ParameterValue and as a string. Got error:
arameter validation failed:
Unknown parameter in Parameters[0]: "PARAM_NAME", must be one of: ParameterKey, ParameterValue, UsePreviousValue, ResolvedValue
I tried Alex Harvey's answer and got this:
An error occurred (ValidationError) when calling the CreateStack operation: Template format error: unsupported structure.
Based on LeBlue's answer and a quick read of the docs, it looks like you need to build the argument to --parameters from the arguments to the function:
cloudformation_create () {
local IFS=,
local parameters="${*:3}"
#if block not shown
aws cloud_formation ... --parameters "$parameters" ...
this presumes that your function is called like this:
cloudformation_create foo bar baz=123 qux=456
that is, with the key, value pairs already formed.
The snippet above works by setting IFS to a comma. Using $* inside quotes causes the elements contained in it to be joined using the first character of IFS. If you need to make use of the word splitting features in another part of your function, you may want to save the current value of IFS before changing it then restore it after the joining assignment.
As a result, the expansion of $parameters will be baz=123,qux=456
I am not sure that answers your question but maybe it will help you.
$# is the number of parameters.
$# will include all parameters and you can pass that.
#!/bin/bash
foo()
{
echo "params in foo: " $#
echo "p1: " $1
echo "p2: " $2
echo "p3: " $3
}
echo "number of paramters: " $#
foo $# 3 # pass params and add one to the end
Call:
./test.sh 1 2
Output:
number of paramters: 2
params in foo: 3
p1: 1
p2: 2
p3: 3
I suspect the parameter expansion is wrong, as --parameters probably needs to have one argument.
Either quote all arguments to cloudformation_create that need to end up as value for the --parameters flag:
cloudformation_create "the-stack" "the-filename" "all the parameters"
or rewrite the function to not expand into multiple arguments with "$*" (merge every arg into one)
cloudformation_create () {
...
else
aws cloudformation ... --parameters "${*:3}" --capabilities CAPABILITY_IAM
fi
}
This will preserve all values as one string/argument, both will translate to:
aws cloudformation ... --parameters "all other parameters" --capabilities CAPABILITY_IAM
as opposed to your version:
aws cloudformation ... --parameters "all" "other" "parameters" --capabilities CAPABILITY_IAM
First of all, thank you all for your help.
I've realised the issue (and my mistake):
AWS returned the error with Requires capabilities : [CAPABILITY_NAMED_IAM] and my function has [CAPABILITY_IAM].
Depends on the template with params related to creating IAM, [CAPABILITY_NAMED_IAM] or [CAPABILITY_IAM] is required. I found the answer here helpful.
So in my case, the bash function is good, for the template I was trying to create, I need to pass in --capabilities CAPABILITY_NAMED_IAM. I've tried it and it works.
I would write it this way:
cloudformation_create() {
local stack_name=$1
local template_body=$2
shift 2 ; local parameters="$#"
local command=(aws cloudformation create-stack
--stack-name "$stack_name"
--template-body "file://$template_body")
[ ! -z "$parameters" ] && \
command+=(--parameters "$parameters")
command+=(--capabilities CAPABILITY_IAM)
${command[#]}
}
Note:
calling shift 2 results in $3 being rotated to $1 such that you can just use $# as normal.

bash script - unable to set variable with double quotes in value

Need help in fixing this bash script to set a variable with a value including double quotes. Somehow I am defining this incorrectly as my values foo and bar are not enclosed in double quotes as needed.
My script thus far:
#!/usr/local/bin/bash
set -e
set -x
host='127.0.0.1'
db='mydev'
_account="foo"
_profile="bar"
_version=$1
_mongo=$(which mongo);
exp="db.profile_versions_20170420.find({account:${_account}, profile:${_profile}, version:${_version}}).pretty();";
${_mongo} ${host}/${db} --eval "$exp"
set +x
Output shows:
+ host=127.0.0.1
+ db=mydev
+ _account=foo
+ _profile=bar
+ _version=201704112004
++ which mongo
+ _mongo=/usr/local/bin/mongo
+ exp='db.profile_versions_20170420.find({account:foo, profile:bar, version:201704112004}).pretty();'
+ /usr/local/bin/mongo 127.0.0.1/mydev --eval 'db.profile_versions_20170420.find({account:foo, profile:bar, version:201704112004}).pretty();'
MongoDB shell version: 3.2.4
connecting to: 127.0.0.1/mydev
2017-04-22T15:32:55.012-0700 E QUERY [thread1] ReferenceError: foo is not defined :
#(shell eval):1:36
What i need is account:"foo", profile:"bar" to be enclosed in double quotes.
In bash (and other POSIX shells), the following 2 states are equivalent:
_account=foo
_account="foo"
What you want to do is to preserve the quotations, therefore you can do the following:
_account='"foo"'
Since part of what you're doing here is forming JSON, consider using jq -- which will guarantee that it's well-formed, no matter what the values are.
host='127.0.0.1'
db='mydev'
_account="foo"
_profile="bar"
_version=$1
json=$(jq -n --arg account "$_account" --arg profile "$_profile" --arg version "$_version" \
'{$account, $profile, version: $version | tonumber}')
exp="db.profile_versions_20170420.find($json).pretty();"
mongo "${host}/${db}" --eval "$exp"
This makes jq responsible for adding literal quotes where appropriate, and will avoid attempted injection attacks (for instance, via a version passed in $1 containing something like 1, "other_argument": "malicious_value"), by replacing any literal " in a string with \"; a literal newline with \n, etc -- or, with the | tonumber conversion, failing outright with any non-numeric value.
Note that some of the syntax above requires jq 1.5 -- if you have 1.4 or prior, you'll want to write {account: $account, profile: $profile} instead of being able to write {$account, $profile} with the key names inferred from the variable names.
When you need to use double quotes inside a double quoted string, escape them with backslashes:
$ foo="acount:\"foo\"" sh -c 'echo $foo'
acount:"foo"
I needed to enquote something already in a variable and stick that in a variable. Expanding on Robert Seaman's answer, I found this worked:
VAR='"'$1'"'
(single quote, double quote, single quote, variable,single quote, double quote, single quote)

Resources