Suppress Errors in readarray Command in Bash - bash

I'm parsing AWS policy documents and I'm trying to send the errors in this command to /dev/null so that the user doesn't see them.
This is my code:
readarray -t aws_policy_effects < \
<( if aws iam get-policy-version --policy-arn "$aws_policy_arn" \
--version-id "$aws_policy_version_id" --profile="$aws_key" 2> /dev/null | \
jq -r '.PolicyVersion.Document.Statement[].Effect'
then
true
else
aws iam get-policy-version --policy-arn "$aws_policy_arn" \
--version-id "$aws_policy_version_id" --profile="$aws_key" | \
jq -r '.PolicyVersion.Document.Statement.Effect'
fi)
I'm using an 'if' statement so that the right jq query gets used based on the aws policy that we're reading.
I will get this error:
jq: error (at <stdin>:22): Cannot index string with string "Effect"
Because this command always run as the first condition of the if (if there's no list the statement of the AWS policy document):
++ jq -r '.PolicyVersion.Document.Statement[].Effect'
++ aws iam get-policy-version --policy-arn arn:aws:iam::123456789101:policy/IP_RESTRICTION --version-id v11 --profile=company-lab
Why isn't the error being buried by sending it to /dev/null? How can I get the error to not print out to the screen using this if statement?

I'm using an 'if' statement so that the right jq query gets used based on the aws policy that we're reading.
But jq is telling you that your jq query is not always valid. So one option would be to make your jq query more robust, e.g. by using postfix ? (e.g. .Effect?), or testing the type of the input, etc.

I didn't redirect jq's standard error, only the standard error of aws. Putting the 2> /dev/null to the end of the jq command works.

Related

Passing Variables in CURL GET

I'm trying to write a sh script that reads data from a JSON, and by parsing, pass the data as variables for a curl request (GET)
The parsing is done correctly but the CURL request continues to give problems.
I parse the variables, after opening the JSON, with the help of JQ:
ruleId=$($JSON | jq '.[].ruleId')
alert=$($JSON | jq '.[].alertName')
...
I'm passing the variables in this way:
curl --data-urlencode "ruleId=$ruleId" --data-urlencode "newLevel=$newLevel" --data-urlencode "url=$url" --data-urlencode "urlIsRegex=$urlIsRegex" --data-urlencode "enabled=$enabled" --data-urlencode "parameter=$parameter" --data-urlencode "evidence=$evidence" "http://localhost:8090/JSON/alertFilter/action/addGlobalAlertFilter"
The error that i get back is:
curl: (3) URL using bad/illegal format or missing URL
Can you help me?
Here is my test script:
!/bin/sh
#set -e
# Info to test this script
# 1. Start docker
# 2. Run this command from terminal: docker run -u zap -p 8080:8090 -i owasp/zap2docker-stable zap.sh -daemon -host 0.0.0.0 -port 8080 -config api.disablekey=true -config api.addrs.addr.name=.* -config api.addrs.addr.regex=true
# 2. Get the JSON from api (test for now)
# 3. bash filter.sh
# 4. Check for filter to be set with browser http://localhost:8080/UI/alertFilter/ => view global filter
#open example JSON file (API) and check for alert filter list
curl -s 'https://api.npoint.io/c29e3a68be632f73fc22' > whitelist_tmp.json
whitelist=whitelist_tmp.json
#Parser WITOUT loop
ruleId=$($whitelist | jq '.[].ruleId')
alert=$($whitelist | jq '.[].alertName')
newLevel=$($whitelist | jq '.[].newLevel')
url=$($whitelist | jq '.[].url')
urlIsRegex=$($whitelist | jq '.[].urlIsRegex')
enabled=$($whitelist | jq '.[].enabled')
parameter=$($whitelist | jq '.[].parameter')
evidence=$($whitelist | jq '.[].evidence')
#Fist test-query WITHOUT url-encoding => not working
#curl -X -4 --retry 2 --retry-connrefused --retry-delay 3 "http://localhost:8080/JSON/alertFilter/action/addGlobalAlertFilter/?ruleId=${ruleId}\&newLevel=${newLevel}\&url=${url}\&urlIsRegex=${urlIsRegex}\&parameter=${parameter}\&enabled=${enabled}\&evidence=${evidence}"
#Second test-query WITH url-encoding
curl --data-urlencode "ruleId=$ruleId" --data-urlencode "newLevel=$newLevel" --data-urlencode "url=$url" --data-urlencode "urlIsRegex=$urlIsRegex" --data-urlencode "enabled=$enabled" --data-urlencode "parameter=$parameter" --data-urlencode "evidence=$evidence" "http://localhost:8080/JSON/alertFilter/action/addGlobalAlertFilter"
jq accepts a filename argument, and it would be better to use that than change whitelist to include a call to cat:
jq -r 'MYFILTER' "$whitelist"
Using filters such as .[].evidence' makes your script very fragile, as it assumes the top-level array will always have just one entry. If you want just the first entry in the array, you could specify that, though maybe using first would be even more robust:
first(.[].evidence)
Although jq is fairly light-weight, it is worth noting that all those calls to jq could be reduced to one, though of course your shell script would then have to be modified as well. See e.g.
Attempting to assign multiple variables using jq
Try changing
whitelist=whitelist_tmp.json
to
whitelist="cat whitelist_tmp.json"
I think the error was occurring because shell was trying & failing to execute "whitelist_tmp.json" as its own command/function instead of piping the contents of the file to jq.
You also want to add the -r option to jq in the variables you are going to pass in curl, like so:
ruleId=$($whitelist | jq -r '.[].ruleId')
alert=$($whitelist | jq -r '.[].alertName')
newLevel=$($whitelist | jq -r '.[].newLevel')
url=$($whitelist | jq -r '.[].url')
urlIsRegex=$($whitelist | jq -r '.[].urlIsRegex')
enabled=$($whitelist | jq -r '.[].enabled')
parameter=$($whitelist | jq -r '.[].parameter')
evidence=$($whitelist | jq -r '.[].evidence')
Without the -r option, any strings that jq outputs will be quoted, which could be escaping your script's quotation in the curl command. The -r option tells jq to output the raw string data without any quotation.

jq: error: syntax error, unexpected INVALID_CHARACTER (Unix shell quoting issues?) at <top-level>

I was following this link to stream data from mysql to kafka topic in my ubuntu machine. There, in
Kafka Connect setup topic, when I run to check if my connectors are running with this(as suggested there):
curl -s "http://localhost:8083/connectors" | jq '.[]' | xargs -I mysql-connector curl -s "http://localhost:8083/connectors/mysql-connector/status" | jq -c -M '[.name,.connector.state,.tasks[].state] | \
join(":|:")'| column -s : -t| sed 's/\"//g'| sort
I got this error:
jq: error: syntax error, unexpected INVALID_CHARACTER (Unix shell
quoting issues?) at , line 1:
[.name,.connector.state,.tasks[].state] | \
jq: 1 compile error (23) Failed writing body (23) Failed writing body
I am totally stuck. Anyone please help if possible.
N.B.: This is not duplicate question, though question with similar headline exists but problem is different and I have checked them well.
You are abusing \. Use:
curl -s "http://localhost:8083/connectors" |
jq '.[]' |
xargs -I mysql-connector curl -s "http://localhost:8083/connectors/mysql-connector/status" |
jq -c -M '[.name,.connector.state,.tasks[].state] |
join(":|:")' |
column -s : -t |
tr -d \" |
sort
It might be cleaner to put the jq all on one line, but the key point is that if you try to escape the newline inside single quotes, you end up with a literal backslash in the jq command that does not belong there.

Formatting JSON string for AWS IoT

I'm getting "JSON format error" from the AWS console when I try to publish a temperature value from a variable; this works correctly:
mosquitto_pub -t \$aws/things/my-xxxx/shadow/update -m '{"state": {"desired":
{"temperature": 1 }}}' -q 1
I want to replace "1" with a variable so, I create a shell with the mosquitto_pub etc.., and I want to pass an argument to the shell, calling "./publish.sh Temperature_Value", where Temperature value is an int:
Trying this I get errors from AWS console:
DATA=${1}
mosquitto_pub -t \$aws/things/my-xxxx/shadow/update -m '{"state": {"desired":
{"temperature": $DATA }}}' -q 1
What am I doing wrong?
Thanks
Your escaping is wrong. this is the right escaping :
mosquitto_pub -t \$aws/things/my-xxxx/shadow/update -m "{\"state\": {\"desired\":
{\"temperature\": $1 }}}" -q 1
Remember that variables within single quotes ' are not interpolated.
Regards!

error in awscli call doesnt send to logfile

we have to check for the status of instance and iam trying to capture if any error to logfile. logfile has instance inforamtion but the error is not being writen to logfile below is code let me know what needs to be corrected
function wait-for-status {
instance=$1
target_status=$2
status=unknown
while [[ "$status" != "$target_status" ]]; do
status=`aws rds describe-db-instances \
--db-instance-identifier $instance | head -n 1 \
| awk -F \ '{print $10}'` >> ${log_file} 2>&1
sleep 30
echo $status >> ${log_file}
done
Rather than using all that head/awk stuff, if you want a value out of the CLI, you should use the --query parameter. For example:
aws rds describe-db-instances --db-instance-identifier xxx --query 'DBInstances[*].DBInstanceStatus'
See: Controlling Command Output from the AWS Command Line Interface - AWS Command Line Interface
Also, if your goal is to wait until an Amazon RDS instance is available, then you should use db-instance-available — AWS CLI Command Reference:
aws rds wait db-instance-available --db-instance-identifier xxx

Command line tool to access Amazon Athena

I am looking for a command line tool to make queries to Amazon Athena.
It works with JDBC, using the driver com.amazonaws.athena.jdbc.AthenaDriver, but I haven't found any command line tool that works with it.
Expanding on previous answer from #MasonWinsauer. Requires bash and jq.
#!/bin/bash
# Athena queries are fundamentally Asynchronous. So we have to :
# 1) Make the query, and tell Athena where in s3 to put the results (tell it the same place as the UI uses).
# 2) Wait for the query to finish
# 3) Pull down the results and un-wacky-Jsonify them.
# run the query, use jq to capture the QueryExecutionId, and then capture that into bash variable
queryExecutionId=$(
aws athena start-query-execution \
--query-string "SELECT Count(*) AS numBooks FROM books" \
--query-execution-context "Database=demo_books" \
--result-configuration "OutputLocation"="s3://whatever_is_in_the_athena_UI_settings" \
--region us-east-1 | jq -r ".QueryExecutionId"
)
echo "queryExecutionId was ${queryExecutionId}"
# Wait for the query to finish running.
# This will wait for up to 60 seconds (30 * 2)
for i in $(seq 1 30); do
queryState=$(
aws athena get-query-execution --query-execution-id "${queryExecutionId}" --region us-east-1 | jq -r ".QueryExecution.Status.State"
);
if [[ "${queryState}" == "SUCCEEDED" ]]; then
break;
fi;
echo " Awaiting queryExecutionId ${queryExecutionId} - state was ${queryState}"
if [[ "${queryState}" == "FAILED" ]]; then
# exit with "bad" error code
exit 1;
fi;
sleep 2
done
# Get the results.
aws athena get-query-results \
--query-execution-id "${queryExecutionId}" \
--region us-east-1 > numberOfBooks_wacky.json
# Todo un-wacky the json with jq or something
# cat numberOfBooks_wacky.json | jq -r ".ResultSet.Rows[] | .Data[0].VarCharValue"
As of version 1.11.89, the AWS command line tool supports Amazon Athena operations.
First, you will need to attach the AmazonAthenaFullAccess policy to the IAM role of the calling user.
Then, to get started querying, you will use the start-query-execution command as follows:
aws athena start-query-execution
--query-string "SELECT * FROM MyDb.MyTable"
--result-configuration "OutputLocation"="s3://MyBucket/logs" [Optional: EncryptionConfiguration]
--region <region>
This will return a JSON object of the QueryExecutionId, which can be used to retrieve the query results using the following command:
aws athena get-query-results
--query-execution-id <id>
--region <region>
Which also returns a JSON object of the results and metadata.
More information can be found in the official AWS Documentation.
Hope this helps!
You can try AthenaCLI, which is a command line client for Athena service that can do auto-completion and syntax highlighting, and is a proud member of the dbcli community
https://github.com/dbcli/athenacli
athena-cli should be a good start.

Resources