Pick values from text file and put them after string - bash

I'd like to pick two values from the text file and paste it to another file after certain strings:
#!/bin/bash
IAM_ROLE=$(curl --silent http://169.254.169.254/latest/meta-data/iam/security-credentials)
curl --silent http://169.254.169.254/latest/meta-data/iam/security-credentials/$IAM_ROLE | jq -r '.AccessKeyId, .SecretAccessKey' > /tmp/aws_credentials
curl --silent http://169.254.169.254/latest/dynamic/instance-identity/document | jq -r .region > /tmp/aws_region
{ read -r val1
read -r val2
sed -i 's! access_key= .* *$! access_key= $val1 !; s! secret_key= .* *$! secret_key= $val2 !;' /etc/trafficserver/s3_auth_v4.config
} < /tmp/aws_credentials
/etc/trafficserver/s3_auth_v4.config looks like:
access_key=
secret_key=
version=4
v4-region-map=region_map.config
However sed part of the script is doing nothing.

Related

How to get a command variable inside another command variable?

Example here:
gitrepo=$(jq -r '.gitrepo' 0.json)
releasetag=$(curl --silent ""https://api.github.com/repos/\"$gitrepo\""/releases/latest" | grep '"tag_name":' | sed -E 's/.*"([^"]+)".*/\1/')
echo "$releasetag"
Used \" to escape characters.
0.json:
{
"type": "github-releases",
"gitrepo": "ipfs/go-ipfs"
}
How to put $gitrepo to work inside $releasetag?
Thanks in advance!
Bash variables expand inside quoted " strings.
gitrepo="$(jq -r '.gitrepo' 0.json)"
releasetag="$(
curl --silent "https://api.github.com/repos/$gitrepo/releases/latest" \
| grep '"tag_name":' | sed -E 's/.*"([^"]+)".*/\1/'
)"
echo "$releasetag"
Btw, as you are using jq to extract .gitrepo from 0.json, you could also use it in the exact same way to extract .tag_name from curl's output (instead of using grep and sed) like so:
gitrepo="$(jq -r '.gitrepo' 0.json)"
releasetag="$(
curl --silent "https://api.github.com/repos/$gitrepo/releases/latest" \
| jq -r '.tag_name'
)"
echo "$releasetag"
And to simplify it even further (depending on your use case), just write:
curl --silent "https://api.github.com/repos/$(jq -r '.gitrepo' 0.json)/releases/latest" \
| jq -r '.tag_name'

Is it possible to output multiple values to different files from a single jq invocation?

I have a curl where I would like to extract two text to two different files, file1.crt and file2.key. currently configured this way by repeating curl.
curl -X GET -H "X-Vault-Token:{{ vault_token }}" https://fopp.com/v1/ACME/data/SSL/fopp.com | jq -r .data.data.crt > /files/nginx/ssl/file1.crt &&
curl -X GET -H "X-Vault-Token:{{ vault_token }}" https://fopp.com/v1/ACME/data/SSL/fopp.com | jq -r .data.data.key > /files/nginx/ssl/file2.key
I would like to know if with jq could handle to extract in just one command.
Not with only jq itself, but it's easy enough to combine with a bit of shell. If your values can't contain literal newlines, this can be as easy as:
curl -X GET -H "X-Vault-Token:{{ vault_token }}" https://fopp.com/v1/ACME/data/SSL/fopp.com \
| jq -r '.data.data.crt, .data.data.key' \
| { IFS= read -r crt && printf '%s\n' "$crt" > /files/nginx/ssl/file1.crt;
IFS= read -r key && printf '%s\n' "$key" > /files/nginx/ssl/file2.key; }
If the values can contain newlines, then you need to use a different separator. Consider:
curl -X GET -H "X-Vault-Token:{{ vault_token }}" https://fopp.com/v1/ACME/data/SSL/fopp.com \
| jq -j '.data.data.crt, "\u0000", .data.data.key, "\u0000"' \
| { IFS= read -r -d '' crt && printf '%s\n' "$crt" > /files/nginx/ssl/file1.crt;
IFS= read -r -d '' key && printf '%s\n' "$key" > /files/nginx/ssl/file2.key; }

Echo the command result in a file.txt

I have a script such as :
cat list_id.txt | while read line; do for ACC in $line;
do
echo -n "$ACC\t"
curl -s "link=fasta&retmode=xml" |\
grep TSeq_taxid |\
cut -d '>' -f 2 |\
cut -d '<' -f 1 |\
tr -d "\n"
echo
sleep 0.25
done
done
This script allows me from a list of ID in list_id.txt to get the corresponding names in a database in https://eutils.ncbi.nlm.nih.gov/entrez/eutils/efetch.fcgi?db=nuccore&id=${ACC}&rettype=fasta&retmode=xml
So from this script I get something like
CAA42669\t9913
V00181\t7154
AH002406\t538120
And what I would like is directly to print or echo this result in fiel call new_ids.txt, I tried echo >> new_ids.txt but the file is empty.
Thanks for your help.
A minimal refactoring of your script might look like
# Avoid useless use of cat
# Use read -r
# Don't use upper case for private variables
while read -r line; do
for acc in $line; do
echo -n "$acc\t"
# No backslash necessary after | character
curl -s "link=fasta&retmode=xml" |
# Probably use a proper XML parser for this
grep TSeq_taxid |
cut -d '>' -f 2 |
cut -d '<' -f 1 |
tr -d "\n"
echo
sleep 0.25
done
done <list_id.txt >new_ids.txt
This could probably still be simplified significantly, but without knowledge of what your input file looks like exactly, or what curl returns, this is somewhat speculative.
tr -s ' \t\n' '\n' <list_id.txt |
while read -r acc; do
curl -s "link=fasta&retmode=xml" |
awk -v acc="$acc" '/TSeq_taxid/ {
split($0, a, /[<>]/); print acc "\t" a[3] }'
sleep 0.25
done <list_id.txt >new_ids.txt

exporting environment variables with spaces using jq

So, I'm trying to export an environment variable that comes from an api that returns json values. Would like to use jq to just do a one liner, but if the values have spaces I cannot get it working
Trying without surrounding the value in quotes
/app/src $ $(echo '{"params":[{ "Name":"KEY","Value":"value with space"}]}' | jq
-r '.params[] | "export " + .Name + "=" + .Value')
/app/src $ printenv KEY
value
/app/src $
Next, I try wrapping the value in quotes
/app/src $ $(echo '{"params":[{ "Name":"KEY","Value":"value with space"}]}' | jq
-r '.params[] | "export " + .Name + "=\"" + .Value + "\""')
sh: export: space": bad variable name
/app/src $
For all of the below, I'm assuming that:
json='{"params":[{ "Name":"KEY","Value":"value with space"}]}'
It can be done, but ONLY IF YOU TRUST YOUR INPUT.
A solution that uses eval might look like:
eval "$(jq -r '.params[] | "export \(.Name | #sh)=\(.Value | #sh)"' <<<"$json")"
The #sh builtin in jq escapes content to be eval-safe in bash, and the eval invocation then ensures that the content goes through all parsing stages (so literal quotes in the data emitted by jq become syntactic).
However, all solutions that allow arbitrary shell variables to be assigned have innate security problems, as the ability to set variables like PATH, LD_LIBRARY_PATH, LD_PRELOAD and the like can be leveraged into arbitrary code execution.
Better form is to generate a NUL-delimited key/value list...
build_kv_nsv() {
jq -j '.params[] |
((.Name | gsub("\u0000"; "")),
"\u0000",
(.Value | gsub("\u0000"; "")),
"\u0000")'
}
...and either populate an associative array...
declare -A content_received=( )
while IFS= read -r -d '' name && IFS= read -r -d '' value; do
content_received[$name]=$value
done < <(build_kv_nsv <<<"$json")
# print the value of the populated associative array
declare -p content_received
...or to use a namespace that's prefixed to guarantee safety.
while IFS= read -r -d '' name && IFS= read -r -d '' value; do
printf -v "received_$name" %s "$value" && export "received_$name"
done < <(build_kv_nsv <<<"$json")
# print names and values of our variables that start with received_
declare -p "${!received_#}" >&2
If the values are known not to contain (raw) newlines, and if you have access to mapfile, it may be worthwhile considering using it, e.g.
$ json='{"params":[{ "Name":"KEY","Value":"value with space"}]}'
$ mapfile -t KEY < <( jq -r '.params[] | .Value' <<< "$json" )
$ echo N=${#KEY[#]}
N=1
If the values might contain (raw) newlines, then you'd need a version of mapfile with the -d option, which could be used as illustrated below:
$ json='{"params":[{ "Name":"KEY1","Value":"value with space"}, { "Name":"KEY2","Value":"value with \n newline"}]}'
$ mapfile -d $'\0' KEY < <( jq -r -j '.params[] | .Value + "\u0000"' <<< "$json" )
$ echo N=${#KEY[#]}
N=2

How do I detect a failed subprocess in a bash read statement?

In bash we can set an environment variable from a sequence of commands using read and a pipe to a subprocess. But I'm having trouble detecting errors in my processing in one edge case - a part of the subprocess pipeline producing some output before erroring.
A simplified example which takes an input file, looks for a line starting with "foo" and sets var to the first word on that line is:
set -e
set -o pipefail
set -o nounset
die() {
echo $1 > /dev/stderr
exit 1
}
read -r var rest < <( \
cat data.txt \
| grep foo \
|| die "PIPELINE" \
) || die "OUTER"
echo "var=$var"
Running this with data.txt like
blah
zap foo awesome
bang foo
will output
var=zap
Running this on a data.txt file that doesn't contain foo outputs (to stderr)
DEAD: PIPELINE
DEAD: OUTER
This is all as expected.
We can introduce another no-op stage like cat at the end of the process
...
read -r var rest < <( \
cat data.txt \
| grep foo \
| cat \
|| die "PIPELINE" \
) || die "OUTER"
...
and everything continues to work.
But if the additional stage is paste -s -d' ' and the input does not contain "foo" the output is
var=
DEAD: PIPELINE
Which seems to show that the pipeline errors, but read succeeds with an empty line. (It looks like paste -s -d' ' outputs a line of output even when its input is empty.)
Is there a simple way to detect this failure of the pipeline, and cause the main script to error out?
I guess I could check that the variable is not empty - but this is a simplified version - I'm actually using sed and paste to join multiple lines to set multiple variables, like
read -r v1 v2 v3 rest < <( \
cat data.txt \
| grep "^foo=" \
| sed -e 's/foo=//' \
| paste -s -d' ' \
|| die "PIPELINE"
) || die "OUTER"
You could use another grep to see if the output of paste contained something:
read -r var rest < <( \
cat data.txt \
| grep foo \
| paste -s -d' ' \
| grep . \
|| die "PIPELINE" \
) || die "OUTER"
In the end I went with two different solutions depending on the context.
The first was to pipe the results to a temporary file. This will process the entire file before performing the read, and thus any failures in the pipe will cause the script to fail.
cat data.txt \
| grep "^foo=" \
| sed -e 's/foo=//' \
| paste -s -d' ' \
> $TMP/result.txt
|| die "PIPELINE"
read -r var rest < $TMP/result.txt || die "OUTER"
The second was to just test that the variables were set. While this meant
there was a bunch of duplication that I wanted to avoid, it seemed the most bullet-proof solution.
read -r var rest < <( cat data.txt \
| grep "^foo=" \
| sed -e 's/foo=//' \
| paste -s -d' ' \
|| die "PIPELINE"
) || die "OUTER"
[ ! -z "$var" ] || die "VARIABLE NOT SET"

Resources