Passing certs to curl from an environment variable - bash

I am working on a CORS monitor for my application, and need to do the following:
Download certificates from our Hashicorp Vault instance
Curl an endpoint using those certs
Evaluate the response
I can do this currently as follows:
vault kv get -field crt my/cert/path/CACERT > CACERT.crt
vault kv get -field crt my/cert/path/TESTCERT > CERT.crt
vault kv get -field key my/cert/path/TESTCERT > KEY.key
curl -v\
--cacert CACERAT.crt \
--cert CERT.crt \
--key KEY.key \
--location \
--request GET 'https://my.end.point'
# <evaluate here>
rm CACERT.crt CERT.crt KEY.key
While this works, I would prefer to not write the certs to a file, and would rather keep them in memory, such as using environment variables or maybe some bash-isms I'm not aware of.
In my mind it would look something more like this:
CACERT=$(vault kv get -field crt my/cert/path/CACERT)
CERT=$(vault kv get -field crt my/cert/path/TESTCERT)
KEY=$(vault kv get -field key my/cert/path/TESTCERT)
curl -v\
--cacert $CACERT \
--cert $CERT \
--key $KEY \
--location \
--request GET 'https://my.end.point'
# <evaluate here>
Obviously this wont work, as curl is expecting file paths, but for illustrative purposes, is this possible? If possible, is it a poor approach? Perhaps there is another way to approach this that I haven't considered? I am aware that I could do the above with python relatively easily, however I'd prefer to stick to bash + curl if at all possible.

Bash supports process substitution using the <(cmd) syntax. This causes the output of cmd to be substituted with a filename. This allows you to pass command output as an argument where a filename is expected.
In your example, you would do something like this:
CACERT=$(vault kv get -field crt my/cert/path/CACERT)
CERT=$(vault kv get -field crt my/cert/path/TESTCERT)
KEY=$(vault kv get -field key my/cert/path/TESTCERT)
curl -v \
--cacert <(echo "$CACERT") \
--cert <(echo "$CERT") \
--key <(echo "$KEY") \
--location \
--request GET 'https://my.end.point'

With a POSIX only shell, it is possible to replace process substitution with explicitly created named pipes.
Background tasks streams the different keys and certificates to their dedicated named pipe.
#!/usr/bin/env sh
# These are the randomly named pipes
cacert_pipe=$(mktemp -u)
cert_pipe=$(mktemp -u)
key_pipe=$(mktemp -u)
# Remove the named pipes on exit
trap 'rm -f -- "$cacert_pipe" "$cert_pipe" "$key_pipe"' EXIT
# Create the named pipes
mkfifo -- "$cacert_pipe" "$cert_pipe" "$key_pipe" || exit 1
# Start background shells to stream data to the respective named pipes
vault kv get -field crt my/cert/path/CACERT >"$cacert_pipe" &
vault kv get -field crt my/cert/path/TESTCERT >"$cert_pipe" &
vault kv get -field key my/cert/path/TESTCERT >"$key_pipe" &
curl -v\
--cacert "$cacert_pipe" \
--cert "$cert_pipe" \
--key "$key_pipe" \
--location \
--request GET 'https://example.com/my.end.point'
The downside compared the Bash's process substitution method is; the streaming tasks need to be restarted explicitly before calling curl with the named pipes as files arguments.

Related

Passing key and pem file data in curl command not working

I am trying to invoke the API using the curl command but unfortunately, I won't be able to keep the files. I am trying to pass the .key and .pem file's data in the command but I am not able to pass that correctly. Below is my command in my .sh file:
response=$(curl --key "$5" --cert "$6" -k -X "$2" -d "$payload" "$4")
I am calling the script below way:
key="${key}"
pem="${pem}"
bash ./Integration1.sh Provision POST "$payload" https://some-api.com/pr "$key" "$pem"
It gives the below error:
curl: (58) could not load PEM client certificate from -----BEGIN CERTIFICATE-----
This command works fine if I pass the file directly so, is there any way to pass the data via string variables in the curl command?
If you only have your key data in a variable and can't write it to a file yourself for some reason, an alternative solution is to use process substitution.
bash ./Integration1.sh Provision POST "$payload" https://some-api.com/pr \
<(printf '%s' "$key") \
<(printf '%s' "$pem")
This requires bash and still uses files under the hood, but it doesn't require you to manage the files yourself or know where they're located.
--key and --cert take the name of a file containing certificate data, not the certificate data itself.
... "$(cat my_client.key)" "$(cat my_client.pem)"
Should just be
... my_client.key my_client.pem

extract a value from the output of a script and store in a variable winodws?

According to the document Get Azure AD tokens for service principals:
curl -X POST -H 'Content-Type: application/x-www-form-urlencoded' \
https://login.microsoftonline.com/<tenant-id>/oauth2/v2.0/token \
-d 'client_id=<client-id>' \
-d 'grant_type=client_credentials' \
-d 'scope=2ff814a6-3304-4ab8-85cb-cd0e6f879c1d%2F.default' \
-d 'client_secret=<client-secret>'
Now, I could get the correct output,like:
The Azure AD access token is in the access_token value within the output of the call.
What I want is that I need the get the value of the access_token and set it to the variable, so that I could use it in next REST API scripts.
But I'm not very familiar with Bash and curl, can anyone offer advice?
use jq to extract access_token from the json, and VAR=$(...) to store it in a variable,
ACCESS_TOKEN=$(curl -X POST -H 'Content-Type: application/x-www-form-urlencoded' \
https://login.microsoftonline.com/<tenant-id>/oauth2/v2.0/token \
-d 'client_id=<client-id>' \
-d 'grant_type=client_credentials' \
-d 'scope=2ff814a6-3304-4ab8-85cb-cd0e6f879c1d%2F.default' \
-d 'client_secret=<client-secret>' \
| jq -r .access_token )
then you can use ACCESS_TOKEN like
curl -d access_token="$ACCESS_TOKEN"
but be wary, bash is a shitty scripting language, you should not attempt to use bash for complex logic, you should probably switch to a better scripting language like Python, Perl, or PHP, rather than implementing complex logic in Bash. (same goes for Windows's cmd and PowerShell. all 3 are languages unsuitable, but not incapable, of complex logic)

Save the file with the name of the array key

I have a script that downloads files. The links are consist of two-part: one in the cur(the stable one) and the second one in the Associative array.
So I want to have a script that downloads a file and given a name like a key from my arrey
declare -A metricssArray
metricssArray[key1]=some link
metricssArray[key2]=some link
curl --location --request GET 'First part of link'+${metricssArray[key1]} --output example.csv
I know that will give a file name "example.csv"
-output examle.csv
But I need to have a file with the name key1.csv as my key from the array
curl --location --request GET 'First part of link'+${metricssArray[key1]} --output example.csv
When I change ${metricssArray[key2]} or key3 in curl the file should be downloaded with key2.csv name
Or maybe I going to download all of them and I need to understand with file belongs to with link/array key
for key in "${!metricssArray[#]}"; do
# ........^..............^ iterates over the keys
curl --location \
--request GET \
--output "$key".csv \
-u name:password \
'firs part of the link'+${metricssArray[$#]}
done
I have tried to put -u after the link but I get a file and inside 403 status code(
I'm not entirely certain what the question is.
If you're asking about how to iterate over the associative array, you can do this:
declare -A metricssArray
metricssArray[key1]=some link
metricssArray[key2]=some link
for key in "${!metricssArray[#]}"; do
# ........^..............^ iterates over the keys
curl --location \
--request GET \
--output "$key".csv \
'First part of link'+${metricssArray[$key]}
done
To build the URL, you might want:
# the array notation does not require line continuations
curl_opts=(
--location
--request GET
--user "username:password"
)
url_root=http://example.com/start/of/path
for key in "${!metricssArray[#]}"; do
url="${url_root}/${metricssArray[$key]}"
curl "${curl_opts[#]}" --output "$key".csv "$url"
done

Is it possible to tail Ansible AWX logs via curl?

I would like to create a curl output live in a single shell command, to log a output from an Ansible job in realtime filling a log file.
I've tried this command:
curl -f -k -N -H 'Content-Type: application/json' -XPOST \
--user admin:awxsecret \
http://192.168.42.100/api/v2/jobs/1620/
...but it only returns the output generated thus far, not waiting for newly-generated content.
As #charles-duffy said: "AWX does support websockets" I will work with this solution.

How to use in curl command data from file?

I have a text file containing variables of fqdn and hostname. File looks like this
first_fqnd first_hostname
second_fqdn second_hostname
..... .....
I have to update some data using curl in a bash script, but I have to take fqdn and hostname from this text file and make a curl for every pair fqdn and hostname.
My curl should be like this:
curl -H "Content-Type:application/json" -XPUT "https://pup.pnet.pl/api/hosts/**fqdn from file**" -d '{"host":{"name": "**hostname from file**"}}' --cacert bundle.pem --cert xxx-pem.cer --key xxx-privkey.pem
How can I pass these variables from file to curl? I've thought about using awk, but I don't know how to use it in curl command
Use a while construct to read the lines of file and put whitespace separated parameters as two relevant variables, fqdn and hostn:
while read fqdn hostn; do
curl -H .... -XPUT "https://pup.pnet.pl/api/hosts/${fqdn}" \
-d '{"host":{"name": "'"${hostn}"'"}}' --cacert ....; done <file.txt
Try something like this:
#!/bin/bash
while read fqdn hostname; do
curl -H "Content-Type:application/json" -XPUT \
"https://pup.pnet.pl/api/hosts/${fqdn}" \
-d '{"host":{"name": "'${hostname}'}}' --cacert bundle.pem \
--cert xxx-pem.cer --key xxx-privkey.pem
done <input_file.txt
The while read fqdn hostname will take input from Standard Input, line by line, splitting it by Bash's Internal Field Separator into "column" variables $fqdn and $hostname. See Catching User Input for more information.

Resources