Save the file with the name of the array key - bash

I have a script that downloads files. The links are consist of two-part: one in the cur(the stable one) and the second one in the Associative array.
So I want to have a script that downloads a file and given a name like a key from my arrey
declare -A metricssArray
metricssArray[key1]=some link
metricssArray[key2]=some link
curl --location --request GET 'First part of link'+${metricssArray[key1]} --output example.csv
I know that will give a file name "example.csv"
-output examle.csv
But I need to have a file with the name key1.csv as my key from the array
curl --location --request GET 'First part of link'+${metricssArray[key1]} --output example.csv
When I change ${metricssArray[key2]} or key3 in curl the file should be downloaded with key2.csv name
Or maybe I going to download all of them and I need to understand with file belongs to with link/array key
for key in "${!metricssArray[#]}"; do
# ........^..............^ iterates over the keys
curl --location \
--request GET \
--output "$key".csv \
-u name:password \
'firs part of the link'+${metricssArray[$#]}
done
I have tried to put -u after the link but I get a file and inside 403 status code(

I'm not entirely certain what the question is.
If you're asking about how to iterate over the associative array, you can do this:
declare -A metricssArray
metricssArray[key1]=some link
metricssArray[key2]=some link
for key in "${!metricssArray[#]}"; do
# ........^..............^ iterates over the keys
curl --location \
--request GET \
--output "$key".csv \
'First part of link'+${metricssArray[$key]}
done
To build the URL, you might want:
# the array notation does not require line continuations
curl_opts=(
--location
--request GET
--user "username:password"
)
url_root=http://example.com/start/of/path
for key in "${!metricssArray[#]}"; do
url="${url_root}/${metricssArray[$key]}"
curl "${curl_opts[#]}" --output "$key".csv "$url"
done

Related

Bash Script CURL Request a File Getting Failed to open/read local data from file/application

I have a bash script that loops through all files in a directory and makes a curl request to some URL
for FILE in ./files/*;
do
echo "Making a request..."
echo $FILE
curl --location --request POST "${URL}" \
--form 'file=#"${FILE}"' \
sleep 100
done
echo "done!"
The curl request was copied from postman so I'm confident that it works.
When I run the script, I get the following
Making a request...
./files/split1.csv
curl: (26) Failed to open/read local data from file/application
The issue I'm getting is how to handle string interpolation here.
You are very close, just remove the single quotes to get the string interpolation to work.
curl --location --request POST "${URL}" \
--form file=#"${FILE}" \

bash script to convert string value to json and then return json

I am very new to bash scripting, please can someone point me in the right direction on how to accomplish my task?
I have a curl call which returns a string, which I want to convert to json.
My curl statement -
curl --insecure -X POST 'https://url/api/IPAM/GetIP' --header 'Content-Type: application/json' --header -d '{"key1": "value1"}'
This curl statement returns a string ,for example: 10.100.100.100
I want to fetch this string and return the output in json format:
{"IP":"10.100.100.100"}
I don't want to use jquery or python to do this because this entire script will be run by a wrapper that only understands bash.
You can use jq to process your IP string into a JSON string and package it into a JSON object of your choice.
ip="10.100.100.100"
jq --arg ip "$ip" -cn '{"IP":$ip}'
Result:
{"IP":"10.100.100.100"}
Now if working with the result of your example curl POST request:
rawip_string=$(curl --insecure -X POST 'https://url/api/IPAM/GetIP' --header 'Content-Type: application/json' --header -d '{"key1": "value1"}')
jq --arg ip "$rawip_string" -cn '{"IP":$ip}'
One way to not rely on external tools like jq, you can get the output ip attribute that to a variable and concatenate it.
$ return=$(echo 10.100.100.100)
$ echo "{\"IP\":\"${return}\"}"
{"IP":"10.100.100.100"}
Like this
printf '{"IP":"%s"}' "$(curl --insecure -X POST 'https://url/api/IPAM/GetIP' --header 'Content-Type: application/json' --header -d '{"key1": "value1"}')"

Passing certs to curl from an environment variable

I am working on a CORS monitor for my application, and need to do the following:
Download certificates from our Hashicorp Vault instance
Curl an endpoint using those certs
Evaluate the response
I can do this currently as follows:
vault kv get -field crt my/cert/path/CACERT > CACERT.crt
vault kv get -field crt my/cert/path/TESTCERT > CERT.crt
vault kv get -field key my/cert/path/TESTCERT > KEY.key
curl -v\
--cacert CACERAT.crt \
--cert CERT.crt \
--key KEY.key \
--location \
--request GET 'https://my.end.point'
# <evaluate here>
rm CACERT.crt CERT.crt KEY.key
While this works, I would prefer to not write the certs to a file, and would rather keep them in memory, such as using environment variables or maybe some bash-isms I'm not aware of.
In my mind it would look something more like this:
CACERT=$(vault kv get -field crt my/cert/path/CACERT)
CERT=$(vault kv get -field crt my/cert/path/TESTCERT)
KEY=$(vault kv get -field key my/cert/path/TESTCERT)
curl -v\
--cacert $CACERT \
--cert $CERT \
--key $KEY \
--location \
--request GET 'https://my.end.point'
# <evaluate here>
Obviously this wont work, as curl is expecting file paths, but for illustrative purposes, is this possible? If possible, is it a poor approach? Perhaps there is another way to approach this that I haven't considered? I am aware that I could do the above with python relatively easily, however I'd prefer to stick to bash + curl if at all possible.
Bash supports process substitution using the <(cmd) syntax. This causes the output of cmd to be substituted with a filename. This allows you to pass command output as an argument where a filename is expected.
In your example, you would do something like this:
CACERT=$(vault kv get -field crt my/cert/path/CACERT)
CERT=$(vault kv get -field crt my/cert/path/TESTCERT)
KEY=$(vault kv get -field key my/cert/path/TESTCERT)
curl -v \
--cacert <(echo "$CACERT") \
--cert <(echo "$CERT") \
--key <(echo "$KEY") \
--location \
--request GET 'https://my.end.point'
With a POSIX only shell, it is possible to replace process substitution with explicitly created named pipes.
Background tasks streams the different keys and certificates to their dedicated named pipe.
#!/usr/bin/env sh
# These are the randomly named pipes
cacert_pipe=$(mktemp -u)
cert_pipe=$(mktemp -u)
key_pipe=$(mktemp -u)
# Remove the named pipes on exit
trap 'rm -f -- "$cacert_pipe" "$cert_pipe" "$key_pipe"' EXIT
# Create the named pipes
mkfifo -- "$cacert_pipe" "$cert_pipe" "$key_pipe" || exit 1
# Start background shells to stream data to the respective named pipes
vault kv get -field crt my/cert/path/CACERT >"$cacert_pipe" &
vault kv get -field crt my/cert/path/TESTCERT >"$cert_pipe" &
vault kv get -field key my/cert/path/TESTCERT >"$key_pipe" &
curl -v\
--cacert "$cacert_pipe" \
--cert "$cert_pipe" \
--key "$key_pipe" \
--location \
--request GET 'https://example.com/my.end.point'
The downside compared the Bash's process substitution method is; the streaming tasks need to be restarted explicitly before calling curl with the named pipes as files arguments.

For loop from file with multiple columns-BASH

i have following text file (output.txt)
TECH-746 TECH 10400
TECH-747 TECH 10400
i need to read all columns and pass it to 3 variables and then submit it to curl command. While read simple won't work with curl (don't know why) so i need to use for loop (it works with curl), can i use one for loop or need to nest multiple ones
for project in `cat output.txt`; do
echo $project
curl -D- -u user:pass -X POST --data "{\"fields\":{\"project\":{\"key\":\"TECH\"},\"parent\":{\"key\":\"$project\"},\"summary\":\"test",\"description\":\"test.\",\"issuetype\":{\"name\":\"Sub-task\"}}}" -H "Content-Type:application/json" --silent https://jira.company.com/rest/api/latest/issue/ >/dev/null
code above works, so i just want to "extend" to to include all other columns in file
while read can pull a line into distinct variables.
while read project col2 col3
do
curl -D- -u user:pass -X POST --data "{\"fields\":{\"project\":{\"key\":\"TECH\"},\"parent\":{\"key\":\"$project\"},\"summary\":\"test",\"description\":\"test.\",\"issuetype\":{\"name\":\"Sub-task\"}}}" -H "Content-Type:application/json" --silent https://jira.company.com/rest/api/latest/issue/ >/dev/null
done < sourcefile.txt
EDIT:
The curl command also misses escaping one quote, compare the two lines below, the first one is the original, the second one is the one I've corrected.
curl -D- -u user:pass -X POST --data "{\"fields\":{\"project\":{\"key\":\"TECH\"},\"parent\":{\"key\":\"$project\"},\"summary\":\"test",\"description\":\"test.\",\"issuetype\":{\"name\":\"Sub-task\"}}}" -H "Content-Type:application/json" --silent https://jira.company.com/rest/api/latest/issue/ >/dev/null
curl -D- -u user:pass -X POST --data "{\"fields\":{\"project\":{\"key\":\"TECH\"},\"parent\":{\"key\":\"$project\"},\"summary\":\"test\",\"description\":\"test.\",\"issuetype\":{\"name\":\"Sub-task\"}}}" -H "Content-Type:application/json" --silent https://jira.company.com/rest/api/latest/issue/ >/dev/null

POST multiple files with -d in curl

I'm using curl to create several classifications. I have written the json for the many classifications and they are in one folder. I would like to create all the classifications in one go. But using curl I can only create them one at a time. How could I make them in one request?
curl -u admin:admin -H "Content-Type: application/json" -X POST -d #pii.json http://127.0.0.1:21000/api/atlas/v2/types/typedefs
The curl manual for -d says 'Multiple files can also be specified'. How can I do this? All my attempts have failed.
Do I need a bash script instead? If so, could you help me - I'm not a coder and I'm struggling without an example!
Thanks in advance.
You probably don't want to use multiple -d with JSON data since curl concatenates multiple ones with a & in between. As described in the man page for -d/--data:
If any of these options is used more than once on the same command
line, the data pieces specified will be merged together with a
separating &-symbol. Thus, using '-d name=daniel -d skill=lousy' would
generate a post chunk that looks like 'name=daniel&skill=lousy'.
You can however easily and conveniently pass several files on stdin to let curl use them all in one go:
cat a.json b.json c.json | curl -d#- -u admin:admin -H "Content-Type: application/json" http://127.0.0.1:21000/api/atlas/v2/types/typedefs
(please note that -X POST has no place on a command line that uses -d)
I found the following to work in the end:
<fileToUpload.dat xargs -I % curl -X POST -T "{%}" -u admin:admin -H "Content-Type: application/json" http://127.0.0.1:21000/api/atlas/v2/types/typedefs
Where fileToUpload.dat contained a list of the .json files.
This seemed to work over Daniel's answer, probably due to the contents of the files. Hopefully this is useful to others if Daniel's solution doesn't work for them.
I needed to upload all the *.json files from a folder via curl and I made this little script.
nfiles=*.json
echo "Enter user:"
read user
echo "Enter password:"
read -s password
for file in $nfiles
do
echo -e "\n----$file----"
curl --user $user:$password -i -X POST "https://foo.bar/foo/bar" -H "Content-Type: application/json" -d "#$file"
done
Maybe fits your needs.

Resources