Passing key and pem file data in curl command not working - bash

I am trying to invoke the API using the curl command but unfortunately, I won't be able to keep the files. I am trying to pass the .key and .pem file's data in the command but I am not able to pass that correctly. Below is my command in my .sh file:
response=$(curl --key "$5" --cert "$6" -k -X "$2" -d "$payload" "$4")
I am calling the script below way:
key="${key}"
pem="${pem}"
bash ./Integration1.sh Provision POST "$payload" https://some-api.com/pr "$key" "$pem"
It gives the below error:
curl: (58) could not load PEM client certificate from -----BEGIN CERTIFICATE-----
This command works fine if I pass the file directly so, is there any way to pass the data via string variables in the curl command?

If you only have your key data in a variable and can't write it to a file yourself for some reason, an alternative solution is to use process substitution.
bash ./Integration1.sh Provision POST "$payload" https://some-api.com/pr \
<(printf '%s' "$key") \
<(printf '%s' "$pem")
This requires bash and still uses files under the hood, but it doesn't require you to manage the files yourself or know where they're located.

--key and --cert take the name of a file containing certificate data, not the certificate data itself.
... "$(cat my_client.key)" "$(cat my_client.pem)"
Should just be
... my_client.key my_client.pem

Related

CURL request fails with error msg "failed to decrypt request"

I have implemented a CURL script to execute API call which requires encrypted string as data, But it gets failed with 'failed to decrypt request'. I have tried with python also (requests and Crypto modules) but failed with same error message.
#!/usr/bin/env bash
output=$(openssl enc -aes-256-cbc -pass pass:xxxxxxx -in msg.txt -base64)
curl -X POST \
"https://api.swing.tradesmartonline.in/api/v1/login-token" \
-H "Content-Type: application/json" \
-d '{"data":"$output","app":"APPLICATION_ID"}'
Output is: {"status":"false","data":"","error_msg":"failed to decrypt
request."}
Where as I am able to decrypt the message in local using below command
echo "$output" | openssl enc -aes-256-cbc -base64 -pass pass:xxxxxxx -d
Thanks in advance
The shell will not expand your variables in a single quoted string. Either:
Use double quotes and escape embedded double quotes with \":
"{\"data\":\"$output\",\"app\":\"APPLICATION_ID\"}"
Use single quotes for static text and double quotes for what contain variables (this means less escaping):
'{"data":"'"$output"'","app":"APPLICATION_ID"}'

Passing certs to curl from an environment variable

I am working on a CORS monitor for my application, and need to do the following:
Download certificates from our Hashicorp Vault instance
Curl an endpoint using those certs
Evaluate the response
I can do this currently as follows:
vault kv get -field crt my/cert/path/CACERT > CACERT.crt
vault kv get -field crt my/cert/path/TESTCERT > CERT.crt
vault kv get -field key my/cert/path/TESTCERT > KEY.key
curl -v\
--cacert CACERAT.crt \
--cert CERT.crt \
--key KEY.key \
--location \
--request GET 'https://my.end.point'
# <evaluate here>
rm CACERT.crt CERT.crt KEY.key
While this works, I would prefer to not write the certs to a file, and would rather keep them in memory, such as using environment variables or maybe some bash-isms I'm not aware of.
In my mind it would look something more like this:
CACERT=$(vault kv get -field crt my/cert/path/CACERT)
CERT=$(vault kv get -field crt my/cert/path/TESTCERT)
KEY=$(vault kv get -field key my/cert/path/TESTCERT)
curl -v\
--cacert $CACERT \
--cert $CERT \
--key $KEY \
--location \
--request GET 'https://my.end.point'
# <evaluate here>
Obviously this wont work, as curl is expecting file paths, but for illustrative purposes, is this possible? If possible, is it a poor approach? Perhaps there is another way to approach this that I haven't considered? I am aware that I could do the above with python relatively easily, however I'd prefer to stick to bash + curl if at all possible.
Bash supports process substitution using the <(cmd) syntax. This causes the output of cmd to be substituted with a filename. This allows you to pass command output as an argument where a filename is expected.
In your example, you would do something like this:
CACERT=$(vault kv get -field crt my/cert/path/CACERT)
CERT=$(vault kv get -field crt my/cert/path/TESTCERT)
KEY=$(vault kv get -field key my/cert/path/TESTCERT)
curl -v \
--cacert <(echo "$CACERT") \
--cert <(echo "$CERT") \
--key <(echo "$KEY") \
--location \
--request GET 'https://my.end.point'
With a POSIX only shell, it is possible to replace process substitution with explicitly created named pipes.
Background tasks streams the different keys and certificates to their dedicated named pipe.
#!/usr/bin/env sh
# These are the randomly named pipes
cacert_pipe=$(mktemp -u)
cert_pipe=$(mktemp -u)
key_pipe=$(mktemp -u)
# Remove the named pipes on exit
trap 'rm -f -- "$cacert_pipe" "$cert_pipe" "$key_pipe"' EXIT
# Create the named pipes
mkfifo -- "$cacert_pipe" "$cert_pipe" "$key_pipe" || exit 1
# Start background shells to stream data to the respective named pipes
vault kv get -field crt my/cert/path/CACERT >"$cacert_pipe" &
vault kv get -field crt my/cert/path/TESTCERT >"$cert_pipe" &
vault kv get -field key my/cert/path/TESTCERT >"$key_pipe" &
curl -v\
--cacert "$cacert_pipe" \
--cert "$cert_pipe" \
--key "$key_pipe" \
--location \
--request GET 'https://example.com/my.end.point'
The downside compared the Bash's process substitution method is; the streaming tasks need to be restarted explicitly before calling curl with the named pipes as files arguments.

Sending file using CURL in windows

I'm trying to send a file using curl in windows.
Here's the command i'm using:
C:\curl>curl -X POST -F chat_id=#telegramchannel -F photo=#IMAGE.png https://api.telegram.org/bot812312342:XXXXXXXXXXXXXXXXXXXXXX/sendPhoto
and I keep getting this error:
curl: (26) Failed to open/read local data from file/application
does anybody know how to solve it and how to use the -F properly with files on windows?
Thanks
If telegramchannel is not a file, then you have to escape # with a backslash or use single quotes to encapsulate the content. As # has special meaning in curl context,
either
curl -X POST -F chat_id='#telegramchannel' -F photo=#IMAGE.png https://api.telegram.org/bot812312342:XXXXXXXXXXXXXXXXXXXXXX/sendPhoto
or
curl -X POST -F chat_id=\#telegramchannel -F photo=#IMAGE.png https://api.telegram.org/bot812312342:XXXXXXXXXXXXXXXXXXXXXX/sendPhoto

How to pass "escape hell" to docker run

I'm trying to pass several commands to an alpine container from powershell\cmd. The problem is: commands have quotes\single quotes\escaped quotes.
Sample command, the actual command is one line, I'm splitting it so it is easier to look at:
docker run neilpang/acme.sh bin/sh -c --%
"acme.sh --issue --dns dns_azure --dnssleep 10 --force -d "domain";
openssl pkcs12 -inkey /acme.sh/domain/domain.key -in /acme.sh/domain/domain.cer -export -out pfx -password pass:password;
curl -X POST -d "{\"certName\":\"certName1\",\"certData\":\"$(tail -n +2 /acme.sh/domain/domain.cer > b64; head -n -1 b64)\"}" url -H Content-Type:application/json;
curl -X POST -d "{\"certName\":\"certName2\",\"certData\":\"$(openssl base64 -in pfx -out b64; cat b64)\"}" url -H Content-Type:application/json"
What I've tried:
Use single quotes around the whole expression (after --%) returns something like quoted string not closed
Using single quotes around json works fine, but $() doesnt get interpreted, so I dont get proper output
escaping " before\after json
without " before\after json, results in some garbage
I never got it to work without --%, but it appears ps does some string manipulations even when using --%
did some experiments with cmd, no luck either.
I'm open to any way of making this work :)
for those requesting errors, here's a sample:
-n: line 1: syntax error: unexpected end of file (expecting ")") # quotes around expression
-p: line 1: syntax error: unterminated quoted string # single quotes around expression
no error, but certData empty # no quotes around json
no quotes in json >> not valid json >> doesnt get parsed # escaped quotes around json
executing the strings i'm trying to pass inside the containers results in the working "solution", whereas trying to pass them through docker run results in different defects described above. I'm not sure where these arise, from powershell (although, i firmly believe --% does the work here, and powershell passes everything properly), docker or bash itself. the closest i got to working is the ' version, but it doesnt evaluate expression inside $(), so that the problem
You'll find the quoting issues significantly easier if you turn your very long, very involved command line into a shell script, package it into a custom image, and then run that.
Shell script, let's call it run_acme.sh:
#!/bin/sh
acme.sh --issue --dns dns_azure --dnssleep 10 --force -d "domain"
openssl pkcs12 \
-inkey /acme.sh/domain/domain.key \
-in /acme.sh/domain/domain.cer \
-export \
-out pfx \
-password pass:password
CER_B64=$(tail -n +2 /acme.sh/domain/domain.cer | head -n -1)
curl -X POST \
--data-binary "{\"certName\":\"certName1\",\"certData\":\"$CER_B64\"}" \
-H 'Content-Type: application/json'
url
PFX_B64=$(openssl base64 -in pfx)
curl -X POST \
--data-binary "{\"certName\":\"certName1\",\"certData\":\"$PFX_B64\"}" \
-H 'Content-Type: application/json'
url
Dockerfile:
FROM neilpang/acme.sh
COPY run_acme.sh /bin
CMD ["/bin/run_acme.sh"]
Then you can docker build this like any other custom image, and docker run it, and the very long command line will be baked into the shell script in your image.
I'd suggest using splatting to make this significantly cleaner to maintain. Note: I'm uncertain whether your shell script needs to escape the quotes around the -d parameter.
$curl1 = '"{\"certName\":\"certName1\",' +
'\"certData\":\"$(tail -n +2 /acme.sh/domain/domain.cer > b64; head -n -1 b64)\"}"'
$curl2 = '"{\"certName\":\"certName2\",' +
'\"certData\":\"$(openssl base64 -in pfx -out b64; cat b64)\"}"'
$dockerArgs = #(
'run', 'neilpang/acme.sh', 'bin/sh', '-c'
'"acme.sh --issue --dns dns_azure --dnssleep 10 --force -d "domain";' +
'openssl pkcs12 -inkey /acme.sh/domain/domain.key -in /acme.sh/domain/domain.cer ' +
'-export -out pfx -password pass:password;' +
"curl -X POST -d $curl1 url -H Content-Type:application/json;" +
"curl -X POST -d $curl2 url -H Content-Type:application/json"""
)
docker.exe #dockerArgs
The powershell escape character is a grave-accent ` or you could double-up on the quotes "" to escape double-quotes "`"".

File path pointing to variable in bash script

I wonder if it is possible provide instead of file path file content itself. For example curl needs to specify file path to certificate. What to do when I have a certificate in variable?
I can't put it into stdin, because I'm generating xml to send there.
Why I put certificate into variable? Because at the end I'd like to have just one file with script, which can I easily distribute to my colleagues.
The only solution which come to my mind is to save variable to file and then provide file path to curl
read -d '' CERT << "CERTEND"
-----BEGIN CERTIFICATE-----
### something very secret here ###
-----END CERTIFICATE-----
CERTEND
curl -# -k --cert ??? -d #xml.xml -H "Content-Type: application/soap+xml" -H 'SOAPAction: ""' https://1.1.1.1/EndpointPort`
You can use process substitution:
curl --cert <(echo "$CERT") ...
which would create a pipe for you. The output of the command in parentheses goes to this pipe (echo $CERT in our case), and the filename of the pipe is returned.

Resources