Escape newline character while encoding in base64 - bash

I have a situation where in my BASH script, I need to encode an environment variable to base64 and write it to a json file, which is then picked up by docker.
I have the below code for this:
USER64=$(echo $USER | base64)
echo '{"auths": {"auth": "'"$USER64"'}}' > ~/.docker/config.json
This works, but the problem is the encoded value of $USER contains a \n so the echo writes it into the config file as 2 lines. How can I escape all the \n while encoding the $USER and write it to the config file?

As added incentive to use jq, it can do the base64 encoding for you:
jq -n --arg u "$USER" '{auths: {auth: ($u | #base64)}}' > ~/.docker/config.json
(And as far as I can tell, #base64 is working on the original value, not the JSON-encode value, of $USER.)

You can use the substitution operator in shell parameter expansion.
echo '{"auths": {"auth": "'"${USER64/$'\n'/\\\n}"'}}' > ~/.docker/config.json
But you can also use an option to base64 to prevent it from putting newlines into the encoding in the first place.
USER64=$(echo $USER | base64 --wrap=0)

Related

Keep the trailing newlines when reading a PEM with yq

I have the following use case. I need to read from an YAML file using yq v4 PEM keys. It's then important to keep the trailing newlines otherwise a future reading of those PEM keys would miserably fail.
I haven't found a way in Bash to read a PEM from an Yaml file and store it in a variable keeping the trailing newlines.
Naturally if I would use $() Bash would remove the trailing new lines.
Do you have any other idea?
I seriously doubt that you genuinely need to do this (see comments on the question), but using a process substitution to feed input to the read command (configured to expect end-of-input to be signified by a NUL rather than a newline) will work:
IFS='' read -r -d '' input < <(yq ... && printf '\0')
Be sure you check stored contents with echo "$input" or declare -p input, not echo $input. (That's true in the command-substitution case too).

curl request for login with password having special character in bash script?

I'm reading password from input file and login with that password.
However if password has special character then its failing.
How can i make it work for the passwords with or without special characters ?
Have tried with '$pass' but looks it doesn't work.
ip="$(echo $line | awk -F ',' '{print $1}')"
pass="$(echo $line | awk -F ',' '{print $3}')"
output_json="$(curl -u admin:'$pass' -X GET -H "Content-Type:application/json" https://$ip:443/admin -k)"
You should always quote variable expansions
Use single-quotes to disable variable expansions and other special characters
Some issues with your current code:
echo $line is not properly quoted, and will break on whitespace and other special characters; use echo "$line"
As #GordonDavisson suggested in the comments, printf '%s\n' "$line" would actually be safer than echo, which may not work correctly depending on the contents of $line
admin:'$pass' will resolve to the literal characters admin:$pass being passed to curl; use "admin:${pass}"
https://$ip:443/admin is also not properly quoted, use "https://${ip}:443/admin"
If $line is being set with a literal password in the script you'll want single-quotes to have the shell ignore special characters; line='...,sbxy$sT_i7d6I*7'

Get base64 version of password

I have this command:
echo -n "cdt_main!##$" | base64
that fails with:
bash: !##$": event not found
lulz b/c of the special characters.
So my best guess is that this is correct:
echo -n "cdt_main\!\##\$" | base64
at least this is b/c there is no error now. Unfortunately I cannot test the base64 version password until I know that it's right - I have only one chance to get it right o/w something will blow up. Does that look right to you?
Using the decoding trick, I have:
echo -n "cdt_main\!\##\$" | base64 | base64 --decode
which yields:
cdt_main\!\##$
given that output, not sure if this is working as planned, because slashes are in there.
The problem is that bash's history expansion is turned on. Unless you history expansion is something that you actively want and use, the solution is to turn it off.
Observe that this fails with event not found:
$ echo -n "cdt_main!##$" | base64
bash: !##: event not found
But, this succeeds:
$ set +H
$ echo -n "cdt_main!##$" | base64
Y2R0X21haW4hQCMk
set +H turns off history expansion.
Alternative: use single quotes
For strings inside double quotes, the shell will perform a wide variety of expansions. To prevent that, put strings instead in single quotes.
For example, the following succeeds even with history expansion turned on:
$ set -H
$ echo -n 'cdt_main!##$' | base64
Y2R0X21haW4hQCMk
Portable approach
The -n option to echo is not supported by all shells. For a more portable solution, one should, as Gordon Davisson suggests, use printf:
printf "%s" 'cdt_main!##$' | base64
#John1024 is right, but an even more generic solution is to use:
echo -n 'cdt_main!##$' | base64
single quotes ' means the characters in the string aren't interpreted by bash, very nice.

Mac Dirname of String From Variable

In my bash_profile I created a deploy function. There I read a path from a file:
deployPath=$(jq '.["prod-deploy-path"]' ./package.json)
When I print the path, I get the following:
"/var/www/user/websitefolder/dist"
Now I want to remove the last part of the URL:
pwdPath=$(dirname $deployPath)
but I receive
"/var/www/user/websitefolder/dist
(notice the last quotation mark is missing)
If I do the same thing with a fixed string instead of the variable
dirname "/var/www/user/websitefolder/dist"
I receive what I want to receive:
dirname "/var/www/user/websitefolder/"
What do I need to write in my bash_profile to get the same result with the variable as with a string?
You will need to use the -r option for jq:
deployPath=$(jq -r '.["prod-deploy-path"]' ./package.json)
which will output a raw string.
Your problem is not that you got a variable but that the content in the variable are wrapped in quotes, eg:
with_quotes='"something here"'
without_quotes='something here'
echo "$with_quotes" # "something here"
echo "$without_quotes" # something here
jq will by default print a JSON encoded string. This also means that if your string contains double quotes they will be escaped like this: "hello \"world"
From man jq:
--raw-output / -r
With this option, if the filter's result is a string then it will be written
directly to standard output rather than being formatted as a JSON string
with quotes. This can be useful for making jq filters talk to non-JSON-based
systems

Saving backslash-escaped characters to variable in bash

I've just written a bash script that takes some info from the mysql database and reads it line by line, extracting tab-separated columns into separate variables, something like this:
oldifs=$IFS
result=result.txt
$mysql -e "SELECT id,foo,bar,baz FROM $db.$table" -u $user --password=$pass -h $server > $result
cat $result | grep -e ^[0-9].*$ | while IFS=$'\t' read id foo bar baz
do
# some code
done
IFS=$oldifs
Now, while this works OK and I'm satisfied with the result (especially since I'm going to move the query t oanother script and let cron regenerate the result.txt file contents once a week or so, since I'm dealing with a table that changes maybe once or twice a year), I'm curious about the possibility of putting the query's result in a variable instead of a file.
I have noticed that in order to echo out backslash-excaped characters, I need to tell the command explicitly to interpret such characters as special chars:
echo -e "some\tstring\n"
But, being a bash noob that I am, I have no idea how to place the backslash escaped characters (the tabs and newlines from the query) inside a variable and just work with it the same way I'm working with the external file (just changing the cat with echo -e). I tried this:
result=`$mysql -e "SELECT id,foo,bar,baz FROM $db.$table" -u $user --password=$pass -h $server`
but the backslash escaped characters are converted into spaces this way :(. How can I make it work?
To get the output of a command, use $(...). To avoid wordsplitting and other bash processing you will need to quote. Single quotes ('$(...)') will not work as the quoting is too strong.
Note that once the output is in your variable, you will probably need to (double) quote it wherever you use it if you need to preserve anything that's in $IFS.
$ listing="$(ls -l)"
$ echo "$listing"
Could you try to set double quotes around $result - thus echo -e "$result"?
% awk '/^[0-9]/ { print $2, $3, $4, $5 }' <<SQL | set -- -
> $("${mysql}" -e "SELECT id,foo,bar,baz FROM $db.$table" -u $user --password=$pass -h $server)
> SQL
% printf '%s\t' "${#}"
<id> <foo> <bar> <baz>
You might get some use out of this. The heredoc should obviate any escaping issues, awk will separate on tabs by default, and set accepts the input as a builtin argv array. printf isn't necessary, but it's better than echo - especially when working with escape characters.
You could also use read as you did above - but to better handle backslashes use the -r argument if you go that route. The above method would work best as a function and you could then iterate over your variables with shift and similar.
-Mike

Resources