How to pass variable to exec command in expect script? - shell

The purpose of the script is to read an encrypted file ".pass" and decrypt the file using a PublicKey, and the decrypted output should be saved
puts $output
should show the decrypted password.
PublicKey will change every time based on my key generation logic, so I wanted it to be set as a variable
#!/usr/bin/expect
set value "PublicKey"
set output [ exec sh -c {cat .pass | cut -d'&' -f1 | openssl base64 -d | openssl enc -d -rc2 -k "$value" } ]
puts $output

Tcl {braces} are like shell 'single quotes' -- no variable expansion is performed within.
You need to use different quoting:
set value "PublicKey"
set cmd "cat .pass | cut -d'&' -f1 | openssl base64 -d | openssl enc -d -rc2 -k $value"
set output [ exec sh -c $cmd ]
puts $output

There are typographic quotation marks in your file - you need to instead use only ' and " (straight quotation marks) to quote strings.

Related

Is there a way to output jq into multiple variables for bash script?

Basically I have a bash script that at one point makes an API call and a cert and key are generated and returned in json. I pipe it to jq and can select either the cert or the key and store it in a variable.
Something like this:
CERT=$(API call | jq -r '.certificate')
or
KEY=$(API call | jq -r '.key')
I want to store each in its own variable but I can't make the call twice because it will generate a new cert/key.
I know that I can just store both in a file and then manipulate after to accomplish my task but I am curious if jq offers a direct way to selectively store each value in its own variable?
You could store the original JSON in a variable (instead of a file), and then extract the key and cert from that:
apiResult=$(API call)
cert=$(jq -r '.certificate' <<<"$apiResult")
key=$(jq -r '.key' <<<"$apiResult")
Notes: I recommend using lower- or mixed-case variables, to avoid accidental conflicts with the many all-caps names with special meanings. Also, <<< is a bashism, and won't work in all other shells; if you need this be portable to e.g. dash, use something like key=$(printf '%s\n' "$apiResult" | jq -r '.key').
Unless there are NUL characters in the strings, there should be no need to call jq more than once, or to serialize the data.
In the simplest case, you could proceed along the following lines:
{ IFS= read -r certificate
IFS= read -r key
echo "certificate=$certificate"
echo "key=$key"
} < <(API call | jq -r '.certificate, .key')
If the values do not contain NUL but might contain newline characters,
then you could use NUL as the delimiter. For the sake of variety,
we could also use a while loop:
while IFS= read -r -d $'\0' certificate
do
IFS= read -r -d $'\0' key
echo "certificate=$certificate"
echo "key=$key"
done < <(API call | jq -rj '[.certificate, .key] | join("\u0000")')
Conversely ...
If the values of interest might contain literal NUL values ("\u0000"), then the question seems to be problematic, as bash variables in effect cannot contain literal NULs.
If any of the values of interest might contain literal NUL values, then here are two strategies for extracting the "raw" string equivalents into separate files:
Save the JSON output in a (temporary) file, and invoke jq -r once per value of interest in the obvious way.
Set up a bash pipeline starting with:
API call | jq -r '.certificate, .key | #base64`
and continuing with a loop in which each line is decoded, e.g. using base64 --decode or jq's #base64d.
The second strategy might make sense if the API call produces a very large JSON document.
If neither of the values contain a newline, you can easily use read:
$ read cert key < <( echo '{"certificate": "foo", "key": "bar"}' | jq -r .certificate,.key | tr \\n ' ')
$ echo $cert
foo
$ echo $key
bar
or:
$ { read cert; read key; } << EOF
> $( echo '{"certificate": "foo", "key": "bar"}' | jq -r .certificate,.key )
> EOF
$ echo $cert:$key
foo:bar
The certificate almost certainly will contain newlines, so you many want to serialize that data (eg, base64 encode it). But you're probably better off using Gordon Davisson's approach.
Say you created a program that output shell commands.
$ data_source | jq -r '#sh "certificate=\( .certificate )\nkey=\( .key )"'
certificate='the certificate'
key='the key'
Then, all you would need to do is evaluate them.
eval "$( data_source | jq -r '#sh "certificate=\( .certificate )\nkey=\( .key )"' )"
If you were dealing with many variables, you could use the following:
eval "$(
data_source |
jq -r '
{ "certificate": "certificate", "key": "key" } as $map |
( $map | keys_unsorted[] ) as $shell_var |
$shell_var + #sh "=\( .[$map[$shell_var]] )"
'
)"
You could even use paths.
eval "$(
data_source |
jq -r '
{ "certificate": [ "certificate" ], "key": [ "key" ] } as $map |
( $map | keys_unsorted[] ) as $shell_var |
$shell_var + #sh "=\( getpath($map[$shell_var]) )"
'
)"

need help in grep (BASH)

I try to make a tool to get list of proxies and You have downloaded Index for one of the free proxy sites I use this :
wget http://free-proxy.cz/en/proxylist/country/all/https/ping/all
and outputs something like that :
<script type="text/javascript">document.write(Base64.decode("MTg1LjExNC4xMzcuMTQ="))</script></td><td style=""><span class="fport" style=''>12195</span></td><td><small>HTTPS</small></td><td class="left"><div style="padding-left:2px"><img src="/flags/blank.gif" class="flag flag-ua" alt="Ukraine" /> Ukraine</div></td><td class="small"><small></small></td><td class="small"><small></small></td><td class="small"><small>High anonymity</small></td><td> <i class="icon-black icon-question-sign"></i></td><td><small>2.4%</small><div class="progress"><div class="fill" style="width:4%;background-color:red;"></div></div></td><td><div style="padding-left:5px"><small>649 ms</small> <div class="progress"><div class="fill" style="width:94%;background-color:#A5DA74;;"></div></div></div></td><td><small>8 hours ago</small></td></tr><tr><td style="text-align:center" class="left"><script type="text/javascript">document.write(Base64.decode("MTYxLjk3LjEzOC4yMzg="))</script></td><td style=""><span class="fport" style=''>3128</span></td><td>
As you can see the IP is encrypted in base64 and port is normal
I try to grep base64 codes first and this is work ↓
echo (outputs) | grep -Eo '("[A-Za-z0-9]{12,30}[=]{0,2}")' | cut -d '"' -f2
and I try this code to get ports ↓
echo (output) | grep -Eo "(class=\"fport\" style=''>[0-9]{1,9})" | cut -d '>' -f2
how can I mix it and make it like that
(base64 code):(port)
and after that I wanna decrypt the base64 code and make it look like :
IP:PORT
1st step
base64 is not an encryption, but an encoding. If you are working on
Linux or other Unix-variants, the command base64, base64-encoder/decoder,
will be pre-installed. If not, it will be easily installed with your
OS-dependent package manager.
Then please try to execute:
base64 -d <<< "MTg1LjExNC4xMzcuMTQ="
It will output:
185.114.137.14
2nd step
Then we can combine the base64 decoder with your command pipeline.
The problem is base64-coding ignores newlines and we need to process
the result of the pipeline line by line. Assuming the variable $output
holds the output of the wget command, please try:
while IFS= read -r line; do
base64 -d <<< "$line"
echo
done < <(echo "$output" | grep -Eo '("[A-Za-z0-9]{12,30}[=]{0,2}")' | cut -d '"' -f2)
It will print something like:
185.114.137.14
161.97.138.238
The <(command) notation is a process substitution and the output of
echo .. grep .. cut pipeline is fed to the while loop via stdin
and the while loop processes the base64-encoded strings line by line.
3rd step
Now we want to merge the IPs and PORTs in a format of IP:PORT.
We can make use of paste command. The final script will be:
paste -d ":" <(
while IFS= read -r line; do
base64 -d <<< "$line"
echo
done < <(echo "$output" | grep -Eo '("[A-Za-z0-9]{12,30}[=]{0,2}")' | cut -d '"' -f2)
) \
<(echo "$output" | grep -Eo "(class=\"fport\" style=''>[0-9]{1,9})" | cut -d '>' -f2)
Output:
185.114.137.14:12195
161.97.138.238:3128
The paste command takes filenames as arguments. Here we make use
of process substitution again in a manner as: paste <(command1) <(command2)
which saves to create temporary files.

Password protected shell script

I want to make my script password protected. If I use this code it works:
ACTUAL="sam123"
read -s -p "Password: " enteredpass
I also want to protect the script from being read with cat and vi. I tried to use vim -x <script> to encrypt it but then it won't allow me to run it.
I am using a generic user and haven't gotten anywhere.
You can't do this securely without your sysadmin's help, but you can do something sorta-kinda-maybe-not-really-adequate without it.
So, let's say you create your script like so:
cat >myscript <<EOF
echo "Doing something super secret here"
EOF
...but you don't want anyone who doesn't know the password to run it, even if they're using a shared account. You can do this by encrypting it:
gpg -ac <myscript >myscript.asc
...and then embedding that plaintext into a script:
#!/usr/bin/env bash
{ gpg -d | bash -s "$#"; } <<'EOF'
-----BEGIN PGP MESSAGE-----
jA0EBwMCBogTuO9LcuZg0lsB2wqrsPU8Bw2DRzAZr+hiecYTOe//ajXfcjPI4G6c
P3anEYb0N4ng6gsOhKqOYpZU9JzVVkxeL73CD1GSpcQS46YlKWJI8FKcPckR6BE+
7vqkcPWwcS7oy4H2
=gmFu
-----END PGP MESSAGE-----
EOF
That said, other users in the shared account can still collect your password if they connect to and trace your process while it's running -- running strace on the copy of bash -s will show the text being fed into its stdin. In general, you shouldn't rely on shared accounts for anything that needs to remain confidential.
Late answer for posterity, how about using openssl? here's my scriptencrypt.sh
It generates a new .sh file that requires a password
#!/bin/bash
if [ -z "$1" ]; then echo "usage: $(basename $0) script"; exit 1; fi
script=$(cat "$1")
checksum="$(echo "$script" | md5sum | awk '{ print $1 }')"
extension=$([[ "$(basename $1)" =~ .\.. ]] && echo ".${1##*.}" || echo "")
cat << EOF > "${1%.*}.enc${extension}"
#!/bin/bash
read -r -d '' encrypted_script << EOF2
$(openssl aes-256-cbc -a -salt -in /dev/stdin -out /dev/stdout <<< "${script}")
EOF2
read -s -p "Enter script password: " password
echo
unencrypted_script=\$(openssl aes-256-cbc -d -a -salt -in /dev/stdin -out /dev/stdout <<< "\${encrypted_script}" -pass pass:"\${password}" 2>/dev/null | tr -d '\000')
clear
checksum="\$(echo "\$unencrypted_script" | md5sum | awk '{ print \$1 }')"
if [ "\${checksum}" = "${checksum}" ]; then
eval "\${unencrypted_script}"
exit 0
else
echo "Wrong password inserted"
exit 1
fi
EOF

Execute commands from shell script fails

I'm trying read a file which contains lines like this:
Run COMMAND with options "OPTIONS" and arguments "ARGUMENTS"
Then I want to execute this command with given options and arguments. For example I'd like to execute these commands:
Run pwd with options "" and arguments ""
Run ls with options "-al" and arguments "$HOME"
Run ls with options "-al" and arguments "Example: \"strange folder name\""
This is my code
#!/bin/bash
while read -r line
do
COMMAND=$(echo "$line" | cut -d" " -f 2)
OPTIONS=$(echo "$line" | cut -d" " -f 5 | tr -d '"')
ARGUMENTS=$(echo "$line" | cut -d" " -f 8)
$COMMAND $OPTIONS $ARGUMENTS
done <$1
First example is working as it should, second one is giving me error ls: cannot access $HOME: No such file or directory' and third one is not storing the name of the folder to $ARGUMENTS correctly.
second one is giving me error ls: cannot access $HOME: No such file or directory'
This is because the folder named $HOME does not exist. I am not talking about the value of $HOME variable, but the string literal. The shell does not execute the parameter expansion in your situation.
third one is not storing the name of the folder to $ARGUMENTS correctly
This is because -f 8 only extract column 8, try -f 8- to extract the 8th column and all the others until the end of line.
You can give a try to this version below:
while read -r line; do
COMMAND=$(printf "%s" "${line}" | cut -d" " -f 2)
OPTIONS=$(printf "%s" "${line}" | cut -d" " -f 5 | tr -d '"')
ARGUMENTS=$(printf "%s" "${line}" | cut -d" " -f 8-)
$COMMAND $OPTIONS "$(eval printf \"%s\" "$ARGUMENTS")"
done < "${1}"
The eval is a shell built-in command which is used to enable parameter expansion of ARGUMENTS, if applicable.
I have to warn you that the eval is usualy say risky to use.

Variable expanding in a script shell

My code is:
nb_lignes=`wc -l $1 | cut -d " " -f1`
for i in $(seq $(($nb_lignes - 1)) )
do
machine=`head $1 -n $i | tail -1`
machine1=`head $1 -n $nb_lignes | tail -1`
ssh root#$machine -x " scp /home/file.txt root#$machine1:/home && rm -r /home/file.txt"
done
Is $machine1 taken as a variable or a string? If a string, how can I change it — by adding a quote?
$machine will expand to head $1 -n $i | tail -1 result, $machine1 will expand to head $1 -n $nb_lignes | tail -1 result.
You could figured it out by yourself.
Btw, ssh root# …
$machine1 will be expanded to give the value of variable machine1, because you are using double quotes". If you had used single quotes ' then it would not have been expanded.
One possible confusion is when you embed a variable inside other text. In this case you are fine, because the trailing character is a : (root#$machine1:/home) which is not a valid character in a Bash variable name. Some shells (csh) would not have liked that, if you are not sure then you can delimit the variable name using { }, for example:
root#${machine1}:/home

Resources