shell command line to run a script N times - shell

I'm trying to run a curl command for the following numbers but it doesn't apply
for i in {2,51,52,53,54}; do curl -v -X PUT http://localhost:8198/v3/progress/i/?status=Open; done
anything going wrong ?

In bash you can use variables with $. So i should be $i
This should make it work:
for i in {2,51,52,53,54}; do curl -v -X PUT http://localhost:8198/v3/progress/$i/?status=Open; done

Related

bash script from docker does not work as expected if statement [duplicate]

This question already has answers here:
Difference between sh and Bash
(11 answers)
Pattern matching in UNIX Case statement
(1 answer)
Closed 1 year ago.
I am using this image which has bash v4.3.48 and curl v7.56.1:
https://hub.docker.com/r/bizongroup/alpine-curl-bash/tags?page=1&ordering=last_updated
Inside the docker I write the following script:
email_dest="iz#gmail.com}}"
suffix="#gmail.com"
dm_to=${email_dest%"$suffix"}
if [[ $email_dest == *"#users.noreply.github.com"* ]]
then
echo "Email address is no reply. Please fix your email preferences in Github"
elif [[ $email_dest == *$suffix* ]]
then
curl -X POST -H 'Content-type: application/json' --data '{"text":"Hello <#'"$dm_to"'>. '{{inputs.parameters.workflow_name}}' "}' https://hooks.slack.com/services/T01JNE5DXA7/B0246T84N75/hHDk7RUg2BWl2bYbPoN9r
else
echo "Email address is not of digibank domain!"
fi
If I run this script with bash command <script_name> it will work as expected (Run the curl command). But if I run it with sh command <script_name> it will not run the curl command:
/ # bash send-message.sh
ok/ #
/ # sh send-message.sh
Email address is not of digibank domain!
Any suggestion of what it could be? and what should be changed so it will work with sh?
That lies within the differences between bash and sh:
sh is POSIX compliant, whereas bash isn't (fully).
As a best practice you should always include a shebang:
#!/usr/bin/env bash
echo "this is going to run within bash"
With this you can now omit calling the script via bash myscript and just call it with ./myscript and it is always going to use bash (even if you are in a zsh, sh or whatever else).
However, if you truly want to have a script that runs with both sh and bash then you should rewrite your script to be plain sh compliant (i.e. POSIX).
TL;DR
Any suggestion of what it could be? and what should be changed so it will work with sh?
In your script you are using bash extensions such as [[ which is why it does not work with sh.
Checkout the links I posted above for more differences and how you can "convert" your bash script into a sh script.
The following site has a great summary on what to change in order to get your bash script working for dash which is an implementation of sh: http://mywiki.wooledge.org/Bashism
Furthermore, you can also check if any issues exist by using the following site: https://www.shellcheck.net/

curl works, but won't execute in BASH script

The following curl command works from the command line. I get a valid response from the server.
curl -X POST https://somebaseurl/api/v1/auth/login -H "Content-Type:application/json" -d '{"email": "foo#bar,com", "password": "foo"}'
However I am trying to write a BASH script like this
baseUrl=https://somebaseurl
contentTypeJson="\"Content-Type:application/json\""
credentials="'{\"email\": \"foo#bar.com",\"password\": \"foo\"}'"
login="curl -X POST $baseUrl/api/v1/auth/login -H $contentTypeJson -d $credentials"
echo ${login}
response=`${login}`
echo ${response}
I get a bad request response from the server. However if I copy the echoed curl command directly into my terminal it works. What am I doing wrong?
edit:
As requested I get
Bad Request For request 'POST api/v1/auth/login' [Expected application/json]
Bash and cURL can be quite particular how quotes are used within a script. If the escaping gets thrown off then everything else can easily fail. Running the script through shellcheck.net is often very helpful in identifying such issues. Below is a revised version of the script after fixing based upon the suggestions:
#!/bin/bash
baseUrl="https://somebaseurl/api/v1/auth/login"
contentTypeJson="Content-Type:application/json"
credentials="{\"email\": \"foo#bar.com\", \"password\": \"foo\"}"
login="$(curl -X POST "$baseUrl" -H "$contentTypeJson" -d "$credentials")"
echo "${login}"
response="${login}"
echo "${response}"
Executing with backticks interprets the command only as a sequence of words, and doesn't treat quotes specially. To have the shell interpret quotes as if they were interactively typed, use eval ${login} instead.
As an aside, bash has a -x option which will show you commands as they are being executed (run your script with bash -x script.sh instead of bash script.sh or ./script.sh). This will show you the commands correctly quoted, and is more helpful than printing them out using echo.

Trouble escaping quotes in a variable held string during a Sub-shell execution call [duplicate]

This question already has answers here:
Why does shell ignore quoting characters in arguments passed to it through variables? [duplicate]
(3 answers)
Closed 6 years ago.
I'm trying to write a database call from within a bash script and I'm having problems with a sub-shell stripping my quotes away.
This is the bones of what I am doing.
#---------------------------------------------
#! /bin/bash
export COMMAND='psql ${DB_NAME} -F , -t --no-align -c "${SQL}" -o ${EXPORT_FILE} 2>&1'
PSQL_RETURN=`${COMMAND}`
#---------------------------------------------
If I use an 'echo' to print out the ${COMMAND} variable the output looks fine:
echo ${COMMAND}
screen output:-
#---------------
psql drupal7 -F , -t --no-align -c "SELECT DISTINCT hostname FROM accesslog;" -o /DRUPAL/INTERFACES/EXPORTS/ip_list.dat 2>&1
#---------------
Also if I cut and paste this screen output it executes just fine.
However, when I try to execute the command as a variable within a sub-shell call, it gives an error message.
The error is from the psql client to the effect that the quotes have been removed from around the ${SQL} string.
The error suggests psql is trying to interpret the terms in the sql string as parameters.
So it seems the string and quotes are composed correctly but the quotes around the ${SQL} variable/string are being interpreted by the sub-shell during the execution call from the main script.
I've tried to escape them using various methods: \", \\", \\\", "", \"" '"', \'"\', ... ...
As you can see from my 'try it all' approach I am no expert and it's driving me mad.
Any help would be greatly appreciated.
Charlie101
Instead of storing command in a string var better to use BASH array here:
cmd=(psql ${DB_NAME} -F , -t --no-align -c "${SQL}" -o "${EXPORT_FILE}")
PSQL_RETURN=$( "${cmd[#]}" 2>&1 )
Rather than evaluating the contents of a string, why not use a function?
call_psql() {
# optional, if variables are already defined in global scope
DB_NAME="$1"
SQL="$2"
EXPORT_FILE="$3"
psql "$DB_NAME" -F , -t --no-align -c "$SQL" -o "$EXPORT_FILE" 2>&1
}
then you can just call your function like:
PSQL_RETURN=$(call_psql "$DB_NAME" "$SQL" "$EXPORT_FILE")
It's entirely up to you how elaborate you make the function. You might like to check for the correct number of arguments (using something like (( $# == 3 ))) before calling the psql command.
Alternatively, perhaps you'd prefer just to make it as short as possible:
call_psql() { psql "$1" -F , -t --no-align -c "$2" -o "$3" 2>&1; }
In order to capture the command that is being executed for debugging purposes, you can use set -x in your script. This will the contents of the function including the expanded variables when the function (or any other command) is called. You can switch this behaviour off using set +x, or if you want it on for the whole duration of the script you can change the shebang to #!/bin/bash -x. This saves you explicitly echoing throughout your script to find out what commands are being run; you can just turn on set -x for a section.
A very simple example script using the shebang method:
#!/bin/bash -x
ec() {
echo "$1"
}
var=$(ec 2)
Running this script, either directly after making it executable or calling it with bash -x, gives:
++ ec 2
++ echo 2
+ var=2
Removing the -x from the shebang or the invocation results in the script running silently.

bash read is being skipped when run from curl pipe

I'm building a bootstrap for a github project and would like it to be a simple one-liner. The script requires a password input.
This works and stops the script to wait for an input:
curl -s https://raw.github.com/willfarrell/.vhosts/master/setup.sh -o setup.sh
bash setup.sh
This does not, and just skips over the input request:
curl -s https://raw.github.com/willfarrell/.vhosts/master/setup.sh | bash
setup.sh contains code is something like:
# code before
read -p "Password:" -s password
# code after
Is it possible to have a clean one-liner? If so, how might one do it?
Workaround:
Use three commands instead of piping output.
curl -s https://raw.github.com/willfarrell/.vhosts/master/setup.sh -o vhosts.sh && bash vhosts.sh && rm vhosts.sh
I had the same exact problem as the OP and was looking for an answer. This question was one of the first hits on Google for me and since it doesn't have a real answer yet, here's the command that I eventually stumbled upon which solved my need of using read in a remote script.
bash <(curl -s https://example.com/my-bash-script.sh)
With the pipe, the read reads from standard input (the pipe), but the shell already read all the standard input so there isn't anything for the read to read.

How do I use a variable in the --data section of a curl command?

I am writing a bash script to call curl from the command line. I sadly cannot figure out how to substitute a simple variable into the -d section of the curl request.
Why doesn't this work?
#!/bin/sh
name=$1
test -z $name && echo "Repo name required." 1>&2 && exit 1
curl -u 'metaraine' https://api.github.com/user/repos -d '{"name":"$name"}'
It doesn't actually substitute the value of $name into the data.
What about this?
curl -u 'metaraine' https://api.github.com/user/repos -d "{\"name\":\"$name\"}"
That is, escape the quotes and use double quotes around the {} instead of simple ones.

Resources