Bash function to automate curl POST authentication not executing as intended - windows

I am using Ubuntu bash shell on a windows machine and i am trying to create a function to test my endpoints of a spring application. First i must authenticate with the server via the following curl command. The output of the statement run is below.
$> curl -i -X "POST" -d"${auth}" -H"${contentType}" "${host}/login"
When i copy and paste this command into a function it seems to blow up leading to non-execution of curl as intended which should have the same output as above.
$> function springlogin(){curl -i -X "POST" -d"${auth}" -H"${contentType}" "${host}/login";};
$> springlogin
What am i missing here? Is this due to some variable expansion im not aware of? Or something else entirely.
My end goal would be to use the output of this function and use it to authorize my endpoint api calls if i happen to be on my command line.

Thanks to Gordon Davisson i figured it out. This is something that we do at work but i didn't think to do it in my home code. I need to unset the function in my profile before i declare it again.
unset springlogin
function springlogin(){curl -i -X "POST" -d"${auth}" -H"${contentType}" "${host}/login";};

Related

Bash script for LDAP website API requests

I want to automate and create a bash script where I have this call:
curl --header "Authorization: Bearer <PANDA TOKEN HERE>" https://www.websitehere.com/api/v2/groups/GROUP NAME/details
My goal is for the bash script to iterate over a list of group names (GROUP NAME).I have my panda token working. Unfortunately I have no previous experience with bash. I have installed WSL and curl.
I tried this bash script, but it doesn't work:
#!/bin/bash
group_details=("acton_team_coordinators" "acs-peripherals" "admingrocery" "ae-central-leads")
for group in "${group_details[#]}"
do
curl -v GET " https://www.websitehere.com/api/v2/groups/acton_team_coordinators/details" -H "<PANDA TOKEN HERE>”
done
I got error in CMD like:
enter image description here
You forgot to code the "group" variable as a variable, by prefixing it with the "$".
It is good practice to surround variable strings with a pair of braces ("{}"), because there are cases where confusion can occur in determining what is the actual variable name.
You also had space inside the double-quotes, before the "http ...". I am not sure if that could create issues, but best to remove that space and avoid them.
The modified script would look like this:
#!/bin/bash
out_capture="somewhere"
group_details=("acton_team_coordinators" "acs-peripherals" "admingrocery" "ae-central-leads")
for group in "${group_details[#]}"
do
curl -v GET "https://www.websitehere.com/api/v2/groups/${group}/details" -H "<PANDA TOKEN HERE>" >${out_capture}/${group}.out
done

Shell Script: How to use cURL or wGet to call URL's from a list? I can get only the last row to work! Help! :)

Some help with Shell Script and cURL/wGet ?
I need to call multiple URL´s from a file.
While testing I´m using just 3 rows in that file, meaming that I have 3 URL´s.
Fist step I create the file and then I read it row by row calling via cURL or wGet using the block bellow:
cat ${vMyFile} | while read vMyRow #I cat the file and read it on a "While" structure
do
echo curl -f ${vMyRow} >>${vMyLogFile} #I use cURL to call the rows with the URL`s
The script works just fine, and the log file (vMyLogFile) shows me that everything worked.
The "curl" called all 3 URL´s with no erros BUT... I can only get the last row(URL) to work.
When I call these URL´s, a service is activated and only the server at line 3 (the last one) get activated. :(
If I get the cURL line 1 and 2 from the log file and run it thru terminal, the service is activated successfully.
I already tried to use SLEEP and WAIT between the cURL´s and still can´t get it to work properly.
Is there a "CLOSE" or "END" that I must user before calling a cURL request again in the same process?
I need to call multiple URL´s from a file.
wget has option for such use case, namely -i. If not other options are specified it should be followed by name of file which holds one URL per line, so if this file is named for example urls.txt simplest usage is
wget -i urls.txt
If you want to know more about -i behavior consult wget's man page.

Curl command in Bash script returns no URL specified

Trying to run a curl command to a test page results in "no URL specified!" when attempting to run via a bash script. This is a test that runs through a proxy (1.2.3.4 in the code example)
The command runs as expected from the command line. I've tried a few variations of quotes around the components of the command however still manage the same "no URL specified" result.
echo $(curl -x http://1.2.3.4:80 https://example.company.com)
The expected result is an HTML file, however instead the above issue is found only when run from this small bash script file.
The command itself runs just fine as "curl -x http://1.2.3.4:80 https://example.company.com" from the command line.
Appreciate in advance any help you can provide!
I have literally edited it down to the following. Everything else is commented out. Still the same error.
#!/bin/bash
curl -x http://1.2.3.4:80 https://example.company.com
In your example you want to use double quotes around the subshell and single qutes for the curl parameters
Make sure to set the correct shebang (#!/bin/bash).
You could also try to run it as:
cat <(curl -x http://1.2.3.4:80 https://example.company.com)
However I am not able to reproduce your example

curl: Is it possible to set timeout globally in shell

Is it possible to set timeout value for curl globally? E.g. via an environment variable or config file?
I have a shell script with some 20 curl commands scattered all over and would like to avoid specifying --connect-timeout everywhere.
This is for shell scripting, not php or C/C++ or ...
As far as I can see on the man page, there is no such environment variable.
You could make a function called curl:
curl () {
command curl --connect-timeout 60 "$#"
}
So whenever you call curl, it will call this function, which in turn calls the curl command (command suppresses shell function lookup) with the same arguments.

Bash script doesn't get all parameters that i've given

I've a simple bash script to run the parameter as a command on the server, the necessity for this i am creating the required command as a string on the other server and trying to execute it remotely.
PROFILE=/coremedia/home/picroot/.profile
source $PROFILE
"$1"
the parameter i am sending to script :
/coremedia/pic-cms-tools/bin/cm publish -u admin -p admin -t "/Config/Static Texts/PDF Texts/pdf.eudatasheet.ocEnergyConsConvAlone" "/Config/Static Texts/PDF Texts/pdf.eudatasheet.ocEnergyConsForcedAlone"
But it couldn't find the necessary command it is stopped when we reached :
/coremedia/pic-cms-tools/bin/cm
I've tried many configurations on my side to handle the string but i've still couldn't reach a solution, obviously i am missing a small critical thing...
Any help would be appreciated, many thanks in advance!
Replace eval $1 with eval "$#" to evaluate all the parameters, not just the first one.

Resources