I need to execute a Curl script on Jenkins so as to check the status URL - shell

The script should check for Http status code for the URL and should show error when status code doesn't match for eg. 200.
In Jenkins if this script fails then Build should get failed and Mail is triggered through post build Procedure.

Another interesting feature of curl is its -f/--fail option. If set, it will tell curl to fail on any HTTP error, i.e. curl will have an exit code different from 0, if the server response status code was not 1xx/2xx/3xx, i.e. if it was 4xx or above, so
curl --silent --fail "http://www.example.org/" >/dev/null
or (equivalently):
curl -sf "http://www.example.org/" >/dev/null
would have an exit code of 22 rather than 0, if the URL could not be found or if some other HTTP error occurred. See man curl for a description of curl's various exit codes.

You can use simple shell command as referred in this answer
curl -s -o /dev/null -w "%{http_code}" http://www.example.org/

This will happen if the following shell script is added:
response=$(curl -s -o /dev/null -w "%{http_code}\n" http://www.example.org/)
if [ "$response" != "200" ]
then
exit 1
fi
exit 1 will mark build as failed

Jenkins also has HTTP Request Plugin that can trigger HTTP requests.
For example this is how you can check response status and content:
def response = httpRequest "http://httpbin.org/response-headers?param1=${param1}"
println('Status: '+response.status)
println('Response: '+response.content)

You could try:
response=`curl -k -s -X GET --url "<url_of_the_request>"`
echo "${response}"

How about passing the URL at run time using curl in bashscript
URL=www.google.com
"curl --location --request GET URL"
How we can pass url at runtime ?

Related

Curl: Save HTTP Error in Bash to log and data to file

I'm making a curl request to a REST API and now I want to save the HTTP answer code, just if an error occurs, to a logfile and the API answer to another file if there is no error occuring.
I'm trying it with:
error=$(curl -v -o "test.json" -H "Authorization: Basic ABCDEF" "https://api.abc.com")
and
error=$(curl --fail -o "test.json" -H "Authorization: Basic ABCDEF" "https://api.abc.com")
If I make an if [0 -eq $? ] after the curl request with --fail I can detect that an error occurred but I am not able to save the HTTP error to a log.
Thanks.
Since according man curl the option --fail
... is not fail-safe and there are occasions where non-successful response codes will slip through, especially when authentication is involved (response codes 401 and 407).
you could use --write-out and http_code.
The numerical response code that was found in the last retrieved HTTP(S) or FTP(s) transfer.
I.e.:
ERROR=$(curl --silent --fail --header "Authorization: Basic ABCDEF" "https://api.abc.com" --output "test.json" --write-out "%{http_code}")
Welcome to stackoverflow.
This should do the trick.
#send all output to file named out
curl -v -o "test.json" -H "Authorization: Basic ABCDEF" "https://api.abc.com" >out 2>&1
# find HTTP/2 code in the output
error=`grep "HTTP/2" out | tail -1 | rev | cut -c1-5 | rev`
# print the error code
echo $error
Example run:
mamuns-mac:jenkins xmrashid$ ./get_error.sh
503
mamuns-mac:jenkins xmrashid$
Good Luck.

curl 400 bad request (in bash script)

I trying to do execute the following script in bash
#!/bin/bash
source chaves.sh
HEAD='"X-Cachet-Token:'$CACHET_KEY'"'
SEARCH="'{"'"status"'":1,"'"id"'":"'"7"'","'"enabled"'":true}'"
echo $SEARCH
if curl -s --head --request GET http://google.com.br | grep "200 OK" > /dev/null; then
echo 'rodou'
curl -X PUT -H '"Content-Type:application/json;"' -H '"'X-Cachet-Token:$CACHET_KEY'"' -d $SEARCH $CACHET_URL/7
else
echo 'não deu'
curl -X PUT -H '"Content-Type: application/json;"' -H $x -d '{"status":1,"id":"7","enabled":true}' $CACHET_URL/7
fi
But keep receiving a 400 bad request from the server.
When i try to run the same line (echo in the script, Ctrl+c and Ctrl+v) directly in terminal, the command run without problems.
The source file have the directions to path and a variable token i need to use, but as far as i have tested is reading ok.
edit 1 - hidding some sensitive content
edit 2 - posting the exit line (grabed trought Ctrl+c, Ctrl+v)
The command i neet to input in server is:
curl -X PUT -H "Content-Type:application/json;" -H
"X-Cachet-Token:4A7ixgkU4hcCWFReQ15G" -d
'{"status":1,"id":"7","enabled":true}'
http://XXX.XXX.XXX.XXX/api/v1/components/7
And the exit i grabed trought echo comand, give me the exact exit i want, but don't run inside script, only outside.
I'm a bit new to the curl, any help can be apreciate.
Sorry for the bad english and tks in advance.

Bash curl call second command only when status is not 200

I looking for solution how combine two curl request in bash, and call second curl only when first doesnt return status 200.
I tried:
curl -s "https://example.com/first" || curl -s "https://example.com/second"
but it still call both because first curl is successful if return for example status 404.
How it is possible call second only when first doesnt return status 200?
Thanks for help.
curl -s -o /dev/null -w "%{http_code}" https://example.com | grep -q "^200$" || curl -s https://example.com/2.html
Edit: added improvement by #tripleee to not pollute output with grep output.

curl --fail without suppressing stdout

I have a script that uploads a file to a WebDav server using curl.
curl --anyauth --user user:password file http://webdav-server/destination/
I want two things at the same time:
have the script output to stdout (which is directed to a log file)
detect whether the upload succeeded
As far as I know, curl returns an exit code of 0 even in 401(unauthorized) or 407(conflict) situations. The --fail option can be used to change this behavior, but it suppresses stdout.
What would be the best workaround for this? tee and grep?
curl writes its output to stderr(2), the error stream instead of stdout(1). Redirect it on the command using 2>&1
curl --fail --anyauth --user user:password file http://webdav-server/destination/ 2>&1 > logFile
retval=$?
if [ $retval -eq 0 ]; then
echo "curl command succeeded and the log present in logFile"
fi

Login via curl fails inside bash script, same curl succeeds on command line

I'm running this login via curl in my bash script. I want to make sure I can login before executing the rest of the script, where I actually log in and store the cookie in a cookie jar and then execute another curl in the API thousands of times. I don't want to run all that if I've failed to login.
Problem is, the basic login returns 401 when it runs inside the script. But when I run the exact same curl command on the command line, it returns 200!
basic_login_curl="curl -w %{http_code} -s -o /dev/null -X POST -d \"username=$username&password=$password\" $endpoint/login"
echo $basic_login_curl
outcome=`$basic_login_curl`
echo $outcome
if [ "$outcome" == "401" ]; then
echo "Failed login. Please try again."; exit 1;
fi
This outputs:
curl -w %{http_code} -s -o /dev/null -X POST -d "username=bdunn&password=xxxxxx" http://stage.mysite.it:9301/login
401
Failed login. Please try again.
Copied the output and ran it on the cmd line:
$ curl -w %{http_code} -s -o /dev/null -X POST -d "username=bdunn&password=xxxxxx" http://stage.mysite.it:9301/login
200$
Any ideas? LMK if there's more from the code you need to see.
ETA: Please note: The issue's not that it doesn't match 401, it's that running the same curl login command inside the script fails to authenticate, whereas it succeeds when I run it on the actual CL.
Most of the issues reside in how you are quoting/not quoting variables and the subshell execution. Setting up your command like the following is what I would recommend:
basic_login_curl=$(curl -w "%{http_code}" -s -o /dev/null -X POST -d "username=$username&password=$password" "$endpoint/login")
The rest basically involves quoting everything properly:
basic_login_curl=$(curl -w "%{http_code}" -s -o /dev/null -X POST -d "username=$username&password=$password" "$endpoint/login")
# echo "$basic_login_curl" # not needed since what follows repeats it.
outcome="$basic_login_curl"
echo "$outcome"
if [ "$outcome" = "401" ]; then
echo "Failed login. Please try again."; exit 1;
fi
Running the script through shellcheck.net can be helpful in resolving issues like this as well.

Resources