curl return HTTP code 503 when called from a script - bash

I'm try to call some REST API and get the resulting HTTP code using curl. If in a terminal I type :
curl -s -o /dev/null -I -w '%{http_code}' -X POST 'http://localhost/gitlab/api/v3/projects?private_token=my_private_token&name=blabla' -H 'Content-Length: 0'
It works and return me the HTTP code 201 ("created"). Now I try to use this command in a bash script replacing a part of the url with variable:
echo "Enter base URL :"
read gitlab_url # Here I type 'http://localhost/gitlab', to generate the same URL as in first code snippet
code_status=$(curl -s -o /dev/null -I -w '%{http_code}' -X POST '$gitlab_url/api/v3/projects?private_token=my_private_token&name=blabla' -H 'Content-Length: 0')
echo "$code_status"
And then it returns me the HTTP code 503 ("Service Unavailable"). To see if there is any differences between the "hard coded" URL and the generated one, I do :
echo "curl -s -o /dev/null -I -w '%{http_code}' -X POST '$gitlab_url/api/v3/projects?private_token=my_private_token&name=blabla' -H 'Content-Length: 0'"
# Output :
curl -s -o /dev/null -I -w '%{http_code}' -X POST 'http://localhost/gitlab/api/v3/projects?private_token=my_private_token&name=blabla' 'Content-Length: 0'
And if I execute this in a terminal directly, it works and return me 201. So: why do this command fails if I use it in a script ? Is there anything I missed ?

It was a proxy problem. If I use curl -v .... I can see the following output:
When curl is typed directly in terminal I have :
* About to connect() to localhost port 80 (#0)
* Trying 127.0.0.1... connected
* Connected to localhost (127.0.0.1) port 80 (#0)
And when I used it into a bash script I get :
* About to connect() to proxy proxy.my.company port xxx (#0)
* Trying xx.xx.xx.xx... connected
* Connected to proxy.my.company (xx.xx.xx.xx) port xxx (#0)
So to fix it I added this in the top of my script :
export no_proxy=localhost,127.0.0.1
export http_proxy=""
I am very surprised to have to do this, because I already have an environment var no_proxy who already reference localhost and 127.0.0.1

Try to run it as bash -x script.sh.

Related

Curl and Variables inside http

I try to do combination of variable source and destination IP.
then I would like to send to our software to tell me if traffic is allow or not.
I am using curl but I dont know how to add variables there
I need someting like this
curl -ks -X GET 'https://tufin.com/api/path?$line_source$line_destination$line_port
but this show me is not variable
line_source="$(cat $HOME/src)"
line_destination="$(cat $HOME/dst)"
line_port="$(cat $HOME/port)"
prepare_curl=$(echo -e "'https://tufin.com/api/path?src=$line_source&dst=$line_destination&service=$line_port'
result=$(curl -k -s -X GET $prepare_curl )
echo -e " source $line_source destination $line_destination port $line_port traffic is $result"
curl: option -m: expected a proper numerical parameter
curl: try 'curl --help' or 'curl --manual' for more information

How to use CURL over SSH and get the file as well as the return value?

I want to load a file from a clients webserver. This webserver is running local only. To get there I have to use ssh. I need the content as well as the return value (e.g. SSH connection broke, webserver down).
What do I have to change? My first try:
#!/bin/bash
RETURN=0
CONTENT=""
sshpass -p xxxxxx ssh root#172.17.1.33 "curl -X POST http://127.0.0.1:10000/status -H 'Content-Type: application/json' > $CONTENT | bash; RETURN=$?"
If you want to get the exit code of curl and the return value of curl:
#!/bin/bash
CONTENT=$(sshpass -p xxxxxx ssh root#172.17.1.33 "curl -X POST http://127.0.0.1:10000/status -H 'Content-Type: application/json'")
RETURN=$?
echo "$RETURN, $CONTENT"
In your script you set the variables on the server you ssh'ed into.

Script that will print HTTP headers for multiple servers

I've created the following bash script:
#!/bin/bash
for ip in $(cat targets.txt); do
"curl -I -k https://"${ip};
"curl -I http://"${ip}
done
However I am not receiving the expected output, which is the HTTP header responses from IP addresses listed in targets.txt
I'm not sure how curl can attempt both HTTP and HTTPS (80/443) within one command, so I've set two seperate curl commands.
nmap might be more appropriate for the task: nmap -iL targets.txt -p T:80,443 -sV --script=banner --open
Perform a network map (nmap) of hosts from the input list (-iL targets.txt) on TCP ports 80 and 443 (-p T:80,443) with service/version detection (-sV) and use the banner grabber script (--script=banner, ref. https://nmap.org/nsedoc/scripts/banner.html). Return results for open ports (--open).
... or masscan (ref. https://github.com/robertdavidgraham/masscan): masscan $(cat targets.txt) -p 80,443 --banners
Mass scan (masscan) all targets on ports 80 and 443 (-p 80,443) and grab banners (--banners).
Remove the quotes around your curl commands. You also don't need the ; after the first curl.
#!/bin/bash
for ip in $(cat targets.txt); do
curl -I -k https://${ip}
curl -I http://${ip}
done
I added some echo's to #John's answer to be able to have a better visibility of the results of the curl executions. Also added port 8080 in case of proxy.
#!/bin/bash
for ip in $(cat $1); do
echo "> Webserver Port Scan on IP ${ip}."
echo "Attempting IP ${ip} on port 443..."
curl -I -k https://${ip}
echo "Attempting IP ${ip} on port 80..."
curl -I http://${ip}
echo "Attempting IP ${ip} on port 8080..."
curl -I http://${ip}:8080
done

How to retrieve error code from cURL on shell

I know a similar question was posted, but I can't get it to work on my machine.
I tried the 1st answer from the mentioned question, i.e. response=$(curl --write-out %{http_code} --silent --output /dev/null servername) and when I echo $response I got 000 [Not sure if that is the desired output].
However, when trying to do so with my cURL command, I get no output.
This is my command:
curl -k --silent --ftp-pasv --ftp-ssl --user C:is_for_cookies --cert localcert_cert.pem --key certs/localcert_pkey.pem ftps://10.10.10.10:21/my_file.txt
and I use it with
x=$(curl -k --silent --ftp-pasv --ftp-ssl --user C:is_for_cookies --cert localcert_cert.pem --key certs/localcert_pkey.pem ftps://10.10.10.10:21/my_file.txt)
but when I try to echo $x all I get is a newline...
I know the cURL is failing, because when I run the same command, without --silent, I get curl: (7) Couldn't connect to server
This Q is tagged with both sh, bash because I've tried it on both with same results
I found this option which kind of helps (but I still don't know how to assign it to a variable, which should be easier than this...):
--stderr <file>
Redirect all writes to stderr to the specified file instead. If the file name is a plain '-', it is instead written to stdout.
If this option is used several times, the last one will be used.
When I use it like this:
curl -k --silent -S --stderr my_err_file --ftp-pasv --ftp-ssl --user C:is_for_cookies --cert localcert_cert.pem --key certs/localcert_pkey.pem ftps://10.10.10.10:21/my_file.txt
I can see the errors (i.e. curl: (7) Couldn't connect to server) inside that file.
I used --silent to suppress all output, and -S to un-suppress the errors, and the --stderr <file> to redirect them

Ldap search with negative parameter

I'm trying to do a search on my LDAP base like that:
ldapsearch -x -h localhost -p 389 -D uid=xxxadmin,ou=administrators,ou=topologymanagement,o=netscaperoot -v -w 12345 -b "ou=Usuarios,ou=Alunos,ou=XXXX,o=xxXXXxx" -f (!(objectClass=ntUser)) 1.1
Basically I want to list all the entries without the objectClass ntUser and add the objectClass to them.
I'm getting this as an answer:
-bash: !: event not found
From http://www.openldap.org/lists/openldap-software/200104/msg00196.html
This message comes from the shell (bash). It states that the command
`!' didn't find the event you unintentionally asked for. This happens
because the double quotes in bash do not prevent some command
invocation. Use single quotes instead:
Your search should be like this:
ldapsearch -x -h localhost -p 389 -D 'uid=xxxadmin,ou=administrators,ou=topologymanagement,o=netscaperoot' -v -w 12345 -b 'ou=Usuarios,ou=Alunos,ou=XXXX,o=xxXXXxx' -f '(!(objectClass=ntUser))' 1.1
Your search should work. But, for bash, you will need to quote the parameters.
Something like:
ldapsearch -x -h localhost -p 389 -D uid=xxxadmin,ou=administrators,ou=topologymanagement,o=netscaperoot -v -w 12345 -b "ou=Usuarios,ou=Alunos,ou=XXXX,o=xxXXXxx" -f "(!(objectClass=ntUser))" 1.1
Tested both openLDAP
#(#) $OpenLDAP: ldapsearch (Ubuntu) (Mar 17 2014 21:19:27) $buildd#aatxe:/build/buildd/openldap-2.4.31/debian/build/clients/tools
(LDAP library: OpenLDAP 20431)
ldapsearch -x -h localhost -p 389 -D "cn=admin" -W -b "dc=example,dc=com" -s sub -a always -z 1000 "(!(objectClass=inetOrgPerson))" "objectClass"
and OpenDJ
ldapsearch --version
OpenDJ 2.7.0-20140727
Build 20140727000040Z
ldapsearch -h localhost -p 389 -D "cn=admin" -b "dc=example,dc=com" -s sub -a always -z 1000 "(!(objectClass=inetOrgPerson))" "objectClass"
-jim
Its happening because bash thinks ! as a special character
"!" Start a history substitution, except when followed by a space, tab, the end of the line, ‘=’ or ‘(’
So finally, you should be able to solve your problem by putting single quotes around the term as follow:
ldapsearch -x -h localhost -p 389 -D uid=xxxadmin,ou=administrators,ou=topologymanagement,o=netscaperoot -v -w 12345 -b "ou=Usuarios,ou=Alunos,ou=XXXX,o=xxXXXxx" -f '(!(objectClass=ntUser))' 1.1
Please refer following question on stackoverflow.
Which characters need to be escaped in Bash? How do we know it?

Resources