Hi I have the following simple script to read some URLs from the text and post it to another text file with response.
#!/bin/bash
while read url
do
urlstatus=$(curl -o /dev/null --silent --head --write-out '%{http_code}' "$url")
echo "$url"
echo "$url $urlstatus" >> urlstatus.txt
done < $1
As an example I am trying following link:
www.proddigia.com/inmueble/pisos/en-venta/el-putget-i-el-farro/sarria-sant-gervasi/barcelona/6761
However I get 0 as response. When I check with google I get 200. Am I missing something in script?
Zero is not a valid HTTP response code.
If curl is unable to establish a HTTP connection to the server, or if the server (somehow) fails to deliver a well-formed HTTP response message, there will be no "http code" to return in that variable. Zero is what you would probably see in that scenario.
It could also be that the value of $url that you are using is invalid. For example, if the URL is enclosed in < and > characters, it curl won't understand it. I would expect a zero in that case too.
The problem is that --silent is telling curl to throw away all of the error messages, so it can't tell you what the problem is.
I suggest that you see what you get by running the following command:
curl -o /dev/null --head "$url"
with the identical url string to the one you are currently using.
I just figured out that if you use txt file created in windows OS it does not work as expcted in ubuntu. That was the reason that I got 0. You need to create the txt file in Ubuntu and copy the links over there. Thanks for the answers anyway.
Related
I am calling a service via curl and in case of success I want to get the result in a variable. So I do this call:
result=$(curl -s $URL)
For the success case this works well. In case of an error I get just null in the variable.
In case of error the service throws some http code like 400 and returns some details in the response.
How can I also evaluate the error and fetch the error response?
Stderror to file:
command 2> file
Stderror to stdout:
command > file 2>&1
Try these flags with curl for stderror:
--fail --silent --show-error
I am using shell script to find the tps and error count in a single day using shell script and store the data in a txt file.
Now i want to use the same shell script to post data form that result file onto a slack group, but sometimes command gives bad request, sometimes Invalid Payload and sometimes it works.
curl -X -POST -H --silent --data-urlencode "payload={\"text\": \"$(cat <filepath>/filename.txt)\"}" "<slack url>"
Please help
I've got a bash script I use to create a ngrok tunnel, then I use dweet.io to post the tunnel address & port.
If that's meaningless to you, don't worry, essentially I'm using wget --post-data to post a string to an address.
This bash script is auto-started with a cron job.
while true
do
#Gets the internal IP
IP="$(hostname -I)"
#Gets the external IP
EXTERNALIP="$(curl -s https://canihazip.com/s )"
echo "Dweeting IP... "
TUNNEL="$(curl -s http://localhost:4040/api/tunnels)"
echo "${TUNNEL}" > tunnel_info.json
#Gets the tunnel's address and port
TUNNEL_TCP=$(grep -Po 'tcp:\/\/[^"]+' ./tunnel_info.json )
#Pushes all this information to dweet.io
wget -q --post-data="tunnel=${TUNNEL_TCP}&internal_ip=${IP}&external_ip=${EXTERNALIP}" http://dweet.io/dweet/for/${dweet_id_tunnel}
sleep $tunnel_delay
done
This works, however, the directory I start the script from gets spammed with files named
dweet_id_tunnel.1,
dweet_id_tunnel.2,
dweet_id_tunnel.3,
...
These contain the HTTP response from the wget --post-data from dweet.io.
As this script runs regularly, it's rather annoying to have a folder filled with thousands of these responses. I'm not sure why they're even made because I added the -q argument to wget, which should suppress responses.
Any idea what I need to change to stop these files being created?
wget fetches the response and saves it to a file; that's what it does. If you don't want that, add -O /dev/null, or switch to curl which seems to be more familiar to you anyway, as well as more versatile.
The -q option turns off reporting, not downloading (i.e. progress messages etc, similar to curl -s).
This question already has answers here:
Exit code of variable assignment to command substitution in Bash
(5 answers)
How to capture the Gradle exit code in a shell script?
(1 answer)
Closed 3 years ago.
I'm using cURL to check links for uptime in a bash script like this:
curl -Lo /dev/null --silent --head --write-out '%{http_code}' $link
Where link="http://deadlink/" for example.
It returns 000 cos it's a dead link which is fine, but whenever I get a 000 response from cURL (since it could be for many reasons), I'd like to apply some logic to find out what's happening there. For instance, is it because connection refused, timeout, ssl failure, etc?
I assume the best way would some way to isolate the error code directly from cURL and apply tests to it using IF statements. That would be fine, so the closest I've got to extracting the error code from cURL by itself is this:
failState=$(curl -Ss $link; echo "error code is $?" )
echo $failState
Which nicely returns:
curl: (6) Could not resolve host: brokenlink
error code is 6
How do I get the "6" into a variable?
you can assign $? to another variable right after assigning curl's output to failState.
failState=$(curl -Ss "$link") exitCode=$?
I have a cronjob getting a list of prices from a website in JSON format and copying it into the right folder and it looks like this:
curl 'https://somesite.web/api/list.json' > ~/list.json.tmp && cp ~/list.json.tmp /srv/www/list.json > /dev/null
The problem is that a couple of times the website was down while the cron was trying to get the list and got an empty JSON file. To prevent this in the future, is there a way to make the cron only copy the file if it's not empty (no cp option to do this)? or should I create a script to do that and call the script after getting the list?
Maybe curl --fail will accomplish what you want? From the man page:
-f, --fail
(HTTP) Fail silently (no output at all) on server errors. This is mostly done to better enable scripts etc to better deal with failed attempts. In normal cases when an HTTP server fails to deliver a document, it returns an HTML document stating so (which often also describes why and more). This flag will prevent curl from outputting that and return error 22.
This would cause curl to exit with a failure code, and thus the && in your statement would not execute the copy.
curl ... && [ -s ~/list/json.tmp ] && cp ~/list/json.tmp /srv/www/list.json
The -s test is true if the named file exists and is not empty.
(Incidentally, the > /dev/null redirection is not necessary. The cp command might print error messages to stderr, but it shouldn't print anything to stdout, which is what you're redirecting.)