Get cURL response in bash - bash

I have a simple bash script that uploads files to an FTP. I was wondering how to get a response from curl that I can record (error or success)?
eval curl -T "${xmlFolder}"/"${xmlFile}" "${mediaFTP}"
Thanks in advance

Given the command provided, this should suffice:
curl -T "$xmlFolder/$xmlFile" "$mediaFTP" ||
printf '%s\n' $?
Or, if you want to discard the error message:
curl -T "$xmlFolder/$xmlFile" "$mediaFTP" >/dev/null ||
printf '%s\n' $?

The $? bash variable indicates success (val 0) / failure (val non 0) of the previous command. So you could do:
eval curl -T "${xmlFolder}"/"${xmlFile}" "${mediaFTP}"
err=$?
if [ $err -ne 0 ]
then
echo "Failed with error code $err"
exit
fi

Related

Unable to execute awk command in a function, but working directly in the shell

I want to create a utility function for bash to remove duplicate lines. I am using function
function remove_empty_lines() {
if ! command -v awk &> /dev/null
then
echo '[x] ERR: "awk" command not found'
return
fi
if [[ -z "$1" ]]
then
echo "usage: remove_empty_lines <file-name> [--replace]"
echo
echo "Arguments:"
echo -e "\t--replace\t (Optional) If not passed, the result will be redirected to stdout"
return
fi
if [[ ! -f "$1" ]]
then
echo "[x] ERR: \"$1\" file not found"
return
fi
echo $0
local CMD="awk '!seen[$0]++' $1"
if [[ "$2" = '--reload' ]]
then
CMD+=" > $1"
fi
echo $CMD
}
If I am running the main awk command directly, it is working. But when i execute the same $CMD in the function, I am getting this error
$ remove_empty_lines app.js
/bin/bash
awk '!x[/bin/bash]++' app.js
The original code is broken in several ways:
When used with --reload, it would truncate the output file's contents before awk could ever read those contents (see How can I use a file in a command and redirect output to the same file without truncating it?)
It didn't ever actually run the command, and for the reasons described in BashFAQ #50, storing a shell command in a string is inherently buggy (one can work around some of those issues with eval; BashFAQ #48 describes why doing so introduces security bugs).
It wrote error messages (and other "diagnostic content") to stdout instead of stderr; this means that if your function's output was redirected to a file, you could never see its errors -- they'd end up jumbled into the output.
Error cases were handled with a return even in cases where $? would be zero; this means that return itself would return a zero/successful/truthy status, not revealing to the caller that any error had taken place.
Presumably the reason you were storing your output in CMD was to be able to perform a redirection conditionally, but that can be done other ways: Below, we always create a file descriptor out_fd, but point it to either stdout (when called without --reload), or to a temporary file (if called with --reload); if-and-only-if awk succeeds, we then move the temporary file over the output file, thus replacing it as an atomic operation.
remove_empty_lines() {
local out_fd rc=0 tempfile=
command -v awk &>/dev/null || { echo '[x] ERR: "awk" command not found' >&2; return 1; }
if [[ -z "$1" ]]; then
printf '%b\n' >&2 \
'usage: remove_empty_lines <file-name> [--replace]' \
'' \
'Arguments:' \
'\t--replace\t(Optional) If not passed, the result will be redirected to stdout'
return 1
fi
[[ -f "$1" ]] || { echo "[x] ERR: \"$1\" file not found" >&2; return 1; }
if [ "$2" = --reload ]; then
tempfile=$(mktemp -t "$1.XXXXXX") || return
exec {out_fd}>"$tempfile" || { rc=$?; rm -f "$tempfile"; return "$rc"; }
else
exec {out_fd}>&1
fi
awk '!seen[$0]++' <"$1" >&$out_fd || { rc=$?; rm -f "$tempfile"; return "$rc"; }
exec {out_fd}>&- # close our file descriptor
if [[ $tempfile ]]; then
mv -- "$tempfile" "$1" || return
fi
}
First off the output from your function call is not an error but rather the output of two echo commands (echo $0 and echo $CMD).
And as Charles Duffy has pointed out ... at no point is the function actually running the $CMD.
As for the inclusion of /bin/bash in your function's echo output ... the main problem is the reference to $0; by definition $0 is the name of the running process, which in the case of a function is the shell under which the function is being called. Consider the following when run from a bash command prompt:
$ echo $0
-bash
As you can see from your output this generates /bin/bash in your environment. See this and this for more details.
On a related note, the reference to $0 within double quotes causes the $0 to be evaluated, so this:
local CMD="awk '!seen[$0]++' $1"
becomes
local CMD="awk '!seen[/bin/bash]++' app.js"
I'm thinking what you want is something like:
echo $1 # the name of the file to be processed
local CMD="awk '!seen[\$0]++' $1" # escape the '$' in '$0'
becomes
local CMD="awk '!seen[$0]++' app.js"
That should fix the issues shown in your function's output; as for the other issues ... you're getting a good bit of feedback in the various comments ...

How to use exit code as conditional in simple 2 line bash script

url="http://localhost:8080/matlib"
until $(curl "$url" --max-time 10) == 0; do stuck_pid=$(chown_pid); kill -9 $stuck_pid && "killing chmod process";done
what I'm trying to do is curl this address, if it times out after 10 seconds, then term a PID.
The part that is failing is the '== 0', the intention here was to compare the return code from curl with 0, but I'm receiving the following error:
-bash: ==: command not found
This indeed is the problem:
$(curl "$url" --max-time 10) == 0
== operator must be inside [[ ... ]] or [ ... ] square brackets.
However you are not comparing exit status of curl but output of curl since you are executing $(...) or command substitution.
You should be using just:
until curl "$url" --max-time 10; do ...; done

Getting exit status of somecommand after "if ! somecommand"

I seem to not be able to get the exit code of the command execution in a Bash if condition:
#! /bin/bash
set -eu
if ! curl -sS --fail http://not-there; then
echo "ERROR: curl failed with exit code of $?" >&2
fi
But the $? always returns zero, when my curl exits with non-zero.
If I don't do the curl command inside of the if-condition, then my $? returns correctly.
Am I missing something here?
In your original code, $? is returning the exit status not of curl, but of ! curl.
To preserve the original value, choose a control structure that doesn't require that inversion:
curl -sS --fail http://not-there || {
echo "ERROR: curl failed with exit code of $?" >&2
exit 1
}
...or something akin to:
if curl -sS --fail http://not-there; then
: "Unexpected success"
else
echo "ERROR: curl failed with exit status of $?" >&2
fi
Another way to achieve what you want to do is to collect the return code first, then perform the if statement.
#!/bin/bash
set -eu
status=0
curl -sS --fail http://not-there || status=$?
if ((status)) then
echo "ERROR: curl failed with exit code of $status" >&2
fi
I find this method especially convenient when checking for failure of several commands when you want to return an error code at the end of your script or function if any of these has failed.
Please note in the above I use an arithmetic test, which returns true (0) if the value inside is non-zero, and false (non-zero) otherwise. It is shorter (and more readable to my own taste) than using something like [[ $status != 0 ]].

Write to file if header is a 404 using Bash and Curl on Linux

I have a simple script that accepts 2 arguments, a URL and a log file location. Theoretically, it should capture the header status code from the curl command and if it is a 404, then append the URL to the log file. Any idea where it is failing?
#!/bin/bash
CMP='HTTP/1.1 404 Not Found' # This is the 404 Pattern
OPT=`curl --config /var/www/html/curl.cnf -s -D - "$1" -o /dev/null | grep 404` # Status Response
if [ $OPT = $CMP ]
then
echo "$1" >> "$2" # Append URL to File
fi
Your test is assigning the value of $CMP to $OPT, not comparing for equality. Try the following simpler method, which checks the return code of the grep command rather than looking for the comparison string in its output:
#!/bin/bash
CMP='HTTP/1.1 404 Not Found'
if $(curl -s -I "$1" | grep "$CMP" >/dev/null 2>&1); then
echo "$1" >> "$2"
fi

Check wget's return value

I'm writing a script to download a bunch of files, and I want it to inform when a particular file doesn't exist.
r=`wget -q www.someurl.com`
if [ $r -ne 0 ]
then echo "Not there"
else echo "OK"
fi
But it gives the following error on execution:
./file: line 2: [: -ne: unary operator expected
What's wrong?
Others have correctly posted that you can use $? to get the most recent exit code:
wget_output=$(wget -q "$URL")
if [ $? -ne 0 ]; then
...
This lets you capture both the stdout and the exit code. If you don't actually care what it prints, you can just test it directly:
if wget -q "$URL"; then
...
And if you want to suppress the output:
if wget -q "$URL" > /dev/null; then
...
$r is the text output of wget (which you've captured with backticks). To access the return code, use the $? variable.
$r is empty, and therefore your condition becomes if [ -ne 0 ] and it seems as if -ne is used as a unary operator. Try this instead:
wget -q www.someurl.com
if [ $? -ne 0 ]
...
EDIT As Andrew explained before me, backticks return standard output, while $? returns the exit code of the last operation.
you could just
wget ruffingthewitness.com && echo "WE GOT IT" || echo "Failure"
-(~)----------------------------------------------------------(07:30 Tue Apr 27)
risk#DockMaster [2024] --> wget ruffingthewitness.com && echo "WE GOT IT" || echo "Failure"
--2010-04-27 07:30:56-- http://ruffingthewitness.com/
Resolving ruffingthewitness.com... 69.56.251.239
Connecting to ruffingthewitness.com|69.56.251.239|:80... connected.
HTTP request sent, awaiting response... 200 OK
Length: unspecified [text/html]
Saving to: `index.html.1'
[ <=> ] 14,252 72.7K/s in 0.2s
2010-04-27 07:30:58 (72.7 KB/s) - `index.html.1' saved [14252]
WE GOT IT
-(~)-----------------------------------------------------------------------------------------------------------(07:30 Tue Apr 27)
risk#DockMaster [2025] --> wget ruffingthewitness.biz && echo "WE GOT IT" || echo "Failure"
--2010-04-27 07:31:05-- http://ruffingthewitness.biz/
Resolving ruffingthewitness.biz... failed: Name or service not known.
wget: unable to resolve host address `ruffingthewitness.biz'
zsh: exit 1 wget ruffingthewitness.biz
Failure
-(~)-----------------------------------------------------------------------------------------------------------(07:31 Tue Apr 27)
risk#DockMaster [2026] -->
Best way to capture the result from wget and also check the call status
wget -O filename URL
if [[ $? -ne 0 ]]; then
echo "wget failed"
exit 1;
fi
This way you can check the status of wget as well as store the output data.
If call is successful use the output stored
Otherwise it will exit with the error wget failed
I been trying all the solutions without lucky.
wget executes in non-interactive way. This means that wget work in the background and you can't catch the return code with $?.
One solution it's to handle the "--server-response" property, searching http 200 status code
Example:
wget --server-response -q -o wgetOut http://www.someurl.com
sleep 5
_wgetHttpCode=`cat wgetOut | gawk '/HTTP/{ print $2 }'`
if [ "$_wgetHttpCode" != "200" ]; then
echo "[Error] `cat wgetOut`"
fi
Note: wget need some time to finish his work, for that reason I put "sleep 5". This is not the best way to do but worked ok for test the solution.

Resources