how to suppress/remove ALL output from a cURL command (ruby) - ruby

I have:
info = `curl -s http://www.{insertwebsitehere}.com`
followed by some more arguments and I want to suppress everything that gets printed to the terminal. i know that -s got rid of the progress bar but I can't figure out how to completely remove ALL output of the cURL command.
This command gets run many times during the execution of the program and the output it produces messes everything up.
From what I understand there is no argument which removes it all and i have to redirect it, i just can't figure out how.
I've tried adding > /dev/null/ to the end of it but it gave me errors
any ideas?
thanks

You can suppress stderr by adding 2>/dev/null:
info = `curl -s http://www.{insertwebsitehere}.com 2>/dev/null`
but better way is to use some http library and fetch your page without starting additional process in a shell. For example there's curb - libcurl bindings for ruby

Remove / from the end of /dev/null
info = `curl -s http://www.{insertwebsitehere}.com > /dev/null`
But this will set info to empty string.
I think it is not what you want.

Related

Curl command in Bash script returns no URL specified

Trying to run a curl command to a test page results in "no URL specified!" when attempting to run via a bash script. This is a test that runs through a proxy (1.2.3.4 in the code example)
The command runs as expected from the command line. I've tried a few variations of quotes around the components of the command however still manage the same "no URL specified" result.
echo $(curl -x http://1.2.3.4:80 https://example.company.com)
The expected result is an HTML file, however instead the above issue is found only when run from this small bash script file.
The command itself runs just fine as "curl -x http://1.2.3.4:80 https://example.company.com" from the command line.
Appreciate in advance any help you can provide!
I have literally edited it down to the following. Everything else is commented out. Still the same error.
#!/bin/bash
curl -x http://1.2.3.4:80 https://example.company.com
In your example you want to use double quotes around the subshell and single qutes for the curl parameters
Make sure to set the correct shebang (#!/bin/bash).
You could also try to run it as:
cat <(curl -x http://1.2.3.4:80 https://example.company.com)
However I am not able to reproduce your example

Why does curl send report on speed and time as error output

I'm using the following command in a script:
curl -O --time-cond $_input_file_name $_location/$_input_file_name
and it produces a report with this heading:
% Total % Received % Xferd Average Speed Time Time Time Current
Dload Upload Total Spent Left Speed
but it seems to be sent to error output, even though the transfer has been successful and the return code from curl is zero. Why does it do this? Is there a way to suppress this without suppressing actual error messages? Adding -s or -sS to the curl command doesn't seem to alter this behaviour.
Running the command in a terminal, the -s option does suppress the output. The problem arises only within a script. The script is being triggered in crontab via cronic.
I'm working in Debian 9.1 with curl 7.52.1 (x86_64-pc-linux-gnu).
Curl was designed, at least originally, to send its output to stdout by default (see here), something a large number of other Unix utilities also do.
Some programs will allow you to write their output to stdout by specifying - as an output file name but this is not the way curl went.
The reason all the progress messages would therefore need to be sent to stderr would be so they don't corrupt your actual stream of data coming out on stdout.
If you examine the man page, you should see that the --silent --show-error options should disable the progress stuff while still showing an error.
Use "-s -S"
-S, --show-error
When used with -s, --silent, it makes curl show an error message if it fails.
-s, --silent
Silent or quiet mode. Don't show progress meter or error messages. Makes Curl mute. It will still out-
put the data you ask for, potentially even to the terminal/stdout unless you redirect it.
Use -S, --show-error in addition to this option to disable progress meter but still show error messages.
See also -v, --verbose and --stderr.

Ubuntu command to copy file only if origin is not empty

I have a cronjob getting a list of prices from a website in JSON format and copying it into the right folder and it looks like this:
curl 'https://somesite.web/api/list.json' > ~/list.json.tmp && cp ~/list.json.tmp /srv/www/list.json > /dev/null
The problem is that a couple of times the website was down while the cron was trying to get the list and got an empty JSON file. To prevent this in the future, is there a way to make the cron only copy the file if it's not empty (no cp option to do this)? or should I create a script to do that and call the script after getting the list?
Maybe curl --fail will accomplish what you want? From the man page:
-f, --fail
(HTTP) Fail silently (no output at all) on server errors. This is mostly done to better enable scripts etc to better deal with failed attempts. In normal cases when an HTTP server fails to deliver a document, it returns an HTML document stating so (which often also describes why and more). This flag will prevent curl from outputting that and return error 22.
This would cause curl to exit with a failure code, and thus the && in your statement would not execute the copy.
curl ... && [ -s ~/list/json.tmp ] && cp ~/list/json.tmp /srv/www/list.json
The -s test is true if the named file exists and is not empty.
(Incidentally, the > /dev/null redirection is not necessary. The cp command might print error messages to stderr, but it shouldn't print anything to stdout, which is what you're redirecting.)

bash how to allow interactive session when runing through a pipe

I'm creating a script that I want people to run with
curl -sSL install.domain.com | bash
As RVM, oh-my-zsh and many others does,
However, I'm having issues because my script is interactive (it uses read and select, and the user is not being prompted, the script just skip those steps as is being executed with a |.
I've tried adding {} to my whole code.
I was thinking in ask the script to download himself again and put in tmp folder, and execute from there, not sure if that will work.
You can explicitly tell read to read from the terminal using
read var < /dev/tty
The solution i found is ask user to run it like:
bash <( curl -sSL install.domain.com )
This way script is passed as an argument and standard input remains untouched.
This is a non-problem, as users should not be executing code directly from the stream. The user should be downloading the code to a file first, then verifying that the script they receive has, for example, an MD5 hash that matches the hash you provide on your website. Only after confirming that the script they receive is the script you sent should they execute it.
$ curl -sSL install.domain.com > installer.sh
$ md5sum installer.bash # Does the output match the hash posted on install.domain.com?
$ bash installer.bash

How can I get a JSON object out of my ruby application into the terminal?

I have a ruby program that I invoke via terminal. When I invoke a particular command, I want to get my JSON response object as a string in the terminal. How can I do this? I've tried storing the output (printed with say_ok) of the command in a bash variable, but have had no luck. I'm not sure what the correct approach is here, so any help is appreciated.
Ruby
say_ok response.body
Bash
JSONOBJECT=$(myrubyapp postandgetresponse)
echo "${JSONOBJECT}"
Edit:
Returning the object instead of say_ok seems to of have worked. I will update once verified.
If you want to capture script output from stdout to bash variable, you can do it like this:
% CAPTURE=$(ruby -e "puts 'hello'" | tee /dev/tty)
hello
% echo $CAPTURE
hello

Resources