Consider oneliner
$ ruby -e 'puts 1 + 1'
which uses ruby as a command-line calculator. I would like to write the expression without puts. Is there a switch for it in ruby command?
It is impossible with ruby command line switches, but it’s easily achievable with shell:
⮀ cat /usr/local/bin/rubyoneliner
#!/bin/sh
ruby -e "puts $#"
⮀ rubyoneliner '1 + 1'
2
or with bash/zsh function.
There is no way to have an implicit print in ruby. However, you can shrink your oneliner a little more -> replacing puts by p and including the require in the shell command :
$ ruby -rsy -e 'p 42.kWh.in :MJ'
-rsy whould replace the require 'sy'.
Otherwise you can use the fact that the option -p, implicitely puts the $_ variable with something like (much longer however) :
$ printf whatever | ruby -rsy -pe '$_ = 42.kWh.in :MJ'
or uglier
$ printf whatever | ruby -rsy -pe 'sub("whatever", 42.kWh.in :MJ)'
Related
My question is similar to this one: How to detect if my shell script is running through a pipe?. The difference is that the script I’m working on is written in Ruby.
Let’s say I run:
./test.rb
I expect text on stdout with color, but
./test.rb | cat
I expect the color codes to be stripped out.
Use $stdout.isatty or more idiomatically, $stdout.tty?. I created a little test.rb file to demonstrate, contents:
puts $stdout.isatty
Results:
$ ruby test.rb
true
$ ruby test.rb | cat
false
Reference: https://ruby-doc.org/core/IO.html#method-i-isatty
Use IO#stat.pipe?.
IO#tty? returns true only on a TTY device.
Returns false for UNIX-style pipes (see "man 2 pipe").
$ echo "something" | ruby -e 'puts $stdin.stat.pipe?'
true
$ echo "something" | ruby -e 'puts $stdin.tty?'
false
$ ruby -e 'puts $stdin.tty?'
true
I can do math like
perl -e 'print 5253413/39151' -l
But I don't quite get how to take advantage of Perl's ability to do math with my own predefined bash variables. I've tried
var1=$(some wc command that yields a number); var1=$(some wc that yields another number)
perl -e 'print var1/var2' -l
But it doesn't work
There are two main ways to do this.
Within the Perl code you can use the %ENV built-in hash to access environment variables that are exported from the shell
$ export var1=5253413
$ export var2=39151
$ perl -E 'say $ENV{var1}/$ENV{var2}'
134.183366963807
You can use the shell interpolation facility to insert the value of a shell variable into a command
This is best done as parameters to the perl one-liner rather than introducing the values directly into the code
$ var1=5253413
$ var2=39151
$ perl -E '($v1, $v2) = #ARGV; say $v1/$v2' $var1 $var2
134.183366963807
Two less common ways to do this make use of long-standing perl features.
The first is the core module Env, which ties process environment variables to perl variables:
sh$ export VAR1=1000
sh$ export VAR2=33
sh$ perl -MEnv -E 'say $VAR1/$VAR2' # imports all environ vars
333.333333333333
sh$ perl -MEnv=VAR1,VAR2 -E 'say $VAR1/$VAR2' # imports only VAR1, VAR2
333.333333333333
Note that the variables need to be present in the environment inherited by the perl process, for example with export VAR as above, or explicitly for a single command (as by FOO=hello perl -MEnv -E 'say $FOO').
The second and rather more obscure way is to use use perl's -s switch to set arbitrary variables from the command line:
sh$ VAR1=1000
sh$ VAR2=33
sh$ perl -s -E 'say $dividend/$divisor' -- -dividend=$VAR1 -divisor=$VAR2
333.333333333333
awk does something similar with its -v switch.
I believe the spirit of the question is to pass variables without exported ENV vars.
Beside using perl -s -e expression -perlvar=val, below is code that uses two other mechanisms to pass the variable to perl.
a=x; b=N; c=z;
b=y perl -e '$pa='$a';' -e "\$pc=$c;" -e 'print "$pa$ENV{b}$pc\n";'
echo $a$b$c
Passing a and c is same, only the quoting is different. When passing using chained expressions, like this, it is important to end the expression with semi-colon; because, they flow into one expression at the end.
Passing b is done by ENV, but instead of using the exported value, it is passed directly into perl's ENV by giving the assignment before the command on the same command-line.
Last the echo command is to emphasize how the shell's definition of $b is unchanged.
Using the mechanism of b's passing, we arrive at a more secure solution, because the process's ENV data cannot be checked for the value, and it will not be seen in the command-line argument list.
I'm using uname -n as an example. I've tried other shell commands, using the full pathname to the shell command, and I used other delimiters, such as %x( ) and %x[ ].
$ uname -n
my-server
$ which env
/usr/bin/env
$ which ruby
/home/ubuntu/.rvm/rubies/ruby-2.2.1/bin/ruby
$ irb
2.2.1 :001 > %x{uname -n}
=> "my-server\n"
2.2.1 :002 > exit
$ cat ET.rb
#!/usr/bin/env ruby
%x{uname -n}
$ ruby ET.rb
$ ### !!!?!?!? I'm expecting "my-server"
In IRB, it will show the results of any Ruby statements you feed it. But if in a Ruby script, you should use puts or print method to print something out:
puts %x{uname -n}
or:
print %x{uname -n}
I have a ruby bin I'd like to pass information to in this fashion:
some_text | ./bin/my_ruby_bin
where some_text will be accessible by ARGV
is this possible with ruby + shell or am I taking the wrong approach here?
Here is simple solution that works for my cause, but it appears there are many ways to do this:
# ./bin/my_ruby_bin
#!/usr/bin/env ruby -n
puts "hello: #{$_}"
notice the -n flag
from command line:
echo 'world' | ./bin/my_ruby_bin
# => hello world
More on ruby -n
ruby -h
-n assume 'while gets(); ... end' loop around your script
I noticed today Bash printf has a -v option
-v var assign the output to shell variable VAR rather than
display it on the standard output
If I invoke like this it works
$ printf -v var "Hello world"
$ printf "$var"
Hello world
Coming from a pipe it does not work
$ grep "Hello world" test.txt | xargs printf -v var
-vprintf: warning: ignoring excess arguments, starting with `var'
$ grep "Hello world" test.txt | xargs printf -v var "%s"
-vprintf: warning: ignoring excess arguments, starting with `var'
xargs will invoke /usr/bin/printf (or wherever that binary is installed on your system). It will not invoke bash's builtin function. And only a builtin (or sourcing a script or similar) can modify the shell's environment.
Even if it could call bash's builtin, the xargs in your example runs in a subsell. The subshell cannot modify it's parent's environment anyway. So what you're trying cannot work.
A few options I see if I understand your sample correctly; sample data:
$ cat input
abc other stuff
def ignored
cba more stuff
Simple variable (a bit tricky depending on what exactly you want):
$ var=$(grep a input)
$ echo $var
abc other stuff cba more stuff
$ echo "$var"
abc other stuff
cba more stuff
With an array if you want individual words in the arrays:
$ var=($(grep a input))
$ echo "${var[0]}"-"${var[1]}"
abc-other
Or if you want the whole lines in each array element:
$ IFS=$'\n' var=($(grep a input)) ; unset IFS
$ echo "${var[0]}"-"${var[1]}"
abc other stuff-cba more stuff
There are two printf's - one is a shell bultin and this is invoked if you just run printf and the other is a regular binary, usually /usr/bin/printf. The latter doesn't take a -v argument, hence the error message. Since printf is an argument to xargs here, the binary is run, not the shell bulitin. Additionally, since it's at the receiving end of a pipeline, it is run as a subprocess. Variables can only be inherited from parent to child process but not the other way around, so even if the printf binary could modify the environment, the change wouldn't be visible to the parent process. So there are two reasons why your command cannot work. But you can always do var=$(something | bash -c 'some operation using builtin printf').
Mat gives an excellent explanation of what's going on and why.
If you want to iterate over the output of a command and set a variable to successive values using Bash's sprintf-style printf feature (-v), you can do it like this:
grep "Hello world" test.txt | xargs bash -c 'printf -v var "%-25s" "$#"; do_something_with_formatted "$var"' _ {} \;