Pipe symbol before command in "open" - ruby

I stumbled over the following line of code
open("|cd lib && /opt/jruby/bin/jruby jasper_pdf.rb") { |input| open("log/jasper_pdf.log", "w") { |f| f.write(input.read) } }
What is the pipe symbol before the cd command for?

The Ruby documentation for Kernel#open says:
If path starts with a pipe character ("|"), a subprocess is created,
connected to the caller by a pair of pipes. The returned IO object may
be used to write to the standard input and read from the standard
output of this subprocess.
In your case it is used to log the output of the process spawned by the command /opt/jruby/bin/jruby jasper_pdf.rb to the file log/jasper_pdf.log.
It is roughly equivalent to use the Open3 module like this:
require 'open3'
Open3.popen2e('cd lib && /opt/jruby/bin/jruby jasper_pdf.rb') do |_, output, _|
open('log/jasper_pdf.log', 'w') do |f|
f.write(output.read)
end
end

Related

Ruby - Using Kernel.exec with custom STDOUT

I'm trying to exec a shell process such that its standard output is prefixed with an identifier.
My approach is to write a custom IO object that re-implements write, passing it as the :out argument to exec (documented under Process::spawn).
require "delegate"
class PrefixedStdout < DelegateClass(IO)
def initialize(prefix, io)
#prefix = prefix
super(io)
end
def write(str)
super("#{#prefix}: #{str}")
end
end
pr_stdout = PrefixedStdout.new("my_prefix", $stdout)
pr_stdout.write("hello\n") # outputs "my_prefix: hello"
exec("echo hello", out: pr_stdout) # outputs "hello"
Somehow exec is bypassing PrefixedStdout#write and calling $stdout.write directly. How do I force exec to use my prefixed output stream as its stdout?
What gets preserved in the other process is the underlying file descriptor (or rather they are hooked up under the hood), so as I commented I don't think you'll ever get writes to that descriptor to be funnelled through your write method - exec replaces the running process with a new one.
A possible approach is to create a pipe, pass one end to your child process and then read from the other end, inserting prefixes as needed,
For example you might do
IO.pipe do |read_pipe, write_pipe|
fork do
exec("echo hello", out: write_pipe)
end
write_pipe.close
while line = read_pipe.gets
puts "prefix: #{line}"
end
end
You might also be interested in IO.popen which wraps some of this up.
Somehow exec is bypassing PrefixedStdout#write and calling
$stdout.write
Take a look at this example:
class MyIO < IO
def initialize(fd)
super
end
def write(str)
STDOUT.puts 'write called'
super
end
end
fd = IO.sysopen("data.txt", "w")
io = MyIO.new(fd)
io.write "goodbye\n"
puts '---now with exec()...'
exec("echo hello", :out => io)
--output:--
write called
---now with exec()...
Now, what do you think is in the file data.txt?
spoiler:
$cat data.txt
hello
So passing an IO object to exec() 'works', but not the way you expected: exec() never calls io.write() to write the output of the child process to io. Instead, I assume exec() obtains the file descriptor for io, then passes it to some C code, which does some system level redirection of the output from the child process to the file data.txt.
Do you have to use exec()? If not:
prefix = "prefix: "
cmd = 'echo hello'
output = `#{cmd}`
puts "#{prefix}#{output}"
--output:--
prefix: hello

Ruby: Printing system output in real time?

I have a ruby rake task that calls a bash script via:
Open3.popen('/path/file_converter.sh', file_list, output_format)
That bash script outputs logs to the command line as it processes (which takes from 30 secs to 5 hours)
When I call the rake task, the output from bash is returned to the command line, but only as one large message after the entire script has run. Anyone know of a way to pipe command line output direct to ruby output as it occurs?
According to the documentation you should be able to use the output stream given in the block:
Open3.popen3('/path/file_converter.sh', file_list, output_format) do |_,out,_,_|
out.each_line do |line|
puts line
end
end
Put the output into a file. And run the process in the background creating a new thread. After it you can parse the file.
class FileConverter
def initialize
#output_file = '/tmp/something.txt'
output_format = 'foo'
file_list = 'bar foo something'
#child = Thread.new do
`/path/file_converter.sh #{file_list} #{output_format} 2>&1 >#{#output_file}`
end
end
def data
File.readlines(#output_file)
end
def parse
while #child.alive?
# parse data # TODO: need to implement real parsing
sleep 0.5
end
end
end
fc = FileConverter.new
fc.parse

Can't pipe into gvim from a ruby sub process

I've been attempting to use the PA_translator.vim plugin but found that it doesn't work in Win32 Gvim. This appears to be because in an embedded Vim Ruby script it's not possible to use any of the commands which pipe in from a subprocess. The original process builds a command dymanically and then launches a subprocess to obtain a JSON snippet like so:
cmd = "require 'net/http'; p Net::HTTP.get('%s', '%s')"% [host, path]
response = `ruby -e "#{cmd}"`
If I run this in a command line ruby script it works fine, but inside a Vim script the pipe appears to return an empty string..
I've also tried several other methods which produce the same result:
response = ''
IO.popen("ruby.exe", "w+") do |f|
f.write cmd
f.close_write
response = f.read
p response
end
And even:
def redirect
orig_defout = $stdout
$stdout = StringIO.new
yield
$stdout.string
ensure
$stdout = orig_defout
end
response = redirect { eval cmd }
All of these seem to fail for the same reason, it's not possible to get the output from the pipe and I get back an empty string. GVim is a true win32 process, is there some reason why piping from a subprocess won't work?
EDIT: If I try to capture piped output from embedded vim/perl, that works fine, so I guess it's some particular issue with the vim -> win32 -> ruby combination:
fun! SayHello()
perl << EOF
$bob = `ls`;
VIM::Msg($bob);
EOF
endfun

How to replace STDIN, STDOUT, STDERR in ruby19

In ruby18 I sometimes did the following to get a subprocess with full control:
stdin, #stdin= IO.pipe
#stdout, stdout= IO.pipe
#stderr, stderr= IO.pipe
#pid= fork do
#stdin.close
STDIN.close
stdin.dup
#stdout.close
STDOUT.close
stdout.dup
#stderr.close
STDERR.close
stderr.dup
exec(...)
end
This does not work in ruby19. The close method for STDIN, STDOUT, STDERR does not close the underlying filedescriptor in ruby19. How do I do this in ruby19.
Check out Process.spawn, Open3, and the childprocess gem.
I can't tell exactly what you're trying to do there, but you can take control of a child process's IO in many ways.
Using Unix pipes:
readme, writeme = IO.pipe
pid = fork {
$stdout.reopen writeme
readme.close
exec(...)
}
Juggling the IOs with Process.spawn:
pid = spawn(command, :err=>:out)
Or wrapping the process in POpen3:
require 'open3'
include Open3
popen3(RUBY, '-r', THIS_FILE, '-e', 'hello("Open3", true)') do
|stdin, stdout, stderr|
stdin.write("hello from parent")
stdin.close_write
stdout.read.split("\n").each do |line|
puts "[parent] stdout: #{line}"
end
stderr.read.split("\n").each do |line|
puts "[parent] stderr: #{line}"
end
You might also consider Jesse Storimer's Working With Unix Processes. It has a lot of information and his writing style is very easy to read and understand. The book doubles as a reference guide that is somehow more useful than a lot of the actual documentation.
references:
http://pleac.sourceforge.net/pleac_ruby/processmanagementetc.html
http://rubydoc.info/stdlib/core/1.9.3/Process.spawn
http://devver.wordpress.com/2009/10/12/ruby-subprocesses-part_3/
This post shows one way to temporarily replace stdin in Ruby:
begin
save_stdin = $stdin # a dup by any other name
$stdin.reopen('/dev/null') # dup2, essentially
# do stuff
ensure
$stdin.reopen(save_stdin) # restore original $stdout
save_stdin.close # and dispose of the copy
end
Since this question is one of the top google hits for “ruby replace stdin,” I hope this will help others looking for how to do that.

How do I get the STDOUT of a ruby system() call while it is being run?

Similar to Getting output of system() calls in Ruby , I am running a system command, but in this case I need to output the STDOUT from the command as it runs.
As in the linked question, the answer is again not to use system at all as system does not support this.
However this time the solution isn't to use backticks, but IO.popen, which returns an IO object that you can use to read the input as it is being generated.
In case someone might want to read stdout and stderr:
It is important to read them in parallel, not first one then the other. Because programs are allowed to output to stdout and stderr by turns and even in parallel. So, you need threads. This fact isn't even Ruby-specific.
Stolen from here.
require 'open3'
cmd = './packer_mock.sh'
data = {:out => [], :err => []}
# see: http://stackoverflow.com/a/1162850/83386
Open3.popen3(cmd) do |stdin, stdout, stderr, thread|
# read each stream from a new thread
{ :out => stdout, :err => stderr }.each do |key, stream|
Thread.new do
until (raw_line = stream.gets).nil? do
parsed_line = Hash[:timestamp => Time.now, :line => "#{raw_line}"]
# append new lines
data[key].push parsed_line
puts "#{key}: #{parsed_line}"
end
end
end
thread.join # don't exit until the external process is done
end
here is my solution
def io2stream(shell, &block)
Open3.popen3(shell) do |_, stdout, stderr|
while line = stdout.gets
block.call(line)
end
while line = stderr.gets
block.call(line)
end
end
end
io2stream("ls -la", &lambda { |str| puts str })
With following you can capture stdout of a system command:
output = capture(:stdout) do
system("pwd") # your system command goes here
end
puts output
shortened version:
output = capture(:stdout) { system("pwd") }
Similarly we can also capture standard errors too with :stderr
capture method is provided by active_support/core_ext/kernel/reporting.rb
Looking at that library's code comments, capture is going to be deprecated, so not sure what is the current supported method name is.

Resources