I want to have a method defined on Object that takes a block and applies the receiver to the block. An implementation will be like the following:
class Object
def apply ≺ pr.call(self) end
end
2.apply{|x| x * 3} # => 6
Is there already a standard way to do this or a well known library that has a method with similar use? If so, I didn't want to reinvent the wheel.
It happens to me very often that, I have a method that takes an optional block, and when there is no block, I want to return some return_value calculated within the method, but when there is a block, I want to return the return value of the return_value applied to the block. For now, I have bunches of lines like:
def method ..., &pr
...
pr ? pr.call(return_value) : return_value
end
but I want to consistently write
def method ..., &pr
...
pr ? return_value.apply(&pr) : return_value
end
or even better, with a slightly modified definition of apply,
def method ..., &pr
...
return_value.apply(&pr)
end
I guess Object.tap is what you are looking for:
"Abc".tap do |str|
puts str
end
Is that not identical to def apply; yield self; end? – steenslag
#steenslag Yes. It is. I want to have that effect with self as the receiver. – sawa
Is this what you mean?
2.instance_eval { * 3 }
# => 6
Unfortunately, that doesn't work. instance_eval simply runs code as if the receiver was self. Operators don't presume self as the receiver, so you'd actually have to write this:
2.instance_eval { self * 3 }
# => 6
However, as a proof of concept, this is possible:
Numeric.send(:define_method, :plus) { |x| self + x }
2.instance_eval { plus 3 }
# => 5
(Aftwer reading OP's edit) AFAIK the canonical way to write this is:
def purpose(*args) #no &bl or &pr
res = 42 #huge calculation
return res unless block_given?
yield res
end
p purpose(1,2)
purpose{|n| puts "from the block: #{n}"}
Related
I would like to define the [] method on a class of my own creation to take a block. I have done so as follows.
class A
def self.[](*args, &block)
puts "I am calling #{block} on #{args}."
block.(*args)
end
end
I can invoke this as follows.
# Explicit method invocation
A.[](1) { |x| puts x }
# With a procedure argument
arg = proc { |x| puts x }
A[2, &arg]
However, what I would like to be able to do is this.
A[3] { |x| puts x }
Which unfortunately seems to produce a syntax error. Is there a block syntax for the bracket method, or am I stuck with the first two ways of invoking it? In fact, more generally, which Ruby method names will allow blocks in their invocation, as it seems that there might be a limitation on when this is allowed?
There's not much you can do against a syntax error, so you'll have to change the syntax.
If you accept :
to define (i.e. pollute) an uppercase method inside Kernel (similar to Kernel#Array)
to use parens instead of brackets
You could write :
class A
def self.call_block_with_args(*args, &block)
puts "I am calling #{block} on #{args}."
block.call(*args)
end
end
module Kernel
def A(*args, &block)
A.call_block_with_args(*args, &block)
end
end
It works this way :
A(3) { |x| puts x }
#=>
# I am calling #<Proc:0x000000012b9c50#block_brackets.rb:14> on [3].
# 3
It's not clean, but it's probably the closest you can be to A[3] { |x| puts x }.
Blocks work with normal method calls only.
Ruby has plenty of operators, listing all of them here would be exhaustive, there are more than two dozens. Even `a` and !a and -a are method calls in Ruby. And obviously there are limitations to all these operators, eg + must take one parameter but not more, et cetera.
Fun fact, loop is a method call too.
I'm writing a simple method that adds num to the return value of the block that is passed to it and I noticed that &block and &prc both work. I know that a proc is an object and can be assigned to a variable which could be handy. Is that the only difference though? Is there any difference between these two when it comes to performance, convention, or versatility? Is it ever better to use &block instead of &prc?
def adder(num = 1, &block)
yield + num
end
vs.
def adder(num = 1, &prc)
yield + num
end
Is there any difference between these two when it comes to
performance, convention, or versatility?
There is no difference between these, you able to name it as you want, it's just a name. Some devs call it &blk some &block or &b or &foo ...
>> def foo &foo
>> yield
>> end
=> :foo
>> foo do
?> puts '1'
>> end
1
Strictly saying & is an operator which you can apply to any object, and it will take care of converting that object to a Proc by calling to_proc().
>> def bar(&some_proc)
>> some_proc
>> end
=> :bar
>> p = bar { puts 'Call proc' }
=> #<Proc:0x005601e6d69c80#(irb):4>
>> p.call
=> Call proc
>> p.class
=> Proc
Only the one thing is important, the name should be informative.
Line any argument to your method the name is largely subjective. Typically you'll see &block used if only by convention, but the name itself can be anything you want so long as it's a valid variable name.
In your example you're declaring a block name but not actually using the name. Keep in mind that any Ruby method can be given a block, there's no way to restrict this, but it's up to the method itself to use the block if it wants. That block can be called zero or more times either immediately or at some point in the future. Giving the block to the method surrenders control, so be sure to read the documentation on any given method carefully. There can be surprises.
If you need to chain through a block, declare it with a name:
def passes_through(&block)
[ 1, 2, 3, 4 ].each(&block)
end
If you are going to yield on the block there's no need here:
def direct_call
[ 1, 2, 3, 4 ].each do |n|
yield n
end
end
If you're going to preserve the call and use it later, that's also a case for naming it:
def preserved_call(&block)
#callback = block
end
def make_callback
#callback and #callback.call
end
Any method can check if a block was supplied:
def tests_for_block
if (block_given?)
yield 'value'
else
'value'
end
end
There's a small but measurable cost to capturing a block by declaring it in the method signature, a lot of computation has to be done to properly capture all the variables that might be used in a closure situation. In performance sensitive code you'll want to avoid this.
You can dynamically create a block:
def captures_conditionally
if (block_given?)
#callback = Proc.new
end
end
The Proc.new method will assume control over whatever block has been supplied to the method if one has been.
in your example, there is not a difference between &block and &prc, because in each case you are just passing a block to be call into the method.
Block and proc are similar in that they are both blocks of code.
[1,2,3].each {|x| puts x }
everything within the {} is the block.
A proc is just a block of code that you can name and can be called at a later time.
put_element = Proc.new {|x| puts x}
then you use put_element as an argument in your function.
I have this code:
l = lambda { a }
def some_function
a = 1
end
I just want to access a by the lambda and a special scope which has defined a already somewhere like inside some_function in the example, or just soon later in the same scope as:
l = lambda { a }
a = 1
l.call
Then I found when calling l, it is still using its own binding but not the new one where it was called.
And then I tried to use it as:
l.instance_eval do
a = 1
call
end
But this also failed, it is strange that I can't explain why.
I know the one of the solution is using eval, in which I could special a binding and executing some code in text, but I really do not want to use as so.
And, I know it is able to use a global variable or instance variable. However, actually my code is in a deeper embedded environment, so I don't want to break the completed parts if not quite necessary.
I have referred the Proc class in the documentation, and I found a function names binding that referred to the Proc's context. While the function only provided a way to access its binding but cannot change it, except using Binding#eval. It evaluate text also, which is exactly what I don't like to do.
Now the question is, do I have a better (or more elegant) way to implement this? Or using eval is already the regular manner?
Edit to reply to #Andrew:
Okay, this is a problem which I met when I'm writing a lexical parser, in which I defined a array with fixed-number of items, there including at least a Proc and a regular expression. My purpose is to matching the regular expressions and execute the Procs under my special scope, where the Proce will involved some local variables that should be defined later. And then I met the problem above.
Actually I suppose it is not same completely to that question, as mine is how to pass in binding to a Proc rather than how to pass it out.
#Niklas:
Got your answer, I think that is what exactly I want. It has solved my problem perfectly.
You can try the following hack:
class Proc
def call_with_vars(vars, *args)
Struct.new(*vars.keys).new(*vars.values).instance_exec(*args, &self)
end
end
To be used like this:
irb(main):001:0* lambda { foo }.call_with_vars(:foo => 3)
=> 3
irb(main):002:0> lambda { |a| foo + a }.call_with_vars({:foo => 3}, 1)
=> 4
This is not a very general solution, though. It would be better if we could give it Binding instance instead of a Hash and do the following:
l = lambda { |a| foo + a }
foo = 3
l.call_with_binding(binding, 1) # => 4
Using the following, more complex hack, this exact behaviour can be achieved:
class LookupStack
def initialize(bindings = [])
#bindings = bindings
end
def method_missing(m, *args)
#bindings.reverse_each do |bind|
begin
method = eval("method(%s)" % m.inspect, bind)
rescue NameError
else
return method.call(*args)
end
begin
value = eval(m.to_s, bind)
return value
rescue NameError
end
end
raise NoMethodError
end
def push_binding(bind)
#bindings.push bind
end
def push_instance(obj)
#bindings.push obj.instance_eval { binding }
end
def push_hash(vars)
push_instance Struct.new(*vars.keys).new(*vars.values)
end
def run_proc(p, *args)
instance_exec(*args, &p)
end
end
class Proc
def call_with_binding(bind, *args)
LookupStack.new([bind]).run_proc(self, *args)
end
end
Basically we define ourselves a manual name lookup stack and instance_exec our proc against it. This is a very flexible mechanism. It not only enables the implementation of call_with_binding, it can also be used to build up much more complex lookup chains:
l = lambda { |a| local + func(2) + some_method(1) + var + a }
local = 1
def func(x) x end
class Foo < Struct.new(:add)
def some_method(x) x + add end
end
stack = LookupStack.new
stack.push_binding(binding)
stack.push_instance(Foo.new(2))
stack.push_hash(:var => 4)
p stack.run_proc(l, 5)
This prints 15, as expected :)
UPDATE: Code is now also available at Github. I use this for one my projects too now.
class Proc
def call_with_obj(obj, *args)
m = nil
p = self
Object.class_eval do
define_method :a_temp_method_name, &p
m = instance_method :a_temp_method_name; remove_method :a_temp_method_name
end
m.bind(obj).call(*args)
end
end
And then use it as:
class Foo
def bar
"bar"
end
end
p = Proc.new { bar }
bar = "baz"
p.call_with_obj(self) # => baz
p.call_with_obj(Foo.new) # => bar
Perhaps you don't actually need to define a later, but instead only need to set it later.
Or (as below), perhaps you don't actually need a to be a local variable (which itself references an array). Instead, perhaps you can usefully employ a class variable, such as ##a. This works for me, by printing "1":
class SomeClass
def l
#l ||= lambda { puts ##a }
end
def some_function
##a = 1
l.call
end
end
SomeClass.new.some_function
a similar way:
class Context
attr_reader :_previous, :_arguments
def initialize(_previous, _arguments)
#_previous = _previous
#_arguments = _arguments
end
end
def _code_def(_previous, _arguments = [], &_block)
define_method("_code_#{_previous}") do |_method_previous, _method_arguments = []|
Context.new(_method_previous, _method_arguments).instance_eval(&_block)
end
end
_code_def('something') do
puts _previous
puts _arguments
end
I'd like to write a method that yields values in one place and pass it as a parameter to another method that will invoke it with a block. I'm convinced it can be done but somehow I'm not able to find the right syntax.
Here's some sample (non-working) code to illustrate what I'm trying to achieve:
def yielder
yield 1
yield 2
yield 3
end
def user(block)
block.call { |x| puts x }
end
# later...
user(&yielder)
$ ruby x.rb
x.rb:2:in `yielder': no block given (yield) (LocalJumpError)
from x.rb:12:in `<main>'
FWIW, in my real code, yielder and user are in different classes.
Update
Thanks for your answers. As Andrew Grimm mentioned, I want the iterator method to take parameters. My original example left this detail out. This snippet provides an iterator that counts up to a given number. To make it work, I made the inner block explicit. It does what I want, but it's a bit ugly. If anyone can improve on this I'd be very interested in seeing how.
def make_iter(upto)
def iter(upto, block)
(1 .. upto).each do |v|
block.call(v)
end
end
lambda { |block| iter(upto, block) }
end
def user(obj)
obj.call Proc.new { |x| puts x }
end
# later...
user(make_iter(3))
This doesn't use a lambda or unbound method, but it is the simplest way to go...
def f
yield 1
yield 2
end
def g x
send x do |n|
p n
end
end
g :f
When you write &yielder, you're calling yielder and then trying to apply the & (convert-to-Proc) operator on the result. Of course, calling yielder without a block is a no-go. What you want is to get a reference to the method itself. Just change that line to user(method :yielder) and it will work.
I think this might be along the lines of what you want to do:
def yielder
yield 1
yield 2
yield 3
end
def user(meth)
meth.call { |x| puts x }
end
# later...
user( Object.method(:yielder) )
Some related info here: http://blog.sidu.in/2007/11/ruby-blocks-gotchas.html
As it has been pointed out the baseline problem is that when you try to pass a function as a parameter Ruby executes it – as a side effect of parenthesis being optional.
I liked the simplicity of the symbol method that was mentioned before, but I would be afraid of my future self forgetting that one needs to pass the iterator as a symbol to make that work. Being readability a desired feature, you may then wrap your iterator into an object, which you can pass around without fear of having code unexpectedly executed.
Anonymous object as iterator
That is: using an anonymous object with just one fuction as iterator. Pretty immediate to read and understand. But due to the restrictions in the way Ruby handles scope the iterator cannot easily receive parameters: any parameters received in the function iterator are not automatically available within each.
def iterator
def each
yield("Value 1")
yield("Value 2")
yield("Value 3")
end
end
def iterate(my_iterator)
my_iterator.each do |value|
puts value
end
end
iterate iterator
Proc object as iterator
Using a Proc object as iterator lets you easily use any variables passed to the iterator constructor. The dark side: this starts looking weird. Reading the Proc.new block is not immediate for the untrained eye. Also: not being able to use yield makes it a bit uglier IMHO.
def iterator(prefix:)
Proc.new { |&block|
block.call("#{prefix} Value 1")
block.call("#{prefix} Value 2")
block.call("#{prefix} Value 3")
}
end
def iterate(my_iterator)
my_iterator.call do |value|
puts value
end
end
iterate iterator(prefix: 'The')
Lambda as iterator
Ideal if you want to obfuscate your code so hard that no one else besides you can read it.
def iterator(prefix:)
-> (&block) {
block.call("#{prefix} Value 1")
block.call("#{prefix} Value 2")
block.call("#{prefix} Value 3")
}
end
def iterate(my_iterator)
my_iterator.call do |value|
puts value
end
end
iterate iterator(prefix: 'The')
Class as iterator
And finally the good ol' OOP approach. A bit verbose to initialize for my taste, but with little or none surprise effect.
class Iterator
def initialize(prefix:)
#prefix = prefix
end
def each
yield("#{#prefix} Value 1")
yield("#{#prefix} Value 2")
yield("#{#prefix} Value 3")
end
end
def iterate(my_iterator)
my_iterator.each do |value|
puts value
end
end
iterate Iterator.new(prefix: 'The')
I'm trying to use Ruby 1.9.1 for an embedded scripting language, so that "end-user" code gets written in a Ruby block. One issue with this is that I'd like the users to be able to use the 'return' keyword in the blocks, so they don't need to worry about implicit return values. With this in mind, this is the kind of thing I'd like to be able to do:
def thing(*args, &block)
value = block.call
puts "value=#{value}"
end
thing {
return 6 * 7
}
If I use 'return' in the above example, I get a LocalJumpError. I'm aware that this is because the block in question is a Proc and not a lambda. The code works if I remove 'return', but I'd really prefer to be able to use 'return' in this scenario. Is this possible? I've tried converting the block to a lambda, but the result is the same.
Simply use next in this context:
$ irb
irb(main):001:0> def thing(*args, &block)
irb(main):002:1> value = block.call
irb(main):003:1> puts "value=#{value}"
irb(main):004:1> end
=> nil
irb(main):005:0>
irb(main):006:0* thing {
irb(main):007:1* return 6 * 7
irb(main):008:1> }
LocalJumpError: unexpected return
from (irb):7:in `block in irb_binding'
from (irb):2:in `call'
from (irb):2:in `thing'
from (irb):6
from /home/mirko/.rvm/rubies/ruby-1.9.1-p378/bin/irb:15:in `<main>'
irb(main):009:0> thing { break 6 * 7 }
=> 42
irb(main):011:0> thing { next 6 * 7 }
value=42
=> nil
return always returns from method, but if you test this snippet in irb you don't have method, that's why you have LocalJumpError
break returns value from block and ends its call. If your block was called by yield or .call, then break breaks from this iterator too
next returns value from block and ends its call. If your block was called by yield or .call, then next returns value to line where yield was called
You cannot do that in Ruby.
The return keyword always returns from the method or lambda in the current context. In blocks, it will return from the method in which the closure was defined. It cannot be made to return from the calling method or lambda.
The Rubyspec demonstrates that this is indeed the correct behaviour for Ruby (admittedly not a real implementation, but aims full compatibility with C Ruby):
describe "The return keyword" do
# ...
describe "within a block" do
# ...
it "causes the method that lexically encloses the block to return" do
# ...
it "returns from the lexically enclosing method even in case of chained calls" do
# ...
I admire the answer of s12chung. Here is my little improvement of his answer. It lets avoid cluttering the context with method __thing.
def thing(*args, &block)
o = Object.new
o.define_singleton_method(:__thing, block)
puts "value=#{o.__thing}"
end
thing { return 6 * 7 }
You are looking it from the wrong point of view.
This is an issue of thing, not the lambda.
def thing(*args, &block)
block.call.tap do |value|
puts "value=#{value}"
end
end
thing {
6 * 7
}
I had the same issue writing a DSL for a web framework in ruby... (the web framework Anorexic will rock!)...
anyway, I dug into the ruby internals and found a simple solution using the LocalJumpError returned when a Proc calls return... it runs well in the tests so far, but I'm not sure it's full-proof:
def thing(*args, &block)
if block
block_response = nil
begin
block_response = block.call
rescue Exception => e
if e.message == "unexpected return"
block_response = e.exit_value
else
raise e
end
end
puts "value=#{block_response}"
else
puts "no block given"
end
end
the if statement in the rescue segment could probably look something like this:
if e.is_a? LocalJumpError
but it's uncharted territory for me, so I'll stick to what I tested so far.
I found a way, but it involves defining a method as an intermediate step:
def thing(*args, &block)
define_method(:__thing, &block)
puts "value=#{__thing}"
end
thing { return 6 * 7 }
Where is thing invoked? Are you inside a class?
You may consider using something like this:
class MyThing
def ret b
#retval = b
end
def thing(*args, &block)
implicit = block.call
value = #retval || implicit
puts "value=#{value}"
end
def example1
thing do
ret 5 * 6
4
end
end
def example2
thing do
5 * 6
end
end
end
I believe this is the correct answer, despite the drawbacks:
def return_wrap(&block)
Thread.new { return yield }.join
rescue LocalJumpError => ex
ex.exit_value
end
def thing(*args, &block)
value = return_wrap(&block)
puts "value=#{value}"
end
thing {
return 6 * 7
}
This hack allows users to use return in their procs without consequences, self is preserved, etc.
The advantage of using Thread here is that in some cases you won't get the LocalJumpError - and the return will happen in the most unexpected place (onside a top-level method, unexpectedly skipping the rest of it's body).
The main disadvantage is the potential overhead (you can replace the Thread+join with just the yield if that's enough in your scenario).