Dynamically define a method inside an instance method - ruby

I am working on a project of context-oriented programming in ruby. And I come to this problem:
Suppose that I have a class Klass:
class Klass
def my_method
proceed
end
end
I also have a proc stored inside a variable impl. And impl contains { puts "it works!" }.
From somewhere outside Klass, I would like to define a method called proceed inside the method my_method. So that if a call Klass.new.my_method, I get the result "it works".
So the final result should be something like that:
class Klass
def my_method
def proceed
puts "it works!"
end
proceed
end
end
Or if you have any other idea to make the call of proceed inside my_method working, it's also good. But the proceed of another method (let's say my_method_2) isn't the same as my_method.
In fact, the proceed of my_method represent an old version of my_method. And the proceed of my_method_2 represent an old version of my_method_2.
Thanks for your help

Disclaimer: you are doing it wrong!
There must be more robust, elegant and rubyish way to achieve what you want. If you still want to abuse metaprogramming, here you go:
class Klass
def self.proceeds
#proceeds ||= {}
end
def def_proceed
self.class.proceeds[caller.first[/`.*?'/]] = Proc.new
end
def proceed *args
self.class.proceeds[caller.first[/`.*?'/]].(*args)
end
def m_1
def_proceed { puts 1 }
proceed
end
def m_2
def_proceed { puts 2 }
proceed
end
end
inst = Klass.new
inst.m_1
#⇒ 1
inst.m_2
#⇒ 2
What you in fact need, is Module#prepend and call super from there.

One way of doing that is to construct a hash whose keys are the names of the methods calling proceed and whose values are procs that represent the implementations of proceed for each method calling it.
class Klass
singleton_class.send(:attr_reader, :proceeds)
#proceeds = {}
def my_method1(*args)
proceed(__method__,*args)
end
def my_method2(*args)
proceed(__method__,*args)
end
def proceed(m, *args)
self.class.proceeds[m].call(*args)
end
end
def define_proceed(m, &block)
Klass.proceeds[m] = Proc.new &block
end
define_proceed(:my_method1) { |*arr| arr.sum }
define_proceed(:my_method2) { |a,b| "%s-%s" % [a,b] }
k = Klass.new
k.my_method1(1,2,3) #=> 6
k.my_method2("cat", "dog") #=> "cat-dog"

Related

Is there any difference in using `yield self` in a method with parameter `&block` and `yield self` in a method without a parameter `&block`?

I understand that
def a(&block)
block.call(self)
end
and
def a()
yield self
end
lead to the same result, if I assume that there is such a block a {}. My question is - since I stumbled over some code like that, whether it makes any difference or if there is any advantage of having (if I do not use the variable/reference block otherwise):
def a(&block)
yield self
end
This is a concrete case where i do not understand the use of &block:
def rule(code, name, &block)
#rules = [] if #rules.nil?
#rules << Rule.new(code, name)
yield self
end
The only advantage I can think of is for introspection:
def foo; end
def bar(&blk); end
method(:foo).parameters #=> []
method(:bar).parameters #=> [[:block, :blk]]
IDEs and documentation generators could take advantage of this. However, it does not affect Ruby's argument passing. When calling a method, you can pass or omit a block, regardless of whether it is declared or invoked.
The main difference between
def pass_block
yield
end
pass_block { 'hi' } #=> 'hi'
and
def pass_proc(&blk)
blk.call
end
pass_proc { 'hi' } #=> 'hi'
is that, blk, an instance of Proc, is an object and therefore can be passed to other methods. By contrast, blocks are not objects and therefore cannot be passed around.
def pass_proc(&blk)
puts "blk.is_a?(Proc)=#{blk.is_a?(Proc)}"
receive_proc(blk)
end
def receive_proc(proc)
proc.call
end
pass_proc { 'ho' }
blk.is_a?(Proc)=true
#=> "ho"

block hack, how to simplify block

I have a question about Ruby blocks.
For example I have a Ruby Class:
class NewClass
def initialize
#a = 1
end
def some_method
puts #a
end
end
When I do something like that:
NewClass.new do |c|
c.some_method
end
Everything is good, but is there any possibilities to do that somehow like:
NewClass.new do
some_method
end
Any ideas?
Your current code will just ignore the block anyway since you don't yield to it. For what you are trying to do in your first example you need the yield self idiom in initialize.
For why you need a block variable in the first place, think about what the receiver for some_method would be in your second example. Without an explicit receiver it's the top level main (unless this code is part of some other class of course, where that enclosing class would be self). See Dave Thomas' blog post Changing self in Ruby (or Yehuda Katz' post as pointed out by Niklas B. in the comments) for more info on that topic (the comments clear up the "proc invocation" part).
Edit: all that said, this seems to work, but I prefer the yield self version and example 1:
class NewClass
def initialize
#a = 1
end
def some_method
puts "Hello: ##a"
end
def self.build(&block)
x = self.new
x.instance_eval(&block)
x
end
end
NewClass.build do
some_method
end
This allows you to execute the block without a block variable and will return the new instance of the class for assigning to a variable etc.
class NewClass
def initialize(&block)
#a = 1
instance_eval(&block)
end
def some_method
puts #a
end
end
NewClass.new do
some_method
end
Using instance_eval should do the job. But unless you know what you are doing (and it's not just out of laziness) I'd advice against it, and go with your original approach.
def initialize(&block)
#a = 1
self.instance_eval(&block) if block_given?
end

Ruby: automatically wrapping methods in event triggers

Heres what I have/want:
module Observable
def observers; #observers; end
def trigger(event, *args)
good = true
return good unless (#observers ||= {})[event]
#obersvers[event].each { |e| good = false and break unless e.call(self, args) }
good
end
def on(event, &block)
#obersvers ||= {}
#obersvers[event] ||= []
#observers[event] << block
end
end
class Item < Thing
include Observable
def pickup(pickuper)
return unless trigger(:before_pick_up, pickuper)
pickuper.add_to_pocket self
trigger(:after_pick_up, pickuper)
end
def drop(droper)
return unless trigger(:before_drop, droper)
droper.remove_from_pocket self
trigger(:after_drop, droper)
end
# Lots of other methods
end
# How it all should work
Item.new.on(:before_pickup) do |item, pickuper|
puts "Hey #{pickuper} thats my #{item}"
return false # The pickuper never picks up the object
end
While starting on trying to create a game in Ruby, I thought it would be great if it could be based all around Observers and Events. The problem is have to write all of these triggers seems to be a waste, as it seems like a lot of duplicated code. I feel there must be some meta programming method out there to wrap methods with functionality.
Ideal Sceanrio:
class CustomBaseObject
class << self
### Replace with correct meta magic
def public_method_called(name, *args, &block)
return unless trigger(:before_+name.to_sym, args)
yield block
trigger(:after_+name.to_sym, args)
end
###
end
end
And then I have all of my object inherit from this Class.
I'm still new to Ruby's more advanced meta programming subjects, so any knowledge about this type of thing would be awesome.
There are a several ways to do it with the help of metaprogramming magic. For example, you can define a method like this:
def override_public_methods(c)
c.instance_methods(false).each do |m|
m = m.to_sym
c.class_eval %Q{
alias #{m}_original #{m}
def #{m}(*args, &block)
puts "Foo"
result = #{m}_original(*args, &block)
puts "Bar"
result
end
}
end
end
class CustomBaseObject
def test(a, &block)
puts "Test: #{a}"
yield
end
end
override_public_methods(CustomBaseObject)
foo = CustomBaseObject.new
foo.test(2) { puts 'Block!' }
# => Foo
Test: 2
Block!
Bar
In this case, you figure out all the required methods defined in the class by using instance_methods and then override them.
Another way is to use so-called 'hook' methods:
module Overrideable
def self.included(c)
c.instance_methods(false).each do |m|
m = m.to_sym
c.class_eval %Q{
alias #{m}_original #{m}
def #{m}(*args, &block)
puts "Foo"
result = #{m}_original(*args, &block)
puts "Bar"
result
end
}
end
end
end
class CustomBaseObject
def test(a, &block)
puts "Test: #{a}"
yield
end
include Overrideable
end
The included hook, defined in this module, is called when you include that module. This requires that you include the module at the end of the class definition, because included should know about all the already defined methods. I think it's rather ugly :)

Wrapper for method with a block

I want to use the combination method with a custom class.
If my class looks like this...
class MyClass
def initialize
#data = []
end
def to_a
#data
end
end
I could call this...
myobj = MyClass.new
myobj.to_a.combination(2) {|a,b| puts "#{a} #{b}" }
But I'd much rather have this...
myobj.combination {|a,b| puts "#{a} #{b}" }
I've tried to write a class method to wrap the combination method, passing the block. But it's not working.
def combination(&block)
#data.to_a.combination(2) block.call
end
Also, does anyone know why combination is in the Array class and not Enumerable? I'd have thought it would have been more useful there.
The block is a special type of parameter to Array#combination (much like you've got it in your own definition). The correct invocation is:
def combination(&block)
#data.to_a.combination(2, &block)
end
Try this:
def combination(&block)
#data.to_a.combination(2) { block }
end

How can I run a Ruby functions backwards?

I want to have a class that runs functions backwards like foo.method1.method2.method3 and I want the funcitons to run method3 method2 then method1. But it goes 1 2 3 now. I think this is called lazy evaluation but I'm not sure.
I know next to nothing about Ruby so please excuse this question if its simple and I should know this already.
Sure you can do it. It sounds like you were on the right path thinking of the lazy evaluation you just need to end every list of method calls with a method that runs the queued methods.
class Foo
def initialize
#command_queue = []
end
def method1
#command_queue << :_method1
self
end
def method2
#command_queue << :_method2
self
end
def method3
#command_queue << :_method3
self
end
def exec
#command_queue.reverse.map do |command|
self.send(command)
end
#command_queue = []
end
private
def _method1
puts "method1"
end
def _method2
puts "method2"
end
def _method3
puts "method3"
end
end
foo = Foo.new
foo.method1.method2.method3.exec
method3
method2
method1
Maybe you could chain method calls, build an evaluation stack and execute later. This requires that you call an extra method to evaluate the stack. You could use private methods for the actual implementations.
class Weirdo
def initialize
#stack = []
end
def method_a
#stack << [:method_a!]
self #so that the next call gets chained
end
def method_b(arg1, arg2)
#stack << [:method_b!, arg1, arg2]
self
end
def method_c(&block)
#stack << [:method_c!, block]
self
end
def call_stack
while #stack.length > 0 do
send *#stack.pop
end
end
private
# actual method implementations
def method_a!
# method_a functionality
end
def method_b!(arg1, arg2)
# method_b functionality
end
def method_c!(&block)
# method_c functionality
end
end
so that you can do something like
w = Weirdo.new
w.method_a.method_b(3,5).method_c{ Time.now }
w.call_stack # => executes c first, b next and a last.
Update
Looks like I managed to miss Pete's answer and posted almost exactly the same answer. The only difference is the ability to pass on the arguments to the internal stack.
What you really want here is a proxy class that captures the messages, reverses them, and then forwards them on to the actual class:
# This is the proxy class that captures the messages, reverses them, and then forwards them
class Messenger
def initialize(target)
#obj = target
#messages = []
end
def method_missing(name, *args, &block)
#messages << [name, args, block]
self
end
# The return value of this method is an array of the return values of the invoked methods
def exec
#messages.reverse.map { |name, args, block| #obj.send(name, *args, &block) }
end
end
# this is the actual class that implements the methods you want to invoke
# every method on this class just returns its name
class Test
def self.def_methods(*names)
names.each { |v| define_method(v) { v } }
end
def_methods :a, :b, :c, :d
end
# attach the proxy, store the messages, forward the reversed messages onto the actual class
# and then display the resulting array of method return values
Messenger.new(Test.new).a.b.c.exec.inspect.display #=> [:c, :b, :a]
I don't think that this is possibl. Here is why:
method3 is operating on the result of method2
method2 is operating on the result of method1
therefore method3 requires that method2 has completed
therefore method2 requires that method1 has completed
thus, the execution order must be method3 -> method2 -> method1
I think your only option is to write it as:
foo.method3.method2.method1
As a side note, lazy evaluation is simply delaying a computation until the result is required. For example, if I had a database query that went:
#results = Result.all
Lazy evaluation would not perform the query until I did something like:
puts #results
This is most interesting and useful; I think that the proxy pattern can indeed be used.
I thought as I read your initial post about a debugger that IBM first provided that allowed executing forward and backward (I think that it was on eclipse for java).
I see that OCAML has such a debugger: http://caml.inria.fr/pub/docs/manual-ocaml/manual030.html.
Don't forget that closures can indeed mess or help this issue.

Resources