how to call __post_init__ in both subclass and baseclass which are defined by #dataclass [duplicate] - derived-class

I'm trying to use the new python dataclasses to create some mix-in classes (already as I write this I think it sounds like a rash idea), and I'm having some issues. Behold the example below:
from dataclasses import dataclass
#dataclass
class NamedObj:
name: str
def __post_init__(self):
print("NamedObj __post_init__")
self.name = "Name: " + self.name
#dataclass
class NumberedObj:
number: int = 0
def __post_init__(self):
print("NumberedObj __post_init__")
self.number += 1
#dataclass
class NamedAndNumbered(NumberedObj, NamedObj):
def __post_init__(self):
super().__post_init__()
print("NamedAndNumbered __post_init__")
If I then try:
nandn = NamedAndNumbered('n_and_n')
print(nandn.name)
print(nandn.number)
I get
NumberedObj __post_init__
NamedAndNumbered __post_init__
n_and_n
1
Suggesting it has run __post_init__ for NamedObj, but not for NumberedObj.
What I would like is to have NamedAndNumbered run __post_init__ for both of its mix-in classes, Named and Numbered. One might think that it could be done if NamedAndNumbered had a __post_init__ like this:
def __post_init__(self):
super(NamedObj, self).__post_init__()
super(NumberedObj, self).__post_init__()
print("NamedAndNumbered __post_init__")
But this just gives me an error AttributeError: 'super' object has no attribute '__post_init__' when I try to call NamedObj.__post_init__().
At this point I'm not entirely sure if this is a bug/feature with dataclasses or something to do with my probably-flawed understanding of Python's approach to inheritance. Could anyone lend a hand?

This:
def __post_init__(self):
super(NamedObj, self).__post_init__()
super(NumberedObj, self).__post_init__()
print("NamedAndNumbered __post_init__")
doesn't do what you think it does. super(cls, obj) will return a proxy to the class after cls in type(obj).__mro__ - so, in your case, to object. And the whole point of cooperative super() calls is to avoid having to explicitely call each of the parents.
The way cooperative super() calls are intended to work is, well, by being "cooperative" - IOW, everyone in the mro is supposed to relay the call to the next class (actually, the super name is a rather sad choice, as it's not about calling "the super class", but about "calling the next class in the mro").
IOW, you want each of your "composable" dataclasses (which are not mixins - mixins only have behaviour) to relay the call, so you can compose them in any order. A first naive implementation would look like:
#dataclass
class NamedObj:
name: str
def __post_init__(self):
super().__post_init__()
print("NamedObj __post_init__")
self.name = "Name: " + self.name
#dataclass
class NumberedObj:
number: int = 0
def __post_init__(self):
super().__post_init__()
print("NumberedObj __post_init__")
self.number += 1
#dataclass
class NamedAndNumbered(NumberedObj, NamedObj):
def __post_init__(self):
super().__post_init__()
print("NamedAndNumbered __post_init__")
BUT this doesn't work, since for the last class in the mro (here NamedObj), the next class in the mro is the builtin object class, which doesn't have a __post_init__ method. The solution is simple: just add a base class that defines this method as a noop, and make all your composable dataclasses inherit from it:
class Base(object):
def __post_init__(self):
# just intercept the __post_init__ calls so they
# aren't relayed to `object`
pass
#dataclass
class NamedObj(Base):
name: str
def __post_init__(self):
super().__post_init__()
print("NamedObj __post_init__")
self.name = "Name: " + self.name
#dataclass
class NumberedObj(Base):
number: int = 0
def __post_init__(self):
super().__post_init__()
print("NumberedObj __post_init__")
self.number += 1
#dataclass
class NamedAndNumbered(NumberedObj, NamedObj):
def __post_init__(self):
super().__post_init__()
print("NamedAndNumbered __post_init__")

The problem (most probably) isn't related to dataclasses. The problem is in Python's method resolution. Calling method on super() invokes the first found method from parent class in the MRO chain. So to make it work you need to call the methods of parent classes manually:
#dataclass
class NamedAndNumbered(NumberedObj, NamedObj):
def __post_init__(self):
NamedObj.__post_init__(self)
NumberedObj.__post_init__(self)
print("NamedAndNumbered __post_init__")
Another approach (if you really like super()) could be to continue the MRO chain by calling super() in all parent classes (but it needs to have a __post_init__ in the chain):
#dataclass
class MixinObj:
def __post_init__(self):
pass
#dataclass
class NamedObj(MixinObj):
name: str
def __post_init__(self):
super().__post_init__()
print("NamedObj __post_init__")
self.name = "Name: " + self.name
#dataclass
class NumberedObj(MixinObj):
number: int = 0
def __post_init__(self):
super().__post_init__()
print("NumberedObj __post_init__")
self.number += 1
#dataclass
class NamedAndNumbered(NumberedObj, NamedObj):
def __post_init__(self):
super().__post_init__()
print("NamedAndNumbered __post_init__")
In both approaches:
>>> nandn = NamedAndNumbered('n_and_n')
NamedObj __post_init__
NumberedObj __post_init__
NamedAndNumbered __post_init__
>>> print(nandn.name)
Name: n_and_n
>>> print(nandn.number)
1

Related

undefined method error when pass function from one class to the other

I am working on a ruby script where I have two classes: Logger and Test, where Logger is recording the function name along with its input and output when a function is called.
Then I have Test class where I define a test_func to double the input. However, I met an undefined method error when using the func_logger function after logger is initialized. I guess it is a scope issue so logger didn't know where to find test_func. If so, which part should I modify to make it work? Thanks in advance.
p.s. I have to maintain the code structure in the way below, so I can't move any function outside a class.
class Logger
def initialize(id)
#id = id
end
def func_logger(func, input)
func_name = func.to_s
output = method(func).call(*input)
{"name"=>func_name, "input"=>input, "output"=>output}
end
end
class Test
def initialize(id)
#id = id
end
def exec(input)
def test_func(a)
a*2
end
logger = Logger.new(#id)
logger.func_logger(:test_func, input)
end
end
test = Test.new(0)
puts test.exec(2).inspect
Traceback (most recent call last):
3: from main.rb:29:in `<main>'
2: from main.rb:24:in `exec'
1: from main.rb:8:in `func_logger'
main.rb:8:in `method': undefined method `test_func' for class `Logger' (NameError)
exit status 1
method on Logger is equivalent to calling self.method(...), which tries to get a Method instance by looking up a method with the passed name on self. There is obviously no test_func on the Logger instance. Calling method on the calling object will produce a Method instance bound to the Test instance, which is what you want to pass. This Method instance has a #name property, and you can #call it with your arguments:
class Logger
def initialize(id)
#id = id
end
def func_logger(func, *input, &block)
output = func.call(*input, &block)
{"name"=> func.name.to_s, "input"=>input, "output"=>output}
end
end
class Test
def initialize(id)
#id = id
end
def exec(input)
def test_func(a)
a*2
end
logger = Logger.new(#id)
logger.func_logger(method(:test_func), input)
end
end

Adding methods in __init__() in Python

I'm making classes which are similar, but with different functions, depending on the use of the class.
class Cup:
def __init__(self, content):
self.content = content
def spill(self):
print(f"The {self.content} was spilled.")
def drink(self):
print(f"You drank the {self.content}.")
Coffee = Cup("coffee")
Coffee.spill()
> The coffee was spilled.
However, it is known during the initialization of an object whether or not the cup will be spilled or drank. If there are many cups, there's no need for all of them to have both functions, because only one of them will be used. How can I add a function during initialization?
Intuitively it should be something like this, but this apparently didn't work:
def spill(self):
print(f"The {self.content} was spilled.")
class Cup:
def __init__(self, content, function):
self.content = content
self.function = function
Coffee = Cup("coffee", spill)
Coffee.function()
> The coffee was spilled
If you create a class in Python with methods e.g.
class A
def method(self, param1, param)
It will make sure that when you call A().method(x,y) it fill the self parameter with instance of A. When you try specify method yourself outside of the class then you have to also make sure that the binding is done properly.
import functools
class Cup:
def __init__(self, content, function):
self.content = content
self.function = functools.partial(function, self)

How should I call object methods from a static method in a ruby class?

From an academic / "for interests sake" (best practice) perspective:
In a Ruby class, I want to provide a static method that calls instance methods of the class. Is this considered an acceptable way of doing it, or is there is a "better" way?
class MyUtil
def apiClient
#apiClient ||= begin
apiClient = Vendor::API::Client.new("mykey")
end
end
def self.setUpSomething(param1, param2, param3=nil)
util = self.new() # <-- is this ok?
util.apiClient.call("foo", param2)
# some logic and more util.apiClient.calls() re-use client.
end
end
And then I can use this lib easily:
MyUtil.setUpSomething("yes","blue")
vs
MyUtil.new().setupUpSomething()
# or
util = MyUtil.new()
util.setUpSomething()
The environment is sys admin scripts that get executed in a controlled manner, 1 call at a time (i.e. not a webapp type of environment that could be subjected to high load).
Personally I would probably do something like:
class MyUtil
API_KEY = "mykey"
def apiClient
#apiClient
end
def initialize
#apiClient = Vendor::API::Client.new(API_KEY)
yield self if block_given?
end
class << self
def setUpSomthing(arg1,arg2,arg3=nil)
self.new do |util|
#setup logic goes here
end
end
def api_call(arg1,arg2,arg3)
util = setUpSomthing(arg1,arg2,arg3)
util.apiClient.call("foo", param2)
#do other stuff
end
end
end
The difference is subtle but in this version setUpSomthing guarantees a return of the instance of the class and it's more obvious what you're doing.
Here setUpSomthing is responsible for setting up the object and then returning it so it can be used for things, this way you have a clear separation of creating the object and setting the object up.
In this particular case you likely need a class instance variable:
class MyUtil
#apiClient = Vendor::API::Client.new("mykey")
def self.setUpSomething(param1, param2, param3=nil)
#apiClient.call("foo", param2)
# some logic and more util.apiClient.calls() re-use client.
end
end
If you want a lazy instantiation, use an accessor:
class MyUtil
class << self
def api_client
#apiClient ||= Vendor::API::Client.new("mykey")
end
def setUpSomething(param1, param2, param3=nil)
apiClient.call("foo", param2)
# some logic and more util.apiClient.calls() re-use client.
end
end
end
Brett, what do you think about this:
class MyUtil
API_KEY = 'SECRET'
def do_something(data)
api_client.foo(data)
end
def do_something_else(data)
api_client.foo(data)
api_client.bar(data)
end
private
def api_client
#api_client ||= Vendor::API::Client.new(API_KEY)
end
end
Usage
s = MyUtil.new
s.do_something('Example') # First call
s.do_something_else('Example 2')

If a Ruby constant is defined, why isn't it listed by constants()?

Class names seem to show up when you check for them with const_defined?() but not when you try to list them with constants(). Since I'm interested in listing any defined class names, I'm just trying to work out what's going on. Example:
class MyClass
def self.examine_constants
puts self.name
puts const_defined? self.name
puts constants.inspect
end
end
MyClass.examine_constants
Under Ruby 2.0.0p195, that code gives the following results:
MyClass
true
[]
Shouldn't the methods agree? What am I missing please?
The Module#constant method you're using return all the constants at the current scope. What you might want to use is the class method Module.constant which returns all the top-level constants. For instance:
class MyClass
def self.examine_constants
puts self.name
puts const_defined? self.name
puts Module.constants.inspect
end
end
MyClass.examine_constants
Response:
MyClass
true
[...lots of other constants, :MyClass]

What is the right way to initialize an object that invoke an instance method of its mother?

The question is as of the title. Say I have a simple example below:
class Vehicle
attr_accessor :wheels
end
class Car < Vehicle
def initialize
self.wheels = 4
end
end
class Truck < Vehicle
def initialize
#wheels = 16
end
end
I am curious which way is considered correct or better to invoke
wheels writer method of mother Vehicle?
self.wheels = 4 is more flexible because under the hood it is calling a setter method: self.wheels=(4)
So if you ever wanted to do something with the value before it is placed into #wheels, you could define that function:
def wheels=(val)
# do something
#wheels = new_val
end

Resources