How to specify type parameter using sorbet?
For example, I want to annotate a method with an argument of type A returning generic type T[A].
def build_array(value)
[value]
end
The output type depends on the input type:
build_array(42) #=> return Array[Integer]
build_array('42') #=> return Array[String]
You can accomplish this using type_parameters:
# typed: true
extend T::Sig
sig do
type_parameters(:T)
.params(value: T.type_parameter(:T))
.returns(T::Array[T.type_parameter(:T)])
end
def build_array(value)
[value]
end
x = build_array(5)
T.reveal_type(build_array(42)) # T::Array[Integer]
T.reveal_type(build_array('42')) # T::Array[String]
Here's a sorbet.run link with the above code.
You can try using Generic for the method definition.
Eg:
sig do
type_parameters(:U)
.params(
blk: T.proc.params(arg0: Elem).returns(T.type_parameter(:U)),
)
.returns(Box[T.type_parameter(:U)])
end
def map(&blk)
Box.new(blk.call(#x))
end
See example from sorbet.run
Related
E.G.
def do_the_thing(file_to_load, hash_path)
file = File.read(file)
data = JSON.parse(file, { symbolize_names: true })
data[sections.to_sym]
end
do_the_thing(file_I_want, '[:foo][:bar][0]')
Tried a few methods but failed so far.
Thanks for any help in advance :)
Assuming you missed the parameters names...
Lets assume our file is:
// test.json
{
"foo": {
"bar": ["foobar"]
}
}
Recomended solution
Does your param really need to be a string??
If your code can be more flexible, and pass arguments as they are on ruby, you can use the Hash dig method:
require 'json'
def do_the_thing(file, *hash_path)
file = File.read(file)
data = JSON.parse(file, symbolize_names: true)
data.dig(*hash_path)
end
do_the_thing('test.json', :foo, :bar, 0)
You should get
"foobar"
It should work fine !!
Read the rest of the answer if that doesn't satisfy your question
Alternative solution (using the same argument)
If you REALLY need to use that argument as string, you can;
Treat your params to adapt to the first solution, it won't be a small or fancy code, but it will work:
require 'json'
BRACKET_REGEX = /(\[[^\[]*\])/.freeze
# Treats the literal string to it's correspondent value
def treat_type(param)
# Remove the remaining brackets from the string
# You could do this step directly on the regex if you want to
param = param[1..-2]
case param[0]
# Checks if it is a string
when '\''
param[1..-2]
# Checks if it is a symbol
when ':'
param[1..-1].to_sym
else
begin
Integer(param)
rescue ArgumentError
param
end
end
end
# Converts your param to the accepted pattern of 'dig' method
def string_to_args(param)
# Scan method will break the match results of the regex into an array
param.scan(BRACKET_REGEX).flatten.map { |match| treat_type(match) }
end
def do_the_thing(file, hash_path)
hash_path = string_to_args(hash_path)
file = File.read(file)
data = JSON.parse(file, symbolize_names: true)
data.dig(*hash_path)
end
so:
do_the_thing('test.json', '[:foo][:bar][0]')
returns
"foobar"
This solution though is open to bugs when the "hash_path" is not on an acceptable pattern, and treating it's bugs might make the code even longer
Shortest solution (Not safe)
You can use Kernel eval method which I EXTREMELY discourage to use for security reasons, read the documentation and understand its danger before using it
require 'json'
def do_the_thing(file, hash_path)
file = File.read(file)
data = JSON.parse(file, symbolize_names: true)
eval("data#{hash_path}")
end
do_the_thing('test.json', '[:foo][:bar][0]')
If the procedure you were trying to work with was just extracting the JSON data to an object, you might find yourself using either of the following scenarios:
def do_the_thing(file_to_load)
file = File.read(file)
data = JSON.parse(file, { symbolize_names: true })
data[sections.to_sym]
end
do_the_thing(file_I_want)[:foo][:bar][0]
or use the dig function of Hash :
def do_the_thing(file_to_load, sections)
file = File.read(file)
data = JSON.parse(file, { symbolize_names: true })
data.dig(*sections)
end
do_the_thing(file_I_want, [:foo, :bar, 0])
I am currently making my blog with Jekyll. Jekyll custom plugin uses Ruby and Jekyll is using Liquid. I am currently getting input via custom liquid tag and processing it there. I would like to check whether the string contains integers or not. So I have the following code. I realised that the input is not of the type String but rather Jekyll::Token type. So I changed the inputs to string but I cannot detect whether string contains an integer. Here is my code:
module Jekyll
class TypecheckTag < Liquid::Tag
def is_int(word)
return word.count("0-3000") > 0
end
def initialize(tag_name, word, tokens)
super
#word = word.to_s
end
def render(context)
if /\A\d+\z/.match(#word)
#result = 'int'
else
#result = 'string'
end
end
end
end
Liquid::Template.register_tag('typecheck', Jekyll::TypecheckTag)
Unfortunately, it always returns 'string' say even if we have a string "16" for example.
The following code work fine
def render(input_str)
if /\A[-+]?\d+\z/.match(input_str.strip)
#result = 'int'
else
#result = 'string'
end
end
Sample input with output
p render('16') # "int"
p render('16a') # "string"
p render('16 ') # "int"
p render('asd') #"string"
p render('') #"string"
I have used strip because of render('16 ') # "int" If you don't want ignore that. Hope it will help you.
I have .csv file with rows of which every row represents one call with certain duration, number etc. I need to create array of Call objects - every Call.new expects Hash of parameters, so it's easy - it just takes rows from CSV. But for some reason it doesn't work - when I invoke Call.new(raw_call) it's nil.
It's also impossible for me to see any output - I placed puts in various places in code (inside blocks etc) and it simply doesn't show anything. I obviously have another class - Call, which holds initialize for Call etc.
require 'csv'
class CSVCallParser
attr_accessor :io
def initialize(io)
self.io = io
end
NAMES = {
a: :date,
b: :service,
c: :phone_number,
d: :duration,
e: :unit,
f: :cost
}
def run
parse do |raw_call|
parse_call(raw_call)
end
end
private
def parse_call(raw_call)
NAMES.each_with_object({}) do |name, title, memo|
memo[name] = raw_call[title.to_s]
end
end
def parse(&block)
CSV.parse(io, headers: true, header_converters: :symbol, &block)
end
end
CSVCallParser.new(ARGV[0]).run
Small sample of my .csv file: headers and one row:
"a","b","c","d","e","f"
"01.09.2016 08:49","International","48627843111","0:29","","0,00"
I noticed a few things that isn't going as expected. In the parse_call method,
def parse_call(raw_call)
NAMES.each_with_object({}) do |name, title, memo|
memo[name] = raw_call[title.to_s]
end
end
I tried to print name, title, and memo. I expected to get :a, :date, and {}, but what I actually got was [:a,:date],{}, and nil.
Also, raw_call headers are :a,:b,:c..., not :date, :service..., so you should be using raw_call[name], and converting that to string will not help, since the key is a symbol in the raw_call.
So I modified the function to
def parse_call(raw_call)
NAMES.each_with_object({}) do |name_title, memo|
memo[name_title[1]] = raw_call[name_title[0]]
end
end
name_title[1] returns the title (:date, :service, etc)
name_title[0] returns the name (:a, :b, etc)
Also, in this method
def run
parse do |raw_call|
parse_call(raw_call)
end
end
You are not returning any results you get, so you are getting nil,
So, I changed it to
def run
res = []
parse do |raw_call|
res << parse_call(raw_call)
end
res
end
Now, if I output the line
p CSVCallParser.new(File.read("file1.csv")).run
I get (I added two more lines to the csv sample)
[{:date=>"01.09.2016 08:49", :service=>"International", :phone_number=>"48627843111", :duration=>"0:29", :unit=>"", :cost=>"0,00"},
{:date=>"02.09.2016 08:49", :service=>"International", :phone_number=>"48622454111", :duration=>"1:29", :unit=>"", :cost=>"0,00"},
{:date=>"03.09.2016 08:49", :service=>"Domestic", :phone_number=>"48627843111", :duration=>"0:29", :unit=>"", :cost=>"0,00"}]
If you want to run this program from the terminal like so
ruby csv_call_parser.rb calls.csv
(In this case, calls.csv is passed in as an argument to ARGV)
You can do so by modifying the last line of the ruby file.
p CSVCallParser.new(File.read(ARGV[0])).run
This will also return the array with hashes like before.
csv = CSV.parse(csv_text, :headers => true)
puts csv.map(&:to_h)
outputs:
[{a:1, b:1}, {a:2, b:2}]
I have an ABC BaseAbstract class with several getter/setter properties defined.
I want to require that the value to be set is an int and from 0 - 15.
#luminance.setter
#abstractproperty
#ValidateProperty(Exception, types=(int,), valid=lambda x: True if 0 <= x <= 15 else False)
def luminance(self, value):
"""
Set a value that indicate the level of light emitted from the block
:param value: (int): 0 (darkest) - 15 (brightest)
:return:
"""
pass
Can someone help me figure out what my ValidateProperty class/method should look like. I started with a class and called the accepts method but this is causing an error:
function object has no attribute 'func_code'
current source:
class ValidateProperty(object):
#staticmethod
def accepts(exception, *types, **kwargs):
def check_accepts(f, **kwargs):
assert len(types) == f.func_code.co_argcount
def new_f(*args, **kwds):
for i, v in enumerate(args):
if f.func_code.co_varnames[i] in types and\
not isinstance(v, types[f.func_code.co_varnames[i]]):
arg = f.func_code.co_varnames[i]
exp = types[f.func_code.co_varnames[i]]
raise exception("arg '{arg}'={r} does not match {exp}".format(arg=arg,
r=v,
exp=exp))
# del exp (unreachable)
for k,v in kwds.__iter__():
if k in types and not isinstance(v, types[k]):
raise exception("arg '{arg}'={r} does not match {exp}".format(arg=k,
r=v,
exp=types[k]))
return f(*args, **kwds)
new_f.func_name = f.func_name
return new_f
return check_accepts
One of us is confused about how decorators, descriptors (e.g. properties), and abstracts work -- I hope it's not me. ;)
Here is a rough working example:
from abc import ABCMeta, abstractproperty
class ValidateProperty:
def __init__(inst, exception, arg_type, valid):
# called on the #ValidateProperty(...) line
#
# save the exception to raise, the expected argument type, and
# the validator code for later use
inst.exception = exception
inst.arg_type = arg_type
inst.validator = valid
def __call__(inst, func):
# called after the def has finished, but before it is stored
#
# func is the def'd function, save it for later to be called
# after validating the argument
def check_accepts(self, value):
if not inst.validator(value):
raise inst.exception('value %s is not valid' % value)
func(self, value)
return check_accepts
class AbstractTestClass(metaclass=ABCMeta):
#abstractproperty
def luminance(self):
# abstract property
return
#luminance.setter
#ValidateProperty(Exception, int, lambda x: 0 <= x <= 15)
def luminance(self, value):
# abstract property with validator
return
class TestClass(AbstractTestClass):
# concrete class
val = 7
#property
def luminance(self):
# concrete property
return self.val
#luminance.setter
def luminance(self, value):
# concrete property setter
# call base class first to activate the validator
AbstractTestClass.__dict__['luminance'].__set__(self, value)
self.val = value
tc = TestClass()
print(tc.luminance)
tc.luminance = 10
print(tc.luminance)
tc.luminance = 25
print(tc.luminance)
Which results in:
7
10
Traceback (most recent call last):
File "abstract.py", line 47, in <module>
tc.luminance = 25
File "abstract.py", line 40, in luminance
AbstractTestClass.__dict__['luminance'].__set__(self, value)
File "abstract.py", line 14, in check_accepts
raise inst.exception('value %s is not valid' % value)
Exception: value 25 is not valid
A few points to think about:
The ValidateProperty is much simpler because a property setter only takes two parameters: self and the new_value
When using a class for a decorator, and the decorator takes arguments, then you will need __init__ to save the parameters, and __call__ to actually deal with the defd function
Calling a base class property setter is ugly, but you could hide that in a helper function
you might want to use a custom metaclass to ensure the validation code is run (which would also avoid the ugly base-class property call)
I suggested a metaclass above to eliminate the need for a direct call to the base class's abstractproperty, and here is an example of such:
from abc import ABCMeta, abstractproperty
class AbstractTestClassMeta(ABCMeta):
def __new__(metacls, cls, bases, clsdict):
# create new class
new_cls = super().__new__(metacls, cls, bases, clsdict)
# collect all base class dictionaries
base_dicts = [b.__dict__ for b in bases]
if not base_dicts:
return new_cls
# iterate through clsdict looking for properties
for name, obj in clsdict.items():
if not isinstance(obj, (property)):
continue
prop_set = getattr(obj, 'fset')
# found one, now look in bases for validation code
validators = []
for d in base_dicts:
b_obj = d.get(name)
if (
b_obj is not None and
isinstance(b_obj.fset, ValidateProperty)
):
validators.append(b_obj.fset)
if validators:
def check_validators(self, new_val):
for func in validators:
func(new_val)
prop_set(self, new_val)
new_prop = obj.setter(check_validators)
setattr(new_cls, name, new_prop)
return new_cls
This subclasses ABCMeta, and has ABCMeta do all of its work first, then does some additional processing. Namely:
go through the created class and look for properties
check the base classes to see if they have a matching abstractproperty
check the abstractproperty's fset code to see if it is an instance of ValidateProperty
if so, save it in a list of validators
if the list of validators is not empty
make a wrapper that will call each validator before calling the actual property's fset code
replace the found property with a new one that uses the wrapper as the setter code
ValidateProperty is a little different as well:
class ValidateProperty:
def __init__(self, exception, arg_type):
# called on the #ValidateProperty(...) line
#
# save the exception to raise and the expected argument type
self.exception = exception
self.arg_type = arg_type
self.validator = None
def __call__(self, func_or_value):
# on the first call, func_or_value is the function to use
# as the validator
if self.validator is None:
self.validator = func_or_value
return self
# every subsequent call will be to do the validation
if (
not isinstance(func_or_value, self.arg_type) or
not self.validator(None, func_or_value)
):
raise self.exception(
'%r is either not a type of %r or is outside '
'argument range' %
(func_or_value, type(func_or_value))
)
The base AbstractTestClass now uses the new AbstractTestClassMeta, and has the validator code directly in the abstractproperty:
class AbstractTestClass(metaclass=AbstractTestClassMeta):
#abstractproperty
def luminance(self):
# abstract property
pass
#luminance.setter
#ValidateProperty(Exception, int)
def luminance(self, value):
# abstract property validator
return 0 <= value <= 15
The final class is the same:
class TestClass(AbstractTestClass):
# concrete class
val = 7
#property
def luminance(self):
# concrete property
return self.val
#luminance.setter
def luminance(self, value):
# concrete property setter
# call base class first to activate the validator
# AbstractTestClass.__dict__['luminance'].__set__(self, value)
self.val = value
I'm creating a hash that will eventually be dumped on disk in YAML, but I need to capture multiple values stored in a file on disk and insert them into a hash. I can successfully create a variable with comma separated values, but I need to insert those values into a my "classes" key:
variable_values = "class1,class2,class3"
Ultimately, I need to get them into my test hash so it simulates something like this:
test_hash = {'Classes' => ['class1', 'class2', 'class3']}
Finally, I can output them to yaml so it looks like this:
---
classes:
- class1
- class2
- class3
What's the best way to iterate through the values and insert them into the hash? Thanks for any help you can offer!
You'd probably want something like:
test_hash = {'Classes' => variable_values.split(',')}
If you're wanting to serialize Ruby Classes (I'm not able to tell for sure), you'll probably want the following code (courtesy of opensoul.org, and as used in the Small Eigen Collider)
class Module
yaml_as "tag:ruby.yaml.org,2002:module"
def Module.yaml_new( klass, tag, val )
if String === val
val.split(/::/).inject(Object) {|m, n| m.const_get(n)}
else
raise YAML::TypeError, "Invalid Module: " + val.inspect
end
end
def to_yaml( opts = {} )
YAML::quick_emit( nil, opts ) { |out|
out.scalar( "tag:ruby.yaml.org,2002:module", self.name, :plain )
}
end
end
class Class
yaml_as "tag:ruby.yaml.org,2002:class"
def Class.yaml_new( klass, tag, val )
if String === val
val.split(/::/).inject(Object) {|m, n| m.const_get(n)}
else
raise YAML::TypeError, "Invalid Class: " + val.inspect
end
end
def to_yaml( opts = {} )
YAML::quick_emit( nil, opts ) { |out|
out.scalar( "tag:ruby.yaml.org,2002:class", self.name, :plain )
}
end
end
The code currently throws an exception if you try to serialize/deserialize anonymous classes (something I could fix but don't need to), and apart from that it works well for me.