sphinx-apidoc cannot understand var and type args in one line. How to fix this? - python-sphinx

I am using sphinx reSt to write docstrings for functions and I have created a mustache setting to customize docstring automatic generation in VsCode as below:
{{! Sphinx Docstring Template }}
{{summaryPlaceholder}}
{{extendedSummaryPlaceholder}}
{{#args}}
:param {{var}} {{typePlaceholder}}: {{descriptionPlaceholder}}
{{/args}}
{{#kwargs}}
:param {{var}} {{typePlaceholder}}{{#default}}, optional defaults to {{/default}}{{&default}}: {{descriptionPlaceholder}}
{{/kwargs}}
{{#exceptionsExist}}
{{#exceptions}}
:raises {{type}}: {{descriptionPlaceholder}}
{{/exceptions}}
{{/exceptionsExist}}
{{#yieldsExist}}
{{#yields}}
:yield {{typePlaceholder}}: {{descriptionPlaceholder}}
{{/yields}}
{{/yieldsExist}}
{{#returnsExist}}
{{#returns}}
:return {{typePlaceholder}}: {{descriptionPlaceholder}}
{{/returns}}
{{/returnsExist}}
For example:
# myproject/src/foo.py
def my_func(arg1: int, arg2: int, arg3: str = "some_text") -> tuple[str, str, str]:
"""my summary
my extended_summary
:param arg1 int: arg1 my description
:param arg2 int: arg2 my description
:param arg3 str Default "some_text": arg3 my description
:return tuple[str, str, str]: return my description
"""
return "a", "b", "c"
Now I am going to use sphinx-apidoc for generating documentation
I do sphinx-quickstart in myproject/docs. Then I run sphinx-apidoc -o ./docs/_modules ./src
and then change directory to docs and run make html
However, the documentation that gets generated for me is as below:
If I modify the function's docstring to the standard format as below:
def my_func(arg1: int, arg2: int, arg3: str = "some_text") -> tuple[str, str, str]:
"""my summary
my extended_summary
:param arg1: arg1 my description
:type arg1: int
:param arg2: arg2 my description
:type arg2: int
:param arg3: arg3 my description, defaults to "some_text"
:type arg3: str, optional
:return: return my description
:rtype: tuple[str, str, str]
"""
return "a", "b", "c"
The the correct documentation gets generated.
As the template I have created for docstring is an standard alternative, how can adjust sphinx-apidoc to use my mustache template and generate correct documentation?
**Summery:
I have customized autodoc extension with mustache
I need sphinx-apidoc to understand the generated docstrings correctly**

I think #mzn' comment is the real answer to this question.
Btw, the usual way to customize the output from sphinx-apidoc is to use Jinja templates. See https://www.sphinx-doc.org/en/master/man/sphinx-apidoc.html#cmdoption-sphinx-apidoc-0
Anyway, I modified mustache file and I am getting the result I expect:
{{! Sphinx Docstring Template }}
{{summaryPlaceholder}}
{{extendedSummaryPlaceholder}}
{{#args}}
:param {{typePlaceholder}} {{var}}: {{descriptionPlaceholder}}
{{/args}}
{{#kwargs}}
:param {{typePlaceholder}}{{#default}}, optional {{var}}: {{descriptionPlaceholder}}, defaults to {{/default}}{{&default}}
{{/kwargs}}
{{#exceptionsExist}}
{{#exceptions}}
:raises {{type}}: {{descriptionPlaceholder}}
{{/exceptions}}
{{/exceptionsExist}}
{{#yieldsExist}}
{{#yields}}
:yield {{typePlaceholder}}: {{descriptionPlaceholder}}
{{/yields}}
{{/yieldsExist}}
{{#returnsExist}}
{{#returns}}
:return: {{descriptionPlaceholder}}
:rtype: {{typePlaceholder}}
{{/returns}}
{{/returnsExist}}
def my_func(arg1: int, arg2: int, arg3: str = "some_text") -> tuple[str, str, str]:
"""summary
extended_summary
:param int arg1: description
:param int arg2: description
:param str, optional arg3: description, defaults to "some_text"
:return: description
:rtype: tuple[str, str, str]
"""
return "a", "b", "c"

Related

How to specify type parameter

How to specify type parameter using sorbet?
For example, I want to annotate a method with an argument of type A returning generic type T[A].
def build_array(value)
[value]
end
The output type depends on the input type:
build_array(42) #=> return Array[Integer]
build_array('42') #=> return Array[String]
You can accomplish this using type_parameters:
# typed: true
extend T::Sig
sig do
type_parameters(:T)
.params(value: T.type_parameter(:T))
.returns(T::Array[T.type_parameter(:T)])
end
def build_array(value)
[value]
end
x = build_array(5)
T.reveal_type(build_array(42)) # T::Array[Integer]
T.reveal_type(build_array('42')) # T::Array[String]
Here's a sorbet.run link with the above code.
You can try using Generic for the method definition.
Eg:
sig do
type_parameters(:U)
.params(
blk: T.proc.params(arg0: Elem).returns(T.type_parameter(:U)),
)
.returns(Box[T.type_parameter(:U)])
end
def map(&blk)
Box.new(blk.call(#x))
end
See example from sorbet.run

Overriding from_yaml to add custom YAML tag

Is overriding from_yaml enough to register a tag from a class or is it necessary to use yaml.add_constructor(Class.yaml_tag, Class.from_yaml)? If I don't use te add_constructor method, my YAML tags are not recognized. Example of what I have:
import yaml
class Something(yaml.YAMLObject):
yaml_tag = u'!Something'
#classmethod
def from_yaml(cls,loader,node):
# Set attributes to None if not in file
values = loader.construct_mapping(node, deep=True)
attr = ['attr1','attr2']
result = {}
for val in attr:
try:
result[val] = values[val]
except KeyError:
result[val] = None
return cls(**result)
Is this enough for it to work? I'm confused with the use of from_yaml vs any other constructor you would register using the method I mentioned above. I suppose there's something fundamental I'm missing, since they say:
Subclassing YAMLObject is an easy way to define tags, constructors,
and representers for your classes. You only need to override the
yaml_tag attribute. If you want to define your custom constructor and
representer, redefine the from_yaml and to_yaml method
correspondingly.
There is indeed no need to register explicitly:
import yaml
class Something(yaml.YAMLObject):
yaml_tag = u'!Something'
def __init__(self, *args, **kw):
print('some_init', args, kw)
#classmethod
def from_yaml(cls,loader,node):
# Set attributes to None if not in file
values = loader.construct_mapping(node, deep=True)
attr = ['attr1','attr2']
result = {}
for val in attr:
try:
result[val] = values[val]
except KeyError:
result[val] = None
return cls(**result)
yaml_str = """\
test: !Something
attr1: 1
attr2: 2
"""
d = yaml.load(yaml_str)
which gives:
some_init () {'attr1': 1, 'attr2': 2}
But there is no need at all to use PyYAML's load() which is
documented to be unsafe. You can just use safe_load if you set the yaml_loader class attribute:
import yaml
class Something(yaml.YAMLObject):
yaml_tag = u'!Something'
yaml_loader = yaml.SafeLoader
def __init__(self, *args, **kw):
print('some_init', args, kw)
#classmethod
def from_yaml(cls,loader,node):
# Set attributes to None if not in file
values = loader.construct_mapping(node, deep=True)
attr = ['attr1','attr2']
result = {}
for val in attr:
try:
result[val] = values[val]
except KeyError:
result[val] = None
return cls(**result)
yaml_str = """\
test: !Something
attr1: 1
attr2: 2
"""
d = yaml.safe_load(yaml_str)
as this gives the same:
some_init () {'attr1': 1, 'attr2': 2}
(done both with Python 3.6 and Python 2.7)
The registering is done in the __init__() of the metaclass of yaml.YAMLObject:
class YAMLObjectMetaclass(type):
"""
The metaclass for YAMLObject.
"""
def __init__(cls, name, bases, kwds):
super(YAMLObjectMetaclass, cls).__init__(name, bases, kwds)
if 'yaml_tag' in kwds and kwds['yaml_tag'] is not None:
cls.yaml_loader.add_constructor(cls.yaml_tag, cls.from_yaml)
cls.yaml_dumper.add_representer(cls, cls.to_yaml)
So maybe you are somehow interfering with that initialisation in your full class definition. Try to start with a minimal implementation as I did, and add the functionality on your class that you need until things break.

How to use substitution definitions with code blocks?

I tried using substitution definitions together with code blocks in Sphinx documentation however it doesn't work. This is my ReST source code:
.. |foo| code-block:: python
foo = 1
|foo|
Sphinx throws the following errors:
/.../examples.rst:184: WARNING: Substitution definition "foo" empty or invalid.
.. |foo| code-block:: python
foo = 1
/.../examples.rst:193: ERROR: Undefined substitution referenced: "foo".
How can I make this example work?
Sphinx version: 1.5.5 (using Sphinx 1.6.2 the above error transforms into a warning).
This is not possible without changing code-block.
I have created a Sphinx extension which provides substitution-code-block for this purpose.
It allows you to define substitutions in conf.py and then use these substitutions in .. substitution-code-block blocks.
This extension is at https://github.com/adamtheturtle/sphinx-substitution-extensions.
However, this is a very small amount of code.
To enable this in your own codebase without a third party extension, create a module in your codebase with the following contents:
"""
Custom Sphinx extensions.
"""
from typing import List
from sphinx.application import Sphinx
from sphinx.directives.code import CodeBlock
class SubstitutionCodeBlock(CodeBlock): # type: ignore
"""
Similar to CodeBlock but replaces placeholders with variables.
"""
def run(self) -> List:
"""
Replace placeholders with given variables.
"""
app = self.state.document.settings.env.app
new_content = []
self.content = self.content # type: List[str]
existing_content = self.content
for item in existing_content:
for pair in app.config.substitutions:
original, replacement = pair
item = item.replace(original, replacement)
new_content.append(item)
self.content = new_content
return list(CodeBlock.run(self))
def setup(app: Sphinx) -> None:
"""
Add the custom directives to Sphinx.
"""
app.add_config_value('substitutions', [], 'html')
app.add_directive('substitution-code-block', SubstitutionCodeBlock)
Then, use this module the extensions defined in conf.py.
Then, set the substitutions variable in conf.py e.g. to [('|foo|', 'bar')] to replace |foo| with bar in every substitution-code-block.

How to provide value validation at abstract class level?

I have an ABC BaseAbstract class with several getter/setter properties defined.
I want to require that the value to be set is an int and from 0 - 15.
#luminance.setter
#abstractproperty
#ValidateProperty(Exception, types=(int,), valid=lambda x: True if 0 <= x <= 15 else False)
def luminance(self, value):
"""
Set a value that indicate the level of light emitted from the block
:param value: (int): 0 (darkest) - 15 (brightest)
:return:
"""
pass
Can someone help me figure out what my ValidateProperty class/method should look like. I started with a class and called the accepts method but this is causing an error:
function object has no attribute 'func_code'
current source:
class ValidateProperty(object):
#staticmethod
def accepts(exception, *types, **kwargs):
def check_accepts(f, **kwargs):
assert len(types) == f.func_code.co_argcount
def new_f(*args, **kwds):
for i, v in enumerate(args):
if f.func_code.co_varnames[i] in types and\
not isinstance(v, types[f.func_code.co_varnames[i]]):
arg = f.func_code.co_varnames[i]
exp = types[f.func_code.co_varnames[i]]
raise exception("arg '{arg}'={r} does not match {exp}".format(arg=arg,
r=v,
exp=exp))
# del exp (unreachable)
for k,v in kwds.__iter__():
if k in types and not isinstance(v, types[k]):
raise exception("arg '{arg}'={r} does not match {exp}".format(arg=k,
r=v,
exp=types[k]))
return f(*args, **kwds)
new_f.func_name = f.func_name
return new_f
return check_accepts
One of us is confused about how decorators, descriptors (e.g. properties), and abstracts work -- I hope it's not me. ;)
Here is a rough working example:
from abc import ABCMeta, abstractproperty
class ValidateProperty:
def __init__(inst, exception, arg_type, valid):
# called on the #ValidateProperty(...) line
#
# save the exception to raise, the expected argument type, and
# the validator code for later use
inst.exception = exception
inst.arg_type = arg_type
inst.validator = valid
def __call__(inst, func):
# called after the def has finished, but before it is stored
#
# func is the def'd function, save it for later to be called
# after validating the argument
def check_accepts(self, value):
if not inst.validator(value):
raise inst.exception('value %s is not valid' % value)
func(self, value)
return check_accepts
class AbstractTestClass(metaclass=ABCMeta):
#abstractproperty
def luminance(self):
# abstract property
return
#luminance.setter
#ValidateProperty(Exception, int, lambda x: 0 <= x <= 15)
def luminance(self, value):
# abstract property with validator
return
class TestClass(AbstractTestClass):
# concrete class
val = 7
#property
def luminance(self):
# concrete property
return self.val
#luminance.setter
def luminance(self, value):
# concrete property setter
# call base class first to activate the validator
AbstractTestClass.__dict__['luminance'].__set__(self, value)
self.val = value
tc = TestClass()
print(tc.luminance)
tc.luminance = 10
print(tc.luminance)
tc.luminance = 25
print(tc.luminance)
Which results in:
7
10
Traceback (most recent call last):
File "abstract.py", line 47, in <module>
tc.luminance = 25
File "abstract.py", line 40, in luminance
AbstractTestClass.__dict__['luminance'].__set__(self, value)
File "abstract.py", line 14, in check_accepts
raise inst.exception('value %s is not valid' % value)
Exception: value 25 is not valid
A few points to think about:
The ValidateProperty is much simpler because a property setter only takes two parameters: self and the new_value
When using a class for a decorator, and the decorator takes arguments, then you will need __init__ to save the parameters, and __call__ to actually deal with the defd function
Calling a base class property setter is ugly, but you could hide that in a helper function
you might want to use a custom metaclass to ensure the validation code is run (which would also avoid the ugly base-class property call)
I suggested a metaclass above to eliminate the need for a direct call to the base class's abstractproperty, and here is an example of such:
from abc import ABCMeta, abstractproperty
class AbstractTestClassMeta(ABCMeta):
def __new__(metacls, cls, bases, clsdict):
# create new class
new_cls = super().__new__(metacls, cls, bases, clsdict)
# collect all base class dictionaries
base_dicts = [b.__dict__ for b in bases]
if not base_dicts:
return new_cls
# iterate through clsdict looking for properties
for name, obj in clsdict.items():
if not isinstance(obj, (property)):
continue
prop_set = getattr(obj, 'fset')
# found one, now look in bases for validation code
validators = []
for d in base_dicts:
b_obj = d.get(name)
if (
b_obj is not None and
isinstance(b_obj.fset, ValidateProperty)
):
validators.append(b_obj.fset)
if validators:
def check_validators(self, new_val):
for func in validators:
func(new_val)
prop_set(self, new_val)
new_prop = obj.setter(check_validators)
setattr(new_cls, name, new_prop)
return new_cls
This subclasses ABCMeta, and has ABCMeta do all of its work first, then does some additional processing. Namely:
go through the created class and look for properties
check the base classes to see if they have a matching abstractproperty
check the abstractproperty's fset code to see if it is an instance of ValidateProperty
if so, save it in a list of validators
if the list of validators is not empty
make a wrapper that will call each validator before calling the actual property's fset code
replace the found property with a new one that uses the wrapper as the setter code
ValidateProperty is a little different as well:
class ValidateProperty:
def __init__(self, exception, arg_type):
# called on the #ValidateProperty(...) line
#
# save the exception to raise and the expected argument type
self.exception = exception
self.arg_type = arg_type
self.validator = None
def __call__(self, func_or_value):
# on the first call, func_or_value is the function to use
# as the validator
if self.validator is None:
self.validator = func_or_value
return self
# every subsequent call will be to do the validation
if (
not isinstance(func_or_value, self.arg_type) or
not self.validator(None, func_or_value)
):
raise self.exception(
'%r is either not a type of %r or is outside '
'argument range' %
(func_or_value, type(func_or_value))
)
The base AbstractTestClass now uses the new AbstractTestClassMeta, and has the validator code directly in the abstractproperty:
class AbstractTestClass(metaclass=AbstractTestClassMeta):
#abstractproperty
def luminance(self):
# abstract property
pass
#luminance.setter
#ValidateProperty(Exception, int)
def luminance(self, value):
# abstract property validator
return 0 <= value <= 15
The final class is the same:
class TestClass(AbstractTestClass):
# concrete class
val = 7
#property
def luminance(self):
# concrete property
return self.val
#luminance.setter
def luminance(self, value):
# concrete property setter
# call base class first to activate the validator
# AbstractTestClass.__dict__['luminance'].__set__(self, value)
self.val = value

yaml use key or parent key as value

I’ve just started using YAML (through pyyaml) and I was wondering if there is any way to state that the value of a key is the key name itself or the parent key.
For example
---
foo: &FOO
bar: !.
baz: !..
foo2:
<<: *FOO
…
{‘foo’: {‘bar’: ‘bar’, ‘baz’: ’foo’}, ‘foo2’:{‘bar’:’bar’, ‘baz’:’foo2’}}
(notice the dot and double dot on bar and baz respectively - those are just placeholders for getting the key name and parent key name)
I've tried using add_constructor:
def key_construct(loader, node):
# return the key here
pass
yaml.add_constructor('!.', key_construct)
but Node, doesn't hold the key (or a reference to the parent) and I couldn't find the way to get them.
EDIT:
So, here is my real use case and a solution based on Anthon's response:
I have a logger configuration file (in yaml), and I wanted to reuse some definitions there:
handlers:
base: &base_handler
(): globals.TimedRotatingFileHandlerFactory
name: ../
when: midnight
backupCount: 14
level: DEBUG
formatter: generic
syslog:
class: logging.handlers.SysLogHandler
address: ['localhost', 514]
facility: local5
level: NOTSET
formatter: syslog
access:
<<: *base_handler
error:
<<: *base_handler
loggers:
base: &base_logger
handlers: [../, syslog]
qualname: ../
access:
<<: *base_logger
error:
<<: *base_logger
handlers: [../, syslog, email]
The solution, as Anthon suggested was to traverse the configuration dictionary after is was being processed:
def expand_yaml(val, parent=None, key=None, key1=None, key2=None):
if isinstance(val, str):
if val == './':
parent[key] = key1
elif val == '../':
parent[key] = key2
elif isinstance(val, dict):
for k, v in val.items():
expand_yaml(v, val, k, k, key1)
elif isinstance(val, list):
parent[key] = val[:] # support inheritance of the list (as in *base_logger)
for index, e in enumerate(parent[key]):
expand_yaml(e, parent[key], index, key1, key2)
return val
You don't have much context when you are constructing an element, so you are not going to find your key, and certainly not the parent key, to fill in the values, without digging in the call stack for the context (the loader knows about foo, bar and baz, but not in a way you can use to determine which is the corresponding key or parent_key).
What I suggest you do is create a special node that you return with the key_construct and then after the YAML load, walk the structure that your yaml.load() returned. Unless you have other ! objects, which make it more difficult to walk the resulting combination than a pure combination of sequences/lists and mappings/dicts ¹:
import ruamel.yaml as yaml
yaml_str = """\
foo: &FOO
bar: !.
baz: !..
foo2:
<<: *FOO
"""
class Expander:
def __init__(self, tag):
self.tag = tag
def expand(self, key, parent_key):
if self.tag == '!.':
return key
elif self.tag == '!..':
return parent_key
raise NotImplementedError
def __repr__(self):
return "E({})".format(self.tag)
def expand(d, key=None, parent_key=None):
if isinstance(d, list):
for elem in d:
expand(elem, key=key, parent_key=parent_key)
elif isinstance(d, dict):
for k in d:
v = d[k]
if isinstance(v, Expander):
d[k] = v.expand(k, parent_key)
expand(d[k], key, parent_key=k)
return d
def key_construct(loader, node):
return Expander(node.tag)
yaml.add_constructor('!.', key_construct)
yaml.add_constructor('!..', key_construct)
data = yaml.load(yaml_str)
print(data)
print(expand(data))
gives you:
{'foo': {'bar': E(!.), 'baz': E(!..)}, 'foo2': {'bar': E(!.), 'baz': E(!..)}}
{'foo': {'bar': 'bar', 'baz': 'foo'}, 'foo2': {'bar': 'bar', 'baz': 'foo2'}}
¹ This was done using ruamel.yaml of which I am the author. PyYAML, of which ruamel.yaml is a functional superset, should work the same.

Resources