When using sphinx's automodule (https://www.sphinx-doc.org/en/master/usage/extensions/autodoc.html),
I simply write in a .rst file:
.. automodule:: my_module
:members:
It documents my_module fine, but it doesn't find the inner modules like my_module.inner_module0 and my_module.inner_module1. Is there something that needs to be specified in the __init__.py file besides the __all__ variable?
Also, I'm aware of sphinx-apidoc. But that command documents far too much (exposes every function/folder including undocumented ones).
It sounds like you want to give the automodule directive a package name and have it recurse into the directory and document each Python module. That isn't supported yet. You will ned to specify the full dotted module name for each module you want to document.
For example, given the following directory structure (from the Python documentation). You cannot specify .. automodule:: sound.formats and have it document all the modules in the directory. You will have to specify a automodule command for each module: .. automodule:: sound.formats.waveread, .. automodule:: sound.formats.wavewrite, etc.
sound/ Top-level package
__init__.py Initialize the sound package
formats/ Subpackage for file format conversions
__init__.py
wavread.py
wavwrite.py
aiffread.py
aiffwrite.py
auread.py
auwrite.py
...
effects/ Subpackage for sound effects
__init__.py
echo.py
surround.py
reverse.py
...
It seems to me that using the :imported-members: option (non-direct link, do use search) should now be possible, if __init__.py imports those submodules.
However, I'm personally not able to make this work (yet).
EDIT: Possibly a known bug.
Related
I successfully used rules_go to build a gRPC service:
go_proto_library(
name = "processor_go_proto",
compilers = ["#io_bazel_rules_go//proto:go_grpc"],
importpath = "/path/to/proto/package",
proto = ":processor_proto",
deps = ["//services/shared/proto/common:common_go_proto"],
)
However, I'm not sure how to import the resulting file in VSCode. The generated file is nested under bazel_bin and under the original proto file path; so to import this, it seems like I would need to write out the entire path (including the bazel_bin part) to the generated Go file. To my understanding, there doesn't seem to be a way to instruct VSCode to look under certain folders that only contain Go packages/files; everything seems to need a go.mod file. This makes it quite difficult to develop in.
For clarity, my directory structure looks something like this:
WORKSPACE
bazel-bin
- path
- to
- generated_Go_file.go
go.mod
go.sum
proto
- path
- to
- gRPC_proto.proto
main.go
main.go should use the generated_Go_file.go.
Is there a way around this?
I don't use Bazel and so cannot help with the Bazel configuration. It's likely there is a way to specify the generated code location so that you can revise this to reflect you preference.
The outline you provide of the generated code, is workable though and a common pattern. Often the generated proto|gRPC code is placed in a module's gen subdirectory.
This is somewhat similar to vendoring where your code incorporates what may often be a 3rd-party's stubs (client|server) into your code. The stubs must reflect the proto(s) package(s) and, when these are 3rd-party, using gen or bazel-bin provide a way to keep potentially multiple namespaces discrete.
You're correct that the import for main.go, could (!) be prefixed with the module name from go.mod (first line) followed by the folder path to the generated code. This is standard go packaging and treats the generated code in a similar way to vendored modules.
Another approach is to use|place the generated code in a different module.
For code generated from 3rd-party protos, this may be preferable and the generated code may be provided by the 3rd-party in a module that you can go get or add to your go.mod.
An example of this approach is Google Well-Known Types. The proto (sources) are bundled with protoc (lib directory) and, when protoc compiles sources that references any of these, the Go code that is generated includes imports that reference a Google-hosted location of the generated code (!) for these types (google.golang.org/protobuf/types/known).
Alternatively, you can replicate this behavior without having to use an external repo. The bazel-bin folder must be outside of the current module. Each distinct module in bazel-bin, would need its own go.mod file. You would include in a require block in your code's go.mod file references to the modules' (one or more) locations. You don't need to publish the module to a external repo but can simply require ( name => path/to/module ) to provide a local reference.
I'm building a documentation for a platform that includes modules. I would like to let the documentations live in these modules repositories and include them in the "master" doc with the include command.
I tried the following :
.. include:: https://github.com/12rambau/sepal_ui_template/blob/master/doc/en.rst
But nothing was added to the file
Is it possible to use absolute link in includecommand ?
My main objective is not to use the include command but to avoid code ducplication and use a file that is available on the web. based on #Steve piercy answer I came up with this solution :
In the conf.py file I copy the content of the file from github
be careful and use the raw.githubusercontent.com link to avoid importing html
# [...]
# -- Copy the modules documentation ------------------------------------------
from urllib.request import urlretrieve
urlretrieve (
"https://raw.githubusercontent.com/12rambau/sepal_ui_template/master/doc/en.rst",
"modules/sepal_ui_template.rst"
)
after that the file is created under modules/sepal_ui_template.rst in my documentation and I can safely access it.
It will be download again every time I rebuild my documentation.
No. A fully qualified URL is not relative to the document. According to the docs for the include directive:
The directive argument is the path to the file to be included, relative to the document containing the directive.
There are alternatives, including this one.
I have a Python project using Sphinx for docs. I am building the docs remotely on readthedocs.io service.
I used sphinx-quickstart and it generated an index.rst file with these links in the footer:
Indices and tables
~~~~~~~~~~~~~~~~~~
* :ref:`genindex`
* :ref:`modindex`
* :ref:`search`
When I push changes to readthedocs.io and build the docs, my build succeeds. Docs that I manually linked via toctree directive all work fine.
The search link works fine.
But the genindex link goes to an empty page, titled "Index"
And the modindex page links to py-modindex.html, which is a 404.
Following this guide: https://samnicholls.net/2016/06/15/how-to-sphinx-readthedocs I had run sphinx-apidoc -o api-docs/ ../myproject to generate the autodoc .rst files.
I linked the resulting api-docs/modules.rst in the toctree section at the top of my index.rst... That link works and if I click through the api-docs have been generated correctly.
Also generated by sphinx-autodoc were files for each package in my project, they contain directives like:
myproject.whatever module
-------------------------
.. automodule:: myproject.whatever
:members:
:undoc-members:
:show-inheritance:
If I browse directly to these pages they have content, but they don't appear in the index (only the tocs they are manually linked in).
I also have some manually authored pages, again linked via toc.
My docs/conf.py looks like:
import os
import sys
sys.path.insert(0, os.path.abspath("../"))
extensions = [
"sphinx.ext.autodoc",
"sphinx.ext.viewcode",
"sphinx.ext.napoleon",
"sphinx_autodoc_typehints",
]
templates_path = ["_templates"]
exclude_patterns = ["_build", "Thumbs.db", ".DS_Store"]
html_theme = "alabaster"
html_static_path = ["_static"]
I believe the fact that html generated from the autodoc .rst stub files are filled with modules and classes extracted from the .py files in my project indicates that the sys path fix and autodoc are basically working.
So my question is:
How to make :ref:`genindex` have some content?
How to fix :ref:`modindex` points to py-modindex.html which does not exist?
genindex and modindex are automatically managed by Sphinx. Two cases should be considered:
Any declaration in a .rst file will be inserted in those indexes. For example, if you declare a function from the Python domain:
Your rst file
-------------
.. py:function:: name(parameters)
It will be inserted in the indexes even if it doesn't have a corresponding function in any .py file.
Using autodoc directives, the same applies with a few more rules. The autodoc extension will substitute domain declarations (like above) depending if the object has a docstring and if you are using the :members: and or :undoc-members: options. So you have to verify you are using the correct option and directive for your case.
Your rst file
-------------
.. autoclass:: Your_Module.Your_Class
:members:
The above example will be substituted by a :py:class:: domain declaration if the corresponding class has a docstring, if not you need to add the :undoc-members: option.
The symptoms you are describing of having empty indexes happens when you haven't declared anything in the .rst files. With the nuance that the autodoc directives may or may not do those declarations for you depending if the objects have docstrings and you used the right options in the directives.
EDIT: You should also run make clean before your builds (e.g.make html) because inconsistencies between builds can break the index.
As I eventually worked out in the comments, thanks to help from #bad_coder, my problem was specific to building the docs in readthedocs.io
Building the docs locally worked fine.
The reason came down to use of sphinx.ext.autodoc, perhaps in conjunction with sphinx_autodoc_typehints, which seems to need to import my actual python code. Checking the logs of my apparently successful readthedocs build showed actually there were warnings like:
WARNING: autodoc: failed to import module 'whatever' from module 'myproject'; the following exception was raised:
No module named 'somelib'
i.e the docs had only partially built, it had skipped the parts it couldn't do.
The build worked locally because I was already in a virtualenv with all my project's dependencies installed.
(IMHO this seems like a bad design of the sphinx.ext.autodoc and/or sphinx_autodoc_typehints ...good static-analysis tools exist for Python which can build an AST or CST and extract structure and docstrings without importing any code.)
Well anyway, this meant that I needed to tell readthedocs.io how to install all my project deps. This is slightly complicated by the fact I'm using Poetry, which is not explicitly supported. This means I don't have a requirements.txt file to point to (and I don't want to manually create one that is a duplicate of everything in my pyproject.toml).
Fortunately the pyproject.toml file is understandable by pip, so we're able to use the pip install method described here for readthedocs.io to install both my project deps, plus extra deps that are only needed for building docs: https://github.com/readthedocs/readthedocs.org/issues/4912#issuecomment-664002569
To summarise:
Deleted my docs/requirements.txt
Added:
[tool.poetry.dependencies]
...
sphinx = {version = "^3.1.1", optional = true}
sphinx-autodoc-typehints ={version = "^1.11.1", optional = true}
and:
[tool.poetry.extras]
docs = [
"sphinx",
"sphinx-autodoc-typehints"
]
to my pyproject.toml
Updated my .readthedocs.yml to:
version: 2
sphinx:
configuration: docs/conf.py
python:
version: 3.8
install:
- method: pip
path: .
extra_requirements:
- docs
Pushed these changes to readthedocs.io ...voilà, now it works.
Using the defgroup Doxygen keyword in a Doxygen comment block, it is possible to define a "module". Then, using the ingroup Doxygen keyword in any other Doxygen comment block, even in other source files, it is possible to add cpp classes and enums to the defined module.
Then, running doxygen doxyfile, the documentation is generated, and it has a nice modules tab, where all the defined modules are listed, one per line.
I have a C++ project and I would like to also have this module list that Doxygen generates in Sphinx. I activated the autodoc, breathe and exhale sphinx extensions.
Running make html runs Doxygen and generates the sphinx documentation, but the list of modules generated by Doxygen is missing in sphinx. In conf.py I have
# Setup the `exhale` extension
exhale_args = {
# These arguments are required.
"containmentFolder": "./api",
"rootFileName": "library_root.rst",
"rootFileTitle": "Library API",
the api/library_root.html generated by sphinx/exhale has a nice expandable list of all namespaces, each containing all its classes and enums. It also has another nice expandable list of all directories in the project, each with all its files.
So my question is this - how can I get sphinx to also generate the list of modules which doxygen has no trouble generating? It doesn't matter if in the library_root.html or another HTML file.
I found that if I use the name Doxygen keyword, like this: name module1 in a Doxygen comment (with leading backslash or at symbol) breathe seems to recognize it, whereas it seems to not recognize and ignore defgroup/ingroup.
So, in an rst file, I can then manually say
.. doxygengroup:: module1
:outline:
.. doxygengroup:: module2
:outline:
.. doxygengroup:: module3
:outline:
to get the list of modules, (they are labeled group not module) but I want this to be autogenerated, like in Doxygen, where I just define the modules in Doxygen comments in the source, and then they all appear in the modules tab in the Doxygen output without having to do anything else. I would prefer that they are labeled module, but I can live with group as long as it works.
I also tried adding in index.rst
.. automodule:: My Project
:members:
before .. toctree:: but that has no effect, it probably only works in Python.
I use Linux and have Sphinx 1.6.7 on one machine and 1.8.5 on another, Python 2.7
I have a go package located on my filesystem (not in the $GOPATH), called bitbucket.org/me/awesome.
~/awesome> tree
.
├── main.go
├── go.mod
├── go.sum
├── subpackageA
│ └── main.go
My go.mod looks like:
module bitbucket.org/me/awesome
require (
... # lots of external dependencies
)
replace bitbucket.org/me/awesome => ./
In main.go in my top-level directory, I call a subpackage like follows:
import "bitbucket.org/me/awesome/subpackageA"
which all seems pretty normal. go get works. However, when I clone this entire repository somewhere else (say in a Docker image) and run go get for the first time, I get errors like:
package bitbucket.org/me/awesome/subpackageA: https://api.bitbucket.org/2.0/repositories/me/awesome?fields=scm: 403 Forbidden,
which means it's not using the local filesystem version of the packages, even though I told it to with the replace directive in the go.mod file.
What am I doing wrong? How do I ensure that subpackages are used from the filesystem instead of attempting to be fetched from the internet?
Go has no (real) notion of "subpackage". All packages are basically treated equal. This means that a replace bitbucket.org/me/awesome does not influence package bitbucket.org/me/awesome/subpackageA as these are two individual, unrelated packages. The folder layout does not introduce a relation of subpackageA to awsome, or the other way around *).
So you need to add an individual replace directive for subpackageA
replace bitbucket.org/me/awesome/subpackageA => ./subpackageA
*) Nitpicking for absolute correctness: Folder layout does have influence for folders named internal (cannot be imported from other projects), for folders named vendor (which may contain vendored packages) and searching for a go.mod file stops at the repo root.
For another approach, you can have go.mod like this:
module awesome
Then call subpackage like this:
import "awesome/subpackageA"
https://golang.org/doc/code.html