I'm building a documentation for a platform that includes modules. I would like to let the documentations live in these modules repositories and include them in the "master" doc with the include command.
I tried the following :
.. include:: https://github.com/12rambau/sepal_ui_template/blob/master/doc/en.rst
But nothing was added to the file
Is it possible to use absolute link in includecommand ?
My main objective is not to use the include command but to avoid code ducplication and use a file that is available on the web. based on #Steve piercy answer I came up with this solution :
In the conf.py file I copy the content of the file from github
be careful and use the raw.githubusercontent.com link to avoid importing html
# [...]
# -- Copy the modules documentation ------------------------------------------
from urllib.request import urlretrieve
urlretrieve (
"https://raw.githubusercontent.com/12rambau/sepal_ui_template/master/doc/en.rst",
"modules/sepal_ui_template.rst"
)
after that the file is created under modules/sepal_ui_template.rst in my documentation and I can safely access it.
It will be download again every time I rebuild my documentation.
No. A fully qualified URL is not relative to the document. According to the docs for the include directive:
The directive argument is the path to the file to be included, relative to the document containing the directive.
There are alternatives, including this one.
Related
I successfully used rules_go to build a gRPC service:
go_proto_library(
name = "processor_go_proto",
compilers = ["#io_bazel_rules_go//proto:go_grpc"],
importpath = "/path/to/proto/package",
proto = ":processor_proto",
deps = ["//services/shared/proto/common:common_go_proto"],
)
However, I'm not sure how to import the resulting file in VSCode. The generated file is nested under bazel_bin and under the original proto file path; so to import this, it seems like I would need to write out the entire path (including the bazel_bin part) to the generated Go file. To my understanding, there doesn't seem to be a way to instruct VSCode to look under certain folders that only contain Go packages/files; everything seems to need a go.mod file. This makes it quite difficult to develop in.
For clarity, my directory structure looks something like this:
WORKSPACE
bazel-bin
- path
- to
- generated_Go_file.go
go.mod
go.sum
proto
- path
- to
- gRPC_proto.proto
main.go
main.go should use the generated_Go_file.go.
Is there a way around this?
I don't use Bazel and so cannot help with the Bazel configuration. It's likely there is a way to specify the generated code location so that you can revise this to reflect you preference.
The outline you provide of the generated code, is workable though and a common pattern. Often the generated proto|gRPC code is placed in a module's gen subdirectory.
This is somewhat similar to vendoring where your code incorporates what may often be a 3rd-party's stubs (client|server) into your code. The stubs must reflect the proto(s) package(s) and, when these are 3rd-party, using gen or bazel-bin provide a way to keep potentially multiple namespaces discrete.
You're correct that the import for main.go, could (!) be prefixed with the module name from go.mod (first line) followed by the folder path to the generated code. This is standard go packaging and treats the generated code in a similar way to vendored modules.
Another approach is to use|place the generated code in a different module.
For code generated from 3rd-party protos, this may be preferable and the generated code may be provided by the 3rd-party in a module that you can go get or add to your go.mod.
An example of this approach is Google Well-Known Types. The proto (sources) are bundled with protoc (lib directory) and, when protoc compiles sources that references any of these, the Go code that is generated includes imports that reference a Google-hosted location of the generated code (!) for these types (google.golang.org/protobuf/types/known).
Alternatively, you can replicate this behavior without having to use an external repo. The bazel-bin folder must be outside of the current module. Each distinct module in bazel-bin, would need its own go.mod file. You would include in a require block in your code's go.mod file references to the modules' (one or more) locations. You don't need to publish the module to a external repo but can simply require ( name => path/to/module ) to provide a local reference.
go.mod's replace directive is a local configuration option, different developers could have the local module source in different locations.
It just feels wrong including this option in a file that has to be committed to a repo from which others can use the module (be it private or public).
Is there a way to specify this somewhere else than in go.mod?
Example:
https://github.com/Drean64/c64/blob/master/src/go.mod#L5
module github.com/Drean64/c64
go 1.18
replace github.com/Drean64/cpu6502 => ../../cpu6502/src
replace directive temporary solution when you want to use local modules but I prefer to use build flags, below -modfile is good and you can use it while building or running the program.
example : go run -modfile=local.mod main.go
-modfile file
in module aware mode, read (and possibly write) an alternate go.mod
file instead of the one in the module root directory. A file named
"go.mod" must still be present in order to determine the module root
directory, but it is not accessed. When -modfile is specified, an
alternate go.sum file is also used: its path is derived from the
-modfile flag by trimming the ".mod" extension and appending ".sum".
I do use replace directive only when needs a temporary solution.
different developers could have the local module source in different locations.
By using a relative path, you could reference a folder which is a submodule of your main repository project, which means all developer would benefit from the same local replace directive.
Is there a way to specify this somewhere else than in go.mod?
It does not seem to be, the replace directive is linked to go.mod, and:
replace directives only apply in the main module’s go.mod file and are ignored in other modules.
I have a Python project using Sphinx for docs. I am building the docs remotely on readthedocs.io service.
I used sphinx-quickstart and it generated an index.rst file with these links in the footer:
Indices and tables
~~~~~~~~~~~~~~~~~~
* :ref:`genindex`
* :ref:`modindex`
* :ref:`search`
When I push changes to readthedocs.io and build the docs, my build succeeds. Docs that I manually linked via toctree directive all work fine.
The search link works fine.
But the genindex link goes to an empty page, titled "Index"
And the modindex page links to py-modindex.html, which is a 404.
Following this guide: https://samnicholls.net/2016/06/15/how-to-sphinx-readthedocs I had run sphinx-apidoc -o api-docs/ ../myproject to generate the autodoc .rst files.
I linked the resulting api-docs/modules.rst in the toctree section at the top of my index.rst... That link works and if I click through the api-docs have been generated correctly.
Also generated by sphinx-autodoc were files for each package in my project, they contain directives like:
myproject.whatever module
-------------------------
.. automodule:: myproject.whatever
:members:
:undoc-members:
:show-inheritance:
If I browse directly to these pages they have content, but they don't appear in the index (only the tocs they are manually linked in).
I also have some manually authored pages, again linked via toc.
My docs/conf.py looks like:
import os
import sys
sys.path.insert(0, os.path.abspath("../"))
extensions = [
"sphinx.ext.autodoc",
"sphinx.ext.viewcode",
"sphinx.ext.napoleon",
"sphinx_autodoc_typehints",
]
templates_path = ["_templates"]
exclude_patterns = ["_build", "Thumbs.db", ".DS_Store"]
html_theme = "alabaster"
html_static_path = ["_static"]
I believe the fact that html generated from the autodoc .rst stub files are filled with modules and classes extracted from the .py files in my project indicates that the sys path fix and autodoc are basically working.
So my question is:
How to make :ref:`genindex` have some content?
How to fix :ref:`modindex` points to py-modindex.html which does not exist?
genindex and modindex are automatically managed by Sphinx. Two cases should be considered:
Any declaration in a .rst file will be inserted in those indexes. For example, if you declare a function from the Python domain:
Your rst file
-------------
.. py:function:: name(parameters)
It will be inserted in the indexes even if it doesn't have a corresponding function in any .py file.
Using autodoc directives, the same applies with a few more rules. The autodoc extension will substitute domain declarations (like above) depending if the object has a docstring and if you are using the :members: and or :undoc-members: options. So you have to verify you are using the correct option and directive for your case.
Your rst file
-------------
.. autoclass:: Your_Module.Your_Class
:members:
The above example will be substituted by a :py:class:: domain declaration if the corresponding class has a docstring, if not you need to add the :undoc-members: option.
The symptoms you are describing of having empty indexes happens when you haven't declared anything in the .rst files. With the nuance that the autodoc directives may or may not do those declarations for you depending if the objects have docstrings and you used the right options in the directives.
EDIT: You should also run make clean before your builds (e.g.make html) because inconsistencies between builds can break the index.
As I eventually worked out in the comments, thanks to help from #bad_coder, my problem was specific to building the docs in readthedocs.io
Building the docs locally worked fine.
The reason came down to use of sphinx.ext.autodoc, perhaps in conjunction with sphinx_autodoc_typehints, which seems to need to import my actual python code. Checking the logs of my apparently successful readthedocs build showed actually there were warnings like:
WARNING: autodoc: failed to import module 'whatever' from module 'myproject'; the following exception was raised:
No module named 'somelib'
i.e the docs had only partially built, it had skipped the parts it couldn't do.
The build worked locally because I was already in a virtualenv with all my project's dependencies installed.
(IMHO this seems like a bad design of the sphinx.ext.autodoc and/or sphinx_autodoc_typehints ...good static-analysis tools exist for Python which can build an AST or CST and extract structure and docstrings without importing any code.)
Well anyway, this meant that I needed to tell readthedocs.io how to install all my project deps. This is slightly complicated by the fact I'm using Poetry, which is not explicitly supported. This means I don't have a requirements.txt file to point to (and I don't want to manually create one that is a duplicate of everything in my pyproject.toml).
Fortunately the pyproject.toml file is understandable by pip, so we're able to use the pip install method described here for readthedocs.io to install both my project deps, plus extra deps that are only needed for building docs: https://github.com/readthedocs/readthedocs.org/issues/4912#issuecomment-664002569
To summarise:
Deleted my docs/requirements.txt
Added:
[tool.poetry.dependencies]
...
sphinx = {version = "^3.1.1", optional = true}
sphinx-autodoc-typehints ={version = "^1.11.1", optional = true}
and:
[tool.poetry.extras]
docs = [
"sphinx",
"sphinx-autodoc-typehints"
]
to my pyproject.toml
Updated my .readthedocs.yml to:
version: 2
sphinx:
configuration: docs/conf.py
python:
version: 3.8
install:
- method: pip
path: .
extra_requirements:
- docs
Pushed these changes to readthedocs.io ...voilà, now it works.
I am using asciidoc with asciidoctor to create documentation for a current project.
I notice there is a markup to include files in the documentation like so:
link:index.html
or
link:protocol.json[Open the JSON file]
Is it possible to include a docx file as a link so that it would open externally or be able to downloaded?
Also can I put this file in a folder inside my asciidoc directory (for the sake of organization) and still be able to properly reference it?
You write something like this:
Open this link:somefile.docx[Word file] should work.
Or this link:file:///C:/Users/xxx/docs/otherfile.docx[second file].
It works with relative path or absolute path.
You need to ensure that the path to your file will be correct for the reader of your document.
Example: if you put the HTML files produced by Asciidoctor on a webserver (public or intranet), having a path referencing your local C: is not a good idea.
It is hard to tell you what to do without knowledge of your publication/distribution toolchain.
I'm experimenting with Sphinx and ReadTheDocs (RTD) to compile my documentation on every GitHub push. Unfortunately, RTD found multiple doc/docs folders containing a conf.py file.
My repository uses git sub-modules to embed third party libraries. Some of them are also documented using Sphinx. I assume the biggest (long lasting documentation build) wins and overwrites all static HTML pages in the final RTD view.
How can I exclude or tell RTD to ignore the folders of these sub-modules:
lib/cocotb
lib/osvvm
lib/vunit
docs/source/_themes/sphinx_rtd_theme
My documentation is located here:
docs/source/conf.py
docs/source/index.rst
As far as I have found, RTD does support *.yml files, but there is no entry to define the documentation root folder.
Any ideas to solve my problem?
Inside conf.py, there is a list that looks like this
# List of patterns, relative to source directory, that match files and
# directories to ignore when looking for source files.
exclude_patterns = []
You can put the files you want to ignore inside it like
exclude_patterns = ["lib/cocotb", "lib/osvvm", "lib/vunit", "docs/_themes/sphinx_rtd_theme"]
Please note that here the pattern is relative to the source directory, you can put / at the beginning of each file pattern above to make this more clear.
The main documentation folder and its conf.py can be configured in the Advanced Settings tab in the per project settings.
Example value: documentation/conf.py