Re: Using sphinx-apidoc to automatically generate API reference from docstrings via the autodoc extension...
I copied the global /site-packages/sphinx/templates/apidoc/package.rst_t template to a local folder, and made a nonsense edit. When I built the API docs, the nonsense edit wasn't visible ie. the local templates didn't seem to override the global ones.
Here's my local docs/source directory:
|_ docs
|_ source
|_ templates
| package.rst_t
|_ conf.py
|_ index.rst
conf.py contains this directive:
templates_path = ['templates']
I built the API docs using this command:
sphinx-apidoc -e -M -f --templatedir=./templates -o ./source/autodoc ../myproject
Can anyone see what's going wrong?
Related
my-project
|
|-kube
|_kustomize
|_base
|_sql
|_dbsturct
|_liqubase
|-db.changelog-master.xml
|-src
|_my.java.code
|_ resources
In my project I kept database-chagelog.xml outside resources folder and not included in the classpath.This folder structure is not included in the pom.xml as part of resources.
I have configured spring.liquibase.change-log=file:///C:/my-project/kube/kustomize/base/sql/dbsruct/liquibase/db.changelog-master.xml
It throws file not found exception.
Is there a way to configure db.changelog-master.xml without including as resource folder inside pom.xml?
I am trying to write a GitHub action that makes a copy of a file in my repo into a different subdir in the repo. This is my dir structure (before the file is copied)
my_project
|_ .github
| |_ workflows
| |_ run_tests.yml
| |_ copy_file.yml
|_ tests
| |_ testthat
| |_ test1.R
| |_ test2.R
|_ my_file.json
|_ copyfile.sh
This is what I want my file structure to be after the file is copied
my_project
|_ .github
| |_ workflows
| |_ run_tests.yml
| |_ copy_file.yml
|_ tests
| |_ testthat
| | |_ test1.R
| | |_ test2.R
| |_ copy_of_my_file.json
|_ my_file.json
|_ copyfile.sh
So basically, my GitHub action should make a copy of my_file.json named copy_of_my_file.json and place it inside the tests sub dir. I've built a bash script with the following
#!/bin/bash
cp my_file.json tests/copy_of_my_file.json
Locally, I run chmod u+x copyfile.sh and then bash copyfile.sh and the file is indeed copied. My copy_file.yml workflow is as follows:
name: Copy my_file.json to tests subdirectory
on: [push, pull_request, workflow_dispatch]
jobs:
run:
runs-on: [ubuntu-latest]
steps:
- name: Checkout 🛎️
uses: actions/checkout#v2
- name: Make copy of my_file.json 📚
run: |
chmod u+x "${GITHUB_WORKSPACE}/copyfile.sh"
bash "${GITHUB_WORKSPACE}/copyfile.sh"
The action runs with no errors, but the file doesn't get copied. I tried other actions from the GitHub Marketplace with no luck. I also tried changing the action's run to simply cp my_file.json tests/copy_of_my_file.json but it doesn't work either.
The problem was that the file needed to be committed to the repo. One could write its own commit action or pick from the GitHub Actions Marketplace.
In the background, what happens is that the action is run on a GitHub server. The checkout action fetches the files from the repo. Anything that is done afterward happens in that server. Therefore, anything that needs to be "saved", must be pushed to the repo or generate an action artifact.
Is it possible to hard code dependencies into the libraries build with bazel. The reason is that if I build somelib I can use it in the workspace but as soon as I copy the lib somewhere else I loose all dependencies (bazel cache). Witch creates a problem when I want to deploy the libraries into the system or install.
some_folder
|
thirdparty
|_WORKSPACE
|_somelib
| |_src
| |_ a.c
| |_ BUILD
| |_include
| |_a.h
|_include
|_ b.h
It sounds like you want to build a fully statically linked library. This can be done in Bazel by building the library using cc_binary with the linkshared attribute set to True. According to the documentation you also have to name your library libfoo.so or similar.
What enables the static library here is cc_binary's linkstatic attributes behavior. When True, which is the default, all dependencies that can be linked statically into the binary will be. Note that linkstatic does NOT behave the same on cc_library, see the documentation.
So, basically you want something like this in your BUILD file
cc_binary(
name = "libfoo.so",
srcs = [...],
hdrs = [...],
linkshared = 1,
#linkstatic = 1 # This is the default, you don't need to add this.
)
Good luck!
I'm experimenting with Sphinx and ReadTheDocs (RTD) to compile my documentation on every GitHub push. Unfortunately, RTD found multiple doc/docs folders containing a conf.py file.
My repository uses git sub-modules to embed third party libraries. Some of them are also documented using Sphinx. I assume the biggest (long lasting documentation build) wins and overwrites all static HTML pages in the final RTD view.
How can I exclude or tell RTD to ignore the folders of these sub-modules:
lib/cocotb
lib/osvvm
lib/vunit
docs/source/_themes/sphinx_rtd_theme
My documentation is located here:
docs/source/conf.py
docs/source/index.rst
As far as I have found, RTD does support *.yml files, but there is no entry to define the documentation root folder.
Any ideas to solve my problem?
Inside conf.py, there is a list that looks like this
# List of patterns, relative to source directory, that match files and
# directories to ignore when looking for source files.
exclude_patterns = []
You can put the files you want to ignore inside it like
exclude_patterns = ["lib/cocotb", "lib/osvvm", "lib/vunit", "docs/_themes/sphinx_rtd_theme"]
Please note that here the pattern is relative to the source directory, you can put / at the beginning of each file pattern above to make this more clear.
The main documentation folder and its conf.py can be configured in the Advanced Settings tab in the per project settings.
Example value: documentation/conf.py
Is it possible to have a subdirectory for all my pages?
Currently:
rake new_page['siht-daer-uoy-nac']
generates the markdown files like so:
source/
|_ _posts
|_ <some-other-directories>
|_ siht-daer-uoy-nac
|_ index.markdown
then doing a
rake generate
takes care of everything and spurts out a pretty html file.
My problem/question:
Call me OCD afflicted, but i would like to have a directory structure like so:
source/
|_ _posts
|_ <some-other-directories>
|_ _pages
|_ siht-daer-uoy-nac
|_ index.markdown
Having my top directory structure littered with a bunch of page slugs, makes me cry a little inside. I understand jekyll is merely a static page generator and plays its part only upto the point of html generation (and deployment).
Is it possible to maintain this kind of a folder structure for my pages?
Update:
I don't think it's possible out of the box to have a _pages directory without significantly messing around with octopress/jekyll source code (one of these days). In the meantime, a workaround to have a bunch of similar page like posts grouped is as ngm suggested below:
rake new_page["osx-essential-software/2011.markdown"]
# creates /source/osx-essential-software/2011/index.markdown
rake new_page["osx-essential-software/2010.markdown"]
# creates /source/osx-essential-software/2010/index.markdown
Not sure if it's by design or by accident, but if you do:
rake new_page["foo/nac-i-sey"]
you'll get a page for nac-i-sey under a foo subdirectory.
So if you wanted another page under foo, you could do:
rake new_page["foo/another-page"]