Build process customization
Read the Docs has a well-defined build process that works for many projects. We also allow customization of builds in two ways:
- Extend the build process
Keep using the default build process, adding your own commands.
- Override the build process
This option gives you full control over your build. Read the Docs supports any tool that generates HTML.
Extend the build process
In the normal build process,
the pre-defined jobs checkout
, system_dependencies
, create_environment
, install
, build
and upload
are executed.
Read the Docs also exposes these jobs,
which allows you to customize the build process by adding shell commands.
The jobs where users can customize our default build process are:
Step |
Customizable jobs |
---|---|
Checkout |
|
System dependencies |
|
Create environment |
|
Install |
|
Build |
|
Upload |
No customizable jobs currently |
Note
The pre-defined jobs (checkout
, system_dependencies
, etc) cannot be overridden or skipped.
You can fully customize things in Override the build process.
These jobs are defined using the Configuration file reference with the build.jobs key. This example configuration defines commands to be executed before installing and after the build has finished:
version: 2
build:
os: "ubuntu-22.04"
tools:
python: "3.10"
jobs:
pre_install:
- bash ./scripts/pre_install.sh
post_build:
- curl -X POST \
-F "project=${READTHEDOCS_PROJECT}" \
-F "version=${READTHEDOCS_VERSION}" https://example.com/webhooks/readthedocs/
User-defined job limitations
The current working directory is at the root of your project’s cloned repository
Environment variables are expanded for each individual command (see Environment variable reference)
Each command is executed in a new shell process, so modifications done to the shell environment do not persist between commands
Any command returning non-zero exit code will cause the build to fail immediately (note there is a special exit code to cancel the build)
build.os
andbuild.tools
are required when usingbuild.jobs
build.jobs
examples
We’ve included some common examples where using build.jobs will be useful. These examples may require some adaptation for each projects’ use case, we recommend you use them as a starting point.
Unshallow git clone
Read the Docs does not perform a full clone in the checkout
job in order to reduce network data and speed up the build process.
Instead, it performs a shallow clone and only fetches the branch or tag that you are building documentation for.
Because of this, extensions that depend on the full Git history will fail.
To avoid this, it’s possible to unshallow the git clone:
version: 2
build:
os: "ubuntu-20.04"
tools:
python: "3.10"
jobs:
post_checkout:
- git fetch --unshallow || true
If your build also relies on the contents of other branches, it may also be necessary to re-configure git to fetch these:
version: 2
build:
os: "ubuntu-20.04"
tools:
python: "3.10"
jobs:
post_checkout:
- git fetch --unshallow || true
- git config remote.origin.fetch '+refs/heads/*:refs/remotes/origin/*' || true
- git fetch --all --tags || true
Cancel build based on a condition
When a command exits with code 183
,
Read the Docs will cancel the build immediately.
You can use this approach to cancel builds that you don’t want to complete based on some conditional logic.
Note
Why 183 was chosen for the exit code?
It’s the word “skip” encoded in ASCII. Then it’s taken the 256 modulo of it because the Unix implementation does this automatically for exit codes greater than 255.
>>> sum(list("skip".encode("ascii")))
439
>>> 439 % 256
183
Here is an example that cancels builds from pull requests when there are no changes to the docs/
folder compared to the origin/main
branch:
version: 2
build:
os: "ubuntu-22.04"
tools:
python: "3.12"
jobs:
post_checkout:
# Cancel building pull requests when there aren't changed in the docs directory or YAML file.
# You can add any other files or directories that you'd like here as well,
# like your docs requirements file, or other files that will change your docs build.
#
# If there are no changes (git diff exits with 0) we force the command to return with 183.
# This is a special exit code on Read the Docs that will cancel the build immediately.
- |
if [ "$READTHEDOCS_VERSION_TYPE" = "external" ] && git diff --quiet origin/main -- docs/ .readthedocs.yaml;
then
exit 183;
fi
This other example shows how to cancel a build if the commit message contains skip ci
on it:
version: 2
build:
os: "ubuntu-22.04"
tools:
python: "3.12"
jobs:
post_checkout:
# Use `git log` to check if the latest commit contains "skip ci",
# in that case exit the command with 183 to cancel the build
- (git --no-pager log --pretty="tformat:%s -- %b" -1 | grep -viq "skip ci") || exit 183
Generate documentation from annotated sources with Doxygen
It’s possible to run Doxygen as part of the build process to generate documentation from annotated sources:
version: 2
build:
os: "ubuntu-20.04"
tools:
python: "3.10"
jobs:
pre_build:
# Note that this HTML won't be automatically uploaded,
# unless your documentation build includes it somehow.
- doxygen
Use MkDocs extensions with extra required steps
There are some MkDocs extensions that require specific commands to be run to generate extra pages before performing the build. For example, pydoc-markdown
version: 2
build:
os: "ubuntu-20.04"
tools:
python: "3.10"
jobs:
pre_build:
- pydoc-markdown --build --site-dir "$READTHEDOCS_OUTPUT/html"
Avoid having a dirty Git index
Read the Docs needs to modify some files before performing the build to be able to integrate with some of its features. Because of this reason, it could happen the Git index gets dirty (it will detect modified files). In case this happens and the project is using any kind of extension that generates a version based on Git metadata (like setuptools_scm), this could cause an invalid version number to be generated. In that case, the Git index can be updated to ignore the files that Read the Docs has modified.
version: 2
build:
os: "ubuntu-20.04"
tools:
python: "3.10"
jobs:
pre_install:
- git update-index --assume-unchanged environment.yml docs/conf.py
Perform a check for broken links
Sphinx comes with a linkcheck builder that checks for broken external links included in the project’s documentation. This helps ensure that all external links are still valid and readers aren’t linked to non-existent pages.
version: 2
build:
os: "ubuntu-20.04"
tools:
python: "3.10"
jobs:
pre_build:
- python -m sphinx -b linkcheck -D linkcheck_timeout=1 docs/ $READTHEDOCS_OUTPUT/linkcheck
Support Git LFS (Large File Storage)
In case the repository contains large files that are tracked with Git LFS,
there are some extra steps required to be able to download their content.
It’s possible to use post_checkout
user-defined job for this.
version: 2
build:
os: "ubuntu-20.04"
tools:
python: "3.10"
jobs:
post_checkout:
# Download and uncompress the binary
# https://git-lfs.github.com/
- wget https://github.com/git-lfs/git-lfs/releases/download/v3.1.4/git-lfs-linux-amd64-v3.1.4.tar.gz
- tar xvfz git-lfs-linux-amd64-v3.1.4.tar.gz
# Modify LFS config paths to point where git-lfs binary was downloaded
- git config filter.lfs.process "`pwd`/git-lfs filter-process"
- git config filter.lfs.smudge "`pwd`/git-lfs smudge -- %f"
- git config filter.lfs.clean "`pwd`/git-lfs clean -- %f"
# Make LFS available in current repository
- ./git-lfs install
# Download content from remote
- ./git-lfs fetch
# Make local files to have the real content on them
- ./git-lfs checkout
Install Node.js dependencies
It’s possible to install Node.js together with the required dependencies by using user-defined build jobs.
To setup it, you need to define the version of Node.js to use and install the dependencies by using build.jobs.post_install
:
version: 2
build:
os: "ubuntu-22.04"
tools:
python: "3.9"
nodejs: "16"
jobs:
post_install:
# Install dependencies defined in your ``package.json``
- npm ci
# Install any other extra dependencies to build the docs
- npm install -g jsdoc
Install dependencies with Poetry
Projects managed with Poetry,
can use the post_create_environment
user-defined job to use Poetry for installing Python dependencies.
Take a look at the following example:
version: 2
build:
os: "ubuntu-22.04"
tools:
python: "3.10"
jobs:
post_create_environment:
# Install poetry
# https://python-poetry.org/docs/#installing-manually
- pip install poetry
post_install:
# Install dependencies with 'docs' dependency group
# https://python-poetry.org/docs/managing-dependencies/#dependency-groups
# VIRTUAL_ENV needs to be set manually for now.
# See https://github.com/readthedocs/readthedocs.org/pull/11152/
- VIRTUAL_ENV=$READTHEDOCS_VIRTUALENV_PATH poetry install --with docs
sphinx:
configuration: docs/conf.py
Install dependencies with uv
Projects can use uv, to install Python dependencies, usually reducing the time taken to install compared to pip. Take a look at the following example:
version: 2
build:
os: "ubuntu-22.04"
tools:
python: "3.10"
commands:
- asdf plugin add uv
- asdf install uv latest
- asdf global uv latest
- uv venv
- uv pip install .[docs]
- .venv/bin/python -m sphinx -T -b html -d docs/_build/doctrees -D language=en docs $READTHEDOCS_OUTPUT/html
You can use -r docs/requirements.txt
, etc. instead as needed. MkDocs projects could use NO_COLOR=1 .venv/bin/mkdocs build --strict --site-dir $READTHEDOCS_OUTPUT/html
instead.
Update Conda version
Projects using Conda may need to install the latest available version of Conda.
This can be done by using the pre_create_environment
user-defined job to update Conda
before creating the environment.
Take a look at the following example:
version: 2
build:
os: "ubuntu-22.04"
tools:
python: "miniconda3-4.7"
jobs:
pre_create_environment:
- conda update --yes --quiet --name=base --channel=defaults conda
conda:
environment: environment.yml
Override the build process
Warning
This feature is in beta and could change without warning.
We are currently testing the new addons integrations we are building
on projects using build.commands
configuration key.
If your project requires full control of the build process, and extending the build process is not enough, all the commands executed during builds can be overridden using the build.commands.
As Read the Docs does not have control over the build process, you are responsible for running all the commands required to install requirements and build your project.
Where to put files
It is your responsibility to generate HTML and other formats of your documentation using build.commands.
The contents of the $READTHEDOCS_OUTPUT/<format>/
directory will be hosted as part of your documentation.
We store the the base folder name _readthedocs/
in the environment variable $READTHEDOCS_OUTPUT
and encourage that you use this to generate paths.
Supported formats are published if they exist in the following directories:
$READTHEDOCS_OUTPUT/html/
(required)$READTHEDOCS_OUTPUT/htmlzip/
$READTHEDOCS_OUTPUT/pdf/
$READTHEDOCS_OUTPUT/epub/
Note
Remember to create the folders before adding content to them. You can ensure that the output folder exists by adding the following command:
mkdir -p $READTHEDOCS_OUTPUT/html/
Search support
Read the Docs will automatically index the content of all your HTML files, respecting the search option.
You can access the search from the Read the Docs dashboard, or by using the Server side search API.
Note
In order for Read the Docs to index your HTML files correctly, they should follow the conventions described at Server side search integration.
build.commands
examples
This section contains examples that showcase what is possible with build.commands. Note that you may need to modify and adapt these examples depending on your needs.
Pelican
Pelican is a well-known static site generator that’s commonly used for blogs and landing pages. If you are building your project with Pelican you could use a configuration file similar to the following:
version: 2
build:
os: "ubuntu-22.04"
tools:
python: "3.10"
commands:
- pip install pelican[markdown]
- pelican --settings docs/pelicanconf.py --output $READTHEDOCS_OUTPUT/html/ docs/
Docsify
Docsify generates documentation websites on the fly, without the need to build static HTML. These projects can be built using a configuration file like this:
version: 2
build:
os: "ubuntu-22.04"
tools:
nodejs: "16"
commands:
- mkdir --parents $READTHEDOCS_OUTPUT/html/
- cp --recursive docs/* $READTHEDOCS_OUTPUT/html/
Asciidoc
Asciidoctor is a fast processor for converting and generating documentation from AsciiDoc source. The Asciidoctor toolchain includes Asciidoctor.js which you can use with custom build commands. Here is an example configuration file:
version: 2
build:
os: "ubuntu-22.04"
tools:
nodejs: "20"
commands:
- npm install -g asciidoctor
- asciidoctor -D $READTHEDOCS_OUTPUT/html index.asciidoc