Tutorial#
Generate you package#
To generate a package using cookiecutter, run:
cookiecutter [--checkout BRANCH-NAME] https://github.com/usnistgov/cookiecutter-nist-python.git
where the optional argument in brackets can be used to specify a specific branch.
Alternatively (and highly recommended) is to use [cruft]. This allows for the template files to be updated as the template is updated. For this, you can run:
cruft create [--checkout BRANCH-NAME] https://github.com/usnistgov/cookiecutter-nist-python.git
Create a git repo#
cd {my-project}
# init the project
git init
# add all files (should be used with care)
git add .
git commit -m 'a meaningful message'
Using pre-commit#
It is highly recommended to enable pre-commit. See Setup development environment for installation instructions. To install the pre-commit hooks, run:
pre-commit install
This will enable a variety of code-checkers (linters) when you add a file to commit. Alternatively, you can run the hooks over all files using:
pre-commit run --all-files
You can also run pre-commit on all files via nox using:
nox -s lint
Using nox#
This project makes extensive use of nox to automate testing, typing,
documentation creation, etc. One downside of using tox with this particular
workflow is the need for multiple scripts/makefiles, while with nox, most
everything is self contained in the file noxfile.py
. nox also allows for a
mix of conda and virtualenv environments. The default is for the development
environment to use conda, while all other environments are virtualenvs. There
are conda sessions for testing (test-conda
), typing (typing-conda
), docs
(docs-conda
), etc.
Installing interpreters for virtualenv creation#
If using virtualenvs across multiple python versions (e.g., test
, typing
,
etc), you’ll need to install python interpreters for each version. If using
pyenv, you should be good to go.
Instead of using pyenv, I use conda to create multiple invironments to hold different python version. You can use the following script to create the needed conda environments:
python tools/create_pythons.py -p 3.8 3.9 ...
Run with --help
for more options.
To tell nox where to find python interpreters created like above, define the environment variable:
NOX_PYTHON_PATH="~/.conda/python-3.*/bin"
or the user config file config/userconfig.toml
with:
# config/userconfig.toml
[nox.python]
paths = ["~/.conda/envs/python-3.*/bin"]
The variable nox.python.paths
is a list of paths (with optional wildcards)
added to the environment variable PATH
to search for python interpreters. If
using the environment variable NOX_PYTHON_PATH
, paths should be separated with
the colons (:
). Either of the above will add the paths
~/.conda/envs/python-3.*/bin
to the search path.
Nox session options#
To see all nox session, run:
nox --list
To simplify passing options to underlying commands, the options to a particular
nox session use +
instead of -
for options. For example, pass options to
pytest, use:
nox -s test -- ++test-opts -x -v
Using +
for the session option ++test-opts
means we don’t have to escape
-x
or -v
. To see all options:
nox -- ++help/+h
Note that these options should be passed after --
. For example, to build and
open the documentation, run:
nox -s docs -- +d build open
Creating environment.yaml/requirement.txt files#
The project is setup to create environment.yaml
and requirement.txt
files
from pyproject.toml
. This can be done using:
nox -s requirements
This uses pyproject2conda to create the requirement files. Note that all
requirement files are under something like
requirements/py{version}-{env-name}.yaml
(conda environment) or
requirements/{env-name}.txt
(virtual environment).
Additionally, requirement files for virtualenvs (e.g., requirements.txt
like
files) will be “locked” using pip-compile
from pip-tools. These files are
placed under requirements/lock
. Note the the session requirements
automatically calls the session pip-compile
.
To upgrade the dependencies in the lock, you’ll need to pass the option:
nox -s requirements/pip-compile -- +L/++pip-compile-upgrade
ipykernel#
The environments created by nox dev
and docs-conda
will try to add
meaningful display names for ipykernel. These are installed at the user level.
To cleanup the kernels (meaning, removing installed kernels that point to a
removed environment), You can use the script tools/clean_kernelspec.py
:
python tools/clean_kernelspec.py
Building the docs#
We use nox to isolate the documentation build. Specific tasks can be run with
nox -s docs -- +d [commands]
where commands can be one of:
clean : remove old doc build
build/html : build html documentation
spelling : check spelling
linkcheck : check the links
symlink : rebuild symlinks from
examples
todocs/examples
release : make pages branch for documentation hosting (using ghp-import)
livehtml : Live documentation updates
open : open the documentation in a web browser
serve : Serve the created documentation webpage (Need this to view javascript in created pages).
Testing with nox#
The basic command is:
nox -s test -- [++test-opts] [++no-cov]
where you can pass in additional pytest options via ++test-opts
. For example:
nox -s test -- ++test-opts -x -v
Use session test-conda
to test under a conda environment.
Note that by default, these will install an isolated copy of the package, as
apposed to installing with pip install -e . --no-deps
. This is similar to how
tox works. This uses the nox session build
behind the scenes. This should
therefore be a fast operation.
Building distribution for conda#
For the most part, we use grayskull to create the conda recipe. However, I’ve
had issues getting it to play nice with pyproject.toml
for some of the ‘extra’
variables. So, we use grayskull to build the majority of the recipe, and append
the file config/recipe-append.yaml
. For some edge cases (install name
different from package name, etc), you’ll need to manually edit this file to
create the final recipe.
To build the conda recipe using grayskull:
nox -s conda-recipe -- ++conda-recipe [recipe, recipe-full]
To build the conda distribution:
nox -s conda-build -- ++conda-build [build,clean]
To upload the recipe, you’ll need to run an external command like:
nox -s conda-build -- ++conda-build-run "anaconda upload PATH-TO-TARBALL"
Building distribution for pypi#
The basic command is:
nox -s build
To upload the pypi distribution:
nox -s publish -- +p [release, test]
test : upload to testpypi
release : upload to pypi
Testing pypi or conda installs#
Run:
nox -s testdist-pypi -- ++version [version]
to test a specific version from pypi and
nox -s testdist-conda -- ++version [version]
to to likewise from conda.
Testing notebooks with nbval#
To test notebooks expected output using nbval, run
nox -s test-notebook
Type checking#
Run:
nox -s typing -- +m [commands] [options]
Use typing-conda
to test typing in a conda environment.
Note that the repo is setup to use a single install of mypy and pyright. The
script tools/pipxrun.py
will run check if an appropriate version of the
typecheckers is installed. If not, they will be run (and cached) using
pipx run
.
Setup development environment#
This project uses a host of tools to (hopefully) make development easier. We
recommend installing some of these tools system wide. For this, we recommend
using either pipx or condax. We mostly use conda/condax, but the choice is
yours. For conda, we recommend actually using mamba. Alternatively, you can
setup conda
to use the faster mamba
solver. See here for
details.
Create development environment with conda#
To install a development environment using conda/mamba run:
conda env create -n {env-name} -f requirements/py{version}-dev.yaml
conda activate {env-name}
pip install -e . --no-deps
If you want to include some extra tools in the environment (instead of using
condax or pipx), use requirements/py{version}-dev-complete.yaml
instead.
Create development environment with pip#
Run something like the following:
python -m venv .venv
source .venv/bin/activate
# unlocked
python -m pip install -r requirements/dev.txt
# locked:
pip-sync --python-path .venv/bin/python requirements/lock/py{version}-dev.txt
python -m pip install -e . --no-deps
If you want to include the extra tools, replace dev.txt
with
dev-complete.txt
.
Create development environment with nox#
If you’d like to use nox to manage your development environment, use the following:
nox -s dev -- [++dev-envname dev/dev-complete]
where the option ++dev-envname
(default dev
) can be used to specify what
kind of development environment you’d like. This will create a conda
environment under .venv
. To instead create a virtualenv based development
environment, use nox -s dev-venv ....
.
If you go this route, you may want to use something like zsh-autoenv (if using zsh shell) or autoenv (if using bash) to auto activate the development environment when in the parent directory.
Note that you can bootstrap the whole process with pipx using:
pipx run --spec nox \
nox -s dev/dev-venv -- \
++dev-envname dev/dev-complete
Development tools#
We recommend installing the following tools with pipx or condax. If you’d
like to install them in the development environment instead, use the
dev-complete
version of the commands above.
Additional tools are:
uv (optional, highly recommended)
scriv (optional)
pyright (optional)
cruft (optional)
commitizen (optional)
cog (optional)
nbqa (optional)
These are setup using the following:
# install pipx using something like ...
pip install --user pipx
condax/pipx install pre-commit
# optional packages
pipx install scriv
condax/pipx install uv
condax/pipx install pyright
condax/pipx install cruft
condax/pipx install commitizen
condax/pipx install cogapp
condax/pipx install nbqa
Note that the repo is setup to automatically use pipx for many of these tools.
Behind the scenes, the makefile and noxfile.py
will invoke tools/pipxrun.py
.
This will either run the tool with pipx run tool..
, or, if it is already
installed (with proper version), run the tool from the install. This prevents
having to install a bunch of tooling in the “dev” environment, and also avoid
creating a bunch of through away nox environments. This is experimental, and I
might change back to using small nox environments again in the future.
Package version#
Versioning is handled with hatch-vcs. The package version is set by the git
tag. For convenience, you can override the version with nox setting
++version ...
. This is useful for updating the docs, etc.
Note that the version in a given environment/session can become stale. The easiest way to update the installed package version version is to reinstall the package. This can be done using the following:
# using pip
pip install -e . --no-deps
# using uv
uv pip install -e . --no-deps
To do this in a given session, use:
nox -s {session} -- +P/++update-package
Using setuptools instead of hatchling#
The repo by default uses hatchling for building the package. I’ve found that setuptools is overkill for python only projects. However, if you’d like to use setuptools (if, for example, your package includes non-python code), you can use something like the following:
# pyproject.toml
[build-system]
build-backend = "setuptools.build_meta"
requires = [
"setuptools>=61.2",
"setuptools_scm[toml]>=8.0",
]
...
[tool.setuptools]
zip-safe = true # if using mypy, must be False
include-package-data = true
license-files = ["LICENSE"]
[tool.setuptools.packages.find]
namespaces = true
where = ["src"]
[tool.setuptools.dynamic]
readme = { file = [
"README.md",
"CHANGELOG.md",
"LICENSE"
], content-type = "text/markdown" }
[tool.setuptools_scm]
fallback_version = "999"
Also remove the sections tool.hatch.version
and
tool.hatch.metadata.hooks.fancy-pypi-readme
. You may have to add the file
MANIFEST.in
to include/exclude files if needed.