Tutorial#

Install cookiecutter#

You can install cookiecutter in any of the following ways. It is recommended to use [pipx] or [condax], so cookiecutter is centrally installed.

# using conda or mamba
conda/mamba install [-n ENVIRONMENT] cookiecutter
# using pip or pipx or condax
pip/pipx/condax install cookiecutter

Generate you package#

To generate a package using cookiecutter, run:

cookiecutter [--checkout BRANCH-NAME] https://github.com/usnistgov/cookiecutter-nist-python.git

where the optional argument in brackets can be used to specify a specific branch.

Alternatively (and highly recommended) is to use [cruft]. This allows for the template files to be updated as the template is updated. For this, you can run:

cruft create [--checkout BRANCH-NAME] https://github.com/usnistgov/cookiecutter-nist-python.git

Create a git repo#

cd {my-project}
# init the project
git init
# add all files (should be used with care)
git add .
git commit -m 'a meaningful message'

Using pre-commit#

It is highly recommended to enable pre-commit. See Setup development environment for installation instructions. To install the pre-commit hooks, run:

pre-commit install

This will enable a variety of code-checkers (linters) when you add a file to commit. Alternatively, you can run the hooks over all files using:

pre-commit run --all-files

You can also run pre-commit on all files via nox using:

nox -s lint

Using nox#

This project makes extensive use of nox to automate testing, type checking, documentation creation, etc. One downside of using tox with this particular workflow is the need for multiple scripts, while with nox, most everything is self contained in the file noxfile.py. nox also allows for a mix of conda and virtualenv environments.

Installing interpreters for virtualenv creation#

If using virtualenvs across multiple python versions (e.g., test, typecheck, etc), you’ll need to install python interpreters for each version. If using pyenv, you should be good to go.

Instead of using pyenv, I use uv to manage python versions. For example:

uv python install python3.12

I also set the global uv config file (~/.config/uv/uv.toml on mac and linux) to use only managed python:

python-preference = "only-managed"

nox is setup to automatically work with uv. Note that the python interpreter may need to be installed before it can be used with nox

Nox session options#

To see all nox session, run:

nox --list

To simplify passing options to underlying commands, the options to a particular nox session use + instead of - for options. For example, pass options to pytest, use:

nox -s test -- ++test-opts -x -v

Using + for the session option ++test-opts means we don’t have to escape -x or -v. To see all options:

nox -- ++help/+h

Note that these options should be passed after --. For example, to build and open the documentation, run:

nox -s docs -- +d build open

Creating environment.yaml/requirement.txt files#

The project is setup to create environment.yaml and requirement.txt files from pyproject.toml. This can be done using:

just requirements

This uses pyproject2conda to create the requirement files. Note that all requirement files are under something like requirements/py{version}-{env-name}.yaml (conda environment) or requirements/{env-name}.txt (virtual environment).

Additionally, requirement files for virtualenvs (e.g., requirements.txt like files) will be “locked” using uv pip compile from uv. These files are placed under requirements/lock. This uses the script tools/requirements_lock.py. The uv.lock file will also be updated. To upgrade locked requirements pass option --upgrade/-U.

Using just as task runner#

The project includes a justfile to be invoked using just to simplify common tasks. Run just with no options to see available commands.

ipykernel#

The environments created by nox dev, or running just install-kernel, will try to add meaningful display names for ipykernel. These are installed at the user level. To cleanup the kernels (meaning, removing installed kernels that point to a removed environment), You can use the script tools/clean_kernelspec.py:

python tools/clean_kernelspec.py

Building the docs#

We use nox to isolate the documentation build. Specific tasks can be run with

nox -s docs -- +d [commands]

where commands can be one of:

  • clean : remove old doc build

  • build/html : build html documentation

  • spelling : check spelling

  • linkcheck : check the links

  • release : make pages branch for documentation hosting (using ghp-import)

  • livehtml : Live documentation updates

  • open : open the documentation in a web browser

  • serve : Serve the created documentation webpage (Need this to view javascript in created pages).

Testing with nox#

The basic command is:

nox -s test -- [++test-opts] [++no-cov]

where you can pass in additional pytest options via ++test-opts. For example:

nox -s test -- ++test-opts -x -v

Use session test-conda to test under a conda environment.

Note that by default, these will install an isolated copy of the package, as apposed to installing with pip install -e . --no-deps. This is similar to how tox works. This uses the nox session build behind the scenes. This should therefore be a fast operation.

Building distribution for conda#

For the most part, we use grayskull to create the conda recipe. However, I’ve had issues getting it to play nice with pyproject.toml for some of the ‘extra’ variables. So, we use grayskull to build the majority of the recipe, and append the file config/recipe-append.yaml. For some edge cases (install name different from package name, etc), you’ll need to manually edit this file to create the final recipe.

To build the conda recipe using grayskull:

nox -s conda-recipe -- ++conda-recipe [recipe, recipe-full]

To build the conda distribution:

nox -s conda-build -- ++conda-build [build,clean]

To upload the recipe, you’ll need to run an external command like:

nox -s conda-build -- ++conda-build-run "anaconda upload PATH-TO-TARBALL"

Building distribution for pypi#

The basic command is:

nox -s build

To upload the pypi distribution:

nox -s publish -- +p [release, test]
  • test : upload to testpypi

  • release : upload to pypi

Testing pypi or conda installs#

Run:

nox -s testdist-pypi -- ++version [version]

to test a specific version from pypi and

nox -s testdist-conda -- ++version [version]

to do likewise from conda.

Testing notebooks with nbval#

To test notebooks expected output using nbval, run

nox -s test-notebook

Type checking#

Run:

nox -s typecheck -- +m [commands] [options]

Note that the repo is setup to use a single install of mypy and pyright. The script tools/typecheck.py will run the checkers via uvx and point the checker to the appropriate python executable.

Setup development environment#

This project uses a host of tools to (hopefully) make development easier. We recommend installing some of these tools system wide. For this, we recommend using uv (or pipx or condax). We mostly use uv, but the choice is yours. For conda, we recommend actually using mamba. Alternatively, you can setup conda to use the faster mamba solver. See here for details.

Create development environment with conda#

To install a development environment using conda/mamba run:

conda env create -n {env-name} -f requirements/py{version}-dev.yaml
conda activate {env-name}
pip install -e . --no-deps

Create development environment with uv/pip#

The easiest way to create an development environment, if using uv.lock mechanism is:

uv sync

If the project does not use uv.lock, or you don’t want to use uv to manage your environment, then use one of the following:

# using venv
python -m venv .venv
source .venv/bin/activate
python -m pip install -r requirements/lock/py{version}-dev.txt
python -m pip install -e . --no-deps
# using uv
uv venv --python 3.11 .venv
uv pip sync requirements/lock/py{version}-dev.txt

Note that if the project is setup to use uv.lock but you’d like to use one of the above, you may have to run something like:

uv export --dev > requirements.txt

and use this requirement file in the commands above.

If the project includes an ipython kernel, you can install it with:

just install-kernel

Alternatively, you can simply use:

nox -s dev

which will create a virtual environment under .venv. If you go this route, you may want to use something like zsh-autoenv (if using zsh shell) or autoenv (if using bash) to auto activate the development environment when in the parent directory.

Development tools#

The only required tool is uv, but it highly recommended to also install just. Other tools used are:

which can be installed using:

uv tool install pre-commit

Note that the repo is setup to automatically use uvx for many of these tools. Behind the scenes, the justfile and noxfile.py will invoke uvx with constraints from requirements/lock/uvx-tools.txt. This will run the tool with with the proper version. Note that if the tool is already installed with the proper version, uvx will use it. This prevents having to install a bunch of tooling in the “dev” environment, and also avoid creating a bunch of throw away nox environments.

Package version#

Versioning is handled with hatch-vcs. The package version is set by the git tag. For convenience, you can override the version with nox setting ++version .... This is useful for updating the docs, etc.

Note that the version in a given environment/session can become stale. The easiest way to update the installed package version version is to reinstall the package. This can be done using the following:

# using pip
pip install -e . --no-deps
# using uv
uv pip install -e . --no-deps

To do this in a given session, use:

nox -s {session} -- +P/++update-package

Using setuptools instead of hatchling#

The repo by default uses hatchling for building the package. I’ve found that setuptools is overkill for python only projects. However, if you’d like to use setuptools (if, for example, your package includes non-python code), you can use something like the following:

# pyproject.toml
[build-system]
build-backend = "setuptools.build_meta"
requires = [
    "setuptools>=61.2",
    "setuptools_scm[toml]>=8.0",
]

...

[tool.setuptools]
zip-safe = true # if using mypy, must be False
include-package-data = true
license-files = ["LICENSE"]

[tool.setuptools.packages.find]
namespaces = true
where = ["src"]

[tool.setuptools.dynamic]
readme = { file = [
    "README.md",
    "CHANGELOG.md",
    "LICENSE"
], content-type = "text/markdown" }

[tool.setuptools_scm]
fallback_version = "999"

Also remove the sections tool.hatch.version and tool.hatch.metadata.hooks.fancy-pypi-readme. You may have to add the file MANIFEST.in to include/exclude files if needed.