Contributing#

Contributions are welcome, and they are greatly appreciated! Every little bit helps, and credit will always be given.

You can contribute in many ways:

Types of Contributions#

Report Bugs#

Report bugs at here

If you are reporting a bug, please include:

  • Your operating system name and version.

  • Any details about your local setup that might be helpful in troubleshooting.

  • Detailed steps to reproduce the bug.

Fix Bugs#

Look through the GitHub issues for bugs. Anything tagged with “bug” and “help wanted” is open to whoever wants to implement it.

Implement Features#

Look through the GitHub issues for features. Anything tagged with “enhancement” and “help wanted” is open to whoever wants to implement it.

Write Documentation#

This project could always use more documentation, whether as part of the official docs, in docstrings, or even on the web in blog posts, articles, and such.

Submit Feedback#

The best way to send feedback is to file an issue here.

If you are proposing a feature:

  • Explain in detail how it would work.

  • Keep the scope as narrow as possible, to make it easier to implement.

  • Remember that this is a volunteer-driven project, and that contributions are welcome!

Making a contribution#

Ready to contribute? Here’s how to make a contribution.

  • Fork the repo on GitHub.

  • Clone your fork locally:

    git clone git@github.com:your_name_here/cmomy.git
    

    If the repo includes submodules, you can add them either with the initial close using:

    git clone --recursive-submodules git@github.com:your_name_here/cmomy.git
    

    or after the clone using

    cd cmomy
    git submodule update --init --recursive
    
  • Create development environment. See Setup development environment for details.

  • Initiate pre-commit with:

    pre-commit install
    

    To update the recipe, periodically run:

    pre-commit autoupdate
    

    If recipes change over time, you can clean up old installs with:

    pre-commit gc
    
  • Create a branch for local development:

    git checkout -b name-of-your-bugfix-or-feature
    

    Now you can make your changes locally. Alternatively, we recommend using git-flow.

  • When you’re done making changes, check that your changes pass the pre-commit checks: tests.

    pre-commit run [--all-files]
    

    To run tests, use:

    pytest
    

    To test against multiple python versions, use nox:

    nox -s test
    
  • Create changelog fragment. See scriv for more info.

    scriv create --edit
    
  • Commit your changes and push your branch to GitHub:

    git add .
    git commit -m "Your detailed description of your changes."
    git push origin name-of-your-bugfix-or-feature
    

    Note that the pre-commit hooks will force the commit message to be in the conventional style. To assist this, you may want to commit using commitizen.

    cz commit
    
  • Submit a pull request through the GitHub website.

Pull Request Guidelines#

Before you submit a pull request, check that it meets these guidelines:

  • The pull request should include tests.

  • If the pull request adds functionality, the docs should be updated. Put your new functionality into a function with a docstring, and add the feature to the list in CHANGELOG.md. You should use scriv for this.

  • The pull request should work for all supported python versions.

Using pre-commit#

It is highly recommended to enable pre-commit. See Setup development environment for installation instructions. To install the pre-commit hooks, run:

pre-commit install

This will enable a variety of code-checkers (linters) when you add a file to commit. Alternatively, you can run the hooks over all files using:

pre-commit run --all-files

You can also run pre-commit on all files via nox using:

nox -s lint

Shared numba cache#

This project uses numba to accelerate calculations. For development, we specify the environment variable NUMBA_CACHE_DIR={project_root}/.numba_cache relative to the project root. If using [zsh-autoenv], there is a recipe in the Makefile to create this environment variable on entering the repo directory. Also, noxfile.py and the notebooks sets this environment variable. This should greatly speed up recompilation of numba compiled functions during development.

Using nox#

This project makes extensive use of nox to automate testing, typing, documentation creation, etc. One downside of using tox with this particular workflow is the need for multiple scripts/makefiles, while with nox, most everything is self contained in the file noxfile.py. nox also allows for a mix of conda and virtualenv environments. The default is for the development environment to use conda, while all other environments are virtualenvs. There are conda sessions for testing (test-conda), typing (typing-conda), docs (docs-conda), etc.

Installing interpreters for virtualenv creation#

If using virtualenvs across multiple python versions (e.g., test, typing, etc), you’ll need to install python interpreters for each version. If using pyenv, you should be good to go.

Instead of using pyenv, I use uv to manage python versions. For example:

uv python install python3.12

I also set the global uv config file (~/.config/uv/uv.toml on mac and linux) to use only managed python:

python-preference = "only-managed"

The noxfile.py is setup to automatically add the python interpreters installed by uv to the path. Note that the python version needs to be installed before it can be used with nox

Nox session options#

To see all nox session, run:

nox --list

To simplify passing options to underlying commands, the options to a particular nox session use + instead of - for options. For example, pass options to pytest, use:

nox -s test -- ++test-opts -x -v

Using + for the session option ++test-opts means we don’t have to escape -x or -v. To see all options:

nox -- ++help/+h

Note that these options should be passed after --. For example, to build and open the documentation, run:

nox -s docs -- +d build open

Creating environment.yaml/requirement.txt files#

The project is setup to create environment.yaml and requirement.txt files from pyproject.toml. This can be done using:

nox -s requirements

This uses pyproject2conda to create the requirement files. Note that all requirement files are under something like requirements/py{version}-{env-name}.yaml (conda environment) or requirements/{env-name}.txt (virtual environment).

Additionally, requirement files for virtualenvs (e.g., requirements.txt like files) will be “locked” using uv pip compile from uv. These files are placed under requirements/lock. Note the the session requirements automatically calls the session uv-compile.

To upgrade the dependencies in the lock, you’ll need to pass the option:

nox -s uv-compile -- +L/++pip-compile-upgrade

ipykernel#

The environments created by nox dev and docs-conda will try to add meaningful display names for ipykernel. These are installed at the user level. To cleanup the kernels (meaning, removing installed kernels that point to a removed environment), You can use the script tools/clean_kernelspec.py:

python tools/clean_kernelspec.py

Building the docs#

We use nox to isolate the documentation build. Specific tasks can be run with

nox -s docs -- +d [commands]

where commands can be one of:

  • clean : remove old doc build

  • build/html : build html documentation

  • spelling : check spelling

  • linkcheck : check the links

  • symlink : rebuild symlinks from examples to docs/examples

  • release : make pages branch for documentation hosting (using ghp-import)

  • livehtml : Live documentation updates

  • open : open the documentation in a web browser

  • serve : Serve the created documentation webpage (Need this to view javascript in created pages).

Testing with nox#

The basic command is:

nox -s test -- [++test-opts] [++no-cov]

where you can pass in additional pytest options via ++test-opts. For example:

nox -s test -- ++test-opts -x -v

Use session test-conda to test under a conda environment.

Note that by default, these will install an isolated copy of the package, as apposed to installing with pip install -e . --no-deps. This is similar to how tox works. This uses the nox session build behind the scenes. This should therefore be a fast operation.

Building distribution for conda#

For the most part, we use grayskull to create the conda recipe. However, I’ve had issues getting it to play nice with pyproject.toml for some of the ‘extra’ variables. So, we use grayskull to build the majority of the recipe, and append the file config/recipe-append.yaml. For some edge cases (install name different from package name, etc), you’ll need to manually edit this file to create the final recipe.

To build the conda recipe using grayskull:

nox -s conda-recipe -- ++conda-recipe [recipe, recipe-full]

To build the conda distribution:

nox -s conda-build -- ++conda-build [build,clean]

To upload the recipe, you’ll need to run an external command like:

nox -s conda-build -- ++conda-build-run "anaconda upload PATH-TO-TARBALL"

Building distribution for pypi#

The basic command is:

nox -s build

To upload the pypi distribution:

nox -s publish -- +p [release, test]
  • test : upload to testpypi

  • release : upload to pypi

Testing pypi or conda installs#

Run:

nox -s testdist-pypi -- ++version [version]

to test a specific version from pypi and

nox -s testdist-conda -- ++version [version]

to to likewise from conda.

Testing notebooks with nbval#

To test notebooks expected output using nbval, run

nox -s test-notebook

Type checking#

Run:

nox -s typing -- +m [commands] [options]

Use typing-conda to test typing in a conda environment.

Note that the repo is setup to use a single install of mypy and pyright. The script tools/uvxrun.py will run check if an appropriate version of the typecheckers is installed. If not, they will be run (and cached) using uvx.

Setup development environment#

This project uses a host of tools to (hopefully) make development easier. We recommend installing some of these tools system wide. For this, we recommend using uv (or pipx or condax). We mostly use uv, but the choice is yours. For conda, we recommend actually using mamba. Alternatively, you can setup conda to use the faster mamba solver. See here for details.

Create development environment with conda#

To install a development environment using conda/mamba run:

conda env create -n {env-name} -f requirements/py{version}-dev.yaml
conda activate {env-name}
pip install -e . --no-deps

If you want to include some extra tools in the environment (instead of using uvx, condax, or pipx), use requirements/py{version}-dev-complete.yaml instead.

Create development environment with pip#

Run something like the following:

python -m venv .venv
source .venv/bin/activate
# unlocked
python -m pip install -r requirements/dev.txt
# locked:
pip-sync --python-path .venv/bin/python requirements/lock/py{version}-dev.txt
python -m pip install -e . --no-deps

Or if using uv:

uv venv --python 3.11 .venv
uv pip install -r requirements/dev.txt
# or locked
uv pip sync requirements/lock/py311-dev.txt

If you want to include the extra tools, replace dev.txt with dev-complete.txt.

Create development environment with nox#

If you’d like to use nox to manage your development environment, use the following:

nox -s dev -- [++dev-envname dev/dev-complete]

where the option ++dev-envname (default dev) can be used to specify what kind of development environment you’d like. This will create a conda environment under .venv. To instead create a virtualenv based development environment, use nox -s dev-venv .....

If you go this route, you may want to use something like zsh-autoenv (if using zsh shell) or autoenv (if using bash) to auto activate the development environment when in the parent directory.

Note that you can bootstrap the whole process with uvx using:

uvx nox -s dev/dev-venv -- \
     ++dev-envname dev/dev-complete

Development tools#

Additional tools are:

We recommend installing these tools with uv, but feel free to use pipx or condax. If you’d like to install them in the development environment instead, use the dev-complete version of the commands above.

uv tool/condax/pipx install pre-commit
# optional packages
uv tool/pipx install scriv
uv tool/condax/pipx install uv
uv tool/condax/pipx install pyright
uv tool/condax/pipx install cruft
uv tool/condax/pipx install commitizen
uv tool/condax/pipx install cogapp
uv tool/condax/pipx install nbqa

Note that the repo is setup to automatically use uvx for many of these tools. Behind the scenes, the makefile and noxfile.py will invoke tools/uvxrun.py. This will run the tool with uvx tool.. with proper tool version. Note that if the tool is already installed with the proper version, uvx will use it. This prevents having to install a bunch of tooling in the “dev” environment, and also avoid creating a bunch of through away nox environments. This is experimental, and I might change back to using small nox environments again in the future.

Package version#

Versioning is handled with hatch-vcs. The package version is set by the git tag. For convenience, you can override the version with nox setting ++version .... This is useful for updating the docs, etc.

Note that the version in a given environment/session can become stale. The easiest way to update the installed package version version is to reinstall the package. This can be done using the following:

# using pip
pip install -e . --no-deps
# using uv
uv pip install -e . --no-deps

To do this in a given session, use:

nox -s {session} -- +P/++update-package