Skip to content
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion .github/workflows/bump-version.yml
Original file line number Diff line number Diff line change
Expand Up @@ -8,7 +8,7 @@ name: "Bump Patch Version"
on:
push:
branches:
- master
- main
paths-ignore:
- .cruft.json
- .editorconfig
Expand Down
6 changes: 3 additions & 3 deletions .github/workflows/main.yml
Original file line number Diff line number Diff line change
@@ -1,9 +1,9 @@
name: RavenPy
name: Testing

on:
push:
branches:
- master
- main
pull_request:

env:
Expand Down Expand Up @@ -192,7 +192,7 @@ jobs:
~/.cache/raven-testdata
key: ${{ hashFiles('src/ravenpy/testing/registry.txt') }}-${{ env.RAVEN_TESTDATA_BRANCH }}-conda-${{ matrix.os }}

- name: Prefetch RavenPy test data
- name: Prefetch RavenPy testing data
run: |
python -c "import ravenpy.testing.utils as rtu; rtu.populate_testing_data()"

Expand Down
85 changes: 85 additions & 0 deletions .github/workflows/notebooks.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,85 @@
name: Notebooks

on:
push:
branches:
- main
pull_request:
schedule:
- cron: '0 9 * * 1'

env:
RAVEN_TESTDATA_BRANCH: v2025.6.12

concurrency:
group: ${{ github.workflow }}-${{ github.ref }}
cancel-in-progress: ${{ github.ref != 'refs/heads/master' }}

permissions:
contents: read

jobs:
notebooks:
name: Test Notebooks (Anaconda, ${{ matrix.os }})
runs-on: ${{ matrix.os }}
strategy:
fail-fast: false
matrix:
os: [ "ubuntu-latest", "macos-latest" ]
python-version: [ "3.13" ] # pymetalink not yet supported in Python 3.14
defaults:
run:
shell: bash -l {0}
steps:
- name: Harden Runner
uses: step-security/harden-runner@95d9a5deda9de15063e7595e9719c11c38c90ae2 # v2.13.2
with:
disable-sudo: true
egress-policy: audit

- name: Checkout Repository
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
with:
persist-credentials: false

- name: Setup Conda (Micromamba) with Python${{ matrix.python-version }}
uses: mamba-org/setup-micromamba@add3a49764cedee8ee24e82dfde87f5bc2914462 # v2.0.7
with:
cache-downloads: true
cache-environment: true
cache-environment-key: environment-${{ matrix.python-version }}-${{ matrix.os }}-${{ github.head_ref }}
environment-file: environment-dev.yml
create-args: >-
python=${{ matrix.python-version }}

- name: Install RavenPy
run: |
python -m pip install --no-deps --editable .

- name: List installed packages
run: |
micromamba list
python -m pip check || true

- name: Cache test data (macOS)
if: matrix.os == 'macos-latest'
uses: actions/cache@cdf6c1fa76f9f475f3d7449005a359c84ca0f306 # v5.0.3
with:
path: |
~/Library/Caches/raven-testdata
key: ${{ hashFiles('src/ravenpy/testing/registry.txt') }}-${{ env.RAVEN_TESTDATA_BRANCH }}-conda-${{ matrix.os }}
- name: Cache test data (Ubuntu)
if: matrix.os == 'ubuntu-latest'
uses: actions/cache@cdf6c1fa76f9f475f3d7449005a359c84ca0f306 # v5.0.3
with:
path: |
~/.cache/raven-testdata
key: ${{ hashFiles('src/ravenpy/testing/registry.txt') }}-${{ env.RAVEN_TESTDATA_BRANCH }}-conda-${{ matrix.os }}

- name: Prefetch RavenPy testing data
run: |
python -c "import ravenpy.testing.utils as rtu; rtu.populate_testing_data()"

- name: Test RavenPy
run: |
make test-notebooks
2 changes: 1 addition & 1 deletion .github/workflows/scorecard.yml
Original file line number Diff line number Diff line change
Expand Up @@ -13,7 +13,7 @@ on:
- cron: '41 8 * * 4'
push:
branches:
- master
- main

# Read-all permission is not technically needed for this workflow.
permissions:
Expand Down
1 change: 0 additions & 1 deletion .readthedocs.yml
Original file line number Diff line number Diff line change
Expand Up @@ -33,5 +33,4 @@ python:
- method: pip
path: .
extra_requirements:
- dev
- docs
39 changes: 23 additions & 16 deletions CHANGELOG.rst
Original file line number Diff line number Diff line change
Expand Up @@ -2,19 +2,26 @@
Changelog
=========

..
`Unreleased <https://github.com/CSHS-CWRA/RavenPy>`_ (latest)
-------------------------------------------------------------
`Unreleased <https://github.com/CSHS-CWRA/RavenPy>`_ (latest)
-------------------------------------------------------------

Contributors:
Contributors: Trevor James Smith (:user:`Zeitsperre`).

Changes
^^^^^^^
* Dependency updates. (PR #584):
* Updated required `xskillscore` (``>= 0.0.29``) and `climpred` (``>= 2.6.0``).
* Removed version pins on `intake` and `intake-xarray`.
* Minimum required `birdy` is now ``>= 0.9.1``.

Changes
^^^^^^^
* No change.
Fixes
^^^^^
* Updated notebooks to address several deprecation warnings stemming from `dask`, `numpy`, and `xarray`. (PR #584)

Fixes
^^^^^
* No change.
Internal changes
^^^^^^^^^^^^^^^^
* Added a Makefile recipe and a GitHub Workflow to run tests against the notebooks using ``pytest --nbval`` on changes as well as on a weekly schedule. (PR #584)
* Fixed a bug in several workflows that was impeding triggers when Pull Requests are merged to `main`. (PR #584)

.. _changes_0.20.0:

Expand Down Expand Up @@ -42,15 +49,15 @@ Fixes
Internal changes
^^^^^^^^^^^^^^^^
* Updated the cookiecutter template to the latest version (PR #548):
* Updated the Contributor Covenant Agreement to v3.0.
* Added a `CITATION.cff` file.
* Removed `black`, `blackdoc`, and `isort`, as well as their configurations.
* Updated `pyproject.toml` to be `PEP 639 <https://peps.python.org/pep-0639>`_-compliant.
* Updated the Contributor Covenant Agreement to v3.0.
* Added a `CITATION.cff` file.
* Removed `black`, `blackdoc`, and `isort`, as well as their configurations.
* Updated `pyproject.toml` to be `PEP 639 <https://peps.python.org/pep-0639>`_-compliant.
* Pinned `pydantic` below v2.12 due to breaking changes in their API. (PR #548)
* Unpinned `pydantic` as newer 2.12 patch releases appear to have addressed regressions. (PR #559)
* Unpinned `pydantic` as newer 2.12 patch releases appear to have addressed regressions. (PR #559)
* Pinned `pydap` >=3.5.6 and `h5netcdf` >=1.5.0 to ensure modern versions with better `xarray` support are installed by default. (PR #559)
* Updated the cookiecutter template to the latest version (PR #569):
* Added a workflow for automatically accepting and merging periodic updates from Dependabot affecting CI dependencies.
* Added a workflow for automatically accepting and merging periodic updates from Dependabot affecting CI dependencies.
* Added a `pytest` fixture to perform a teardown of changes performed within the installed `ravenpy` source location. (PR #572)

.. _changes_0.19.1:
Expand Down
15 changes: 13 additions & 2 deletions Makefile
Original file line number Diff line number Diff line change
Expand Up @@ -6,7 +6,7 @@ import os, webbrowser, sys

from urllib.request import pathname2url

webbrowser.open("file://" + pathname2url(os.path.abspath(sys.argv[1])))
webbrowser.open(sys.argv[1])
endef
export BROWSER_PYSCRIPT

Expand Down Expand Up @@ -54,6 +54,8 @@ clean-test: ## remove test and coverage artifacts
rm -fr htmlcov/
rm -fr .pytest_cache

## Testing targets:

lint/flake8: ## check style with flake8
python -m ruff check src/ravenpy tests
python -m flake8 --config=.flake8 src/ravenpy tests
Expand All @@ -73,7 +75,14 @@ coverage: ## check code coverage quickly with the default Python
python -m coverage html
$(BROWSER) htmlcov/index.html

autodoc: clean-docs ## create sphinx-apidoc files:
NOTEBOOKS := $(shell find $(CURDIR)/docs/notebooks -name '*.ipynb')

test-notebooks: ## test all notebooks under docs/notebooks
python -m pytest --nbval --numprocesses=logical --maxprocesses=8 --dist=loadscope $(NOTEBOOKS)

## Sphinx targets:

autodoc: clean-docs ## create sphinx-apidoc files
sphinx-apidoc -o docs/apidoc --private --module-first src/ravenpy

autodoc-custom-index: clean-docs ## create sphinx-apidoc files but with special index handling for indices and indicators
Expand All @@ -100,6 +109,8 @@ endif
servedocs: docs ## compile the docs watching for changes
watchmedo shell-command -p '*.rst' -c '$(MAKE) -C docs html' -R -D .

## Development targets:

dist: clean ## builds source and wheel package
python -m flit build
ls -l dist
Expand Down
43 changes: 12 additions & 31 deletions docs/conf.py
Original file line number Diff line number Diff line change
Expand Up @@ -101,10 +101,6 @@ def rebuild_readme():
"notebooks/paper/*.ipynb",
]

# nbsphinx_execute = "auto"
# nbsphinx_timeout = 1
# nbsphinx_allow_errors = True

extlinks = {
"issue": ("https://github.com/CSHS-CWRA/RavenPy/issues/%s", "GH/%s"),
"pull": ("https://github.com/CSHS-CWRA/RavenPy/pull/%s", "PR/%s"),
Expand All @@ -121,33 +117,18 @@ def rebuild_readme():

# To avoid having to install these and burst memory limit on ReadTheDocs.
autodoc_mock_imports = [
"affine",
"cftime",
"cf_xarray",
"click",
"climpred",
"clisops",
"fiona",
"gdal",
"h5netcdf",
"netCDF4",
"osgeo",
"geopandas",
"haversine",
"holoviews",
"hvplot",
"lxml",
"owslib",
"pandas",
"pyproj",
"rasterio",
"rioxarray",
"scipy",
"shapely",
"spotpy",
"statsmodels",
"xarray",
"xclim",
"affine",
"fiona",
"geopandas",
"holoviews",
"hvplot",
"lxml",
"osgeo",
"netCDF4",
"pyproj",
"rasterio",
"rioxarray",
"shapely"
]

# Add any paths that contain templates here, relative to this directory.
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -417,7 +417,7 @@
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.10.13"
"version": "3.13.11"
}
},
"nbformat": 4,
Expand Down
24 changes: 16 additions & 8 deletions docs/notebooks/08_Getting_and_bias_correcting_CMIP6_data.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -365,7 +365,7 @@
"\n",
" out[exp] = {}\n",
" for variable in [\"tasmin\", \"tasmax\", \"pr\"]:\n",
" print(exp, variable)\n",
" print(f\"Now processing: {exp} x {variable}\")\n",
" query = dict(\n",
" experiment_id=exp,\n",
" table_id=\"day\",\n",
Expand Down Expand Up @@ -558,7 +558,9 @@
"cell_type": "code",
"execution_count": null,
"metadata": {
"tags": []
"tags": [
"nbval-skip"
]
},
"outputs": [],
"source": [
Expand All @@ -569,7 +571,8 @@
" corrected_ref_precip.to_dataset(name=\"pr\"),\n",
" corrected_ref_tasmax.to_dataset(name=\"tasmax\"),\n",
" corrected_ref_tasmin.to_dataset(name=\"tasmin\"),\n",
" ]\n",
" ],\n",
" compat=\"no_conflicts\"\n",
")\n",
"\n",
"# Write to temporary folder.\n",
Expand All @@ -581,8 +584,9 @@
" [\n",
" corrected_fut_precip.to_dataset(name=\"pr\"),\n",
" corrected_fut_tasmax.to_dataset(name=\"tasmax\"),\n",
" corrected_fut_tasmin.to_dataset(name=\"tasmin\"),\n",
" ]\n",
" corrected_fut_tasmin.to_dataset(name=\"tasmin\")\n",
" ],\n",
" compat=\"no_conflicts\"\n",
")\n",
"\n",
"fn_fut = tmp / \"future_dataset.nc\"\n",
Expand All @@ -593,7 +597,9 @@
"cell_type": "code",
"execution_count": null,
"metadata": {
"tags": []
"tags": [
"nbval-skip"
]
},
"outputs": [],
"source": [
Expand All @@ -605,7 +611,9 @@
"cell_type": "code",
"execution_count": null,
"metadata": {
"tags": []
"tags": [
"nbval-skip"
]
},
"outputs": [],
"source": [
Expand All @@ -625,7 +633,7 @@
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.9.16"
"version": "3.13.12"
}
},
"nbformat": 4,
Expand Down
5 changes: 4 additions & 1 deletion docs/notebooks/11_Climatological_ESP_forecasting.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -181,6 +181,9 @@
"# Adjust the streamflow to convert missing data from -1.2345 format to NaN. Set all negative values to NaN.\n",
"q_obs = q_obs.where(q_obs > 0, np.nan)\n",
"\n",
"# Drop non-numerical variable\n",
"q_sims = q_sims.drop_vars(\"basin_fullname\")\n",
"\n",
"# Compute the Continuous Ranked Probability Score using xskillscore\n",
"xs.crps_ensemble(q_obs, q_sims, dim=\"time\").q_sim.values[0]"
]
Expand Down Expand Up @@ -287,7 +290,7 @@
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.10.9"
"version": "3.13.12"
}
},
"nbformat": 4,
Expand Down
Loading