Skip to content

openclimatefix/pvnet

Folders and files

NameName
Last commit message
Last commit date

Latest commit

Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 

Repository files navigation

PVNet

All Contributors

tags badge ease of contribution: hard

This project is used for training PVNet and running PVNet on live data.

PVNet is a multi-modal late-fusion model for predicting renewable energy generation from weather data. The NWP (Numerical Weather Prediction) and satellite data are sent through a neural network which encodes them down to 1D intermediate representations. These are concatenated together with recent generation, the calculated solar coordinates (azimuth and elevation) and the location ID which has been put through an embedding layer. This 1D concatenated feature vector is put through an output network which outputs predictions of the future energy yield.

Experiments

Our paper based on this repo was accepted into the Tackling Climate Change with Machine Learning workshop at ICLR 2024 and can be viewed here.

Some more structured notes on experiments we have performed with PVNet are here.

Setup / Installation

git clone git@github.com:openclimatefix/PVNet.git
cd PVNet
pip install .

The commit history is extensive. To save download time, use a depth of 1:

git clone --depth 1 git@github.com:openclimatefix/PVNet.git

This means only the latest commit and its associated files will be downloaded.

Next, in the PVNet repo, install PVNet as an editable package:

pip install -e .

Additional development dependencies

pip install ".[dev]"

Getting started with running PVNet

Before running any code in PVNet, copy the example configuration to a configs directory:

cp -r configs.example configs

You will be making local amendments to these configs. See the README in configs.example for more info.

Datasets

As a minimum, in order to create samples of data/run PVNet, you will need to supply paths to NWP and GSP data. PV data can also be used. We list some suggested locations for downloading such datasets below:

GSP (Grid Supply Point) - Regional PV generation data
The University of Sheffield provides API access to download this data: https://www.solar.sheffield.ac.uk/api/

Documentation for querying generation data aggregated by GSP region can be found here: https://docs.google.com/document/d/e/2PACX-1vSDFb-6dJ2kIFZnsl-pBQvcH4inNQCA4lYL9cwo80bEHQeTK8fONLOgDf6Wm4ze_fxonqK3EVBVoAIz/pub#h.9d97iox3wzmd

NWP (Numerical weather predictions)
OCF maintains a Zarr formatted version of the German Weather Service's (DWD) ICON-EU NWP model here: https://huggingface.co/datasets/openclimatefix/dwd-icon-eu which includes the UK

PV
OCF maintains a dataset of PV generation from 1311 private PV installations here: https://huggingface.co/datasets/openclimatefix/uk_pv

Connecting with ocf-data-sampler for sample creation

Outside the PVNet repo, clone the ocf-data-sampler repo and exit the conda env created for PVNet: https://github.com/openclimatefix/ocf-data-sampler

git clone git@github.com/openclimatefix/ocf-data-sampler.git
conda create -n ocf-data-sampler python=3.11

Then go inside the ocf-data-sampler repo to add packages

pip install .

Then exit this environment, and enter back into the pvnet conda environment and install ocf-data-sampler in editable mode (-e). This means the package is directly linked to the source code in the ocf-data-sampler repo.

pip install -e <PATH-TO-ocf-data-sampler-REPO>

If you install the local version of ocf-data-sampler that is more recent than the version specified in PVNet it is not guarenteed to function properly with this library.

Streaming samples (no pre-save)

PVNet now trains and validates directly from streamed_samples (i.e. no pre-saving to disk).

Make sure you have copied example configs (as already stated above): cp -r configs.example configs

Set up and config example for streaming

We will use the following example config file to describe your data sources: /PVNet/configs/datamodule/configuration/example_configuration.yaml. Ensure that the file paths are set to the correct locations in example_configuration.yaml: search for PLACEHOLDER to find where to input the location of the files. Delete or comment the parts for data you are not using.

At run time, the datamodule config PVNet/configs/datamodule/streamed_samples.yaml points to your chosen configuration file:

configuration: "/FULL-PATH-TO-REPO/PVNet/configs/datamodule/configuration/example_configuration.yaml"

You can also update train/val/test time ranges here to match the period you have access to.

If downloading private data from a GCP bucket make sure to authenticate gcloud (the public satellite data does not need authentication):

gcloud auth login

You can provide multiple storage locations as a list. For example:

satellite: zarr_path: - "gs://public-datasets-eumetsat-solar-forecasting/satellite/EUMETSAT/SEVIRI_RSS/v4/2020_nonhrv.zarr" - "gs://public-datasets-eumetsat-solar-forecasting/satellite/EUMETSAT/SEVIRI_RSS/v4/2021_nonhrv.zarr"

ocf-data-sampler is currently set up to use 11 channels from the satellite data (the 12th, HRV, is not used).

⚠️ NB: Our publicly accessible satellite data is currently saved with a blosc2 compressor, which is not supported by the tensorstore backend PVNet relies on now. We are in the process of updating this; for now, the paths above cannot be used with this codebase.

Training PVNet

How PVNet is run is determined by the configuration files. The example configs in PVNet/configs.example work with streamed_samples using datamodule/streamed_samples.yaml.

Update the following before training:

  1. In configs/model/late_fusion.yaml:
    • Update the list of encoders to match the data sources you are using. For different NWP sources, keep the same structure but ensure:
      • in_channels: the number of variables your NWP source supplies
      • image_size_pixels: spatial crop matching your NWP resolution and the settings in your datamodule configuration (unless you coarsened, e.g. for ECMWF)
  2. In configs/trainer/default.yaml:
    • Set accelerator: 0 if running on a system without a supported GPU
  3. In configs/datamodule/streamed_samples.yaml:
    • Point configuration: to your local example_configuration.yaml (or your custom one)
    • Adjust the train/val/test time ranges to your available data

If you create custom config files, update the main ./configs/config.yaml defaults:

defaults:

  • trainer: default.yaml
  • model: late_fusion.yaml
  • datamodule: streamed_samples.yaml
  • callbacks: null
  • experiment: null
  • hparams_search: null
  • hydra: default.yaml

Now train PVNet:

python run.py

You can override any setting with Hydra, e.g.:

python run.py datamodule=streamed_samples datamodule.configuration="/FULL-PATH/PVNet/configs/datamodule/configuration/example_configuration.yaml"

Backtest

If you have successfully trained a PVNet model and have a saved model checkpoint you can create a backtest using this, e.g. forecasts on historical data to evaluate forecast accuracy/skill. This can be done by running one of the scripts in this repo such as the UK GSP backtest script or the the pv site backtest script, further info on how to run these are in each backtest file.

Testing

You can use python -m pytest tests to run tests

Contributors ✨

Thanks goes to these wonderful people (emoji key):

Felix
Felix

πŸ’»
Sukhil Patel
Sukhil Patel

πŸ’»
James Fulton
James Fulton

πŸ’»
Alexandra Udaltsova
Alexandra Udaltsova

πŸ’» πŸ‘€
Megawattz
Megawattz

πŸ’»
Peter Dudfield
Peter Dudfield

πŸ’»
Mahdi Lamb
Mahdi Lamb

πŸš‡
Jacob Prince-Bieker
Jacob Prince-Bieker

πŸ’»
codderrrrr
codderrrrr

πŸ’»
Chris Briggs
Chris Briggs

πŸ’»
tmi
tmi

πŸ’»
Chris Arderne
Chris Arderne

πŸ’»
Dakshbir
Dakshbir

πŸ’»
MAYANK SHARMA
MAYANK SHARMA

πŸ’»
aryan lamba
aryan lamba

πŸ’»
michael-gendy
michael-gendy

πŸ’»
Aditya Suthar
Aditya Suthar

πŸ’»
Markus Kreft
Markus Kreft

πŸ’»
Jack Kelly
Jack Kelly

πŸ€”
zaryab-ali
zaryab-ali

πŸ’»
Lex-Ashu
Lex-Ashu

πŸ’»

This project follows the all-contributors specification. Contributions of any kind welcome!