Earth2Studio is now OSS!

Model Hook Injection: Perturbation#

Adding model noise by using custom hooks.

This example will demonstrate how to run an ensemble inference workflow to generate a perturbed ensemble forecast. This perturbation is done by injecting code into the model front and rear hooks. These hooks are applied to the tensor data before/after the model forward call.

This example also illustrates how you can subselect data for IO. In this example we will only output two variables: total column water vapor (tcwv) and 500 hPa geopotential (z500). To run this, make sure that the model selected predicts these variables are change appropriately.

In this example you will learn:

  • How to instantiate a built in prognostic model

  • Creating a data source and IO object

  • Changing the model forward/rear hooks

  • Choose a subselection of coordinates to save to an IO object.

  • Post-processing results

Creating an Ensemble Workflow#

To start let’s begin with creating an ensemble workflow to use. We encourage users to explore and experiment with their own custom workflows that borrow ideas from built in workflows inside earth2studio.run or the examples.

Creating our own generalizable ensemble workflow is easy when we rely on the component interfaces defined in Earth2Studio (use dependency injection). Here we create a run method that accepts the following:

  • time: Input list of datetimes / strings to run inference for

  • nsteps: Number of forecast steps to predict

  • nensemble: Number of ensembles to run for

  • prognostic: Our initialized prognostic model

  • data: Initialized data source to fetch initial conditions from

  • io: io store that data is written to.

  • output_coords: CoordSystem of output coordinates that should be saved. Should be a proper subset of model output coordinates.

Set Up#

With the ensemble workflow defined, we now need to create the individual components.

We need the following:

We will first run the ensemble workflow using an unmodified function, that is a model that has the default (identity) forward and rear hooks. Then we will define new hooks for the model and rerun the inference request. %%

import os

os.makedirs("outputs", exist_ok=True)
from dotenv import load_dotenv

load_dotenv()  # TODO: make common example prep function

import numpy as np

from earth2studio.data import GFS
from earth2studio.io import ZarrBackend
from earth2studio.models.px import DLWP
from earth2studio.perturbation import Gaussian
from earth2studio.run import ensemble

# Load the default model package which downloads the check point from NGC
package = DLWP.load_default_package()
model = DLWP.load_model(package)

# Create the data source
data = GFS()

# Create the IO handler, store in memory
chunks = {"ensemble": 1, "time": 1}
io_unperturbed = ZarrBackend(file_name="outputs/05_ensemble.zarr", chunks=chunks)

Execute the Workflow#

First, we will run the ensemble workflow but with a earth2studio.perturbation.Gaussian() perturbation as the control.

The workflow will return the provided IO object back to the user, which can be used to then post process. Some have additional APIs that can be handy for post-processing or saving to file. Check the API docs for more information.

nsteps = 4 * 12
nensemble = 16
batch_size = 4
forecast_date = "2024-01-30"
output_coords = {
    "lat": np.arange(25.0, 60.0, 0.25),
    "lon": np.arange(230.0, 300.0, 0.25),
    "variable": np.array(["tcwv", "z500"]),
}

# First run with no model perturbation
io_unperturbed = ensemble(
    [forecast_date],
    nsteps,
    nensemble,
    model,
    data,
    io_unperturbed,
    Gaussian(noise_amplitude=0.01),
    output_coords=output_coords,
    batch_size=batch_size,
)
2024-06-25 13:59:25.640 | INFO     | earth2studio.run:ensemble:294 - Running ensemble inference!
2024-06-25 13:59:25.640 | INFO     | earth2studio.run:ensemble:302 - Inference device: cuda
2024-06-25 13:59:25.646 | DEBUG    | earth2studio.data.gfs:fetch_gfs_dataarray:149 - Fetching GFS index file: 2024-01-29 18:00:00

Fetching GFS for 2024-01-29 18:00:00:   0%|          | 0/7 [00:00<?, ?it/s]

2024-06-25 13:59:25.972 | DEBUG    | earth2studio.data.gfs:fetch_gfs_dataarray:196 - Fetching GFS grib file for variable: t850 at 2024-01-29 18:00:00

Fetching GFS for 2024-01-29 18:00:00:   0%|          | 0/7 [00:00<?, ?it/s]
Fetching GFS for 2024-01-29 18:00:00:  14%|█▍        | 1/7 [00:00<00:04,  1.33it/s]

2024-06-25 13:59:26.721 | DEBUG    | earth2studio.data.gfs:fetch_gfs_dataarray:196 - Fetching GFS grib file for variable: z1000 at 2024-01-29 18:00:00

Fetching GFS for 2024-01-29 18:00:00:  14%|█▍        | 1/7 [00:00<00:04,  1.33it/s]
Fetching GFS for 2024-01-29 18:00:00:  29%|██▊       | 2/7 [00:01<00:02,  1.86it/s]

2024-06-25 13:59:27.113 | DEBUG    | earth2studio.data.gfs:fetch_gfs_dataarray:196 - Fetching GFS grib file for variable: z700 at 2024-01-29 18:00:00

Fetching GFS for 2024-01-29 18:00:00:  29%|██▊       | 2/7 [00:01<00:02,  1.86it/s]
Fetching GFS for 2024-01-29 18:00:00:  43%|████▎     | 3/7 [00:01<00:01,  2.23it/s]

2024-06-25 13:59:27.452 | DEBUG    | earth2studio.data.gfs:fetch_gfs_dataarray:196 - Fetching GFS grib file for variable: z500 at 2024-01-29 18:00:00

Fetching GFS for 2024-01-29 18:00:00:  43%|████▎     | 3/7 [00:01<00:01,  2.23it/s]
Fetching GFS for 2024-01-29 18:00:00:  57%|█████▋    | 4/7 [00:01<00:01,  2.29it/s]

2024-06-25 13:59:27.872 | DEBUG    | earth2studio.data.gfs:fetch_gfs_dataarray:196 - Fetching GFS grib file for variable: z300 at 2024-01-29 18:00:00

Fetching GFS for 2024-01-29 18:00:00:  57%|█████▋    | 4/7 [00:01<00:01,  2.29it/s]
Fetching GFS for 2024-01-29 18:00:00:  71%|███████▏  | 5/7 [00:02<00:00,  2.17it/s]

2024-06-25 13:59:28.378 | DEBUG    | earth2studio.data.gfs:fetch_gfs_dataarray:196 - Fetching GFS grib file for variable: tcwv at 2024-01-29 18:00:00

Fetching GFS for 2024-01-29 18:00:00:  71%|███████▏  | 5/7 [00:02<00:00,  2.17it/s]
Fetching GFS for 2024-01-29 18:00:00:  86%|████████▌ | 6/7 [00:02<00:00,  2.28it/s]

2024-06-25 13:59:28.772 | DEBUG    | earth2studio.data.gfs:fetch_gfs_dataarray:196 - Fetching GFS grib file for variable: t2m at 2024-01-29 18:00:00

Fetching GFS for 2024-01-29 18:00:00:  86%|████████▌ | 6/7 [00:02<00:00,  2.28it/s]
Fetching GFS for 2024-01-29 18:00:00: 100%|██████████| 7/7 [00:03<00:00,  2.28it/s]
Fetching GFS for 2024-01-29 18:00:00: 100%|██████████| 7/7 [00:03<00:00,  2.16it/s]
2024-06-25 13:59:29.217 | DEBUG    | earth2studio.data.gfs:fetch_gfs_dataarray:149 - Fetching GFS index file: 2024-01-30 00:00:00

Fetching GFS for 2024-01-30 00:00:00:   0%|          | 0/7 [00:00<?, ?it/s]

2024-06-25 13:59:29.312 | DEBUG    | earth2studio.data.gfs:fetch_gfs_dataarray:196 - Fetching GFS grib file for variable: t850 at 2024-01-30 00:00:00

Fetching GFS for 2024-01-30 00:00:00:   0%|          | 0/7 [00:00<?, ?it/s]
Fetching GFS for 2024-01-30 00:00:00:  14%|█▍        | 1/7 [00:00<00:02,  2.14it/s]

2024-06-25 13:59:29.779 | DEBUG    | earth2studio.data.gfs:fetch_gfs_dataarray:196 - Fetching GFS grib file for variable: z1000 at 2024-01-30 00:00:00

Fetching GFS for 2024-01-30 00:00:00:  14%|█▍        | 1/7 [00:00<00:02,  2.14it/s]
Fetching GFS for 2024-01-30 00:00:00:  29%|██▊       | 2/7 [00:00<00:02,  2.25it/s]

2024-06-25 13:59:30.209 | DEBUG    | earth2studio.data.gfs:fetch_gfs_dataarray:196 - Fetching GFS grib file for variable: z700 at 2024-01-30 00:00:00

Fetching GFS for 2024-01-30 00:00:00:  29%|██▊       | 2/7 [00:00<00:02,  2.25it/s]
Fetching GFS for 2024-01-30 00:00:00:  43%|████▎     | 3/7 [00:01<00:01,  2.21it/s]

2024-06-25 13:59:30.671 | DEBUG    | earth2studio.data.gfs:fetch_gfs_dataarray:196 - Fetching GFS grib file for variable: z500 at 2024-01-30 00:00:00

Fetching GFS for 2024-01-30 00:00:00:  43%|████▎     | 3/7 [00:01<00:01,  2.21it/s]
Fetching GFS for 2024-01-30 00:00:00:  57%|█████▋    | 4/7 [00:01<00:01,  2.30it/s]

2024-06-25 13:59:31.077 | DEBUG    | earth2studio.data.gfs:fetch_gfs_dataarray:196 - Fetching GFS grib file for variable: z300 at 2024-01-30 00:00:00

Fetching GFS for 2024-01-30 00:00:00:  57%|█████▋    | 4/7 [00:01<00:01,  2.30it/s]
Fetching GFS for 2024-01-30 00:00:00:  71%|███████▏  | 5/7 [00:02<00:00,  2.62it/s]

2024-06-25 13:59:31.364 | DEBUG    | earth2studio.data.gfs:fetch_gfs_dataarray:196 - Fetching GFS grib file for variable: tcwv at 2024-01-30 00:00:00

Fetching GFS for 2024-01-30 00:00:00:  71%|███████▏  | 5/7 [00:02<00:00,  2.62it/s]
Fetching GFS for 2024-01-30 00:00:00:  86%|████████▌ | 6/7 [00:02<00:00,  2.62it/s]

2024-06-25 13:59:31.746 | DEBUG    | earth2studio.data.gfs:fetch_gfs_dataarray:196 - Fetching GFS grib file for variable: t2m at 2024-01-30 00:00:00

Fetching GFS for 2024-01-30 00:00:00:  86%|████████▌ | 6/7 [00:02<00:00,  2.62it/s]
Fetching GFS for 2024-01-30 00:00:00: 100%|██████████| 7/7 [00:02<00:00,  2.64it/s]
Fetching GFS for 2024-01-30 00:00:00: 100%|██████████| 7/7 [00:02<00:00,  2.49it/s]
2024-06-25 13:59:32.181 | SUCCESS  | earth2studio.run:ensemble:315 - Fetched data from GFS
2024-06-25 13:59:32.194 | INFO     | earth2studio.run:ensemble:337 - Starting 16 Member Ensemble Inference with             4 number of batches.

Total Ensemble Batches:   0%|          | 0/4 [00:00<?, ?it/s]

Running batch 0 inference:   0%|          | 0/49 [00:00<?, ?it/s]

Running batch 0 inference:   6%|▌         | 3/49 [00:00<00:01, 29.06it/s]

Running batch 0 inference:  12%|█▏        | 6/49 [00:00<00:01, 23.12it/s]

Running batch 0 inference:  18%|█▊        | 9/49 [00:00<00:01, 22.62it/s]

Running batch 0 inference:  24%|██▍       | 12/49 [00:00<00:01, 20.47it/s]

Running batch 0 inference:  31%|███       | 15/49 [00:00<00:01, 19.47it/s]

Running batch 0 inference:  35%|███▍      | 17/49 [00:00<00:01, 18.63it/s]

Running batch 0 inference:  39%|███▉      | 19/49 [00:00<00:01, 17.99it/s]

Running batch 0 inference:  43%|████▎     | 21/49 [00:01<00:01, 16.96it/s]

Running batch 0 inference:  47%|████▋     | 23/49 [00:01<00:01, 16.58it/s]

Running batch 0 inference:  51%|█████     | 25/49 [00:01<00:01, 16.04it/s]

Running batch 0 inference:  55%|█████▌    | 27/49 [00:01<00:01, 15.51it/s]

Running batch 0 inference:  59%|█████▉    | 29/49 [00:01<00:01, 15.14it/s]

Running batch 0 inference:  63%|██████▎   | 31/49 [00:01<00:01, 14.67it/s]

Running batch 0 inference:  67%|██████▋   | 33/49 [00:01<00:01, 14.25it/s]

Running batch 0 inference:  71%|███████▏  | 35/49 [00:02<00:01, 13.62it/s]

Running batch 0 inference:  76%|███████▌  | 37/49 [00:02<00:00, 13.29it/s]

Running batch 0 inference:  80%|███████▉  | 39/49 [00:02<00:00, 12.94it/s]

Running batch 0 inference:  84%|████████▎ | 41/49 [00:02<00:00, 12.57it/s]

Running batch 0 inference:  88%|████████▊ | 43/49 [00:02<00:00, 12.26it/s]

Running batch 0 inference:  92%|█████████▏| 45/49 [00:02<00:00, 11.94it/s]

Running batch 0 inference:  96%|█████████▌| 47/49 [00:03<00:00, 11.72it/s]

Running batch 0 inference: 100%|██████████| 49/49 [00:03<00:00, 11.62it/s]


Total Ensemble Batches:  25%|██▌       | 1/4 [00:03<00:09,  3.32s/it]

Running batch 4 inference:   0%|          | 0/49 [00:00<?, ?it/s]

Running batch 4 inference:   8%|▊         | 4/49 [00:00<00:01, 27.43it/s]

Running batch 4 inference:  14%|█▍        | 7/49 [00:00<00:01, 25.25it/s]

Running batch 4 inference:  20%|██        | 10/49 [00:00<00:01, 22.20it/s]

Running batch 4 inference:  27%|██▋       | 13/49 [00:00<00:01, 20.94it/s]

Running batch 4 inference:  33%|███▎      | 16/49 [00:00<00:01, 19.30it/s]

Running batch 4 inference:  37%|███▋      | 18/49 [00:00<00:01, 18.29it/s]

Running batch 4 inference:  41%|████      | 20/49 [00:01<00:01, 17.93it/s]

Running batch 4 inference:  45%|████▍     | 22/49 [00:01<00:01, 17.32it/s]

Running batch 4 inference:  49%|████▉     | 24/49 [00:01<00:01, 17.05it/s]

Running batch 4 inference:  53%|█████▎    | 26/49 [00:01<00:01, 16.42it/s]

Running batch 4 inference:  57%|█████▋    | 28/49 [00:01<00:01, 15.89it/s]

Running batch 4 inference:  61%|██████    | 30/49 [00:01<00:01, 15.30it/s]

Running batch 4 inference:  65%|██████▌   | 32/49 [00:01<00:01, 15.18it/s]

Running batch 4 inference:  69%|██████▉   | 34/49 [00:01<00:00, 15.11it/s]

Running batch 4 inference:  73%|███████▎  | 36/49 [00:02<00:00, 15.15it/s]

Running batch 4 inference:  78%|███████▊  | 38/49 [00:02<00:00, 14.94it/s]

Running batch 4 inference:  82%|████████▏ | 40/49 [00:02<00:00, 14.43it/s]

Running batch 4 inference:  86%|████████▌ | 42/49 [00:02<00:00, 14.33it/s]

Running batch 4 inference:  90%|████████▉ | 44/49 [00:02<00:00, 14.10it/s]

Running batch 4 inference:  94%|█████████▍| 46/49 [00:02<00:00, 13.27it/s]

Running batch 4 inference:  98%|█████████▊| 48/49 [00:02<00:00, 12.71it/s]


Total Ensemble Batches:  50%|█████     | 2/4 [00:06<00:06,  3.19s/it]

Running batch 8 inference:   0%|          | 0/49 [00:00<?, ?it/s]

Running batch 8 inference:   6%|▌         | 3/49 [00:00<00:01, 26.62it/s]

Running batch 8 inference:  12%|█▏        | 6/49 [00:00<00:01, 22.37it/s]

Running batch 8 inference:  18%|█▊        | 9/49 [00:00<00:01, 20.27it/s]

Running batch 8 inference:  24%|██▍       | 12/49 [00:00<00:01, 18.64it/s]

Running batch 8 inference:  29%|██▊       | 14/49 [00:00<00:01, 17.98it/s]

Running batch 8 inference:  33%|███▎      | 16/49 [00:00<00:01, 17.29it/s]

Running batch 8 inference:  37%|███▋      | 18/49 [00:00<00:01, 16.65it/s]

Running batch 8 inference:  41%|████      | 20/49 [00:01<00:01, 16.30it/s]

Running batch 8 inference:  45%|████▍     | 22/49 [00:01<00:01, 16.20it/s]

Running batch 8 inference:  49%|████▉     | 24/49 [00:01<00:01, 15.86it/s]

Running batch 8 inference:  53%|█████▎    | 26/49 [00:01<00:01, 15.59it/s]

Running batch 8 inference:  57%|█████▋    | 28/49 [00:01<00:01, 15.68it/s]

Running batch 8 inference:  61%|██████    | 30/49 [00:01<00:01, 15.16it/s]

Running batch 8 inference:  65%|██████▌   | 32/49 [00:01<00:01, 14.66it/s]

Running batch 8 inference:  69%|██████▉   | 34/49 [00:02<00:01, 14.19it/s]

Running batch 8 inference:  73%|███████▎  | 36/49 [00:02<00:00, 13.75it/s]

Running batch 8 inference:  78%|███████▊  | 38/49 [00:02<00:00, 13.27it/s]

Running batch 8 inference:  82%|████████▏ | 40/49 [00:02<00:00, 13.06it/s]

Running batch 8 inference:  86%|████████▌ | 42/49 [00:02<00:00, 12.80it/s]

Running batch 8 inference:  90%|████████▉ | 44/49 [00:02<00:00, 12.36it/s]

Running batch 8 inference:  94%|█████████▍| 46/49 [00:03<00:00, 12.21it/s]

Running batch 8 inference:  98%|█████████▊| 48/49 [00:03<00:00, 11.92it/s]


Total Ensemble Batches:  75%|███████▌  | 3/4 [00:09<00:03,  3.26s/it]

Running batch 12 inference:   0%|          | 0/49 [00:00<?, ?it/s]

Running batch 12 inference:   6%|▌         | 3/49 [00:00<00:01, 29.07it/s]

Running batch 12 inference:  12%|█▏        | 6/49 [00:00<00:01, 23.46it/s]

Running batch 12 inference:  18%|█▊        | 9/49 [00:00<00:01, 22.79it/s]

Running batch 12 inference:  24%|██▍       | 12/49 [00:00<00:01, 21.67it/s]

Running batch 12 inference:  31%|███       | 15/49 [00:00<00:01, 21.33it/s]

Running batch 12 inference:  37%|███▋      | 18/49 [00:00<00:01, 19.40it/s]

Running batch 12 inference:  41%|████      | 20/49 [00:00<00:01, 18.47it/s]

Running batch 12 inference:  45%|████▍     | 22/49 [00:01<00:01, 17.73it/s]

Running batch 12 inference:  49%|████▉     | 24/49 [00:01<00:01, 16.96it/s]

Running batch 12 inference:  53%|█████▎    | 26/49 [00:01<00:01, 16.31it/s]

Running batch 12 inference:  57%|█████▋    | 28/49 [00:01<00:01, 16.08it/s]

Running batch 12 inference:  61%|██████    | 30/49 [00:01<00:01, 15.45it/s]

Running batch 12 inference:  65%|██████▌   | 32/49 [00:01<00:01, 15.23it/s]

Running batch 12 inference:  69%|██████▉   | 34/49 [00:01<00:01, 14.66it/s]

Running batch 12 inference:  73%|███████▎  | 36/49 [00:02<00:00, 14.06it/s]

Running batch 12 inference:  78%|███████▊  | 38/49 [00:02<00:00, 13.43it/s]

Running batch 12 inference:  82%|████████▏ | 40/49 [00:02<00:00, 13.21it/s]

Running batch 12 inference:  86%|████████▌ | 42/49 [00:02<00:00, 12.98it/s]

Running batch 12 inference:  90%|████████▉ | 44/49 [00:02<00:00, 12.74it/s]

Running batch 12 inference:  94%|█████████▍| 46/49 [00:02<00:00, 12.53it/s]

Running batch 12 inference:  98%|█████████▊| 48/49 [00:03<00:00, 12.30it/s]


Total Ensemble Batches: 100%|██████████| 4/4 [00:12<00:00,  3.22s/it]
Total Ensemble Batches: 100%|██████████| 4/4 [00:12<00:00,  3.23s/it]
2024-06-25 13:59:45.117 | SUCCESS  | earth2studio.run:ensemble:382 - Inference complete

Now let’s introduce slight model perturbation using the prognostic model hooks defined in earth2studio.models.px.utils.PrognosticMixin. Note that center.unsqueeze(-1) is DLWP specific since it operates on a cubed sphere with grid dimensions (nface, lat, lon) instead of just (lat,lon). To switch out the model, consider removing the unsqueeze() .

model.front_hook = lambda x, coords: (
    x
    - 0.1
    * x.var(dim=0)
    * (x - model.center.unsqueeze(-1))
    / (model.scale.unsqueeze(-1)) ** 2
    + 0.1 * (x - x.mean(dim=0)),
    coords,
)
# Also could use model.rear_hook = ...

io_perturbed = ZarrBackend(
    file_name="outputs/05_ensemble_model_perturbation.zarr", chunks=chunks
)
io_perturbed = ensemble(
    [forecast_date],
    nsteps,
    nensemble,
    model,
    data,
    io_perturbed,
    Gaussian(noise_amplitude=0.01),
    output_coords=output_coords,
    batch_size=batch_size,
)
2024-06-25 13:59:45.119 | INFO     | earth2studio.run:ensemble:294 - Running ensemble inference!
2024-06-25 13:59:45.119 | INFO     | earth2studio.run:ensemble:302 - Inference device: cuda
2024-06-25 13:59:45.119 | DEBUG    | earth2studio.data.gfs:fetch_gfs_dataarray:149 - Fetching GFS index file: 2024-01-29 18:00:00

Fetching GFS for 2024-01-29 18:00:00:   0%|          | 0/7 [00:00<?, ?it/s]

2024-06-25 13:59:45.407 | DEBUG    | earth2studio.data.gfs:fetch_gfs_dataarray:196 - Fetching GFS grib file for variable: t850 at 2024-01-29 18:00:00

Fetching GFS for 2024-01-29 18:00:00:   0%|          | 0/7 [00:00<?, ?it/s]

2024-06-25 13:59:45.429 | DEBUG    | earth2studio.data.gfs:fetch_gfs_dataarray:196 - Fetching GFS grib file for variable: z1000 at 2024-01-29 18:00:00

Fetching GFS for 2024-01-29 18:00:00:   0%|          | 0/7 [00:00<?, ?it/s]

2024-06-25 13:59:45.447 | DEBUG    | earth2studio.data.gfs:fetch_gfs_dataarray:196 - Fetching GFS grib file for variable: z700 at 2024-01-29 18:00:00

Fetching GFS for 2024-01-29 18:00:00:   0%|          | 0/7 [00:00<?, ?it/s]

2024-06-25 13:59:45.466 | DEBUG    | earth2studio.data.gfs:fetch_gfs_dataarray:196 - Fetching GFS grib file for variable: z500 at 2024-01-29 18:00:00

Fetching GFS for 2024-01-29 18:00:00:   0%|          | 0/7 [00:00<?, ?it/s]

2024-06-25 13:59:45.484 | DEBUG    | earth2studio.data.gfs:fetch_gfs_dataarray:196 - Fetching GFS grib file for variable: z300 at 2024-01-29 18:00:00

Fetching GFS for 2024-01-29 18:00:00:   0%|          | 0/7 [00:00<?, ?it/s]

2024-06-25 13:59:45.501 | DEBUG    | earth2studio.data.gfs:fetch_gfs_dataarray:196 - Fetching GFS grib file for variable: tcwv at 2024-01-29 18:00:00

Fetching GFS for 2024-01-29 18:00:00:   0%|          | 0/7 [00:00<?, ?it/s]
Fetching GFS for 2024-01-29 18:00:00:  86%|████████▌ | 6/7 [00:00<00:00, 53.51it/s]

2024-06-25 13:59:45.519 | DEBUG    | earth2studio.data.gfs:fetch_gfs_dataarray:196 - Fetching GFS grib file for variable: t2m at 2024-01-29 18:00:00

Fetching GFS for 2024-01-29 18:00:00:  86%|████████▌ | 6/7 [00:00<00:00, 53.51it/s]
Fetching GFS for 2024-01-29 18:00:00: 100%|██████████| 7/7 [00:00<00:00, 53.50it/s]
2024-06-25 13:59:45.544 | DEBUG    | earth2studio.data.gfs:fetch_gfs_dataarray:149 - Fetching GFS index file: 2024-01-30 00:00:00

Fetching GFS for 2024-01-30 00:00:00:   0%|          | 0/7 [00:00<?, ?it/s]

2024-06-25 13:59:45.694 | DEBUG    | earth2studio.data.gfs:fetch_gfs_dataarray:196 - Fetching GFS grib file for variable: t850 at 2024-01-30 00:00:00

Fetching GFS for 2024-01-30 00:00:00:   0%|          | 0/7 [00:00<?, ?it/s]

2024-06-25 13:59:45.713 | DEBUG    | earth2studio.data.gfs:fetch_gfs_dataarray:196 - Fetching GFS grib file for variable: z1000 at 2024-01-30 00:00:00

Fetching GFS for 2024-01-30 00:00:00:   0%|          | 0/7 [00:00<?, ?it/s]

2024-06-25 13:59:45.731 | DEBUG    | earth2studio.data.gfs:fetch_gfs_dataarray:196 - Fetching GFS grib file for variable: z700 at 2024-01-30 00:00:00

Fetching GFS for 2024-01-30 00:00:00:   0%|          | 0/7 [00:00<?, ?it/s]

2024-06-25 13:59:45.750 | DEBUG    | earth2studio.data.gfs:fetch_gfs_dataarray:196 - Fetching GFS grib file for variable: z500 at 2024-01-30 00:00:00

Fetching GFS for 2024-01-30 00:00:00:   0%|          | 0/7 [00:00<?, ?it/s]

2024-06-25 13:59:45.768 | DEBUG    | earth2studio.data.gfs:fetch_gfs_dataarray:196 - Fetching GFS grib file for variable: z300 at 2024-01-30 00:00:00

Fetching GFS for 2024-01-30 00:00:00:   0%|          | 0/7 [00:00<?, ?it/s]

2024-06-25 13:59:45.787 | DEBUG    | earth2studio.data.gfs:fetch_gfs_dataarray:196 - Fetching GFS grib file for variable: tcwv at 2024-01-30 00:00:00

Fetching GFS for 2024-01-30 00:00:00:   0%|          | 0/7 [00:00<?, ?it/s]
Fetching GFS for 2024-01-30 00:00:00:  86%|████████▌ | 6/7 [00:00<00:00, 53.45it/s]

2024-06-25 13:59:45.806 | DEBUG    | earth2studio.data.gfs:fetch_gfs_dataarray:196 - Fetching GFS grib file for variable: t2m at 2024-01-30 00:00:00

Fetching GFS for 2024-01-30 00:00:00:  86%|████████▌ | 6/7 [00:00<00:00, 53.45it/s]
Fetching GFS for 2024-01-30 00:00:00: 100%|██████████| 7/7 [00:00<00:00, 53.19it/s]
2024-06-25 13:59:45.885 | SUCCESS  | earth2studio.run:ensemble:315 - Fetched data from GFS
2024-06-25 13:59:45.898 | INFO     | earth2studio.run:ensemble:337 - Starting 16 Member Ensemble Inference with             4 number of batches.

Total Ensemble Batches:   0%|          | 0/4 [00:00<?, ?it/s]

Running batch 0 inference:   0%|          | 0/49 [00:00<?, ?it/s]

Running batch 0 inference:   8%|▊         | 4/49 [00:00<00:01, 27.75it/s]

Running batch 0 inference:  14%|█▍        | 7/49 [00:00<00:01, 25.91it/s]

Running batch 0 inference:  20%|██        | 10/49 [00:00<00:01, 22.93it/s]

Running batch 0 inference:  27%|██▋       | 13/49 [00:00<00:01, 21.18it/s]

Running batch 0 inference:  33%|███▎      | 16/49 [00:00<00:01, 19.24it/s]

Running batch 0 inference:  37%|███▋      | 18/49 [00:00<00:01, 18.58it/s]

Running batch 0 inference:  41%|████      | 20/49 [00:01<00:01, 17.97it/s]

Running batch 0 inference:  45%|████▍     | 22/49 [00:01<00:01, 17.24it/s]

Running batch 0 inference:  49%|████▉     | 24/49 [00:01<00:01, 16.45it/s]

Running batch 0 inference:  53%|█████▎    | 26/49 [00:01<00:01, 15.98it/s]

Running batch 0 inference:  57%|█████▋    | 28/49 [00:01<00:01, 15.58it/s]

Running batch 0 inference:  61%|██████    | 30/49 [00:01<00:01, 15.56it/s]

Running batch 0 inference:  65%|██████▌   | 32/49 [00:01<00:01, 15.19it/s]

Running batch 0 inference:  69%|██████▉   | 34/49 [00:01<00:01, 14.62it/s]

Running batch 0 inference:  73%|███████▎  | 36/49 [00:02<00:00, 14.17it/s]

Running batch 0 inference:  78%|███████▊  | 38/49 [00:02<00:00, 13.46it/s]

Running batch 0 inference:  82%|████████▏ | 40/49 [00:02<00:00, 12.85it/s]

Running batch 0 inference:  86%|████████▌ | 42/49 [00:02<00:00, 12.44it/s]

Running batch 0 inference:  90%|████████▉ | 44/49 [00:02<00:00, 12.08it/s]

Running batch 0 inference:  94%|█████████▍| 46/49 [00:02<00:00, 11.67it/s]

Running batch 0 inference:  98%|█████████▊| 48/49 [00:03<00:00, 11.46it/s]


Total Ensemble Batches:  25%|██▌       | 1/4 [00:03<00:09,  3.26s/it]

Running batch 4 inference:   0%|          | 0/49 [00:00<?, ?it/s]

Running batch 4 inference:   8%|▊         | 4/49 [00:00<00:01, 26.95it/s]

Running batch 4 inference:  14%|█▍        | 7/49 [00:00<00:01, 24.19it/s]

Running batch 4 inference:  20%|██        | 10/49 [00:00<00:01, 21.16it/s]

Running batch 4 inference:  27%|██▋       | 13/49 [00:00<00:01, 19.23it/s]

Running batch 4 inference:  31%|███       | 15/49 [00:00<00:01, 18.29it/s]

Running batch 4 inference:  35%|███▍      | 17/49 [00:00<00:01, 17.79it/s]

Running batch 4 inference:  39%|███▉      | 19/49 [00:00<00:01, 17.35it/s]

Running batch 4 inference:  43%|████▎     | 21/49 [00:01<00:01, 17.27it/s]

Running batch 4 inference:  47%|████▋     | 23/49 [00:01<00:01, 17.28it/s]

Running batch 4 inference:  51%|█████     | 25/49 [00:01<00:01, 16.88it/s]

Running batch 4 inference:  55%|█████▌    | 27/49 [00:01<00:01, 16.75it/s]

Running batch 4 inference:  59%|█████▉    | 29/49 [00:01<00:01, 16.24it/s]

Running batch 4 inference:  63%|██████▎   | 31/49 [00:01<00:01, 15.72it/s]

Running batch 4 inference:  67%|██████▋   | 33/49 [00:01<00:01, 14.83it/s]

Running batch 4 inference:  71%|███████▏  | 35/49 [00:02<00:01, 13.98it/s]

Running batch 4 inference:  76%|███████▌  | 37/49 [00:02<00:00, 13.49it/s]

Running batch 4 inference:  80%|███████▉  | 39/49 [00:02<00:00, 13.13it/s]

Running batch 4 inference:  84%|████████▎ | 41/49 [00:02<00:00, 12.66it/s]

Running batch 4 inference:  88%|████████▊ | 43/49 [00:02<00:00, 12.44it/s]

Running batch 4 inference:  92%|█████████▏| 45/49 [00:02<00:00, 12.24it/s]

Running batch 4 inference:  96%|█████████▌| 47/49 [00:03<00:00, 12.02it/s]

Running batch 4 inference: 100%|██████████| 49/49 [00:03<00:00, 11.95it/s]


Total Ensemble Batches:  50%|█████     | 2/4 [00:06<00:06,  3.26s/it]

Running batch 8 inference:   0%|          | 0/49 [00:00<?, ?it/s]

Running batch 8 inference:   6%|▌         | 3/49 [00:00<00:01, 28.00it/s]

Running batch 8 inference:  12%|█▏        | 6/49 [00:00<00:01, 23.36it/s]

Running batch 8 inference:  18%|█▊        | 9/49 [00:00<00:01, 20.93it/s]

Running batch 8 inference:  24%|██▍       | 12/49 [00:00<00:01, 18.93it/s]

Running batch 8 inference:  29%|██▊       | 14/49 [00:00<00:01, 18.30it/s]

Running batch 8 inference:  33%|███▎      | 16/49 [00:00<00:01, 17.81it/s]

Running batch 8 inference:  37%|███▋      | 18/49 [00:00<00:01, 17.13it/s]

Running batch 8 inference:  41%|████      | 20/49 [00:01<00:01, 16.46it/s]

Running batch 8 inference:  45%|████▍     | 22/49 [00:01<00:01, 15.94it/s]

Running batch 8 inference:  49%|████▉     | 24/49 [00:01<00:01, 15.46it/s]

Running batch 8 inference:  53%|█████▎    | 26/49 [00:01<00:01, 14.87it/s]

Running batch 8 inference:  57%|█████▋    | 28/49 [00:01<00:01, 15.14it/s]

Running batch 8 inference:  61%|██████    | 30/49 [00:01<00:01, 15.27it/s]

Running batch 8 inference:  65%|██████▌   | 32/49 [00:01<00:01, 15.01it/s]

Running batch 8 inference:  69%|██████▉   | 34/49 [00:02<00:01, 14.95it/s]

Running batch 8 inference:  73%|███████▎  | 36/49 [00:02<00:00, 14.51it/s]

Running batch 8 inference:  78%|███████▊  | 38/49 [00:02<00:00, 14.24it/s]

Running batch 8 inference:  82%|████████▏ | 40/49 [00:02<00:00, 13.58it/s]

Running batch 8 inference:  86%|████████▌ | 42/49 [00:02<00:00, 13.27it/s]

Running batch 8 inference:  90%|████████▉ | 44/49 [00:02<00:00, 13.02it/s]

Running batch 8 inference:  94%|█████████▍| 46/49 [00:02<00:00, 12.49it/s]

Running batch 8 inference:  98%|█████████▊| 48/49 [00:03<00:00, 12.03it/s]


Total Ensemble Batches:  75%|███████▌  | 3/4 [00:09<00:03,  3.27s/it]

Running batch 12 inference:   0%|          | 0/49 [00:00<?, ?it/s]

Running batch 12 inference:   8%|▊         | 4/49 [00:00<00:01, 27.41it/s]

Running batch 12 inference:  14%|█▍        | 7/49 [00:00<00:01, 24.22it/s]

Running batch 12 inference:  20%|██        | 10/49 [00:00<00:01, 20.78it/s]

Running batch 12 inference:  27%|██▋       | 13/49 [00:00<00:01, 19.37it/s]

Running batch 12 inference:  31%|███       | 15/49 [00:00<00:01, 18.14it/s]

Running batch 12 inference:  35%|███▍      | 17/49 [00:00<00:01, 17.26it/s]

Running batch 12 inference:  39%|███▉      | 19/49 [00:01<00:01, 16.62it/s]

Running batch 12 inference:  43%|████▎     | 21/49 [00:01<00:01, 16.62it/s]

Running batch 12 inference:  47%|████▋     | 23/49 [00:01<00:01, 15.89it/s]

Running batch 12 inference:  51%|█████     | 25/49 [00:01<00:01, 15.22it/s]

Running batch 12 inference:  55%|█████▌    | 27/49 [00:01<00:01, 14.62it/s]

Running batch 12 inference:  59%|█████▉    | 29/49 [00:01<00:01, 14.66it/s]

Running batch 12 inference:  63%|██████▎   | 31/49 [00:01<00:01, 14.16it/s]

Running batch 12 inference:  67%|██████▋   | 33/49 [00:02<00:01, 13.86it/s]

Running batch 12 inference:  71%|███████▏  | 35/49 [00:02<00:01, 13.81it/s]

Running batch 12 inference:  76%|███████▌  | 37/49 [00:02<00:00, 13.96it/s]

Running batch 12 inference:  80%|███████▉  | 39/49 [00:02<00:00, 13.85it/s]

Running batch 12 inference:  84%|████████▎ | 41/49 [00:02<00:00, 13.50it/s]

Running batch 12 inference:  88%|████████▊ | 43/49 [00:02<00:00, 13.26it/s]

Running batch 12 inference:  92%|█████████▏| 45/49 [00:02<00:00, 12.89it/s]

Running batch 12 inference:  96%|█████████▌| 47/49 [00:03<00:00, 12.79it/s]

Running batch 12 inference: 100%|██████████| 49/49 [00:03<00:00, 12.68it/s]


Total Ensemble Batches: 100%|██████████| 4/4 [00:13<00:00,  3.27s/it]
Total Ensemble Batches: 100%|██████████| 4/4 [00:13<00:00,  3.26s/it]
2024-06-25 13:59:58.957 | SUCCESS  | earth2studio.run:ensemble:382 - Inference complete

Post Processing#

The last step is to post process our results. Here we plot and compare the ensemble mean and standard deviation from using an unperturbed/perturbed model.

Notice that the Zarr IO function has additional APIs to interact with the stored data.

import cartopy.crs as ccrs
import matplotlib.pyplot as plt
from matplotlib.colors import LogNorm

levels_unperturbed = np.linspace(0, io_unperturbed["tcwv"][:].max())
levels_perturbed = np.linspace(0, io_perturbed["tcwv"][:].max())


std_levels_perturbed = np.linspace(0, io_perturbed["tcwv"][:].std(axis=0).max())

plt.close("all")
fig = plt.figure(figsize=(20, 10), tight_layout=True)
ax0 = fig.add_subplot(2, 2, 1, projection=ccrs.PlateCarree())
ax1 = fig.add_subplot(2, 2, 2, projection=ccrs.PlateCarree())
ax2 = fig.add_subplot(2, 2, 3, projection=ccrs.PlateCarree())
ax3 = fig.add_subplot(2, 2, 4, projection=ccrs.PlateCarree())


def update(frame):
    """This function updates the frame with a new lead time for animation."""
    import warnings

    warnings.filterwarnings("ignore")
    ax0.clear()
    ax1.clear()
    ax2.clear()
    ax3.clear()

    ## Update unperturbed image
    im0 = ax0.contourf(
        io_unperturbed["lon"][:],
        io_unperturbed["lat"][:],
        io_unperturbed["tcwv"][:, 0, frame].mean(axis=0),
        transform=ccrs.PlateCarree(),
        cmap="Blues",
        levels=levels_unperturbed,
    )
    ax0.coastlines()
    ax0.gridlines()

    im1 = ax1.contourf(
        io_unperturbed["lon"][:],
        io_unperturbed["lat"][:],
        io_unperturbed["tcwv"][:, 0, frame].std(axis=0),
        transform=ccrs.PlateCarree(),
        cmap="RdPu",
        levels=std_levels_perturbed,
        norm=LogNorm(vmin=1e-1, vmax=std_levels_perturbed[-1]),
    )
    ax1.coastlines()
    ax1.gridlines()

    im2 = ax2.contourf(
        io_perturbed["lon"][:],
        io_perturbed["lat"][:],
        io_perturbed["tcwv"][:, 0, frame].mean(axis=0),
        transform=ccrs.PlateCarree(),
        cmap="Blues",
        levels=levels_perturbed,
    )
    ax2.coastlines()
    ax2.gridlines()

    im3 = ax3.contourf(
        io_perturbed["lon"][:],
        io_perturbed["lat"][:],
        io_perturbed["tcwv"][:, 0, frame].std(axis=0),
        transform=ccrs.PlateCarree(),
        cmap="RdPu",
        levels=std_levels_perturbed,
        norm=LogNorm(vmin=1e-1, vmax=std_levels_perturbed[-1]),
    )
    ax3.coastlines()
    ax3.gridlines()

    for i in range(16):
        ax0.contour(
            io_unperturbed["lon"][:],
            io_unperturbed["lat"][:],
            io_unperturbed["z500"][i, 0, frame] / 100.0,
            transform=ccrs.PlateCarree(),
            levels=np.arange(485, 580, 15),
            colors="black",
            linestyle="dashed",
        )

        ax2.contour(
            io_perturbed["lon"][:],
            io_perturbed["lat"][:],
            io_perturbed["z500"][i, 0, frame] / 100.0,
            transform=ccrs.PlateCarree(),
            levels=np.arange(485, 580, 15),
            colors="black",
            linestyle="dashed",
        )
    plt.suptitle(
        f'Forecast Starting on {forecast_date} - Lead Time - {io_perturbed["lead_time"][frame]}'
    )

    ax0.set_title("Unperturbed Ensemble Mean - tcwv + z500 countors")
    ax1.set_title("Unperturbed Ensemble Std - tcwv")
    ax2.set_title("Perturbed Ensemble Mean - tcwv + z500 contours")
    ax3.set_title("Perturbed Ensemble Std - tcwv")

    if frame == 0:
        plt.colorbar(
            im0, ax=ax0, shrink=0.75, pad=0.04, label="kg m^-2", format="%2.1f"
        )
        plt.colorbar(
            im1, ax=ax1, shrink=0.75, pad=0.04, label="kg m^-2", format="%1.2e"
        )
        plt.colorbar(
            im2, ax=ax2, shrink=0.75, pad=0.04, label="kg m^-2", format="%2.1f"
        )
        plt.colorbar(
            im3, ax=ax3, shrink=0.75, pad=0.04, label="kg m^-2", format="%1.2e"
        )


# Uncomment this for animation
# import matplotlib.animation as animation
# update(0)
# ani = animation.FuncAnimation(
# fig=fig, func=update, frames=range(1, nsteps), cache_frame_data=False
# )
# ani.save(f"outputs/05_model_perturbation_{forecast_date}.gif", dpi=300)


for lt in [10, 20, 30, 40]:
    update(lt)
    plt.savefig(
        f"outputs/05_model_perturbation_{forecast_date}_leadtime_{lt}.png",
        dpi=300,
        bbox_inches="tight",
    )
Forecast Starting on 2024-01-30 - Lead Time - 240 hours, Unperturbed Ensemble Mean - tcwv + z500 countors, Unperturbed Ensemble Std - tcwv, Perturbed Ensemble Mean - tcwv + z500 contours, Perturbed Ensemble Std - tcwv

Total running time of the script: (0 minutes 44.126 seconds)

Gallery generated by Sphinx-Gallery