Statistical Inference#

This example will demonstrate how to run a simple inference workflow to generate a forecast and then to save a statistic of that data. There are a handful of built-in statistics available in earth2studio.statistics, but here we will demonstrate how to define a custom statistic and run inference.

In this example you will learn:

  • How to instantiate a built in prognostic model

  • Creating a data source and IO object

  • Create a custom statistic

  • Running a simple built in workflow

  • Post-processing results

Creating a statistical workflow#

Start with creating a simple inference workflow to use. We encourage users to explore and experiment with their own custom workflows that borrow ideas from built in workflows inside earth2studio.run or the examples.

Creating our own generalizable workflow to use with statistics is easy when we rely on the component interfaces defined in Earth2Studio (use dependency injection). Here we create a run method that accepts the following:

  • time: Input list of datetimes / strings to run inference for

  • nsteps: Number of forecast steps to predict

  • prognostic: Our initialized prognostic model

  • statistic: our custom statistic

  • data: Initialized data source to fetch initial conditions from

  • io: IOBackend

We do not run an ensemble inference workflow here, even though it is common for statistical inference. See ensemble examples for details on how to extend this example for that purpose.

import os

os.makedirs("outputs", exist_ok=True)
from dotenv import load_dotenv

load_dotenv()  # TODO: make common example prep function

from datetime import datetime

import numpy as np
import pandas as pd
from loguru import logger
from tqdm import tqdm

from earth2studio.data import DataSource, fetch_data
from earth2studio.io import IOBackend
from earth2studio.models.px import PrognosticModel
from earth2studio.statistics import Statistic
from earth2studio.utils.coords import map_coords
from earth2studio.utils.time import to_time_array

logger.remove()
logger.add(lambda msg: tqdm.write(msg, end=""), colorize=True)


def run_stats(
    time: list[str] | list[datetime] | list[np.datetime64],
    nsteps: int,
    nensemble: int,
    prognostic: PrognosticModel,
    statistic: Statistic,
    data: DataSource,
    io: IOBackend,
) -> IOBackend:
    """Simple statistics workflow

    Parameters
    ----------
    time : list[str] | list[datetime] | list[np.datetime64]
        List of string, datetimes or np.datetime64
    nsteps : int
        Number of forecast steps
    nensemble : int
        Number of ensemble members to run inference for.
    prognostic : PrognosticModel
        Prognostic models
    statistic : Statistic
        Custom statistic to compute and write to IO.
    data : DataSource
        Data source
    io : IOBackend
        IO object

    Returns
    -------
    IOBackend
        Output IO object
    """
    logger.info("Running simple statistics workflow!")
    # Load model onto the device
    device = torch.device("cuda" if torch.cuda.is_available() else "cpu")
    logger.info(f"Inference device: {device}")
    prognostic = prognostic.to(device)
    # Fetch data from data source and load onto device
    time = to_time_array(time)
    x, coords = fetch_data(
        source=data,
        time=time,
        lead_time=prognostic.input_coords()["lead_time"],
        variable=prognostic.input_coords()["variable"],
        device=device,
    )
    logger.success(f"Fetched data from {data.__class__.__name__}")

    # Set up IO backend
    total_coords = coords.copy()
    output_coords = prognostic.output_coords(prognostic.input_coords())
    total_coords["lead_time"] = np.asarray(
        [output_coords["lead_time"] * i for i in range(nsteps + 1)]
    ).flatten()
    # Remove reduced dimensions from statistic
    for d in statistic.reduction_dimensions:
        total_coords.pop(d, None)

    io.add_array(total_coords, str(statistic))

    # Map lat and lon if needed
    x, coords = map_coords(x, coords, prognostic.input_coords())

    # Create prognostic iterator
    model = prognostic.create_iterator(x, coords)

    logger.info("Inference starting!")
    with tqdm(total=nsteps + 1, desc="Running inference") as pbar:
        for step, (x, coords) in enumerate(model):
            s, coords = statistic(x, coords)
            io.write(s, coords, str(statistic))
            pbar.update(1)
            if step == nsteps:
                break

    logger.success("Inference complete")
    return io

Set Up#

With the statistical workflow defined, we now need to create the individual components.

We need the following:

from collections import OrderedDict

import numpy as np
import torch

from earth2studio.data import GFS
from earth2studio.io import NetCDF4Backend
from earth2studio.models.px import Pangu24
from earth2studio.utils.type import CoordSystem

# Load the default model package which downloads the check point from NGC
package = Pangu24.load_default_package()
model = Pangu24.load_model(package)

# Create the data source
data = GFS()

# Create the IO handler, store in memory
io = NetCDF4Backend(
    file_name="outputs/soi.nc",
    backend_kwargs={"mode": "w"},
)


# Create the custom statistic
class SOI:
    """Custom metric calculation the Southern Oscillation Index.

    SOI = ( standardized_tahiti_slp - standardized_darwin_slp ) / soi_normalization

    soi_normalization = std( historical ( standardized_tahiti_slp - standardized_darwin_slp ) )

    standardized_*_slp = (*_slp - climatological_mean_*_slp) / climatological_std_*_slp

    Note
    ----
    __str__
        Name that will be applied to the output of this statistic, primarily for IO purposes.
    reduction_dimensions
        Dimensions that this statistic reduces over. This is used to help automatically determine
        the output coordinates, primarily used for IO purposes.
    """

    def __str__(self) -> str:
        return "soi"

    def __init__(
        self,
    ):
        # Read in Tahiti and Darwin SLP data
        from physicsnemo.utils.filesystem import _download_cached

        file_path = _download_cached(
            "http://data.longpaddock.qld.gov.au/SeasonalClimateOutlook/SouthernOscillationIndex/SOIDataFiles/DailySOI1933-1992Base.txt"
        )
        ds = pd.read_csv(file_path, sep=r"\s+")
        dates = pd.date_range("1999-01-01", freq="d", periods=len(ds))
        ds["date"] = dates
        ds = ds.set_index("date")
        ds = ds.drop(["Year", "Day", "SOI"], axis=1)
        ds = ds.rolling(30, min_periods=1).mean().dropna()

        self.climatological_means = torch.tensor(
            ds.groupby(ds.index.month).mean().to_numpy(), dtype=torch.float32
        )
        self.climatological_std = torch.tensor(
            ds.groupby(ds.index.month).std().to_numpy(), dtype=torch.float32
        )

        standardized = ds.groupby(ds.index.month).transform(
            lambda x: (x - x.mean()) / x.std()
        )
        diff = standardized["Tahiti"] - standardized["Darwin"]

        self.normalization = torch.tensor(
            diff.groupby(ds.index.month).std().to_numpy(), dtype=torch.float32
        )

        self.tahiti_coords = {
            "variable": np.array(["msl"]),
            "lat": np.array([-17.65]),
            "lon": np.array([210.57]),
        }
        self.darwin_coords = {
            "variable": np.array(["msl"]),
            "lat": np.array([-12.46]),
            "lon": np.array([130.84]),
        }

        self.reduction_dimensions = list(self.tahiti_coords)

    def __call__(
        self, x: torch.Tensor, coords: CoordSystem
    ) -> tuple[torch.Tensor, CoordSystem]:
        """Computes the SOI given an input.

        coords must be a superset of both

        tahiti_coords = {
            'variable': np.array(['msl']),
            'lat': np.array([-17.65]),
            'lon': np.array([210.57])
        }

        and

        darwin_coords = {
            'variable': np.array(['msl']),
            'lat': np.array([-12.46]),
            'lon': np.array([130.84])
        }

        So make sure that the model chosen predicts the `msl` variable.

        Parameters
        ----------
        x : torch.Tensor
            Input tensor
        coords : CoordSystem
            coordinate system belonging to the input tensor.

        Returns
        -------
        tuple[torch.Tensor, CoordSystem]
            Returns the SOI and appropriate coordinate system.
        """
        tahiti, _ = map_coords(x, coords, self.tahiti_coords)
        darwin, _ = map_coords(x, coords, self.darwin_coords)

        tahiti = tahiti.squeeze(-3, -2, -1) / 100.0
        darwin = darwin.squeeze(-3, -2, -1) / 100.0
        output_coords = OrderedDict(
            {k: v for k, v in coords.items() if k not in self.reduction_dimensions}
        )

        # Get time coordinates
        times = coords["time"].reshape(-1, 1) + coords["lead_time"].reshape(1, -1)
        months = torch.broadcast_to(
            torch.as_tensor(
                [pd.Timestamp(t).month for t in times.flatten()],
                device=tahiti.device,
                dtype=torch.int32,
            ).reshape(times.shape),
            tahiti.shape,
        )

        cm = self.climatological_means.to(tahiti.device)
        cs = self.climatological_std.to(tahiti.device)
        norm = self.normalization.to(tahiti.device)

        tahiti_std_anomaly = (tahiti - cm[months, 0]) / cs[months, 0]
        darwin_std_anomaly = (tahiti - cm[months, 1]) / cs[months, 1]

        return (tahiti_std_anomaly - darwin_std_anomaly) / norm[months], output_coords


soi = SOI()
Downloading pangu_weather_24.onnx: 0%|          | 0.00/1.10G [00:00<?, ?B/s]
Downloading pangu_weather_24.onnx: 1%|          | 10.0M/1.10G [00:00<01:05, 18.0MB/s]
Downloading pangu_weather_24.onnx: 2%|▏         | 20.0M/1.10G [00:00<00:40, 28.7MB/s]
Downloading pangu_weather_24.onnx: 3%|▎         | 30.0M/1.10G [00:00<00:29, 38.8MB/s]
Downloading pangu_weather_24.onnx: 4%|▎         | 40.0M/1.10G [00:01<00:25, 43.8MB/s]
Downloading pangu_weather_24.onnx: 5%|▌         | 60.0M/1.10G [00:01<00:16, 69.1MB/s]
Downloading pangu_weather_24.onnx: 7%|▋         | 80.0M/1.10G [00:01<00:12, 90.9MB/s]
Downloading pangu_weather_24.onnx: 9%|▉         | 100M/1.10G [00:01<00:09, 114MB/s]
Downloading pangu_weather_24.onnx: 11%|█         | 120M/1.10G [00:01<00:08, 129MB/s]
Downloading pangu_weather_24.onnx: 12%|█▏        | 140M/1.10G [00:01<00:07, 136MB/s]
Downloading pangu_weather_24.onnx: 14%|█▍        | 160M/1.10G [00:01<00:06, 151MB/s]
Downloading pangu_weather_24.onnx: 16%|█▌        | 180M/1.10G [00:02<00:05, 166MB/s]
Downloading pangu_weather_24.onnx: 18%|█▊        | 200M/1.10G [00:02<00:05, 168MB/s]
Downloading pangu_weather_24.onnx: 20%|█▉        | 220M/1.10G [00:02<00:05, 176MB/s]
Downloading pangu_weather_24.onnx: 21%|██▏       | 240M/1.10G [00:02<00:05, 178MB/s]
Downloading pangu_weather_24.onnx: 23%|██▎       | 260M/1.10G [00:02<00:04, 184MB/s]
Downloading pangu_weather_24.onnx: 25%|██▍       | 280M/1.10G [00:02<00:04, 190MB/s]
Downloading pangu_weather_24.onnx: 27%|██▋       | 300M/1.10G [00:02<00:04, 187MB/s]
Downloading pangu_weather_24.onnx: 28%|██▊       | 320M/1.10G [00:02<00:04, 192MB/s]
Downloading pangu_weather_24.onnx: 30%|███       | 340M/1.10G [00:02<00:04, 180MB/s]
Downloading pangu_weather_24.onnx: 32%|███▏      | 360M/1.10G [00:03<00:04, 188MB/s]
Downloading pangu_weather_24.onnx: 35%|███▍      | 390M/1.10G [00:03<00:04, 190MB/s]
Downloading pangu_weather_24.onnx: 36%|███▋      | 410M/1.10G [00:03<00:04, 187MB/s]
Downloading pangu_weather_24.onnx: 39%|███▉      | 440M/1.10G [00:03<00:03, 194MB/s]
Downloading pangu_weather_24.onnx: 42%|████▏     | 470M/1.10G [00:03<00:03, 200MB/s]
Downloading pangu_weather_24.onnx: 43%|████▎     | 490M/1.10G [00:03<00:03, 189MB/s]
Downloading pangu_weather_24.onnx: 45%|████▌     | 510M/1.10G [00:03<00:03, 191MB/s]
Downloading pangu_weather_24.onnx: 47%|████▋     | 530M/1.10G [00:03<00:03, 195MB/s]
Downloading pangu_weather_24.onnx: 49%|████▉     | 550M/1.10G [00:04<00:03, 195MB/s]
Downloading pangu_weather_24.onnx: 51%|█████     | 570M/1.10G [00:04<00:02, 199MB/s]
Downloading pangu_weather_24.onnx: 53%|█████▎    | 600M/1.10G [00:04<00:02, 203MB/s]
Downloading pangu_weather_24.onnx: 55%|█████▌    | 620M/1.10G [00:04<00:02, 203MB/s]
Downloading pangu_weather_24.onnx: 57%|█████▋    | 640M/1.10G [00:04<00:02, 205MB/s]
Downloading pangu_weather_24.onnx: 59%|█████▊    | 660M/1.10G [00:04<00:02, 206MB/s]
Downloading pangu_weather_24.onnx: 60%|██████    | 680M/1.10G [00:04<00:02, 205MB/s]
Downloading pangu_weather_24.onnx: 62%|██████▏   | 700M/1.10G [00:04<00:02, 195MB/s]
Downloading pangu_weather_24.onnx: 64%|██████▍   | 720M/1.10G [00:04<00:02, 194MB/s]
Downloading pangu_weather_24.onnx: 66%|██████▌   | 740M/1.10G [00:05<00:02, 192MB/s]
Downloading pangu_weather_24.onnx: 67%|██████▋   | 760M/1.10G [00:05<00:02, 191MB/s]
Downloading pangu_weather_24.onnx: 69%|██████▉   | 780M/1.10G [00:05<00:01, 196MB/s]
Downloading pangu_weather_24.onnx: 71%|███████   | 800M/1.10G [00:05<00:01, 198MB/s]
Downloading pangu_weather_24.onnx: 73%|███████▎  | 820M/1.10G [00:05<00:01, 200MB/s]
Downloading pangu_weather_24.onnx: 75%|███████▍  | 840M/1.10G [00:05<00:01, 181MB/s]
Downloading pangu_weather_24.onnx: 76%|███████▋  | 860M/1.10G [00:05<00:01, 187MB/s]
Downloading pangu_weather_24.onnx: 78%|███████▊  | 880M/1.10G [00:05<00:01, 187MB/s]
Downloading pangu_weather_24.onnx: 80%|███████▉  | 900M/1.10G [00:05<00:01, 192MB/s]
Downloading pangu_weather_24.onnx: 82%|████████▏ | 920M/1.10G [00:06<00:01, 196MB/s]
Downloading pangu_weather_24.onnx: 83%|████████▎ | 940M/1.10G [00:06<00:00, 199MB/s]
Downloading pangu_weather_24.onnx: 85%|████████▌ | 960M/1.10G [00:06<00:00, 202MB/s]
Downloading pangu_weather_24.onnx: 87%|████████▋ | 980M/1.10G [00:06<00:00, 201MB/s]
Downloading pangu_weather_24.onnx: 89%|████████▊ | 0.98G/1.10G [00:06<00:00, 197MB/s]
Downloading pangu_weather_24.onnx: 91%|█████████▏| 1.01G/1.10G [00:06<00:00, 202MB/s]
Downloading pangu_weather_24.onnx: 93%|█████████▎| 1.03G/1.10G [00:06<00:00, 203MB/s]
Downloading pangu_weather_24.onnx: 95%|█████████▍| 1.04G/1.10G [00:06<00:00, 201MB/s]
Downloading pangu_weather_24.onnx: 97%|█████████▋| 1.06G/1.10G [00:06<00:00, 193MB/s]
Downloading pangu_weather_24.onnx: 98%|█████████▊| 1.08G/1.10G [00:07<00:00, 148MB/s]
Downloading pangu_weather_24.onnx: 100%|██████████| 1.10G/1.10G [00:07<00:00, 163MB/s]

Execute the Workflow#

With all components initialized, running the workflow is a single line of Python code. Workflow will return the provided IO object back to the user, which can be used to then post process. Some have additional APIs that can be handy for post-processing or saving to file. Check the API docs for more information. We simulate a trajectory of 60 time steps, or 2 months using Pangu24

nsteps = 60
nensemble = 1
io = run_stats(["2022-01-01"], nsteps, nensemble, model, soi, data, io)
2025-05-15 03:09:25.755 | INFO     | __main__:run_stats:117 - Running simple statistics workflow!
2025-05-15 03:09:25.755 | INFO     | __main__:run_stats:120 - Inference device: cuda

Fetching GFS data:   0%|          | 0/69 [00:00<?, ?it/s]

2025-05-15 03:09:50.206 | DEBUG    | earth2studio.data.gfs:fetch_array:352 - Fetching GFS grib file: noaa-gfs-bdp-pds/gfs.20220101/00/atmos/gfs.t00z.pgrb2.0p25.f000 151656416-759878

Fetching GFS data:   0%|          | 0/69 [00:00<?, ?it/s]

2025-05-15 03:09:50.208 | DEBUG    | earth2studio.data.gfs:fetch_array:352 - Fetching GFS grib file: noaa-gfs-bdp-pds/gfs.20220101/00/atmos/gfs.t00z.pgrb2.0p25.f000 218679518-1156420

Fetching GFS data:   0%|          | 0/69 [00:00<?, ?it/s]

2025-05-15 03:09:50.210 | DEBUG    | earth2studio.data.gfs:fetch_array:352 - Fetching GFS grib file: noaa-gfs-bdp-pds/gfs.20220101/00/atmos/gfs.t00z.pgrb2.0p25.f000 337729946-1341851

Fetching GFS data:   0%|          | 0/69 [00:00<?, ?it/s]

2025-05-15 03:09:50.211 | DEBUG    | earth2studio.data.gfs:fetch_array:352 - Fetching GFS grib file: noaa-gfs-bdp-pds/gfs.20220101/00/atmos/gfs.t00z.pgrb2.0p25.f000 398978760-951998

Fetching GFS data:   0%|          | 0/69 [00:00<?, ?it/s]

2025-05-15 03:09:50.213 | DEBUG    | earth2studio.data.gfs:fetch_array:352 - Fetching GFS grib file: noaa-gfs-bdp-pds/gfs.20220101/00/atmos/gfs.t00z.pgrb2.0p25.f000 195062222-745131

Fetching GFS data:   0%|          | 0/69 [00:00<?, ?it/s]

2025-05-15 03:09:50.214 | DEBUG    | earth2studio.data.gfs:fetch_array:352 - Fetching GFS grib file: noaa-gfs-bdp-pds/gfs.20220101/00/atmos/gfs.t00z.pgrb2.0p25.f000 342364395-942176

Fetching GFS data:   0%|          | 0/69 [00:00<?, ?it/s]

2025-05-15 03:09:50.216 | DEBUG    | earth2studio.data.gfs:fetch_array:352 - Fetching GFS grib file: noaa-gfs-bdp-pds/gfs.20220101/00/atmos/gfs.t00z.pgrb2.0p25.f000 187082652-1139010

Fetching GFS data:   0%|          | 0/69 [00:00<?, ?it/s]

2025-05-15 03:09:50.217 | DEBUG    | earth2studio.data.gfs:fetch_array:352 - Fetching GFS grib file: noaa-gfs-bdp-pds/gfs.20220101/00/atmos/gfs.t00z.pgrb2.0p25.f000 341434010-930385

Fetching GFS data:   0%|          | 0/69 [00:00<?, ?it/s]

2025-05-15 03:09:50.219 | DEBUG    | earth2studio.data.gfs:fetch_array:352 - Fetching GFS grib file: noaa-gfs-bdp-pds/gfs.20220101/00/atmos/gfs.t00z.pgrb2.0p25.f000 211657731-595750

Fetching GFS data:   0%|          | 0/69 [00:00<?, ?it/s]

2025-05-15 03:09:50.220 | DEBUG    | earth2studio.data.gfs:fetch_array:352 - Fetching GFS grib file: noaa-gfs-bdp-pds/gfs.20220101/00/atmos/gfs.t00z.pgrb2.0p25.f000 244182036-592113

Fetching GFS data:   0%|          | 0/69 [00:00<?, ?it/s]

2025-05-15 03:09:50.222 | DEBUG    | earth2studio.data.gfs:fetch_array:352 - Fetching GFS grib file: noaa-gfs-bdp-pds/gfs.20220101/00/atmos/gfs.t00z.pgrb2.0p25.f000 172597413-758101

Fetching GFS data:   0%|          | 0/69 [00:00<?, ?it/s]

2025-05-15 03:09:50.223 | DEBUG    | earth2studio.data.gfs:fetch_array:352 - Fetching GFS grib file: noaa-gfs-bdp-pds/gfs.20220101/00/atmos/gfs.t00z.pgrb2.0p25.f000 280390840-735884

Fetching GFS data:   0%|          | 0/69 [00:00<?, ?it/s]

2025-05-15 03:09:50.225 | DEBUG    | earth2studio.data.gfs:fetch_array:352 - Fetching GFS grib file: noaa-gfs-bdp-pds/gfs.20220101/00/atmos/gfs.t00z.pgrb2.0p25.f000 178466960-975427

Fetching GFS data:   0%|          | 0/69 [00:00<?, ?it/s]

2025-05-15 03:09:50.226 | DEBUG    | earth2studio.data.gfs:fetch_array:352 - Fetching GFS grib file: noaa-gfs-bdp-pds/gfs.20220101/00/atmos/gfs.t00z.pgrb2.0p25.f000 184818667-751183

Fetching GFS data:   0%|          | 0/69 [00:00<?, ?it/s]

2025-05-15 03:09:50.227 | DEBUG    | earth2studio.data.gfs:fetch_array:352 - Fetching GFS grib file: noaa-gfs-bdp-pds/gfs.20220101/00/atmos/gfs.t00z.pgrb2.0p25.f000 0-877594

Fetching GFS data:   0%|          | 0/69 [00:00<?, ?it/s]

2025-05-15 03:09:50.229 | DEBUG    | earth2studio.data.gfs:fetch_array:352 - Fetching GFS grib file: noaa-gfs-bdp-pds/gfs.20220101/00/atmos/gfs.t00z.pgrb2.0p25.f000 398017307-961453

Fetching GFS data:   0%|          | 0/69 [00:00<?, ?it/s]

2025-05-15 03:09:50.230 | DEBUG    | earth2studio.data.gfs:fetch_array:352 - Fetching GFS grib file: noaa-gfs-bdp-pds/gfs.20220101/00/atmos/gfs.t00z.pgrb2.0p25.f000 265181184-554592

Fetching GFS data:   0%|          | 0/69 [00:00<?, ?it/s]

2025-05-15 03:09:50.231 | DEBUG    | earth2studio.data.gfs:fetch_array:352 - Fetching GFS grib file: noaa-gfs-bdp-pds/gfs.20220101/00/atmos/gfs.t00z.pgrb2.0p25.f000 365233854-937705

Fetching GFS data:   0%|          | 0/69 [00:00<?, ?it/s]

2025-05-15 03:09:50.233 | DEBUG    | earth2studio.data.gfs:fetch_array:352 - Fetching GFS grib file: noaa-gfs-bdp-pds/gfs.20220101/00/atmos/gfs.t00z.pgrb2.0p25.f000 393076173-856786

Fetching GFS data:   0%|          | 0/69 [00:00<?, ?it/s]

2025-05-15 03:09:50.234 | DEBUG    | earth2studio.data.gfs:fetch_array:352 - Fetching GFS grib file: noaa-gfs-bdp-pds/gfs.20220101/00/atmos/gfs.t00z.pgrb2.0p25.f000 156987362-832146

Fetching GFS data:   0%|          | 0/69 [00:00<?, ?it/s]

2025-05-15 03:09:50.236 | DEBUG    | earth2studio.data.gfs:fetch_array:352 - Fetching GFS grib file: noaa-gfs-bdp-pds/gfs.20220101/00/atmos/gfs.t00z.pgrb2.0p25.f000 307565094-903183

Fetching GFS data:   0%|          | 0/69 [00:00<?, ?it/s]

2025-05-15 03:09:50.237 | DEBUG    | earth2studio.data.gfs:fetch_array:352 - Fetching GFS grib file: noaa-gfs-bdp-pds/gfs.20220101/00/atmos/gfs.t00z.pgrb2.0p25.f000 282343876-1298659

Fetching GFS data:   0%|          | 0/69 [00:00<?, ?it/s]

2025-05-15 03:09:50.238 | DEBUG    | earth2studio.data.gfs:fetch_array:352 - Fetching GFS grib file: noaa-gfs-bdp-pds/gfs.20220101/00/atmos/gfs.t00z.pgrb2.0p25.f000 174760328-1259620

Fetching GFS data:   0%|          | 0/69 [00:00<?, ?it/s]

2025-05-15 03:09:50.240 | DEBUG    | earth2studio.data.gfs:fetch_array:352 - Fetching GFS grib file: noaa-gfs-bdp-pds/gfs.20220101/00/atmos/gfs.t00z.pgrb2.0p25.f000 334667042-859385

Fetching GFS data:   0%|          | 0/69 [00:00<?, ?it/s]

2025-05-15 03:09:50.241 | DEBUG    | earth2studio.data.gfs:fetch_array:352 - Fetching GFS grib file: noaa-gfs-bdp-pds/gfs.20220101/00/atmos/gfs.t00z.pgrb2.0p25.f000 195807353-754091

Fetching GFS data:   0%|          | 0/69 [00:00<?, ?it/s]

2025-05-15 03:09:50.242 | DEBUG    | earth2studio.data.gfs:fetch_array:352 - Fetching GFS grib file: noaa-gfs-bdp-pds/gfs.20220101/00/atmos/gfs.t00z.pgrb2.0p25.f000 308468277-899018

Fetching GFS data:   0%|          | 0/69 [00:00<?, ?it/s]

2025-05-15 03:09:50.243 | DEBUG    | earth2studio.data.gfs:fetch_array:352 - Fetching GFS grib file: noaa-gfs-bdp-pds/gfs.20220101/00/atmos/gfs.t00z.pgrb2.0p25.f000 419129996-943349

Fetching GFS data:   0%|          | 0/69 [00:00<?, ?it/s]

2025-05-15 03:09:50.244 | DEBUG    | earth2studio.data.gfs:fetch_array:352 - Fetching GFS grib file: noaa-gfs-bdp-pds/gfs.20220101/00/atmos/gfs.t00z.pgrb2.0p25.f000 201416483-580864

Fetching GFS data:   0%|          | 0/69 [00:00<?, ?it/s]

2025-05-15 03:09:50.245 | DEBUG    | earth2studio.data.gfs:fetch_array:352 - Fetching GFS grib file: noaa-gfs-bdp-pds/gfs.20220101/00/atmos/gfs.t00z.pgrb2.0p25.f000 215838816-734672

Fetching GFS data:   0%|          | 0/69 [00:00<?, ?it/s]

2025-05-15 03:09:50.246 | DEBUG    | earth2studio.data.gfs:fetch_array:352 - Fetching GFS grib file: noaa-gfs-bdp-pds/gfs.20220101/00/atmos/gfs.t00z.pgrb2.0p25.f000 200833225-583258

Fetching GFS data:   0%|          | 0/69 [00:00<?, ?it/s]

2025-05-15 03:09:50.247 | DEBUG    | earth2studio.data.gfs:fetch_array:352 - Fetching GFS grib file: noaa-gfs-bdp-pds/gfs.20220101/00/atmos/gfs.t00z.pgrb2.0p25.f000 240272644-1113069

Fetching GFS data:   0%|          | 0/69 [00:00<?, ?it/s]

2025-05-15 03:09:50.248 | DEBUG    | earth2studio.data.gfs:fetch_array:352 - Fetching GFS grib file: noaa-gfs-bdp-pds/gfs.20220101/00/atmos/gfs.t00z.pgrb2.0p25.f000 359433879-848754

Fetching GFS data:   0%|          | 0/69 [00:00<?, ?it/s]

2025-05-15 03:09:50.249 | DEBUG    | earth2studio.data.gfs:fetch_array:352 - Fetching GFS grib file: noaa-gfs-bdp-pds/gfs.20220101/00/atmos/gfs.t00z.pgrb2.0p25.f000 403383371-952939

Fetching GFS data:   0%|          | 0/69 [00:00<?, ?it/s]

2025-05-15 03:09:50.250 | DEBUG    | earth2studio.data.gfs:fetch_array:352 - Fetching GFS grib file: noaa-gfs-bdp-pds/gfs.20220101/00/atmos/gfs.t00z.pgrb2.0p25.f000 279544311-846529

Fetching GFS data:   0%|          | 0/69 [00:00<?, ?it/s]

2025-05-15 03:09:50.251 | DEBUG    | earth2studio.data.gfs:fetch_array:352 - Fetching GFS grib file: noaa-gfs-bdp-pds/gfs.20220101/00/atmos/gfs.t00z.pgrb2.0p25.f000 152722444-1096714

Fetching GFS data:   0%|          | 0/69 [00:00<?, ?it/s]

2025-05-15 03:09:50.252 | DEBUG    | earth2studio.data.gfs:fetch_array:352 - Fetching GFS grib file: noaa-gfs-bdp-pds/gfs.20220101/00/atmos/gfs.t00z.pgrb2.0p25.f000 264633312-547872

Fetching GFS data:   0%|          | 0/69 [00:00<?, ?it/s]

2025-05-15 03:09:50.253 | DEBUG    | earth2studio.data.gfs:fetch_array:352 - Fetching GFS grib file: noaa-gfs-bdp-pds/gfs.20220101/00/atmos/gfs.t00z.pgrb2.0p25.f000 205056193-741330

Fetching GFS data:   0%|          | 0/69 [00:00<?, ?it/s]

2025-05-15 03:09:50.254 | DEBUG    | earth2studio.data.gfs:fetch_array:352 - Fetching GFS grib file: noaa-gfs-bdp-pds/gfs.20220101/00/atmos/gfs.t00z.pgrb2.0p25.f000 394958560-1204554

Fetching GFS data:   0%|          | 0/69 [00:00<?, ?it/s]

2025-05-15 03:09:50.254 | DEBUG    | earth2studio.data.gfs:fetch_array:352 - Fetching GFS grib file: noaa-gfs-bdp-pds/gfs.20220101/00/atmos/gfs.t00z.pgrb2.0p25.f000 243604169-577867

Fetching GFS data:   0%|          | 0/69 [00:00<?, ?it/s]

2025-05-15 03:09:50.255 | DEBUG    | earth2studio.data.gfs:fetch_array:352 - Fetching GFS grib file: noaa-gfs-bdp-pds/gfs.20220101/00/atmos/gfs.t00z.pgrb2.0p25.f000 222042592-605371

Fetching GFS data:   0%|          | 0/69 [00:00<?, ?it/s]

2025-05-15 03:09:50.256 | DEBUG    | earth2studio.data.gfs:fetch_array:352 - Fetching GFS grib file: noaa-gfs-bdp-pds/gfs.20220101/00/atmos/gfs.t00z.pgrb2.0p25.f000 303982559-1237492

Fetching GFS data:   0%|          | 0/69 [00:00<?, ?it/s]

2025-05-15 03:09:50.257 | DEBUG    | earth2studio.data.gfs:fetch_array:352 - Fetching GFS grib file: noaa-gfs-bdp-pds/gfs.20220101/00/atmos/gfs.t00z.pgrb2.0p25.f000 191217112-908803

Fetching GFS data:   0%|          | 0/69 [00:00<?, ?it/s]

2025-05-15 03:09:50.258 | DEBUG    | earth2studio.data.gfs:fetch_array:352 - Fetching GFS grib file: noaa-gfs-bdp-pds/gfs.20220101/00/atmos/gfs.t00z.pgrb2.0p25.f000 413877259-875333

Fetching GFS data:   0%|          | 0/69 [00:00<?, ?it/s]

2025-05-15 03:09:50.259 | DEBUG    | earth2studio.data.gfs:fetch_array:352 - Fetching GFS grib file: noaa-gfs-bdp-pds/gfs.20220101/00/atmos/gfs.t00z.pgrb2.0p25.f000 222647963-618782

Fetching GFS data:   0%|          | 0/69 [00:00<?, ?it/s]

2025-05-15 03:09:50.260 | DEBUG    | earth2studio.data.gfs:fetch_array:352 - Fetching GFS grib file: noaa-gfs-bdp-pds/gfs.20220101/00/atmos/gfs.t00z.pgrb2.0p25.f000 261060104-1247846

Fetching GFS data:   0%|          | 0/69 [00:00<?, ?it/s]

2025-05-15 03:09:50.261 | DEBUG    | earth2studio.data.gfs:fetch_array:352 - Fetching GFS grib file: noaa-gfs-bdp-pds/gfs.20220101/00/atmos/gfs.t00z.pgrb2.0p25.f000 258273122-811591

Fetching GFS data:   0%|          | 0/69 [00:00<?, ?it/s]

2025-05-15 03:09:50.262 | DEBUG    | earth2studio.data.gfs:fetch_array:352 - Fetching GFS grib file: noaa-gfs-bdp-pds/gfs.20220101/00/atmos/gfs.t00z.pgrb2.0p25.f000 211066840-590891

Fetching GFS data:   0%|          | 0/69 [00:00<?, ?it/s]

2025-05-15 03:09:50.262 | DEBUG    | earth2studio.data.gfs:fetch_array:352 - Fetching GFS grib file: noaa-gfs-bdp-pds/gfs.20220101/00/atmos/gfs.t00z.pgrb2.0p25.f000 197589080-1153817

Fetching GFS data:   0%|          | 0/69 [00:00<?, ?it/s]

2025-05-15 03:09:50.263 | DEBUG    | earth2studio.data.gfs:fetch_array:352 - Fetching GFS grib file: noaa-gfs-bdp-pds/gfs.20220101/00/atmos/gfs.t00z.pgrb2.0p25.f000 179442387-907316

Fetching GFS data:   0%|          | 0/69 [00:00<?, ?it/s]

2025-05-15 03:09:50.263 | DEBUG    | earth2studio.data.gfs:fetch_array:352 - Fetching GFS grib file: noaa-gfs-bdp-pds/gfs.20220101/00/atmos/gfs.t00z.pgrb2.0p25.f000 156114459-872903

Fetching GFS data:   0%|          | 0/69 [00:00<?, ?it/s]

2025-05-15 03:09:50.264 | DEBUG    | earth2studio.data.gfs:fetch_array:352 - Fetching GFS grib file: noaa-gfs-bdp-pds/gfs.20220101/00/atmos/gfs.t00z.pgrb2.0p25.f000 185569850-749900

Fetching GFS data:   0%|          | 0/69 [00:00<?, ?it/s]

2025-05-15 03:09:50.265 | DEBUG    | earth2studio.data.gfs:fetch_array:352 - Fetching GFS grib file: noaa-gfs-bdp-pds/gfs.20220101/00/atmos/gfs.t00z.pgrb2.0p25.f000 418169245-960751

Fetching GFS data:   0%|          | 0/69 [00:00<?, ?it/s]

2025-05-15 03:09:50.265 | DEBUG    | earth2studio.data.gfs:fetch_array:352 - Fetching GFS grib file: noaa-gfs-bdp-pds/gfs.20220101/00/atmos/gfs.t00z.pgrb2.0p25.f000 335526427-838455

Fetching GFS data:   0%|          | 0/69 [00:00<?, ?it/s]

2025-05-15 03:09:50.266 | DEBUG    | earth2studio.data.gfs:fetch_array:352 - Fetching GFS grib file: noaa-gfs-bdp-pds/gfs.20220101/00/atmos/gfs.t00z.pgrb2.0p25.f000 301178894-856992

Fetching GFS data:   0%|          | 0/69 [00:00<?, ?it/s]

2025-05-15 03:09:50.266 | DEBUG    | earth2studio.data.gfs:fetch_array:352 - Fetching GFS grib file: noaa-gfs-bdp-pds/gfs.20220101/00/atmos/gfs.t00z.pgrb2.0p25.f000 190236100-981012

Fetching GFS data:   0%|          | 0/69 [00:00<?, ?it/s]

2025-05-15 03:09:50.267 | DEBUG    | earth2studio.data.gfs:fetch_array:352 - Fetching GFS grib file: noaa-gfs-bdp-pds/gfs.20220101/00/atmos/gfs.t00z.pgrb2.0p25.f000 366171559-949232

Fetching GFS data:   0%|          | 0/69 [00:00<?, ?it/s]

2025-05-15 03:09:50.268 | DEBUG    | earth2studio.data.gfs:fetch_array:352 - Fetching GFS grib file: noaa-gfs-bdp-pds/gfs.20220101/00/atmos/gfs.t00z.pgrb2.0p25.f000 205797523-746849

Fetching GFS data:   0%|          | 0/69 [00:00<?, ?it/s]

2025-05-15 03:09:50.268 | DEBUG    | earth2studio.data.gfs:fetch_array:352 - Fetching GFS grib file: noaa-gfs-bdp-pds/gfs.20220101/00/atmos/gfs.t00z.pgrb2.0p25.f000 259084713-722805

Fetching GFS data:   0%|          | 0/69 [00:00<?, ?it/s]

2025-05-15 03:09:50.269 | DEBUG    | earth2studio.data.gfs:fetch_array:352 - Fetching GFS grib file: noaa-gfs-bdp-pds/gfs.20220101/00/atmos/gfs.t00z.pgrb2.0p25.f000 216573488-756397

Fetching GFS data:   0%|          | 0/69 [00:00<?, ?it/s]

2025-05-15 03:09:50.270 | DEBUG    | earth2studio.data.gfs:fetch_array:352 - Fetching GFS grib file: noaa-gfs-bdp-pds/gfs.20220101/00/atmos/gfs.t00z.pgrb2.0p25.f000 285975326-905135

Fetching GFS data:   0%|          | 0/69 [00:00<?, ?it/s]

2025-05-15 03:09:50.270 | DEBUG    | earth2studio.data.gfs:fetch_array:352 - Fetching GFS grib file: noaa-gfs-bdp-pds/gfs.20220101/00/atmos/gfs.t00z.pgrb2.0p25.f000 173355514-766810

Fetching GFS data:   0%|          | 0/69 [00:00<?, ?it/s]

2025-05-15 03:09:50.271 | DEBUG    | earth2studio.data.gfs:fetch_array:352 - Fetching GFS grib file: noaa-gfs-bdp-pds/gfs.20220101/00/atmos/gfs.t00z.pgrb2.0p25.f000 150888816-767600

Fetching GFS data:   0%|          | 0/69 [00:00<?, ?it/s]

2025-05-15 03:09:50.271 | DEBUG    | earth2studio.data.gfs:fetch_array:352 - Fetching GFS grib file: noaa-gfs-bdp-pds/gfs.20220101/00/atmos/gfs.t00z.pgrb2.0p25.f000 358536093-897786

Fetching GFS data:   0%|          | 0/69 [00:00<?, ?it/s]

2025-05-15 03:09:50.272 | DEBUG    | earth2studio.data.gfs:fetch_array:352 - Fetching GFS grib file: noaa-gfs-bdp-pds/gfs.20220101/00/atmos/gfs.t00z.pgrb2.0p25.f000 361696690-1222127

Fetching GFS data:   0%|          | 0/69 [00:00<?, ?it/s]

2025-05-15 03:09:50.272 | DEBUG    | earth2studio.data.gfs:fetch_array:352 - Fetching GFS grib file: noaa-gfs-bdp-pds/gfs.20220101/00/atmos/gfs.t00z.pgrb2.0p25.f000 302035886-756420

Fetching GFS data:   0%|          | 0/69 [00:00<?, ?it/s]

2025-05-15 03:09:50.273 | DEBUG    | earth2studio.data.gfs:fetch_array:352 - Fetching GFS grib file: noaa-gfs-bdp-pds/gfs.20220101/00/atmos/gfs.t00z.pgrb2.0p25.f000 286880461-900027

Fetching GFS data:   0%|          | 0/69 [00:00<?, ?it/s]

2025-05-15 03:09:50.273 | DEBUG    | earth2studio.data.gfs:fetch_array:352 - Fetching GFS grib file: noaa-gfs-bdp-pds/gfs.20220101/00/atmos/gfs.t00z.pgrb2.0p25.f000 238232118-730356

Fetching GFS data:   0%|          | 0/69 [00:00<?, ?it/s]

2025-05-15 03:09:50.274 | DEBUG    | earth2studio.data.gfs:fetch_array:352 - Fetching GFS grib file: noaa-gfs-bdp-pds/gfs.20220101/00/atmos/gfs.t00z.pgrb2.0p25.f000 207773671-1144304

Fetching GFS data:   0%|          | 0/69 [00:00<?, ?it/s]

2025-05-15 03:09:50.274 | DEBUG    | earth2studio.data.gfs:fetch_array:352 - Fetching GFS grib file: noaa-gfs-bdp-pds/gfs.20220101/00/atmos/gfs.t00z.pgrb2.0p25.f000 237405113-827005

Fetching GFS data:   0%|          | 0/69 [00:00<?, ?it/s]
Fetching GFS data:   1%|▏         | 1/69 [00:00<00:32,  2.11it/s]
Fetching GFS data:   3%|▎         | 2/69 [00:00<00:19,  3.51it/s]
Fetching GFS data:   6%|▌         | 4/69 [00:00<00:10,  6.44it/s]
Fetching GFS data:  13%|█▎        | 9/69 [00:00<00:04, 14.73it/s]
Fetching GFS data:  20%|██        | 14/69 [00:01<00:02, 22.43it/s]
Fetching GFS data:  28%|██▊       | 19/69 [00:01<00:01, 27.47it/s]
Fetching GFS data:  35%|███▍      | 24/69 [00:01<00:01, 28.14it/s]
Fetching GFS data:  42%|████▏     | 29/69 [00:01<00:01, 33.08it/s]
Fetching GFS data:  48%|████▊     | 33/69 [00:01<00:01, 33.14it/s]
Fetching GFS data:  59%|█████▉    | 41/69 [00:01<00:00, 38.43it/s]
Fetching GFS data:  67%|██████▋   | 46/69 [00:01<00:00, 39.03it/s]
Fetching GFS data:  74%|███████▍  | 51/69 [00:01<00:00, 39.37it/s]
Fetching GFS data:  81%|████████  | 56/69 [00:02<00:00, 40.08it/s]
Fetching GFS data:  88%|████████▊ | 61/69 [00:02<00:00, 40.23it/s]
Fetching GFS data:  96%|█████████▌| 66/69 [00:02<00:00, 41.20it/s]
Fetching GFS data: 100%|██████████| 69/69 [00:02<00:00, 29.44it/s]
2025-05-15 03:09:52.716 | SUCCESS  | __main__:run_stats:131 - Fetched data from GFS
2025-05-15 03:09:52.717 | INFO     | __main__:run_stats:151 - Inference starting!

Running inference:   0%|          | 0/61 [00:00<?, ?it/s]
Running inference:   3%|▎         | 2/61 [00:01<00:40,  1.45it/s]
Running inference:   5%|▍         | 3/61 [00:02<00:53,  1.08it/s]
Running inference:   7%|▋         | 4/61 [00:03<01:00,  1.05s/it]
Running inference:   8%|▊         | 5/61 [00:05<01:03,  1.13s/it]
Running inference:  10%|▉         | 6/61 [00:06<01:04,  1.17s/it]
Running inference:  11%|█▏        | 7/61 [00:07<01:04,  1.20s/it]
Running inference:  13%|█▎        | 8/61 [00:08<01:04,  1.22s/it]
Running inference:  15%|█▍        | 9/61 [00:10<01:04,  1.23s/it]
Running inference:  16%|█▋        | 10/61 [00:11<01:03,  1.24s/it]
Running inference:  18%|█▊        | 11/61 [00:12<01:02,  1.25s/it]
Running inference:  20%|█▉        | 12/61 [00:14<01:01,  1.25s/it]
Running inference:  21%|██▏       | 13/61 [00:15<01:00,  1.26s/it]
Running inference:  23%|██▎       | 14/61 [00:16<00:59,  1.26s/it]
Running inference:  25%|██▍       | 15/61 [00:17<00:57,  1.26s/it]
Running inference:  26%|██▌       | 16/61 [00:19<00:56,  1.26s/it]
Running inference:  28%|██▊       | 17/61 [00:20<00:55,  1.26s/it]
Running inference:  30%|██▉       | 18/61 [00:21<00:54,  1.26s/it]
Running inference:  31%|███       | 19/61 [00:22<00:52,  1.26s/it]
Running inference:  33%|███▎      | 20/61 [00:24<00:51,  1.26s/it]
Running inference:  34%|███▍      | 21/61 [00:25<00:50,  1.26s/it]
Running inference:  36%|███▌      | 22/61 [00:26<00:49,  1.26s/it]
Running inference:  38%|███▊      | 23/61 [00:27<00:47,  1.26s/it]
Running inference:  39%|███▉      | 24/61 [00:29<00:46,  1.26s/it]
Running inference:  41%|████      | 25/61 [00:30<00:45,  1.26s/it]
Running inference:  43%|████▎     | 26/61 [00:31<00:44,  1.26s/it]
Running inference:  44%|████▍     | 27/61 [00:32<00:42,  1.26s/it]
Running inference:  46%|████▌     | 28/61 [00:34<00:41,  1.26s/it]
Running inference:  48%|████▊     | 29/61 [00:35<00:40,  1.26s/it]
Running inference:  49%|████▉     | 30/61 [00:36<00:39,  1.26s/it]
Running inference:  51%|█████     | 31/61 [00:37<00:37,  1.26s/it]
Running inference:  52%|█████▏    | 32/61 [00:39<00:36,  1.26s/it]
Running inference:  54%|█████▍    | 33/61 [00:40<00:35,  1.26s/it]
Running inference:  56%|█████▌    | 34/61 [00:41<00:34,  1.26s/it]
Running inference:  57%|█████▋    | 35/61 [00:43<00:32,  1.26s/it]
Running inference:  59%|█████▉    | 36/61 [00:44<00:31,  1.26s/it]
Running inference:  61%|██████    | 37/61 [00:45<00:30,  1.26s/it]
Running inference:  62%|██████▏   | 38/61 [00:46<00:29,  1.26s/it]
Running inference:  64%|██████▍   | 39/61 [00:48<00:27,  1.26s/it]
Running inference:  66%|██████▌   | 40/61 [00:49<00:26,  1.26s/it]
Running inference:  67%|██████▋   | 41/61 [00:50<00:25,  1.26s/it]
Running inference:  69%|██████▉   | 42/61 [00:51<00:23,  1.26s/it]
Running inference:  70%|███████   | 43/61 [00:53<00:22,  1.26s/it]
Running inference:  72%|███████▏  | 44/61 [00:54<00:21,  1.26s/it]
Running inference:  74%|███████▍  | 45/61 [00:55<00:20,  1.26s/it]
Running inference:  75%|███████▌  | 46/61 [00:56<00:18,  1.26s/it]
Running inference:  77%|███████▋  | 47/61 [00:58<00:17,  1.26s/it]
Running inference:  79%|███████▊  | 48/61 [00:59<00:16,  1.26s/it]
Running inference:  80%|████████  | 49/61 [01:00<00:15,  1.26s/it]
Running inference:  82%|████████▏ | 50/61 [01:01<00:13,  1.26s/it]
Running inference:  84%|████████▎ | 51/61 [01:03<00:12,  1.26s/it]
Running inference:  85%|████████▌ | 52/61 [01:04<00:11,  1.26s/it]
Running inference:  87%|████████▋ | 53/61 [01:05<00:10,  1.26s/it]
Running inference:  89%|████████▊ | 54/61 [01:07<00:08,  1.26s/it]
Running inference:  90%|█████████ | 55/61 [01:08<00:07,  1.26s/it]
Running inference:  92%|█████████▏| 56/61 [01:09<00:06,  1.26s/it]
Running inference:  93%|█████████▎| 57/61 [01:10<00:05,  1.26s/it]
Running inference:  95%|█████████▌| 58/61 [01:12<00:03,  1.26s/it]
Running inference:  97%|█████████▋| 59/61 [01:13<00:02,  1.26s/it]
Running inference:  98%|█████████▊| 60/61 [01:14<00:01,  1.26s/it]
Running inference: 100%|██████████| 61/61 [01:15<00:00,  1.26s/it]
Running inference: 100%|██████████| 61/61 [01:15<00:00,  1.24s/it]
2025-05-15 03:11:08.569 | SUCCESS  | __main__:run_stats:160 - Inference complete

Post Processing#

The last step is to post process our results.

Notice that the NetCDF IO function has additional APIs to interact with the stored data.

import matplotlib.pyplot as plt

times = io["time"][:].flatten() + io["lead_time"][:].flatten()

fig = plt.figure(figsize=(12, 4))
ax = fig.add_subplot(1, 1, 1)
ax.plot(times, io["soi"][:].flatten())
ax.set_title("Southern Oscillation Index")
ax.grid("on")

plt.savefig("outputs/07_southern_oscillation_index_prediction_2022.png")
io.close()
Southern Oscillation Index

Total running time of the script: (2 minutes 14.766 seconds)

Gallery generated by Sphinx-Gallery