nvalchemi.data.datapipes.ZarrArrayConfig#
- class nvalchemi.data.datapipes.ZarrArrayConfig(*, compressors=None, filters=None, serializer=None, chunk_size=None, shard_size=None, write_empty_chunks=True)[source]#
Configuration for Zarr array compression, chunking, and sharding.
- Parameters:
compressors (tuple[zarr.abc.codec.Codec, ...] | None) – Compressor codec(s) to apply. E.g.
(zarr.codecs.ZstdCodec(level=3),).filters (tuple[zarr.abc.codec.Codec, ...] | None) – Array-to-array filter codec(s). E.g.
(zarr.codecs.TransposeCodec(order=(1, 0)),).serializer (zarr.abc.codec.Codec | None) – Bytes serializer codec. E.g.
zarr.codecs.BytesCodec(endian="little").chunk_size (int | None) – Chunk length along dimension 0. Other dimensions use their full extent.
Noneuses Zarr defaults.shard_size (int | None) – Shard length along dimension 0. When set, multiple chunks are stored in a single storage object. Must be a multiple of
chunk_sizewhen both are specified.Nonedisables sharding.write_empty_chunks (bool) – Whether to write chunks that are entirely fill-valued. Default
True.
- model_config = {'arbitrary_types_allowed': True}#
Configuration for the model, should be a dictionary conforming to [ConfigDict][pydantic.config.ConfigDict].