zarr.abc.codec#

Attributes#

Classes#

ArrayArrayCodec

Base class for array-to-array codecs.

ArrayBytesCodec

Base class for array-to-bytes codecs.

ArrayBytesCodecPartialDecodeMixin

Mixin for array-to-bytes codecs that implement partial decoding.

ArrayBytesCodecPartialEncodeMixin

Mixin for array-to-bytes codecs that implement partial encoding.

BaseCodec

Generic base class for codecs.

BytesBytesCodec

Base class for bytes-to-bytes codecs.

CodecPipeline

Base class for implementing CodecPipeline.

Module Contents#

class zarr.abc.codec.ArrayArrayCodec[source]#

Bases: BaseCodec[zarr.core.buffer.NDBuffer, zarr.core.buffer.NDBuffer]

Base class for array-to-array codecs.

class zarr.abc.codec.ArrayBytesCodec[source]#

Bases: BaseCodec[zarr.core.buffer.NDBuffer, zarr.core.buffer.Buffer]

Base class for array-to-bytes codecs.

class zarr.abc.codec.ArrayBytesCodecPartialDecodeMixin[source]#

Mixin for array-to-bytes codecs that implement partial decoding.

async decode_partial(
batch_info: collections.abc.Iterable[tuple[zarr.abc.store.ByteGetter, zarr.core.indexing.SelectorTuple, zarr.core.array_spec.ArraySpec]],
) collections.abc.Iterable[zarr.core.buffer.NDBuffer | None][source]#

Partially decodes a batch of chunks. This method determines parts of a chunk from the slice selection, fetches these parts from the store (via ByteGetter) and decodes them.

Parameters:
batch_infoIterable[tuple[ByteGetter, SelectorTuple, ArraySpec]]

Ordered set of information about slices of encoded chunks. The slice selection determines which parts of the chunk will be fetched. The ByteGetter is used to fetch the necessary bytes. The chunk spec contains information about the construction of an array from the bytes.

Returns:
Iterable[NDBuffer | None]
class zarr.abc.codec.ArrayBytesCodecPartialEncodeMixin[source]#

Mixin for array-to-bytes codecs that implement partial encoding.

async encode_partial(
batch_info: collections.abc.Iterable[tuple[zarr.abc.store.ByteSetter, zarr.core.buffer.NDBuffer, zarr.core.indexing.SelectorTuple, zarr.core.array_spec.ArraySpec]],
) None[source]#

Partially encodes a batch of chunks. This method determines parts of a chunk from the slice selection, encodes them and writes these parts to the store (via ByteSetter). If merging with existing chunk data in the store is necessary, this method will read from the store first and perform the merge.

Parameters:
batch_infoIterable[tuple[ByteSetter, NDBuffer, SelectorTuple, ArraySpec]]

Ordered set of information about slices of to-be-encoded chunks. The slice selection determines which parts of the chunk will be encoded. The ByteSetter is used to write the necessary bytes and fetch bytes for existing chunk data. The chunk spec contains information about the chunk.

class zarr.abc.codec.BaseCodec[source]#

Bases: zarr.abc.metadata.Metadata, Generic[CodecInput, CodecOutput]

Generic base class for codecs.

Codecs can be registered via zarr.codecs.registry.

Warning

This class is not intended to be directly, please use ArrayArrayCodec, ArrayBytesCodec or BytesBytesCodec for subclassing.

abstract compute_encoded_size(
input_byte_length: int,
chunk_spec: zarr.core.array_spec.ArraySpec,
) int[source]#

Given an input byte length, this method returns the output byte length. Raises a NotImplementedError for codecs with variable-sized outputs (e.g. compressors).

Parameters:
input_byte_lengthint
chunk_specArraySpec
Returns:
int
async decode(
chunks_and_specs: collections.abc.Iterable[tuple[CodecOutput | None, zarr.core.array_spec.ArraySpec]],
) collections.abc.Iterable[CodecInput | None][source]#

Decodes a batch of chunks. Chunks can be None in which case they are ignored by the codec.

Parameters:
chunks_and_specsIterable[tuple[CodecOutput | None, ArraySpec]]

Ordered set of encoded chunks with their accompanying chunk spec.

Returns:
Iterable[CodecInput | None]
async encode(
chunks_and_specs: collections.abc.Iterable[tuple[CodecInput | None, zarr.core.array_spec.ArraySpec]],
) collections.abc.Iterable[CodecOutput | None][source]#

Encodes a batch of chunks. Chunks can be None in which case they are ignored by the codec.

Parameters:
chunks_and_specsIterable[tuple[CodecInput | None, ArraySpec]]

Ordered set of to-be-encoded chunks with their accompanying chunk spec.

Returns:
Iterable[CodecOutput | None]
evolve_from_array_spec(array_spec: zarr.core.array_spec.ArraySpec) Self[source]#

Fills in codec configuration parameters that can be automatically inferred from the array metadata.

Parameters:
array_specArraySpec
Returns:
Self
resolve_metadata(
chunk_spec: zarr.core.array_spec.ArraySpec,
) zarr.core.array_spec.ArraySpec[source]#

Computed the spec of the chunk after it has been encoded by the codec. This is important for codecs that change the shape, data type or fill value of a chunk. The spec will then be used for subsequent codecs in the pipeline.

Parameters:
chunk_specArraySpec
Returns:
ArraySpec
validate(
*,
shape: zarr.core.common.ChunkCoords,
dtype: numpy.dtype[Any],
chunk_grid: zarr.core.chunk_grids.ChunkGrid,
) None[source]#

Validates that the codec configuration is compatible with the array metadata. Raises errors when the codec configuration is not compatible.

Parameters:
shapeChunkCoords

The array shape

dtypenp.dtype[Any]

The array data type

chunk_gridChunkGrid

The array chunk grid

is_fixed_size: bool[source]#
class zarr.abc.codec.BytesBytesCodec[source]#

Bases: BaseCodec[zarr.core.buffer.Buffer, zarr.core.buffer.Buffer]

Base class for bytes-to-bytes codecs.

class zarr.abc.codec.CodecPipeline[source]#

Base class for implementing CodecPipeline. A CodecPipeline implements the read and write paths for chunk data. On the read path, it is responsible for fetching chunks from a store (via ByteGetter), decoding them and assembling an output array. On the write path, it encodes the chunks and writes them to a store (via ByteSetter).

abstract compute_encoded_size(
byte_length: int,
array_spec: zarr.core.array_spec.ArraySpec,
) int[source]#

Given an input byte length, this method returns the output byte length. Raises a NotImplementedError for codecs with variable-sized outputs (e.g. compressors).

Parameters:
byte_lengthint
array_specArraySpec
Returns:
int
abstract decode(
chunk_bytes_and_specs: collections.abc.Iterable[tuple[zarr.core.buffer.Buffer | None, zarr.core.array_spec.ArraySpec]],
) collections.abc.Iterable[zarr.core.buffer.NDBuffer | None][source]#
Async:

Decodes a batch of chunks. Chunks can be None in which case they are ignored by the codec.

Parameters:
chunk_bytes_and_specsIterable[tuple[Buffer | None, ArraySpec]]

Ordered set of encoded chunks with their accompanying chunk spec.

Returns:
Iterable[NDBuffer | None]
abstract encode(
chunk_arrays_and_specs: collections.abc.Iterable[tuple[zarr.core.buffer.NDBuffer | None, zarr.core.array_spec.ArraySpec]],
) collections.abc.Iterable[zarr.core.buffer.Buffer | None][source]#
Async:

Encodes a batch of chunks. Chunks can be None in which case they are ignored by the codec.

Parameters:
chunk_arrays_and_specsIterable[tuple[NDBuffer | None, ArraySpec]]

Ordered set of to-be-encoded chunks with their accompanying chunk spec.

Returns:
Iterable[Buffer | None]
abstract evolve_from_array_spec(array_spec: zarr.core.array_spec.ArraySpec) Self[source]#

Fills in codec configuration parameters that can be automatically inferred from the array metadata.

Parameters:
array_specArraySpec
Returns:
Self
classmethod from_codecs(codecs: collections.abc.Iterable[Codec]) Self[source]#
Abstractmethod:

Creates a codec pipeline from an iterable of codecs.

Parameters:
codecsIterable[Codec]
Returns:
Self
abstract read(
batch_info: collections.abc.Iterable[tuple[zarr.abc.store.ByteGetter, zarr.core.array_spec.ArraySpec, zarr.core.indexing.SelectorTuple, zarr.core.indexing.SelectorTuple]],
out: zarr.core.buffer.NDBuffer,
drop_axes: tuple[int, Ellipsis] = (),
) None[source]#
Async:

Reads chunk data from the store, decodes it and writes it into an output array. Partial decoding may be utilized if the codecs and stores support it.

Parameters:
batch_infoIterable[tuple[ByteGetter, ArraySpec, SelectorTuple, SelectorTuple]]

Ordered set of information about the chunks. The first slice selection determines which parts of the chunk will be fetched. The second slice selection determines where in the output array the chunk data will be written. The ByteGetter is used to fetch the necessary bytes. The chunk spec contains information about the construction of an array from the bytes.

outNDBuffer
abstract validate(
*,
shape: zarr.core.common.ChunkCoords,
dtype: numpy.dtype[Any],
chunk_grid: zarr.core.chunk_grids.ChunkGrid,
) None[source]#

Validates that all codec configurations are compatible with the array metadata. Raises errors when a codec configuration is not compatible.

Parameters:
shapeChunkCoords

The array shape

dtypenp.dtype[Any]

The array data type

chunk_gridChunkGrid

The array chunk grid

abstract write(
batch_info: collections.abc.Iterable[tuple[zarr.abc.store.ByteSetter, zarr.core.array_spec.ArraySpec, zarr.core.indexing.SelectorTuple, zarr.core.indexing.SelectorTuple]],
value: zarr.core.buffer.NDBuffer,
drop_axes: tuple[int, Ellipsis] = (),
) None[source]#
Async:

Encodes chunk data and writes it to the store. Merges with existing chunk data by reading first, if necessary. Partial encoding may be utilized if the codecs and stores support it.

Parameters:
batch_infoIterable[tuple[ByteSetter, ArraySpec, SelectorTuple, SelectorTuple]]

Ordered set of information about the chunks. The first slice selection determines which parts of the chunk will be encoded. The second slice selection determines where in the value array the chunk data is located. The ByteSetter is used to fetch and write the necessary bytes. The chunk spec contains information about the chunk.

valueNDBuffer
property supports_partial_decode: bool[source]#
Abstractmethod:

property supports_partial_encode: bool[source]#
Abstractmethod:

zarr.abc.codec.CodecInput[source]#
zarr.abc.codec.CodecOutput[source]#