centaur¶
attention¶
-
class
parts.centaur.attention.
AttentionBlock
(hidden_size, attention_dropout, layer_postprocess_dropout, training, cnn_dropout_prob, regularizer=None, conv_params=None, n_heads=1, window_size=None, back_step_size=None, name='attention_block')[source]¶ Bases:
object
Attention block for Centaur model.
-
__init__
(hidden_size, attention_dropout, layer_postprocess_dropout, training, cnn_dropout_prob, regularizer=None, conv_params=None, n_heads=1, window_size=None, back_step_size=None, name='attention_block')[source]¶ Attention block constructor.
Parameters: - hidden_size – dimensionality of hidden embeddings.
- attention_dropout – dropout rate for attention layer.
- layer_postprocess_dropout – dropout rate for sublayer.
- training – whether it is training mode.
- cnn_dropout_prob – dropout probabilty for cnn layers.
- regularizer – regularizer for the convolution kernel.
- conv_params – description of convolutional layer.
- n_heads – number of attention heads. Defaults to 1.
- window_size – size of attention window for forcing monotonic attention during the inference. Defaults to None.
- back_step_size – number of steps attention is allowed to go back during the inference. Defaults to 0.
- name – name of the block.
-
batch_norm¶
conv_block¶
-
class
parts.centaur.conv_block.
ConvBlock
(name, conv, norm, activation_fn, dropout, training, is_residual, is_causal)[source]¶ Bases:
object
Convolutional block for Centaur model.
-
__init__
(name, conv, norm, activation_fn, dropout, training, is_residual, is_causal)[source]¶ Convolutional block constructor.
Parameters: - name – name of the block.
- conv – convolutional layer.
- norm – normalization layer to use after the convolutional layer.
- activation_fn – activation function to use after the normalization.
- dropout – dropout rate.
- training – whether it is training mode.
- is_residual – whether the block should contain a residual connection.
- is_causal – whether the convolutional layer should be causal.
-
prenet¶
-
class
parts.centaur.prenet.
Prenet
(n_layers, hidden_size, activation_fn, dropout=0.5, regularizer=None, training=True, dtype=None, name='prenet')[source]¶ Bases:
object
Centaur decoder pre-net.
-
__init__
(n_layers, hidden_size, activation_fn, dropout=0.5, regularizer=None, training=True, dtype=None, name='prenet')[source]¶ Pre-net constructor.
Parameters: - n_layers – number of fully-connected layers to use.
- hidden_size – number of units in each pre-net layer.
- activation_fn – activation function to use.
- dropout – dropout rate. Defaults to 0.5.
- regularizer – regularizer for the convolution kernel. Defaults to None.
- training – whether it is training mode. Defaults to None.
- dtype – dtype of the layer’s weights. Defaults to None.
- name – name of the block.
-