Layers#
Core modules#
Layers#
Layers are basic building blocks of neural networks.
The following table lists the available layers:
Apply a 1D convolution. |
|
Apply a 2D convolution. |
|
Apply a Gated Recurrent Unit (GRU) cell. |
|
Apply a linear transformation over the final dimension of the input. |
|
Apply a Long Short-Term Memory (LSTM) cell. |
|
Apply a Elman's Recurrent Neural Network (RNN) cell. |
|
Apply callable modules (e.g. layers, activation functions, etc.) connected in a cascading sequence. |