Activations#
Activations are non-linear functions that transform the input data into a new representation.
The following table lists the available activations:
Exponential Linear Unit (ELU) activation function. |
|
Leaky Rectified Linear Unit (Leaky ReLU) activation function. |
|
Rectified Linear Unit (ReLU) activation function. |
|
Scaled Exponential Linear Unit (SELU) activation function. |
|
Sigmoid activation function. |
|
Soft-plus activation function. |
|
Soft-sign activation function. |
|
Hyperbolic Tangent (Tanh) activation function. |