Optimizers#
Optimizers are algorithms that update the parameters of a model.
The following table lists the available optimizers:
Base class#
Note
This is the base class for all the optimizers. It is not intended to be used directly.
API#
- class warp_nn.optimizers.Optimizer(
- parameters: list[array],
- *,
- lr: float = 0.001,
- device: str | Device | None = None,
- max_norm: float | None = None,
- disable_graph: bool = False,
Bases:
ABCBase class for all optimizers.
- Parameters:
parameters – Model parameters.
lr – Learning rate.
device – Device to use for the optimizer.
disable_graph – Whether to disable graph capture.
- clip_by_total_norm( )[source]#
Clip (scaling down) parameters’ gradients in-place by their total norm.
https://arxiv.org/abs/1211.5063
- Parameters:
max_norm – Maximum global norm.
disable_graph – Whether to disable graph capture.