warp.optim#

Optimization algorithms for gradient descent and linear systems.

This module provides gradient-based optimizers (Adam, SGD) for updating arrays based on computed gradients. The warp.optim.linear submodule provides iterative linear solvers.

Usage:

This module must be explicitly imported:

import warp.optim

Submodules#

These modules are automatically available when you import warp.optim.

API#

Adam

Adaptive Moment Estimation (Adam) optimizer.

SGD

Stochastic Gradient Descent (SGD) optimizer with optional momentum.