Apex
0.1

AMP: Automatic Mixed Precision

  • apex.amp
    • opt_levels and Properties
      • Properties
      • opt_levels
        • O0: FP32 training
        • O1: Mixed Precision (recommended for typical use)
        • O2: “Almost FP16” Mixed Precision
        • O3: FP16 training
    • Unified API
    • Checkpointing
    • Advanced use cases
      • Advanced Amp Usage
        • GANs
        • Gradient clipping
        • Custom/user-defined autograd functions
        • Forcing particular layers/functions to a desired type
        • Multiple models/optimizers/losses
        • Gradient accumulation across iterations
        • Custom data batch types
    • Transition guide for old API users
      • For users of the old “Amp” API
      • For users of the old FP16_Optimizer

Distributed Training

  • apex.parallel
    • Utility functions

Fused Optimizers

  • apex.optimizers

Fused Layer Norm

  • apex.normalization.fused_layer_norm
Apex
  • Docs »
  • Overview: module code

All modules for which code is available

  • apex.amp._amp_state
  • apex.amp.frontend
  • apex.amp.handle
  • apex.fp16_utils.fp16_optimizer
  • apex.fp16_utils.fp16util
  • apex.fp16_utils.loss_scaler
  • apex.normalization.fused_layer_norm
  • apex.optimizers.fused_adam
  • apex.optimizers.fused_lamb
  • apex.optimizers.fused_novograd
  • apex.optimizers.fused_sgd
  • apex.parallel
    • apex.parallel.distributed
    • apex.parallel.optimized_sync_batchnorm

© Copyright 2018

Built with Sphinx using a theme provided by Read the Docs.