layer
LoRA (Low-Rank Adaptation) module implementation.
Classes
| Base class for LoRA (Low-Rank Adaptation) modules. | 
- class LoRAModule
- Bases: - DynamicModule- Base class for LoRA (Low-Rank Adaptation) modules. - This module wraps existing layers and adds trainable low-rank decomposition matrices (LoRA adapters) that are added to the original layer’s output. - _lora_adapters
- Dictionary mapping adapter names to their LoRA A and B matrices 
 - property adapter_names: set
- Return the set of all registered adapter names. 
 - forward(x, *args, **kwargs)
- Forward pass with LoRA adaptation. - Parameters:
- x (Tensor) – Input tensor 
- *args – Additional positional arguments for the base layer 
- **kwargs – Additional keyword arguments for the base layer 
 
- Returns:
- Output from the base layer plus active LoRA adaptations 
- Return type:
- Any 
 
 - abstract update_layer_lora(adapter_name, attr_config)
- Create and register a new LoRA adapter. - This method must be implemented by subclasses to create the appropriate LoRA A and B matrices for the specific layer type. - Parameters:
- adapter_name (str) – Name for the new adapter 
- attr_config (PEFTAttributeConfig) – PEFTAttributeConfig containing rank, scale, and initialization settings 
 
- Return type:
- None