plugins
Plugins for sparse attention integration with various frameworks.
Functions
Dynamically register sparse attention for any model. |
- register_sparse_attention_on_the_fly(model)
Dynamically register sparse attention for any model.
This function automatically detects attention modules in the model and registers them for sparse attention optimization.
- Parameters:
model (Module) – Model to process
- Returns:
True if any modules were registered
- Return type:
bool