attention

Quantization-specific attention kernel pieces (placeholder for combined sparse+quant path).