Linear Attention Approximations
Explore linear complexity attention mechanisms including Performer, Linformer, and other efficient transformers that scale to very long sequences.
6 min readConcept
Explore machine learning concepts related to linear-attention. Clear explanations and practical insights.
Explore linear complexity attention mechanisms including Performer, Linformer, and other efficient transformers that scale to very long sequences.