State-space LLMs: Do we need Attention?
: https://www.interconnects.ai/p/llms-beyond-attention
HiPPO
HiPPO: Recurrent Memory with Optimal Polynomial Projections
LSSL
Combining Recurrent, Convolutional, and Continuous-time Models with Linear State-Space Layers
S4
Efficiently Modeling Long Sequences with Structure State Spaces
Mamba
Mamba: Linear-Time Sequence Modeling with Selective State Spaces
르장드르
Legendre Memory Units: Continuous-Time Representation in Recurrent Neural Networks
Long Range Dependency
https://en.wikipedia.org/wiki/Long-range_dependence
CTM. continuous time model
Legendre Memory Units: Continuous-Time Representation in Recurrent Neural Networks
SSM. fundamental state space model
,
LRA
Long Range Arena : A Benchmark for Efficient Transformers
