Gated DeltaNet: The “Surgical Eraser” Solving Linear Attention’s Memory Problem
pub.towardsai.net·1d
📱Edge AI
Preview
Report Post

7 min read23 hours ago

Press enter or click to view image in full size

The “Infinite Floor” vs. The “Whiteboard”

Imagine a librarian (the Model) trying to answer questions based on a massive stack of books (the Context)

  • Standard Transformers (Attention) are like a librarian who lays every single page out on an infinite floor. To answer a question, they look at every page simultaneously. It’s perfect, but as the books pile up, the floor runs out of space, and the librarian collapses from exhaustion (Quadratic Complexity O(L²)).
  • RNNs and Mamba (State Space Models) are like a librarian with a single small whiteboard. They read one page, scribble some notes, erase a little, and move to the next. It’s incredibly fast (Linear Complexity O(L)...

Similar Posts

Loading similar posts...

Keyboard Shortcuts

Navigation
Next / previous item
j/k
Open post
oorEnter
Preview post
v
Post Actions
Love post
a
Like post
l
Dislike post
d
Undo reaction
u
Recommendations
Add interest / feed
Enter
Not interested
x
Go to
Home
gh
Interests
gi
Feeds
gf
Likes
gl
History
gy
Changelog
gc
Settings
gs
Browse
gb
Search
/
General
Show this help
?
Submit feedback
!
Close modal / unfocus
Esc

Press ? anytime to show this help