MIT's Recursive Language Models Just Killed Context Limits
pub.towardsai.net·3d
🔧DSPy
Preview
Report Post

MIT Just Killed the Context Window

Recursive Language Models Are the Future

15 min readJust now

The End of “Context Rot” and the Dawn of Truly Infinite AI Reasoning

Press enter or click to view image in full size

A comparison of GPT-5 and a corresponding RLM on three long-context tasks of increasing complexity: S-NIAH (Find a single fact hidden in a huge pile of text), OOLONG (Understand a complicated story where all the details connect), and OOLONG-Pairs (Compare two different complicated stories with each other). For each task, Input length scaled from ²¹³ to ²¹⁸. GPT-5 performance degrades significantly as a function of both input length and task complexity, while the RLM maintains strong performance. Inputs beyond the red region do not fit in GPT-5’s…

Similar Posts

Loading similar posts...

Keyboard Shortcuts

Navigation
Next / previous item
j/k
Open post
oorEnter
Preview post
v
Post Actions
Love post
a
Like post
l
Dislike post
d
Undo reaction
u
Recommendations
Add interest / feed
Enter
Not interested
x
Go to
Home
gh
Interests
gi
Feeds
gf
Likes
gl
History
gy
Changelog
gc
Settings
gs
Browse
gb
Search
/
General
Show this help
?
Submit feedback
!
Close modal / unfocus
Esc

Press ? anytime to show this help