Published on February 4, 2026 12:41 AM GMT

I’ve been reading Toby Ord’s recent sequence on AI scaling a bit. General notes come first, then my thoughts.

Notes

  • The Scaling Paradox basically argues that the scaling laws are actually pretty bad and mean progress will hit a wall fairly quickly unless the next gen or two of models somehow speed up AI research, we find a new scaling paradigm etc...
  • Inference Scaling and the Log X Chart says that inference is also not a big deal because the scaling is again logarithmic. My intuition here is th...

Similar Posts

Loading similar posts...

Keyboard Shortcuts

Navigation
Next / previous item
j/k
Open post
oorEnter
Preview post
v
Post Actions
Love post
a
Like post
l
Dislike post
d
Undo reaction
u
Recommendations
Add interest / feed
Enter
Not interested
x
Go to
Home
gh
Interests
gi
Feeds
gf
Likes
gl
History
gy
Changelog
gc
Settings
gs
Browse
gb
Search
/
General
Show this help
?
Submit feedback
!
Close modal / unfocus
Esc

Press ? anytime to show this help