In the 1940s, physicist Richard Feynman famously raced an abacus expert through a series of calculations. The abacus handled basic arithmetic like addition and multiplication with ease. But when it came to division and square roots, Feynman pulled ahead. The abacus was limited to whole-number operations and relied on indirect methods like repeated subtraction and estimation, which made it slower and less precise.

Early computers had similar limitations. Though faster than an abacus, they relied on fixed-point arithmetic, a system that could only represent numbers within narrow ranges and burned through bit space just to show fractions. This made it difficult for them to handle tasks requiring both large and small values, like modeling weather patterns or simulating forces in physics…

Similar Posts

Loading similar posts...

Keyboard Shortcuts

Navigation
Next / previous item
j/k
Open post
oorEnter
Preview post
v
Post Actions
Love post
a
Like post
l
Dislike post
d
Undo reaction
u
Recommendations
Add interest / feed
Enter
Not interested
x
Go to
Home
gh
Interests
gi
Feeds
gf
Likes
gl
History
gy
Changelog
gc
Settings
gs
Browse
gb
Search
/
General
Show this help
?
Submit feedback
!
Close modal / unfocus
Esc

Press ? anytime to show this help