Quantum computing (QC) and AI have one thing in common: They make mistakes.

There are two keys to handling mistakes in QC: We’ve made tremendous progress in error correction in the last year. And QC focuses on problems where generating a solution is extremely difficult, but verifying it is easy. Think about factoring 2048-bit prime numbers (around 600 decimal digits). That’s a problem that would take years on a classical computer, but a quantum computer can solve it quickly—with a significant chance of an incorrect answer. So you have to test the result by multiplying the factors to see if you get the original number. Multiply two 1024-bit numbers? Easy, very easy for a modern classical computer. And if the answer’s wrong, the quantum computer tries again.

One of the problems…

Similar Posts

Loading similar posts...

Keyboard Shortcuts

Navigation
Next / previous item
j/k
Open post
oorEnter
Preview post
v
Post Actions
Love post
a
Like post
l
Dislike post
d
Undo reaction
u
Recommendations
Add interest / feed
Enter
Not interested
x
Go to
Home
gh
Interests
gi
Feeds
gf
Likes
gl
History
gy
Changelog
gc
Settings
gs
Browse
gb
Search
/
General
Show this help
?
Submit feedback
!
Close modal / unfocus
Esc

Press ? anytime to show this help