Quantum Computing: A Primer

watch time: 28 minutes

One of the key insights that legendary physicist and Nobel Prize laureate Richard Feynman had was that quantum mechanics (the branch of physics that deals with subatomic particles, uncertainty principle, and many other concepts beyond classic physics) is just way too complicated to simulate using traditional computers.

Nature, of course, can handle these complex calculations — computers however can’t do those same calculations (or would take a prohibitively long time and amount of resources to do so). But this isn’t just about being able to do more with computers in a faster (or smaller) way: It’s about solving problems that we couldn’t solve with traditional computers; it’s about a difference of kind not just degree.

So what is a quantum computer and “qubits” — especially as compared to a traditional computer and bits? What is Grover’s Algorithm? And besides speed of processing, what are some of the new applications that wouldn’t have been possible before? From how traditional computers work and quantum computers will work to why this all matters, a16z Deal and Research team head Frank Chen walks us through the basics of quantum computing in this slide presentation. And even though may feel like you finally understand after watching this, just remember what Feynman once said: “If you think you understand quantum mechanics then you don’t understand quantum mechanics.”

 

some sources and recommended further reading:

Quantum Computing 101 — University of Waterloo Institute for Quantum Computing
Quantum Computing Since Democritus — Scott Aronson
Quantum category — Scott Aronson’s blog
Quantum Computing entry — Wikipedia
Grover’s Quantum Search Algorithm — Craig Gidney, Twisted Oak Studios