Chapter 17. Q
Quantum computing
Quantum computing is where quantum mechanics and computing meet. Quantum means really, really tiny things—the tiniest things, atomic and subatomic particles. Physicists tell me that the physics of these really tiny things are strange: the laws of physics that apply when you drop a tennis ball down a flight of stairs don’t exactly apply to these really, really tiny things. The study of how they work is called quantum mechanics.
The data our computers process is all binary. Even when you use higher-level computer programming languages with human-readable words, it all eventually gets converted to 1s and 0s in our CPUs. A “1” or “0” is called a bit. That’s how the electronic signals in our computers behave.
Quantum computers are designed to leverage the properties of quantum mechanics. So, instead of using bits, they use qubits, or quantum bits. Because of the weirdness of quantum mechanics, a qubit can be “1,” “0,” or both “1” and “0” simultaneously. It’s mind-blowing! And when a unit of computer data can be 1 and 0 simultaneously, it greatly increases how much math a computer can do at any given time.
Quantum computers have been in research and development ever since physicist Richard Feynman proposed the concept in 1982. In 2017, IBM launched the first properly working quantum computers: the IBM Q project. Microsoft also has a quantum computing service, Azure Quantum. As of 2023, quantum computing is still experimental and accessible only for enterprise ...
Get Hacker Culture A to Z now with the O’Reilly learning platform.
O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.