From Barron's December 13:
Think AI Is Baffling? Here’s How to Pretend You Understand Quantum Computing.
Google introduced a new quantum computing chip this week that is faster than a supercomputer by the age of the universe. It may be the next big thing, so you need at least to know a few key concepts.
What an embarrassing week to be a supercomputer. These are machines that connect many processors to work on complicated jobs, like modeling weather or conjuring a koala-in-a-tophat emoji. Going fast is kind of their whole thing. But a software company just dunked on the entire industry. Google said that it has a new “quantum computing” chip called Willow that can perform a calculation in less than five minutes that would take today’s fastest supercomputers longer than the age of the universe.
“Our logical qubits now operate below the critical quantum error correction threshold,” explained a company scientist in a windbreaker and sneakers who I’m pretty sure can bend metal with his mind. It’s a significant development in terms of chained geometric equity progression, meaning that whatever that guy was talking about made shares of parent Alphabet go up more than 5% on both Tuesday and Wednesday.
My first thought was whether quantum computing’s potential for good in fields like drug discovery will outweigh its dangers in code breaking and cyberwarfare. But my even first-er thought was whether there’s an exchange-traded fund. It turns out that the Defiance Quantum ETF (ticker: QTUM) is up 38% this year, versus 29% for the Global X Artificial Intelligence & Technology ETF (AIQ), meaning that quantum computing is suddenly even hotter than AI.
What’s that, you say? Quantum computing and AI aren’t alternatives but rather complements, and to suggest otherwise raises questions about my quantum computing bona fides? Stop trying to blow my cover. Quantum computing is already popping up in water cooler conversations across ordinary offices, which means you need to understand it in a hurry—only I’ve tried, and the stuff is impenetrable. What you really need is a guide to pretending to understand quantum computing. And there I think I can help.
Step 1 is to explain to no doubt impressed onlookers that classical computing is based on bits, which are stored in two possible states, 0 or 1. “That’s why we call them bits,” be sure to say. “It’s a portmanteau for binary digits.” Pause here for admiring nods.
Next, explain that quantum computing is based on what are called qubits. “That’s with a q,” make clear, or else your colleagues will picture cubits, and Noah’s Ark—we’re going in a totally different direction. A qubit is a unit of measurement based on what’s called a superposition of multiple possible states. It’s difficult to understand this, because it comes from quantum mechanics, or how subatomic particles behave, which is totally different from how the objects around us behave.
In quantum mechanics, particles can be in two states at the same time until they’re observed. There was an exasperated Austrian called Schrödinger who tried to explain this to normals in terms of a make-believe cat trapped in a box with an unstable radioactive material that may or may not decay, triggering a Geiger counter, which releases poison, such that you’re not sure whether the cat is dead until you open the box, or as quantum mechanics would have it, that the cat is both alive and dead, in a superposition of both, with a certain probability of each. For me, this only raises more questions, including whether Schrödinger had pets.
Best to skip superposition when explaining quantum computing, but remember the term, along with entanglement, decoherence, and interference. If your colleagues chime in with any of these, it means they know more than you. Immediately fake a sneezing fit and leave the conversation....
....MUCH MORE
This week: