Quantum computers will take us beyond the binary age, into a perplexing new era. And they're already here
Each day, humans create 2.5 quintillion bytes of data. A byte is the amount of data needed by a computer to encode a single letter. A quintillion is one followed by 18 zeros. We float on an ocean of data.
You’d arrive at an even bigger number if you put it in terms of “bits”, the ultimate basic building block out of which every wonder of the digital age is built. A bit is simply a one or a zero or, equivalently, a single switch inside an electronic processor that must be either on or off. Put eight in a row, and you’ve got enough combinations to label and store every character on your keyboard—there are thus eight bits to the byte.
These days your newspapers, your tax records, your shopping list and perhaps your love life are nothing more than a long series of “ons” and “offs” generated by the digital processors that lurk in your phone, your car, or your TV. The correct sequence of ones and zeros is all that computers need in order to control the traffic lights at the end of your street, run a nuclear power station, or find you a date for next Friday night. From one perspective, they are simply doing—on a vast scale—the tallying and reckoning we have always done on our fingers: on our digits.
The “digital age” is a colossal achievement of human ingenuity. But this world of ones and zeros is not an end state. Humankind has passed through other ages before: bronze, iron, the era of steam and then of the telegraph, each of which constituted a revolution, before being brought to a close by some further advance of human ingenuity. And that raises a question—if our present digital age will pass just like all the rest, what might come after it?
We are starting to see the answer to that question, and it looks as though the successor to the age of the digital computer will be a startlingly new kind of device—the quantum computer.
In 1981, Richard Feynman, the Nobel prize-winning physicist, presented a paper at the California Institute of Technology with the title “Simulating Physics with Computers.” “What kind of computer are we going to use to simulate physics?” Feynman asked, and he chased that first question with a second: “what kind of physics are we going to imitate?” The answer to that came clear as a bell. “Nature isn’t classical, dammit,” said Feynman, “and if you want to make a simulation of nature, you’d better make it quantum mechanical, and by golly it’s a wonderful problem because it doesn’t look so easy.”
Feynman had it dead right. What he was proposing was not easy. Instead of a computer that ran according to the laws of classical physics—such as all conventional computers—he was proposing a computer that ran according to the most advanced picture of the physical world known to science: quantum mechanics. Feynman was putting forward the idea of a computer that ran according to a completely different set of scientific principles. It was a stunning suggestion. The laws of quantum mechanics relate to the behaviour of subatomic particles and packets of energy. The idea that quantum mechanical states could be harnessed and somehow used for computation was deeply provocative.
A quantum computer would work in a completely different way to the classical kind. Instead of “bits”, it would use “qubits,” that is, quantum bits. Feynman proposed that a machine of this sort would allow scientists to model quantum states and gain new insights into the behaviour of atoms and particles. But there were possibilities beyond pure science. Quantum computers would be able to carry out operations at many times the speed of traditional computers. Not only that, they might be able to do things that a conventional computer could not do at all.
All of which would have struck Feynman’s 1981 audience as pretty far-out. Even now the idea of a quantum computer has a tang of science fiction about it. Which it should not, because quantum systems already exist. You can go online and use one right now. In May 2016, IBM debuted its “Quantum Experience,” which allows users to access a quantum system through a cloud application and run algorithms and experiments. In the summer of 2017, IBM upgraded the processor behind the application and in November announced plans for an even more powerful device.
BM is not alone. Google is currently experimenting with an even more powerful quantum chip, and has plans to upgrade it further. In April 2017, a number of Google’s senior researchers released a paper called “Characterising Quantum Supremacy in Near-Term Devices.” In that abstruse-sounding title, the phrase “quantum supremacy” is the most significant. It denotes the moment when a quantum computer can perform operations that a classical computer cannot. The paper’s authors, who include Hartmut Neven, Engineering Director at Google and the founder and manager of its Quantum Artificial Intelligence Lab, wrote in their paper that “quantum supremacy can be achieved in the near-term.”
The potential of quantum computer technology is enormous and billions of dollars are being poured into research by companies, including not only Google and IBM but also Facebook and Microsoft, by universities in the US, UK, Australia and elsewhere, and by the Chinese government (which has invested heavily in developing quantum communication systems.) This brings with it a huge freight of complex challenges and questions and the most central question of all, aside from how you make one, is what a quantum computer would actually do. The answers to that are not straightforward, and involve negotiating a dense mash of computer science, physics, mathematics and philosophy.
Scott Aaronson is a Professor of Computer Science at the University of Texas at Austin. He is a leading authority on quantum computing and I spoke to him extensively in researching this article. “If you are interested in what is the ultimate limits of what we could compute or what we could know,” he said, “then in some sense you need to know something about quantum computing. Because this is the most powerful kind of computation based on the understood laws of physics.”
***Humans have always looked at the heavens. The first Babylonian star catalogues date from 1,200 BC. The Egyptians used astronomy to calculate the timing of the flooding of the Nile and it was the Greek thinker, Aristarchus of Samos, who in the third century BC first suggested that the sun was at the centre of the solar system. Over a thousand years passed before that idea entered western science, when Copernicus made his pronouncements on the heliocentric model. In the seventeenth century, Isaac Newton set out the law of universal gravitation, an immense moment of intellectual progress which gave such a powerful picture of how the universe behaved that it remained broadly unchanged for nearly two hundred years....MUCH MORE