Today's conventional digital computers are limited by their inability to process data that isn't broken down into binary language. A circuit will store an information "bit" by either applying a charge to a transistor or not, read by the machine as a one or a zero. Continual advances in photolithography, the technology behind printing circuits on semiconductor wafers, has allowed computing power to double every few years, something known as Moore's Law.
Now billions of transistors only a few nanometers in length can be packed onto one chip, but experts believe the industry may only be years away from hitting a wall. "Shrink them any further, to the size of a few atoms perhaps, and they could begin exhibiting quantum mechanical properties of their own," warned Ford's expert in the field, Joydip Ghosh.
Moreover, digital computers are designed to evaluate each possible outcome individually one after the other until all calculations are completed with total certainty. This can take so long that by the time they are done, the data is already worthless, as in the case of traffic. The advent of multicore chips allows for some operations to be performed at the same time but in isolation from one another, which limits their usefulness.
Enter renowned physicist Richard Feynman, who in 1982 first proposed the idea of building a computer with an ability to overlap multiple calculations at once where variables are evaluated as part of an interconnected system in constant flux.
It has taken decades but Google claims its Bristlecone chip features 72 quantum bits or "qubits," while smaller rival Rigetti plans to deploy a 128-qubit system this year. It may not sound like much, but a quantum mechanical principle allows each qubit to store information as both a one and a zero simultaneously. "Thanks to this effect known as superpositioning, with just 250 qubits you could store more bits of information than there are atoms in the universe," said IBM researcher Winfried Wilcke, who is the company's senior manager for nanoscale science and technology. "That is the enormous potential of a quantum computer."
But this is only one part of the equation. Physicists have discovered that pairs of particles can be "entangled," influencing each other even over vast distances. This same phenomenon can be applied to a qubit chip, forming the basis for researchers to track interactions between correlated inputs, regardless of whether these variables represent cars in a fleet or electrons in a lithium atom.
"Entanglement is the quantum mechanical engine that drives a machine's parallel computing -- only then can the exponential power of the qubit truly be harnessed," Ford's Ghosh said.
While the world has eventually caught up with Feynman's idea, there are considerable technical challenges to be overcome. In order to mimic the superpositioned state of both one and zero, qubits are often made from loops of superconducting metal like niobium that allow a current to be applied in two opposite directions. The metal has to be cold before all electrical resistance is eliminated. Very cold. Dozens of cryogenic tubes deep freeze the circuits before a liquefied helium isotope lowers the temperature further to almost -273 degrees Celsius. To put that in perspective, more warmth has been detected in the far reaches of deep space. "Inside a quantum computer is probably the coldest point in the measured universe," said former D-Wave boss Bo Ewald.
That's not all, though. The qubits must also be placed in a vacuum to minimize any and all environmental influences as much as possible. Since cosmic rays can already induce memory errors in conventional computers, layer upon layer of steel-cage shielding isolates the qubits -- even from the effects of earth's own magnetic field.
"Think of it as a series of Russian dolls protecting something that looks like the arm of a Terminator at the center," Ewald said. Nevertheless, the slightest interference can disrupt this delicate quantum state known as coherence and invalidate the results, so the longer they run, the more unreliable the data can become.
A top priority for tech companies building these machines is improving error detection and correction rates, a problem largely solved for their digital cousins. IBM's Wilcke is confident progress will continue to advance in prolonging system stability, and with it the integrity of the data: "Whereas coherence once lasted only nanoseconds, over time we've been able to extend this to microseconds. It's the new Moore's Law."
IBM is now researching a new form of qubit as part of a three-year contract with the U.S. Defense Department's advanced research division, DARPA. Governments are interested in part due to the risk these computers pose to cybersecurity.
Scientists predict they will soon be fast enough to crack the world's most sophisticated encryption algorithms.