Quantum computing: Explained

The future of computing is closer than ever.

Only a few generations ago, computers the size of rooms sputtered and struggled to calculate simple sums from punched tape algorithms. Today, the tiny transistors in your smartwatch are capable of more advanced computing than scientists of the 1930s could’ve ever dreamed. Now,  computational science is on the verge of another revolution: quantum computers.

Quantum computers have long been the holy grail of computer science — these machines, in theory, would be capable of outperforming any classical computer. While we won’t be popping down to the local Best Buy to purchase our own quantum computers anytime soon, advances in this technology are becoming more realistic everyday. Companies like Google and IBM are muscling for quantum supremacy, and startups in Silicon Valley are already pumping out quantum algorithms by the thousands.

Science of Quantum Computing

At the center of quantum computing is strange and counterintuitive quantum physics itself, which was discovered in the 1930s.

Classical physics, the kind that describes how apples fall from trees, relies on exact, measurable qualities to make calculations. For example, the height of an apple tree branch and the weight of the apple. But quantum physics refuses to provide these kinds of qualities. Instead, quantum physical properties are measured by probabilities and measurements of uncertainty. Usually, this would be a pain. But when it comes to eking out more computing power, it can actually come in handy.

Traditional computing measures information in bits that can either represent 0 or 1 and can be combined to create every display and click of your computer. But quantum computing, even these binary bits of information aren’t so certain. A quantum bit, known as a qubit, can represent 0, 1, or a state that exists between both. This is known in physics as a superposition.

Taking advantage of this quantum quirk, a computation or software execution that would’ve taken several strings of binary bits can now be done using a single qubit. String multiple qubits together, and the computational possibilities start to skyrocket.

Why It Matters

Once out of its awkward adolescence, these machines will be particularly good at computations like factoring, searching, and modeling and be able to do so at speeds far faster than any classical computer.

These features won’t necessarily be used to speed up your Google search, but they will help scientists make breakthroughs in fields like chemistry and pharmacology. Using quantum-speed chemical modeling, it may be possible to develop vaccine candidates in a number of days instead of years.

Quantum information processing will also change how our world is run. Experts believe that quantum computers will be able to easily crack open existing cryptography systems, exposing some of the world’s most sensitive encrypted information. To future proof these systems, scientists are already looking to develop quantum encryption that will take advantage of quantum characteristics to be impenetrable. Just as a spy might pop a cyanide pill rather than tell you government secrets, information encrypted by qubits will disappear as soon as someone attempts to tamper with it.

The Efforts

When it comes to actually designing a quantum computer, increasing your number of qubits is the name of the game. In recent years, IBM and Google have risen to the top of the pack with their supercooled, superconducting qubits. These qubits experience no electrical resistance (which is a property of superconducting materials), and because they’re manufactured instead of naturally collected (such is the case for trapped ion qubits), they can be more easily controlled.

The secret weapons behind quantum computing are notoriously unstable.

IBM’s computer is currently capable of operating 65 qubits and Google’s of operating at 53, which makes them capable of solving thorny calculations in a matter of minutes that would take a classical computer 10,000 years. This milestone, which Google’s Sycamore computer reached in 2019 is called “quantum supremacy.”

Both companies have already announced plans to build 1000 qubit computers in the coming decades, which could bring these already fast minutes-long calculations down to only fractions of a second. But even then, you’re more likely to find these computers crunching large data at a server farm than juggling your internet tabs.

The Challenges

But part of what makes actually building a quantum computer like this difficult is that qubits, the secret weapon behind these calculations, are notoriously unstable. To make calculations, researchers have to hold these qubits very precariously in certain energy states, but vibration from highways or even ambient room temperature can be enough disturbance to send these qubits into a tizzy. Scientists call this “decoherence.”

One way to control for these disturbances is to keep these computers very close to absolute zero (-450 degrees Fahrenheit) in giant freezers. But this is incredibly costly. Developing so-called “warm” quantum computing — computers that could run at room temperature — could drastically save on costs and space for future computers.

Additionally, quantum computers right now have to dedicate a handful of their limited qubits to keeping track of accumulating errors due to those disturbances. Developing more efficient ways to track or circumvent these errors can help return these side-lined qubits to the main game, and in turn, increasing computing power.

The goal posts may seem far away now, but researchers are confident that this quantum computing will one day have a huge impact on our lives.

We’d love to hear from you! If you have a comment about this article or if you have a tip for a future Freethink story, please email us at [email protected]

Russia tries to impose switch to Linux from Windows
The Russian government is switching from the Windows operating system to the open-source OS Linux and encouraging companies to follow suit.
Stanford’s new microchip could put powerful AI on your devices 
A Stanford-led team has developed a new microchip that could let us run advanced AI programs directly on our devices.
New algorithm aces university math course questions 
Researchers use machine learning to automatically solve, explain, and generate university-level math problems at a human level.
Engineers build LEGO-like artificial intelligence chip 
Engineers have created an AI chip with alternating layers of sensing and processing elements that can communicate with each other.
How a volcanic eruption was forecast five months in advance
Researchers correctly forecast a volcanic eruption five months in advance by running a modeling program on a supercomputer.
Up Next
genome analysis app
Subscribe to Freethink for more great stories