in

Why Google’s quantum breakthrough is ‘truly remarkable’ – and what happens next

Google’s Willow quantum computing chip has 105 physical qubits crafted from superconducting “transmons,” a technique pioneered in 2007 at Yale University.

Google

Google’s quantum computing scientists this month demonstrated breakthroughs in the field that reinforce the sense that quantum computing is for real and will be able to find its place among other kinds of computers as a valuable resource. 

But much remains to be done: Google’s latest quantum chip — called Willow and fabricated in its Santa Barbara research facility — is a memory chip. It doesn’t actually process any functions, it simply stores a bit to be read. Doing anything with it will involve the long work of developing logical circuits to make use of the “qubits” that make up the chip. 

Also: Agentic AI is the top strategic technology trend for 2025

The fundamental breakthrough, as explained in Nature magazine (which published Google’s early-release research paper), is to show that the errors of qubits can be reduced below a level of noise called a threshold and  — as that happens — the machine can reliably represent information –that is, represent it with a tolerable level of error.

To understand, consider the basic proposition of today’s quantum hardware. To make any quantum “bit” of information, you have to combine multiple physical quantum bits, or qubits, which can be made of a variety of materials. Google’s Willow is a follow-on to last year’s Google chip Sycamore, and both use a superconducting form of capacitor, cooled to well below zero, called a “Transmon,” developed at Yale University 20 years ago. 

Using Transmons or other forms of physical qubits, researchers at numerous institutions, not just Google, have been making progress for years in combining multiple qubits to make a single “logical” qubit. A physical qubit only lasts for billionths of a second, and so its lifetime isn’t long enough for the “decoder” circuitry of the quantum machine to read its information.

<!–>

The logical qubit, really a summary of all the physical qubits, can persist long enough -double the lifespan of a physical qubit – that its value can be readable and therefore useful.

The challenge has been to suppress the errors that arise as the multiple qubits succumb to environmental noise. Too much noise and the logical qubit becomes useless. Error correction of various sorts has been developed for years now, but Google’s breakthrough is the first to reduce the individual errors in the physical qubits below a level necessary to produce a logical qubit that’s workable – the threshold level. 

–>

Key stats for Google’s Willow quantum chip.

Google

The key to Google’s Willow chip, which boosts the number of physical qubits to 105, is a variety of physical changes to the fabrication of the chip that lead to reduced noise in each physical qubit. The result is that “each time the code distance increases by two, the logical error per cycle is reduced by more than half,” as Google’s lead author, Rajeev Acharya, and collaborators write. 

That’s exciting because reliable logical qubits can be scaled; that is, more and more physical qubits can be added while keeping noise below the threshold level, and getting a predictably reliable logical qubit as a result. 

Also: A buyer’s guide to quantum as a service: Qubits for hire

Why that’s profound: Scaling is the fundamental breakthrough of traditional computer chips, the ability to assemble billions of transistors on a square of silicon, to create increasingly powerful circuits. If you can reliably scale now physical qubits, you can see a path to creating logical qubit circuits in the same way, with increasing power and performance. 

The coverage of Google’s announcement about Willow in Nature magazine and its competitor Science magazine, cites numerous experts in the field. The consensus, as Nature’s article states, is that this is “a truly remarkable breakthrough.”

Still, it’s worth keeping in mind how much work is yet to be done. For one thing, the breakthrough in threshold error doesn’t mean the work of reducing errors is over. Scaling the error rates now has to reach a much higher level of accuracy for logical qubits in practice, as Acharya and the team note.

Also: What is quantum computing today? The how, why, and when of a paradigm shift

“Orders of magnitude remain between present logical error rates and the requirements for practical quantum computation,” they write. For example, there can be “high-energy impact events” that take place in the surrounding environment, “approximately once every ten seconds,” that destroy reliable, error-free logical qubit operation.

In other words, the fact that logical qubits can be scaled, though a breakthrough, just means that researchers now have a long journey ahead to actually perform that scaling with both larger chips (more Transmons), and also improve ways of detecting and mitigating logical errors. Google says they have a roadmap for how to get there.

<!–> google-2024-quantum-computing-roadmap

–>

Google’s quantum computing roadmap.

Google

Beyond the scaling issue, there’s the limitation of the current device type: It’s not yet a computer chip.

The Willow’s logical qubit is the equivalent of a capacitor: it stores a bit. It doesn’t do anything with that bit — yet. It’s just a memory for holding information. To perform an operation, it will need to be extended to combinations of several logical qubits linked to form logical operations such as adding and multiplying. (The actual form of logical operations for a quantum processor may be rather more abstruse, but you get the idea.)

Also: What is quantum computing? Everything you need to know about the strange world of quantum computers

As John Preskill, a theoretical physicist at the California Institute of Technology in Pasadena, told Nature’s Davide Castelvecchi, “We want to do protected qubit operations, not just memory.” The longstanding goal of the Google team — to do real operations — is to have a million-qubit chip, meaning, a million physical qubits, to have enough to have logical qubits that can be strung together to form true circuits for true computation.

To get from here to actual circuits — collections of logical qubits that do something — is a very long journey, both for the actual chip and for the software that will ultimately make working quantum computers. Probably, today’s C++ and Python won’t be enough.

It’s certainly a breakthrough, just be mindful of all the work ahead.  


Source: Robotics - zdnet.com

I built the ultimate home theater setup with these 3 products – here’s my buying advice

Google’s Veo 2 video generator takes on Sora Turbo – how to try it