Google’s quantum computing scientists this month demonstrated breakthroughs in the field that reinforce the sense that quantum computing is for real and will be able to find its place among other kinds of computers as a valuable resource.
But much remains to be done: Google’s latest quantum chip — called Willow and fabricated in its Santa Barbara research facility — is a memory chip. It doesn’t actually process any functions, it simply stores a bit to be read. Doing anything with it will involve the long work of developing logical circuits to make use of the “qubits” that make up the chip.
Also: Agentic AI is the top strategic technology trend for 2025
The fundamental breakthrough, as explained in Nature magazine (which published Google’s early-release research paper), is to show that the errors of qubits can be reduced below a level of noise called a threshold and — as that happens — the machine can reliably represent information –that is, represent it with a tolerable level of error.
To understand, consider the basic proposition of today’s quantum hardware. To make any quantum “bit” of information, you have to combine multiple physical quantum bits, or qubits, which can be made of a variety of materials. Google’s Willow is a follow-on to last year’s Google chip Sycamore, and both use a superconducting form of capacitor, cooled to well below zero, called a “Transmon,” developed at Yale University 20 years ago.
Using Transmons or other forms of physical qubits, researchers at numerous institutions, not just Google, have been making progress for years in combining multiple qubits to make a single “logical” qubit. A physical qubit only lasts for billionths of a second, and so its lifetime isn’t long enough for the “decoder” circuitry of the quantum machine to read its information.
<!–>
The logical qubit, really a summary of all the physical qubits, can persist long enough -double the lifespan of a physical qubit – that its value can be readable and therefore useful.
The challenge has been to suppress the errors that arise as the multiple qubits succumb to environmental noise. Too much noise and the logical qubit becomes useless. Error correction of various sorts has been developed for years now, but Google’s breakthrough is the first to reduce the individual errors in the physical qubits below a level necessary to produce a logical qubit that’s workable – the threshold level.