Quantum's Progress
The DiVincenzo Criteria Revisited
It’s been 26 years after David DiVincenzo proposed the DiVincenzo criteria.
These criteria (proposed by David DiVincenzo in 2000) remain the foundational checklist for building a viable quantum computer, but current systems are still in the noisy intermediate-scale quantum (NISQ) era or as I have been calling the latest stage the “Frontier Era” (see Quantum’s Business archives). Full achievement would require satisfying them at a scale where error-corrected, useful computations outperform classical supercomputers reliably—something projected for the late 2020s or beyond by leading roadmaps.
Image source: Brian Lenahan/Midjourney
Short Answer. No company has fully achieved the DiVincenzo criteria for a practical, large-scale, fault-tolerant universal quantum computer as of March 2026.
David DiVincenzo Bio Since 2000
Since his paper on the DiVincenzo criteria, David P. DiVincenzo continued his influential career in quantum information science at IBM’s Thomas J. Watson Research Center in Yorktown Heights, New York, where he had served as a research staff member and later in various management roles—including as manager of the Quantum Information Group—since joining the company in the mid-1980s. In February 2000 he published the seminal paper “The Physical Implementation of Quantum Computation,” which formalized and expanded the five DiVincenzo criteria (scalable qubits, initialization, universal gates, long coherence times, and individual measurement) that became the foundational roadmap for building practical quantum computers, while also surveying candidate physical platforms such as superconducting circuits, quantum dots, and trapped ions.
Over the following decade, he advanced theoretical work on superconducting flux qubits, electron-spin qubits in semiconductors, circuit quantum electrodynamics, quantum error correction, and scalable architectures, collaborating closely with IBM colleagues and publishing extensively on topics ranging from stoquastic Hamiltonians to readout techniques.
After more than 25 years at IBM, DiVincenzo accepted the Alexander von Humboldt Professorship in 2010 and transitioned in January 2011 to RWTH Aachen University as professor and to Forschungszentrum Jülich as director of the Institute of Theoretical Nanoelectronics (and co-founder of the joint Institute for Quantum Information), where his IBM-honed expertise continued to drive research on spin and superconducting qubit systems until his directorship concluded in July 2025.
Quick Recap of the Core DiVincenzo Criteria (for quantum computation)
The system consists of a scalable system of well-characterized qubits, i.e. qubits that are well understood.
It must be possible to put the qubits into a defined initial state.
A universal set of elementary quantum gates, i.e. computational operations, can be performed.
Individual qubits (at least one) can be measured.
The coherence time of the system is much longer than the gate operation time, i.e. the duration of a computational operation.
(There are also two communication criteria for networking qubits, but they’re secondary for standalone computing.)
Current Status: Companies Have Come Very Close (and Meet the Criteria in Prototypes)
NOTE: This is a point in time perspective only. Quantum hardware is evolving rapidly.
Multiple platforms demonstrate all five criteria experimentally on current hardware, especially in small-to-medium systems (dozens to hundreds of qubits). Researchers noted as early as the 2010s that “all criteria are now satisfied” in lab setups, but scaling them while maintaining fidelity and uniformity is the persistent gap. No single modality fully nails them simultaneously for fault-tolerant, large-scale operation. Here’s how the leading companies/platforms stack up (based on detailed assessments of superconducting, trapped-ion, and other systems):
Trapped-ion systems (Quantinuum, IonQ): These come the closest overall and are frequently described as robustly satisfying the five criteria in practice.
Excellent coherence (T₂ up to seconds–minutes, allowing millions of operations).
Near-perfect initialization and measurement (>99.99% fidelity).
Universal gates with very high fidelity (>99%, all-to-all connectivity via lasers).
Well-characterized (identical ions by nature).
Limitations: Scalability challenged beyond ~100 ions per trap (modular shuttling/photonic links in development). Quantinuum’s H-series and IonQ’s Forte systems exemplify this.
Superconducting qubits (IBM, Google Quantum AI, Rigetti): Strong on scalability and gates; they meet the criteria on working processors but with tighter margins.
Hundreds of qubits (e.g., IBM’s Heron/Eagle series >1,000 in some roadmaps; Google’s Willow chip).
Fast gates (~10–100 ns) and universal sets (e.g., CNOT/iSWAP at ~99%+ fidelity).
Initialization via cooling/active reset (>99%); measurement via dispersive readout (~99%).
Coherence (T₁ ~50–200 μs or better in newer designs) is “just sufficient” for current circuits but requires error correction for depth.
Limitations: Fabrication variations, crosstalk, and shorter coherence vs. ions. Google’s 2019 Sycamore and 2024 Willow demos (error correction threshold crossed) show strong progress.
Other platforms (photonic like PsiQuantum/Xanadu, neutral atoms like QuEra, silicon spins): They satisfy most criteria in prototypes but lag in full demonstrations or scale. Photonics excels at communication criteria (flying qubits).
Key Gaps and Roadmaps (Why Not “Achieved” Yet)
Scalability (Criterion 1) and fault tolerance are the main shortfalls. Current devices have 10s–1,000+ noisy physical qubits; logical/error-corrected qubits are emerging but small (e.g., IBM experiments with 100+ logical qubits, Quantinuum’s high-fidelity logical demos). No system yet delivers negligible error rates at useful scale.
Companies openly state they’re still building toward this: IBM targets “quantum advantage” (useful tasks beating classical) by end of 2026 and fault-tolerant systems ~2029. Google, Quantinuum, etc., have similar timelines.
Recent breakthroughs (Google Willow’s below-threshold error correction, IBM’s LDPC codes, high quantum volumes) show rapid progress—but experts agree full practical achievement is still years away.
Achieving classical systems accuracy of 15-9’s or 99.9999999999999% accurate seems distant.
Summary
Companies like Quantinuum, IonQ, IBM, and Google have built real physical quantum computers that satisfy the DiVincenzo criteria on today’s hardware (and trapped-ion platforms are arguably the closest). But for a truly scalable, error-corrected, commercially transformative machine? Not yet—that’s the ongoing engineering challenge the entire industry is racing toward. Progress is real and accelerating, with 2026–2029 expected to be pivotal.
Brian Lenahan is founder and chair of the Quantum Strategy Institute, author of seven Amazon published books on quantum technologies and artificial intelligence and a Substack Top 50 Rising in Technology. Brian’s focus on the practical side of technology ensures you will get the guidance and inspiration you need to gain value from quantum now and into the future. Brian does not purport to be an expert in each field or subfield for which he provides science communication.
Brian’s books are available on Amazon. Quantum Strategy for Business course is available on the QURECA platform.
Copyright © 2026 Aquitaine Innovation Advisors


