The Billion-Dollar Question
"When will quantum computers be useful?"
This question has haunted the field for three decades. The answer keeps shifting: five years away, ten years away, always just over the horizon. Skeptics call it the "nuclear fusion of computing"—perpetually promising, perpetually distant.
But something has changed. In 2024 and 2025, quantum computers achieved milestones that weren't supposed to happen for years. Error correction that actually works. Logical qubits that outlast their physical components. Algorithms running on real hardware, producing results classical computers strain to verify.
The field may still be years from practical quantum computing. Or it may be months away from the first application that matters. The honest answer is: nobody knows for certain. The technology has surprised skeptics before; it has also disappointed optimists.
This chapter examines the practical path to useful quantum computing: what needs to happen, what's happening now, and what applications might emerge first.
2026 Snapshot — The State of Quantum Hardware
Qubit Modalities
Superconducting qubits:
- Current leaders: IBM, Google, Rigetti, Amazon (through partners)
- Largest systems: 1,000+ qubits (IBM)
- Advantages: Fastest gate operations; most mature fabrication
- Disadvantages: Requires extreme cooling (~15 millikelvin); high error rates; challenging to scale
- Recent progress: Google's Willow chip showed error correction improvement with scale—a crucial milestone¹
Trapped ions:
- Current leaders: IonQ, Quantinuum (Honeywell + Cambridge Quantum)
- Largest systems: ~50 high-quality qubits
- Advantages: Highest fidelity gates; long coherence times; all-to-all connectivity
- Disadvantages: Slower operations; scaling requires complex architectures
- Recent progress: Quantinuum achieved record gate fidelities; demonstrated real algorithmic advantage²
Photonic:
- Current leaders: Xanadu, PsiQuantum
- Approach: Qubits encoded in light
- Advantages: Room temperature operation; natural for networking
- Disadvantages: Probabilistic gates; different error profile
- Recent progress: Photonic systems demonstrated quantum advantage for specific problems³
Neutral atoms:
- Current leaders: QuEra, Pasqal, Atom Computing
- Approach: Individual atoms trapped by lasers
- Advantages: Large qubit counts possible; natural parallelism
- Disadvantages: Newer technology; gate operations still developing
- Recent progress: Systems with 1,000+ atoms demonstrated; promising scaling path⁴
Topological (Microsoft):
- Status: Not yet demonstrated
- Theory: Qubits encoded in topological states; inherently error-resistant
- Potential: Could dramatically reduce overhead for error correction
- Reality: Microsoft claimed progress in 2025 but full topological qubits not yet achieved⁵
The Error Problem
Error rates today:
- Single-qubit gates: 99.9% fidelity (0.1% error)
- Two-qubit gates: 99-99.9% fidelity (0.1-1% error)
- Measurement: 99%+ fidelity
Why this matters: A useful quantum algorithm might need thousands of gates. With 0.1% error per gate, after 1,000 gates you have ~63% chance of error. After 10,000 gates, virtually certain error.
Error correction: Encodes information across multiple physical qubits to create one "logical" qubit. Errors can be detected and corrected without destroying quantum information.
The overhead: Current estimates require 1,000-10,000 physical qubits per logical qubit for useful error correction. A useful quantum computer might need millions of physical qubits.
Key Milestones Achieved
Quantum supremacy (Google, 2019): Sycamore processor performed calculation impractical for classical computers. Not useful, but demonstrated quantum advantage exists.
Error correction threshold (Google, 2024-25): Willow processor showed that adding more qubits to error correction codes actually reduced errors—the crucial threshold for scalable quantum computing.⁶
Real algorithmic results (Quantinuum, IonQ, 2024-25): Small algorithms running on real hardware producing verified results. Not yet commercially valuable but demonstrating capability.
Notable Players in Detail
IBM
Approach: Superconducting qubits; heavy investment in ecosystem.
Hardware roadmap:
- Condor (2023): 1,121 qubits
- Heron (2024): Improved error rates, modular design
- Target: 100,000 qubits by 2033
Software: Qiskit framework; extensive documentation; IBM Quantum Network for cloud access.
Strategy: Build the ecosystem. Make quantum computing accessible. Bet on being the platform everyone uses.
Status: Largest installed base; most accessible cloud quantum computing; hardware lags some competitors in quality metrics.
Approach: Superconducting qubits; fundamental research focus.
Key achievements:
- Sycamore "quantum supremacy" (2019)
- Willow error correction milestone (2024-25)
- Strong publication record
Strategy: Solve the hard problems first. Error correction is the priority. Commercialization can wait.
Status: Leading in error correction research; smaller commercial presence than IBM.
Microsoft
Approach: Topological qubits (unique); Azure Quantum platform (pragmatic).
Topological bet: Theoretically more stable qubits. High risk, high potential reward. Not yet proven to work.
Azure Quantum: Cloud platform offering access to multiple quantum providers (IonQ, Quantinuum, etc.).
Strategy: Hedge with platform while betting on breakthrough technology.
Status: Platform growing; topological approach still unproven but potentially game-changing.
IonQ
Approach: Trapped ions; commercial focus.
Status: Public company (NYSE: IONQ); cloud access via AWS, Azure, Google Cloud.
Hardware: #AQ (algorithmic qubits) metric emphasizes quality over quantity.
Strategy: Practical applications first. Don't wait for perfect error correction.
Notable: Has shown commercial revenue from quantum computing services.
Quantinuum
Approach: Trapped ions; formed from Honeywell Quantum Solutions and Cambridge Quantum.
Status: Highest-fidelity quantum operations demonstrated. Strong enterprise focus.
Hardware: H-series processors; emphasis on quality over quantity.
Strategy: Precision first. Work with enterprises on real problems.
Notable: Has demonstrated algorithms running with real advantage over classical approaches for specific problems.
Startups
PsiQuantum: Photonic approach; massive funding ($700M+); claims path to million-qubit machine.
Xanadu: Photonic; open-source software (PennyLane); cloud access.
QuEra: Neutral atoms; emerged from Harvard/MIT research; large qubit counts.
Pasqal: Neutral atoms; European leader; analog quantum computing approach.
Atom Computing: Neutral atoms; demonstrated 1,000+ qubit array.
Rigetti: Superconducting; cloud platform; has faced challenges.
The Path to Fault Tolerance
What Fault Tolerance Means
Definition: A fault-tolerant quantum computer can run arbitrarily long calculations without accumulating errors that corrupt the result.
Requirements:
- Error rates below a threshold
- Error correction that actually reduces errors
- Enough qubits to implement correction
- Fast enough correction to keep up with errors
Status: The field is at the beginning of this path. Error correction has been demonstrated; fault tolerance at scale has not.
The Error Correction Journey
Theoretical foundation (1990s): Shor, Steane, and others proved quantum error correction was possible in principle.
Small demonstrations (2010s): Error correction codes implemented on a few qubits. Proof of concept.
The threshold (2024-25): Google's demonstration that adding qubits to error correction codes reduces rather than increases errors. This is the crucial transition.⁷
Next steps: Larger codes; multiple logical qubits; actual computation on error-corrected qubits.
The Scaling Challenge
Physical qubits needed: Estimates vary from 1,000 to 10,000+ physical qubits per logical qubit, depending on error rates and algorithm requirements.
Connectivity: Error correction requires qubits to interact with their neighbors rapidly.
Classical control: Error correction requires massive classical processing to decode syndromes and apply corrections in real time.
Current state: The field has perhaps 0-2 useful logical qubits. Hundreds or thousands are needed for useful computation.
Timeline Estimates
Optimistic (5-7 years): Fault-tolerant quantum computing achieved. First practical applications for chemistry and optimization.
Moderate (10-15 years): Fault tolerance at scale. Clear commercial applications. Industry adoption begins.
Pessimistic (20+ years): Technical challenges prove harder than expected. Practical quantum computing remains limited.
Honest assessment: Nobody knows. The field has surprised both optimists and pessimists before.
Near-Term Applications: The NISQ Era
What Is NISQ?
NISQ: Noisy Intermediate-Scale Quantum. Coined by John Preskill.⁸
Definition: Quantum computers large enough to be interesting but too noisy for error correction.
The hope: Maybe some useful computation is possible without full fault tolerance.
The reality: Mixed results. Some promising demonstrations; no killer application yet.
Variational Algorithms
Approach: Hybrid classical-quantum algorithms. Quantum processor does part of calculation; classical computer optimizes parameters.
VQE (Variational Quantum Eigensolver): For chemistry problems. Finds ground state energies of molecules.
QAOA (Quantum Approximate Optimization Algorithm): For optimization problems.
Status: Demonstrated on real hardware. Results often comparable to classical but not clearly better. Scaling unclear.
Quantum Machine Learning
The pitch: Quantum computers might speed up machine learning training or enable new algorithms.
Reality check: Few demonstrations of practical advantage. Classical ML advancing rapidly.
Promising areas: Certain kernel methods; quantum neural networks for specific problems.
Status: Active research area. No commercial applications yet.
Simulation of Physical Systems
The natural fit: Quantum computers are quantum systems. They should naturally simulate other quantum systems.
Chemistry: Simulating molecular behavior for drug discovery, materials science.
Status: The most promising near-term application area. Small molecules already demonstrated.
Limitation: Interesting chemistry problems require error correction; NISQ systems struggle.
Random Circuit Sampling
What it is: Running random quantum operations and sampling the output distribution.
Why it matters: Demonstrates quantum advantage—classical computers can't efficiently reproduce results.
Limitation: Not commercially useful. Proof of concept, not application.
Practical Application Areas
Cryptography
The threat: Shor's algorithm can factor large numbers, breaking RSA and similar encryption.
Timeline for threat: Unknown. Requires large fault-tolerant quantum computer. Estimates range from 10-30+ years.
Current risk: "Harvest now, decrypt later." Adversaries collect encrypted data today, waiting for quantum computers to break it.
Response: Post-quantum cryptography. Classical algorithms resistant to quantum attack. NIST standardization completed.⁹
Quantum-safe transition: Already underway. Organizations migrating to quantum-resistant algorithms. 10-20 year transition expected.
Chemistry and Materials
Why quantum helps: Molecular behavior is fundamentally quantum. Classical computers must approximate; quantum computers can simulate directly.
Applications:
- Drug discovery: Simulate protein-ligand interactions; predict drug efficacy
- Catalyst design: Find materials for industrial processes, carbon capture
- Battery materials: Design better electrodes, electrolytes
- Superconductors: Understand and design high-temperature superconductors
Current state: Small molecules demonstrated. Larger molecules require error correction.
Timeline: First commercially valuable chemistry simulations probably within decade.
Optimization
The promise: Many optimization problems might have quantum speedup.
Applications:
- Logistics: Routing, scheduling, resource allocation
- Finance: Portfolio optimization, risk analysis
- Machine learning: Training optimization
Reality check: Proven quantum speedup for optimization is limited. Many classical algorithms are very good.
Status: Active research. Some promising results for specific problems. Not yet commercially valuable.
Finance
Applications:
- Portfolio optimization
- Risk analysis (Monte Carlo methods)
- Fraud detection
- Derivatives pricing
Industry interest: High. Banks and asset managers investing in quantum research.
Current use: Experimental. No production quantum finance applications yet.
Machine Learning
Potential: Quantum computers might accelerate ML training or enable new algorithms.
Research areas:
- Quantum kernel methods
- Quantum neural networks
- Quantum generative models
Status: Promising research. No practical advantage demonstrated at scale.
Challenge: Classical AI advancing rapidly. Quantum must exceed a moving target.
AI-Quantum Synergy
AI Helping Quantum
Error correction: AI systems help design and implement quantum error correction schemes.
Calibration: ML optimizes quantum hardware calibration, reducing errors without hardware changes.
Algorithm discovery: AI might discover new quantum algorithms or optimize existing ones.
Control: Real-time ML systems manage quantum control pulses for better fidelity.
Status: Already in use. AI is integral to quantum hardware development.
Quantum Helping AI
Training acceleration: Quantum computers might speed up certain ML training algorithms.
New architectures: Quantum-classical hybrid models for specific problems.
Optimization: If quantum optimization works, could help AI hyperparameter tuning.
Status: Theoretical and early experimental. No practical advantage yet.
The Race Dynamic
Question: Will quantum computing be useful before classical AI solves the same problems?
Scenario 1: Quantum arrives in time. Some problems remain hard for classical AI. Quantum provides unique value.
Scenario 2: Classical AI advances faster. By the time quantum works, classical has already solved key problems.
Scenario 3: Synergy. Quantum and classical AI are complementary. Each advances the other.
Assessment: Probably scenario 3. Some problems will favor quantum; others classical. Integration likely.
The Business of Quantum
Investment Landscape
Total investment: Billions of dollars globally.
Government funding: US National Quantum Initiative ($1.2B+), EU Quantum Flagship (€1B+), China (less transparent but massive), UK, Japan, others.
Private investment: Billions in startups. Major tech companies investing heavily.
Public companies: IonQ, Rigetti, D-Wave on public markets. Mixed performance.
Business Models
Cloud access: IBM, Amazon (Braket), Azure, Google offer cloud quantum computing.
Consulting: Companies like Accenture, McKinsey building quantum practices.
Software: Qiskit (IBM), Cirq (Google), PennyLane (Xanadu), various others.
Hardware: Direct sales to research institutions, governments, enterprises.
Enterprise Adoption
Current state: Exploration, not production. Companies experimenting with quantum.
Industries most active: Finance, pharma, automotive, aerospace, energy.
Typical engagement: Proof of concept projects. Learning and preparation.
Challenge: No compelling ROI yet. Hard to justify production investment.
Market Projections
Current market: ~$1B annually (hardware, software, services).
Projections: Wide range. $10-50B by 2035-2040 depending on assumptions.
Key uncertainty: When fault tolerance arrives. All projections depend on this.
The Path Forward
Near-Term Likely (2026-2032)
Hardware progress: Qubit counts reach 10,000+. Error rates continue improving. Multiple modalities advance.
Error correction milestones: Multiple logical qubits demonstrated. Algorithms run on error-corrected qubits at small scale.
Applications: Chemistry simulations for small molecules achieve genuine utility. Optimization has some wins.
Cloud access expands: Quantum computing as a service matures. More organizations experiment.
Post-quantum transition: Migration to quantum-resistant cryptography accelerates. Urgency increases.
Hype cycle: Expect continued oscillation between hype and disappointment. Progress is real but uneven.
Plausible (2032-2040)
Fault tolerance achieved: At least one platform achieves fault-tolerant quantum computing at useful scale.
First killer app: Probably chemistry/materials. Simulation that provides genuine commercial value.
Industry adoption begins: Companies start using quantum computing for production workloads.
Hybrid workflows standard: Quantum processors as specialized accelerators in larger classical systems.
Cryptographic urgency: Quantum computers approach capability to break current encryption. Transition deadline.
Wild Trajectory (2040+)
Quantum is a standard tool: Like GPUs today, quantum processors are part of computational infrastructure.
Chemistry transformed: Drug discovery, materials science fundamentally changed by quantum simulation.
New algorithms discovered: Applications not yet imagined. History of computing suggests unexpected uses.
Quantum internet: Quantum communication and distributed quantum computing.
Or: Progress stalls. Fault tolerance harder than expected. Quantum remains niche research tool. Classical computing advances fill the gap.
Risks and Guardrails
Technical Risk
Risk: Fault tolerance proves harder than expected. Quantum computing remains limited indefinitely.
Guardrails: Diversified research across multiple approaches; continued fundamental science; patience with realistic expectations.
Cryptographic Risk
Risk: Quantum computers break encryption before transition complete. Massive data compromise.
Guardrails: Accelerate post-quantum transition now; don't wait for fault tolerance; "harvest now, decrypt later" awareness.
Hype and Investment Risk
Risk: Overpromising leads to funding collapse when results don't materialize. Promising research abandoned.
Guardrails: Honest communication about timelines and limitations; focus on real milestones; sustainable funding.
Geopolitical Risk
Risk: Quantum advantage goes to adversaries first. Strategic implications for security, commerce.
Guardrails: Sustained government investment; international collaboration where appropriate; export controls on key technologies.
Talent Risk
Risk: Insufficient quantum workforce to exploit advances.
Guardrails: Quantum education programs; workforce development; immigration policy for quantum talent.
What It Means
For Businesses
Now: Learn. Experiment. Build quantum literacy. Don't expect production applications.
Next five years: Deepen exploration. Identify problems that might benefit. Prepare for cryptographic transition.
When fault tolerance arrives: Be ready to adopt. First movers may have significant advantages in quantum-amenable industries.
For Researchers
Opportunity: Quantum computing is a growing field with massive funding and open problems.
Focus areas: Error correction, algorithms, applications, architecture—all need work.
Interdisciplinary: Quantum computing needs physicists, engineers, computer scientists, and domain experts.
For Society
Promise: Quantum computing could accelerate drug discovery, materials science, AI. Genuine benefit.
Risk: Cryptographic disruption. Security implications. Need proactive governance.
Timeline: Probably a decade plus for major impact. But preparation should start now.
Conclusion
Quantum computing exists in a peculiar state—simultaneously the most precisely engineered technology ever built and one that doesn't yet do anything useful.
The physics works. Qubits superpose. Entanglement creates correlations. Interference enables algorithms. Every fundamental prediction of quantum mechanics has been verified to extraordinary precision.
The engineering is catching up. Error rates that were 1% a few years ago are now 0.1% and falling. Systems that had dozens of qubits now have thousands. Error correction that was purely theoretical is now demonstrated in hardware.
What remains is the last mile—and it may be the hardest. Fault tolerance at scale. Enough qubits to run useful algorithms. Classical control systems that can keep up. Software that makes it all accessible.
When this comes together—if it comes together—the implications are significant. Problems that take classical computers the age of the universe become tractable. Drug discovery accelerates. Materials science transforms. Cryptography reinvents itself.
But the field is not there yet. The honest answer to "when will quantum computers be useful?" is still "nobody knows." The optimists have been wrong before. So have the pessimists.
What is different now is that researchers are not just waiting. They are building. Every month brings new hardware, new algorithms, new error correction schemes. The trajectory is toward capability. The question is how long the trajectory takes.
For now, quantum computing is a technology worth watching, worth learning, worth preparing for—but not yet worth betting your business on. That may change faster than anyone expects. Or it may take longer than anyone hopes.
The only certainty is uncertainty. Which is, appropriately enough, very quantum.
Endnotes — Chapter 44
- Google's Willow chip (2024-25) demonstrated that increasing code size reduces errors below threshold—the first demonstration of "below threshold" error correction at scale.
- Quantinuum achieved two-qubit gate fidelities above 99.9% with trapped ion systems; demonstrated certified random number generation with quantum advantage.
- Xanadu and others demonstrated "quantum advantage" for Gaussian boson sampling—useful for certain optimization and chemistry problems.
- QuEra demonstrated 48-logical-qubit processor using neutral atoms (2023); Atom Computing demonstrated 1,200-atom array.
- Microsoft claimed demonstration of Majorana zero modes (precursor to topological qubits) in 2025; full topological qubits remain undemonstrated.
- Below-threshold error correction means adding redundancy actually helps rather than introducing more errors; crucial transition for scalable quantum computing.
- The "threshold theorem" proves fault-tolerant quantum computing is possible in principle if error rates are below a threshold; Google's demonstration was the first large-scale verification.
- John Preskill coined "NISQ" (Noisy Intermediate-Scale Quantum) in 2018 paper describing the current era of quantum computing.
- NIST selected initial post-quantum cryptography algorithms in 2022 (CRYSTALS-Kyber, CRYSTALS-Dilithium, SPHINCS+, FALCON); standardization ongoing.
- Estimates for breaking RSA-2048 with Shor's algorithm range from requiring millions of physical qubits to potentially fewer with improved error correction; timeline estimates vary from 10-30+ years.