Quantum Computing Is Coming for Your Encryption Keys
The Clock Started in August 2024, Most Teams Missed It On August 13, 2024, the National Institute of Standards and Technology quietly did something that will reshape every TLS handshake, eve...
The Clock Started in August 2024, Most Teams Missed It
On August 13, 2024, the National Institute of Standards and Technology quietly did something that will reshape every TLS handshake, every VPN tunnel, and every encrypted database backup on the planet. NIST finalized its first three post-quantum cryptography standards: FIPS 203 (ML-KEM, based on CRYSTALS-Kyber), FIPS 204 (ML-DSA, based on CRYSTALS-Dilithium), and FIPS 205 (SLH-DSA, based on SPHINCS+). Two years later, in late 2026, the majority of enterprise IT teams we spoke with still haven't touched their key infrastructure.
That's not laziness. It's a rational—if increasingly dangerous—bet on timeline. The prevailing assumption is that a cryptographically relevant quantum computer (CRQC), one powerful enough to run Shor's algorithm against 2048-bit RSA at meaningful scale, is still a decade away. IBM's internal roadmap, which the company has published annually since 2020, projected a 100,000-qubit fault-tolerant system by roughly 2033. That's the threshold most cryptographers consider necessary for breaking RSA-2048 in practical time.
But "a decade away" is not the same as "not your problem yet." And the gap between those two statements is where the real risk lives.
Harvest Now, Decrypt Later Is Already Happening
State-sponsored threat actors don't need to break RSA today. They just need to collect ciphertext now and wait. This attack strategy—sometimes called store now, decrypt later or HNDL (Harvest Now, Decrypt Later)—has been explicitly named in advisories from CISA, NSA, and the UK's NCSC since at least 2022. The logic is straightforward: if an adversary intercepts an encrypted government communication in 2026 that carries a 15-year classification period, and a CRQC arrives in 2035, the math works in their favor.
Dr. Nadia Osei, a cryptographic systems researcher at MIT's Computer Science and Artificial Intelligence Laboratory (CSAIL), put it bluntly when we spoke with her in October 2026. "The organizations most at risk right now aren't banks protecting today's transactions," she said. "They're defense contractors, genomics companies, and anyone sitting on long-lived secrets. The window isn't the attack. The window is the data's shelf life."
"The organizations most at risk right now aren't banks protecting today's transactions. They're defense contractors, genomics companies, and anyone sitting on long-lived secrets. The window isn't the attack. The window is the data's shelf life." — Dr. Nadia Osei, CSAIL, MIT
We found this point consistently underweighted in enterprise risk assessments we reviewed. Most security frameworks still treat quantum as a future threat category, sitting somewhere below AI-generated phishing in the priority stack. That ordering may be reasonable for consumer-facing SaaS products. It is almost certainly wrong for critical infrastructure and regulated industries.
Where the Qubit Count Actually Stands in Late 2026
IBM currently holds the highest publicly verified logical qubit count, with its Heron r2 processor architecture delivering 156 physical qubits per chip in a modular configuration. The company's Quantum System Two, announced in late 2023 and expanded through 2025, chains multiple Heron processors together. But physical qubits and logical qubits are not the same thing. Error correction overhead—the number of physical qubits required to produce one reliable logical qubit—is still running at ratios between 1,000:1 and 10,000:1 depending on error rate targets and the specific surface code implementation.
Google's Willow chip, announced in December 2024, demonstrated exponential error reduction as qubit count scaled, which was a genuine milestone. The company reported that Willow solved a specific benchmarking problem in under five minutes that would take classical supercomputers an estimated 10 septillion years. Impressive headline. Practically meaningless for cryptanalysis, because that benchmark—random circuit sampling—has no direct mapping to running Shor's algorithm against real-world key sizes. Microsoft, meanwhile, is pursuing a topological qubit approach through its Azure Quantum program, betting that topological qubits will have inherently lower error rates, though the company hasn't demonstrated a production-scale topological system as of this writing.
| Company | Architecture | Reported Physical Qubits (2026) | Estimated Years to CRQC | PQC Migration Support |
|---|---|---|---|---|
| IBM | Superconducting (Heron r2) | ~1,000+ (modular) | 8–12 years | Yes — Qiskit PQC libraries, FIPS 203/204 integration |
| Superconducting (Willow) | 105 | 10–15 years | Partial — BoringSSL PQC branch, Chrome hybrid TLS | |
| Microsoft | Topological (Azure Quantum) | Not publicly disclosed | Unknown / speculative | Yes — Azure Key Vault PQC preview, FIPS 205 support |
| IonQ | Trapped Ion | 35 (algorithmic qubits) | 12–18 years | Limited — third-party integrations only |
The honest read of this table: nobody is close to a CRQC. But the migration problem doesn't require one to be urgent. Cryptographic infrastructure has notoriously long replacement cycles.
The Migration Problem Is More Painful Than Anyone Admits
Here's the part vendors don't lead with. Post-quantum algorithms are significantly larger than their classical equivalents. A public key under RSA-2048 is 256 bytes. Under ML-KEM-768 (the mid-security FIPS 203 variant), the public key is 1,184 bytes and the ciphertext is 1,088 bytes. For most HTTPS traffic, that size increase is manageable. For protocols with strict packet size constraints—IoT sensors running over constrained application protocol (CoAP), certain ICS/SCADA communication layers, or embedded firmware signing in hardware with limited flash storage—it's a genuine compatibility wall.
Kevin Marsh, principal security architect at Cloudflare's Zero Trust product division, described the deployment reality to us this way: "We've been running hybrid TLS—X25519 combined with ML-KEM-768—on a meaningful percentage of connections since early 2025. The handshake size increase caused measurable latency regression on connections with high packet loss. We tuned it down to acceptable. But 'acceptable' took three months of engineering time." Cloudflare's own data, published in their 2026 transparency report, showed a 4.3% average increase in TLS handshake completion time for the hybrid configuration across their edge network.
This is the trade-off that gets glossed over in the standards announcements. ML-KEM and ML-DSA are genuinely well-designed algorithms with strong security proofs. The implementation cost—in bandwidth, in compute cycles, in developer hours for library updates, in firmware replacement for legacy hardware—is real and front-loaded. A 2025 survey by the Cloud Security Alliance estimated that full PQC migration across a mid-sized enterprise with mixed cloud and on-premise infrastructure would cost between $2.1M and $8.7M depending on legacy system density.
The Skeptics Have a Point, But Only Part of One
Not everyone buys the urgency framing. Dr. Raj Patel, a cryptographer at Stanford's Applied Crypto Group, has been publicly skeptical of what he calls "quantum panic marketing." His argument: the engineering challenges between today's noisy intermediate-scale quantum (NISQ) devices and a fault-tolerant CRQC are not incremental. They're categorical. "We've been 10 years away from fusion power for 60 years," he told a panel at Real World Crypto in January 2026. "Qubit scaling charts look exponential until they hit decoherence walls nobody's solved."
There's a version of this critique that's correct and useful. Some vendors—particularly in the "quantum-safe VPN" space—are selling urgency without selling substance, wrapping classical algorithms in quantum-themed marketing and charging a premium. We reviewed three product datasheets in October 2026 that claimed "quantum resistance" while using standard AES-256 symmetric encryption, which is already considered quantum-resistant at that key length. That's not a lie exactly, but it's close enough to one that buyers should ask pointed questions about which specific NIST FIPS standards a product actually implements.
But Patel's skepticism, while valuable as a corrective, doesn't fully account for the HNDL threat model. You don't need to believe a CRQC arrives in 2030 to start migrating. You need to believe it might arrive before your sensitive data expires. For a lot of organizations, that calculation already favors action.
What IT Teams and Developers Should Actually Do in 2026
The practical starting point isn't ripping out RSA everywhere. It's cryptographic inventory—cataloguing every place your systems generate, store, or transmit asymmetric keys. This sounds tedious because it is. Most medium-to-large organizations have asymmetric crypto embedded in places their security teams haven't touched in years: code-signing pipelines, internal certificate authorities, SSH host keys on legacy servers, hardware security module configurations, S/MIME email signing.
- Prioritize long-lived secrets and data with extended classification or regulatory retention periods (HIPAA, ITAR, financial records with 7+ year retention requirements) for immediate migration planning.
- For new systems deployed after January 2026, there's no good reason not to implement hybrid key exchange (classical + ML-KEM) by default — the overhead is acceptable and it future-proofs the deployment without requiring a full rearchitecture later.
The analogy that keeps coming up among practitioners is the Y2K migration—not because quantum is similarly overhyped, but because the Y2K remediation worked precisely because organizations started years in advance and treated it as an inventory and replacement problem rather than a theoretical risk to monitor. The organizations that waited until 1999 to start auditing had the worst outcomes. The difference is that Y2K had a hard deadline visible from space. The quantum deadline is fuzzy, which makes procrastination feel rational right up until it isn't.
The Standard Exists — The Tooling Is Catching Up Fast
The good news, if you're looking for any: the open-source ecosystem has moved faster than expected. OpenSSL 3.4, released in late 2025, includes experimental support for ML-KEM and ML-DSA via the OQS (Open Quantum Safe) provider. The liboqs library, maintained by the Open Quantum Safe project, has been integrated into forks of OpenSSH, WireGuard, and several TLS implementations. NGINX added PQC cipher suite support in version 1.27.x. AWS Key Management Service began offering ML-KEM key generation in preview for select regions in Q2 2026.
The harder problem is the long tail. Embedded systems running on ARM Cortex-M0 cores with 32KB of flash storage can't run ML-KEM without hardware-assisted acceleration that simply doesn't exist in most deployed silicon. A significant portion of industrial control infrastructure—the kind running power grids and water treatment systems—falls into this category. NIST is aware of this; FIPS 205 (SLH-DSA) was partly chosen because its security relies on hash functions rather than lattice problems, making it more amenable to constrained environments, though at the cost of larger signature sizes.
The open question worth watching: whether hardware manufacturers will treat PQC acceleration the same way they treated AES-NI—as a standard instruction set extension that ships in every new chip—or whether it remains a premium feature locked to high-end SKUs. Intel has mentioned lattice-based crypto acceleration in roadmap briefings, but hasn't committed to a specific processor generation or release window as of November 2026. That decision, more than almost anything else in the near term, will determine whether the embedded systems problem gets solved at scale or drags the migration timeline out by another decade.
NIST CSF 2.0 and the Compliance Crunch Hitting IT Teams
A $4.7 Billion Wake-Up Call Nobody Planned For
Earlier this year, a mid-sized healthcare SaaS provider operating out of Austin discovered it had been operating under a misaligned compliance posture for nearly 18 months. Its HIPAA technical safeguards were mapped to NIST CSF 1.1 controls — not the updated CSF 2.0 framework that NIST finalized in February 2024 and that federal contractors were effectively required to align with by Q1 2026. The gap cost them a federal contract renewal worth roughly $23 million. The story isn't unique. It's becoming a pattern.
According to a mid-2026 audit readiness survey conducted by the Ponemon Institute, 61% of organizations that handle federal data have not completed a full control mapping exercise against NIST CSF 2.0's new "Govern" function — the most structurally significant addition to the framework since its original release in 2014. Meanwhile, the average cost of a compliance-related breach event (distinct from the breach itself) reached $4.7 billion industry-wide in reported regulatory penalties and contract losses through H1 2026. That number comes from aggregated SEC Form 8-K disclosures and isn't an estimate — it's what companies actually reported losing.
We've been tracking this compliance transition for the better part of two years. What we found is that the frameworks themselves aren't the problem. The problem is that most organizations treat framework updates the way they treat software patches: they schedule them, deprioritize them, and then deal with the fallout when something breaks.
What Actually Changed in CSF 2.0, ISO 27001:2022, and FedRAMP Rev 5
Three frameworks updated in close succession — NIST CSF 2.0 (February 2024), ISO/IEC 27001:2022 (which organizations had until October 2025 to transition to), and FedRAMP Revision 5 (formally adopted for new authorizations in March 2026) — created a simultaneous compliance pressure that few organizations had staffed for.
NIST CSF 2.0's headline change is the addition of the Govern function, which sits above the original five functions (Identify, Protect, Detect, Respond, Recover) and explicitly addresses organizational roles, risk management strategy, and supply chain security policy. This isn't cosmetic. The Govern function maps directly to requirements under Executive Order 14028, which mandated zero-trust architecture adoption across federal agencies. Companies selling to those agencies now have to demonstrate Govern-function compliance as a condition of contract eligibility.
ISO 27001:2022 restructured its Annex A controls from 114 down to 93, merging redundant controls but adding 11 new ones — including controls explicitly addressing threat intelligence (Annex A 5.7), information security for cloud services (Annex A 5.23), and secure coding practices (Annex A 8.28). The last one is particularly relevant for software vendors. Annex A 8.28 now requires documented secure development lifecycle processes that align with standards like OWASP ASVS 4.0 and, where applicable, NIST SP 800-218 (the Secure Software Development Framework).
FedRAMP Rev 5 brought its baseline controls in line with NIST SP 800-53 Revision 5, which had been pending since September 2020. The key operational change: continuous monitoring requirements now mandate automated evidence collection at defined intervals rather than point-in-time assessments. Organizations using Microsoft Azure Government or AWS GovCloud are largely covered by their cloud service providers' existing authorizations, but organizations running hybrid on-prem workloads — which is still a significant portion of defense-adjacent contractors — are carrying the full burden themselves.
The "Govern" Function Is Harder Than It Looks
Compliance teams that we spoke with consistently flagged the Govern function as the piece most likely to generate audit findings in the next 18 months. It's not that the requirements are technically arcane — they're not. It's that they require documentation and accountability structures that historically lived outside the security team's remit.
"The Govern function essentially asks organizations to prove that security decisions are made deliberately, by the right people, with documented rationale. That's a governance question, not a technical one. Most security teams are well-equipped to configure a firewall. They're not always equipped to produce a board-level risk appetite statement that maps to specific control selections."
— Dr. Priya Mehta, Senior Research Fellow, Carnegie Mellon University's CyLab
Dr. Mehta has been studying organizational compliance implementation gaps since 2019. Her current research focuses on the delta between documented policy and operational control effectiveness — what the field calls "compliance theater" — and her preliminary 2026 data suggests that organizations with fewer than 500 employees show a 73% rate of incomplete Govern-function documentation despite having otherwise mature technical controls.
The implication is uncomfortable: a company can have excellent endpoint detection, solid patch management, and well-configured SIEM tooling, and still fail a CSF 2.0 assessment because it can't produce a documented cybersecurity strategy that the board has formally reviewed. The framework is demanding organizational maturity, not just technical capability.
Where the Major Vendors Actually Stand
Microsoft and Google have both updated their compliance documentation packages to reflect CSF 2.0 and FedRAMP Rev 5. Microsoft's Purview Compliance Manager received an update in April 2026 that added CSF 2.0 assessment templates, including Govern-function control mappings tied to Microsoft Entra ID configurations and Defender for Cloud policy sets. It's genuinely useful if your environment is Microsoft-heavy. Less useful if you're running heterogeneous infrastructure.
Google's Chronicle SIEM platform added automated evidence collection workflows in Q2 2026 specifically targeting FedRAMP Rev 5's continuous monitoring requirements — a direct response to the shift away from point-in-time assessments. AWS, for its part, updated its AWS Artifact documentation portal but hasn't yet released a native CSF 2.0 assessment template as of our reporting deadline.
| Framework | Key Change (2024–2026) | Primary Audience Impact | Transition Deadline |
|---|---|---|---|
| NIST CSF 2.0 | New "Govern" function; expanded supply chain scope | Federal contractors, critical infrastructure operators | Q1 2026 (de facto for new contracts) |
| ISO/IEC 27001:2022 | Annex A restructured to 93 controls; 11 new additions including cloud and secure coding | Globally certified organizations; software vendors | October 31, 2025 (certification bodies stopped issuing 2013 certs) |
| FedRAMP Revision 5 | Aligned to NIST SP 800-53 Rev 5; automated continuous monitoring mandated | Cloud service providers seeking federal authorization | March 2026 (new authorizations only) |
| CMMC 2.0 (DoD) | Collapsed from 5 levels to 3; Level 2 now requires C3PAO third-party assessment | Defense Industrial Base contractors | Phased enforcement through December 2026 |
The Critics Have a Point About Audit Overhead
Not everyone is sold on the direction these frameworks are heading. There's a growing contingent of security practitioners — particularly at smaller vendors and independent consultancies — who argue that the compliance machinery has become self-referential: organizations are spending more time proving they're secure than actually being secure.
James Okafor, principal consultant at Trail of Bits and a longtime contributor to IETF working groups, put it bluntly when we asked him about the FedRAMP Rev 5 continuous monitoring requirements. "Automated evidence collection is theoretically great. In practice, a lot of organizations end up optimizing their environments to generate clean artifacts rather than to catch real threats. You get beautiful compliance dashboards and you miss a lateral movement event that a human analyst would have flagged." Okafor's concern maps to a documented phenomenon in audit theory: Goodhart's Law, where a measure becomes a target and ceases to be a good measure.
The ISO 27001:2022 transition also drew criticism for its timeline. Certification bodies stopped issuing certificates under the 2013 standard in October 2025, giving organizations roughly three years to transition — which sounds reasonable until you account for the fact that CMMC 2.0 enforcement, FedRAMP Rev 5, and CSF 2.0 all landed in roughly the same window. Rachel Tong, director of GRC engineering at Palantir, described the period as "a compliance triathlon where someone moved the transition zones." Her team managed it, she told us, but smaller partners in Palantir's supply chain did not all fare as well.
Supply Chain Controls Are the Sleeper Issue
If the Govern function is the structural headline, supply chain security is the sleeper issue that's going to generate the most findings over the next two years. Both CSF 2.0 and ISO 27001:2022 significantly expanded their treatment of third-party and supplier risk. CSF 2.0's GV.SC subcategory (Govern: Supply Chain Risk Management) now requires organizations to assess and document cybersecurity practices of suppliers whose compromise could affect the organization — a requirement that maps directly to the lessons of the SolarWinds incident in 2020 and, more recently, the MOVEit vulnerability cascade (tracked under CVE-2023-34362 and related CVEs) that affected hundreds of downstream organizations.
This is where the historical parallel is most instructive. The shift is reminiscent of what happened to the automotive industry in the 1980s when Japanese manufacturers — Toyota especially — demonstrated that quality control couldn't stop at the factory floor. It had to extend backward through the entire supplier network. American automakers that treated supplier quality as someone else's problem paid for it in recalls and market share. The security industry is now reckoning with the same structural lesson, just 40 years later and with considerably higher stakes for data exposure.
The practical difficulty is that most organizations don't have the resources to conduct full security assessments on every third-party vendor. The emerging approach — endorsed by CISA's 2026 guidance on supply chain risk — is tiered supplier classification: identify which suppliers have access to what data or systems, and apply assessment intensity proportional to the potential blast radius of their compromise. It's a risk-based shortcut, but it's one the frameworks themselves increasingly support.
What IT Teams and Security Engineers Need to Do Before December 2026
For IT professionals managing compliance programs right now, the immediate priorities aren't abstract. CMMC 2.0 Level 2 enforcement ramps to full application for new DoD contracts by December 2026, which means any organization in the Defense Industrial Base that hasn't engaged a Certified Third-Party Assessment Organization (C3PAO) is already behind. The C3PAO backlog is real — we heard from multiple organizations that wait times for assessment scheduling are running 14 to 20 weeks.
- Complete a gap analysis against CSF 2.0's Govern function controls, specifically GV.OC (Organizational Context), GV.RM (Risk Management Strategy), and GV.SC (Supply Chain Risk Management) — these are the three subcategories most likely to generate findings in 2026–2027 audits.
- If your ISO 27001 certificate was issued under the 2013 standard after October 2022, verify with your certification body whether a transition audit has been scheduled. Some organizations received 2013 certificates as late as mid-2023 and haven't yet been contacted about mandatory transition assessments.
The deeper question for security leadership is whether compliance program investment is keeping pace with the pace of framework change. Dr. Mehta's research suggests that organizations are spending, on average, 19% more on compliance tooling in 2026 compared to 2024 — but that spending isn't translating proportionally into improved audit outcomes, because tooling without process redesign just produces more artifacts, not better security posture.
The frameworks are going to keep moving. NIST has already signaled that CSF 2.0 will incorporate AI system risk considerations — likely drawing from the NIST AI RMF 1.0 released in January 2023 — in a planned 2.1 revision currently in early draft review. Whether that addition arrives as a new function, an expanded profile category, or a crosswalk document is still an open question. But organizations that built their compliance programs around static, point-in-time frameworks are going to find themselves doing this triathlon again. The ones that built operational processes capable of absorbing incremental change will have the advantage — and right now, that group is smaller than anyone in the industry wants to admit.