Table of Contents
- The Uncomfortable Part Nobody Leads With
- What Actually Breaks When Quantum Computers Arrive
- Two Algorithms, Two Very Different Problems
- How Far Away Is the Actual Threat?
- The NIST Process: Eight Years, Four Winners
- Why Waiting Is the Wrong Mental Model
- Government Deadlines That Already Exist
- If You Write Code or Run Servers, Read This Section
- If You Are Not a Developer, Read This Section
- Where This Leaves Us
The Uncomfortable Part Nobody Leads With
Most articles about quantum cryptography spend the first three paragraphs reassuring you. “Quantum computers aren’t there yet.” “RSA is still safe today.” “Don’t panic.” All of that is technically true and also kind of misses the point.
The uncomfortable reality is that the relevant window for action is not when quantum computers become powerful enough to break encryption. It is right now, while the data being generated today is still being intercepted and stored by adversaries who will eventually have that compute power. Those are two completely different timelines and most people conflate them.
This article is going to spend more time on that second timeline than most, because I think it is where the actual urgency lives. But first, a grounding in what quantum computers actually threaten and why the mathematics underneath your daily internet use has a known expiration date.
What Actually Breaks When Quantum Computers Arrive
Your connection to this website right now is protected by something called public-key cryptography. The way it works at a high level: two parties can establish a shared secret over a public channel without ever directly sending that secret. They do this by exploiting a mathematical trapdoor, a problem that is easy to compute in one direction and practically impossible to reverse.
For RSA, that trapdoor is integer factorization. Multiplying two large prime numbers together takes a fraction of a second. Given only the result and asked to find the original primes, a classical computer would take longer than the current age of the universe on numbers large enough for modern key sizes. That asymmetry is the entire foundation of the security.
Quantum computers do not make this problem slightly easier. Shor’s algorithm, which runs on a quantum computer, makes it exponentially easier. The same factoring problem that would take a classical computer millions of years becomes tractable in hours. That is not an incremental improvement in computing power. It is a categorical change that invalidates the entire trapdoor.
RSA is the obvious target but it is not the only one. Diffie-Hellman key exchange, which underlies a large fraction of TLS connections, and elliptic curve cryptography, which is the more efficient modern variant used in newer systems, are both vulnerable to Shor’s algorithm for similar reasons. They rely on different trapdoor problems but problems that quantum computers can also efficiently solve. The three most widely deployed public-key systems in the world all fail against the same attack.
Symmetric encryption like AES is a different story. It does not use trapdoor mathematics in the same way. Grover’s algorithm gives quantum computers a speedup against brute-force searches on symmetric keys, but only a quadratic one. The defense is straightforward: use AES-256 instead of AES-128, which doubles the effective key length and restores the security margin. That is a configuration change, not a fundamental rethinking of the algorithm.
Two Algorithms, Two Very Different Problems
It is worth being precise about Shor versus Grover because the responses to them are completely different in scale and urgency.
Shor’s algorithm breaks public-key cryptography outright. There is no “just use a bigger key” fix for RSA or ECDH against Shor, because the underlying mathematical problem does not get harder with longer keys in the same way it does for symmetric encryption. The only real defense is to stop using algorithms based on integer factorization and discrete logarithms and switch to something with a different mathematical foundation. That is a migration, not a patch.
Grover’s algorithm is genuinely less scary despite getting lumped in with Shor in most coverage. Yes, it speeds up brute-force attacks. Yes, it effectively halves the bit security of symmetric keys. AES-128 drops to roughly 64-bit effective security. But here is the thing: 64-bit security against a quantum computer with Grover’s is still practically infeasible with the hardware that will realistically exist for the foreseeable future. And switching to AES-256 restores your margin entirely. This one is solved with a config change that most security-conscious systems have already made for unrelated reasons.
The Shor problem is the one requiring years of industry-wide infrastructure work. The Grover problem is a configuration ticket. I keep seeing articles treat them as equally urgent and they are not.
How Far Away Is the Actual Threat?
No quantum computer today can run Shor’s algorithm on a key size that matters. Breaking RSA-2048 would require somewhere between 4,000 and 10,000 stable, error-corrected logical qubits, depending on whose error correction assumptions you use. The best machines publicly announced as of early 2026 have hundreds to low thousands of physical qubits, and physical qubits are not the same thing as logical qubits. Each logical qubit requires many physical ones to implement error correction, with estimates ranging from about 1,000 to 10,000 physical qubits per logical qubit depending on how noisy the underlying hardware is.
So we are not close in the “this will happen next year” sense. The Global Risk Institute’s annual quantum threat timeline puts the probability of a cryptographically relevant machine at around 5 to 10 percent by 2030, rising to somewhere in the 50 percent range by 2035. Most professional estimates cluster in the early 2030s as the danger window, with the caveat that classified government programs, both domestic and foreign, operate outside what is publicly known.
That caveat matters more than it might seem. China, the US, and several European countries have national quantum programs running with classified budgets. The public state of the art is not necessarily the actual state of the art. The NSA has historically assumed adversary capabilities run several years ahead of what is publicly announced in sensitive areas. Whether you weight that as “classified programs are probably close” or “classified programs are probably still subject to the same physics constraints” depends on your priors, but it is a reason not to treat 2035 as a comfortable deadline.
The NIST Process: Eight Years, Four Winners
In 2016, NIST announced a public competition to find cryptographic algorithms that are secure against quantum computers. Submissions came in from research teams across the world, and over the following eight years each round of candidates went through public cryptanalysis, meaning anyone could try to break them and publish the results. Several candidates were eliminated during this process when researchers found flaws, which is exactly what a rigorous standardization process is supposed to do.
In August 2024, NIST published the first three finalized standards from this process. ML-KEM under FIPS 203 is the key encapsulation mechanism, the post-quantum replacement for the key exchange step that currently uses RSA or Diffie-Hellman. ML-DSA under FIPS 204 is the digital signature standard, replacing RSA and ECDSA for signature applications. SLH-DSA under FIPS 205 is a hash-based signature scheme included as a mathematically diverse backup: it uses completely different underlying mathematics from ML-DSA, so a breakthrough against lattice-based cryptography would not affect it.
In 2025, NIST selected HQC as a fourth standard, specifically as a backup key encapsulation mechanism to complement ML-KEM. HQC is code-based rather than lattice-based. The reasoning is the same as including SLH-DSA: if someone finds a fundamental weakness in lattice problems, you do not want your entire post-quantum infrastructure to collapse at once.
The practical state of deployment as of March 2026: Chrome and Firefox are already running ML-KEM in hybrid mode for TLS, meaning connections to supporting servers use both X25519 (classical) and ML-KEM (post-quantum) simultaneously. Cloudflare has had hybrid post-quantum TLS in production since 2023. Signal shipped its post-quantum protocol upgrade in September 2023. Apple added post-quantum protection to iMessage in iOS 17.4 in March 2024. This is not a future roadmap item. For the communication tools most people use daily, deployment is already happening or already done.
The hybrid approach is worth understanding because it comes up everywhere in deployment discussions. Running X25519 and ML-KEM together means the connection security is the harder of the two to break. A classical attacker cannot break ML-KEM. A quantum attacker cannot break X25519 any faster than classical, wait, that one they can. The point is that hybrid protects against both a quantum attacker and a scenario where ML-KEM turns out to have an unexpected weakness. You get post-quantum security without betting everything on algorithms that have only been public for a few years.
Why Waiting Is the Wrong Mental Model
The standard mental model for security threats goes: threat appears, industry responds, problem is solved or mitigated over time. Ransomware, phishing, SQL injection, all follow roughly this pattern. The threat is current, the defense is current, the race is between attackers and defenders happening right now.
Post-quantum is structurally different in a way that breaks this model. The threat is not current in terms of active exploitation. But the data being generated right now is a target for future exploitation in a way that cannot be patched retroactively. Once an adversary has a copy of your encrypted traffic from 2024, no future security upgrade protects that copy. The only defense is encrypting it with post-quantum algorithms before it leaves your system in the first place.
The NSA’s 2022 advisory stated directly that it assumes sophisticated adversaries are already collecting encrypted data at scale for future decryption. That is not a hypothetical. State-level actors with long intelligence timelines and the storage capacity to hold large volumes of intercepted traffic have had economic incentive to do this for years. The data most at risk is whatever needs to stay confidential for a decade or more: classified government communications, long-term corporate strategy, medical and legal records, any data protected by long-lived key pairs that are not frequently rotated.
For most regular internet users, the personal risk from this specific attack is lower. Your Twitter login from 2024 is not interesting enough to store and decrypt in 2033. But if you work with data that matters over long time horizons, or you build systems that handle it, the harvest now problem is not a future concern. It is a present one.
Government Deadlines That Already Exist
The US government issued NSM-10 in May 2022 directing federal agencies to begin migrating to post-quantum cryptography. This was not a suggestion or a study request. It was a directive with follow-on requirements from CISA and NSA for specific timelines. The rough structure for federal systems: cryptographic inventories completed and hybrid deployments in new systems between 2024 and 2027, deprecation of classical-only algorithms by 2030, full disallowance in classified systems by 2035.
The UK’s NCSC published detailed migration timelines in 2024 with sector-specific guidance. ENISA, the EU’s cybersecurity body, published its own roadmap. Every major Western government agency responsible for cybersecurity has produced public guidance saying the same thing in slightly different language: start the inventory and migration now, finish the critical systems before 2030.
Private industry is following with less formal deadlines but real movement. AWS, Azure, and Google Cloud all support post-quantum TLS and are actively migrating internal systems. Payment networks are running pilots. Certificate authorities are preparing post-quantum certificate issuance. OpenSSL 3.5, released in early 2025, includes production-grade ML-KEM and ML-DSA support. The tooling exists. The question for any given organization is whether it has done the work to figure out where it actually uses classical public-key cryptography and in what priority order those systems need updating.
That inventory is genuinely the hard part. Large organizations can have thousands of systems using RSA in ways that are not visible from a single audit. Hardcoded in application logic, embedded in hardware security modules, baked into VPN appliances with firmware update cycles measured in years, present in code signing infrastructure that was set up a decade ago and never touched since. Finding all of it takes longer than most organizations expect when they first estimate the project.
If You Write Code or Run Servers, Read This Section
The most impactful thing you can do for any new system you are building right now is stop using RSA and classical Diffie-Hellman for key exchange by default. OpenSSL 3.5 supports X25519 plus ML-KEM hybrid key exchange. Setting that as your TLS default costs you one configuration change and provides post-quantum protection on all future traffic from that system. There is no good reason to build new infrastructure on algorithms with a known expiration date when the replacement is available and already in production at major providers.
For SSH, the OpenSSH project added post-quantum key exchange support in version 9.0 back in 2022 as an experimental feature. By version 9.6 it was more stable and usable in production environments. Check what version your servers are running and whether your SSH configs have been updated to prefer hybrid or post-quantum key exchange algorithms. Most default configs have not changed since the servers were originally provisioned.
For existing systems, the priority order should roughly be: anything using long-lived key pairs first, anything storing encrypted data that needs to remain confidential for years second, anything handling high-volume user traffic third. A certificate authority with a 10-year root certificate is a much higher-priority migration target than a web server doing routine HTTPS with short-lived certs, even though both use RSA.
One thing that is easy to overlook: JWT tokens and API authentication often use RSA signing under the hood, sometimes in ways that are abstracted behind a library that makes it easy to forget the underlying algorithm. RS256 in JWT is RSA. ES256 is ECDSA. Both are vulnerable. If you are issuing long-lived API tokens signed with RS256, that is worth putting on the migration list.
If You Are Not a Developer, Read This Section
The honest answer for regular users is that a lot of the most important post-quantum migration has already been done for you by the apps and services you use, assuming you keep them updated. Signal shipped PQXDH in September 2023. Your new Signal conversations since then are post-quantum protected. Apple added post-quantum protection to iMessage in March 2024 via the PQ3 protocol. Chrome and Firefox are running hybrid post-quantum TLS. If you are on current versions of these tools, a significant portion of your daily communication is already covered.
Where you have more direct control: local storage encryption. If you store sensitive files locally, use AES-256. VeraCrypt, BitLocker on Windows, and FileVault on macOS all support AES-256. That handles the Grover problem for local data. For passwords, most major password managers now use Argon2id for key derivation, which is the current recommendation. If yours still uses PBKDF2 with SHA-1, that is worth checking in the settings or the vendor’s security documentation.
The pq.cloudflare.com site will tell you whether your browser’s current connection is using post-quantum TLS, which is a useful real-world check rather than a theoretical one. On a current version of Chrome or Firefox talking to Cloudflare, you will likely see ML-KEM in use. That is not a guarantee that every site you visit uses it, but it shows the infrastructure is there and working.
Two-factor authentication is worth mentioning here even though it is not directly a quantum issue. TOTP codes, hardware keys, and passkeys are all based on symmetric cryptography or public-key systems with different threat profiles. FIDO2 hardware keys like YubiKey use elliptic curve cryptography for the attestation, which is a quantum-vulnerable algorithm, but the practical attack surface is very different from TLS interception. This is a longer-term concern than the harvest now problem for most users.
Where This Leaves Us
The 2030s quantum threat is real enough that governments have issued formal migration mandates, a major standards body spent eight years developing replacements, and the largest tech companies have already shipped post-quantum protocols in production. That level of institutional response does not happen for threats that are considered marginal or speculative.
At the same time, I want to be honest about uncertainty in the other direction too. The 2030s timeline is an estimate with real variance. Quantum computing has a long history of timelines slipping. Error correction is hard in ways that are not fully understood. It is possible the threat is further away than current estimates suggest, and it is possible classified programs are closer. Nobody outside a handful of national security agencies actually knows.
What is not uncertain: the NIST standards are finalized, the tools to implement them are production-ready, and the harvest now attack is a present concern for anyone protecting long-lived sensitive data. Those three things are true regardless of exactly when a cryptographically relevant quantum computer arrives.
For most people reading this, the practical upshot is: use Signal and iMessage for sensitive conversations, keep your browser updated, check that your password manager uses Argon2, and use AES-256 for local storage. That covers your personal exposure without requiring you to understand lattice cryptography. If you build systems or manage infrastructure, the action items are more involved, but the tools exist and the migration path is clear.
What does your current setup look like? Specifically curious whether anyone is already running hybrid TLS in production or has done a cryptographic inventory at their organization. Drop a comment with where you are in the process.
References (March 2026):
NIST FIPS 203, 204, 205 (August 2024): csrc.nist.gov/projects/post-quantum-cryptography
NIST HQC selection (March 2025): nist.gov
NSM-10, National Security Memorandum on Quantum Computing (May 2022): whitehouse.gov
UK NCSC PQC Migration Timelines (2024): ncsc.gov.uk
Signal PQXDH announcement (September 2023): signal.org/blog/pqxdh
Apple PQ3 protocol for iMessage, iOS 17.4 (March 2024): security.apple.com/blog/imessage-pq3
OpenSSL 3.5 release notes and ML-KEM support: openssl.org
Global Risk Institute Quantum Threat Timeline 2024: globalriskinstitute.org
The tools to protect your data from a quantum future already exist.
Whether you use them before the deadline is the only open question.







Leave a Reply