Google’s announcement yesterday sets a definitive expiration date for the encryption standards that currently protect the global economy. By 2029, the company predicts that quantum hardware will reach the scale necessary to break RSA and Elliptic Curve Cryptography, turning our current security protocols into legacy vulnerabilities. This is no longer a theoretical concern for research labs; it is a three-year implementation deadline for every infrastructure engineer and security architect.
The physics behind the 2029 timeline
The shift in Google’s stance stems from recent breakthroughs in quantum error correction and qubit scaling at their Quantum AI lab. For years, the industry operated under the assumption that a cryptographically relevant quantum computer (CRQC) was decades away. That consensus changed when researchers demonstrated the ability to suppress logical errors at a rate that allows for reliable execution of Shor’s algorithm. This algorithm can factor large integers and solve discrete logarithms in polynomial time, which effectively bypasses the mathematical difficulty that makes RSA and ECC secure.
The jump from noisy intermediate-scale quantum (NISQ) devices to error-corrected machines is happening faster than previous benchmarks suggested. Google’s data indicates that the hardware path to a machine capable of shredding a 2048-bit RSA key is now visible within a 36-month window. When a company with Google's specific hardware roadmap issues a warning this direct, the industry must treat the 2029 date as a hard cutoff for asymmetric cryptographic integrity.
Addressing the harvest now decrypt later risk
The most immediate threat is not the sudden collapse of live connections in 2029. It is the practice of "harvest now, decrypt later" (HNDL). Foreign intelligence services and well-funded criminal groups are currently intercepting and storing vast amounts of encrypted traffic from financial institutions, government agencies, and healthcare providers. They do not need to break the encryption today. They only need to hold the data until 2029, when a quantum computer can retrospectively expose every secret stored in those archives.
This means that any data with a shelf life longer than three years is already at risk. If you are protecting a trade secret or a patient record that must remain confidential until 2030, current RSA-based TLS 1.2 or 1.3 connections are insufficient. The migration to post-quantum cryptography (PQC) should have started yesterday for long-lived data. Every day of delay extends the window of vulnerability for data that is being vacuumed into storage arrays right now.
The NIST standards and hybrid key exchange
The National Institute of Standards and Technology (NIST) finalized the primary standards for PQC in 2024, providing the industry with a vetted set of algorithms. The most prominent are ML-KEM (formerly Kyber) for key encapsulation and ML-DSA (formerly Dilithium) for digital signatures. Google has already integrated ML-KEM-768 into Chrome and its internal RPC traffic. This algorithm relies on the hardness of the module lattice-based learning with errors (MLWE) problem, which remains resistant to both classical and quantum attacks.
Implementation is currently favoring a hybrid approach. Rather than relying solely on a new PQC algorithm, engineers are combining a classical key exchange like X25519 with a post-quantum exchange like ML-KEM. This creates a nested layer of security. If a flaw is discovered in the new lattice-based math, the classical ECC layer still provides a baseline of protection. If the quantum threat manifests as predicted, the PQC layer handles the heavy lifting. This dual-track strategy is the standard for OpenSSL 3.x and BoringSSL implementations moving forward.
Navigating the complexity of crypto agility
The transition to PQC is significantly more complex than the previous move from SHA-1 to SHA-256. Post-quantum keys and signatures are substantially larger than their classical counterparts. For example, an ML-KEM-768 public key is 1,184 bytes, whereas an X25519 key is only 32 bytes. This size increase can lead to packet fragmentation in UDP-based protocols like QUIC or exceeding buffer limits in older load balancers and middleboxes.
Engineers need to audit their infrastructure for "crypto agility," which is the ability to swap algorithms without re-architecting the entire system. This involves identifying hardcoded assumptions about key sizes and signature lengths in custom applications. Many legacy systems will fail when presented with a 2,420-byte ML-DSA signature. Testing these edge cases in staging environments is necessary before 2027, as the overhead of these larger payloads can also impact latency-sensitive services and handshake timings in high-traffic environments.
Immediate actions for infrastructure engineers
The first step is a comprehensive inventory of where asymmetric cryptography is used across your stack. This includes not only public-facing TLS certificates but also internal service-to-service communication, code signing, and encrypted backups. Most organizations find that their internal Public Key Infrastructure (PKI) is the hardest part to migrate because it often relies on legacy certificate authorities that do not yet support the new NIST OIDs (Object Identifiers).
Start by enabling hybrid post-quantum key exchange in your edge nodes. Cloud providers like AWS and Google Cloud already offer PQC options for their Key Management Services (KMS) and Load Balancers. For internal services, update your BoringSSL or OpenSSL libraries to versions that support ML-KEM. The goal for the remainder of 2026 is to ensure that all data in transit is protected by at least one layer of quantum-resistant math. This mitigates the harvest now, decrypt later threat while you work on the more difficult task of updating long-term identity keys and root certificates.
The 2029 deadline is a functional reality for anyone managing sensitive data. The transition will take years of testing and deployment. If your organization waits until 2028 to start the migration, you will be attempting to rotate your entire cryptographic identity under the pressure of a looming systemic collapse. The math is ready, the standards are set, and the hardware is scaling.
Are your dependency trees and rotation policies prepared for a total replacement of the asymmetric stack?