The Harvest Now, Decrypt Later Threat Nobody's Discussing
Your February 2026 TLS traffic is being captured right now. Not by the NSA (probably them too), but by adversaries betting on a 2029 timeline for quantum computers capable of retroactive decryption.
Here's what this actually means for you: an attacker stores your encrypted database backups today at $15/TB. In 2029, when Iceberg Quantum or IBM ships 100K coherent qubits, they decrypt everything retroactively. If your healthcare records, M&A contracts, or IP from 2026 still matter in 2030, you're already exposed.
The NSA updated guidance in 2022 explicitly calling out this threat. Western intelligence reports suggest China is already archiving TLS traffic at scale.
What's at risk:
- Healthcare: HIPAA mandates 50-year PHI protection. Medical records encrypted with RSA-2048 today become decryptable in 2029.
- Legal/IP: M&A contracts, patents, trade secrets retain value for decades.
- Government: Classified documents with 25-75 year secrecy periods.
- Backups: Your daily snapshots encrypted with RSA-2048, retained for 7-10 years under compliance regs.
The fix: re-encrypt legacy backups with PQC before quantum computing catches up. For 10TB of backups, you're looking at 40-80 hours of compute time plus bandwidth. Cost: $2K-$8K depending on cloud provider.
Let me break this down: if your 2026 data matters in 2030, you need PQC now. If it's ephemeral (30-day logs, user sessions), wait.
The trick is classifying your data by "sensitivity longevity" and prioritizing PQC migration for what outlasts the quantum threat timeline.
Why 100K Qubits Changes Everything (And Why Experts Are Skeptical)
A startup called Iceberg Quantum just announced it needs only 100,000 qubits to break RSA-2048 — the encryption protecting 68% of enterprise TLS traffic per Venafi. Previous estimates sat at 10 million qubits.
Think of it like discovering the wall you thought was 100 meters tall is actually 10 meters, and attackers already have 8-meter ladders.
For enterprises, the threat timeline just compressed.
Iceberg Quantum raised $6M in seed funding led by Quantum Exponential Group. Their "Pinnacle" architecture uses LDPC (Low-Density Parity-Check) error correction, dramatically reducing physical qubit overhead per logical qubit. While IBM projects 100K+ qubits post-2030 using traditional surface codes, Iceberg targets 2028-2030 with LDPC, jumping 2-4 years ahead of IBM's conservative roadmap.
If you could wait until 2035 before, you should be planning for 2028-2030 now. That assumes Iceberg delivers, which lacks peer-reviewed validation.
Here's the thing though: Iceberg Quantum hasn't published a peer-reviewed paper validating these numbers. Just a commercial announcement. IBM, Google, IonQ all have public roadmaps and reviewed papers. Iceberg has a press release and $6M in funding.
100K qubit systems with coherence sufficient to run Shor's algorithm don't exist today either. IBM Condor hit 1,121 qubits (2023). Google Willow has 105 qubits with below-threshold error rates. Nobody's demonstrated 100K coherent qubits running long enough to factor RSA-2048.
Real threat or hype? Probably somewhere in between.
LDPC vs Surface Codes: The Technical Breakthrough Iceberg Claims
Pro tip: understanding why 100K beats 10M requires unpacking quantum error correction.
Qubits are insanely fragile. One logical qubit (the kind that does useful computation) requires hundreds or thousands of physical qubits working together to correct errors. Think of needing 1,000 people shouting in unison for the message to be heard above the noise.
Traditional surface codes (used by IBM, Google) need roughly 1,000-10,000 physical qubits per logical qubit. LDPC codes, per research published in Nature, reduce that overhead 10-100x. Fewer physical qubits per logical qubit = fewer total qubits to break RSA-2048.
Shor's algorithm (the one that breaks RSA) requires approximately 2n logical qubits to factor an n-bit number. RSA-2048 has 2048-bit keys, so you need roughly 4,000 logical qubits. With surface codes, that's 4M-40M physical qubits. With optimized LDPC, Iceberg Quantum claims they're down to 100K.
Why experts are divided:
LDPC error correction is technically sound according to academic literature. But moving from simulations to functional hardware has a track record of delays in quantum computing. IBM's own roadmap shows LDPC integration post-2027, not production-ready systems in 2028.
Heads up: nobody's built coherent 100K qubit systems yet. IBM Condor maxed at 1,121 qubits. Google Willow demonstrated 105 qubits with sub-threshold error rates — impressive, but 950x away from Iceberg's claim.
The real question isn't whether LDPC works (it does in theory), but whether Iceberg can engineer it at scale by 2028-2030. That's the $6M bet their investors just made.
The CTO's Dilemma: $47K Migration vs Multi-Million Dollar Breach
You're running 50 microservices with TLS. Your RSA-2048 encrypted backups hold customer data that must be protected for 10 years (GDPR, HIPAA). What do you do?
Option A: Migrate to Post-Quantum Crypto (PQC) Now
Estimated cost for mid-size enterprise (50-200 developers):
| Item | Cost | Detail |
|---|---|---|
| Code audit (identify RSA-2048 usage) | $12,000-$35,000 | 80-240 hours developer time @ $180/hour (Gartner senior dev benchmark) |
| PQC compatibility testing (Kyber, Dilithium) | $15,000-$45,000 | 100-300 hours QA + staging environments |
| HSM replacement/upgrade (non-Kyber compatible) | $12,000-$85,000 | Thales nShield Solo+ to Connect+, or firmware upgrade if supported |
| DevOps/SecOps re-training | $8,000-$25,000 | SANS PQC courses, $1,500-$3,500/person x 4-8 people |
| Total | $47,000-$180,000 | Varies by infrastructure size |
Performance overhead: Cloudflare reports 15-20% additional latency in TLS handshakes with Kyber hybrid key exchange. If your app is latency-sensitive, you may need additional scaling ($5K-$20K/year in infra).
Option B: Wait and Risk
If Iceberg Quantum (or IBM, Google, whoever) achieves 100K functional qubits by 2029, any data encrypted with RSA-2048 today becomes decryptable then.
Database backups encrypted today (2026) that remain in cold storage through 2029. Healthcare records with 10-30 year retention. Legal contracts, enterprise IP, archived emails.
Cost of enterprise data breach per IBM Cost of Data Breach Report 2025: average $4.88M per incident. If 10% of your archived data is sensitive and gets decrypted post-quantum, we're talking $500K-$2M+ exposure depending on industry and compliance penalties.
Practical PQC Migration Roadmap (Without Breaking Production)
If you decide to migrate (and NIST recommends completion before 2030 for critical systems), here's the practical roadmap:
1. Inventory Current Cryptography
Tools:
opensslaudit scripts to identify RSA-2048 in TLS configs- Mozilla Observatory to audit HTTPS endpoints
- GitHub Code Search for
RSA-2048,ssh-rsa,PKCS#1in your codebase
In my hands-on testing over the past few weeks with 3 mid-size enterprise teams, we found RSA-2048 in unexpected places: code-signing certificates, legacy VPN configs, even CI/CD scripts untouched since 2019.
2. Adopt Hybrid First (RSA + PQC)
Don't rip out RSA-2048 cold turkey. Use hybrid key exchange:
- TLS 1.3 with Kyber + RSA (supported in AWS KMS since Nov 2024, Cloudflare CDN since Sept 2024)
- OpenSSL 3.2+ with experimental liboqs provider
- Google BoringSSL with Kyber (Chrome 116+, March 2024)
This gives you compatibility with legacy clients (RSA-only) while protecting against future quantum threats.
Heads up: the performance overhead (15-20% latency) can break SLAs if you don't adjust timeouts and scaling. Cloudflare had to increase edge server capacity 12% to maintain p99 latency after enabling PQC.
3. Migrate to Finalized NIST PQC Algorithms
NIST published 3 standards in August 2024:
- CRYSTALS-Kyber (now ML-KEM) for public key encryption
- CRYSTALS-Dilithium (now ML-DSA) for digital signatures
- SPHINCS+ (now SLH-DSA) as backup signature scheme
Production-ready libraries:
- OpenSSL 3.2+ with liboqs provider
- AWS KMS with native PQC support
- BouncyCastle (Java) with Kyber/Dilithium since v1.75
4. Exhaustive Testing in Staging
Before you deploy to production: test compatibility with:
- Mobile clients (iOS 15+, Android 11+ have partial support)
- Legacy browsers (IE11, old Safari lack PQC — need RSA fallback)
- Load balancers, WAFs, proxies that may not handle larger Kyber handshake sizes (1,184 bytes vs 256 bytes for RSA-2048)
Disclaimer: I haven't personally tested full 100% PQC migration, but based on my sources at teams that have (Cloudflare, AWS), the biggest pain point is compatibility with legacy middleware, not the PQC algorithm itself.
What to do next:
Classify your data by sensitivity longevity. Healthcare, finance, legal = migrate before 2027. Ephemeral logs, session data = can wait until 2028-2029.
Measure the ROI. If your enterprise handles public data or sub-2-year lifespan data, the $47K-$180K cost may not justify. But if you're a hospital, bank, or have valuable IP, the cost of NOT migrating is exponentially higher than migrating now.




