Nsa and Ietf: Can an Attacker Purchase Standardization of Weakened Cryptography?
Key topics
The NSA is allegedly trying to weaken cryptography standards by pushing for the adoption of a potentially vulnerable post-quantum key exchange algorithm, sparking controversy and debate among cryptography experts and HN commenters.
Snapshot generated from the HN discussion
Discussion Activity
Very active discussionFirst comment
2h
Peak period
31
6-12h
Avg / period
8.9
Based on 80 loaded comments
Key moments
- 01Story posted
Oct 4, 2025 at 6:16 PM EDT
3 months ago
Step 01 - 02First comment
Oct 4, 2025 at 7:57 PM EDT
2h after posting
Step 02 - 03Peak activity
31 comments in 6-12h
Hottest window of the conversation
Step 03 - 04Latest activity
Oct 7, 2025 at 3:57 AM EDT
3 months ago
Step 04
Generating AI Summary...
Analyzing up to 500 comments to identify key contributors and discussion patterns
Want the full context?
Jump to the original sources
Read the primary article or dive into the live Hacker News thread when you're ready.
It failed to raise my confidence at all.
> The IESG has concluded that there were no process failures by the SEC ADs. The IESG declines to directly address the complaint on the TLS WG document adoption matter. Instead, the appellant should refile their complaint with the SEC ADs in a manner which conforms to specified process.
This complaint? https://cr.yp.to/2025/20250812-non-hybrid.pdf
Engineering concerns start in section 2 and continue through section 4.
It seems you haven't read it.
Ah, yes, procedural complaints such as "The draft creates security risks." and "There are no principles supporting the adoption decision.", and "The draft increases software complexity."
I don't know what complaint you're reading, but you're working awful hard to ignore the engineering concerns presented in the one I've read and linked to.
This is the retort of every bureaucracy which fails to do the right thing, and signals to observers that procedure is being used to overrule engineering best practices. FYI.
I'm thankful for the work djb has put in to these complaints, as well as his attempts to work through process, successful or not, as otherwise I wouldn't be aware of these dangerous developments.
Excuses of any kind ring hollow in the presence of historical context around NSA and encryption standardization, and the engineering realities.
https://blog.cr.yp.to/20220805-nsa.html
I'm actually quite surprised that anyone is advocating the non-hybrid PQ key exchange for real applications. If it isn't some sort of gimmick to allow NSA to break these, it's sure showing a huge amount of confidence in relatively recently developed mechanisms.
It feels kind of like saying "oh, now that we can detect viruses in sewage, hospitals should stop bothering to report possible epidemic outbreaks, because that's redundant with the sewage monitoring capability". (Except worse, because it involves some people who may secretly be pursuing goals that are the opposite of everyone else's.)
Edit: DJB said in that 2022 post
https://en.wikipedia.org/wiki/Lattice-based_cryptography#His...
2005 (LWE), 2012 (LWE for key exchange), earlier (1990s for lattice math in general), 2017 (Kyber submission), later (competition modifications to Kyber)?
I can see where one could see the mathematics as moderately mature (comparable in age to ECC, but maybe less intensively studied?). As above, I don't know quite how to think about whether the "thing" here is properly "lattices", "LWE", "LWE-KEX", "Kyber", or "the parameters and instantiation of Kyber from the NIST PQ competition". Depending where we focus our attention there, I suppose this gives us some timeframe from the 1980s (published studies of computational complexity of lattice-related algorithms) to "August 2024" (adoptions of NIST PQ FIPS documents).
Edit: The other contextual thing that freaks out DJB, for those who might not be familiar, is that one of the proposed standards NIST was considering, SIKE, made it all the way through to the final (fourth) round of consideration, whereupon it was completely broken by a couple of researchers bringing to bear mathematical insight. Now SIKE had a very different architecture than the other proposals in the fourth round, so it seems like a portion of the debate is whether the undetected mathematical problems in SIKE are symptomatic of "the NIST competition came extraordinarily close to approving something that was totally broken, so maybe it wasn't actually that great at evaluating candidate algorithms, or at least maybe the mathematics community's understanding of post-quantum key exchange algorithms is still immature" or more symptomatic of "SIKE had such a weird and distinctive architecture that it was hard to understand or analyze, or hard to motivate relevant experts to understand or analyze it, unlike other candidate algorithms that were and are much better understood". It seems like DJB is saying the former and you're saying the latter.
Why is that so surprising? Adopting new cryptography by running it in a hybrid mode with the cryptography it's replacing is generally not standard practice and multi-algorithm schemes are pretty niche at best (TrueCrypt/VeraCrypt are the only non-PQ cases that come to mind, although I'm sure there are others). Now you could certainly argue that PQ algorithms are untested and risky in a way that was not true of any other new algorithm and thus a hybrid scheme makes the most sense, but it's not such an obviously correct argument that anyone arguing otherwise must be either stupid or malicious.
The cool thing is the dramatic security improvements against certain unknown unknowns for approximately linear additional work and space. Seems like a pretty great advantage for the defender, although seriously arguing that quantitatively requires some way to reason about the unknown unknowns (the reductio ad absurdum being that we would need to use every relevant primitive ever published in every protocol¹).
I see PQC as somehow very discontinuous with existing cryptography, both in terms of the attacks it tries to mitigate and the methods it uses to resist them. This might be wrong. Maybe it's fair to consider it an evolutionary advance in cryptographic primitive design.
The casual argument from ignorance is that lattices are apparently either somewhat harder to understand, or just less-studied overall, than other structures that public-key primitives have been built on, to the extent that we would probably currently not use them at all in practical cryptography if it weren't for the distinctive requirements of resistance to quantum algorithms. I understand that this isn't quantitative or even particularly qualitative (for instance, I don't have any idea of what about lattices is actually harder to understand).
Essentially, in this view, we're being forced into using weird esoteric stuff much earlier than we'd like because it offers some hope of defending against other weird esoteric stuff. Perhaps this is reinforced by, for example, another LWE submission having been called "NewHope", connoting to me that LWE was thought even by many of its advocates to offer urgently-needed "hope", but maybe not "confidence".
I'd like not to have to have that argument only in terms of vibes (and DJB does have some more concrete arguments that the security of SIKE was radically overestimated, while the security of LWE methods was moderately overestimated, so we need to figure out how to model how much of the problem was identified by the competition process and how much may remain to be discovered). I guess I just need to learn more math!
¹ I think I remember someone at CCC saying with respect to the general risk of cryptographic backdoors that we should use hybrids of mechanisms that were created by geopolitical rivals, either to increase the chance that at least one party did honest engineering, or to decrease the chance that any party knows a flaw in the overall system! This is so bizarre and annoying as a pure matter of math or engineering, but it's not like DJB is just imagining the idea that spy agencies sometimes want to sabotage cryptography, or have budgets and staff dedicated to doing so.
* Targets with sufficient technical understanding would use hybrids anyway.
* Average users and unsophisticated targets can already be monitored through PRISM which makes cryptography moot.
So...what's their actual end game here?
Are you implying that djb blew the matter out of proportion?
The reason this is a poor quality analogy is that fundamentally ecdsa and ed25519 are sufficiently similar that people had a high degree of confidence that there was no fundamental weakness in ed25519, and so it's fine - whereas for PQC the newer algorithms are meaningfully mathematically distinct, and the fact that SIKE turned out to be broken is evidence that we may not have enough experience and tooling to be confident that any of them are sufficiently secure in themselves and so a protocol using PQC should use a hybrid algorithm with something we have more confidence in. And the counter to that is that SIKE was meaningfully different in terms of what it is and does and cryptographers apparently have much more confidence in the security of Kyber, and hybrid algorithms are going to be more complicated to implement correctly, have worse performance, and so on.
And the short answer seems to be that a lot of experts, including several I know well and would absolutely attest are not under the control of the NSA, seem to feel that the security benefits of a hybrid approach don't justify the drawbacks. This is a decision where entirely reasonable people could disagree, and there are people other than djb who do disagree with it. But only djb has engaged in a campaign of insinuating that the NSA has been controlling the process with the goal of undermining security.
The problem with this statement to me is that we know of at least 1/4 finalists in the post quantum cryptography challenge is broken, so it's very hard to assign a high probability that the rest of the algorithms will be secure from another decade of advancement (this is not helped by the fact that since the beginning of the contest, the lattice based methods have lost a signficant number of bits as better attacks have been discovered).
1. adopt hybrid/dual encryption. This is safe against a break of the PQC layer which seems entirely plausible given that the algorithms are young, the implementations are younger, and there has been significant weakening of the algorithms in the past decade.
2. Adopt PQC without a backup layer. This approach is ~5% faster (PQC algorithms are pretty slow), with the cost of breaking encryption for everyone on the internet if any flaw in the PQC algorithms or implementations is found.
The NSA starts by requiring some insecure protocols be supported, and then when support is widespread they start requiring it be made a default by requiring compliance testing be done with default config.
From this privileged network position, if both sides support weaker crypto that NSA lobbied for, they can MitM the initial connection and omit the hybrid methods from the client's TLS ClientHello, and then client/server proceed to negotiate into a cipher that NSA prefers.
Intelligence is a numbers game, they never get everything, but if your net is wide enough and you don't give up, you'll catch a lot of fish over time
Seems dumb not to have like 10.
Yes, and at the same time all of modern crypto is incredibly cheap and can be added as wished on almost every application without any visible extra costs.
So the answer to the GP is not that trivial one. The actual answer is about software complexity making errors more likely, and similar encryption schemes not really adding any resiliency.
The point is to trust no one and no thing that we cannot examine freely, closely, and transparently. And to maintain healthy skepticism of any entity that claims to have a virtuous process to do its business.
You're lived experience tells you to trust the NSA, at least as it relates to NIST standards.
?
Dual EC wasn't a shockingly clever, CS-boundary-pushing hack (and NSA has apparently deployed at least one of those in the last 20 years). It was an RNG (not a key agreement protocol) based on asymmetric public key cryptography, a system where you could look at it and just ask "where's the private key?" There wasn't a ton of academic research trying to pick apart flaws in Dual EC because why would there be? Who would ever use it?
(It turns out: a big chunk of the industry, which all ran on ultra-closed source code and was much less cryptographically literate that most people thought. I was loudly wrong about this at the time!)
MLKEM is a standard realization of CRYSTALS-Kyber, an algorithm submitted to the NIST PQ contest by a team of some of the biggest names in academic PQ cryptography, including Peter Schwabe, a prior collaborator of Bernstein. Nobody is looking at MLKEM and wondering "huh, where's the private key?".
MLKEM is based on cryptographic ideas that go back to the 1990s, and were intensively studied in the 2000s. It's not oddball weird cryptography. It is to the lineage of lattice cryptography roughly what Ed25519 was to elliptic curve cryptography at the time of Ed25519's adoption.
Utterly unlike SIKE, which isn't a lattice algorithm at all, but rather a supersingular isogeny algorithm, a cryptographic primitive based on an entirely new problem class, and an extremely abstruse one at that. The field had been studying lattice cryptography intensively for decades by the time MLKEM came to pass. That's not remotely true of isogeny cryptography. Isogenies were taken seriously not because of confidence in the hardness of isogenies, but because of ergonomics: they were a drop-in replacement for Diffie Hellman in a way MLKEM isn't.
These are all things Bernstein is counting on you not knowing when you read this piece.
I'd use a hybrid if I was designing a system; I am deeply suspicious of all cryptography, and while I don't think Kyber is going to collapse, I wouldn't bet against 10-15 years of periodic new implementation bugs nobody knew to look for.
But I'm cynical about cryptography. It's really clear why people would want a non-hybrid code point.
Let me just say this once as clearly as I can: I sort of don't give a shit about any of this. A pox on all their houses. I think official cryptographic standards are a force for evil. More good is going to be done for the world by systems that implement well enough to become de facto standards. More WireGuards, fewer RFCs. Certainly, I can't possibly give even a millifuck about what NIST wants.
But I also can't be chill about these blog posts Bernstein writes where it's super clear his audience is not his colleagues in cryptography research, but rather a lay audience that just assumes anything he writes must be true and important. It's gross, because you can see the wires he's using to hold these arguments together (yes, even I can see them), and I don't like it when people insult their audiences this way.
It does though. It's just been engineered integral to the unibody. And there are crumple zones, airbags, seat belts, ABS, emergency braking systems, collision sensors, and more layered defenses in addition.
No sane engineer would argue that removing these layers of defense would make the car safer.
Which is why many engineers wear the ring.
Folks do love to argue though.
To me it really isn't. TLS has no need for it. But let's focus the context for some US government organisations that want this for their FIPS maturity level they're aiming for. Why would these organisations want a weaker algorithm for TLS than what is standardised; more importantly how does it benefit deployment except save a tiny bit of computation and eliminate some ECC code. I'm not going to jump the shark and say it is nefarious, but I will throw in my 2 cents and say it doesn't help security and is unnecessary.
Unless NSA pays you $10 million, as they did to RSA, to make said obviously bumbling attempt the default in their security products.
https://en.wikipedia.org/wiki/Dual_EC_DRBG#Timeline_of_Dual_...
https://www.reuters.com/article/us-usa-security-rsa-idUSBRE9...
Or unless the presence of such less secure options in compliant implementations enables a https://en.wikipedia.org/wiki/Downgrade_attack
Currently the best attacks on NTRU, Kyber, etc, are essentially the same generic attacks that work for something like Frodo, which works on unstructured lattices. And while the resistance of unstructured attacks is pretty well studied at this point, it is not unreasonable to suspect that the algebraic structure in the more efficient lattice schemes can lead to more efficient attacks. How efficient? Who knows.
You find it offensive now to compare ML-KEM and SIKE because SIKE was so thoroughly broken and demonstrated to be worse than pre-quantum crypto. But ML-KEM may already be broken this thoroughly by NSA and friends, and they’re keeping it secret because shipping bad crypto to billions of people enables SIGINT. The idea that your professional crypto acquaintances might be on the NSA’s payroll clearly disturbs you enough that you dismiss it out of hand.
Bernstein is proposing more transparency because that is what was promised after the Dual-EC debacle. Do you disagree with Bernstein because he advocates for transparency (which could prevent bad crypto shipping), or because of his rhetorical style?
You’ve admitted you were “loudly wrong” when you announced Dual-EC couldn’t be an NSA cryptography backdoor. Snowden let us all know the NSA spends $250 million every year secretly convincing/bribing the private sector to use bad cryptography. Despite that history, you are still convinced there’s no way ML-KEM is an NSA cryptographic backdoor and that all the bizarre procedural errors in the PQ crypto contest are mere coincidences.
[checks my text messages] Lucy just texted me, Thomas. She’s outside waiting for you to kick her football.
You saw a similar thing in Bernstein's earlier railing against the NIST contest (which he participated in), happily whipping up a crowd of people who believed Tancrede Lepoint or Chris Peikert or Peter Schwabe might have been corrupted by NSA, because nobody in that crowd have any idea who those three researchers are.
It's really gross.
“Apache chunked encoding is not exploitable” —- Dowd, 2002
What I think you're not seeing is that this isn't a SIKE vs. Lattice kind of debate; it's a Curve25519 vs. P-256 kind of debate. P-256 was never broken. Curve25519 made smart engineering decisions that for years foreclosed on some things that were common in-the-real-world implementation pitfalls. P-256 has closed that gap now, but for the whole run of the experience they were both sane choices.
That's a generous interpretation. Another parallel would be Rijndael vs. Serpent, where the Serpent advocates were all "I don't know about this Rijndael stuff seems dicy". Turned out: Rijndael was great.
But Bernstein wants you think that rather than a curve-selection type debate, this is more akin to a "discrete log vs. knapsack" debate. It isn't.
I wonder what your strategy here is. Muddying the waters and depict Bernstein as a renegade? You have made too many big-state and big-money apologist posts for that to work.
And now, in a world where QR + pre-QR algos are typically being introduced in a layered fashion, they're saying "let's add another option, to reduce the number of options" which at least looks very suspicious
Practical quantum computers are probably not very close, but you can certainly use the fear of them as a chance to introduce a new back-door. If you did, you'd have to behave exactly as the NSA is doing right now.
Dual EC isn't the only comparison he's making. He's also making a comparison to DES, which had an obvious weakness: 53 bit limitation, similar to the obvious weakness of non-hybrid. In neither case is there a secret backdoor. At the time of DES, the NSA publicly said they used it, to make others confident in it. Similarly, the NSA is saying "we do not anticipate supporting hybrid in NSS", which will make people confident in non-hybrid. But in the background, NSA actually uses something more secure (using 2 layers of encryption themselves).
>Surveillance agency NSA and its partner GCHQ are trying to have standards-development organizations endorse weakening ECC+PQ down to just PQ.
The NSA spends about half of its resources attempting to hack the FBI and erase its evidence against them in the matter of keeping my wife and me from communicating. The other half of the staff are busy commenting online about how unfair this is, and attempting to get justice.
There are no NSA resources left for actions like the one I quoted. I don't think NSA is involved in it.
They are not running out of resources.
https://mailarchive.ietf.org/arch/msg/tls/RK1HQB7Y-WFBxQaAve...
Trust the process!
Maybe an stunnel for CurveCP, or something like PQConnect
History has shown djb is usually right
He has been far more productive at writing software and developing cryptography that has avoided security vulnerabilities than any of the IETF WG members. The best part about his software IMHO is that it is small with low resource requirements and can primarily serve ordinary individual computer users, as opposed to large, complex, steep learning curve software primarily serving corporations like the ones that publish RFCs and send people to IETF meetings
Anyone reading this comment is probably using djb's cryptography in TLS. His contributions to today's internet are substantial
It really says a lot about "IETF" and other Silicon Valley pseudo-governance that a talented and trustworthy author, who has remained an academic when so many have sold out, gets treated like a nuisance
I wonder who else could reasonably host a standardization process? Maybe the Linux Foundation? All the cryptography talent seems to be working on ZK proofs at the moment in the Ethereum ecosysetem; I think if Vitalik organized a contest like NIST people would pay attention.
The most important thing is to incentivize attackers to break the cryptography on dummy examples instead of in the wild. Ideally: before the algorithm is standardized. The Ethereum folks are well setup to offer bounties for this. If a cryptographer can make FU money through responsible disclosure, then there is less incentive to sell the exploit to dishonest parties.
This implies that what is actually being offered is Security Through Ignorance.
Is this encryption sound? Maybe, who knows! Let's wait and find out!
29 more comments available on Hacker News