A cryptography research body held an election and they can't decrypt the results
Mood
controversial
Sentiment
negative
Category
news
Key topics
Cryptography
Election Security
Research
Discussion Activity
Moderate engagementFirst comment
1m
Peak period
8
Hour 5
Avg / period
2.9
Based on 38 loaded comments
Key moments
- 01Story posted
Nov 22, 2025 at 10:47 PM EST
1d ago
Step 01 - 02First comment
Nov 22, 2025 at 10:48 PM EST
1m after posting
Step 02 - 03Peak activity
8 comments in Hour 5
Hottest window of the conversation
Step 03 - 04Latest activity
Nov 23, 2025 at 4:20 PM EST
10h ago
Step 04
Generating AI Summary...
Analyzing up to 500 comments to identify key contributors and discussion patterns
How is someone losing their key a "technical problem"? Is that hard to own up and put the actual reason in the summary? It's not like they have stockholders to placate.
we will adopt a 2-out-of-3 threshold mechanism for the management of private keys [1]
The trustee responsible has resigned so why weaken security going forward?
I would have thought cryptography experts losing keys would be pretty rare, like a fire at a Sea Parks.
Confidentiality that undermines availability might be good cryptography but it violates basic tenets of information security.
"Your Scientists Were So Preoccupied With Whether Or Not They Could, They Didn’t Stop To Think If They Should"
The human half of the problem is the loss of the key; the technical half of the problem is being unable to decrypt the election results.
> The trustee responsible has resigned so why weaken security going forward?
I don't think there's a scenario in which a 2-of-3 threshold is a significant risk to IACR.
I believe the DNSSEC uses a 5 of 7 approach.
“Unfortunately, one of the three trustees has irretrievably lost their private key, an honest but unfortunate human mistake, and therefore cannot compute their decryption share. As a result, Helios is unable to complete the decryption process, and it is technically impossible for us to obtain or verify the final outcome of this election.”
⇒ that first paragraph is badly worded, but they’re not hiding facts.
I also think “3 out of 3” is not a good idea, as it allows any single key holder to prevent election outcomes that they don’t like (something that may have happened here, too. I don’t think cryptography experts often lose such keys by accident)
It's also important to factor in the case of "a key holder was hit by a bus, and now we can no longer access their private key".
Even if this was an accident, isn't it theoretically possible for one of the trustees to intentionally not provide the key to trigger the re-election? There's no guarantee that the people will vote the same. I see this as a kind of vulnerability.
This is also not realistic and Occam's razor applies here strongly: why sabotage your career and frankly embarrass yourself just to make a tiny election delay, based on uncertain assumptions? This doesn't pass the sniff test.
In short, I think always assuming the worst in people is not healthy and we should trust that this was indeed a honest, unfortunate mistake. This could happen to everyone.
I was merely expanding on the hypothetical case where bad politics overcame a theoretically sound selection process.
E.g. everyone provides a hash for their key first, and the actual key a some seconds later, when all the hashes for the keys have arrived. Someone is 'cheating' by claiming key loss if s/he claims the s/he lost the key during that few seconds.
It'd be more robust in my opinion to have 4 mostly trustworthy people and a 3-in-4 secret share. That seems as good as 3 trusted people.
Because they’re an association of cryptographers. They’ve invented all these cool encrypted voting protocols that split trust among multiple people, so of course that’s what they’re going to use.
they probably design this system to be used for government elections, how they can convince anyone to use it when they do not use it for their own elections?
- Availability is a security requirement. "Availability" of critical assets just as important as "Confidentiality". While this seems like a truism, it is not uncommon to come across system designs, or even NSA/NIST specifications/points-of-view, that contradict this principle.
- Security is more than cryptography. Most secure systems fail or get compromised, not due to cryptanalytic attacks, but due to implementation and OPSEC issues.
Lastly, I am disappointed that IACR is publicly framing the root cause as an "unfortunate human mistake", and thereby throwing a distinguished member of the community under the bus. This is a system design issue; no critical system should have 3 of 3 quorum requirement. Devices die. Backups fail. People quit. People forget. People die. Anyone who has worked with computers or people know that this is what they do sometimes.
IACR's system design should have accounted for this. I wish IACR took accountability for the system design failure. I am glad that IACR is addressing this "human mistake" by making a "system design change" to 2 of 3 quorum.
A small threshold reduces privacy, whereas a large threshold makes human error or deliberate sabotage attempts more likely. What is the optimum here? How do we evaluate the risks?
Considering that this is an election for a professional organization with thousands of members, I am going to go out on a limb and say that it should be easily possible to assemble a group of 5 people that the community/board trusts woudn't largely collude to break their privacy. If I were in the room, I would have advocated for 3 of 5 quorum.
But the lifecycle of the key is only a few months. That limits the availability risk a little bit, so I can be convinced to support a 2 of 3 quorum, if others feel strongly that the incremental privacy risk introduced by 3 of 5 quorum is unacceptable.
To me, the entire matter is mostly amusing; the negative impact on IACR is pretty low. I now have to spend 10-15 minutes voting again. No big deal.
It saddens me that Moti Yung is stepping down from his position as an election trustee; in my opinion, this is unwarranted. We have been using Helios voting for some time; this was bound to happen at some point.
Don't forget that the IACR is not a large political body with a decent amount of staff; it's all overworked academics (in academia or corporate) administering IACR in their spare time. Many of them are likely having to review more Eurocrypt submissions than any human could reasonably manage right now. There are structural issues in cryptography, and this event might be a symptom of the structural pressure to work way more than any human should, which is pervasive not just in cryptography, but in all of science.
From what I heard on the grapevine, this scenario was discussed when Helios was adopted; people wanted threshold schemes to avoid this exact scenario from the start, but from the sources I can find, Helios does not support this, or at least it does not make threshold encryption easy. The book Real-World Electronic Voting (2016)[^0] mentions threshold encryption under "Helios Variants and Related Systems", and the original Helios paper (2008)[^1] mentions it as a future direction.
You don't have to tell these academics that usable security is important. Usable security is a vital and accepted aspect of academic cryptography, and pretty much everyone agrees that a system is only as secure as it is usable. The hard part is finding the resources—both financial and personnel-wise—to put this lesson into practice. Studying the security of cryptographic systems and building them are two vastly different skills. Building them is harder, and there are even fewer people doing this.
[^0]: Pereira, Olivier. "Internet voting with Helios." Real-World Electronic Voting. Auerbach Publications, 2016. 293-324, https://www.realworldevoting.com/files/Chapter11.pdf
[^1]: Adida, Ben. "Helios: Web-based Open-Audit Voting." USENIX security symposium. Vol. 17. 2008, https://www.usenix.org/legacy/event/sec08/tech/full_papers/a...
Break your systems, identify the issues, fix it.
I want this to happen because I want mathematically secure elections.
That said… holy shit, you didnt think one of three groups could possibly lose a key due to human error!?
Want the full context?
Jump to the original sources
Read the primary article or dive into the live Hacker News thread when you're ready.