Shor's Algorithm: the One Quantum Algo That Ends Rsa/ecc Tomorrow
Key topics
The article discusses Shor's algorithm and its potential to break classical cryptography, sparking a debate among commenters about the feasibility and timeline of practical quantum computers. Some argue that the current progress in quantum computing is overstated, while others discuss the need to transition to post-quantum cryptography.
Snapshot generated from the HN discussion
Discussion Activity
Active discussionFirst comment
7h
Peak period
12
8-10h
Avg / period
3.5
Based on 35 loaded comments
Key moments
- 01Story posted
Nov 27, 2025 at 10:19 PM EST
about 1 month ago
Step 01 - 02First comment
Nov 28, 2025 at 4:51 AM EST
7h after posting
Step 02 - 03Peak activity
12 comments in 8-10h
Hottest window of the conversation
Step 03 - 04Latest activity
Nov 29, 2025 at 5:40 AM EST
about 1 month ago
Step 04
Generating AI Summary...
Analyzing up to 500 comments to identify key contributors and discussion patterns
Want the full context?
Jump to the original sources
Read the primary article or dive into the live Hacker News thread when you're ready.
The author mentions: > RSA-2048: ~4096 logical qubits, 20-30 million physical qubits > 256-bit ECC: ~2330 logical qubits, 12-15 million physical qubits
For reference, we are at ~100 physical qubits right now. There is a bit of nuance in the logical to physical correlation though.
Scepticism aside, the author does mention that it might be a while in the future, and it is probably smart to start switching to quantum resistant cryptography for long-running, critical systems, but I'm not a huge fan of the fear-mongering tone.
I will highlight to others that while the qubit count is not increasing exponentially, other metrics are.
Anyway, here is what Scott Aaronson recently said about quantum computing progress:
> Indeed, given the current staggering rate of hardware progress, I now think it’s a live possibility that we’ll have a fault-tolerant quantum computer running Shor’s algorithm before the next US presidential election. And I say that not only because of the possibility of the next US presidential election getting cancelled, or preempted by runaway superintelligence! (...)
> To clarify — if, before the 2028 presidential election, a fully fault-tolerant Shor’s algorithm was used even just to factor 15 into 3×5, I would view the “live possibility” here as having come to pass.
> The point is, from that point forward, it seems like mostly a predictable matter of adding more fault-tolerant qubits and scaling up, and I find it hard to understand what the showstopper would be.
https://scottaaronson.blog/?p=9325
Relevant quote:
> It’s like this: if you think quantum computers able to break 2048-bit cryptography within 3-5 years are a near-certainty, then I’d say your confidence is unwarranted. If you think such quantum computers, once built, will also quickly revolutionize optimization and machine learning and finance and countless other domains beyond quantum simulation and cryptanalysis—then I’d say that more likely than not, an unscrupulous person has lied to you about our current understanding of quantum algorithms.
And:
> In any case, the main reason I made my remark was just to tee up the wisecrack about whether I’m not sure if there’ll be a 2028 US presidential election.
So I would be careful posting those quotes without context, it makes Scott angry.
https://en.wikipedia.org/wiki/Integer_factorization_records
https://eprint.iacr.org/2025/1237.pdf
[1] https://www.nature.com/articles/nature12290
[2] https://algassert.com/post/2500
https://gagliardoni.net/#20250714_ludd_grandpas
An abstract:
> "but then WHAT is a good measure for QC progress?" [...] you should disregard quantum factorization records.
> The thing is: For cryptanalytic quantum algorithms (Shor, Grover, etc) you need logical/noiseless qubits, because otherwise your computation is constrained [...] With these constraints, you can only factorize numbers like 15, even if your QC becomes 1000x "better" under every other objective metric. So, we are in a situation where even if QC gets steadily better over time, you won't see any of these improvements if you only look at the "factorization record" metric: nothing will happen, until you hit a cliff (e.g., logical qubits become available) and then suddenly scaling up factorization power becomes easier. It's a typical example of non-linear progress in technology (a bit like what happened with LLMs in the last few years) and the risk is that everyone will be caught by surprise. Unfortunately, this paradigm is very different from the traditional, "old-style" cryptanalysis handbook, where people used to size keys according to how fast CPU power had been progressing in the last X years. It's a rooted mindset which is very difficult to change, especially among older-generation cryptography/cybersecurity experts. A better measure of progress (valid for cryptanalysis, which is, anyway, a very minor aspect of why QC are interesting IMHO) would be: how far are we from fully error-corrected and interconnected qubits? [...] in the last 10 or more years, all objective indicators in progress that point to that cliff have been steadily improving
Have you ever wondered what will happen to those coaxial cables seen in every quantum computer setup, which scale approximately linearly with the number of physical qubits? Multiplexing is not really an option when the qubit waiting for its control signal decoheres in the meantime.
Regarding the coaxial cables, you seem to be an expert, so tell me if I'm wrong, but it seems to me a limitation of current designs (and in particular of superconducting qubits), I don't think there is any fundamental reason why this could not be replaced by a different tech in the future. Plus, the scaling must not need to be infinite, right? Even with current "coaxial cable tech", it "only" needs to scale up to the point of reaching one logical qubit.
The QC is designed with coaxial cables running from the physical qubits outside the cryostat because the pulse measurement apparatus is most precise in large, bulky boxes. When you miniaturise it for placement next to qubits, you lose precision, which increases the error rate.
I am not even sure whether logical components work at such low temperatures, since everything becomes superconducting.
> Even with current "coaxial cable tech", it "only" needs to scale up to the point of reaching one logical qubit.
Having a logical qubit sitting in a big box is insufficient. One needs multiple logical qubits that can be interacted with and put in a superposition, for example. A chain of gates represents each logical qubit gate between each pair of physical qubits, but that's not possible to do directly at once; hence, one needs to effectively solve the 15th puzzle with the fewest steps so that the qubits don't decohere in the meantime.
Currently finishing a course where the final project is designing a semiconductor (quantum dot) based quantum computer. Obviously not mature tech yet, but we've been stressed during the course that you can build most of the control and readout circuits to work at cryogenic temps (2-4K) using slvtfets. The theoretical limit for this quantum computing platform is, I believe, on the order of a million qubits in a single cryostat.
Given the magic that happens inside high-precision control and readout boxes connected to qubits with coaxial cables, I would not equate the possibility of building one with such a control circuit ever reaching the same level of precision. I find it strange that I haven’t seen that on the agenda for QC, where instead I see that multiplexing is being used.
> The theoretical limit for this quantum computing platform is, I believe, on the order of a million qubits in a single cryostat.
What are the constraints here?
> Rotate everything that lasts >10 years to pure PQC now
The author suggests switching to Post-Quantum Cryptography which uses relatively new ciphers that haven't been as battle-tested as older ones like RSA and ECC. Back when those were introduced, there weren't any stronger ciphers at the time, so if they were broken, at least people knew they did the best they could to protect their data.
Now, however, we have standardized encryption with (to the general public's knowledge at least) uncrackable algorithms (provided sane key lengths are chosen), so doing anything that could weaken our encryption makes us worse than the baseline. This proposal is theoretically stronger, but it is unknown whether it will stand the test of time, even with today's technology, due to it being relatively new and not widely deployed.
The standard practice of rolling out PQC is using it as an additional layer alongside current encryption standards. This adds redundancy, so that if one is broken the data will stay safe. Using only PQC or only RSA/ECC/whatever makes the system have a single point of failure.
FYI, this is exactly what governments want (I'll let you guess why). This related post was on the front page just a few days ago: https://news.ycombinator.com/item?id=46033151
You're right that rotating every crypto algo to PQC right away might be a bit too aggressive. The actual best practice (like you said) is hybrid: layer ML-KEM/ML-DSA on top of RSA/ECC for redundancy. Classical algos aren't dead yet, but Shor's clock is ticking, and for now those NIST-standardized (FIPS203 for ML-KEM, FIPS204 for ML-DSA) PQC algos didn't break for now. That's why Cloudflare for example uses ML-KEM alongside X25519 for their TLS key exchange (https://cyberpress.org/cloudflare-enhances-security/).
And yeah.. presenting a single algo as the perfect solution. That gives Dual_EC vibes, perfect spot for a backdoor.
https://eprint.iacr.org/2025/1237.pdf