Time Needed to Factor Large Integers
Posted3 months agoActive3 months ago
johndcook.comResearchstory
calmneutral
Debate
40/100
CryptographyNumber TheoryComputational Complexity
Key topics
Cryptography
Number Theory
Computational Complexity
The article discusses the time complexity of factoring large integers, sparking a discussion on the implications for cryptography and the current state of factorization algorithms.
Snapshot generated from the HN discussion
Discussion Activity
Moderate engagementFirst comment
4d
Peak period
6
96-108h
Avg / period
3.7
Comment distribution11 data points
Loading chart...
Based on 11 loaded comments
Key moments
- 01Story posted
Sep 30, 2025 at 2:53 PM EDT
3 months ago
Step 01 - 02First comment
Oct 4, 2025 at 1:54 PM EDT
4d after posting
Step 02 - 03Peak activity
6 comments in 96-108h
Hottest window of the conversation
Step 03 - 04Latest activity
Oct 5, 2025 at 8:08 AM EDT
3 months ago
Step 04
Generating AI Summary...
Analyzing up to 500 comments to identify key contributors and discussion patterns
ID: 45429590Type: storyLast synced: 11/20/2025, 4:02:13 PM
Want the full context?
Jump to the original sources
Read the primary article or dive into the live Hacker News thread when you're ready.
[1] http://tom7.org/papers/murphy2022harder.pdf
I'd tell you to "go ahead and start computing that and tell me when you're done", however, I like the universe I live in, and the entire information content of the boundary of the observable universe is something like 4*10^122 bits [1]. So you're talking about creating a black hole vastly, vastly, vastly, 10-to-the-power-of-thousand+ times larger than the observable universe, which some of your fellow universe residents may find to be a hostile act and we probably aren't going to let you finish.
While you can define such a table as having "O(1)" lookup properties in the sense that on average the vast, vast, vast, vast, vast, dwarfing-the-observable-universe-by-hundreds-of-orders-of-magnitude light years you'd have to travel for the answer to a given query can be considered "O(1)" since it's on average pretty much the same for all lookups, it's constant in a rather useless sense.
[1]: https://math.ucr.edu/home/baez/information.html
In any of these models the time would be multiplied with the size of the lookup table, resulting in a cost much higher that number-field-sieve.
Plus you need to consider the (amortized) cost of populating the lookup table.
> RSA encryption is not broken by factoring keys but by exploiting implementation flaws.
> Factoring a 2048-bit RSA key would require more energy than the world produces in a year, as explained here.
The above should probably contain some caveat's like "Assuming a GNFS attacker, ..." or "ignoring hypothetical non-public mathematical breakthroughs"