The Principles of Diffusion Models
Postedabout 2 months agoActiveabout 2 months ago
arxiv.orgTechstory
calmmixed
Debate
40/100
Diffusion ModelsAIMachine Learning
Key topics
Diffusion Models
AI
Machine Learning
A 470-page academic paper on the principles of diffusion models is shared, sparking discussion on its comprehensiveness, relevance, and the broader implications of AI research.
Snapshot generated from the HN discussion
Discussion Activity
Moderate engagementFirst comment
1h
Peak period
6
6-9h
Avg / period
2.6
Comment distribution31 data points
Loading chart...
Based on 31 loaded comments
Key moments
- 01Story posted
Nov 9, 2025 at 11:10 AM EST
about 2 months ago
Step 01 - 02First comment
Nov 9, 2025 at 12:38 PM EST
1h after posting
Step 02 - 03Peak activity
6 comments in 6-9h
Hottest window of the conversation
Step 03 - 04Latest activity
Nov 11, 2025 at 1:22 PM EST
about 2 months ago
Step 04
Generating AI Summary...
Analyzing up to 500 comments to identify key contributors and discussion patterns
ID: 45866572Type: storyLast synced: 11/20/2025, 6:51:52 PM
Want the full context?
Jump to the original sources
Read the primary article or dive into the live Hacker News thread when you're ready.
> 97 matches
Ok I'll read it :)
“Are reposts ok?
If a story has not had significant attention in the last year or so, a small number of reposts is ok. Otherwise we bury reposts as duplicates.”
https://news.ycombinator.com/newsfaq.html
Also, from the guidelines: “Please don't post on HN to ask or tell us something. Send it to hn@ycombinator.com.”
For example they probably didn't want posts like "Hey Hacker News, why don't you call for the revival of emacs and the elimination of all vi users?" and would rather you email them so they can ignore it, but they also don't want email messages asking "How do I italicize text in a Hacker News comments, seriously I can't remember and I would have done so earlier in this comment if I could?" and would rather you ask the community who could answer it without bothering anyone working at Y Combinator.
[1] https://deepgenerativemodels.github.io/
[2] https://m.youtube.com/playlist?list=PLoROMvodv4rPOWA-omMM6ST...
Of course we don’t brute-force this in our lifetime. Evolution encoded the coarse structure of the manifold over billions of years. And then encoded a hyper-compressed meta-learning algorithm into primates across millions of years.
There’s a lot of “force” in statistics, but that force relies on pretty deep structures and choices.
https://github.com/gmaher/diffusion_principles_summary