Netflix Optimized Its Petabyte-Scale Logging System with Clickhouse
Posted2 months agoActive2 months ago
clickhouse.comTechstory
supportivepositive
Debate
0/100
ClickhouseLogging SystemsNetflixDatabase Optimization
Key topics
Clickhouse
Logging Systems
Netflix
Database Optimization
Netflix optimized its petabyte-scale logging system using ClickHouse, a column-store database, and the company shares its experience in a blog post.
Snapshot generated from the HN discussion
Discussion Activity
Light discussionFirst comment
4h
Peak period
1
3-4h
Avg / period
1
Key moments
- 01Story posted
Oct 24, 2025 at 3:56 AM EDT
2 months ago
Step 01 - 02First comment
Oct 24, 2025 at 7:41 AM EDT
4h after posting
Step 02 - 03Peak activity
1 comments in 3-4h
Hottest window of the conversation
Step 03 - 04Latest activity
Oct 24, 2025 at 7:41 AM EDT
2 months ago
Step 04
Generating AI Summary...
Analyzing up to 500 comments to identify key contributors and discussion patterns
ID: 45692063Type: storyLast synced: 11/17/2025, 9:13:28 AM
Want the full context?
Jump to the original sources
Read the primary article or dive into the live Hacker News thread when you're ready.
Considering that 99% of what NetFlix is supposed to “do” is stream static content to users, this is an indirect signal that they’re too busy pleasing their middle managers with internal reports instead of doing “the thing” their business is primarily about.
Every time this comes up, someone makes the comment that they need those 40,000 microservices to provide recommendations and the like, but the end result is objectively and subjectively terrible.
I’m forced to Google for “what’s good on NetFlix” because the in-app recommendations are hopeless.
Also: they can collect up to 12 petabytes of writes to disk daily, but still struggle with the apparently much more difficult task of distributing more than five subtitle languages at a time! Those 100 KB text files must have been a step too far for their infrastructure to handle.
Maybe next year I’ll read a self-congratulatory blog article where they’ve scaled logging to fifty exabytes per second and subtitles to six languages per region.