AI Is Killing Wikipedia's Human Traffic
Posted3 months agoActive3 months ago
gizmodo.comTechstory
calmnegative
Debate
20/100
WikipediaAITraffic
Key topics
Wikipedia
AI
Traffic
The article discusses how AI is potentially reducing human traffic to Wikipedia, sparking concerns about the impact of AI on information dissemination and the role of human editors.
Snapshot generated from the HN discussion
Discussion Activity
Light discussionFirst comment
3h
Peak period
2
2-3h
Avg / period
1.3
Key moments
- 01Story posted
Oct 18, 2025 at 9:31 AM EDT
3 months ago
Step 01 - 02First comment
Oct 18, 2025 at 12:10 PM EDT
3h after posting
Step 02 - 03Peak activity
2 comments in 2-3h
Hottest window of the conversation
Step 03 - 04Latest activity
Oct 18, 2025 at 10:28 PM EDT
3 months ago
Step 04
Generating AI Summary...
Analyzing up to 500 comments to identify key contributors and discussion patterns
ID: 45627223Type: storyLast synced: 11/20/2025, 3:22:58 PM
Want the full context?
Jump to the original sources
Read the primary article or dive into the live Hacker News thread when you're ready.
virtue signaling and identity politics
increadably questionable financial expenditure
and now AI
are blighting what should be the crowning achivement of human knowledge
However if something is used as a source by natural text search, perhaps it would at least be fair to mark some sort of hit to that in other ways where deals would be made for rewarding that.
The ideal and fairest to me seems that there must be some sort of taxation/royalty type percentage coming through for what is verified as high quality content. E.g. Google needs to mark down what content and how much it used for training and content that is used as source and keep aggregated statistics, pay out a certain percentage from the profits or percentage of costs that it takes to generate tokens if no profits.
Maybe there are better ideas, these are just few top of mind. Since generating tokens is costly, adding 10 percent on top of it, doesn't seem that significant and could be used to reward the content creators proportionally.
It’s wild how often a Google summary asserts something, I click through to the cited Wikipedia link, and the article says the exact opposite.
I’m very much an AI optimist these days, but product decisions (like elevating weaker models to the top of search results) are making the world epistemically worse right now.