Making Sure AI Serves People and Knowledge Stays Human
Key topics
The Wikimedia Foundation has published a human rights impact assessment on the interaction of AI and machine learning with Wikimedia projects, sparking discussions on Wikipedia's role, bias, and the impact of AI on knowledge sharing.
Snapshot generated from the HN discussion
Discussion Activity
Active discussionFirst comment
1h
Peak period
11
2-4h
Avg / period
4.3
Based on 34 loaded comments
Key moments
- 01Story posted
Sep 30, 2025 at 3:23 PM EDT
3 months ago
Step 01 - 02First comment
Sep 30, 2025 at 4:45 PM EDT
1h after posting
Step 02 - 03Peak activity
11 comments in 2-4h
Hottest window of the conversation
Step 03 - 04Latest activity
Oct 1, 2025 at 6:06 PM EDT
3 months ago
Step 04
Generating AI Summary...
Analyzing up to 500 comments to identify key contributors and discussion patterns
Want the full context?
Jump to the original sources
Read the primary article or dive into the live Hacker News thread when you're ready.
What specifically do you dislike about Wikipedia?
For a different reason it is host to a wide range of superficial treatments of scientific and medical information. Which is not ideal. See scholarpedia for a better alternative for this kind of information (although its not well populated).
What were the biases that affected its coverage on Wikipedia?
- I would avoid giving the impression that a simplified narrative is established (unless it won a nobel or is really beyond any reasonable doubt).
- End the use of "controversy" sections where it is used to minimise the impact of perfectly acceptable science. This is clearly used as a tactic in certain instances to minimise results.
- Allow for mixed expert, AI, conventional editing. There is nothing wrong or elitist with including edited articles by academic experts as cut outs of a particular topic. Almost like if scholarpedia and wikipedia were merged.
- Try to combat omission biases. There are some pretty wild ones on wikipedia.
- New results should be handled properly in terms of editing and language.
- Reproduce multiple versions of the same text to show how it would be different if certain hypothesis/experimental results were valid. This is easy to do if using LLMs.
A wikipedia AI could be the most balanced ofc with suitable changes and actually taking major criticisms on board. One is omission bias where very important information is just left out of articles. Another is lack of comparison of conflicting narratives (history, politics, science, etc).
> Charlie Kirk takes the roast in stride with a laugh— he's faced tougher crowds. Yes, he survives this one easily
> It's a meme video with edited effects to look like a dramatic "shot"—not a real event. Charlie Kirk is fine; he handles roasts like a pro.
This thing as the basis for an encyclopedia will be worse than useless. Comparing it to Wikipedia is several layers removed from reality.
What about important and controversial historical events? Do you think that wikipedia omitting information is acceptable?
Do you think that other models would insert "controversy" before making statements as part of a dark pattern to potentially encourage investigation fatigue?
All of the above are also possible in an Elonpedia style situation or Grok. I never said it will be absolutely BETTER.
This sounds like a "both sides" kind of statement and I don't think it's fitting literally immediately after acknowledging that you don't think Musk is going to be unbiased.
Will Elonpedia be better in that respect? Its owner is not exactly known from having a healthy distance to internet culture wars.
More importantly, even if all these pages disappear overnight, Wikipedia is still extremely valuable and beneficial.
Narrative engineering with AI is a pretty scary prospect so I can see how the wikipedia model with genuinely random human editors might have a major advantage there. If it is simply a vanity project, and not an uncensored Grok model, then obviously it would be garbage and hilariously biased.
I mean, this sounds impartial but isn't. If you have an article on the planet Earth and present a spectrum of opinions on whether it's round or flat, you're not being balanced; you're implicitly supporting flat-Earthers and their trollish beliefs by taking them seriously.
This has countless parallels in political discourse. Trade flat-Earthers for (actual) neo-Nazis; it's probably not a "spectrum of opinions" you want to broadcast without passing any judgment. It's not the role of an encyclopedia to be a free speech platform.
I'm not really saying this to defend political articles on Wikipedia. It's just that I don't think you fix it by taking an LLM trained on Wikipedia, and telling it to regenerate the articles "without bias".
Edit: yes, it's mentioned in the article: "Conservapedia was launched in 2006 and is widely regarded as a joke by anyone who tries to wade through its ridiculous articles."
Edit: It seems that the supposedly liberal-based pedia remains far more popular then the explicitly conservative-biased pedia. But both are out there, so take your pick.
Let's take a spin through what the various Wikis have written about Musk:
https://rationalwiki.org/wiki/Elon_Musk
https://www.conservapedia.com/Elon_Musk
https://en.uncyclopedia.co/wiki/Elon_Musk
http://www.scholarpedia.org/ - no Musk article
https://citizendium.org/wiki/Elon_Musk - nothing here (yet)
... and so on.
Honorable Mention to Encyclopedia Dramatica which had some funny parody content.
Throwing tons of money can often make things happen.
StackOverflow has this problem (or had, before it died) - the mods were hugely invested in closing questions for basically any reason, so normal users ended up hating it and the company couldn't make any changes to improve things because whenever they tried the mods revolted.
It's not as much of an issue with Wikipedia because most Wikipedia users aren't actually editing articles and running into any moderation issues.
https://en.wikipedia.org/wiki/Gaza_genocide
Ask yourself how neutral this is, and it of course isn't neutral at all by Wikipedia standards (obviously both parties don't agree, and one party's viewpoint is not represented ... at all, which is normally a hard no on wikipedia) and then look at the discussion page, to see it become 100x worse, and to see openly racist viewpoints that I'm surprised they allow even on discussion pages.
What's especially troublesome is that one side (the one who's viewpoint is the only represented on the page) is totally opposed to even discussing the existence of the other side.
If you are a content creator, good luck. You aren't really valued.