Meta's Teen Accounts Are Sugar Pills for Parents, Not Safety for Kids
Posted3 months agoActive3 months ago
overturned.substack.comTechstory
controversialnegative
Debate
80/100
Social MediaChild SafetyParenting
Key topics
Social Media
Child Safety
Parenting
The article criticizes Meta's teen accounts as a superficial solution to child safety concerns, sparking debate among commenters about the responsibility of social media companies versus parents in protecting minors online.
Snapshot generated from the HN discussion
Discussion Activity
Active discussionFirst comment
1h
Peak period
19
0-2h
Avg / period
6.6
Comment distribution33 data points
Loading chart...
Based on 33 loaded comments
Key moments
- 01Story posted
Sep 30, 2025 at 10:40 AM EDT
3 months ago
Step 01 - 02First comment
Sep 30, 2025 at 12:00 PM EDT
1h after posting
Step 02 - 03Peak activity
19 comments in 0-2h
Hottest window of the conversation
Step 03 - 04Latest activity
Oct 1, 2025 at 10:51 AM EDT
3 months ago
Step 04
Generating AI Summary...
Analyzing up to 500 comments to identify key contributors and discussion patterns
ID: 45426074Type: storyLast synced: 11/20/2025, 12:29:33 PM
Want the full context?
Jump to the original sources
Read the primary article or dive into the live Hacker News thread when you're ready.
https://www.bbc.com/news/articles/ce32w7we01eo
Zuckerberg famously doesn’t (didn’t?) let his children use Facebook. Perhaps everyone else should take a hint.
There are all kinds of services that parents can use now to filter this even further than what was possible 10-15 years ago
Note I’m not saying that Facebook don’t profit from being as ineffectual as possible, but ultimately that parents should know better.
One of my friends was busted at school for distributing porn downloaded from BBS on floppy disks.
Back in the day, hordes of kids were just set loose on the city to find empty lots to fuck around in, because a lot of families are just scraping by and the whole concept of full-time supervision of children is laughably naive. Now they're on the TikToks and the Facestagrams, which has its own set of advantages and disadvantages.
In this case, Meta's product that is says are suitable for teens simply doesn't meet the expectations of safety that most parents expect and it's good that there is reporting on it to inform them.
And I think he was right to do this.
But he’s also the person who created Facebook, who allowed it to be used by children, so it’s extremely hypocritical, no?
Why doesn’t he make Facebook for over 18s only?
Money is why.
He’ll protect his own children whilst at the same time he thinks it’s fine to harm other peoples’. He think this whilst knowing full well that not all parents are created equal and that many parents won’t (or perhaps can’t) do what is best for their children’s welfare relating to social media. And this is because he values making money more, and because he’s an out of touch and elitist manchild who refuses to take responsibility for what he’s created.
So, yes, it’s Meta’s - and specifically Mark Zuckerberg’s - fault that children are exposed to harmful content, and both they and he should absolutely be held to account for it.
Fundamentally social media companies are media companies. Just because it’s a new form of media doesn’t mean they should get to dodge the scrutiny and responsibilities other media companies are subject to.
>Why doesn’t he make Facebook for over 18s only?
Should non-smokers be allowed to be tobacco executives? I think everyone agrees that smoking is bad, but whether the CEO partakes in it shouldn't really be a relevant factor either way.
I think that’s a more accurate analogy, and I think it also would be reprehensible behavior.
EDIT: What adults need to understand is that IG is an ad platform first and not an IM or connection platform. And leave it. If you can't convince adults then children are totally lost cause. Adults also face the same problems. These don't go away when you are not a teen. Where your teen goes and what they do, is up to you as a parent.
First there is the obvious question: who is giving teenagers unfettered access to the Internet? Phones cost money. Home Internet costs money. Mobile data costs money. Best you can say is kids could get online using McDonalds WiFi with a phone they bought for lawnmower money, but we don't have to play pretend. We know how they got phones and Internet, the same way kids and teens were exposed to TV. Apparently though, despite this obvious first step in accountability, it's just absolutely all shoulders. This step is apparently so unimportant it has been completely not worth mentioning. I hate to just bitch about parents because I absolutely know parenting isn't easy, is important for society, and that the conditions we're in make it hard to feel "in control" anymore. On the other hand, this isn't really exactly a new problem. All the way back in 1999, South Park: Bigger, Longer and Uncut basically addressed the same god damn thing. And I don't mean to equate obscenity on TV with the kinds of safety risks that the Internet can pose, but rather particularly the deflection of blame that parents engage in for things they directly facilitated. Seriously, the "Blame Canada" lyrics somehow feel as prescient as ever now:
Though honestly, I absolutely think that this doesn't mean social media companies aren't to blame. Everyone knew we were selling sex and ads to kids. I think that Twitter and Tumblr were extremely violently aware that they were basically selling sex to kids, and if anything just tried as much as possible to ensure their systems had no way for them to be able to account for it (on Twitter many people have always wanted an option to mark their accounts as 18+ only, but as far as I know you still can't. A few years ago I think they added a flag they can apply to accounts that hides them, but it's still not possible. And although Twitter has sensitive content warnings... It doesn't actually let you specify that something is explicit. Only that it is sensitive, or maybe even that it contains nudity. I think this blanketing is intentional. It provides some deniability.) For their role in intentionally blurring lines so they can sell sex and ads to kids while mining their data, they should be penalized.But instead, we're going to destroy the entire Internet and rip it apart. Even though the very vast majority of the Internet really never had any problems that crop up with kids on social media to any particularly alarming scale, we're going to basically apply broad legislation that enforces ISP level blocks. In my state there was a proposal to effectively ban VPNs and Internet pornography altogether!
I think ripping apart the Internet like this is basically evil for all parties involved. It's something regulators can do to show that they really care about child safety. It's something parents can support in stead of taking accountability for their role in not doing the thing called "parenting". And I bet all of the massive social media companies will make it out of this just fine, essentially carving their own way around any legislation, while the rest of the Internet basically burns down, as if we really needed anything to help that along.
We will never learn.
In a perfect world, the internet shouldn't be used by anyone under 18 without monitoring by their parents. That doesn't mean we should legislate criminal penalties for parents who fail to "correctly" parent, nor should we penalize companies whose service or product is used poorly or whose use results in a negative outcome.
Some things are cultural and social, and government isn't the right tool to fix problems. The cost of governance - loss of privacy and agency, unnecessary intrusion and bureaucracy, mistakes and waste of money and time - far exceeds the cost of letting society figure its shit out.
Yeah, there will be tiktok casualties, zombies, people with feed-fried brains. There will even be suicides and tragedies and stupid avoidable deaths from meme challenges. That's better than the alternatives, and it's not ok to put the responsibility for those negative social outcomes on companies when it's a failure of parenting. It's tragic and terrible, but we shouldn't let sympathy for parents allow shifting the blame to social media or AI companies.
That being said, there should be guardrails - no algorithm tuning to target children, rapid detection, reporting, expulsion, and lifetime bans for predators on a platform, no advertising to children, and so forth. Require parental consent and monitoring, with weekly activity summaries emailed to parents, or things like that - empowering parents and guardians and giving them tools to make their life easier, that'd be admirable.
With platforms like Roblox, they're effectively so large that it's impossible to responsibly moderate, so they get infested with predators and toxic situations.
I think it's probably going to require society to limit how big platforms and companies can get - if the service or product or platform cannot be managed responsibly at a certain scale, then they're not allowed to operate at that scale anymore.