4chan Lawyer Publishes Ofcom Correspondence
Mood
heated
Sentiment
negative
Category
other
Key topics
The UK's Ofcom is attempting to regulate online safety, including 4chan, and a lawyer has published correspondence highlighting the extraterritorial implications of the Online Safety Act, sparking debate about government overreach and free speech.
Snapshot generated from the HN discussion
Discussion Activity
Very active discussionFirst comment
26m
Peak period
143
Day 1
Avg / period
26.7
Based on 160 loaded comments
Key moments
- 01Story posted
Oct 17, 2025 at 3:31 AM EDT
about 1 month ago
Step 01 - 02First comment
Oct 17, 2025 at 3:57 AM EDT
26m after posting
Step 02 - 03Peak activity
143 comments in Day 1
Hottest window of the conversation
Step 03 - 04Latest activity
Oct 28, 2025 at 7:59 AM EDT
30 days ago
Step 04
Generating AI Summary...
Analyzing up to 500 comments to identify key contributors and discussion patterns
Edit: In a nutshell - almost every other transfer of goods and services across national borders is subject to quality standards. Why do we give a pass to a system that allows deep, individualised access to people's personal lives and mental processes?
I don't want the government to decide which thoughts I can access and which ones I can't, but I also understand that allowing a foreign power (let's say Russia, although "the US" works just as fine) to freely run undercover propaganda and/or destabilization campaigns without any recourse doesn't look good either. And while I agree with "when in doubt aim for the option with more freedom", I can understand those who share your position.
Step 1 is reduce your attack surface :) As a second point, democracies are propaganda campaigns - it's a feature, not a bug.
I believe that national cultural and societal norms play a key part in self-regulation. I think it's too much to ask for those balancing forces to work as effectively without first turning down the firehose.
By closing up we defend us from some threats, but open gates wide for others. Foreign actors compete against much stronger domestic media machines and as you mentioned have to operate in foreign cultural environments. Gaining true influence also always involves financial flows, not just propaganda campaigns, so it is sure possible to mitigate these threats without closing information flow.
Consider the opposite threat of democracies being undermined from within. If some internal "threat actor" gets control of the executive branch and of the media and also can prevent information flow from the outside, very little can be done against it.
I think it is critical to keep in mind this second possibility even when the first threat seems more urgent.
Propaganda is not necessarily to gain influence or money. Eg: Country x just wants to mess with people's heads and turn them on each other to weaken a rival country. Or: Country y runs a crafted propaganda campaign against a rival. As a result, some sector of its own economy starts doing better at the expense of its rival.
>If some internal "threat actor" gets control of the executive branch and of the media and also can prevent information flow from the outside, very little can be done against it.
I understand the scenario (it's far from new), but that's what the design of any particular democracy is supposed to minimise. Term limits, separation of government powers, etc.
That would be an interesting discussion in itself, but even so - accessing material in isolation over the internet removes all of the benefits of cultural and community self-regulation.
>freely run undercover propaganda and/or destabilization campaigns
I'm of the opinion that WWW3 has already happened - it was a war for hearts and minds waged over the internet, and we've already lost.
This is a very fancy way of saying “censorship”.
> I'm of the opinion that WWW3 has already happened - it was a war for hearts and minds waged over the internet, and we've already lost.
If the open, unfettered exchange of culture and ideas is such a threat to our system then we deserve to lose. If my only option is to be stuck in a system that enforces ideological conformity on its subjects, then I’d rather it be the Chinese system. At least it’s not so dysfunctional!
If we are receiving all of the downsides of a liberal democracy without the benefits, what’s the point anymore?
The question is: is there a defense against this?
Your answer currently is there is no defense because creating an illusion of unanimous ideological conformity counts as an exchange of ideas and that exchange must not be hindered.
The debate is over whether the right to conduct Sybil attacks is more precious than the right to freedom of thought. The question is vastly harder than many people in this thread seem to believe.
My personal take is that the right to freedom of thought is more fundamental and that the value of freedom of speech is via its support for freedom of thought.
Who is we, and who won? What did they win?
That's just as bad of an argument as so-called intellectual walls of text. Nothing needs to be done, the outcomes are not bad. My argument is as strong as yours.
Censoring view points is equivalent to signal boosting other view points. Why do you trust the UK government to select the correct view points given all the strong evidence to the contrary?
Is that a made up problem? IMO: yes. That's a PARENT'S responsibility, not mine.
There are legitimate arguments in favor of a national firewall. Nobody is making them.
This is about the worst attitude you can have in politics.
Sovereign firewalls are mostly used by countries that have them for censorship and surveillance, and I think letting governments use a pretext of digital services being able to avoid tolls and taxes to establish such a powerful tool would be a huge mistake.
If only it were that easy. For me as a parent, my approach is to implement a "Great personal firewall" - that is, internet restrictions that decrease over time as they mature, and starting with essentially zero access. Unfortunately, it's probably doomed to fail as other kids their age (5 + 7) and in their peer groups are already walking around with smartphones.
To put it bluntly, too many parents are too unenaged and lazy (or self-centered).
Now it's just outright forbidden to have anything with a chat. And no Internet.
The problem is that other 10 year old have mobiles, free PC access, etc, so there constant peer pressure.
At home measures are at best a delay, not a fix. What you also have to do is actually communicate with your child. If you're strict about what they can and cannot do on the internet, they will feel shame for doing it anyway, which may also mean they would be too ashamed to talk to their parents if for example they are getting groomed online.
I'm sorry, but if you're threat model is your kid getting a fucking burner phone, I don't know what to tell you.
Even this law won't fix it! Why, couldnt your kid just save up and buy a plane ticket to the US?? Oh no .. we need a global law don't we?
Or, maybe, we throw away that thinking and acknowledge that the problem is not that big and solving 99% of it is MORE than good enough.
Your kid is way more likely to die in a car wreck. Focus on that or something.
Kids go to school, have lessons, right ? And few minutes breaks between lessons ? How that parents want to censorship what kids talk about ? Not to mention phones use. And why exactly ?
Thing is as it always is: parents make fundamens in culture/world view eg via their views and religion they subscribe. And then society and reality takes over. What society you have ?
I don't remember this in my late 90s LAN chats.
I tried setting up parental controls on Fortnite and it was a nightmare, having threats multiple accounts with multiple providers, it felt very much designed to force people to go “ahh forget it”.
They do; in the UK, if you want to have access to porn, you need to tell your ISP and they will unblock it.
Of course, that's a game of whack-a-mole because you can render porn in Minecraft servers or join one of many communities on Whatsapp or Discord if needs be. It mainly blocks the well-known bigger porn sites.
The conclusion is, it's a service problem, not a howto-block problem
kid-friendly content is under supplied and often bad maintained.
To quote GabeN: Piracy is almost always a service problem and not a pricing problem
But it's not forbidden or hidden away, so kids aren't curious about it.
Yes, but the problem is, many (if not most) of those content or services were created by adults and dispised by kids.
pick one your kid's most interested topic, are there enough kid-friendly content/services that fulfills all the needs?
My oldest girl is 5. She's already very aware that other kids in her class have access to tablets and phones. How on earth do I responsibly explain to her the dangers? I have enough trouble asking her to get dressed and keep her nappy dry at night.
I say "I consider", because skills self-evidently essential to a good life (emotional regulation, focus and attention span, ability to read other people's emotional states, effective communication, physical skills) are increasingly not generally considered that way.
By who, and for who? My kids (ages 5+7) watch significantly less TV than their peers (as well as currently almost zero internet access), and are frequently complimented on their command of vocabulary and ability to express themselves.
>And if we are talking about the internet in general and not just twitter/tiktok, then its largely NOT doomscrolling and ragebait.
By amount of time that people spend on the internet, it is mostly doomscrolling and ragebait. If only we could take that part of it away.
ages 0-6, increased vocabulary with increased screen time https://srcd.onlinelibrary.wiley.com/doi/10.1111/cdev.13927
> My kids (ages 5+7) watch significantly less TV than their peers (as well as currently almost zero internet access), and are frequently complimented on their command of vocabulary and ability to express themselves.
Compliments are nice I suppose, but theyre a poor metric when regarding vocabulary size.
> By amount of time that people spend on the internet, it is mostly doomscrolling and ragebait. If only we could take that part of it away.
"most" people I assume doesnt include you? Youre too smart to fall for it, obviously.
>theyre a poor metric when regarding vocabulary size.
I'm talking about school reports, among other things.
>"most" people I assume doesnt include you? Youre too smart to fall for it, obviously.
It's something I struggle with daily, and have put a lot of thought into what I want from my use of online technology. Eg, I don't have a smartphone. How can a kid be expected to make good choices if I can't?
Follow the science bud. The science is telling you to give them screentime
>I'm talking about school reports, among other things.
well yeah, you are now.
> It's something I struggle with daily,
this actually explains a lot
If I see some science that says this, I'll think about it.
1. Educate children about bad actors and scams. (We already do this in off-line contexts.)
2. Use available tools to limit exposure. Without this children will run into such content even when not seeking it. As demonstrated with Tiktok seemingly sending new accounts to sexualised content,(1) and Google/Meta's pathetic ad controls.
3. Be firm about when is the right age to have their own phone. There is zero possibility that they'll be able to have one secretly without a responsible parent discovering it.
4. Schools should not permit phone use during school time (enforced in numerous regions already.)
5. If governments have particular issues with websites, they can use their existing powers to block or limit access. While this is "whack-a-mole", the idea of asking each offshore offending website to comply is also "whack-a-mole" and a longer path to the intended goal.
6. Don't make the EU's "cookies" mistake. E.g. If the goal is to block tracking, then outlaw tracking, do not enact proxy rules that serve only as creative challenges to keep the status quo.
and the big one:
7. Parents must accept that their children will be exposed at some level, and need to be actively involved in the lives of their children so they can answer questions. This also means parenting in a way that doesn't condemn the child needlessly - condemnation is a sure strategy to ensure that the child won't approach their parents for help or with their questions.
Also some tips:
1. Set an example on appropriate use of social media. Doom scrolling on Tiktok and instagram in front of children is setting a bad example. Some housekeeping on personal behaviours will have a run on effect.
2. If they have social media accounts the algorithm is at some point going to recommend them to you. Be vigilant, but also handle the situation appropriately, jumping to condemnation just makes the child better at hiding their activity.
3. Don't post photos of your children online. It's not just an invasion of their privacy, but pedophile groups are known to collect, categorise and share even seemingly benign photos.
1. https://globalwitness.org/en/campaigns/digital-threats/tikto...
i know, freedom of speech, it's your money and not mine, etc.
how does this relate to what i said? i get the "we're a free platform where everyone can do everything and no one is responsible for anything", just a cheap excuse from my POV considering the unhinged, doxxy culture on there. sure, there are cute boards, nice. i am talking about the inhumane, unhinged slurry of shit.
"Sure my neighbour has a couple of cadavres in his cellar, but have you seen the pretty flowers on his balcony?"
but per usual you can't criticize 4chan in the slightest without its warriors appearing to defend it. i get it. 4chan did and does cool stuff. it also does absolutely disgusting things, surprisingly this always gets dismissed as 'it's only the couple of rogue boards which are crazy'.
i agree :)
> people buy their blue checkmarks there all the time
sadly, yes.
as someone who left reddit so long ago that they don't remember it and really does not care about it, please tell me what's worse on reddit than the constant xenophobic, transphobic and general *phobic stuff on 4chan.
phobic does not even do it justice, as it just straight up advocates for whole races or genders to kill themselves (b-b-b-but, i-i-it's just a joke, kawaii).
Least thats what happened with a scene I'm rather involved in, the threads in recent years became nothing but a cesspool of negativity and most people knew who was behind the constant drama. What people didnt expect was the leak revealed one of the mods was among the group constantly causing it.
maybe this is my bias, could very well be. maybe i should give it a 10th chance and browse the more useful boards.
i guess /g/ would be a start, do you have other recommendations? i mean i'm open to change my mind. for me 4chan stands for alt-right pipelines, spreading far-right ideology online etc., so i just really have a sour taste in my mouth when thinking about it.
The Sarcophagus and the new containment building sure, I meant the original one before the accident.
Most threads still get plagued by a circlejerk of wannabe neonazis repeating shibboleths and transphobia at each other ad infinitum, or if you're lucky enough you find a crumb of quality discussion, often generals, often around derivative content from other platforms or popular media.
There are the rare productive generals that do have people curating information in meaningful ways, or even rarer actually doing things themselves. Far more often generals are just toxic loosely held together "friend" circles who cant get along anywhere else due to a perpetual veil of irony that can only survive in anonymous spaces, often attacking each other for little more than to stir the pot and keep conversation going. They'll still hold a superiority complex over their use of the site even though every single bad thing they'll say about others can be said for 4chan times 10.
Its not 2006 anymore, 4chan isnt a creator of internet culture, 4chan is a dumpster of the web, where art goes to die.
That certainly used to be the case pre-2012. All the former hactivists have long since left. marriage, kids, real life, etc... Now it's mostly handfuls of edgy boys on cell phones in school and 4chan-GPT creating and responding to threads. I wish I were wrong. The site went mostly dead for about two weeks when USAID was defunded and had to shift funding sources then all the usual re-re-re-re-re-posted topics in /g/ returned. Some of them are on this site too ... inb4 they reply. Adding to this now the general public have the real names, IP addresses and locations of all the moderators so they are less likely to participate in doxxing.
There was a quote, "4chan is where smart people go to act stupid, facebook/reddit is where stupid people go to act smart". That probably needs to be updated.
I never said that. USAID manipulate narratives on all popular multimedia and social media sites. Anyone may post on 4chan and anyone with a 4chan-pass may use proxies and VPN's.
I wouldn't be surprised to read that on 4chan, but on HN ... we need some credible citations. :)
Speaking of misinformation, there are efforts to suggest USAID is actually US AID inferring they are some type of AID organization including putting "AID" in a different color in their logo. A few times a year they contribute small amounts of resources so they can get away with saying it but they are actually the United States Agency for International Development [3] originally meant to sway public opinion in other nations but started targeting people in the USA and its allies.
I think the take-away is that everything on the internet including references and citations are probably misinformation of misinformation of misinformation. I have sympathy for AI trying to ingest all of it.
[1] - https://unherd.com/newsroom/documents-reveal-us-government-a...
[2] - https://docs.house.gov/meetings/FA/FA14/20190521/109537/HHRG...
[3] - https://en.wikipedia.org/wiki/United_States_Agency_for_Inter...
Why be disingenous? Do you have something to lose by an honest search for the truth? Do you not want to look for it? Are you so sure that your narrow political group has the truth and no search is needed?
1. Raise the cost of conducting malign influence operations against the United States and its allies.
2. Close vulnerabilities that foreign adversaries exploit to undermine democratic institutions.
3. Separate politics from efforts to unmask and respond to foreign operations against the U.S. electoral process.
4. Strengthen partnerships with Europe to improve the transatlantic response to this transnational threat.
5. Make transparency the norm in the tech sector.
6. Build a more constructive public-private partnership to identify and address emerging tech threats.
7. Exhibit caution when reporting on leaked information and using social media accounts as journalism sources.
8. Increase support for local and independent media.
9. Extend the dialogue about foreign interference in democracies beyond Washington.
10. Remember that our democracy is only as strong as we make it.
It's significant that a political faction does everything it can to remove barriers to disinformation, for example using lawfare and other attacks to shut down research into it, using political power to disable the country's ability to protect itself.
That would seem to be least intrusive option.
Using the internet in the UK/EU is such a horrible experience, every cookie pop-up is a reminder how badly thought out these rules are.
[1] - https://www.rtalabel.org/index.php?content=howtofaq#single
A client checking for a header is more than sufficient to block small children from seeing porn and that is 100% more than we have today. No extra memory or CPU required important on tablets or phones handed to children. No privacy invasion by daemons or other third parties.
Kid: "Mommie they said go to pornhub.com for games but it ask for password"
Mom: "Dumb trolls are picking on you, I will deal with them."
Until then I will shed no tears about your slightly lowered effectiveness at manipulating people into acting against their own best interests.
Also remember that the pop-up is an industry choice, the rules only mandate that a user should opt in, not how. No laws mandate the cookie banners, no regulations say they should be obnoxious.
There's no need, that's already the case.
All phones (the network account attached to the SIM actually, not the phone itself) comes with a content filter enabled by default in the UK, adult or not.
Neither resident nor frequent visitor to the UK, so I'm behind the times when I ask: I beg your fucking pardon?
Is there further reading on this inane nanny-state horror, ideally via a Wikipedia article on the law or gentleman's agreement amongst the carriers?
Furthermore, is this more common than I assume, and I simply don't notice because I don't stray too far from the mainstream?
Yep, my thoughts exactly when I first encountered it.
> Is there further reading on this inane nanny-state horror
I tried to look something up but it seems the articles and news about the (new) Online Safety Act has taken over all of the search results (and it's not something I want to search too hard at work). I even asked an LLM but it couldn't provide sources and simply said it was "voluntary" and "industry standard". The rest of its output was drowned in the new Online Safety Act.
I suppose thanks to the OSA the old system is now history.
What's to stop that same kid to buy a porno dvd? Or to download a torrent of a porno? Or a porn magazine?
All the routers also come with filtering settings as well and ISPs ship with the filtering on by default, since that is the law and has been for several decades.
my dream is when ISPs are allowed to sell this, but not allowed to call it internet access.
White listing worked for a while (months) when they were young, but it was super-high touch and stuff just broke all the time. You try to whitelist a site, but you have to then figure out all their CDNs.
Restricting specific sites works, sort of, until they find some place that hosts that content. Blocking youtube doesn't work(*), every search engine has a watch videos feature. (Why are you spending 3 hours a day on DDG?) There's really no way to segment youtube into "videos they need to watch for school" and "viral x hour minecraft playthrough". Somehow, we've managed to combine the biggest time waste ever with a somewhat useful for education hosting service.
That's leaving out the jailbreaks that come from finding an app's unfiltered webview and getting an open web escape there.
There's basically no reliable method for filtering even on locked down platforms.
* there's probably a way to kill it at the firewall based on dns, but that's iffy for phones and it's network wide.
The regex are: (^|\.)youtubei\.googleapis\.com$ (^|\.)ytstatic\.l\.google\.com$ (^|\.)ytimg\.l\.google\.com$ (^|\.)youtube-ui\.l\.google\.com$ (^|\.)youtube\.com$ (^|\.)ytimg\.com$ (^|\.)googlevideo\.com$
You can create groups and assign devices to them, and assign the block rules only to certain groups.
The only annoyance with this is that it blocks logging into Google since they redirect to YouTube to set a login cookie as part of the Google login process. If you're already logged into Google though, everything works as normal, and you can always disable pihole for five minutes if for some reason you got logged out and need to log back in.
Neither is the tech for locking down all online identity to government-controlled access... But I have strong opinions about which one everybody should/shouldn't start creating!
Technical cookies don't require any consent so every time you see a cookie banner the website owner wants to gather more data about you than necessary. Furthermore, these rules don't require cookie banners, it's what the industry has chosen as the way to get consent to track their users.
Check the banner next time, you'll see how many “partners” they do sell your data to.
When purchasing an internet-enabled device the UK could regulate that large retailers must ask if the device is to be used by an under 18 year old. If they say yes, then they could ship with filters enabled. They could also regulate that all internet-enabled devices which could be sold to children should support child filters.
If we did this then whether or not a child views NSFW material it will be on the parent, instead of the current situation where whether a child can view NSFW material online depends on the age verification techniques of Chinese companies like TikTok or American companies like 4chan.
All mobile network connections already come with content filters enabled in the UK, adult or not, and has to be explicitly disabled.
When you buy wifi, they already make sure you're an adult. They ask for proof of residence, you sign a contract. Children cannot buy wifi. Go ahead and try - no ISP is going to write a contract with a child.
Wifi, like tobacco and alcohol, is already age restricted.
The problem is the adults buying it then turn around and just... Hand it to children. That's not the fault of the law or society.
Like, okay the store clerk might make sure when I buy a pack of menthols I'm of age. But if I just go home and hand my kid the pack of menthols, all bets are off. That's not the store clerks problem, he can't and won't get in trouble for that.
Parents and establishments are being stupid here. Same applies for public wifi. Don't want kids to use it? Okay, give it a password, only tell the password to adults. Easy peasy.
The law can't stop parents from being stupid.
But it is society's problem, and within society's capacity to attempt to manage.
https://www.childline.org.uk/info-advice/you-your-body/drugs... says it's illegal to give a child cigarettes, and the cops can confiscate them if you're 16 or below.
> The law can't stop parents from being stupid.
Sure, but reality also often means smart, caring parents still can't stop kids from... being kids. I've lived in places where half a dozen public wifi hotspots were available; even if I didn't, chances are I'd have to let my kids on wifi for homework, on computers I don't have admin rights to because they come from the school.
They can't go sign up for a new internet plan, but that's hardly required.
Sure, to an extent, but not really: we give parents a lot of freedom here.
> Sure, but reality also often means smart, caring parents still can't stop kids from... being kids. I've lived in places where half a dozen public wifi hotspots were available; even if I didn't, chances are I'd have to let my kids on wifi for homework, on computers I don't have admin rights to because they come from the school.
Okay, then lock down those networks. We don't need to lockdown the Internet as a whole.
In reality, most of those networks already are locked down.
Try searching up porn on, say, hotel wifi, it won't work.
We already have the solution.
I… very much doubt that.
Both require being an adult.
And, "free wifi", like you're talking about, already blocks porn. So problem solved, right?
What's actually the issue here? Because nobody seems to be able to articulate it. What problem are we solving?
But neither of those is "buying Wi-Fi" - that's why I'm confused.
Like you can configure your browser to do whatever you want with cookies - blocking them all, blocking only third party ones, etc. - there is no need for government regulation here.
But the legislators are completely tech illiterate and even the general public supports more interference and regulation.
It’s not possible to rely on browser controls as-is, because they do not differentiate between necessary and optional cookies.
Browser vendors could agree standards and implement them, exposing these to users and advertisers in a friendly way.
But they haven’t shown any interest in doing this.
I wonder why?
One of the hundreds of reasons do_not_track failed. You cannot do something that trusts the website operators, because they are egregiously untrustworthy.
The cookie banner everyone keeps bitching about is a direct example of this. No website is required to have a cookie banner. They choose to, because they know most users click "Yes to all", and then complain about the regulators, instead of the assholes asking you to consent to sharing your data with nearly a thousand third parties
And "browser vendors" will never do anything, because 90% of the market is a literal advertising behemoth, the rest of the market is owned by a company that makes money only when you do things not through the web browser.
My point is about UX: it could be much slicker if the browser industry standardised the consent mechanism.
You make a good point about lack of incentives.
I'd welcome a ramp-up of the legislation: outlaw the kind of tracking that needs the banners currently outright. I'm sure a lot of websites would just geo-block EU as a result (like how some did because of GDPR), but I bet the EU-compliant visitor tracking solutions would suddenly skyrocket, and overall, nothing of value would be lost, neither for the users, nor for the website administrators.
The question a user should ask is why is this website collecting my data. Marketing and adtech companies are trying to shift this question to why is the EU making websites worse.
> there is no need for government regulation here
You don't need to care about this if you respect users' privacy in the same way you don't need to care about waste water regulation when you don't pump waste into rivers.
That's what the advertising-dependent implementers who deliberately made it shittier than necessary (stuff like "you have to decline each of our 847 ad partners individually") want you to think, at least. It's mostly malicious compliance.
But people (like my girlfriend) still click "Allow all" because they don't seem to realize that the legislation requires the website to still function if you decline unnecessary cookies!
The banner is literally an attempt to FOMO you into accepting cookies you never need to accept!
IMO the EU is somewhat in dereliction of Duty for not punishing cookie banner sites
This policy was pushed by David Cameron, who was the prime minister at the time:
https://www.gov.uk/government/speeches/the-internet-and-porn...
615 more comments available on Hacker News
Want the full context?
Jump to the original sources
Read the primary article or dive into the live Hacker News thread when you're ready.