Rf Shielding History: When the Fcc Cracked Down on Computers
Key topics
The article discusses the history of RF shielding regulations by the FCC and their impact on computer design, sparking a discussion on the balance between regulation and innovation, as well as the ongoing issues with electromagnetic interference.
Snapshot generated from the HN discussion
Discussion Activity
Very active discussionFirst comment
59m
Peak period
28
0-6h
Avg / period
8.5
Based on 34 loaded comments
Key moments
- 01Story posted
Oct 21, 2025 at 10:26 AM EDT
3 months ago
Step 01 - 02First comment
Oct 21, 2025 at 11:25 AM EDT
59m after posting
Step 02 - 03Peak activity
28 comments in 0-6h
Hottest window of the conversation
Step 03 - 04Latest activity
Oct 23, 2025 at 5:45 PM EDT
3 months ago
Step 04
Generating AI Summary...
Analyzing up to 500 comments to identify key contributors and discussion patterns
Want the full context?
Jump to the original sources
Read the primary article or dive into the live Hacker News thread when you're ready.
And really, it's not the consumer's place to be aware of these things. It's the regulators'. And they've dropped the ball.
Would be nice to have more metal cases for SBCs, like the one on R4S, https://www.androidpimp.com/embedded/nanopi-r4s-review
KKSB makes metal cases for some SBCs, https://kksb-cases.com
Visually, I don't care particularly much one way or the other, but on a 6-12 layer PCB, there's plenty of opportunity to closely couple and shield fast-changing signals, so I wouldn't expect surrounding in a Faraday cage would be needed (and certainly I've never noticed an issue from the computers near my RF receivers).
I went out of my way to avoid extra lights, which sadly wasn’t possible for the GPU. I had to figure out how to turn those off in software.
Another reason to use dark mode I guess
If the FCC hadn't been so strict I think there's a good chance we'd be using computers with a lineage going back to Atari versus IBM today.
Commodore ate Atari's lunch with the C64 and pricing, but Atari could have launched the 400/800 at lower price points with more lax emission standards. They would have had lower peripheral price points, too, since the SIO bus and "smart peripheral" design was also an emissions strategy.
On the home computer front the Atari 8-bits beat the pants off of the PET for graphics and sound capabilities. More success in the late 70s might have enabled Atari to build down to a price point that would have prevented the C64 from even happening.
On the business side Atari R&D had interesting stuff going on (a Unix workstation, Transputer machines). Alan Kay even worked there! They were thinking about business computing. If the 8-bits had had more success I think more interesting future products could have been brought to market on the revenue they generated.
And you make a good point about the SIO bus - this was when every other machine had unshielded ribbon cables everywhere. Their devotion to daisy chained serial really crippled them in terms of speed, and when USB finally arrived, I initially scorned it due to the prejudice formed by my experience with the Atari peripherals! It turns out they were on the right track all along!
And/or many of the other manufacturers of that era. I have encountered execs from that era that still believe the whole thing was some sort of shrouded protectionism.
I kept reading "must accept" as a technical requirement, somehow like "must not be shielded against" or "must not use technical means to protect against", rather than what I now think is the intended legal sense "does not have any legal recourse against".
It's weird that they phrased it in terms of how the device itself must "accept" the interference, rather than the owner accepting it.
I guess there are two ways to look at it. Either the regulation was wildly successful, so the problems persist only in the less-regulated spaces. Or we spend a lot of effort chasing the wrong problem.
https://www.nytimes.com/2019/05/04/us/key-fobs-north-olmsted...
(https://archive.is/aTWZ2)
Apparently the regulations work well enough to provoke an official response when garage door openers stop working over the area of a few houses… a level of reliability I’d long taken for granted
It correlates with aircraft carriers coming to the Bremerton naval shipyard for repair work.
The 315 MHz band for unlicensed low power civilian devices overlaps with several military bands, and the military users have priority (i.e., if they interfere with civilians it is the civilians' problem, and if civilians interfere with the military it is the civilians's problem).
In particular a band used for air-to-ground aircraft communications includes the 315 MHz band. Apparently when a carrier is at the shipyard for repairs they take advantage of that opportunity to test and tune up the radio systems.
They also have radar that uses frequencies in the 315 MHz band. There was an incident in the news in 1999 when the USS Carl Vinson visited Hobart, Australia and disabled nearly every garage door opener within about 10 km of the port while is was pulling in and docking. It was fine once docked because they don't need the radar when not underway.
However, once aware of the potential problems it wasn't too hard or even very expensive to design hardware which avoided the most serious problems. Properly grounding components and a little bit of light shielding here and there would generally suffice to ensure most devices wouldn't cause noticeable issues more than two walls and 30 feet away. I think by the 90s the vast majority of hardware designers knew how to mitigate these issues while the evolution of consumer device speeds and designs reduced the risks of actual interference on both the 'interferor' and 'interferee' sides.
Unfortunately, the FCC's regulatory testing requirements didn't similarly evolve. Hardware designers I worked with described an opaque process of submitting a product for FCC testing only to receive a "Pass/Fail" with no transparency into how it was tested. Sometimes the exact same physical product could be resubmitted a month later with zero changes and pass. This made things unpredictable and slow, which could be a lethal combination for small hardware startups. So there emerged a sub-industry of "independent RF testing labs" which you could pay to use their pricey gear and claimed expertise to test your device and tell you why it failed, let you make a change right there and retest again until you passed. This made things more predictable but it could cost upwards of $10K (in 90s dollars) which was a serious hardship for garage startups. I was told a lot of the minor hardware changes made during such interactive testing probably did nothing to decrease actual interference in the real-world and only served to pass the test.
Then came the era of "weaponizing" FCC certification. Small startups could avoid the costs and delay of FCC testing by filing their product as a "Class A" device (which meant only for use in industrial/scientific environments) instead of as a "Class B" (consumer) device. The devices still had to not interfere but their makers could self-certify their internal tests without going through FCC testing. When new hardware startups would threaten a large, established company product with a cheaper, better product shipped as "Class A", BigCo would report them either interfering or just being used in consumer environments - despite the device very likely not interfering with anything. This ended up creating a lot of problems for such startups because if their cool new product ended up even once in an arguably "retail distribution channel", they could get hit with big fines - all without ever causing any actual interference - and even if the device was able to pass FCC testing and would have been certified as Class B. It got especially ridiculous when a lot of cheaper products were simply generic designs, like a modem using the standard Rockwell chip set and reference design. These were often made on the same production line and even used the same circuit board in a different case as other products which all passed FCC testing. But if you didn't have your official "FCC Cert", you could get busted.
I left the hardware space in the early 2000s so I never heard if these regs were ever modernized, but it sure seemed like they were in need of it.
When I would fire up my KIM-1, the TV would turn to snow.
There was a toy called the "Big Trak", a programmable ATV toy. If you ran that underneath the desk with a TRS-80 on it, it would crash.
The TRS-80 Model 1 was notorious for this, as you connected the computer to the expansion interface with a bare, 40(? ish?) pin ribbon connector. It was a beautiful broadcast antenna for computer signals.
The FCC was an impetus for the Model 3.
It had an amazing selection of ports, all unshielded and designed for flat ribbon cables. But that wouldn't fly in the USA.
[0] https://www.youtube.com/@ComputinghistoryOrgUk1
"Tempest-LoRa: Cross-Technology Covert Communication via HDMI RF Emissions", https://news.ycombinator.com/item?id=44483262
On the other hand, I have been struggling to get my IP KVM at home working and it turned out that the cause of its failure was some cheap PoE extractors that spew noise across the spectrum, especially interfering with VGA sync.
Modern equipment, assuming you aren't buying bargain-basement AliExpress junk (which I do, from time to time) is surprisingly good at RF rejection.
And, amusingly, this just popped up on Twitter: https://x.com/hakluke/status/1980479234159398989