The Peach Meme: on Crts, Pixels and Signal Quality (again)
Posted3 months agoActive2 months ago
datagubbe.seTechstoryHigh profile
calmmixed
Debate
60/100
CrtsRetro GamingDisplay Technology
Key topics
Crts
Retro Gaming
Display Technology
The article discusses the 'Peach meme' and the nuances of CRT display quality, sparking a discussion on the nostalgia and technical aspects of CRTs versus modern displays.
Snapshot generated from the HN discussion
Discussion Activity
Very active discussionFirst comment
7h
Peak period
37
Day 8
Avg / period
10.4
Comment distribution52 data points
Loading chart...
Based on 52 loaded comments
Key moments
- 01Story posted
Oct 13, 2025 at 12:33 AM EDT
3 months ago
Step 01 - 02First comment
Oct 13, 2025 at 7:27 AM EDT
7h after posting
Step 02 - 03Peak activity
37 comments in Day 8
Hottest window of the conversation
Step 03 - 04Latest activity
Oct 24, 2025 at 3:45 PM EDT
2 months ago
Step 04
Generating AI Summary...
Analyzing up to 500 comments to identify key contributors and discussion patterns
ID: 45564696Type: storyLast synced: 11/20/2025, 2:49:46 PM
Want the full context?
Jump to the original sources
Read the primary article or dive into the live Hacker News thread when you're ready.
They also said the impression is different since it's so close up - what does it look like at the size you'd really see it in game?
The article mentions later that it's a PVM-20L2MD [1]. This is a professional CRT monitor for medical devices. It uses the same signals as a consumer TV, but comes with a higher quality tube that has a sharper picture.
[1] https://crtdatabase.com/crts/sony/sony-pvm-20l2md
It works beautifully, and you no longer need a clunky, heavy, dying CRT. I'm sure the purists will say it's not the same, but I've done sides by side comparisons IRL and it's good enough for me even when pixel peeping. I prefer the emulated CRT look on a modern OLED to the real thing these days.
To properly capture VHS you need something called a TBC. Most have died over the years and the ones that are left are either very expensive or dying as caps fail. A TBC that was once considered low end commonly sells for $1k+ today.
The retrotink can do most of what a TBC does and it's a modern device you can actually buy for a reasonable price. It can also upscale and deinterlace for you in the process saving a ton of work later. The serious archivist would scoff, but it's good enough for home movies, and I would argue that it introduces less noise than 30 year old professional equipment with dying caps.
The anime art and FMV sequences looked way better too.
For that period it even shaped my perception that analog video and specially n64 graphics were always bad, but all that was vindicated by those shaders, it really does make a big difference, and made me find a new appreciation for n64 graphics in particular.
There is some internet misconception that the inherently "blurry" output of an n64 is bad (And sure, some games are just ugly/bad from an artistic standpoint), but it's actually the smoothest image any analog console will ever produce when hooked up to a proper CRT or CRT shader, and it's consistent across all games because of "forced" high quality AA in all games. Even the next generation of consoles seldomly used AA.
Playing something like The Dark Knight Rises from an UHD Blu-ray on a good OLED looks incredible!
It's worse for a movie because those are 24fps.
On the post's notes on the Sonic waterfall effect, the [Blargg NTSC Video Filter](https://github.com/CyberLabSystems/CyberLab-Custom-Blargg-NT...) is intended to recreate that signal artifact, but similar processing is included in a lot of the CRT shaders that are available. I found that RGB had a visual artifact when moving that made the waterfall flicker, but composite didn't, so I played on that setting. Running it with the beam simulator is probably causing some of that.
Maybe I just didn't play games that used tricks to get around the pixels?
---
That being said, I remember that "New Super Mario Brothers" on Wii appeared to use a CRT trick to try and make the background flash in a boss room. I always played my Wii in 480p, so it just looked like there were vertical lines in the boss room.
The NES had a particular quirk with its NTSC output that I always thought was very characteristic of NES. I found this article a few years ago, and was fascinated that work was done to really figure it out - https://www.nesdev.org/wiki/NTSC_video - and it's awesome at at least some emulators (FCEUX) seem to use this info to generate an experience quite similar to what I remember the NES being when I grew up. But I don't think any NES game graphics really depended on this for any visual output. All NES games had jagged vertical lines, for example.
Ahh: I always wondered why I never saw interlacing artifacts on the NES! (I'm going to assume the same thing for the SNES too.)
I certainly feel that way when watching interlaced video. There's far too much bad deinterlacing out there. One of the biggest problems that I encounter is that deinterlacers tend to reduce the frame rate. (IE, 60i -> 30p instead of 60p, or 50i -> 25p instead of 50p)
(That being said, when I would watch interlaced live TV on a 30" or more TV, I'd see all kinds of interlacing artifacts.)
If/when this happens, we may be able to again view things as close as they were in broadcast, but with a modern display instead of an an old CRT.
If we can find any content to play that way, anyhow. A lot of it is cheerfully being ruined.
Aside from the dozens of us who are interested in that, most of the rest of the folks seems convinced that television looked just as terrible as a blurry SLP VHS tape does after being played through a $12 composite-to-USB frame grabber, using a 3.5mm "aux cable" jammed in between the RCA jacks of the VCR and the converter, and ultimately delivered by an awful 360p30 codec on YouTube, before being scaled in the blurriest way possible...and draw from this the conclusion that there's no details that have any value worth preserving.
Even though television was never actually like that. It had limits, but things could be quite a lot better than that awful mess I just described.
(For those here who don't know: Towards the end of the run, the quality of a good broadcast with a good receiver would often be in the same ballpark as the composite output of a DVD player is today (but with zero data compression artifacts instead of >0), including the presentation of 50 or 60 independent display updates per second.)
To truly do that, you need to display over 20 million frames a second.
Why?
True analog video didn't capture frames, but instead each pixel was transmitted / recorded as it was captured. This becomes clear when watching shows like Mr. Rogers on an LCD. When the camera pans, the walls look all slanted. (This never happened when viewing on a CRT) This is because the top part of the image was captured before the bottom part. I wouldn't even expect a 60i -> 60p deinterlacer to correct it.
That being said, I don't want to emulate a CRT:
- I want a deinterlacer that can figure out how to make the (cough) best image possible so deinterlacing artifacts aren't noticeable. (Unless I slow down the video / look at stills.)
- I want some kind of machine-learning algorithm that can handle the fact that the top of the picture was captured slightly before the bottom of the picture; then generate a 120p or a 240p video.
CRTs had a look that wasn't completely natural; it was pleasant, like old tube amplifiers and tube-based mixers, but it isn't something that I care to reproduce.
> I want a deinterlacer that can figure out how to make the (cough) best image possible so deinterlacing artifacts aren't noticeable.
> I want some kind of machine-learning algorithm that can handle the fact that the top of the picture was captured slightly before the bottom of the picture; then generate a 120p or a 240p video.
---
If we want to black out lines after 3ms, we might as well bring these back: https://www.youtube.com/watch?v=ms8uu0zeU88 ("Video projectors used to be ridiculously cool", by Technology Connections.)
In that case you just need a 60Hz-synced scanout, and you can get screens that do that right now. That will beat any machine learning stabilizer.
Please allow me to restate my intent: With enough angular resolution (our eyes have limits), and enough brightness and refresh rate, we can maybe get close to what the perception of watching television once was.
And to clarify: I don't propose completely chasing the beam with OLED, but instead emulation of the CRT that includes the appearance of interlaced video (which itself can be completely full of fields of uncorrelated as-it-happens scans of the continuously-changing reality in front of the analog camera that captured it), and the scan lines that resulted, and the persistence and softness that allowed it to be perceived as well as it once was.
In this way, panning in an unmodified Mr Rogers video works with a [future] modern display, sports games and rocket launches are perceived largely as they were instead of a series of frames, and so on. This process doesn't have to be perfect; it just needs to be close enough that it is looks the ~same (largely no better, nor any worse) as it once did.
My completely hypothetical method may differ rather drastically in approach from what you wish to accomplish, and that difference is something that I think is perfectly OK.
These approaches aren't exclusive of eachother. There can be more than one.
And it seems that both of our approaches rely on the rote preservation of existing (interlaced, analog, real-time!) video, for once that information is discarded in favor of something that seems good today, future improvements (whether in display technology or in deinterlacing/scaler technology, or both) for any particular video become largely impossible.
In order to reach either desired result, we really need the interlaced analog source (as close as possible), and not the dodgy transfers that are so common today.
With pixel art go ahead and add bloom, and faster phosphors become more important. But not much else. I don't think anything wants focus issues and power supply sag.
It was originally proofed on a CRT. It may have been a ridiculously-good and Sony BVM in excellent calibration, but it was still a CRT that had CRT issues.
(I don't care much if nobody else wants that experience. I'll build it myself when I feel that modern displays have become flexible-enough to accomplish my goals.)
Wanting the whole package of CRT flaws isn't just playing a record, it's playing a record with a cheap needle. Go ahead if it feels more nostalgic or right to you, and I wish you luck in finding what you want. But I don't think it adds anything, and to some extent it detracts from the original.
So I hope you have a nice day too but also I hope that you don't keep a wrong idea of my comment in your head.
I wasn't being fake and patronizing, there are reasons to want that, it's just that those reasons aren't usually original authorial intent.
It’s like sometimes preferring 24 fps cinema or oil paints over photography.
It depends on what’s being displayed.
The utilitarian POV will always look for the best (less noisy/most accurate/most reproducible) medium possible. But when it comes to art, many other aspects factor in, and a little bit of noise can very well add something to a piece. Not always, but often enough.
No one is campaigning to get rid of the beautiful modern 4k OLED displays and return to CRTs for everything. But for low resolution content (retro games and computers) it looks better on a CRT.
There's pretty good modern emulation of the look, but at the end of the day it's a different technology and they do look different.
Not to mention the feel of playing through a near-zero latency analogue signal is noticeable if you've played enough of certain games. There's a reason speedrunners of retro games all play on console with a CRT.
Those gaps have finally started closing in the last few years, now that 4K 240Hz 99% DCI-P3 OLED monitors are readily available and relatively affordable.
Short version: Our eyes are constantly tracking objects on screen ("smooth pursuit"), which leads to visible smearing on OLED and LCD screens, because they hold their frame for the entire frame time rather than just flashing them for a short fraction of that time. Especially fast paced and side scrolling games like Sonic look much better on CRT screens. (But CRTs have much lower brightness and hence contrast than modern screens.)
Full explanation here: https://news.ycombinator.com/item?id=42604613
I held on to my 21" Trinitron for a long time into the LCD era, because it had better contrast, resolution and sharpness. Eventually affordable LCD's did catch up.
I don't get nostalgic about any technologies - and certainly wouldn't get nostalgic about cathode ray tubes which were big, heavy and had innate limitations. However, I am serious about vintage game preservation and I care about seeing classic game art which was created on and for CRTs accurately presented as the original developers and artists saw it. These days that's as easy as playing games from the CRT era with a good CRT-emulation pixel shader.
What frustrates me is when I see classic 80s games on popular retro YouTube channels objectively looking far worse than they actually did in the 80s. That happens because some of that artwork was painstakingly hand-crafted to exploit the unique traits of both the analog video format and CRTs. When presented without a pixel shader, some of those titles simply look wrong - and in some cases, egregiously so. I know because I'm old enough to have been there, worked with and learned from some now-legendary 80s game developers and artists.
The hard-edged, square pixel blocks many young people (who've never seen a CRT) think a retro game like Pac-Man or Joust should have, is a strange historical anomaly. When I show them what the games actually looked like either via a good pixel shader or on my arcade emulation cabinet's 25-inch analog RGB, quad-sync CRT (which was made for high-end arcade cabinets), they're often shocked. I hear things like "Wow, I thought retro was cool because it looked so janky but it was actually softly beautiful." To me, the importance of CRTs (and CRT shaders) isn't about injecting analog degradation to recreate some childhood nostalgia for the crappy RCA TV your parents had in the living room (with rolling hum bars from the unshielded RF modulator), it's about making games like Pac-Man and Joust look as good as they really did on the industrial CRT in their arcade cabinet (which could be better than the best TV many consumers ever owned). Or alternatively, making console games look as good as they did to the original developers and artists, who usually used the highest-quality CRTs they could because they were after the best-possible image for the same reasons recording studios have always used reference-grade audio speakers.
So yeah, it's not honoring those historic games when retro YouTubers show them in a degraded form that looks far worse than they ever did in the day - especially when it's now so easy to present them accurately by turning on a CRT shader that's already built into many retro emulators. As others in this thread point out, even the best pixel shaders aren't completely perfect, but as a retro-purist (and video engineer whose career spanned the analog and digital video eras) I concede today's best pixel shaders are 'accurate enough' and certainly far better than hard-edged block pixels. It's weirdly tragic because what some people think 'retro' games looked like isn't worse than a bad consumer TV was - or better than a good analog RGB CRT was - it's just wrong on some bizarre third-axis of digital jank which never existed in the CRT era.
It's a shame, because I really like the original, muted colour palette. Some artists produced great results within the limitations of the LCD screen. Similar to the CRT spritework in this article, it feels like a lost art.
However, there's an interesting complication: quite a lot of Game Boy Color and Game Boy Advance developers seem to have created their game art on sRGB monitors, and then shipped those graphics without properly considering how they would appear on the LCD. Those games appear muddy and colourless on a real console - they actually look much better when emulated "incorrectly", because the two errors exactly cancel out!
I still play Diablo I on the Sony to this day. Wonderful monitor. I will cry when it finally dies.
Given their ability to generate a painting that appears identical to a photo, could they depict how the image appears to them, eliminating any loss from mechanical capture.