Stranger Things Creator Says Turn Off “garbage” Settings
Key topics
The debate rages on about TV settings after Stranger Things creators urged viewers to turn off "garbage" settings for a better viewing experience, sparking a lively discussion about the merits of Filmmaker Mode and the pitfalls of excessive post-processing. While some commenters, like elondaits, argue that "Game" mode can improve the viewing experience by reducing lag, others, like astrange, caution that lower latency doesn't necessarily mean a better picture. The conversation quickly veered off-topic, with some commenters, like tguvot, poking fun at the show's production choices and others, like ycombinatrix, defiantly refusing to adjust their viewing style to accommodate modern TVs. Amidst the banter, a tongue-in-cheek consensus emerged: sometimes, the real problem lies not with the TV settings, but with the content itself.
Snapshot generated from the HN discussion
Discussion Activity
Very active discussionFirst comment
58m
Peak period
82
0-6h
Avg / period
16
Based on 160 loaded comments
Key moments
- 01Story posted
Dec 29, 2025 at 6:50 PM EST
11 days ago
Step 01 - 02First comment
Dec 29, 2025 at 7:48 PM EST
58m after posting
Step 02 - 03Peak activity
82 comments in 0-6h
Hottest window of the conversation
Step 03 - 04Latest activity
Jan 1, 2026 at 2:39 PM EST
8 days ago
Step 04
Generating AI Summary...
Analyzing up to 500 comments to identify key contributors and discussion patterns
Want the full context?
Jump to the original sources
Read the primary article or dive into the live Hacker News thread when you're ready.
With that being said, I’ve definitely seen TVs that just don’t have FILMMAKER MODE or have it, but it doesn’t seem to apply to content from sources like Chromecast. The situation is far from easy to get a handle on.
they film for screens , regardless of where those might be.
When you say "modern", do you mean "configured by a clueless consumer"?
That's a lost cause. You never know what sort of random crap and filters someone like that may inflict on the final picture. You cannot possibly make it look good on every possible config.
What you can do is make sure your movie looks decent on most panels out there, assuming they're somewhat standard and aren't configured to go out of their way to nullify most of your work.
It is perfectly understandable that the people who really care about how their work was colour-graded, then suggest you turn off all the features that shit all over that work. Similarly for the other settings he mentions.
Don't get me wrong, I haven't seen the first season, so won't watch this, but creators / artists do and should care about this stuff.
Of course, people can watch things in whatever dreaded settings they want, but lots of TVs default to bad settings, so awareness is good.
Since consumers are not trained to critically discern image and video quality, the "Wow!" often wins the sale. This easily explains the existence of local dimming solutions (now called miniLED or some other thing). In a super bright Best Buy or Walmart viewing environment they can look fantastic (although, if you know what to look for you can see the issues). When you get that same TV home and watch a movie in the dark...oh man, the halos jump off the screen. Now they are starting to push "RGB miniLED" as if that is going to fix basic optics/physics issues.
And don't get me started on horrible implementations of HDR.
This is clearly a case of the average consumer not knowing enough (they should not have to be experts, BTW) and effectively getting duped by marketing.
I watched stranger things at a friend the other day, and they were not noticing that the TV destroyed the picture in every way. It oversaturated and -contrasted so much, that sometimes there were only shades of two different colors on the screen. It also created some sort of in-between frames, that created jarring artifacts.
These settings are the television equivalent of clickbait. They are there to get people to say "Oh, wow!" at the store and buy it. And, just like clickbait, once they have what they clicked on, the experience ranges from lackluster and distorted to being scammed.
Most people have absolutely no idea what goes into making the pixels on their screens flicker with quality content.
Not she why Netflix is destroying destroying the experience themselves here.
For example Kate Bush's 1985 "Running up that Hill" because a huge worldwide hit after appearing in season 4.
I was also gradually switching to treating this season as a background noise, as it fails to be better than that. It is insultingly bad at places even consumed this way.
But then getting into recommendations like "turn off vivid mode" is pretty freaking pretentious, in my opinion, like a restaurant where the chef freaks out if you ask for salt. Yes, maybe the entree is perfectly salted, but I prefer more, and I'm the one paying the bill, so calm yourself as I season it to my tastes. Yes, vivid modes do look different than the filmmaker intended, but that also presumes that the viewers eyes are precisely as sensitive as the director's. What if I need higher contrast to make out what's happening on the screen? Is it OK if I calibrate my TV to my own personal viewing conditions?
The point you make isn't incorrect at all. I would say that TV's should ship without any such enhancements enabled. The user should then be able to configure it as they wish.
Plenty of parallel examples of this: Microsoft should ship a "clean" version of Windows. Users can they opt into whatever they might want to add.
Social media sites should default to the most private non-public sharing settings. Users can open it up to the world if they wish. Their choice.
Going back to TV's: They should not ship with spyware, log-ware, behavioral tracking and advertising crap. Users can opt into that stuff if they value proposition being offered appeals to them.
Etc.
I strongly agree with that. The default settings should be… well, “calibrated” is the wrong word here, but that. They should be in “stand out among others on the showroom floor” mode, but set up to show an accurate picture in the average person’s typical viewing environment. Let the owner tweak as they see fit from there. If they want soap opera mode for some bizarre reason, fine, they can enable it once it’s installed. Don’t make the rest of us chase down whatever this particular brand calls it.
The equalizer analogy is perfect.
Having said that, there are a lot of bad HDR masters.
> Whatever you do, do not switch anything on ‘vivid’ because it’s gonna turn on all the worst offenders. It’s gonna destroy the color, and it’s not the filmmaker’s intent.
I’m interested in trying the filmmaker’s intent, like I’ll try the chef’s dinner before adding salt because it’ll probably be wonderful. But if I think the meal still needs salt, or my TV needs more brightness or contrast, I’ll add it. And even if the filmmaker or chef thinks I’m ruining their masterpiece, if I like it better that way, that’s how I’ll enjoy it.
And I’m very serious about the accessibility bit. My vision is great, but I need more contrast now than I did when I was 20. Maybe me turning up the brightness and contrast, or adding salt, lets me perceive the vision or taste the meal the same way as the director or chef does.
Martin has claimed he flew to HBO to convince them to do 10 seasons of 10 episodes instead of the 8 seasons with just 8 episodes in the final one [1]. It was straight up just D.B. Weiss and David Benioff call how the series ended.
[1]: https://variety.com/2022/tv/news/george-rr-martin-shut-out-g...
You know there are two more episodes, right? That seems like an obvious finale reveal.
They just invent stuff that they have no idea how to explain later. Just like Lost.
There are also stupid leaps of faith like Holly's mom hobbling out of bed and sticking an oxygen tank in a clothes dryer(as if that would even do anything)...
From that point on, everyone gets 10 inch thick plot armour, and then the last two episodes skip a whole season or two of character development to try and box the show off quickly.
Television writers pussying out in their finales is its own meme at this point. Makes me respect David Chase and how The Sopranos ended all that much more.
It's the way stuff is done, the characters' changed behavior, incomprehensible logic, stupid explanations, etc.
And no, I'm not talking about the gay thing. The writing is simply atrocious. Numerous plot holes, leaps of reasoning, and terrible character interactions.
Basically I think the main problem with the show is the character of eleven. She’s boring. She isn’t even really a character as she has no thoughts or desires or personality. She is a set piece for the other characters to manipulate. That works in the first season. But by season 3 it’s very tiring. She just points her hands at things and psychic powers go. Season 3 is a great example of this as billy is a very interesting character you could spend a lot of time understanding why billy is the way he is but instead you get one dream sequence because eleven sees his dreams and oh his dad sucks. Except you knew that already from season 2.
But this is basically the problem with the show. The writers like eleven too much. And she is incredibly boring as a character after season 1.
That being said, I do think that the general narrative of the show going from the demogorgon to the mindflayer to vecna and the abyss is very dungeons and dragons. Haha. That would be a fun campaign to play.
1. Millie Bobbie Brown, the actor of Eleven, is unfathomably stupid (watch any out-of-character interview with her) and the role has simply outgrown her acting abilities. They can't make Eleven do anything interesting because Millie can't act it.
2. Writers have introduced so many supporting characters and separate story lines that it is impossible to give any of them enough screen time for proper character development.
There are other major writing problems with the show, like the overreliance on cheap 80s culture references, but I think the main problem is with the characters. The writers simply don't understand what made the first season so good.
I hunted around on YouTube for a bit, but in nothing I landed on did she come across as “unfathomably stupid.” Young and green, maybe.
It’s honestly not the worse AI content out there! Lots of movies I wouldn’t consider watching but that I’m curious enough to see summarized (eg a movie where only the first title was good but two more were still published)
https://www.smbc-comics.com/comic/summary
You'd think television production would be calibrated for the median watcher's TV settings by now.
I think TV filters (vivid, dynamic brightness, speech lifting, etc) are actually a pretty decent solution to less-than-ideal (bright and noisy environment, subpar screen and audio) viewing conditions.
No.
On the other hand, things that are objective like color calibration, can be hard to “push down” to each TV because they might vary from set to set. Apple TV has a cool feature where you can calibrate the output using your phone camera, it’s really nifty. Lots of people comment on how good the picture on my TV looks, it’s just because it’s calibrated. It makes a big difference.
Anyways, while I am on my soap box, one reason I don’t have a Netflix account any more is because you need the highest tier to get 4k/hdr content. Other services like Apple TV and Prime give everyone 4k. I feel like that should be the standard now. It’s funny to see this thread of suggestions for people to get better picture, when many viewers probably can’t even get 4k/hdr.
(You ever think about how many fantastic riffs have been wasted with cringe lyrics?)
Ever look at the lyrics to Toto's Africa? We can start there, someone send a poet please
I dunno if it's just a me thing, but I wonder if a subconscious part of my brain is pegging the motion smoothed content as unnatural movement and dislikes it as a result.
Also imagine the hand of a clock rotating at 5 minutes’ worth of angle per frame, and 1 frame per second. If you watched that series of pictures, your brain might still fill in that the hand is moving in a circle every 12 seconds.
Now imagine smoothing synthesizing an extra 59 frames per second. If it’s only consider the change between 2 frames, it might show a bright spot moving in a straight line between the 12 and 1 position, then 1 and 2, and so on. Instead of a circle, the circle of the hand would be tracing a dodecagon. That’s fine, but it’s not how your brain knows clocks are supposed to move.
Motion smoothing tries to do its best to generate extra detail that doesn’t exist and we’re a long way from the tech existing for a TV to be able to do that well in realtime. Until then, it’s going to be weird and unnatural.
Film shot at 60FPS? Sure. Shot at 24 and slopped up to 60? Nah, I’ll pass.
"Everyone" includes the filmmakers. And in those cases where the best filmmakers already found all kinds of artistic workarounds for the lower framerate in the places that mattered, adding interpolation will fuck up their films.
For example, golden age animators did their own interpolation by hand. In Falling Hare, Bugs' utter despair after looking out the window of a nosediving airplane is animated by a violent turn of his head that moves farther than what could be smoothly animated at 24fps. To avoid the jumpcut, there is a tween of an elongated bunny head with four ears, seven empty black eye sockets, four noses, and eight teeth. It's absolutely terrifying if you pause on that frame[1], but it does a perfect job of connecting the other cells and evoking snappier motion than what 24fps could otherwise show.
Claiming that motion interpolation makes for a better Falling Hare is like claiming that keeping the piano's damper pedal down through the entirety of Bach's Prelude in C produces better Bach than on a harpsichord. In both cases, you're using objectively better technology poorly, in order to produce worse results.
1: https://www.youtube.com/watch?v=zAPf5fSDGVk
Higher frame rates are superior for shooting reality. But for something that is fictional it helps the audience suspend their disbelief.
If it did horror films would be filmed at higher frame rates for extra scares.
Humans have a long history of suspending belief in both oral and written lore. I think that 'fps' may be as functionally equivalent as the santa clause stories, fun for kids but the adults need to pick up the bill.
With one caveat, some games that use animation-inspired aesthetics, the animation itself is not smoothed out but basically ran on the slower framerate (see guilty gear games) while everything else (camera movement, some effects) is silky smooth and you still get quick reaction time to your inputs.
People are “used to” high FPS content: Live TV, scripted TV shot on video (not limited to only soap operas), video games, most YouTube content, etc are all at 30-60FPS. It’d be worth asking yourself why so many people continue to prefer the aesthetic of a lower framerates when the “objectively better” higher FPS has been available and moderately prevalent for quite some time.
movies above 24fps won't become a thing, it looks terrible and should be left for documentaries and sports
But synthesizing these frames ends up with a higher frame rate but with the same shutter angle / motion blur of the original frame rate, which looks off to me. Same reason the shutter angle is adjusted for footage that is intended to be slow motion.
Despite being a subscriber I pirate their shows to get some pixels.
Many years ago, I had a couple drinks with a guy from Netflix who worked on their video compression processes, and he fully convinced me they're squeezing every last drop out of every bit they send down the pipes. The quality is not great compared to some other streaming services, but it's actually kind of amazing how they're able to get away with serving such tiny files.
Anyway, I think we can expect these companies to mostly max out the resultant video quality of their bitstreams, and showing the average bitrate of their pricing tiers would be a great yardstick for consumers.
for us nerds there is hidden stats for nerds option.
https://blog.sayan.page/netflix-debug-mode/
No other service does this.
And for some reason, if HDR versions of their 1080p content are even more bitstarved than SDR.
Could barely tell what was going on, everything was so dark, and black crush killed it completely, making it look block and janky.
I watched it again a few years later, on max brightness, sitting in the dark, and I got more of what was going on, but it still looked terrible. One day I'll watch the 'UHD' 4k HDRversion and maybe I'll be able to see what it was supposed to look like.
When I last rewatched it (early pandemic), as far as I could tell at the time there was no HDR version available, which I assume would fix it by being able to represent more variation in the darker colours.
I might hunt one down at some point as it does exist now. Though it still wouldn’t make season 8 ‘good’ !!
Here's how bad it was in 2017. One of the earliest things I watched on that TV was "Guardians of the Galaxy" on some expanded basic cable channel. The fight between Peter and Gamora over the orb looked very jerky, like it was only at about 6 fps. I found some reviews of the movie on YouTube that included clips of that fight and it looked great on them, so I know that this wasn't some artistic choice of the director that I just didn't like. Some Googling told me about the motion enhancement settings of the TV, and how they often suck. I had DVRed the movie, and with those settings off the scene looked great when I watched it again.
(However, modern TV sets are often filled with enough other junk that maybe you will not want all of these things anyways)
625 more comments available on Hacker News