The 'toy Story' You Remember
Key topics
The article discusses how the original 'Toy Story' film was mastered for 35mm film projection, and how subsequent digital releases have altered its appearance, sparking a discussion about the importance of preserving original film formats and the challenges of replicating their look in digital versions.
Snapshot generated from the HN discussion
Discussion Activity
Very active discussionFirst comment
1h
Peak period
48
0-3h
Avg / period
16
Based on 160 loaded comments
Key moments
- 01Story posted
Nov 10, 2025 at 10:17 PM EST
about 2 months ago
Step 01 - 02First comment
Nov 10, 2025 at 11:18 PM EST
1h after posting
Step 02 - 03Peak activity
48 comments in 0-3h
Hottest window of the conversation
Step 03 - 04Latest activity
Nov 12, 2025 at 4:43 PM EST
about 2 months ago
Step 04
Generating AI Summary...
Analyzing up to 500 comments to identify key contributors and discussion patterns
Want the full context?
Jump to the original sources
Read the primary article or dive into the live Hacker News thread when you're ready.
In the lion king example you weren't meant to see all of the detail the artists drew. In the Army men example the color on the digital version is nothing like the color of the actual toys.
They originally made those movies the way they did intentionally because what they wanted wasn't crystal clear images with unrealistic colors, they wanted atmosphere and for things to look realistic.
Film grain and dust can be excessive and distracting. It's a good thing when artifacts added due to dirt/age gets cleaned up for transfers so we can have clear images, but the result of that clean up should still show what the artists originally intended and that's where disney's digital versions really miss the mark.
That's the point in that Lion King frame, though. They drew it planning for it to get washed out by the sunlight effect, and when it's not it absolutely ruins the "vast crowd" effect they were going for because you can clearly see there's no more animals in the background and it's just 20 guys standing there.
I don't believe these were part of the filmmaker's vision at the time, but unavoidable. Nowadays they are added again to films (and video games) on purpose to create a certain (nostalgic) effect.
Colorizing a black-and-white film, for example, is not ever restoring the original intention or vision, even "subjectively." If the makers of a black-and-white film had been making a color film, they would have made different choices.
This does not mean that you should not colorize black-and-white films, you should do whatever makes you happy. I honestly can't wait until AI is recreating missing scenes or soundtracks from partially lost films, or even "re"creating entire lost films from scripts or contemporary reviews and cast lists, and expanding films to widescreen by inventing contents on the edges. But this will not be restoring a vision, this will be original work.
It's why they all have "motion smoothing" turned on all their TV's too. Yes, it's animation, but the Blu-rays look "higher resolution", and look "smoother" and less "noisy".
All the artistic benefits you and I see are lost on most watchers.
It is clear that the animators factored in the colour changes from the original media to 35mm, so it seems a disservice to them to re-release their works without honouring how they intended the films to be seen.
https://wackoid.com/game/10-pictures-that-show-why-crt-tvs-a...
This is totally bonkers, because the VHS format is crippled, also color wise. Many modern transfers are just crap.
An infamous case is the Buffy the Vampire Slayer tv show. The Blu-ray (edit: and streaming copies) went back to the film source, which is good, but… that meant losing the color grading and digital effects, because the final show wasn’t printed to film. Not only did they get lazy recreating the effects, they don’t seem to have done scene-by-scene color grading at all. This radically alters the color-mood of many scenes, but worse, it harms the legibility of the show, because lots of scenes were shot day-for-night and fixed in post, but now those just look like they’re daytime, so it’s often hard to tell when a scene is supposed to be taking place, which matters a lot in any show or film but kinda extra-matters in one with fucking vampires.
The result is that even a recorded-from-broadcast VHS is arguably far superior to the blu ray for its colors, which is an astounding level of failure.
(There are other problems with things like some kind of ill-advised auto-cropping seeming to have been applied and turning some wide shots into close-ups, removing context the viewer is intended to have and making scenes confusing, but the colors alone are such a failure that a poor VHS broadcast recording is still arguably better just on those grounds)
There's a fucking lot of things that are not worth it monetarily, but worth it for the sake of itself. Because it's a nice gesture. Or because it just makes people happy. Not to sound like some hippie idealist, but it's just so frustrating that everything has to be commoditized.
In modern tech circles, the utilitarian mindset is going strong, now that the hacker ethos is dead and it’s all about being corporate friendly and hireable.
It's always easy to complain about others not being generous enough with their time, but we always have an excuse for why we won't do it ourselves.
You can't, at least not if you want an acceptable result.
In photography, if you have a JPEG photo only, you can't do post-facto adjustments of the white balance, for that you need RAW - too much information has been lost during compression.
For movies it's just the same. To achieve something that actually looks good with a LUT (that's the fancy way for re-coloring, aka color grading), you need access to the uncompressed scans, as early in the processing pipeline as you can get (i.e. before any kind of filter is applied).
Honestly, by weakening copyright protections. People who love the works will do the work to protect them when they don't have to fear being sued into bankruptcy for trying to preserve their own culture.
> You think a kid is going to notice two pages? All they do is look at the pictures.
I’m quite sure bean counters look at Disney kids movies the exact same way, despite them being Disney’s bread and butter.
With Star Wars you have a dedicated adult fan base that’ll buy up remasters and reworkings. Aladdin? Not so much. Especially in the streaming era, no one is even buying any individual movie any more.
I agree it was likely Disney being cheap, but there are tons of people who'll buy up disney movies on physical media in the age of streaming. Not only are there disney fans who'd rival the obsessiveness of star wars fans, but like Lucas Disney just can't leave shit alone. They go back and censor stuff all the time and you can't get the uncensored versions on their streaming platform. Aladdin is even an example where they've made changes. It's not even a new thing for Disney. The lyrics to one of the songs in Aladdin were changed long before Disney+ existed.
The Disney of yesterday might have been a bit more Jobs than Gates, compared to the Disney of today.
I think there's a discussion to be had about art, perception and devotion to the "original" or "authentic" version of something that can't be resolved completely but what I don't think is correct is the perception that this was overlooked or a mistake.
> During production, we’re working mostly from computer monitors. We’re rarely seeing the images on film. So, we have five or six extremely high-resolution monitors that have better color and picture quality. We put those in general work areas, so people can go and see how their work looks. Then, when we record, we try to calibrate to the film stock, so the image we have on the monitor looks the same as what we’ll get on film.
But they didn't do a perfect job (the behavior of film is extremely complex), so there's a question- should the digital release reflect their intention as they were targeting these calibrated monitors or should it reflect what was actually released? Also, this wouldn't include other artifacts like film grain.
Except, as they say, the high grade monitors were calibrated to emulate the characteristics of film.
If we can show that D+ doesn't look like the film, then we can point out that it probably doesn't look like the calibrated monitors either. Those army men are not that shade of slime green in real life, and you'll have a hard time convincing me that after all the thought and effort went in to the animation they allowed that putrid pea shade to go through.
[0]: https://www.youtube.com/watch?v=lPU-kXEhSgk
Another movie with the same / similar problem is the DVD release of the Lord of the Rings Extended editions. Both Blu-ray and 4K version. As far as I remember is that they fixed it for the theatrical version in 4K but not extended.
https://www.youtube.com/watch?v=XR0YBqhMtcg
Another one that's been hard to find is the 4k matrix original color grading release. Ping me if you have it! (Not the 1080p release)
These can be especially hard to find as the files are typically enormous, with low compression to keep things like grain. I see them mostly traded on short-lived gdrives and Telegram.
Someone tell this community to share over BT. Aint nobody got time to keep up with which platform/server everyone is on and which links are expired and yuck.
But you have algorithmic grain in modern codecs, so no need to waste so much space for noise?
The other’s fake noise.
One’s a real photo from 1890. The other’s an old-timey instagram filter.
It makes sense that some folks might care about the difference. Like, I love my old family Polaroids. I would not want a scanned version of those to have the “noise” removed for compression’s sake. If that had been done, I’d have limited interest in adding fake noise back to them. By far my favorite version to have would be the originals, without the “noise” smoothed out at all.
Lots of folks have similar feelings about film. Faked grain isn’t what they’re after, at all. It’s practically unrelated to what they’re looking for.
> The other’s fake noise
But since there is no such thing as the real thing, it could just as well match one of the many real noise patterns in one of the many real things floating around, or a real thing at a different point in time with more/less degradation. And you wouldn't even know the difference, thus...
> It makes sense that some folks might care about the difference
Not really, it doesn't make sense to care about identical noise you can't tell apart. Of course, plenty people care about all kind of nonsense, so that won't stop those folks, but let's not pretentd there is some 'real physics' involved
But also a simulation called compression of a real thing is different from that real thing, so that purity test had already been failed
One is the real deal and another one is a simulation. End of story.
You can actually, the 2006 Limited Edition DVD is a double disc version one being the original version.
However they are not DVD quality because they were transferred from LaserDisc and not the original film stock
To pick an arguably-minor but very easy to see point: the title’s different.
I can’t find out if they fix the 3% speed-up from the laser disc. The audio mix, at any rate, will be a combination of the three (stereo, mono, 70mm) original mixes, like on the laser disc, so identical to none of them. The source should predate the replacement of Latin script with made-up letters (not conceived until ROTJ then retrofitted on some releases of SW and Empire) so that’ll be intact unless they “fixed” it.
Still stuck with sub-ordinary-dvd-quality picture, as far as official releases go, so that’s too bad. Oh well, fan 35mm scan projects solved that problem.
Sometimes people create things that surpass them, and I think it is totally fair for them to belong to humanity after the people that created them generated enough money for their efforts.
But no, of course it looks between slightly and way better in every case. Goddamnit. Pour one out for my overworked disk array.
And here I was thinking it was just my imagination that several of these look kinda shitty on Blu-ray and stream rips. Nope, they really are worse.
Piracy: saving our childhoods one frame at a time.
I can't figure out how to determine if that's intentional.
The careful eye may also notice they almost never strike the sabers against one another in that scene... because it'd break the spinning sticks. Apparent contact is usually gently done, or a trick of perspective.
A true fan who wants to preserve and be faithful on its scan is going to dedicate their life to get it just right, while a mega corp will just open the original, click "Export as..." and call it a day.
https://www.reddit.com/r/toystory/comments/1hhfuiq/does_anyo...
The "best" right now, in my opinion, is AgX, which at this point has various "flavours" that operate slightly differently. You can find a nice comparison of OCIO configs here: https://liamcollod.xyz/picture-lab-lxm/CAlc-D8T-dragon
I went down the tonemapping rabbit hole for a hobby game engine project a while ago and was surprised at how complex the state-of-the-art is.
The 4k77 et c. fan scans of the original Star Wars trilogy, which aimed to get as close as possible to what one would have seen in a theater the year of release, used multiple prints to fill in e.g. bad frames, used references like (I think) magazine prints of stills and well-preserved fragments or individual frames to fix the (always faded, sometimes badly) color grading and contrast and such, and had to extensively hand-correct things like scratches, with some reels or parts of reels requiring a lot more of that kind of work than others. Even Jedi required a lot of that sort of work, and those reels would have been only something like 30-35 years old when they started working on them.
https://davidsimon.com/the-wire-hd-with-videos/
It seems like the video examples are unfortunately now unavailable, but the discussion is still interesting and it's neat to see the creative trade-offs and constraints in the process. I think those nuances help evoke generosity in how one approaches re-releases or other versions or cuts of a piece of media.
https://x.com/TristanACooper/status/1194298167824650240
Open both images and compare. The visual joke is completely ruined with the cropping.
This is not true at all. Being compatible with outdated, film based projectors was much more important for being able to show it in as many theaters as possible. If they wanted to do a digital screening it would have been technologically possible.
Digital cinema went with Motion JPEG2000 with high quality settings, which leads to very large files, but also much better fidelity than likely with a contemporary video codec.
https://en.wikipedia.org/wiki/Digital_cinema
I agree with that. The article's quote from Pixar's "Making The Cut at Pixar" book was that the technology wasn't there (computer chips fast enough, storage media large enough, compression sophisticated enough) and I--along with the comment I replied to--disagree with that conclusion.
Yeah, but we were still using MPEG-2 back then, weren't we?
They would have looked like utter garbage. Bitrates would have had to be so high that I'm not sure we would have actually had enough storage. I guess we could have shipped TWO hard drives.
We had an incredible amount of fancy toys with no expense spared, including those SGI Onyx Infinite Reality boxes with the specialist video break out boards that did digital video or analogue with genloc. Disks were 2Gb SCSI and you needed a stack of them in RAID formations to play video. This wasn't even HD, it was 720 x 576 interlaced PAL.
We also had to work within a larger post production process, which was aggressively analogue at the time with engineers and others allergic to digital. This meant tapes.
Note that a lot of this was bad for tape machines. These cost £40k upwards and advancing the tape by one frame to record it, then back again to reposition the tape for the next frame, for hours on end, that was a sure way to reck a tape machine, so we just hired them.
Regarding 35mm film, I also babysat the telecine machines where the film bounces up and down on the sprockets, so the picture is never entirely stable. These practical realities of film just had to be worked with.
The other fun aspect was moving the product around. This meant hopping on a train, plane or bicycle to get tapes to where they needed to be. There was none of this uploading malarkey although you could book satellite time and beam your video across continents that way, which happened.
Elsewhere in broadcasting, there was some progress with glorified digital video recorders. These were used in the gallery and contained the programming that was coming up soon. These things had quite a lot of compression and their own babysitting demands. Windows NT was typically part of the problem.
It was an extremely exciting time to be working in tech but we were a long way off being able to stream anything like cinema resolution at the time, even with the most expensive tech of the era.
Pixar and a few other studios had money and bodies to throw at problems, however, there were definitely constraints at the time. The technical constraints are easy to understand but the cultural constraints, such as engineers allergic to anything digital, are hard to imagine today.
It was possible, but much too expensive to get it into wide release that way.
I do find that often enough commercial releases like Aladdin or other movies like Terminator 2 are done lazily and have completely different colors than what was historically shown. I think part of this is the fact that studios don't necessarily recognise the importance of that legacy and don't want to spend money on it.
Are there like multiple digital releases, one with better colour than the other?
There was similar outrage (if that's the right word) about a Matrix remaster that either added or removed a green color filter, and there's several other examples where they did a Thing with colour grading / filtering in a remaster.
> I have an updated, I found out that T2 4K is an HDR movie that needs to be played with MadVR and enable HDR on the TV itself, now the colors are correct and I took a new screenshot: https://i.imgur.com/KTOn3Bw.jpg
> However when the TV is in HDR mode the 4K looks 100% correct, but when seeing the screenshot with HDR off then the screenshot looks still a bit wrong, here is a screenshot with correct colors: https://i.imgur.com/KTOn3Bw.jpg
https://www.youtube.com/watch?v=1mhZ-13HqLQ
There's a 35mm scan floating around from a faded copy with really weird colors sometimes
https://www.youtube.com/watch?v=Ow1KDYc9XsE
And there's an Open Matte Version, which I don't know the Origin of.
https://www.youtube.com/watch?v=Z2eCmhBgsyI
For me, it's the Open Matte that I consider the ultimate best version.
CD itself can replicate same dynamic range and more, but well that doesn't sell extra copies.
Same applies for people buying modern vinyl records believing them to be more authentic than a CD or (god-forbid) online streaming.
Everything comes from a digital master, and arguably the vinyl copy adds artefacts and colour to the sound that is not part of the original recording. Additionally, the vinyl is not catching more overtones because it's analogue, there is no true analogue path in modern music any more.
Allegedly, for a lot of music that is old enough the best version to get (if you have the kind of hifi system that can make use of it) is an early 80s CD release, because it sits in a sweet spot of predating the loudness war where producers actually using the dynamic range of the CD.
[0] https://en.wikipedia.org/wiki/Loudness_war
Once better monitors became more commonplace, mastering became dynamic again.
This is most clear with Metallica's Death Magnetic, which is a brickwalled monstrosity on the 2008 release but was fixed on the 2015 release[0]. And you can see this all over, where albums from the 90s had a 2000s "10-year anniversary" remaster that is heavily compressed, but then a 2010s or 2020s remaster that is dynamic again.
[0] Interestingly enough between those dates, fans extracted the non-brickwalled Guitar Hero tracks and mastered them as well as they could. Fun times :).
It was sort of a happy coincidence that vinyl's limitations forced more dynamic (but less bass-y) masters. Although if your artist didn't do vinyl releases -which really was a dying medium until hipsters brought it back in the 2010s- you were hosed.
Interesting, I did not know this! I'm not doubting you, but I'm a little confused and curious about how the physics of that works out. Wouldn't being brickwalled mean the volume stays pretty constant, meaning there's less work for the needle? Or is there some kind of limit to how many overlapping waveforms a needle can pick up at once?
"Dynamic range compression" is a bit of a misleading term because it sounds like you're taking an audio signal and and squeezing it.
What you're really doing is two things: reducing (compressing) the difference between the quiet (valleys) and loudest (peaks) parts, and then pushing the volume of the peaks up to or past 0dB. Technically, that second step isn't dynamic range compression, but in practice it is / was always done. The reason they do this is because for human ears, louder sounds better. However, you lose dynamism. Imagine if you watched a movie, and a whisper during a military night raid would sound as loud as the shouty conversation they had in the planning room.
Past 0dB, a signal will 'clip'[0], which means the loudest parts of the signal cannot be expressed properly and will be cut off, leading to signal loss. Basically, 0dB is the loudest you can get.
These days, in practice, music tracks get mastered so that the average value is -14dB because streaming sites will 'normalize' tracks so that the average dB is -14dB. Here[1] you can see why that makes brickwalling bad. If your track goes full tilt and has almost no valleys, the average dB per second is rather high, so your entire track gets squeezed to average out to -14dB. But if you have lots of valleys, you can have more peaks and the average will still be -14dB!
RE: vinyl? Well, too much and / or too intense motion in the groove (the groove is effectively a physical waveform) makes the needle slightly skip out of the groove. "Too much" happens with brickwalling, "too intense" happens with very deep bass. Try to imagine the upcoming links I'm referring to as a physical groove a needle has to track, instead of a digital waveform.
Here[2] is one Death Magnetic track waveform of the brickwalled original vs. fixed remastered release. It's not too bad. But then there is this[3] insanity.
[0] https://www.youtube.com/watch?v=SXptusF7Puo / https://www.youtube.com/watch?v=g7AbmhOsrPs
[1] https://cdn.shopify.com/s/files/1/0970/0050/files/46eacedf-c...
[2] https://happyhipster.wordpress.com/wp-content/uploads/2023/0...
[3] https://happyhipster.wordpress.com/wp-content/uploads/2023/0...
I've watched Aladdin more than any as a child and the Blu-ray screenshot is much more familiar to me than the 35mm scan. Aladdin always had the velvia look.
> Early home releases were based on those 35 mm versions.
Here's the 35mm scan the author presents: https://www.youtube.com/watch?v=AuhNnovKXLA
Here's the VHS: https://www.youtube.com/watch?v=dpJB7YJEjD8
So, yes the VHS is expected to have more magenta.
Anecdotally, I remember watching Aladdin at the movie theatre when it came out and later on TV multiple times and the VHS you saw doesn't correspond to my memories at all.
I can't challenge the vividness of your memory. That's all in our heads. I remember it one way, and you remember it another.
The author is not wrong that oversaturation is a source transfer phenomena (which will always be different unless special care is taken to compare with the source material).
On most TVs that magenta wouldn't have shown as much as the youtube video shows because TVs tended to have weaker magentas. Of course, it's not like TVs were that uniformly calibrated back then and there were variations between TVs. So depending on the TV you had, it might have ended up having too much magenta but that would have usually been with more expensive and more accurate TVs.
TLDR: Transfers are hard, any link in the chain can be not properly calibrated, historically some people in charge of transferring from one source to another compensated for perceived weak links in the chain.
Regarding my memory, it becomes shakier the more I think about it. I do remember the purples but me having watched the cartoon could have affected that.
Don't believe me? Download the comparison pictures in the article to your device and play with filters and settings. You can get almost anything you want and the same was true at every step in the render pipeline to your TV.
Ps - and don't get me started on how my 60-year old eyes see color to what they perceived when I saw this in the theater
I don't buy that it's a real degradation due to different presentation methods. I'm sorry, but no matter what film stock you lovingly transfer Toy Story to, it's never going to look like it does in your memory. Same with CRTs. Sure, it's a different look, but my memory still looks better.
It's like our memories get automatically upgraded when we see newer stuff. It's jarring to go back and realise it didn't actually look like that in the 90s. I think this is just the unfortunate truth of CGI. So far it hasn't reached the point of producing something timeless. I can watch a real film from the 80s and it will look just as "good" as one from today. Of course the colours will be different depending on the transfer, but what are we hoping for? To get the exact colours the director saw in his mind's eye? That kind of thing has never really interested me.
I don’t have this issue and never have. For whatever reason I’ve never “upgraded” them in my mind, and they look today exactly as I remember them when played on period hardware.
> Their system was fairly straightforward. Every frame of Toy Story’s negative was exposed, three times, in front of a CRT screen that displayed the movie.
While I have no doubt that this hadn't been done at the scale and resolution, it struck me that I'd heard about this concept in a podcast episode [1] in which very early (1964) computer animation was discussed alongside the SC4020 microfilm printer that used a Charactron CRT which could display text for exposure to film or plot lines.
[1] https://adventofcomputing.libsyn.com/episode-88-beflix-early...
https://www.youtube.com/watch?v=tyixMpuGEL8
Movies projected on film look different not only because of the color and texture, but also a constant spatial jitter over time. When the film moves through the projector, each frame locks into a slightly different position vertically. That creates a wobble that's called "film weave."
(If you want to create truly authentic-looking titles for a 1980s B-grade sci-fi movie, don't forget to add that vertical wobble to your Eurostile Extended Bold layout that reads: "THE YEAR IS 2025...")
I wonder if artificial grain would actually make it look better.
Like when the game Splinter Cell was released, there weee two additional ‘views’ simulating infrared and thermal cameras. Those had heavy noise added to them and felt so real compared to the main view.
https://www.youtube.com/watch?v=6w4bzm6ewRQ
And here I was thinking of re-watching some old Disney/Pixar movies soon :(
If you're interested in making digital footage look exactly like film in every possible way, I'll shill our product Filmbox: https://videovillage.com/filmbox/
If you plug a Nintendo system's RCA cables into a modern TV, it will look like garbage. Emulated games on LCDs look pixelated.
Those games were designed for a CRT's pixel grid. They don't look right on LCDs, and the upscalers in home theater equipment don't respect that. There are hardware upscalers and software shaders that are specifically designed to replicate a CRT's quirks, to let you better approximate how those games were designed to be played.
Related - someone recently built a CRT dock for his Switch, so he could play Nintendo Switch Online's emulated games as originally intended:
https://www.youtube.com/watch?v=wcym2tHiWT4
As the Aladdin still shows with its wildly altered colors clearly other aspects matter/are at play. But the analog/digital discussions always seem, at least to me, to hinge heavily on DR. It’s just so interesting to me.
Many of us remember the leap from SD->HD. Many of us also can point out how 4K is nice and even noticeably better than FHD, but man…getting a 4K OLED TV with (and this is the important part) nice DR was borderline another SD->HD jump to me. Especially with video games and older films shot and displayed on film stock from start to finish. The difference is incredibly striking.
178 more comments available on Hacker News