A 'toaster with a Lens': the Story Behind the First Handheld Digital Camera
Key topics
The tale of Kodak's pioneering handheld digital camera is a cautionary story of innovation and disruption, with commenters weighing in on how the company's failure to capitalize on its own invention ultimately led to its downfall. As one commenter quipped, "Kodak invented the thing that killed them," highlighting the irony of a company being undone by its own groundbreaking technology. While some saw the development of digital photography as inevitable, others pointed out that Kodak's demise was a classic tale of capitalist disruption, with middle management potentially dismissing digital as a fad. The thread also sparked curiosity about the camera's legacy, with some wondering if any original photos still exist online or if the camera remains functional.
Snapshot generated from the HN discussion
Discussion Activity
Very active discussionFirst comment
41m
Peak period
37
120-132h
Avg / period
7.5
Based on 60 loaded comments
Key moments
- 01Story posted
Dec 8, 2025 at 8:04 AM EST
about 1 month ago
Step 01 - 02First comment
Dec 8, 2025 at 8:45 AM EST
41m after posting
Step 02 - 03Peak activity
37 comments in 120-132h
Hottest window of the conversation
Step 03 - 04Latest activity
Dec 15, 2025 at 9:40 PM EST
23 days ago
Step 04
Generating AI Summary...
Analyzing up to 500 comments to identify key contributors and discussion patterns
Want the full context?
Jump to the original sources
Read the primary article or dive into the live Hacker News thread when you're ready.
Other companies had already invented the CCD, it was only a matter of time before someone would digitise the signal and pair it with a storage device. It was an obvious concept.
All Kodak really did was develop an obvious concept into a prototype many years before it could be viable, and then receive a patent for it.
This is a very common story from what I understand, whether the intent is either “if you can’t beat them, buy them!” or even if it’s just to grow.
In Kodak’s case, I wonder if both those that saw it as the future and those that saw it as the end wanted to support and control it.
Also, it never ceases to amaze that some of the best things and the most dangerous things are (1) not those that you planned on and (2) involve someone bending and breaking rules to persue a passion project.
And yet, if you took a crayon and continued the line of maximum resolution achieved on a single chip... that line wasn't plateauing.
Somehow, everyone believed Moore's Law, but not as it applied to detectors (which are basically transistors, which is what Moore's Law discusses).
Captured on Kodak film, I suspect.
https://petapixel.com/how-steve-sasson-invented-the-digital-...
https://petapixel.com/assets/uploads/2022/09/Prorotype-Digit...
[1] https://petapixel.com/how-steve-sasson-invented-the-digital-...
I was glad to hear Sasson found a place at Eastman-Kodak and worked there for the rest of his career.
But had I been in that place at that time, I would not have invented the digital camera. That guy Sasson was clearly capable far beyond the rest of us.
I consider Wozniak (obvious example) who was at the "right time and place" in the early 1970's. He at the engineering capital of the U.S. (Silicon Valley — already known by that name at the time) knowing adults in engineering fields that could get him otherwise expensive and new for the time microprocessor chips… just as the chips were becoming more affordable—just as Don Lancaster's "TV Typewriter" and the "Altair 8800" began to grace the cover of Popular Electronics…
Woz seemed to flounder, or be overwhelmed somewhat, a decade later when hacks with a 555 Timer chip, a few NAND gates or NTSC timing hijinks to get color was not where the industry was going. He took a back-seat on the engineering side.
At the same time, not to diminish Woz's skills in 1975, there were a lot of other smart kids in the "Valley" then that did have their home-brew computers become a product.
(And then so much more to unpack when you allow for Job's contributions, U.S. schools purchasing Apple computers, etc.)
Woz had a serious brain injury in a plane crash, and I think that's when he lost his technical edge. Also, though, he was more interested in flying planes and organizing rock concerts at that point. You could imagine an alternate history where Apple took CPU design in-house in 01981 instead of 02011.
Side note, the wand was developed to solve for: trip to Mars -> bone density loss -> minerals in blood -> debilitating kidney stone 1 year from a hospital.
And the photos in the article of the old "instamatic" Kodak film cameras (especially that 110 pocket camera) suddenly brought back to my mind that formaldehyde-like smell of developer chemicals when I worked at a One-Hour-Photo lab when in high school.
If this were really the case, I'm surprised the US government didn't engage in antitrust action.
https://petapixel.com/why-kodak-died-and-fujifilm-thrived-a-...
TL;DR: Fujifilm diversified quickly, Kodak clung to the film business for far too long.
Yep, that's completely different from the post you replied to.
This part reminded me of the Black Triangle (2004):
https://archive.ph/qqOnP
https://news.ycombinator.com/item?id=698753
Steve Sasson's tale of technical struggle in 01975 at Kodak is real, but dozens of other people were doing the same thing at the same time at different companies, because at that point the problem of building a handheld digital camera had been reduced to a problem that one guy could solve with off-the-shelf parts. That reduction was the result of numerous small advances over the previous 50 years.
Landsat 1 was a digital camera that was initially planned in 01970 and launched into space in 01972; it just weighed a tonne, so you couldn't hold it in your hand. https://directory.eoportal.org/satellite-missions/landsat-1-... says:
> It quickly became apparent that the digital image data, acquired by the MSS (Multispectral Scanner) instrument, a whiskbroom scanning device, were of great value for a broad range of applications and scientific investigations. For the first time, the data of an orbiting instrument were available in digital form, quantified at the instrument level - providing a great deal of flexibility by offering all the capabilities of digital processing, storage, and communication.
Landsat 1 was built by General Electric, RCA, NASA, and subcontractors, and the MSS digital camera component in particular was designed by Virginia Norwood at the Hughes Aircraft Company, not at Kodak.
Ranger 7 in 01964 https://en.wikipedia.org/wiki/Ranger_7 was an electronic camera that was successfully launched into the moon and returned close-range photos of it over radio links, but, as far as I can tell, it wasn't a digital camera; the RF links were analog TV signals.
Handheld electronic cameras, for a very strong person, might date back to Philo T. Farnsworth's Image Dissector in 01927 https://en.wikipedia.org/wiki/Video_camera_tube#Experiments_... or Zworykin's Iconoscope in 01933 https://en.wikipedia.org/wiki/Video_camera_tube#Iconoscope, but in practice these were only reduced to handheld-plus-backpack size in the 01950s https://en.wikipedia.org/wiki/Professional_video_camera#Hist.... Farnsworth was at the Farnsworth Television and Radio Corporation, not at Kodak. Zworykin was at Westinghouse and RCA, not at Kodak.
The first experimental digitization of a signal from an electronic camera was probably done by Frank Gray at Bell Labs, not at Kodak, in 01947, for which he invented the Gray Code. To be able to keep up with live full-motion video data, his analog-to-digital converter was a sort of cathode-ray tube with a shadow mask in it with the code cut into it; this is described in patent 2,632,058, granted in 01953: https://patentimages.storage.googleapis.com/a3/d7/f2/0343f5f....
The video camera tubes that were the only way to build electronic cameras up to the 50s, and which made the cameras large and heavy, were supplanted by the CCDs that Sasson used in his prototype at Kodak. The CCD was developed by Smith and Boyle at Bell Labs, not at Kodak, in 01969–70: https://en.wikipedia.org/wiki/Charge-coupled_device
However, any DRAM chip is also an image sensor, which is why they are encapsulated in black epoxy to prevent them from sensing light; without the CCD, we would have had CMOS image sensors anyway just because of the light-sensitivity of silicon.
The fundamental thing that made digital cameras not just possible but inevitable was microelectronics, a technology which owes its existence in 01975 to a long series of innovations including the point-contact transistor (Bardeen and Brattain, 01947, Bell Labs, not at Kodak); the junction transistor (Shockley, 01948, Bell Labs, not at Kodak); the monolithic integrated circuit (Noyce, 01959, Fairchild Semi, not at Kodak); the planar process (Hoerni, 01959, Fairchild Semi, not at Kodak); the MOSFET (Kahng and Atalla, 01959, Bell Labs, not at Kodak); the self-aligned silicon gate (Faggin, 01968, Fairchild, not at Kodak); and, as mentioned in the article, the microprocessor, an invention which was overdetermined in the same way as the handheld digital camera, but whoever we decide invented the microprocessor, it certainly wasn't done at Kodak.
That's fine by me. The informative posts are worth it.
My first "digital" camera was one of the Canon Ion ones that use floppy discs but actually record the photos in analog, so technically just an "electronic camera", but was marketed as a "Still Video Camera" (!). You had to use a video capture card to get the images into the PC.
https://en.wikipedia.org/wiki/Video_Floppy
https://www.youtube.com/watch?v=4G_1uy_7B5w
The first book of David Brin’s Uplift series was written in 1980 and takes place on an antigravity spaceship carrying alien ambassadors that can penetrate deep into the Sun. Yet one of the major plot points is someone using the onboard darkroom to develop pictures that reveal something essential.
I’m hoping someone would make a new sci-fi movie with a vintage aesthetic that would intentionally emphasize and magnify this old-school analog awesomeness of galactic empires that seem to entirely lack integrated circuits.
This is what I hoped for Foundation, to replicate the 1940s now-retrofuturism I imagine while reading the books. Alas, it wasn't to be.
The “Foundation” we got has good moments and excellent production values, but it doesn’t seem to know or care exactly what the rules of its universe are. (I don’t like how Hari Seldon was apparently a font of semi-magical technology invented all at once and in secret…)
Plus, it’s just one of the best TV shows ever made in any genre.
Reminds me of Harry Turtledove's The Road Not Taken.
https://en.wikipedia.org/wiki/The_Road_Not_Taken_(short_stor...
Idea for a sci fi novel: total reliance on chatbots that predict what you want to hear based on the average of the internet ends the astonishing run of innovation we've had since the industrial revolution, and returns us to the situation humanity has been in for most of our history, in which technology develops slowly, if at all. What do things look like in a thousand years, when we're still relying on the current equivalent of slide rules and analog film?
https://petapixel.com/what-is-ccd-cmos-sensor/
and https://www.teledynevisionsolutions.com/learn/learning-cente...
A lot of it was because the film people kneecapped the digital folks.
Film was very profitable.
Until it wasn't.
The company that I worked for, was a classic film company. When digital was first getting a foothold (early 1990s), I used to get lectures about how film would never die...etc.
A few years later, it was as if film never existed. The transition was so sudden, and so complete, that, if you blinked, you missed it.
Years later, I saw the same kind of thing happen to my company, that happened to Kodak.
The iPhone came out, with its embedded camera, and that basically killed the discrete point-and-shoot market, which was very profitable for my company.
When the iPhone first came out, the marketing folks at my company laughed at it.
Then, they stopped laughing.